Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR AUTOMATED NAVIGATION AND CONTROL OF AN ENDOSCOPIC SURGICAL DEVICE
Document Type and Number:
WIPO Patent Application WO/2024/085890
Kind Code:
A1
Abstract:
A surgical device comprising a flexible part circularly winded in a housing is provided. The surgical device comprises a processor configured to predict an intended path for the distal end and to generate control signals. The intended path is predicted based on at least one anatomical structure recognized using data received from one or more sensors. An actuation unit of the surgical device actuates a three-dimensional movement of the distal end along the intended path based on the control signals.

Inventors:
CHAUHAN SANKET (US)
PHAMVU BRYANT (US)
DAS ADITYA (US)
Application Number:
PCT/US2022/049509
Publication Date:
April 25, 2024
Filing Date:
November 10, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
SOMEONE IS ME LLC (US)
CHAUHAN SANKET SINGH (US)
PHAMVU BRYANT (US)
DAS ADITYA NARAYAN (US)
International Classes:
A61B34/00; A61M39/08; B65H54/02; B65H75/34; H02G11/02
Attorney, Agent or Firm:
BLACKSTONE, Jason (US)
Download PDF:
Claims:
CLAIMS

1. A surgical device comprising: a housing; a flexible part comprising a distal end and winded on the housing; a processor configured to: predict an intended path for the distal end of the flexible part; and generate control signals based on the intended path, wherein the intended path is predicted based on at least one anatomical structure recognized using data received from one or more sensors; and an actuation unit configured to actuate a three-dimensional movement of the distal end of the flexible part along the intended path based on the control signals, the actuation unit comprising: x and y motors configured to actuate a bending movement of the distal end of the flexible part towards the intended path in an x-plane and a y-plane; and a z-motor configured to wind or unwind the flexible part from the housing such that the distal end of the flexible part traverses in a z-plane of the intended path.

2. The surgical device of claim 1, wherein the at least one anatomical structure is recognized using a machine learning model, wherein the machine learning model is trained based on historical data with respect to a plurality of anatomical structures recognized.

3. The surgical device of claim 1, wherein the processor is further configured to create a virtual envelope corresponding to the determined at least one anatomical structure.

4. The surgical device of claim 3, wherein the processor is further configured to predict the intended path for the distal end of the flexible part to avoid the virtual envelope created corresponding to the determined at least one anatomical structure.

5. The surgical device of claim 1, wherein the actuation unit is further configured to restrain or restrict the movement of the distal end of the flexible part when the distal end of the flexible part approaches the at least one anatomical structure.

6. The surgical device of claim 1, wherein the surgical device is configured to perform endoscopic procedures of one of a gastro-intestinal tract, a urinary tract, a respiratory tract, a male or a female genitourinary tract, an ear, nose, throat, brain, spinal cord, cardiovascular system, skeletal system or nervous system.

7. The surgical device of claim 1, wherein the housing is further configured to accommodate a physical barrier that physically separates the flexible part from the housing, wherein the physical barrier is detachably attached to the housing and is disposable or reusable.

8. The surgical device of claim 1, wherein the control signals for a subsequent position are generated based on a current position of the distal end of the flexible part and the data received from the one or more sensors.

9. The surgical device of claim 8, wherein the processor is further configured to: compare the subsequent position generated of the distal end with an intended position along the intended path, and one of: generate the control signals to actuate the three-dimensional movement of the distal end along the intended path, if the subsequent position coincides with the intended position; or generate additional control signals to actuate the three-dimensional movement of the distal end back to the current position, if the subsequent position does not coincide with the intended position.

10. The surgical device of claim 9, wherein the processor utilizes a machine learning model to compare the subsequent position generated of the distal end with an intended position along the intended path.

11. The surgical device of claim 1, further comprising a protective cover configured to cover the flexible part and form a barrier between the flexible part and body fluids, wherein the protective cover is made of a flexible, waterproof, impermeable and transparent material.

12. The surgical device of claim 1, wherein the housing is in shape of a disc.

13. The surgical device of claim 12, wherein the z-motor is disposed at a center of the disc shaped housing to rotate the housing in order to wind or unwind the flexible part from the housing.

14. The surgical device of claim 13, wherein the x and y motors rotate simultaneously with the disc shaped housing.

15. The surgical device of claim 1, wherein the housing comprises of a rack on a top portion on which a pinion, attached to the z-motor, rotates to unwind the flexible part from the housing.

16. The surgical device of claim 15, wherein the x and y motors rotates along with the z-motor.

17. The surgical device of claim 1, wherein the housing is in shape of a spool.

18. The surgical device of claim 17, wherein the z-motor is attached to a serrated guide wheel attached to the flexible part, wherein the flexible part is sandwiched between the serrated guide wheel on one side and a freely rotatable wheel on other side, winding or unwinding the flexible part from the spool shaped housing.

19. A surgical device comprising: a housing, comprising: a top side; a bottom side oppositely disposed to the top side; and a rack provided on the top side; a flexible part wound along a circumference of the bottom side of the housing; a processor configured to: predict an intended path for a distal end of the flexible part; and generate control signals based on the intended path, wherein the intended path is predicted based on at least one anatomical structure recognized using data received from one or more sensors; and an actuation unit configured to actuate a three-dimensional movement of the distal end of the flexible part along the intended path based on the control signals, the actuation unit comprising: a z-motor connected to a pinion, wherein the z-motor is configured to traverse the pinion on the rack winding or unwinding the flexible part from the housing such that the distal end of the flexible part retracts or extends by traversing in a z-plane of the intended path; and x and y motors configured to actuate a bending movement of the distal end of the flexible part towards the intended path in an x-plane and a y-plane, wherein the x and y motors traverse along the rack with the z-motor.

20. The surgical device of claim 19, wherein the housing is elliptical or circular in shape.

21. The surgical device of claim 19, wherein the recognition of the anatomical structure comprises using a machine learning model, wherein the machine learning model is trained based on historical data with respect to a plurality of anatomical structures recognized.

22. The surgical device of claim 19, wherein the processor is further configured to create a virtual envelope corresponding to each of the determined anatomical structure.

23. The surgical device of claim 22, wherein the processor is further configured to predict the intended path for the distal end of the flexible part to avoid the virtual envelope created corresponding to the determined anatomical structure.

24. The surgical device of claim 19, wherein the actuation unit is further configured to restrain or restrict the movement of the distal end of the flexible part when the distal end of the flexible part approaches the at least one anatomical structure.

25. The surgical device of claim 19, wherein the surgical device is configured to perform endoscopic procedures of one of a gastro-intestinal tract, a urinary tract, a respiratory tract, a male or a female genitourinary tract, an ear, nose, throat, brain, spinal cord, cardiovascular system, skeletal system or nervous system.

26. The surgical device of claim 19, wherein the housing is further configured to accommodate a physical barrier that physically separates the flexible part from the housing, wherein the physical barrier is detachably attached to the housing and is disposable or reusable.

27. The surgical device of claim 19, further comprising a detachable barrier between the flexible part and the bottom side of the housing.

28. The surgical device of claim 19, wherein the control signals for a subsequent position are generated based on a current position of the distal end of the flexible part and the data received from the one or more sensors.

29. The surgical device of claim 28, wherein the processor is further configured to: compare the subsequent position generated of the distal end with an intended position along the intended path, and one of: generate the control signals to actuate the three-dimensional movement of the distal end along the intended path, if the subsequent position coincides with the intended position; or generate additional control signals to actuate the three-dimensional movement of the distal end back to the current position, if the subsequent position does not coincide with the intended position.

30. The surgical device in claim 29, wherein the processor utilizes a machine learning model to compare the subsequent position generated of the distal end with an intended position along the intended path.

31. The surgical device of claim 19, further comprising a protective cover configured to cover the flexible part and form a barrier between the flexible part and the body fluids, wherein the protective cover is made of a flexible, waterproof, impermeable and transparent material.

32. A surgical device comprising: a disc shaped hub comprising; a circumference; a flexible part wound along the circumference, the flexible part comprising: a distal end; a processor configured to: predict an intended path for the distal end of the flexible part; and generate control signals based on the intended path, wherein the intended path is predicted based on at least one anatomical structure recognized using data received from one or more sensors; and an actuation unit configured to actuate a three-dimensional movement of the distal end of the flexible part along the intended path based on the control signals, the actuation unit comprising: a z-motor connected to a driving wheel, wherein the driving wheel is rotatably connected to a follower wheel and the hub; serrated guide wheel rotatably connected to the follower wheel, wherein the rotation of the driving wheel rotates the follower wheel, the hub and the serrated guide wheel to wind or unwind the flexible part from the hub such that the distal end of the flexible part traverses in a z-plane of the intended path; x and y motors configured to actuate a bending movement of the distal end of the flexible part towards the intended path in an x-plane and a y-plane, wherein the x and y motors rotate along the circumference of the hub.

33. The surgical device of claim 32, wherein the hub is elliptical or circular in cross section.

34. The surgical device of claim 32, wherein the z-motor is disposed at a center of the housing to rotate the hub in order to wind or unwind the flexible part from the hub.

35. The surgical device of claim 32, wherein the z-motor is attached to the serrated guide wheel, wherein the flexible part is sandwiched between the serrated guide wheel on one side and a guide on other side.

36. The surgical device of claim 32, wherein the recognition of the anatomical structure comprises using a machine learning model, wherein the machine learning model is trained based on historical data with respect to a plurality of anatomical structures recognized.

37. The surgical device of claim 32, wherein the processor is further configured to create a virtual envelope corresponding to each of the at least one determined anatomical structure.

38. The surgical device of claim 37, wherein the processor is further configured to predict the intended path for the distal end of the flexible part to avoid the virtual envelope created corresponding to the determined anatomical structure.

39. The surgical device of claim 32, wherein the actuation unit is further configured to restrain or restrict the movement of the distal end of the flexible part when the distal end of the flexible part approaches the at least one anatomical structure.

40. The surgical device of claim 32, wherein the surgical device is configured to perform endoscopic procedures of one of a gastro-intestinal tract, a urinary tract, a respiratory tract, a male or a female genitourinary tract, an ear, nose, throat, brain, spinal cord, cardiovascular system, skeletal system or nervous system.

41. The surgical device of claim 32, wherein the hub is further configured to accommodate a physical barrier that physically separates the flexible part from the hub, wherein the physical barrier is detachably attached to the hub and is disposable or reusable.

42. The surgical device of claim 32, wherein the control signals for a subsequent position are generated based on a current position of the distal end of the flexible part and the data received from the one or more sensors.

43. The surgical device of claim 42, wherein the processor is further configured to: compare the subsequent position generated of the distal end with an intended position along the intended path, and one of: generate the control signals to actuate the three-dimensional movement of the distal end along the intended path, if the subsequent position coincides with the intended position; or generate additional control signals to actuate the three-dimensional movement of the distal end back to the current position, if the subsequent position does not coincide with the intended position.

44. The surgical device in claim 43, wherein the processor utilizes a machine learning model to compare the subsequent position generated of the distal end with an intended position along the intended path.

45. The surgical device of claim 32, further comprising a protective cover configured to cover the flexible part and form a barrier between the flexible part and the body fluids, wherein the protective cover is made of flexible, waterproof, impermeable and transparent material.

46. A surgical device comprising: a hub in a shape of a spool; a flexible part comprising a distal end and wound on the hub; a processor configured to: predict an intended path for the distal end of the flexible part; and generate control signals based on the intended path, wherein the intended path is predicted based on at least one anatomical structure recognized using data received from one or more sensors; and a drive and guide unit configured to guide and drive the flexible part by actuating a three- dimensional movement of the distal end of the flexible part along the intended path based on the control signals, the drive and guide unit comprising: x and y motors attached to the hub and configured to actuate a bending movement of the distal end of the flexible part towards the intended path in an x-plane and a y- plane; a driver motor is configured to rotate a serrated guide wheel, wherein the serrated guide wheel is rotatably connected to a driving wheel and the hub, wherein the rotation of the serrated guide wheel rotates the hub to wind and unwind the flexible part from the hub such that the distal end of the flexible part traverses in a z-plane of the intended path.

47. The surgical device of claim 46, wherein the flexible part is sandwiched between the serrated guide wheel on one side and a guide on other side.

48. The surgical device of claim 46, wherein the at least one anatomical structure is recognized using a machine learning model, wherein the machine learning model is trained based on historical data with respect to a plurality of anatomical structures recognized.

49. The surgical device of claim 48, wherein the processor is further configured to create a virtual envelope corresponding to each of the determined at least one anatomical structure.

50. The surgical device of claim 49, wherein the processor is further configured to predict the intended path for the distal end of the flexible part to avoid the virtual envelope created corresponding to the determined at least one anatomical structure.

51. The surgical device of claim 46, wherein the drive and guide unit is further configured to restrain or restrict the movement of the distal end of the flexible part when the distal end of the flexible part approaches the at least one anatomical structure.

52. The surgical device of claim 46, wherein the surgical device is configured to perform endoscopic procedures of one of a gastro-intestinal tract, a urinary tract, a respiratory tract, a male or a female genitourinary tract, an ear, nose, throat, brain, spinal cord, cardiovascular system, skeletal system or nervous system.

53. The surgical device of claim 46, wherein the hub is further configured to accommodate a physical barrier that physically separates the flexible part from the hub, wherein the physical barrier is detachably attached to the hub and is disposable or reusable.

54. The surgical device of claim 46, wherein the control signals for a subsequent position are generated based on a current position of the distal end of the flexible part and the data received from the one or more sensors.

55. The surgical device of claim 46, wherein the processor is further configured to: compare the subsequent position generated of the distal end with an intended position along the intended path, and one of: generate the control signals to actuate the three-dimensional movement of the distal end along the intended path, if the subsequent position coincides with the intended position; or generate additional control signals to actuate the three-dimensional movement of the distal end back to the current position, if the subsequent position does not coincide with the intended position.

56. The surgical device in claim 55, wherein the processor utilizes a machine learning model to compare the subsequent position generated of the distal end with an intended position along the intended path.

57. The surgical device of claim 46, further comprises a protective cover configured to cover the flexible part and form a barrier between the flexible part and the body fluids, wherein the protective cover is made of flexible, waterproof, impermeable and transparent material.

58. A method to perform a surgical procedure, comprising: capturing data from one or more sensors; recognizing at least one anatomical structure based on the captured data; predicting an intended path based on the at least one anatomical structure; generating control signals based on the predicted intended path; and actuating a distal end of a flexible part of a surgical device, along the predicted intended path in a three-dimensional manner based on the control signals, wherein the actuation of the flexible part comprises: actuating a z-motor to wind or unwind the flexible part from a housing of the surgical device such that the flexible part traverses in a z-plane of the intended path; and actuating x and y motors to enable a bending movement of the distal end of the flexible part towards the intended path in an x-plane and a y-plane.

59. The method of claim 58, wherein the at least one anatomical structure is recognized using a machine learning model, wherein the machine learning model is trained based on historical data with respect to a plurality of anatomical structures recognized.

60. The method of claim 58, further comprises creating a virtual envelope corresponding to each of the determined one or more anatomical structures.

61. The method of claim 60, wherein the intended path for the distal end of the flexible part is predicted to avoid the virtual envelope created corresponding to the determined at least one anatomical structure.

62. The method of claim 58, wherein the actuation of the distal end of the flexible part comprises restraining or restricting the movement of the distal end of the flexible part when the distal end of the flexible part approaches the at least one anatomical structure.

63. The method of claim 58, wherein method can be utilized to perform endoscopic procedures of one of a gastro-intestinal tract, a urinary tract, a respiratory tract, a male or a female genitourinary tract, an ear, nose, throat, brain, spinal cord, cardiovascular system, skeletal system or nervous system.

64. The method of claim 58, wherein the control signals for a subsequent position are generated based on a current position of the distal end of the flexible part and the data received from the one or more sensors.

65. The method of claim 64, further comprises: comparing the subsequent position generated of the distal end with an intended position along the intended path, and one of: generating the control signals to actuate the three-dimensional movement of the distal end along the intended path, if the subsequent position coincides with the intended position; or generating additional control signals to actuate the three-dimensional movement of the distal end back to the current position, if the subsequent position does not coincide with the intended position.

66. The method of claim 65, wherein the comparison of the subsequent position generated of the distal end with an intended position along the intended path is based on a machine learning model.

Description:
SYSTEM AND METHOD FOR AUTOMATED CLOSED LOOP NAVIGATION AND CONTROL OF AN ENDOSCOPIC SURGICAL DEVICE

CROSS-REFERENCE INFORMATION

[00011 This application is a continuation in part of U.S. patent application No. 17/121,709, filed Dec 14, 2020, entitled “SYSTEM AND METHOD FOR AUTOMATED INTUBATION” which is hereby incorporated by reference in its entirety.

FIELD OF INVENTION

[0002] The present disclosure relates generally to automated medical devices, more specifically a system and method for automated closed loop navigation and control of invasive surgical devices.

BACKGROUND OF THE INVENTION

[0003] Various surgical procedures include implantation or insertion of medical devices inside a patient’s body. Such devices are inserted into or through a cavity or lumen of a patient for diagnostic and/or interventional purposes in various internal body parts such as upper, middle or lower gastro-intestinal (Gl) tract, cardio-vascular tract, trachea, genito-urinary tract, pulmonary tract, etc.

[0004] One such application of insertion of invasive devices is colonoscopy which is performed to examine the lower Gl tract including rectosigmoid, large bowel and the distal part of the small bowel while passing through the anus. The physician can use a cable driven endoscope with an imaging device at the distal end to visualize the internal lumen of the lower Gl tract. However, the length and natural directional changes of the lower Gl tract will make the procedure of colonoscopy quite challenging. Hence, performing a colonoscopy requires a lot of skill and training. Even with appropriate training, it may be difficult to navigate through a lower Gl of a patient efficiently. Therefore, there is a requirement for an efficient and accurate means of controlling and navigating a device within the cavities and lumens inside the human body, such as the Gl tract.

SUMMARY [0005] References to “one embodiment,” “at least one embodiment,” “an embodiment,” “one example," “an example,” “for example," and so on indicate that the embodiment(s) or example(s) may include a particular feature, structure, characteristic, property, element, or limitation but that not every embodiment or example necessarily includes that particular feature, structure, characteristic, property, element, or limitation. Further, repeated use of the phrase “in an embodiment” does not necessarily refer to the same embodiment,

[0006] in an aspect of the present invention, an automated closed loop navigation and control system coupled to an invasive medical device is disclosed. The automated closed loop navigation and control system may include a processing circuitry that receives data from at least one data source such as an image sensor, memory or a database to recognize structures relevant to the cavity or lumen of the patient and predict an intended path for insertion of the invasive medical device inside the patient. The processing circuitry further generates and communicates the control signals to at least one actuation unit based on the intended path, to actuate the three-dimensional movement of the invasive medical device. The control system related to colonoscopy has been described which may be performed in an automated and/or manual manner. Similar systems and methods can be used with some modifications inside and related to any cavity or lumen inside the human body,

[0007] In an exemplary embodiment of the present invention, an automated colonoscopy system predicts one or more intended path(s) and generates control signals for at least one actuation unit. The intended path(s) is predicted based on the data received from at least one imaging sensor. An overlay of intended path(s) and/or recognized anatomical structures is also displayed on a user interface over the data received by the user interface from the imaging sensor(s), for effective guidance. If multiple intended paths are displayed, the user can select one or more paths and assign hierarchy of trajectories. Throughout this process, the information is provided to the user through the user interface and the user has the ability to manually override the output regarding the determination of intended path determined by the closed loop navigation and control system as described later in the detailed description. Manual overriding could elicit auditory or visual feedback asking the user to confirm the override. The hierarchy between manual control and automated control of insufflation can be changed in the settings and either can be the default setting. Additionally, the overlaying of the intended path can also be visualized on the user interface in the form of augmented reality and/or any other form which provides effective guidance to the user.

[0008] The instruments for the procedures indicated above may be inserted through the instrument port such that the distal end of the instruments enters the instrument port and exits at the distal end of the flexible part. The processing circuitry may predict an intended path of the instrument. The processing circuitry may also generate and communicate control signals to the actuation unit to actuate the three-dimensional movement of the invasive medical device.

[0009] In one preferred embodiment, the closed loop navigation and control system comprises a main body, a bending portion, a flexible part that connects the main body with the bending portion, a housing unit arranged on the bending portion comprising of at least one imaging sensor, a circuitry, a user interface, a disposable cover, an actuation unit to move the distal end and at least one actuation unit to actuate the three-dimensional movement of the flexible part. The length of the bending unit is variable and can range from nearly the tip of the flexible part to the entire flexible part. In other embodiments, the bending portion can be located within any portion of the flexible part, determined by several factors, including but not limited to, the relevant uses and anatomical structures that need to be navigated.

[0010] The processing circuitry can utilize machine learning models along with the data received from the data source(s) to recognize structures relevant to the cavity of the patient, predict an intended path, generate and communicate control signals to the actuation unit to actuate the three-dimensional movement of the invasive medical device. The intended path may be defined as a path along which the device can guide the invasive medical device once movement has commenced. The generation of the machine learning model involves receiving or collecting training data in the form of predetermined datasets to train at least one neural network. A form of this neural network used could be, but not limited to, an edge-implemented deep neural net-based object detector and/or any other algorithm known in the art. Other forms of machine learning other than neural networks be substituted, as would be well known to a person of skill in the art.

[0011]The processing circuitry can be utilized to both predict the intended path for insertion of the device based on at least one recognized anatomical structure and to generate control signals. The processing circuitry can also be utilized to recognize anatomical structure using the data received from the imaging sensor and at least one pre-trained machine learning model. The actuation unit can receive control signals from the processing circuitry to actuate the three-dimensional movement of the flexible part. The actuation unit can use connections with the bending portion to actuate the bending movement of the flexible part in X and Y planes. The actuation unit can also comprise a sliding mechanism to actuate the sliding movement of the flexible part in Z plane by moving the bending portion and its associated actuation unit. Alternatively, the sliding mechanism can actuate the sliding movement of the distal end in the Z plane by direct contact or abutment with the flexible part without displacing the bending portion and its associated actuation unit.

[0012] In another aspect of the present invention, a method to automatically insert an invasive medical device inside the cavity or lumen of the patient is provided which comprises inserting a bending portion and an invasive medical device arranged on the bending portion inside the cavity or lumen of the patient. The method includes collecting data using an imaging sensor arranged on the bending portion and communicating the collected data to a processing circuitry to predict an intended path of insertion of the invasive medical device and generate control signals. The control signals are then communicated to at least one actuation unit to actuate the three-dimensional movement of the invasive medical device. The intended path is preferably predicted by the processing circuitry based on the recognition of at least one structure relevant to the cavity or lumen using the data communicated from the imaging sensor.

[0013] Other embodiments and preferred features of the invention, together with corresponding advantages, will be apparent from the following description and claims. BRIEF DESCRIPTION OF THE DRAWINGS

[0014] Various aspects as well as embodiments of the present invention are better understood by referring to the foilowing detailed description, To better understand the invention, the detailed description should be read in conjunction with the drawings,

[0015] FIG. 1 illustrates an environment of a closed loop navigation and controi system, in accordance with embodiments of the present disclosure.

[0016] FIG. 2a illustrates a robotic unit, in accordance with an embodiment of the present disclosure,

[0017] FIG. 2b illustrates the distal tip of the robotic unit, in accordance with an embodiment of the present disclosure,

[0018] FIG. 2c and FIG. 2d illustrates a top view and a bottom view respectively, of a cover enclosing the robotic unit of first embodiment, in accordance with an embodiment of the present disclosure.

[0019] FIG. 3 illustrates a close-up view of the actuation unit of a robotic unit comprising a rack and pinion configuration, in accordance with an embodiment of the present disclosure,

[0020] FIG. 4a illustrates a robotic unit, in accordance with an alternate embodiment of the present disclosure.

[0021] FIG. 4b illustrates a front perspective view of a cover, in accordance with an alternate embodiment of the present disclosure.

[0022] FIG. 5 illustrates a detachable robotic unit, in accordance with an alternate embodiment of the present disclosure.

[0023] FIG. 6a, FIG. 6b and FIG. 6c illustrate the bending section comprising the robotic actuation mechanism of the robotic unit, in accordance with an embodiment of the present disclosure. [0024] FIG. 6d illustrates connection of the bending portion with the distal end, in accordance with an embodiment of the present disclosure.

[0025] FIG. 6e illustrates the angulation wires in the robotic unit, in accordance with an embodiment of the current disclosure.

[0026] FIG. 7 illustrates a cross-sectional view of an inner circuity of the flexible part of the robotic unit, in accordance with an embodiment of the present disclosure.

[0027] FIG. 8 illustrates exemplary functional components of the proposed closed loop navigation and control system, in accordance with an embodiment of the present disclosure.

[0028] FIG. 9 illustrates an exemplary implementation scenario of the closed loop navigation and control system,

[0029] FIG. 10 illustrates a bush coupling mechanism used in the robotic unit, in accordance with an embodiment of the present disclosure.

[0030] FIG. 11 illustrates different positions of the robotic unit around a patient, in accordance with an embodiment of the present disclosure.

DETAILED DESCRIPTION

[0031] The present disclosure is best understood with reference to the detailed figures and description set forth herein. Various embodiments have been discussed with reference to the figures. However, a person skilled in the art will readily appreciate that the detailed descriptions provided herein with respect to the figures are merely for explanatory purposes, as the methods and system may extend beyond the described embodiments. For instance, the teachings presented, and the needs of a particular application may yield multiple alternatives and suitable approaches to implement the functionality of any detail described herein. Therefore, any approach may extend beyond certain implementation choices in the following embodiments. [0032] Methods of the present invention may be implemented by performing or executing manually, automatically, or a combination thereof, of selected steps or tasks. The term “method" refers to manners, means, techniques, and procedures for accomplishing a given task including, but not limited to, those manners, means, techniques, and procedures either known to or readily developed from known manners, means, techniques, and procedures by practitioners of the art to which the invention belongs. The descriptions, examples, methods, and materials presented in the claims and the specification are not to be construed as limiting but rather as illustrative only. Those skilled in the art will envision many other possible variations within the distal end of the technology described herein.

[0033] While reading a description of the exemplary embodiment of the best mode of the invention, hereinafter referred to as “exemplary embodiment”), one should consider the exemplary embodiment as the best mode for practicing the invention at the time of filing of the patent in accordance with the inventor's belief. As a person with ordinary skills in the art may recognize substantially equivalent structures or substantially equivalent acts to achieve the same results in the same manner, or in a dissimilar manner, the exemplary embodiment should not be interpreted as limiting the invention to one embodiment.

[0034] The discussion of a species (or a specific item) invokes the genus (the class of items) to which the species belongs as well as related species in this genus. Similarly, the recitation of a genus invokes the species known in the art. Furthermore, as technology develops, numerous additional alternatives to achieve an aspect of the invention may arise. Such advances are incorporated within their respective genus and should be recognized as being functionally equivalent or structurally equivalent to the aspect shown or described.

[0035] Unless explicitly stated otherwise, conjunctive words (such as “or”, “and", “including” or “comprising") should be interpreted in the inclusive, and not the exclusive sense.

[0036] As will be understood by those of the ordinary skill in the art, various structures and devices are depicted in the block diagram to not obscure the invention. It should be noted in the following discussion that acts with similar names are performed in similar manners unless otherwise stated.

[0037] The foregoing discussions and definitions are provided for clarification purposes and are not limiting. Words and phrases are to be accorded their ordinary, plain meaning unless indicated otherwise. The present disclosure relates generally to automated medical devices, more specifically a system and method for automatically navigating invasive surgical devices through a lumen or cavity of the body.

[0038] In an aspect of the present invention, an automated closed loop navigation and control system coupled to an invasive medical device is disclosed. The automated closed loop navigation and control system may include a processing circuitry that receives data from at least one data source such as an image sensor or memory or a database to recognize structures such as a cavity or lumen in a patient and predict an intended path for insertion and navigation of an invasive medical device inside the patient. The processing circuitry further generates and communicates the control signals to at least one actuation unit based on a predicted intended path, to actuate the invasive medical device in a three-dimensional manner for controlling the navigation and movement of the invasive medical device. The closed loop navigation and control system related to colonoscopy may be performed in an automated and/or manual manner. Additionally, the closed loop navigation and control system and methods may be utilized in several other procedures with some modifications inside any cavity or lumen inside the human body.

[0039] The datasources may comprise of one or more imaging sensors. Further, sensors such as, but not limited to, infrared cameras, sonic sensors, microwave sensors, fiberoptic shape sensors, photodetectors, mechanical sensors such as pressure sensors, force sensors, proximity sensors, time of flight or LIDAR sensors, etc. or other sensors known to person of skilled in the art. In other embodiment, one or more sensors may be fused into a custom designed single sensor to reduce the dimensions of the sensor. The data captured through a data source may also be captured based on differential absorption of monochromatic (single wavelength or narrow band) or polychromatic radiations (multiple wavelengths at the same time) ranging from ultraviolet to far infrared spectrum. The data can be static, a single point or captured at a single point of time, or can be dynamic or serial data and several series of times or continuous data that may create a point cloud or a video.

[0040] In an embodiment, the closed loop navigation and control system may be used for diagnostic and interventional endoscopic procedures involving the cavity or lumen of the Gl system, hepatobiliary system, respiratory system, male and female genlto-urinary systems, cardiovascular system and female reproductive system. In the Gl system, the procedure may include, but are not limited to Esophagoscopy, rigid, transoral; diagnostic, including collection of specimen(s) by brushing or washing: Esophagoscopy, rigid, transoral; with biopsy, single or multiple; Esophagoscopy, flexible, transnasal; diagnostic, including collection of specimen(s) by brushing or washing; Esophagoscopy, flexible, transnasal; with biopsy, single or multiple; Esophagoscopy, flexible, transoral; diagnostic, including collection of specimen(s) by brushing or washing; Esophagoscopy, flexible, transoral; with biopsy, single or multiple; Esophagogastroduodenoscopy, flexible, transoral; diagnostic, including collection of specimen(s) by brushing or washing; Esophagogastroduodenoscopy, flexible, transoral; with biopsy, single or multiple; Small intestinal endoscopy, enteroscopy beyond second portion of duodenum, not including ileum; diagnostic, including collection of specimen(s) by brushing or washing; small intestinal endoscopy, enteroscopy beyond second portion of duodenum, not including ileum; with biopsy, single or multiple, small intestinal endoscopy, enteroscopy beyond second portion of duodenum, including ileum; diagnostic, with or without collection of specimen(s) by brushing or washing; Small intestinal endoscopy, enteroscopy beyond second portion of duodenum, including ileum; with biopsy, single or multiple; Ileostomy, through stoma; diagnostic, including collection of specimens) by brushing or washing, when performed; Ileostomy, through stoma; with biopsy, single or multiple; Colonoscopy through stoma; diagnostic, including collection of specimen(s) by brushing or washing; Colonoscopy through stoma; with biopsy, single or multiple: Sigmoidoscopy, flexible; diagnostic, including collection of specimen(s) by brushing or washing; Sigmoidoscopy, flexible; with biopsy, single or multiple; Colonoscopy, flexible; diagnostic, including collection of specimen(s) by brushing or washing; Colonoscopy, flexible; with biopsy, single or multiple; Esophagoscopy, flexible, transoral; with endoscopic ultrasound examination; Esophagogastroduodenoscopy, flexible, transoral; with endoscopic ultrasound examination limited to the esophagus, stomach or duodenum, and adjacent structures; Esophagogastroduodenoscopy, flexible, transoral; with endoscopic ultrasound examination, including the esophagus, stomach, and either the duodenum or a surgically altered stomach where the jejunum is examined distal to the anastomosis; Colonoscopy through stoma; with endoscopic ultrasound examination, limited to the sigmoid, descending, transverse, or ascending colon and cecum and adjacent structures; Sigmoidoscopy, flexible; with endoscopic ultrasound examination; Colonoscopy, flexible; with endoscopic ultrasound examination limited to the rectum, sigmoid, descending, transverse, or ascending colon and cecum, and adjacent structures; Esophagoscopy, rigid, transoral; with balloon dilation (less than 30 mm diameter); Esophagoscopy, rigid, transoral; with insertion of guide wire followed by dilation over guide wire; Esophagoscopy, flexible, transoral; with dilation of esophagus, by balloon or dilator, retrograde (includes fluoroscopic guidance, when performed); Esophagoscopy, flexible, transoral; with dilation of esophagus with balloon (30 mm diameter or larger) (includes fluoroscopic guidance, when performed); Esophagoscopy, flexible, transoral; with transendoscopic balloon dilation (less than 30 mm diameter); Esophagoscopy, flexible, transoral; with insertion of guide wire followed by passage of dilator(s) over guide wire; Esophagogastroduodenoscopy, flexible, transoral; with dilation of esophagus with balloon (30 mm diameter or larger) (includes fluoroscopic guidance, when performed); Esophagogastroduodenoscopy, flexible, transoral; with dilation of gastric/duodenal stricture(s) (eg. balloon, bougie); Esophagogastroduodenoscopy, flexible, transoral; with insertion of guide wire followed by passage of dilator(s) through esophagus over guide wire; Esophagogastroduodenoscopy, flexible, transoral; with transendoscopic balloon dilation of esophagus; Ileostomy, through stoma; with transendoscopic balloon dilation; Colonoscopy through stoma; with transendoscopic balloon dilation; Sigmoidoscopy, flexible; with transendoscopic balloon dilation; Esophagoscopy, rigid, transoral; with removal of foreign body(s); Esophagoscopy, flexible, transoral; with removal of foreign body(s); Esophagogastroduodenoscopy, flexible, transoral; with removal of foreign body(s); Small intestinal endoscopy, enteroscopy beyond second portion of duodenum, not including ileum; with removal of foreign body(s); Colonoscopy through stoma; with removal of foreign body(s); Sigmoidoscopy, flexible; with removal of foreign body(s); Colonoscopy, flexible; with removal of foreign body(s); Esophagoscopy, flexible, transoral; with removal of tumor(s), polyp(s), or other lesion(s) by hot biopsy forceps; Esophagogastroduodenoscopy, flexible, transoral; with removal of tumor(s), polyp(s), or other lesion(s) by hot biopsy forceps; Small intestinal endoscopy, enteroscopy beyond second portion of duodenum, not including ileum; with removal of tumor(s), polyp(s), or other lesion(s) by hot biopsy forceps or bipolar cautery; Colonoscopy through stoma; with removal of tumor(s), polyp(s), or other lesion(s) by hot biopsy forceps; Sigmoidoscopy, flexible; with removal of tumor(s), polyp(s), or other lesion(s) by hot biopsy forceps; Colonoscopy, flexible; with removal of tumor(s), polyp(s), or other lesion(s) by hot biopsy forceps; Esophagoscopy, flexible, transoral; with removal of tumor(s), polyp(s), or other lesion(s) by snare technique; Esophagogastroduodenoscopy, flexible, transoral; with removal of tumor(s), polyp(s), or other lesion(s) by snare technique; Small intestinal endoscopy, enteroscopy beyond second portion of duodenum, not including ileum; with removal of tumor(s), polyp(s), or other lesion(s) by snare technique; Colonoscopy through stoma; with removal of tumor(s), polyp(s), or other lesion(s) by snare technique; Sigmoidoscopy, flexible; with removal of tumor(s), polyp(s), or other lesion(s) by snare technique; Colonoscopy, flexible; with removal of tumor(s), polyp(s), or other lesion(s) by snare technique; Esophagoscopy, flexible, transoral; with ablation of tumor(s), polyp(s), or other lesion(s) (includes pre- and post-dilation and guide wire passage, when performed); Small intestinal endoscopy, enteroscopy beyond second portion of duodenum, not including ileum; with ablation of tumor(s), polyp(s), or other lesion(s) not amenable to removal by hot biopsy forceps, bipolar cautery or snare technique; Colonoscopy through stoma; with ablation of tumor(s), polyp(s), or other lesion(s) (includes pre- and post-dilation and guide wire passage, when performed); Sigmoidoscopy, flexible; with ablation of tumor(s), polyp(s), or other lesion(s) (includes pre- and post-dilation and guide wire passage, when performed); Colonoscopy, flexible; with ablation of tumor(s), polyp(s), or other lesion(s) (includes pre- and post-dilation and guide wire passage, when performed); Small intestinal endoscopy, enteroscopy beyond second portion of duodenum, not including ileum; with ablation of tumor(s), polyp(s), or other lesion(s) not amenable to removal by hot biopsy forceps, bipolar cautery or snare technique; Colonoscopy through stoma; with ablation of tumor(s), polyp(s), or other lesion(s) (includes pre- and post-dilation and guide wire passage, when performed); Sigmoidoscopy, flexible; with ablation of tumor(s), polyp(s), or other lesion(s) (includes pre- and post-dilation and guide wire passage, when performed); Colonoscopy, flexible; with ablation of tumor(s), polyp(s), or other lesion(s) (includes pre- and post-dilation and guide wire passage, when performed); Esophagoscopy, flexible, transoral; with control of bleeding, any method; Esophagogastroduodenoscopy, flexible, transoral; with control of bleeding, any method; Small intestinal endoscopy, enteroscopy beyond second portion of duodenum, not including ileum; with control of bleeding (eg. injection, bipolar cautery, unipolar cautery, laser, heater probe, stapler, plasma coagulator); Small intestinal endoscopy, enteroscopy beyond second portion of duodenum, including ileum; with control of bleeding (eg, injection, bipolar cautery, unipolar cautery, laser, heater probe, stapler, plasma coagulator); Sigmoidoscopy, flexible; with control of bleeding, any method; Colonoscopy through stoma; with control of bleeding, any method; Esophagoscopy, flexible, transoral; with injection sclerosis of esophageal varices; Esophagogastroduodenoscopy, flexible, transoral; with injection sclerosis of esophageal/ gastric varices; Esophagoscopy, flexible, transoral; with band ligation of esophageal varices; Esophagogastroduodenoscopy, flexible, transoral; with band ligation of esophageal/ gastric varices; Esophagoscopy, flexible, transoral; with placement of endoscopic stent (includes pre-and post-dilation and guide wire passage, when performed); Esophagogastroduodenoscopy, flexible, transoral; with placement of endoscopic stent (includes pre- and post-dilation and guide wire passage, when performed); Small intestinal endoscopy, enteroscopy beyond second portion of duodenum, not including ileum; with transendoscopic stent placement (includes predilation); Small intestinal endoscopy, enterostomy beyond second portion of duodenum, including ileum; with transendoscopic stent placement (includes predilation); Ileostomy, through stoma; with placement of endoscopic stent (includes pre- and postdilation and guide wire passage, when performed); Sigmoidoscopy, flexible; with placement of endoscopic stent (includes pre- and post-dilation and guide wire passage, when performed); Colonoscopy, flexible; with endoscopic stent placement (includes pre- and post-dilation and guide wire passage, when performed); Esophagogastroduodenoscopy, flexible, transoral; with directed placement of percutaneous gastrostomy tube; Small intestinal endoscopy, enterostomy beyond second portion of duodenum not including ileum, with placement of percutaneous jejunostomy tube; Small intestinal endoscopy, enterostomy beyond second portion of duodenum not including Ileum, with conversion of percutaneous gastrostomy tube to percutaneous jejunostomy tube, Replacement of gastrostomy or cecostomy (or other colonic) tube percutaneous, under fluoroscopic guidance including contrast injection(s); Esophagoscopy, rigid, transoral; with directed submucosal injectlon(s), any substance; Esophagoscopy, flexible, transoral; with directed submucosal injection(s), any substance; Esophagogastroduodenoscopy, flexible, transoral; with directed submucosal injections), any substance; Colonoscopy through stoma; with directed submucosal injeciionfs), any substance; Sigmoidoscopy, flexible; with directed submucosal injection(s), any substance; Colonoscopy, flexible; with directed submucosal injection(s), any substance; Esophagoscopy, flexible, transoral; with transendoscopic ultrasound-guided intramural or transmural fine needle aspiration/biopsy(s); Esophagogastroduodenoscopy, flexible, transoral; with transendoscopic ultrasound-guided intramural or transmural fine needle aspirationZbiopsy(s), (includes endoscopic ultrasound examination limited to the esophagus, stomach or duodenum, and adjacent structures); Esophagogastroduodenoscopy, flexible, transoral; with transendoscopic ultrasound- guided intramural or transmural fine needle aspiration/biopsy(s) (includes endoscopic ultrasound examination of the esophagus, stomach, and either the duodenum or a surgically altered stomach where the jejunum is examined distal to the anastomosis); Colonoscopy through stoma; with transendoscopic ultrasound guided intramural or transmural fine needle aspirationZbiopsy(s), includes endoscopic ultrasound examination limited to the sigmoid, descending, transverse, or ascending colon and cecum and adjacent structures; Sigmoidoscopy, flexible; with transendoscopic ultrasound guided intramural or transmural fine needle aspiration/biopsy(s).

[0041] In an embodiment, the cavity or lumen inside which the invasive medical device may inserted can be any naturally or artificial cavity inside the human body including but not limited to abdominal cavity, orbital cavity, external, middle and internal ear including associated lumens and cavities, brain/cranial cavity, vertebral/spinal cavity, thoracic cavity, peritoneal cavity, pelvic cavity, pleural cavity, oral cavity, nasal cavity, laryngeal cavity, etc. Lumen can be any natural or artificial organ, or structure within an organ system including but not limited to the following organ systems. In the respiratory system, lumen can include but is not limited to the continuous respiratory tract ~ nose, nasopharynx, larynx, trachea, right and left main bronchus, bronchi & bronchioles, alveoli, the lunch parenchyma etc. In the gastro-intestinal (G I) system, lumen can include but is not limited to the continuous Gl tract - mouth, oropharynx, Esophagus, stomach (including all different parts of stomach), duodenum (including all different parts of duodenum, small intestine (including all different parts of small intestine), colon (including all different parts of colon), sigmoid, rectum, anus, ileostomies, colostomies, etc. In the urinary system, lumen can include but is not limited to the continuous urinary tract -- urethra (including all parts of male and female urethra), vas deferens, urinary bladder, ureter, renal pelvis, renal calyces, renal pyramids and kidney. In the cardiovascular system, lumen can include but is not limited to all the chambers of the heart, aorta (including all parts of aorta), the celiac trunk, all arteries, all veins and capillaries, Inferior and Superior Vena cava, etc. In the hepato-pancreato-biliary (HPB) system, lumen can include but is not limited to the continuous HPB tract - liver, gall bladder, pancreas, and all the associated ducts - hepatic, cystic, pancreatic, common bile duct, etc. In the female reproductive system, lumen can include but is not limited to the vagina, cervix, uterus, fallopian tubes, etc. The cavity or lumen may also include normal anatomical structures and landmarks in these systems, such as veru montenum, ampulla of vater, ileo-cecal junction, normal anatomical variants, and abnormal anatomical pathologies such as polyps, tumors, diverticuli, etc. The cavity or lumen may enable imaging of a structure in proximity of the endoscope and can be used for diagnostic and interventional procedures such as trans-esophageal echocardiogram, endoscopic ultrasounds, endobronchial ultrasound and other procedures known to a person skilled in art,

[0042] FIG, 1 illustrated an environment of a closed loop navigation and control system 100, in accordance with an embodiment of the present disclosure. In an embodiment, the closed loop navigation and control system 100 may be provided in a body or a stand (described later) of a detachable robotic unit 104. The closed loop navigation and control system 100 may include a control unit 102 which generates control signals which are transmited to the detachable robotic unit 104. The detachable robotic unit 104 may include an imaging unit 112 and an actuation unit 114. In an embodiment, the control unit 102 may be connected to the detachable robotic unit 104 through a wired or a wireless connection or a combination of both. Accordingly, the detachable robotic unit 104 may be automatically operated by the control unit 102 based on a feedback received from the imaging unit 112 and the sensor unit 116. Thus, a closed loop of processing and control may be created based on inputs provided by the imaging unit 112 and the processing done by the control unit 102 to generated control signals to automatically control the movement of the detachable robotic unit 104 through the actuation unit 114.

[0043] In an embodiment, the wired or the wireless network or a combination thereof can be implemented as one of the different types of networks, such as intranet, local area network (LAN), wide area network (WAN), Bluetooth, IEEE 802.11, the internet, Wi-Fi, LTE network, CDMA network, etc. Further, the wired or the wireless network can either be a dedicated network or a shared network. The shared network represents an association of the different types of networks that use a variety of protocols, for example. Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/! nternet Protocol (TCP/IP), Wireless Application Protocol (WAP), etc., io communicate with one another. Further the wired or the wireless network can include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, etc.

[0044] In an embodiment, the closed loop navigation and control system 100 may be electrically powered through an electrical connection through a power supply cable or a rechargeable battery provided to power the various components of the closed loop navigation and control system 100. In an embodiment, the robotic unit 104 may be powered by the same or different power supply cable of the control unit 102 or may have a separate power supply in form of a rechargeable battery to power the robotic unit 104.

[0045] The control unit 102 comprises one or more processors 108. The one or more processor(s) 108 may be implemented as one or more microprocessors, microcomputers, single board computer, microcontrollers, digital signal processors, central processing units, graphics processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 108 are configured to fetch and execute computer-readable instructions stored in a memory 110 of the control unit 102. The memory 110 may store one or more computer- readable instructions or routines, which may be fetched and executed to create or share the data units over a network service. The memory 110 may comprise any non -transitory storage device including, for example, volatile memory such as RAM, or non-volatile memory such as EPROM, flash memory, etc. In an embodiment, the control unit 102 may be connected to a cloud server comprising the one or more processors 108 and the memory 110 in form of a cloud database. The one or more processors 108 may be configured to process data stored in the local memory 110 in the control unit 102 or in form of cloud database.

[0046] The control unit 102 may also comprise input/output devices 106. The input/output devices 106 may comprise of variety of interface(s), for example, interfaces for data input and output devices, and the like. The input/output devices 106 may facilitate inputting of instructions by a user 118 communicating with the control unit 102. In an embodiment, the input/output device 106 may be wirelessly connected to the control unit 102 through wireless network interfaces such as Bluetooth®, infrared, or any other wireless radio communication known in the art. In an embodiment, the input/output devices 106 may be connected to a communication pathway for one or more components of the control unit 102 io facilitate the transmission of inputted instructions and output the results of data generated by various components such as, but not limited to, processor(s) 108 and memory 110.

[0047] In an embodiment, the control unit 102 may be implemented in any computing device which may be automatically configured or controlled by a user 118 operating the closed loop navigation and control system 100. Further, a user 118 can communicate with the control unit 102 through one or more user devices (not shown) that can be communicatively coupled to the control unit 102 through a wired or a wireless connection or provided as one or more input/output devices 106. The user may be a healthcare provider, which may be present during the operation of the medical device comprising the closed loop navigation and control system 100. In an embodiment, the user device (not shown) may include a variety of computing systems, including but not limited to physical manipulation, a touch enabled computing device, Artificial Intelligence (Al) enabled interface, laptop computer, a virtual reality / augmented reality / mixed reality (VR/AR/MR) enabled or integrated interface, a desktop computer, a notebook, a workstation, a portable computer, a personal digital assistant, a handheld device, a joystick or a mobile device. In an embodiment, the input/output device 106 may be configured to receive inputs from a user 118 in form of, but not limited to, touch, gaze, gesture, voice commands, etc.

[0048] The robotic unit 104 may be detachably connected to the control unit 102. The robotic unit 104 may be of different shapes and sizes and may be designed and selected for the cavity or lumen into which it is to be inserted. For example, the size and shape of the robotic unit 104 for a colonoscope or an upper Gl endoscope may be different than that used for a ureteroscope or a bronchoscope, and so on. In an embodiment, the robotic unit 104 may also contain a processing unit (not shown) for processing such as, sensor integration and independent functioning of one or more sensors, microcontrollers, motors, actuators, and other components provided in the robotic unit 104. The robotic unit 104 may also contain attachment interfaces and channels for all external devices including, but not limited to, other external surgical devices (not shown), external surgical instruments, energy devices (not shown), insufflation devices (not shown), suctionirrigation devices (not shown), ultrasound (not shown), and other imaging devices (not shown) etc. The robotic unit 104 may constitute a combination of one or more disposable components and reusable components (described later). In one embodiment, the robotic unit 104 may house a tube (described later), which can be of different lengths and diameter corresponding to the cavity or lumen into which it is to be inserted. The tube of the robotic unit 104 may include a flexible part (described later), a bending portion (described later) and a distal end (described later) which may be single use or reusable.

[0049] The robotic unit 104 may be designed to have different shapes and sizes. Further, the robotic unit 104 may be different for each cavity or lumen the endoscope is designed for. For example, the size and shape of the robotic unit 104 for a colonoscope or an upper Gl endoscope may be different than that of a ureteroscope or a bronchoscope, etc. The robotic unit 104 may house the flexible part 204, which can be of different lengths and diameter for each cavity or lumen according to which the endoscope is designed for.

[0050] The robotic unit 104 may comprise an imaging unit 112 and an actuation unit 114 and a sensor unit 116. The imaging unit 112 may include one or more imaging sensors which may capture the images. An imaging unit 112 may be positioned at any location along the flexible part 204, or at any other location that can provide panoramic images. In an embodiment, the image sensors can be placed at the distal end of the flexible part of the robotic unit 104. In an embodiment, the imaging unit 112 may be detachable from the flexible part of the robotic unit 104. In an embodiment, the imaging unit 112 may be introduced through one or more channels through the flexible part of the robotic unit 104. In an embodiment, the image sensors may not be placed directly at the optimal position due to engineering constraints and may be placed elsewhere and can be connected to the optimal position. The imaging unit 112 may be connected io the optimal position using methods and technologies such as fiberoptics, wave guides, and other forms of transmissive medium. In an embodiment, the imaging sensors can be placed on a proximal end of a flexible part (described later) and connected to the distal tip via these methods. In an embodiment, the image sensors may be placed or aligned to have a superimposed field of view. The control unit 102 may generate a panoramic image by stitching the images captured by each sensor of the imaging unit 112. In an embodiment, the panoramic image may be displayed on a display of the inpui/output device 106. Further, the panoramic image may be utilized to determine an intended path used to operate the actuation unit 114 which in turn actuates the movement of the flexible pari, 204 of the robotic unit 104 as described in the co-pending patent application PCT/US2021/062988, incorporated herein by reference in its entirety. Custom sensor(s) can be created by combining two or more imaging sensors into a single housing and the data from the custom sensor(s) can be used as a data source. In an embodiment, the imaging sensor(s) can operate at any wavelength along the electromagnetic spectrum and non-electro-magnetic spectrum including, but not limited to cameras, infrared cameras, ultra-violet sensor, sonic sensors, microwave sensors, photodetectors, or others known to the person skilled in the art can also be employed to achieve the same purpose. [0051] Further, the flexible part of the robotic unit 104 may also include a sensor unit 116 comprising various sensors or a combination thereof such as, but not limited to, Time-of Flight sensors, temperature sensors, proximity sensor, pressure sensor, etc. The sensor unit 116 and the imaging unit 112, separately or in combination may be a data source and may provide data that can be used by the control unit 102 to generate control signals for the robotic unit 104.

[0052] In an embodiment, the panoramic view captured by the imaging unit 112 and the data from the sensor unit 116 may be used by the control unit 102 for generation of control signals which are transmitted to the actuation unit 114, The actuation unit 114 may include robotic actuation mechanism (described later) which facilitates the movement of the flexible part of the robotic unit 104 inside of a patient's body as per the intended path that has been determined, overlayed and confirmed. In an embodiment, the control signals from the control unit 102 are transmitted to the actuation unit 114 based on an intended path. In an embodiment, the intended path can be the path along which the movement of the flexible part of the robotic unit 104 would be guided once the movement has commenced.

[0053] In an embodiment of the present invention, the control unit 102, the input/output device 106, and the robotic unit 104 may be associated to a main body (not shown). In an alternative embodiment of the present invention, the control unit 102, the input/output device 106, and the robotic unit 104 may be arranged separately from the main body (not shown).

[0054] FIG. 2a illustrates a robotic unit 104, in accordance with an embodiment of the present disclosure. In an embodiment, the robotic unit 104 may include a housing 202. The housing 202 may contain a tubular flexible part 204 of the robotic unit 104 and an actuation unit 114. The flexible part 204 of the robotic unit 104 may have a distal end 206 and at the tip comprising the imaging unit 112, sensor unit 116 and various other ports (not shown) for water, suction, irrigation, insufflation , lighting, etc. The actuation unit 114 and the housing 202 are configured to form a rack and pinon (as illustrated later). The teeth of the rack 208 can be arranged on the circumference or perimeter of the housing 202. The pinion (not shown) is attached to a Z -motor 212 and freely rotates in Z-motor coupler axis. The pinion (not shown) roils over the rack 208 and moves aiong the circumference or perimeter based on the powering or actuation of the Z-motor 212. The Z-motor 212 is housed on top of a motor block 214 and moves from an initial home position to an end position. The pinon (not shown) is attached on the end of the coupler shaft of the Z-motor 212. The coupler shaft of the Z-motor 212 passes through a hole in a side wall of a tunnel 210 of the motor block 214 which houses the pinion (not shown). The distal end of the pinon (not shown) is a gear with teeth that interiocks with the teeth of the rack 208, The rotation of the Z-motor 212 coupler rotates the pinion (not shown) which in turn drives the movement of the flexible part 204 in the Z axis. This movement of the flexible part 204 in the Z direction resuits in the insertion or retraction of the flexible part 204 in Z-direction. The speed of the Z-motor 212 is controlled by control signals received from the control unit 102 of FIG. 1. In an embodiment, the flexible part 204 at one end may be attached to the motor block 214 to be connected to the housing 202. Further, X and Y motors 218 can be attached on same or opposite sides of the motor block 214. As seen in FIG, 2a, the motor block 214 may provide support to the X and Y motors 218 and the Z-motor 212. In an embodiment, the X and Y motors 218 may be attached to the motor block 214 in a manner that the rack 208 is sandwiched between the X and Y motors 218. In an embodiment, limit switches (not shown) may be mounted to the Z-motor 212 and the X & Y motors 218 or motor block 214 to prevent over insertion or over retraction of the flexibie part 204, and may enable initial calibration of the actuation unit 114, in an embodiment, limit switches (as described Safer) may also be placed aiong the rack 208 to provide additional positional data in the Z axis. The purpose of the limit switches can also be achieved by a physical limiter than can limit the rotation of the X and Y motors 218 and prevent excessive rotation of the motor. In an embodiment, electromagnetic sensors (not shown) and tracking sensors (not shown) may be used to confirm extension or retraction of the flexible part 204 forming a closed loop system of winding and unwinding.

[0055] In one preferred embodiment, the automated closed loop navigation and control system 100 for an endoscopic surgical device aiso referred to herein as a robotic unit 104 comprises a housing 202, a flexible part 204 that connects the housing 202 with the flexible part 204, arranged on the housing 202. The housing 202 may comprise of at least one imaging sensor, a circuitry, a user interface, one actuation unit 114 to actuate a three- dimensional movement of a distal end 206. The length of the flexible part 204 is variable and can include a bending portion (not shown) at the tip of the flexible part 204 or can cover the length of the flexible part 204 completely. In other embodiments, the bending portion can be located within any portion of the flexible part 204, determined by several factors, including but not limited to, the relevant uses and anatomical structures that need to be navigated. There can also be multiple bending portions within the flexible part 204.

[0056] The X and Y motors 218 may be actuated based on control signals received from the control unit 102. The X and Y motors 218 may navigate the movement of the distal end 206 in an X direction and Y direction in a 2D plane of movement in order to align the distal end 206 with an intended path. The 2D plane of movement may be determined based on the panoramic image captured by the imaging unit 112. The control unit 102 may provide sets of co-ordinates of the intended path with which the distal end 206 is to be aligned. The determination of sets of co-ordinates of the intended path is described later in the disclosure. A person of skills in the art will find it reasonable to use other 2D coordinate systems than X and Y such as polar co-ordinates.

[0057| In an embodiment, a subsequent position may be determined based on a current position of the distal end 206 of the flexible part 204 and the data received from the one or more sensors of the imaging unit 112 and the sensor unit 116. In an embodiment, the subsequent position determined may be compared with an intended position along the intended path. Accordingly, the control signals may be generated to actuate the three- dimensional movement of the distal end 206 along the intended path if the subsequent position coincides with the intended position. In case, the subsequent position does not coincide with the intended position, additional control signals may be generated by the control unit 102 to actuate the three-dimensional movement of the distal end back to the current position. In an embodiment, a user may manually operate the control unit 102 to actuate the robotic unit 104 so that the distal end 206 of the flexible part 204 is overlapped with the intended position. In an embodiment, multiple subsequent positions may be determined based on the current position and the data received from the imaging unit 112 and the sensor unit 116. A machine teaming algorithm may be utilized by the closed loop navigation and control system 100 to automatically actuate the robotic unit 104 so that the distal end 206 of the flexible part 204 is overlapped with the intended position based on the comparison of the subsequent position generated of the distal end 206 with an intended position along the intended path. Accordingly, closed loop control may be provided to actuate the bending portion of the flexible part 204 in accordance with the intended path determined.

[0058] Based on the sets of co-ordinates of the intended path received the X and Y motors 218 actuate the corresponding angulation cables running inside the flexible part 204 up to the robotic actuation mechanism (described later) provided right before the distal end 206. In an embodiment, four angulation cables are provided which may be actuated by the X and Y motors 218 in order to align the distal end 206 with the intended path.

[0059] In an embodiment, a surgical procedure involving the insertion of the flexible part 204 may be performed by extending the distal end 206 into a cavity or lumen of a patient. The flexible part 204 may be extended from the home position into and through the cavity or lumen by the Z-motor 212. During the surgical procedure when the flexible part 204 has been retracted completely from the housing 202, a physical barrier 220 may be detachably attached to the housing 202. The physical barrier may be, but not limited to, in form of a sleeve detachably attached to the housing which physically separated the flexible part from the housing. In an embodiment, the physical barrier may be disposable or reusable. In an embodiment, the flexible part may be, but not limited to, a disposable drape or a protective cover (not shown) covering the flexible part 204. In an embodiment, the protective cover may form a barrier between the flexible part and the body fluids and is made of a material which may be flexible, waterproof, impermeable and transparent. The purpose of single use detachable barrier 220 and disposable sleeve or disposable drape is io provide additional layers to prevent contamination of the robotic unit 104. The Z-motor 212 may be actuated to retract the flexible part 204 and the distal end 206 in a manner that the flexible part 204 is in contact with the disposable barrier only. In an embodiment, a user may place or attach the barrier 220 onto the rack 208 slot around the circumference or perimeter of the housing 202. The barrier 220 would either snap into place or be locked down with clamps or screws onto the circumference of the housing 202 or can be attached by a method that will be known to a person with skill in the art. In an embodiment, the flexible part 204 may have a cross-sectional shape that is, but not limited to, circular, oval, square, rectangular, etc. In an embodiment, the barrier 220 may be made of PVC, plastic, rubber or a water-proof flexible material known in the art.

[0060] In an embodiment, the motor block 214 may have a male guide rail (not shown) that can decrease the friction when sliding over the female guide rail on the rack 208 or vice versa. It is important that the motor block 214 and X and Y motors 218 move simultaneously because this will allow the X and Y motors 218 to travel along the circumference or perimeter of the housing 202. The X and Y motors 218 will need to travel so that length of an angulation cable (not shown) remain constant. This will allow in a simplified movement of the distal end 206 in a three-dimension. A person of skill in the art may also realize that other three-dimensional coordinate schemes such as radial, polar, cylindrical, and spherical can be used in substitution of the x, y, and z coordinates described herein.

[0061] FIG. 2b illustrates the distal end 206 of the robotic unit 104, in accordance with an embodiment of the present disclosure. In an embodiment, the distal end 206 which is attached to the distal end of the flexible pari 204 may include an instrument channel 222, an image sensor 224, air/water channel 226, visible, UV or IR light source 228, and other sensor(s) 230.

[0062] In an exemplary embodiment of the present invention, the flexible part 204 may contain a shape sensor (not shown). The shape sensor may provide data related to real time position, orientation, speed, velocity, pose, and/or shape at the distal tip and/or the flexible part 204. The shape sensor may include one or more optical fiber(s) along the longitudinal axis of the flexible part 204, where it can be inserted via an instrument port, or mounted externally, or can be inside the lumen of the flexible part 204 temporarily or permanently. The one of more optical fiber(s) may be a single core or can be multicore. In alternate embodiment, the one or more optical fiber(s) may include Fiber Bragg Gratings (FBG) which may provide data related to the strain along the length of the flexible part. In some embodiments, the same fiber(s) can be used to provide the strain data and connect the imaging sensor to its optimum position as described above. The fiber optic shape sensor can also be a data source and the control unit 102 may use machine learning models to predict and anticipate variable anatomy including the formation of different kind of loops during the procedure. During a surgical procedure such as, but not limited io, colonoscopy, these loops may include, but is not limited to, alpha loops, reverse alpha loops, transverse loops, n loops, gamma loops, etc. As the risk of loop formation is predicted before the loop is formed, adequate measures can be taken to avoid formation of these loops. If the loop is still formed, several methods such as manual maneuvers, over tube, shape locking from flexible to rigid, motorized pull mechanism (not shown) from distal end 206 of the flexible part 204, magnetic endoscopic imaging etc. may be used which is known to a person of skill in the art.

[0063] The distal end 206 of the flexible part 204 may contain one or more opening(s) connected to one or more of the following: intraluminal channel(s) for instrumentation, monochromatic (single wavelength or narrow band) or polychromatic radiations (multiple wavelengths at the same time) source ranging from ultraviolet to far infrared spectrum, one or more image sensor(s) of the imaging unit 112, then one or more connection to the image sensor(s), opening to the channels for suction -irrigation, water-jet, insufflation, etc. In alternate embodiments, the distal end may have an additional ultrasound transducer, monochromatic (single wavelength or narrow band) or polychromatic radiations (multiple wavelengths at the same time) source ranging from ultraviolet to far infrared spectrum or other imaging components, or devices known to a person of skill in the art.

[0064] FIG. 2c and FIG. 2d illustrate a top view and a bottom view respectively, of cover enclosing the robotic unit 104 discussed above. FIG. 2c shows a top view of a cover 232 may be provided in shape of a circular disc which houses the housing 202 and the robotic unit 104. The housing comprises a U-shaped opening ridge 234 through which the distal end 206 and the flexible part 204 of the surgical device such as an endoscope extends outwards from the housing 202. FIG. 2d shows a bottom view of the cover 236 which contains the robotic unit 104 and the housing 202. In an embodiment, the cover 232 may indude plugins and slots for power cable and/or battery compartment, A person of skill in the art will know that the housing 202 and the cover 232 can be of any three-dimensional shape capable of enclosing the robotic unit 104.

[0065] FIG. 3 illustrates a close-up view of the actuation unit 114 of a robotic unit 104 comprising a rack and pinion configuration, in accordance with an embodiment of the present disclosure. The pinion 302 is a gear with its teeth intermeshed with the teeth of the rack 208, where the pinion 302 traverses the rack 208 provided on the circumference or perimeter of the housing 202. In an embodiment, the X and Y motors 218 are associated with angulation cables 304 using a mechanism such as, but not limited to a chain and sprocket mechanism. In an embodiment, the Z-motor 212 and the X and Y motors 218 may be associated with the flexible part 204 using other mechanisms. A limit switch 306 may be provided for providing physical limitation for the rotatory motion of the X and Y motors 218 in the x-coordinate and/or the y-coordinate.

[0066] FIG. 4a illustrates a robotic unit 400, in accordance with an alternate embodiment of the present disclosure. As shown in FIG. 4a, a hub 402 holds the flexible part 204 comprising a distal end 206, a detachable barrier 420 is provided on the circumference or perimeter of the hub 402. A disposable sleeve or a disposable drape (not shown) may be applied on top of the flexible part 204. The purpose of a detachable barrier 420 is similar to that of a single use detachable barrier 220 in the earlier embodiment. The disposable sleeve or disposable drape is to provide additional layers to prevent contamination of the enclosures of the robotic unit 400. The actuation unit 114 comprises a Z-motor 408 connected to a driving wheel 406. A follower wheel 410 is moveably connected to the driving wheel 406 through a belt or a chain 412 in a pulley or a sprocket wheel arrangement. The method of connection between a follower wheel 410 and a driving wheel 406 will be known to a person with skill in the art. In an embodiment, the driving wheel 406 and the follower wheel 410 may include sprocket wheels or gears rotatably attached to each other through a chain to form a chain drive. The actuation of the Z-motor 408 results in the movement of the driving wheel 406. The driving wheel 406 is connected to a center of the hub 402, so when the driving wheel 406 rotates, the hub 402 rotates. The hub 402 may rotate in a clockwise or anti-clockwise manner to extract or retract the flexible part 204 or vice versa, through one or more serrated guided wheel 414. Thus, converting the rotational movement of the hub 402 to a linear movement of the flexible part 204. The flexible part 204 may be compressed between the serrated guided wheel 414 and a free-running wheel (not shown). The free-running wheel may have very low friction and rotates only when there is any torque for force acting upon it. It may also contain serrations to increase the pressure on the flexible part 204. When the follower wheel 410 rotates, the serrated guided wheel 414 pushes out or retracts the flexible part 204 from or into the hub 402. As the follower wheel 410 rotates counterclockwise, the serrated guided wheel 414 will rotate clockwise, this motion will allow the distal end 206 to move forward. It is important that the hub 402 and the follower wheel 410 rotate simultaneously. Further, the X and Y motors 416 also travels along the circumference or perimeter of the hub 402 and is attached to the proximal end of the flexible part 204. The X and Y motors 416 may move so that the length of an angulation cable (not shown) remains constant. Z-motor 408 and the X and Y motors 416 together allow a three-dimension movement of the distal end 206. In an embodiment, the serrated guide wheel 414 may be driven directly by a separate motor (not shown). A person of skill in the art will know that the hub 402 and the cover 422 can be of any three-dimensional shape.

[0067] Alternatively, there are a number of different interconnecting arrangements of the driving wheel 406 and the follower wheel 410. The driving wheel 406 and the follower wheel 410 may be of same size rotatably connected to a belt 412. In an embodiment, the size of the driving wheel 406 and the follower wheel 410 can have different sizes to increase or decrease the relative speed of rotation of the follower wheel 410. In an alternate embodiment, the follower wheel 410 may be replaced by a Z-motor 408 thus removing the requirement of a belt 412.

[0068] In a working embodiment, the flexible part 204 may move forward until the proximal end would reach a certain distance away from the follower wheel 410. The hub 402 may include one or more ridge(s) or a groove(s) 418 on the circumference or perimeter onto which the flexible part 204 is wound. In an embodiment, a detachable barrier 420 is attached to the ridge 418. Once a surgical procedure is over the flexible part 204 is retracted into the hub 402 by inserting it into the barrier 420, In an embodiment, the barrier 420 including the flexible part 204 and the distal end 206 may be detached from the hub 402, The flexible part 204 and the distal end 206 may be sterilized by removing it from the detachable barrier 420 which may then be discarded, the distal end 206 would retract and will slide on top of the detachable barrier 420. Once the distal end 206 reaches its home position, the procedure can start.

[0069] FIG. 4b illustrates a front perspective view of a cover, in accordance with an alternate embodiment of the present disclosure, The cover 422 is circular disc enclosing the robotic unit 104 and the hub 402, The cover 422 comprises an opening 424 from which the distal end 206 and the flexible part 204 comes out during operation of the robotic unit 104.

[0070] FIG, 5 illustrates a detachable robotic unit 104, in accordance with an alternate embodiment 500 of the present disclosure. The detachable robotic unit 104 of the current embodiment comprises a first portion 504 and a second portion 506. The first portion 504 comprises a hub 502 which is attached to a left base 501 on left side and to a right base 503 on the right side or vice versa. The hub 502 may be a spool shaped housing which may have the flexible part 204 wound around it. The second portion 506 is a drive and guide unit for the flexible part and comprises a self-reversing screw 526, a lead screw nut 518, a driver wheel 508 which may be attached to a driver wheel motor 510. In an embodiment, guide wheel 514 that may be serrated. The driver wheel motor 510 may be attached to the first portion 504. The Z-motor 522 may be the same as the driver wheel motor 510. In order to drive the distal end 206 forward, a Z-motor 522 will rotate the serrated guide wheel 514 and simultaneously the driver wheel 508 may be rotated by the driver wheel motor 510 which may rotate the driven wheel 512. The driver wheel 508 and driven wheel 512 may be gears coupled together with a gear ratio of 1:1. The types of connections, such as gears, belts, chains, etc,, and the gear ratios between the driver wheel 508 and driven wheel 512 will be known to a person with skill in the art. The movement of the driven wheel 512 may unwind the flexible part 204 and may push the distal end 206 outwards through the serrated guide wheel 514. The flexible part 204 may be aligned by the distal end guide 524 that is attached on top of a lead screw nut 518. The flexible part 204 may be compressed between the serrated guide wheel 514 and a guide wheel 520. The guide wheel 520 may be a very low friction wheel and may rotate if there is any torque or force acting upon it. The serrated guide wheel 514 and guide wheel 520 may contain serrations to increase the pressure on the flexible part 204, When the flexible part 204 is to be inserted or retracted, the Z-motor 522 may rotate the serrated guide wheel 514. The driver wheel motor 510 may rotate the driver wheel 508. The driver wheel 508 may rotate a driven wheel 512 adjacent to it and may have a gear ratio of 1:1. The driven wheel 512 may rotate the spool shaped first portion 504 to wind the flexible part 204 back onto the first portion 504. During this action, the self-reversing screw 526 will freely rotate without the ends moving. This may have the lead screw nut 518 move in the same axis as the self-reversing screw 526, The lead screw nut 518 may have matching threads as the self-reversing screw 526 and may be constrained on the linear rails 516 on opposite ends. This may ensure the leadscrew nut 518 is not rotating. The lead screw nut 518 may have the capabilities to move at the same pitch as how the flexible part 204 may wind back on the hub 502 and go back and forth on the self-reversing screw 526. This may ensure that the flexible part 204 is wound back on the hub 502 as it was and facilitate repeatability and no entanglement. A disposable sleeve or a disposable drape (not shown) may be applied on top of the flexible part 204. The disposable sleeve or disposable drape is to provide additional layers to prevent contamination of the enclosures of the robotic unit 500.

[0071] FIG. 6a, FIG. 6b and FIG. 6c illustrate the bending portion comprising the robotic actuation mechanism of the robotic unit 104, in accordance with an embodiment of the present disclosure. The bending portion 600 is at the distal end of the flexible part 204 just before the distal end 206. The bending portion 600 comprises multiple independent vertebrae 602, 604 stacked over each other and may be connected by rivets 606 as shown in FIG. 6b. The distal part of the bending section 608 may hold the distal end 206 which may include various sensors, the camera housing (not shown), etc. The vertebrae 602, 604 can be connected in such an arrangement to allow partially and/or complete rotational motion independent of each vertebrae 602, 604 about the rivets 606. The rotational motion of each vertebrae 602, 604 can enable bending of the bending portion 600. The vertebrae 602, 604 can be connected to each other and may have eye loop(s) 612 to allow the angulation cables to pass through them, where one end of cable(s) can be connected to the vertebrae 604 at the most distal end at the bending portion 600. The joint connection end of each vertebrae 602, 604 may include rounded corners 610 which may distribute the stress upon bending and reduces the load from the angulation cables. The vertebrae 602, 604 may further comprise at least one eye loop 612 arranged on an inner periphery of the circumference of each vertebrae 602, 604. The cable(s) from the actuation unit 114 may pass through the eye loop(s) 612 to reach the point of connection at the distal end vertebrae 604. The eye loop(s) 612 may form a clover shape cross section allowing the eye loop(s) 612 to remain straight throughout each independent vertebrae 602, 604 and maximize the open internal volume of the vertebrae 602, 604. Having the eye loop(s) 612 of adjacent vertebrae 602, 604 directly aligned with each other and away from the rivets 606 may allow a smoother transition while bending, and may reduce the tension load of the cable(s). Alternatively, a mesh or a combination of the above-described configuration with mesh, or other feasible arrangements known to the person skilled in the art can be employed to achieve the same purpose.

[0072} FIG. 6d illustrates connection of the bending portion 600 with the distal end 206, in accordance with an embodiment of the present disclosure. The vertebrae 602, 604 is connected to the distal end 206 at point of connection 614.

[0073] FIG. 6e illustrates the angulation wires in the robotic unit, in accordance with an embodiment of the current disclosure. As shown in FIG. 6e angulation wires 616 can be seen passing through the flexible pari 204 and further into the bending portion 600. In an embodiment, there may be four angulation wires which may be placed at 90 degrees around the circumference of the robotic unit 104 and connected to the X and Y motors of the actuation unit 114 of the robotic unit 104. The movement of the angulation wires 616 may be controlled by the X and Y motors in order to move them in the X&Y direction.

[0074] FIG. 7 illustrates a cross-sectional view of the flexible part 204 of the robotic unit, in accordance with an embodiment of the present disclosure. As shown in FIG. 7 four angulation wires 616 can be seen arranged at 90 degrees from each other around the circumference of the flexible part 204. It can be seen that the flexible part 204 may comprise of an outermost layer 702 made of any waterproof material. Next to the outermost layer 702 a steel wire mesh 704 may be present to improve the tensile strength of the flexible part 204. The flexible part 204 may enclose an optical fiber 706 which may be connected to a light source such as an LED to illuminate the interior anatomy in order for the imaging unit 112 to capture the images with appropriate luminescence. The flexible part 204 may further Include a signal wires 712 for transmission of signals from the one or more sensors and a wire 716 for varying the thickness of the flexible part 204. Further, the flexible part 204 may comprise of a water jet channel 708 for irrigating an area to provide clear passage for the distal end 206 to proceed in a cavity or lumen. Further, the flexible part 204 may include an air channel 710.

[0075] Any collapsed passage in the cavity or lumen may be insufflated with a gas such as CO? gas, which may provide a better view of an otherwise collapsed cavity or lumen. In an embodiment, the data received from a pressure sensor (not shown) may be combined with the data received from an imaging sensor to create a closed loop insufflation system, that automatically achieves a clear view of the cavity of lumen. The air channel 710 may be used for insufflation of an area by discharging a gas such as COs gas. A pressure sensor (not shown) may be placed at a tip of the distal end 206, or anywhere along the length of the air channel 710 including the portions outside the flexible part 204. A pressure sensor may provide digital or analog input to the robotic unit 104 or the control unit 102. The pressure of the insufflating gas may be monitored in real time based on pressure sensors and the amount of insufflating gas discharged may be controlled so that the pressure inside the cavity or lumen does not exceed or go below pre-defined threshold pressure levels. Therefore, any change in pressure levels may be tracked through data received from the pressor sensor. The control unit 102 may utilize machine learning models along with the data received from the data source(s) to identify ideal insufflation levels, which may be achieved when the data from the data source is within the pre-defined threshold pressure levels. The pre-defined threshold pressure levels may be determined based on pretrained datasets. Once the adequate insufflation is achieved, the information would be sent to the control unit 102 which will then send signals to automatically pause the further flow of the insufflating gases, hence creating a closed-loop insufflation system. Once the adequate insufflation is achieved, it is maintained at that pressure level, the pressure may be automatically increased or decreased as determined through the closed feedback loop. Throughout the process, the real time pressure information is displayed on the user interface of the input/output device 106. Further, the user interface of the input/output device 106 may provide the user 118 the ability to manually override the closed feedback loop output and manually control the pressure. In an embodiment, manual overriding may elicit auditory or visual feedback asking the user to confirm the override.

[0076] The processors 108 of the control unit 102 of the closed loop navigation and control system 100 can utilize machine learning models along with the data received from the data source(s) or saved in the memory 110 to recognize structures relevant to the cavity of the patient, predict an intended path, generate and communicate control signals to the actuation unit 114 to actuate the three-dimensional movement of the invasive medical device of the robotic unit 104. The prediction of the intended path of navigation and recognition of structure relevant to the cavity or lumen can be performed by the control unit 102 by utilizing a machine learning model along with data communicated from the imaging unit 112 and the sensor unit 116. The machine learning model is a part of a computer vision software developed by training one or more neural networks over a labeled dataset of images, where the labeled dataset of images is built by converting a collection of procedure videos into image files and labeling anatomical structures on the image files. In an alternative embodiment, the machine learning model generation involves receiving or collecting training data in form of predetermined datasets to train at least one neural network. The predetermined datasets can be but are not limited to all the data source(s) described above. To enable smooth real time continuous tracking and navigation of the automated closed loop navigation and control system 100, the machine learning models may be optimized for faster execution on a single board compute platform.

[0077] The control unit 102 may predict at least one new intended path. Once the distal end 206 of the flexible part 204 has reached a first position; the control unit 102 may generate or determine a second position along the intended path. The control unit 102 can continuously generate new positions for the distal end 206 of the flexible part 204 along the intended path based on the data received from at least one imaging sensor of the imaging unit 112. The control unit 102 utilizes a machine learning model to compare the data received from imaging unit 112 of the actual movement of the distal end 206 of the flexible part 204 to the intended movement of the distal end 206.

[0078] In the automated closed loop navigation and control system 100, an initial preprogrammed calibration sequence of the motors and the image unit or sensor unit(s) may be indicated. The calibration sequence can be used to construct a Jacobian matrix. In an embodiment of the automated closed loop navigation and control system 100, the visual servoing method wherein inverse kinematics may be used to move the robot joints based on the change in the position of the desired target in the image coordinate frame. A pseudo-inverse of the Jacobian matrix is used to implement the visual servoing . In other embodiments, a reference point minima method wherein a gradient descent approach may be utilized in response to an imaginary potential field between the current position and desired position in the image coordinate frame. A desired position on the image coordinate frame may be provided with an objective to move the captured image of the current view of the robot’s tool to the desired position by moving the robot joints. This may be performed by direct mapping of the robot’s coordinate to the image coordinate. In other embodiments, a LASER guided navigation with a triangulation method may be used to navigate within the cavity or lumen. In other embodiments. Force field tracking may be used where an actual force field such as (but not limited to) mechanical, or magnetic, etc. to guide the robot along the desired path. In any of the embodiments, a form of proportional-integral-derivative controller may be used to smooth the tracking and approach to the desired position.

[0079] FIG. 8 illustrates exemplary functional components of the proposed closed loop navigation and control system 100, in accordance with an embodiment of the present disclosure. The one or more processors 108 of FIG. 1 may enable various processing engines 800 such as data receiving engine 802, image processing engine 804, object detection engine 806, navigation and collision control engine 808, user interface engine 810 and other engines 812. [0080] The data receiving engine 802 is configured to receive data from data sources. In an embodiment, the data source from which the data receiving engine 802 may receive the data received from the imaging unit 112, the actuation unit 114, the sensor unit 116, the input/output device 106, the memory 110.

[0081] The image processing engine 804 may utilize the data received from an imaging sensor provided in the imaging unit 112 and the processed data outputted from the image processing engine 804 may be displayed on a output user interface of the output device 106 to provide a view of the lumen or cavity of the patient to an user, In an embodiment, the view may be two dimensional or a three-dimensional panoramic view of the lumen or cavity of the patient. In an embodiment, the view may be magnified.

[0082] Additionally, the object detection engine 806 may recognize structures in the views created by the image processing engine 804, The views generated may include images which may be overlaid in form of a virtual envelope over the data received from the imaging sensor on the user interface for effective visual guidance to the user.

[0083] During an invasive surgical procedure, a surgeon may accidentally injure one or more anatomical structure such as, but not limited to, the ureter during hysterectomy, common bile duct during cholecystectomy, and the iliac vessels and rectum during prostatectomy, especially during their learning curve. The object detection engine 806 may detect such anatomical structure and the navigation and collision control engine 808 may create a virtual envelope around such anatomical structures that will depict that such structures are to be considered impenetrable and unapproachable, thus protecting them from being accidentally damaged during the surgery.

[0084] In an embodiment, the anatomical structure detected may ensure determination of a control output which is subject to non-intersectional or non-collision position constraints.

[0085] The navigation and collision control engine 808 may provide collision avoidance which may be utilized by to determine navigation controls and the virtual envelopes around the detected anatomical structure depicting the anatomical structure as a no fly zone. As the distal end 206 of the flexible part 204 of the robotic unit 104 approaches these anatomical structures, it may be subject to a repulsive field F, This may be accounted for by sampling in 60-Hz for computation time of image processing to generate visual and simulated tactical feedback. The repulsive force ! F' may be proportional to both the closing-in velocity of the distal end 206 towards the virtual envelope and the reciprocal of the distance between the virtual envelope and the distal end 206 such that: F v/r. In an embodiment, V is the velocity of the distal end 206 and T’ is the distance between the virtual envelope and the distal end 206.

[0086] Based on this principle an algorithm implemented by the navigation and collision control engine 808 will be configured and its effectiveness may be tested in a virtual testing environment of a surgical workspace created using tolls such as, but not limited to, MATLAB: Natick, MRS. In an embodiment, the virtual testing environment of the surgical workspace may be designed, and the algorithm implemented by the navigation and collision control engine 808 may be tested in that virtual testing environment using a virtual robotic unit which is a simulated of the robotic unit 104. Based on the determined efficiency of the algorithm, the same may be integrated and implemented in the navigation and collision control engine 808. In an embodiment, the closed loop navigation and control system 100 may be tested on a non-living model, an animal model, and based on its success it may be rendered fit to be used on a human subject.

[0087] An obstacle can be detected using Stereo vision, depth or distance. In the absence of stereo vision, object view magnification, in terms of its expanse on the image, can be correlated to the distance to the object, which can be used to approximate collision detection and avoidance. In an embodiment, the object detection engine 806 may utilize machine learning models to process the data received from the data source(s) to recognize structures relevant to the cavity or lumen of the patient, or a pre-trained model can identify the obstacles and determine no fly zones. The navigation and collision control engine 808 may utilize machine learning models to process the data received from the data source(s) to predict and navigate along an intended path avoiding the recognized structures. The training of the machine learning model may involve receiving or collecting training data in the form of predetermined datasets to train at least one neural network. A form of this neural network may be, but not limited to, an edge-implemented deep neural net-based object detector which is well known in the art. Other forms of machine learning models other than neural networks may be utilized, as would be well known to a person of skill in the art. The predetermined datasets can include but are not limited to, images and videos, photon count, temperature, position, distance, humidity, gas levels, fluid or enzyme levels, motility studies, pressure, force, etc.

[0088] The navigation and collision control engine 808 may generate and communicate control signals for the actuation unit 114 to actuate the three-dimensional movement of the robotic unit 104.

[0089] The navigation and collision control engine 808 may determine the intended path along which the robotic unit 104 may be guided. The intended path may be determined based on the output from the object detection engine 806, and the data received from the sensor unit 116. The processing circuitry may also utilize at least one pre-trained machine learning model to recognize anatomical structure and the intended path using the data received by the data receiving engine 802.

[0090] In an embodiment, the machine learning model may be a computer vision software utilized by the various processing engines 802-812. The computer vision software may be developed by training one or more neural networks over a labeled dataset of images, where the labeled dataset of images is built by converting a collection of procedural videos into image files and labeling anatomical structures on the image files. In an alternative embodiment, the machine learning model generation involves receiving or collecting training data in form of predetermined datasets to train at least one neural network. The predetermined datasets may also include, but are not limited to, any of the data sources as described above,

[0091] The user interface engine 810 may provide an interactive user interface on the input/output device 106 and may comprise a touch-sensitive display. In an embodiment, the display may display a view of the patient’s internal anatomy with an overlay of the intended path and/or recognized VS to provide an effective visual guidance through the interactive visual interface for the user 118. In an embodiment, the depicted virtual envelope as an overlay over the VS for the user 118 to identify the VS.

[0092] In an embodiment, when the distal end 206 reaches a point of intersection of a passage where the passage may branch into several other branches, the user may by using one or more controls provided by the user interface engine 810 may provide the control signal to the actuation unit 114 to select the intended path to be traversed by the distal end 206. In an embodiment, the one or more controls may be provided in form of augmented reality and/or any other form which may provide the user 118 effective visual guidance. In an embodiment, the user interface engine 810 may display a pre-determined visual indication if multiple intended paths are detected. The user 118 may select an intended path of choice using an appropriate visual indication. In an embodiment, the user interface engine 810 may display multiple trajectories of which the user 118 may select an intended path. Throughout this process, the information is projected on the display and the user 118 may manually override the control generated by the navigation and collision control engine 808 and the control unit 102. In an embodiment, the manual overriding may also elicit auditory or visual feedback asking the user to confirm the override in form of an alarm or a notification. The hierarchy between manual control and automated control may be predefined as default settings based on the training data and may be changed are per requirement.

[0093] In an embodiment, the user interface engine 810 may provide real time vital information of the patient such as, but not limited to, pulse and heart rate, temperature, blood pressure; and other laboratory results, but not limited to blood gas levels, glucose levels, the detected air pressure inside the lumen or cavity and other results that a person trained in the state of art will know.

[0094] In another embodiment, the user interface engine 810 may provide control signals to the actuation unit 114 to actuate three-dimensional movement of the robotic unit 104 manually using a plurality of buttons such as but not limited to, up, down, left and right, insert and retract. In an embodiment, the plurality of butons may enable the actuation of the actuation unit 114 by providing an angular input such as to move the distal end 206 at a 30-degree angle in top-right direction, in an embodiment, the plurality of buttons may be provided as touch buttons arranged on the user interface to provide a manual mode of actuation if required by a user 118. The plurality of buttons can also be used by the user to override the automated actuation of the distal end 206 if the user is not satisfied with the intended path determined by the control unit 102.

[0095] In an embodiment, the other engine(s) 812 may supplement the functionalities of other modules or the closed loop navigation and control system 100 as required.

[0096] FIG. 9 illustrates an exemplary implementation scenario of the closed loop navigation and control system 100, in accordance with an embodiment of the present disclosure. The exemplary scenario 900 shows a tower 902, a display 904, a user interface 906 a cart 908 placed near a patient bed 910. In an embodiment, the tower 902 may comprise a display 904. The user interface 906 can be a touch enabled tablet that may serve as the input/output device 106. The cart 908 may include one or more robotic arms (not shown) and the robotic unit 104 movably coupled to the one or more robotic arms.

[0097] In an embodiment, the tower 902 may further comprise an insufflation equipment (not shown) comprising gas cylinders 912 and an outlet tube to connect the gas cylinders 912 to the robotic unit 104. The tower 902 may include a suction vessel (not shown) to contain the sucked in material during suction. Further, the suction vessel (not shown) may be connected to the suction channel 714 of the flexible part 204. Further, the tower 902 may include a fluid source (not shown) which may act as a fluid source during irrigation and may be connected to the water jet channel 708 of the flexible part 204. The tower 902 may include various other operative energy systems known to the person skilled in the art, to power the electrical system in the tower. In an embodiment, the tower 902, the cart 908, the robotic unit 104 and the tablet 906 may have their separate power sources in form of a battery or a power cable or may be powered by a common power source shared between each of them through electrical cables or an appropriate means known in the art. The cart 908 and the tower 902 may be configured to stand free on the floor and may include wheels underneath in order for them to be positioned as required. In an alternate embodiment, the cart 908 may be attached to the patient bed 910. In alternate embodiments, the cart 908 can be attached to the ceiling or walls of the operating suite.

[0098] in an embodiment, the cart 908 may contain a physical attachment and communications interface (as shown in FIG. 10 described later) with the robotic unit 104. In an embodiment, the tower 902 and the cart 908 may include wheels to be movably placed near the user 118 and the patient in order for the user 118 to monitor the patient effectively and easily.

[0099] Each of the tablet 906, the cart 908, the robotic unit 104, the tower 902, may be connected to each other via a wired or a wireless connection. The tower 902 may also contain one or more additional displays 904 which may also be touch enabled.

[00100] FIG, 10 illustrates a bush coupling mechanism used in the robotic unit, in accordance with an embodiment of the present disclosure. In an embodiment, the bush coupling may be used to connect the one or more robotic arm (not shown) of the cart 908 to one or more robotic unit 104. In an embodiment, the robotic arms may be a type of mechanical arm that may be controlled by the control unit 102 and may function similar to a human arm. The robotic arm may include links of such a manipulator which may be connected by joints allowing either rotational motion (such as in an articulated robot) or translational (linear) displacement. The links of the manipulator can be considered to form a kinematic chain. The terminus of the kinematic chain of the manipulator may be called the end effector and may work analogous to the human hand. The robotic unit 104 can be connected at the end or at any point along the robotic arm.

[00101] In an embodiment, the driver flange 1002 attach of the Z-motor of the actuation unit 114 of robotic unit 104. The driven flange 1004 may be attached to the one or more robotic arms (not shown) provided on the cart 908. In an embodiment, the driver flange 1002 may be connected to the driven flange 1004 through a connection means such as an electromagnetic connection, or nut and bolts passing through the connection notches 1006 and 1008, etc. [00102] FIG. 11 illustrates examples of some possible positions of the robotic unit 104 and the cart 908 around a patient 1108, This illustration is not an exclusive list of all possible procedures or positions around a patient 1108, In an embodiment, the cart 908 may be placed at a position 1102 with respect to a patient 1108 undergoing a procedure through mouth or nose. In another embodiment, the cart 908 may be placed at a position 1104 with respect to a patient 1108 undergoing a transthoracic or a procedure through mouth or nose. In an embodiment, the cart 908 may be placed at a position 1106 or 1110 with respect to a patient 1108 undergoing a procedure through urethra, vagina or anus.

[00103] It is intended that the disclosure and examples be considered as exemplary only, with a true scope of disclosed embodiments being indicated by the following claims.