Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
MEDICAL INTERVENTION CONTROL BASED ON DEVICE TYPE IDENTIFICATION
Document Type and Number:
WIPO Patent Application WO/2022/048984
Kind Code:
A1
Abstract:
Embodiments of the present disclosure encompass a medical intervention control system for improving a medical intervention incorporating an interventional instrument (10) positionable over a guidewire (30). In operation, an adaptive intervention controller (70) derives an identification of the device type of the interventional instrument (10) responsive to receiving shape sensing data generated by the guidewire (30) when the interventional instrument (10) is positioned partly or entirely over the guidewire (30). In accordance with the identification of the device type of the interventional instrument (10), the adaptive intervention controller (70) further adapts an operational control of a medical imaging apparatus (41) by a medical imaging controller (42) for the medical imaging of the interventional instrument (10) and/or an operational control of a medical robot apparatus (91) by a medical robot controller (92) for a robotic navigation of the interventional instrument (10).

Inventors:
BYDLON TORRE (NL)
FLEXMAN MOLLY (NL)
Application Number:
PCT/EP2021/073592
Publication Date:
March 10, 2022
Filing Date:
August 26, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
KONINKLIJKE PHILIPS NV (NL)
International Classes:
A61B90/90; A61B34/00; A61B34/20; A61B34/30; A61B90/00; G06T5/00; G06T7/00
Domestic Patent References:
WO2021074437A12021-04-22
WO2017055620A12017-04-06
Foreign References:
US20130216025A12013-08-22
US20180264227A12018-09-20
US10687909B22020-06-23
US20200253668A12020-08-13
US7289652B22007-10-30
Attorney, Agent or Firm:
PHILIPS INTELLECTUAL PROPERTY & STANDARDS (NL)
Download PDF:
Claims:
Claims

1. A medical intervention control system for improving a medical intervention incorporating an interventional instrument (10) positionable over a guidewire (30) operable to generate shape sensing data, the medical intervention control system comprising: at least one of a medical imaging controller (42) configured to execute an operational control of a medical imaging apparatus (41) for a medical imaging of the interventional instrument (10), and a medical robot controller (92) configured to execute an operational control of a medical robot apparatus (91) for a robotic navigation of the interventional instrument (10); and an adaptive intervention controller (70), wherein the adaptive intervention controller (70) is configured to derive an identification of a device type of the interventional instrument (10) responsive to receiving shape sensing data generated by the guidewire (30) when the interventional instrument (10) is positioned at least partly over the guidewire (30), and wherein the adaptive intervention controller (70) is further configured to at adapt at least one of the operational control of the medical imaging apparatus (41) for by the medical imaging controller (42) for the medical imaging of the interventional instrument (10) in accordance with the identification of the device type of the interventional instrument (10), and the operational control of the medical robot apparatus (91) by the medical robot controller (92) for the robotic navigation of the interventional instrument

(10) in accordance with the identification of the device type of the interventional instrument (10).

2. The medical intervention control system of claim 1, wherein the interventional instrument (10) includes an over-the-wire device

(11); and wherein the adaptive intervention controller (70) being configured to derive the identification of the device type of the interventional instrument (10) includes: the adaptive intervention controller (70) configured to identify the device type of the interventional instrument (10) from an interpretation of the shape sensing data indicating at least one pre-defined identity feature of the over-the-wire device (11).

3. The medical intervention control system of claim 1, wherein the interventional instrument (10) includes a hub (20); and wherein the adaptive intervention controller (70) being configured to derive the identification of the device type of the interventional instrument (10) includes: the adaptive intervention controller (70) configured to identify the device type of the interventional instrument (10) from an interpretation of the shape sensing data indicating a pre-defined identity template of the hub (20).

4. The medical intervention control system of claim 1, wherein the interventional instrument (10) includes a hub (20); and wherein the adaptive intervention controller (70) being configured to derive the identification of the device type of the interventional instrument (10) includes: the adaptive intervention controller (70) configured to identify the hub (20) from an interpretation of the shape sensing data indicating a pre-defined identity template of the hub (20), and the adaptive intervention controller (70) configured to determine a plurality of pre-defined device types of interventional instruments from an identification of the hub (20).

5. The medical intervention control system of claim 4, wherein the interventional instrument (10) further includes an over-the-wire device (11); and wherein the adaptive intervention controller (70) being configured to derive the identification of the device type of the interventional instrument (10) further includes at least one of the adaptive intervention controller (70) configured to identify one of the plurality of pre-defined device types of interventional instruments as the device type of the interventional instrument (10) from an interpretation of the shape sensing data further indicating at least one pre-defined identity feature of the over-the-wire device (11); and the adaptive intervention controller (70) configured to identify one of the plurality of pre-defined device types of interventional instruments as the device type of the interventional instrument (10) from a user interaction with a user interface delineating the interventional instrument (10).

6. The medical intervention control system of claim 1, wherein the adaptive intervention controller (70) being configured to adapt the operational control of the medical imaging apparatus ( 1) by the medical imaging controller (42) for the medical imaging of the interventional instrument (10) includes at least one of the adaptive intervention controller (70) configured to command the medical imaging controller (42) to set at least one medical imaging parameter of the medical imaging apparatus (41) corresponding to the identification of the device type of the interventional instrument (10); the adaptive intervention controller (70) configured to command the medical imaging controller (42) to activate at least one imaging enhancement mode of the medical imaging apparatus (41) corresponding to the identification of the device type of the interventional instrument (10); and the adaptive intervention controller (70) configured to control a display of a protocol user interface for a user interaction with at least one of the at least one medical imaging parameter of the medical imaging apparatus (41) corresponding to the identification of the device type of the interventional instrument (10) and the at least one imaging enhancement mode of the medical imaging apparatus (41) corresponding to the identification of the device type of the interventional instrument (10).

7. The medical intervention control system of claim 6, wherein the adaptive intervention controller (70) being configured to activate the at least one imaging enhancement mode of the medical imaging apparatus (41) includes at least one of the adaptive intervention controller (70) configured to command the medical imaging controller (42) to activate the medical imaging apparatus (41) to delineate predefined details of the interventional instrument (10) and to fade at least one of background noise and anatomical features; the adaptive intervention controller (70) configured to command the medical imaging controller (42) to activate the medical imaging apparatus (41) to delineate a relative positioning of the medical imaging apparatus (41) and the interventional instrument (10); the adaptive intervention controller (70) configured to command the medical imaging controller (42) activate the medical imaging apparatus (41) to generate a multidimensional road map imaging of the interventional instrument (10); and the adaptive intervention controller (70) configured to command the medical imaging controller (42) to active the medical imaging apparatus (41) to implement a multi-dimensional subtraction imaging of the interventional instrument (10).

8. The medical intervention control system of claim 1, wherein the adaptive intervention controller (70) being configured to adapt the operational control of the medical robot apparatus (91) by the medical robot controller (92) for the robotic navigation of the interventional instrument (10) include instructions to at least one of the adaptive intervention controller (70) configured to command the medical robot controller (92) to set at least one robot navigation parameter of the medical robot apparatus (91) corresponding to the identification of the device type of the interventional instrument (10); the adaptive intervention controller (70) configured to command the medical robot controller (92) to activate at least one navigation enhancement mode of the medical robot apparatus (91) corresponding to the identification of the device type of the interventional instrument (10); and the adaptive intervention controller (70) configured to control a display of a protocol user interface for a user interaction with at least one of the at least one robot navigation parameter of the medical robot apparatus (91) corresponding to the identification of the device type of the interventional instrument (10) and the at least one navigation enhancement mode of the medical robot apparatus (91) corresponding to the identification of the device type of the interventional instrument (10).

9. An adaptive intervention controller (70) for improving a medical intervention incorporating an interventional instrument (10) positionable over a guidewire (30) operable to generate shape sensing data, the adaptive intervention controller (70) comprising: a non-transitory machine-readable storage medium encoded with instructions for execution by at least one processor, the non-transitory machine-readable storage medium including instructions to: derive an identification of a device type of the interventional instrument

(10) responsive to receiving shape sensing data generated by the guidewire (30) when the interventional instrument (10) is positioned at least partly over the guidewire (30); and adapt at least one of an operational control of the medical imaging apparatus (41) by a medical imaging controller (42) for the medical imaging of the interventional instrument (10) in accordance with the identification of the device type of the interventional instrument (10); and an operational control of a medical robot apparatus (91) by a medical robot controller (92) for a robotic navigation of the interventional instrument

(10) in accordance with the identification of the device type of the interventional instrument (10).

10. The adaptive intervention controller (70) of claim 9, wherein the interventional instrument (10) includes an over-the-wire device

(11); and wherein the instructions to derive the identification of the device type of the interventional instrument (10) include instructions to: identify the device type of the interventional instrument (10) from an interpretation of the shape sensing data indicating at least one pre-defined identity feature of the over-the-wire device (11).

11. The adaptive intervention controller (70) of claim 9, wherein the interventional instrument (10) includes a hub (20); and wherein the instructions to derive the identification of the device type of the interventional instrument (10) include instructions to: identify of the device type of the interventional instrument (10) from an interpretation of the shape sensing data indicating a pre-defined identity template of the hub (20).

12. The adaptive intervention controller (70) of claim 9, wherein the interventional instrument (10) includes a hub (20); and wherein the instructions to derive the identification of the device type of the interventional instrument (10) include instructions to: identify the hub (20) from an interpretation of the shape sensing data indicating a pre-defined identity template of the hub (20); and determine a plurality of pre-defined device types of interventional instruments from an identification of the hub (20).

13. The adaptive intervention controller (70) of claim 12, wherein the interventional instrument (10) further includes an over-the-wire device (11); and wherein the instructions to derive the identification of the device type of the interventional instrument (10) further include instructions to: identify one of the plurality of pre-defined device types of interventional instruments as the device type of the interventional instrument (10) from an interpretation of the shape sensing data further indicating at least one pre-defined identity feature of the over-the-wire device (11); and identify one of the plurality of pre-defined device types of interventional instruments as the device type of the interventional instrument (10) from a user interaction with a user interface delineating the interventional instrument (10).

14. The adaptive intervention controller (70) of claim 9, wherein the instructions to adapt the operational control of the medical imaging apparatus (41) for the medical imaging by the medical imaging controller (42) of the interventional instrument (10) include instructions to at least one of command the medical imaging controller (42) to set at least one medical imaging parameter of the medical imaging apparatus (41) corresponding to the identification of the device type of the interventional instrument (10); command the medical imaging controller (42) to activate at least one medical imaging parameter of the medical imaging apparatus (41) corresponding to the identification of the device type of the interventional instrument (10); and control a display of a protocol user interface for a user interaction with at least one of the at least one medical imaging parameter of the medical imaging apparatus (41) corresponding to the identification of the device type of the interventional instrument (10) and the at least one imaging enhancement mode of the medical imaging apparatus (41) corresponding to the identification of the device type of the interventional instrument (10).

15. The adaptive intervention controller (70) of claim 14, wherein the instructions to activate at least one medical imaging parameter of the medical imaging apparatus (41) include instructions to includes at least one of: an activation of the medical imaging apparatus (41) by the medical imaging controller (42) to delineate pre-defined details of the interventional instrument (10) and to fade at least one of background noise and anatomical features; an activation of the medical imaging apparatus (41) by the medical imaging controller (42) to delineate a step-by-step guidance of a relative positioning of the medical imaging apparatus (41) and the interventional instrument (10); an activation of the medical imaging apparatus (41) by the medical imaging controller (42) to generate a multi-dimensional road map imaging of t interventional instrument (10); and an activation of the medical imaging apparatus (41) by the medical imaging controller (42) to implement a multi-dimensional subtraction imaging of the over-the- interventional instrument (10).

16. The adaptive intervention controller (70) of claim 9, wherein the instructions to adapt the operational control of the medical robot apparatus (91) by the medical robot controller (92) for the robotic navigation of the interventional instrument (10) include instructions to at least one of command the medical robot controller (92) to set at least one robot navigation parameter of the medical robot apparatus (91) corresponding to the identification of the device type of the interventional instrument (10); command the medical robot controller (92) to activate at least one navigation enhancement mode of the medical robot apparatus (91) corresponding to the identification of the device type of the interventional instrument (10); and control a display of a protocol user interface for a user interaction with at least one of the at least one robot navigation parameter of the medical robot apparatus (91) corresponding to the identification of the device type of the interventional instrument (10) and the at least one navigation enhancement mode of the medical robot apparatus (91) corresponding to the identification of the device type of the interventional instrument (10).

17. A medical imaging control method executable by an adaptive intervention controller (70) for improving a medical intervention incorporating an interventional instrument (10) positionable over a guidewire (30) operable to generate shape sensing data, the medical imaging control method comprising: deriving, by the adaptive intervention controller (70), an identification of a device type of the interventional instrument (10) responsive to receiving shape sensing data generated by the guidewire (30) when the interventional instrument (10) is positioned at least partly over the guidewire (30); and at least one of adapting, by the adaptive intervention controller (70), an operational control of a medical imaging apparatus (41) by a medical imaging controller (42) for the medical imaging of the interventional instrument (10) in accordance with the identification of the device type of the interventional instrument (10); and adapting, by the adaptive intervention controller (70), an operational control of a medical robot apparatus (91) by a medical robot controller (92) for a robotic navigation of the interventional instrument (10) in accordance with the identification of the device type of the interventional instrument (10).

18. The medical imaging control method of claim 17, wherein the deriving, by the adaptive intervention controller (70), the identification of the device type of the interventional instrument (10) includes at least one of identifying, by the adaptive intervention controller (70), the device type of the interventional instrument (10) from an interpretation of the shape sensing data indicating at least one pre-defined identity feature of an over-the-wire device (11) of the interventional instrument (10); and identifying, by the adaptive intervention controller (70), the device type of the interventional instrument (10) from an interpretation of the shape sensing data indicating a pre-defined identity template of a hub (20) of the interventional instrument

(10).

19. The medical imaging control method of claim 17, wherein the deriving, by the adaptive intervention controller (70), the identification of the device type of the interventional instrument (10) includes: identifying, by the adaptive intervention controller (70), a hub (20) of the interventional instrument (10) from an interpretation of the shape sensing data indicating a pre-defined identity template of the hub (20); determining, by the adaptive intervention controller (70), a plurality of predefined device types of interventional instruments from an identification of the hub (20); and at least one of identifying, by the adaptive intervention controller (70), one of the plurality of pre-defined device types of interventional instruments as the device type of the interventional instrument (10) from an interpretation of the shape sensing data further indicating at least one pre-defined identity feature of the over-the-wire device

(11); and identifying, by the adaptive intervention controller (70), one of the plurality of pre-defined device types of interventional instruments as the device type of the interventional instrument (10) from a user interaction with a user interface delineating the interventional instrument (10).

20. The medical imaging control method of claim 17, wherein at least one of: the adapting, by the adaptive intervention controller (70), the operational control of the medical imaging apparatus ( 1) by the medical imaging controller (42) for the medical imaging of the interventional instrument (10) includes at least one of commanding, by the adaptive intervention controller (70), the medical imaging controller (42) to activate an image enhancement mode of the medical imaging apparatus (41) corresponding to the identification of the device type of the interventional instrument (10), commanding, by the adaptive intervention controller (70), the medical imaging controller (42) to set at least one medical imaging parameter of the medical imaging apparatus (41) corresponding to the identification of the device type of the interventional instrument (10), and controlling, by the adaptive intervention controller (70), display of a protocol user interface for a user interaction with at least one of the at least one medical imaging parameter of the medical imaging apparatus (41) corresponding to the identification of the device type of the interventional instrument (10) and the at least one imaging enhancement mode of the medical imaging apparatus (41) corresponding to the identification of the device type of the interventional instrument (10); and the adapting, by the adaptive intervention controller (70), an operational control of a medical robot apparatus (91) by a medical robot controller (92) for a robotic navigation of the interventional instrument (10) includes at least one of commanding, by the adaptive intervention controller (70), the medical robot controller (92) to set at least one robot navigation parameter of the medical robot apparatus (91) corresponding to the identification of the device type of the interventional instrument (10), and controlling, by the adaptive intervention controller (70), a display of a protocol user interface for a user interaction with at least one of the at least one robot navigation parameter of the medical robot apparatus (91) corresponding to the identification of the device type of the interventional instrument (10) and the at least one navigation enhancement mode of the medical robot apparatus (91) corresponding to the identification of the device type of the interventional instrument (10).

50

Description:
MEDICAL INTERVENTION CONTROL

BASED ON DEVICE TYPE IDENTIFICATION

FIELD OF THE INVENTION

The present disclosure relates to systems, controllers and methods for medical interventions incorporating medical imaging systems (e.g., X-Ray imaging systems, magnetic resonance imaging systems, ultrasound imaging systems, etc.) and/or medical robot systems (e.g., operator-manipulated medical robots and computer-assisted medical robots for heart surgeries, thoracic surgeries, gastrointestinal surgeries, etc.).

BACKGROUND OF THE INVENTION

During a vascular intervention, one challenge is to establish a proper view of an interventional instrument (e.g., a catheters, delivery devices, scopes, etc.) during a navigation and a deployment of the interventional instrument in the vasculature within a context of also obtaining a good visualization of the anatomy. Various ways of improving image quality to optimize visualization of interventional instruments and anatomy in an imaging environment while reducing radiation are known.

For example, U.S. Patent No. 7,289,652 B2 to Florent et al. entitled "MEDICAL VIEWING SYSTEM AND METHOD FOR DETECTING AND ENHANCING STRUCTURE IN NOISY IMAGES", hereby incorporated by reference, teaches use of embedded markers serving as a basis to improve the visualization of an interventional instrument in a noisy medical image generated by a medical imaging system (e.g., improving a visualization of a stent in a noisy fluorescent image generated by an X-Ray imaging system).

By further example, International Patent Application Publication No. (TBD) to Bydlon et al. entitled " IMAGE ENHANCEMENT BASED ON FIBER OPTIC SHAPE SENSING", hereby incorporated by reference, teaches use of optical shape sensing as a marker as a basis to improve a visualization of a stent in a fluoroscopy image generated by an X-Ray imaging system).

Additionally, surgical robots have been incorporated as a means for an easier and a safer navigating of an interventional instrument within complex anatomy. Of importance, particularly for a vascular intervention, is to control a setting of navigation parameter(s) of a surgical robot to ensure a minimization of any force trauma applied by the interventional instrument to the anatomy as the interventional instrument is navigated within the anatomy. For example, during medical intervention, a linear acceleration/deceleration, a rotational acceleration/decel eration and a maximum velocity of the surgical robot in navigating the interventional instrument within the anatomy may be set to ensure a minimization of any force trauma applied by the interventional instrument to the anatomy as the interventional instrument is navigated within the anatomy.

SUMMARY OF THE INVENTION

It is an object of the present disclosure to improve upon medical interventions by acquiring high quality medical images of an interventional instrument (e.g., balloon catheters, stent catheters, ablation catheters, imaging catheters, infusion catheters, endograft deployment devices, sheaths, introducers, mitral clip delivery devices, mitral valve delivery devices, aortic valve delivery devices, endoscopes, bronchoscopes, cannulas, needles, thrombectomy devices atherectomy devices, etc.) based on adapting an operational control of a medical imaging apparatus (e.g., X-ray imaging apparatus, a CT apparatus, a MRI apparatus, an ultrasound imaging apparatus, etc.) by an medical imaging controller to an identification of a device type of the interventional instrument.

It is a further object of the present disclosure to improve upon medical interventions by achieving an optimal navigation of an interventional instrument within an anatomy based on adapting an operational control of a medical robot apparatus (e.g., an operator-manipulated surgical robot, a computer-controlled surgical robot, etc.) by a medical robot controller to an identification of a device type of the interventional instrument.

Exemplary embodiments of the present disclosure for improving medical interventions include, but are limited to, (1) medical intervention control systems, (2) adaptive intervention controllers, and (3) medical intervention control methods.

Various medical intervention control system embodiments of the present disclosure encompass a medical imaging controller and an adaptive intervention controller for improving a medical imaging of an interventional instrument by a medical imaging apparatus. The medical imaging controller is configured to execute an operational control of the medical imaging apparatus for the medical imaging of the interventional instrument. The adaptive intervention controller is configured to derive an identification of a device type of the interventional instrument responsive to receiving shape sensing data generated by a guidewire when the interventional instrument is positioned at least partly over the guidewire. The adaptive intervention controller is further configured to adapt the operational control of the medical imaging apparatus by the medical imaging controller for the medical imaging of the interventional instrument in accordance with the identification of the device type of the interventional instrument.

Various medical intervention control system embodiments of the present disclosure encompass a medical robot controller and an adaptive intervention controller for improving a robotic navigation of the interventional instrument by a medical robot apparatus. The medical robot controller is configured to execute an operational control of the medical robot apparatus for the robotic navigation of the interventional instrument. The adaptive intervention controller is configured to derive an identification of a device type of the interventional instrument responsive to receiving shape sensing data generated by a guidewire when the interventional instrument is positioned at least partly over the guidewire. The adaptive intervention controller is further configured to adapt the operational control of the medical robot apparatus by the medical robot controller for the robotic navigation of the interventional instrument in accordance with the identification of the device type of the interventional instrument.

Various medical intervention control system embodiments of the present disclosure encompass a medical imaging controller, a medical robot controller, and an adaptive intervention controller for improving both a medical imaging of an interventional instrument by a medical imaging apparatus and a robotic navigation of the interventional instrument by a medical robot apparatus. The medical imaging controller is configured to execute an operational control of the medical imaging apparatus for the medical imaging of the interventional instrument, and the medical robot controller is configured to execute an operational control of the medical robot apparatus for the robotic navigation of the interventional instrument.

The adaptive intervention controller is configured to derive an identification of a device type of the interventional instrument responsive to receiving shape sensing data generated by a guidewire when the interventional instrument is positioned at least partly over the guidewire. The adaptive intervention controller is further configured to adapt the operational control of the medical imaging apparatus by the medical imaging controller for the medical imaging of the interventional instrument in accordance with the identification of the device type of the interventional instrument, and to adapt the operational control of the medical robot apparatus by the medical robot controller for the robotic navigation of the interventional instrument in accordance with the identification of the device type of the interventional instrument.

Various adaptive intervention controller embodiments of the present disclosure encompass a non-transitory machine-readable storage medium encoded with instructions for execution by one or more processors for improving a medical imaging of an interventional instrument by a medical imaging apparatus. The non-transitory machine-readable storage medium includes instructions to identify a device type of the interventional instrument responsive to receiving shape sensing data generated by a guidewire when the interventional instrument is positioned at least partly over the guidewire. The non-transitory machine-readable storage medium further includes instructions to adapt the operational control of the medical imaging apparatus by the medical imaging controller for the medical imaging of the interventional instrument in accordance with the identification of the device type of the interventional instrument.

Various adaptive intervention controller embodiments of the present disclosure encompass a non-transitory machine-readable storage medium encoded with instructions for execution by one or more processors for improving a robotic navigation of an interventional instrument by a medical robot apparatus. The non-transitory machine-readable storage medium includes instructions to identify a device type of the interventional instrument responsive to receiving shape sensing data generated by a guidewire when the interventional instrument is positioned at least partly over the guidewire. The non-transitory machine-readable storage medium further includes instructions to adapt the operational control of the medical robot apparatus by the medical robot controller for the robotic navigation of the interventional instrument in accordance with the identification of the device type of the interventional instrument.

Various adaptive intervention controller embodiments of the present disclosure encompass a non-transitory machine-readable storage medium encoded with instructions for execution by one or more processors for improving a medical imaging of an interventional instrument by a medical imaging apparatus and for improving a robotic navigation of an interventional instrument by a medical robot apparatus. The non-transitory machine-readable storage medium includes instructions to identify a device type of the interventional instrument responsive to receiving shape sensing data generated by a guidewire when the interventional instrument is positioned at least partly over the guidewire. The non-transitory machine-readable storage medium further includes instructions to adapt the operational control of the medical imaging apparatus by the medical imaging controller for the medical imaging of the interventional instrument in accordance with the identification of the device type of the interventional instrument and instructions to adapt the operational control of the medical robot apparatus by the medical robot controller for the robotic navigation of the interventional instrument in accordance with the identification of the device type of the interventional instrument.

Various medical intervention control method embodiments of the present disclosure for improving a medical imaging of an interventional instrument by a medical imaging apparatus encompass an adaptive intervention controller deriving an identification of a device type of the interventional instrument responsive to receiving shape sensing data generated by the guidewire when the interventional instrument is positioned at least partly over the guidewire, and further encompass the adaptive intervention controller adapting an operational control of the medical imaging apparatus by a medical imaging controller for the medical imaging of the interventional instrument in accordance with the identification of the device type of the interventional instrument.

Various medical intervention control method embodiments of the present disclosure for improving a robotic navigation of an interventional instrument by a medical robot apparatus encompass an adaptive intervention controller deriving an identification of a device type of the interventional instrument responsive to receiving shape sensing data generated by the guidewire when the interventional instrument is positioned at least partly over the guidewire, and further encompass the adaptive intervention controller adapting an operational control of the medical robot apparatus by the medical robot controller for the robotic navigation of the interventional instrument in accordance with the identification of the device type of the interventional instrument. Various medical intervention control method embodiments of the present disclosure for improving a medical imaging of an interventional instrument by a medical imaging apparatus and for improving a robotic navigation of an interventional instrument by a medical robot apparatus encompass an adaptive intervention controller deriving an identification of a device type of the interventional instrument responsive to receiving shape sensing data generated by the guidewire when the interventional instrument is positioned at least partly over the guidewire.

These medical intervention control method embodiments of the present disclosure further encompass the adaptive intervention controller adapting an operational control of the medical imaging apparatus by a medical imaging controller for the medical imaging of the interventional instrument in accordance with the identification of the device type of the interventional instrument, and the adaptive intervention controller adapting an operational control of the medical robot apparatus by the medical robot controller for the robotic navigation of the interventional instrument in accordance with the identification of the device type of the interventional instrument.

For the aforementioned embodiments of the present disclosure, an identification of a device type of the interventional instrument may be derived from (1) a pre-defined identify feature of an over-the-wire device of the interventional instrument (e.g., a shape profile, a curvature profile, a strain profile, a twist profile and/or derivative profiles thereof of a lumen of the over-the wire device) and/or a pre-defined identify template of a hub of the interventional instrument (e.g., a shape profile, a curvature profile, a strain profile a twist profile and/or derivative profiles thereof of a channel within the hub).

Also for the aforementioned embodiments of the present disclosure, the operational control adaption of a medical imaging apparatus by a medical imaging controller may include the adaptive intervention controller (1) commanding the medical imaging controller to set medical imaging parameter(s) of the medical imaging apparatus corresponding to the identification of the device type of the interventional instrument (e.g., a controller setting of a dosage, an acquisition time, a shuttering, a windowing, etc. corresponding to an identified balloon catheter), (2) commanding the medical imaging controller to activate an image enhancement mode of the medical imaging apparatus corresponding to the identification of the device type of the interventional instrument (e.g., activation of StentBoost and/or SmartPerfusion corresponding to an identified stent catheter), (and/or (3) controlling a display of a protocol user interface for a user interaction with the medical imaging parameter(s) of the medical imaging apparatus corresponding to the identification of the device type of the interventional instrument.

Further for the aforementioned embodiments of the present disclosure, the operational control adaption of a medical robot apparatus by a medical robot controller may include the adaptive intervention controller (1) commanding the medical robot controller to set robot navigation parameter(s) of the medical robot apparatus corresponding to the identification of the device type of the interventional instrument (e.g., a controller setting of a linear acceleration/decel eration, a rotational acceleration/deceleration and a maximum velocity corresponding to an identified balloon catheter), (2) commanding the medical robot controller to activate a navigation enhancement mode of the medical robot apparatus corresponding to the identification of the device type of the interventional instrument (e.g., maintaining an orientation of the interventional instrument within an anatomy, maintaining positional/orientational relationships between components of the interventional instrument or updated preplanned paths through the anatomy), and/or (3) controlling a display of a protocol user interface for a user interaction with the medical robot navigation parameter(s) of the medical robot apparatus corresponding to the identification of the device type of the interventional instrument.

The foregoing embodiments and other embodiments of the present disclosure as well as various structures and advantages of the present disclosure will become further apparent from the following detailed description of various embodiments of the present disclosure read in conjunction with the accompanying drawings. The detailed description and drawings are merely illustrative of the present disclosure rather than limiting, the scope of the present disclosure being defined by the appended claims and equivalents thereof.

BRIEF DESCRIPTION OF THE DRAWINGS

The present disclosure will present in detail the following description of exemplary embodiments with reference to the following figures wherein: FIGS. 1 A and IB illustrate an exemplary embodiments of a medical intervention control system in accordance with the present disclosure;

FIG. 2 illustrates exemplary shape profiles of catheters as known in the art of the present disclosure.

FIG. 3 illustrates a cross-section view of a hub as known in the art of the present disclosure;

FIG. 4 illustrates an example of an image of StentBoost showing better image quality of the stent as known in the art of the present disclosure;

FIG. 5 illustrates an optical shape sensed device overlaid upon a pre-operative CT image as known in the art of the present disclosure;

FIG. 6 illustrates an exemplary embodiment of a flowchart representative of a medical intervention control method in accordance with the present disclosure;

FIG. 7 illustrates an exemplary embodiment of a flowchart representative of an X-Ray medical intervention control method in accordance with the present disclosure;

FIGS. 8A-8F illustrate exemplary embodiments of adaptive intervention controllers as shown in FIGS. 1 A and IB in accordance with the present disclosure; and

FIGS. 9A-9F illustrate exemplary embodiments of the medical intervention control systems as shown in FIGS. 1 A and IB in accordance with the present disclosure.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

Generally, the present disclosure is applicable to all medical interventions as known in the art of the present disclosure for diagnosing and/or treating a medical condition involving a medical imaging and/or a robotic navigation of an interventional instrument within an anatomy including, but not limited to, (1) vascular procedures utilizing guidewires, catheters, stent sheaths, endograft deployment systems, etc., (2) endoluminal procedures utilizing guidewires, catheters, sheaths, endoscopes or bronchoscopes and (3) orthopedic procedures utilizing k-wires, screwdrivers, etc.

More particularly, the present disclosure improves upon the prior art of medical interventions by providing novel and unique interventional instrument based adaptive control of numerous and various medical imaging modalities including, but not limited to, X-ray medical imaging, ultrasound medical imaging, intravascular ultrasound imaging, magnetic resonance medical imaging, computed tomography medical imaging, optical coherence tomography medical imaging, and endoscopic medical imaging.

The present disclosure improves upon the prior art of medical interventions by providing novel and unique interventional instrument based adaptive control of numerous and various robotic navigation modalities including, but not limited to, an operator-manipulated robotic navigation of an interventional instrument (e.g., a handmanipulator for mimicking normal operator hand movements during a medical intervention) and a computer-controlled robotic navigation of an interventional instrument (e.g., an image based path planning of the interventional instrument within an anatomy).

Additionally, optical shape sensing (OSS) may be utilized by the medical intervention control of the present disclosure.

To facilitate an understanding of the present disclosure, the following description of FIGS. 1A and IB teach exemplary embodiments of a medical intervention control system in accordance with the present disclosure. From the description of FIGS. 1A and IB, those having ordinary skill in the art of the present disclosure will appreciate how to apply the present disclosure to make and use additional embodiments of a medical intervention control system in accordance with the present disclosure.

Referring to FIG. 1 A, a first exemplary medical intervention system of the present disclosure employs an interventional instrument 10 including a medical imaging system including a medical imaging apparatus 41 and a medical imaging controller 42, a spatial tracking system including a spatial tracking apparatus 53 and a spatial tracking controller 54, and a medical display system including an image processor 60 and an monitor 61.

For purposes of describing and claiming the present disclosure, (1) the term "interventional instrument" encompasses all medical instruments, as known in the art of the present disclosure and hereinafter conceived, including an over-the-wire (OTW) device 11 and an interventional device 12 for diagnosing and/or treating a medical condition, (2) the term "OTW device" encompasses all medical devices, as known in the art of the present disclosure and hereinafter conceived, having a lumen extending over a segment or an entirety of the medical device for accommodating a guidewire 30 within the OTW device, and (3) the term "interventional device" encompasses all medical devices, as known in the art of the present disclosure and hereinafter conceived, deliverable by OTW device 11 to target position(s) within an anatomy and deployable within the anatomy for diagnosing and/or treating a medical condition.

In practice of the present disclosure, interventional device 12 may be internal to, external to, or partially internal/partially external to OTW device 11.

Examples of interventional instrument 10 include, but are not limited to, (1) a balloon catheter including OTW device 11 in the form of a catheter and interventional device 12 in form of a balloon, (2) a stent catheters including OTW device 11 in the form of a catheter and interventional device 12 in form of a stent, and (3) an endograft delivery system including OTW device 11 in the form of a delivery system and interventional device 12 in the form of a stent graft, and (4) an intervascular therapy system including OTW device 11 in the form of an introducer sheath and interventional device 12 in the form of percutaneous heart pump.

Also in practice of the present disclosure, OTW device 11 has one or more predefined identify features, which for purposes of describing and claims the present disclosure, the term "pre-defined identify feature" is a pre-defined profile of a lumen of OTW device 11 as known in the art of the present disclosure or hereinafter conceived (e.g., a shape profile, a curvature profile, a strain profile, a vibration profile, a twist profile and/or derivate profiles thereof of the lumen of OTW device 11) that serves as a basis for identifying a device type of interventional instrument 10 as will be further described in the present disclosure.

For example, FIG. 2 illustrates known baseline shapes of a tiger catheter 1 la, a jacky catheter 1 lb, an amplatz left catheter 11c, a LCB catheter 1 Id, a RCB catheter 1 le, a Judkins left catheter 1 If, a Judkins right catheter 11g, a multipurpose A2 catheter 1 Ih, a IM catheter 1 li, a 3D lima catheter 1 Ijand an IM VB-1 catheter 1 Ik, whereby a shape profile, a curvature profile, a strain profile and a twist profile of a lumen within one of these catheters will facilitate an identification of that particular catheter via a shape sensing by guidewire 30 as positioned or translated within the lumen.

Referring back to FIG. 1 A, interventional instrument 10 may include a hub 20 attached to OTW device 11. For purposes of describing and claiming the present disclosure, the term "hub" broadly encompasses any object, as known in the art of the present disclosure and hereinafter conceived, having a pre-defined identity template serving as a basis for a spatial registration of OTW device 11 within a reference coordinate system, and the term "pre-defined identity template" broadly encompasses a pre-defined profile of a channel within the hub as known in the art of the present disclosure or hereinafter conceived (e.g., a shape profile, a curvature profile, a strain profile, a twist profile and/or derivate profiles thereof of the channel within hub 20) that serves as a basis for identifying a device type of interventional instrument 10 as will be further described in the present disclosure.

Examples of a hub include, but are not limited to, a unicath hub, a luer lock hub, an over-catheter hub, a hemostatic valve hub, a guidewire torque hub and an introducer hub.

In practice of the present disclosure, hub 20 may be permanently affixed to OTW device 11 or detachably attached to OTW device 11.

In one exemplary embodiment, hub 20 may be constructed in accordance with International Patent Application Publication No. WO 2017/055620 Al to Noonan et al. entitled "HUB FOR DEVICE NAVIGATION WITH OPTICAL SHAPE SENSED GUIDEWIRE", hereby incorporated by reference.

In this exemplary embodiment of hub 20 as shown in FIG. 3, a hub 20a includes a hub body 21, which may have a solid design or split half design. In practice, hub body 21 provides channel 21a having a pre-defined identity template that serves as a basis for identifying a device type of interventional instrument 10 as will be further described in the present disclosure. For example, the channel 21a forms a path through hub body 21 as shown whereby a shape profile of the channel 21a is the pre-defined identity template of hub 20a that may be sensed by guidewire 30 being positioned or translated within channel 21a.

By further example, hub body 21 may employ a mechanism 23 for displacing channel 21a from a linear shape profile to a deformable non-linear shape profile to thereby distinguish a portion of channel 21a as the pre-defined identity template of hub 20a that may be sensed by guidewire 30 being positioned or translated within channel 21a. Mechanism 22 includes a spring return button 22a to induce the shape deformation when needed.

By even further example, heat coil(s) 24 may be employed to induce a temperature based axial strain in channel 21a to form a deformable shape profile of channel 21a thereby distinguish a portion of channel 21a as the pre-defined identity template of hub 20a that may be sensed by guidewire 30 being positioned or translated within channel 21a.

Hub body 21 may include a locking mechanism 25 to capture OSS embodiments of guidewire section 30a to thereby impede any translation of OSS guidewire section 30a within hub body 21.

Still referring to FIG. 3, hub body 21 may include a radio-opaque feature 24 to permit the registration of the hub 20a by a medical imaging modality (e.g., fluoroscopy/x-ray, MRI, CT, ultrasound, etc.).

Also in practice, hub body 21 may include an identifier 26 such as, for example, a code, a serial number, a radiofrequency identifier (RFID) tag and a microchip to identify the hub 20a via a database or other reference.

Still referring to FIG. 3, hub body 21 may include a registration feature 29 such as, for example, a divot or a channel as taught by Noonan. Additionally, hub body 21 includes a proximal Luer lock 27 that is free to rotate and pivot to allow improved usability. Lock 27 may include a feature 28 (e.g., a torque stop or a lock) to prevent removal if twisting in one direction but permit removal in the other direction.

Referring back to FIG. 1 A, in practice of the present disclosure, OTW device 11 (and hub 20 if employed) will be loaded upon guidewire 30 as known in the art of the present disclosure whereby interventional device 12 is strategically positioned along guidewire 30 to thereby perform a diagnostic task or therapeutic task. When OTW device 11 (and hub 20 if employed) is(are) fully loaded on guidewire 30 as known in the art of the present disclosure, a section 30a of guidewire 30 will extend through OTW device 1 l(and hub 20 if employed). Additionally, a section 30b of guidewire 30 may be proximal OTW device 11 and/or a section 30c of guidewire 30 may be distal OTW device 11.

For purposes of describing and claiming the present disclosure, the term guidewire" broadly encompasses all elongated devices, as known in the art of the present disclosure and hereinafter conceived, constructed with shape sensors to assist in an insertion, a positioning and/or a tracking of an interventional instrument within an anatomy, and the term "shape sensors" broadly encompasses all sensors, as known in the art of the present disclosure and hereinafter conceived, that may be distributed along an entirety of a guidewire or one or more segments of the guidewire to collectively generate data informative of a shape profile of a guidewire and may be further informative of additional profiles of the guidewire (e.g., a curvature profile, a temperature profile, a vibration profile, a strain profile, a twist profile and/or an alpha profile).

Examples of guidewire 30 include, but are not limited to, a floppy guidewire, a stiff guidewire, a measurement wire (e.g. flow wire), a k-wire and a microcatheter.

A first example of a shape sensor is an optical shape sensor with fiber optics embedded within guidewire 30 for optical shape sensing (OSS) of guidewire 30 as known in the art of the present disclosure.

In one exemplary OSS embodiment, the optical shape sensor is based on fiber optic Bragg grating sensors. A fiber optic Bragg grating (FBG) is a short segment of optical fiber that reflects particular wavelengths of light and transmits all others. This is achieved by adding a periodic variation of the refractive index in the fiber core, which generates a wavelength-specific dielectric minor. A fiber Bragg grating can therefore be used as an inline optical filter to block certain wavelengths, or as a wavelength-specific reflector.

More particular, a FBG sensor may use Fresnel reflection at each of the interfaces where the refractive index is changing. For some wavelengths, the reflected light of the various periods is in phase so that constructive interference exists for reflection and, consequently, destructive interference for transmission. The Bragg wavelength is sensitive to strain as well as to temperature. This means that Bragg gratings can be used as sensing elements in fiber optic sensors. In an FBG sensor, the measurand (e.g., strain) causes a shift in the Bragg wavelength.

In a second exemplary OSS embodiment, the optical shape sensor is based on inherent backscatter. One such approach uses Rayleigh scatter (or other scattering) in standard single-mode communications fiber. Rayleigh scatter occurs as a result of random fluctuations of the index of refraction in the fiber core. These random fluctuations can be modeled as a Bragg grating with a random variation of amplitude and phase along the grating length. By using this effect in three or more cores running within a single length of multi-core fiber, the 3D shape and dynamics of the surface of interest can be followed.

One advantage of an optical shape sensor being embedded within guidewire 30 is that various sensor elements can be distributed over the length of a fiber. Incorporating three or more cores with various sensors (gauges) along a length of a fiber that is embedded in a structure permits a three-dimensional form of such a structure to be precisely determined, typically with better than 1 mm accuracy. Along the length of the fiber, at various positions, a multitude of FBG sensors can be located. From the strain measurement of each FBG, a curvature, a strain and/or a twist of guidewire 30 can be inferred at that position.

A non-limiting example of optical shape sensors include the FORS sensors commercially offered by Philips Healthcare.

A second example of a shape sensor are electromagnetic (EM) sensors as known in the art of the present disclosure for detecting a magnetic field to facilitate a measurement of a position and/or an orientation of the EM sensor(s) within the magnetic field. As with OSS, EM sensors may be embedded within guidewire 30 over a length or one or more segments of guidewire 30 to thereby permit a two-dimensional or a three-dimensional shape of guidewire 30 to be precisely determined. Additional, a curvature, a strain and/or a twist of guidewire 30 can be inferred from the positions of the EM sensors within guidewire.

A non-limiting example of a EM sensor(s) are AURORA EM sensor(s) commercially offered by NDI, Inc.

Referring back to FIG. 1 A, for purposes of describing and claiming the present disclosure, the term “medical imaging apparatus” encompasses all apparatuses, as known in the art of the present disclosure and hereinafter conceived, for directing energy (e.g., X-ray beams, ultrasound, radio waves, magnetic fields, light, electrons, lasers, and radionuclides) into an anatomy for purposes of generating images of the anatomy, and the term "medical imaging controller" encompasses all controllers, as known in the art of the present disclosure and hereinafter conceived, for controlling an activation/deactivation of a medical imaging apparatus (e.g., an X-ray C-arm, an ultrasound probe, etc.) to systematically direct energy into an anatomy via operatorgenerated commands and/or image guided procedural -generated commands for the purposes of generating images of the anatomy as known in the art of the present disclosure. Examples of a medical imaging system include, but are not limited to, X- ray imaging systems, ultrasound imaging systems, MRI systems and computed- tomography imaging system manufactured by Philips Healthcare., Siemens Healthcare GE Healthcare and Hologic.

In one embodiment, imaging controller 42 may set one or more imaging parameters of the activation/deactivation of medical imaging apparatus 41 for generating anatomical images. For purposes of the describing and claiming the present disclosure, the term "imaging parameter" broadly encompasses any factor that determines or limits a generation of anatomical images by medical imaging apparatus 41. Examples of an imaging parameter include, but are not limited to, dosage, acquisition time, collimation, shuttering, windowing of an X-ray imaging apparatus, acquisition setting (kV/mA), image processing parameters, frame rate and postprocessing filters.

In practice of this embodiment as will be further explained in the present disclosure, medical imaging controller 42 may set the imaging parameter(s) of the activation/deactivation of medical imaging apparatus 41 for generating anatomical images where the imaging parameter setting corresponds to a device type of interventional instrument 10, or medical imaging controller 42 may provide user interface(s) of user selectable protocols of imaging parameters that correspond to a device type of interventional instrument 10.

In another exemplary embodiment, medical imaging controller 42 may activate/deactivate one or more image enhancements modes of medical imaging apparatus 41 including, but not limited to, StentBoost and SmartPerfusion or the like for X-ray imaging apparatuses. For purposes of the describing and claiming the present disclosure, the term "image enhancement mode" broadly encompasses any operational mode of medical imaging apparatus 41 that facilitates an image display of interventional instrument 10 that is customized to facilitate an optimal visualization of interventional instrument 10 within medical images. For example, as taught by Florent and Bydlon, StentBoost is an image enhancement tool that enhances stent visualization in relation to vessel walls by localizing marker bands of the stent in each medical image frame, compensating for any motion, and then averaging across the medical image frames to improve the contrast of the image. As exemplarily as shown in FIG. 4, a stent is enhanced in an X- ray image 62a by showing finer details of the stent struts while background noise and anatomical structures are faded out. This enables more precise positioning of the stent and the ability to correct for under-deployment immediately.

By further example, SmartPerfusion is an imaging enhancement technology that provides interventionalists with an objective understanding of the impact of their treatment to help determine the outcome of perfusion procedures. Advanced guidance supports standardized comparisons and automated functions simplify clinical adoption. More particularly SmartPerfusion provides step-by-step guidance of a relative positioning of the X-ray apparatus and the interventional instrument during the procedure to aid standardization of pre-and post-comparison runs. For example, a C- arm and a table position can be easily matched with a pre-run position and a catheter position is stored and visualized to standardize placement for injection.

Additional image enhancement modes include a generation a multi-dimensional road map imaging of the interventional instrument 10, and implementation of a multidimensional subtraction imaging of the interventional instrument. In fluoroscopy roadmapping, an image with peak opacification is used as a mask for subsequent subtraction images. For example, an image with contrast filling the vessel tree can be used as a mask for subsequent fluoroscopy navigation of a catheter within the vessel tree. In order to obtain this roadmap typically a pigtail catheter is used to inject a contrast agent into the vasculature. Therefore, if a pigtail catheter is the type of device that has been identified, then the physician will most likely want to inject contrast and obtain a roadmap image; and hence the imaging system should be set to acquire such an image and as a next step utilize this newly acquired roadmap as the mask for future subtraction images. Similarly, Philips has multiple 3D roadmapping tools such as VesselNavigator that provide a 3D overlay (e.g. segmented vessels from pre-operative CT, segmented vessels form intra-operative cone beam CT, planned landmarks such as ring landmarks for target ostea, planned paths). Further, imaging systems typically provide optimized image acquisition protocols and functionality for certain organs, procedures, or devices. For example, ultrasound presets for MitraClip, x-ray presets for Spine, xperCT presets for stroke.

Referring back to FIG. 1 A, for purposes of describing and claiming the present disclosure, the term “spatial tracking apparatus” encompasses all apparatuses, as known in the art of the present disclosure and hereinafter conceived, including one or more tracking sensor(s) for implementing a localized spatial tracking of interventional instrument 10 within a registered coordinate system.

In one exemplary embodiment, the tracking sensor(s) are optical shape sensor(s) (OSS) as known in the art the present disclose that utilize light along a multicore optical fiber for device localization and navigation of interventional instrument 20 during an interventional procedure. The principle involved makes use of distributed strain measurement in the optical fiber using characteristic Rayleigh backscatter or controlled grating patterns (e.g., Fiber Bragg Gratings). The shape along the optical fiber begins at a specific point along the optical shape sensor, known as the launch or z=0, and the subsequent shape position and orientation are relative to that point. In practice, the optical shape sensor(s) may be integrated into over-the-wire device 11 or guidewire 30 in order to provide live guidance of the device during the interventional procedure without the need for radiation. The integrated fiber provides the position and orientation of the entire device. A non-limiting example of optical shape sensors include the FORS sensors commercially offered by Philips Healthcare.

In practice of this embodiment, spatial tracking apparatus 53 includes a broadband optical source and wavelength monitor(s) as known in the art of the present disclosure, and spatial tracking controller 54 (e.g., programmed hardware and/or application specific integrated circuit) controls a directing of light through the optical fiber via the broadband optical source as known in the art of the present disclosure and for executing distributed strain measurements in the optical fiber via the wavelength monitor(s) as known in the art of the present disclosure.

Also in practice of this present disclosure, an optical shape sensor may be utilized for shape sensing and spatially tracking interventional instrument 10, or two (2) different optical shape sensors may be independently utilized for shape sensing and spatially tracking interventional instrument 10. FIG. 5 shows an exemplary overlay of an OSS based interventional instrument upon a pre-operative CT medical image 62b to delineate a position of the OSS based interventional instrument inside a vasculature.

Referring back to FIG. 1 A, in a second exemplary embodiment, the tracking sensor(s) are electromagnetic (EM) sensors as known in the art of the present disclosure for detecting a magnetic field to facilitate a measurement of a position and/or an orientation of the EM sensor(s) within the magnetic field. A non-limiting example of a EM based spatial tracking apparatus is the AURORA electromagnetic tracking system commercially offered by NDI, Inc.

In practice of this embodiment, spatial tracking apparatus 53 is an EM field generator as known in the art of the present disclosure, and EM based tracking spatial controller 54 (e.g., programmed hardware and/or application specific integrated circuit) for controlling a generation of the magnetic field via the EM field generator as known in the art of the present disclosure and for measuring a position and/or an orientation of the EM sensor(s) within the magnetic field as known in the art of the present disclosure.

Also in practice of this present disclosure, a set of EM sensors may be utilized for shape sensing and spatially tracking interventional instrument 10, or two (2) different sets of EM sensors may be independently utilized for shape sensing and spatially tracking interventional instrument 10.

Still referring to FIG. 1 A, for purposes of describing and claiming the present disclosure, the term “image processor” encompasses all digital signal processors, as known in the art of the present disclosure and hereinafter conceived, configurable for executing image processing to generate an image.

In one exemplary embodiment, medical image processor 60 is an image processor configured to execute image processing to generate medical image(s) 62 for display on monitor 61. In practice, the image processing performed by medical image processor 60 is dependent upon the type of medical imaging apparatus 41 being deployed and the type of spatial tracking apparatus 53 being deployed.

Still referring to FIG. 1 A, in various embodiments, interventional instrument 10, the medical imaging system, the spatial tracking system and the medical display system represent exemplary medical intervention systems as known in the art of the present disclosure, such as, for example, image-guided medical interventions practicable via medical intervention systems commercially offered by Philips Healthcare. The present disclosure improves upon such medical intervention systems as well as other intervention systems, as known in the art of the present disclosure and hereinafter conceived, by providing an adaptive intervention controller 70a for improving a medical image visualization of interventional instrument 10, such as, by example, setting the imaging parameters of medical imaging apparatus 41 in accordance with a shape sensed identification of a device type of interventional instrument 10 and/or activating/deactivating enhancement modes of medical imaging apparatus 41 in accordance with a device type of interventional instrument 10 as will be further detailed in the present disclosure with the description of FIGS. 6 and 7.

Still referring to FIG. 1 A, in practice of the present disclosure, adaptive intervention controller 70a may be (1) installed within one of the medical imaging system, the spatial tracking system and the medical display system, (2) distributed among two or more of the medical imaging system, the spatial tracking system and the medical display system or (3) installed within a separate device, such as, for example, a tablet, a laptop, a workstation or a server.

Referring to FIG. IB, an exemplary medical intervention system of the present disclosure employs interventional instrument 10, a spatial tracking system including spatial tracking apparatus 53 and spatial tracking controller 54, and a medical display system including image processor 60 and an monitor 61 as previously described in the present disclosure (FIG. 1 A).

This exemplary medical intervention system further employs a medical robot system including a medical robot apparatus 91 and a medical robot controller 92.

For purposes of describing and claiming the present disclosure, the term “medical robot apparatus” encompasses all apparatuses, as known in the art of the present disclosure and hereinafter conceived, controllable for navigating an interventional instrument within an anatomy during a medical intervention, and the term "medical robot controller" encompasses all controllers, as known in the art of the present disclosure and hereinafter conceived, for controlling a medical robot apparatus to navigate an interventional instrument to target position(s) within the anatomy during the medical intervention via operator-generated commands and/or image guided procedural -generated commands. Examples of a medical robot system include, but are not limited to, the CorPath® GRX medical robot system, the Magellan™ medical robot system, the Monarch™ medical robot system and the Ion™ medical robot system.

In practice of the present disclosure, as will be further explained in the present disclosure, medical robot controller 92 may automatically set one or more robot navigation parameters of medical robot apparatus 91 for navigate an interventional instrument to target position(s) within the anatomy where the robot navigation parameter setting corresponds to a device type of interventional instrument 10, or medical robot controller 92 may provide user interface(s) of user selectable protocols of robot navigation parameters that correspond to a device type of interventional instrument 10.

In a first exemplary embodiment as known in the art of the present disclosure, medical robot controller 92 may set robot navigation parameters associated with various motions of interventional instrument 10 as being held by medical robot apparatus 91, such as for example, a setting of a linear velocity, a linear acceleration, a linear deceleration, a rotational velocity, a rotational acceleration, a rotational deceleration, a pivotal velocity, a pivotal acceleration and/or a pivotal deceleration of a distal tip of interventional instrument 10.

In a second exemplary embodiment as known in the art of the present disclosure, medical robot controller 92 may set navigation enhancement mode associated with a maintaining an orientation of interventional instrument 10. For example, medical robot controller 92 may set orientation of interventional instrument

10 relative to a medical imaging apparatus (e.g. a distal tip orientation is maintained relative to a beam path of a registered X-ray system).

In a third exemplary embodiment as known in the art of the present disclosure, medical robot controller 92 may set navigation enhancement mode associated with a maintaining a positional/orientational relationship between components of OTW device

11 and/or associated with a positional/orientational relationship between OTW device 11 and interventional device 12. For example, medical robot controller 92 may set an optimal positioning of a sheath with respect to a catheter for stabilization dependent upon the mechanical properties of the catheter (e.g. how far to pull back a guidewire into the catheter to allow for the catheter to take on its pre-formed shape). In a fourth exemplary embodiment as known in the art of the present disclosure, medical robot controller 92 may set navigation enhancement mode associated with updating a planned sequence of motions of interventional instrument 10 within an anatomy. For example, medical robot controller 92 may modify a path to a target position within an anatomy based on a natural mechanical curvature of interventional instrument 10.

In a fifth exemplary embodiment as known in the art of the present disclosure, medical robot controller 92 may set robot navigation parameters associated with defining spatial limits on where the interventional instrument 10 may be navigated within an anatomy. For example, a certain size of sheath may not be advanced beyond a certain depth into renal arteries.

Still referring to FIG. IB, in various embodiments, interventional instrument 10, the medical robot system, the spatial tracking system and the medical display system represent exemplary medical intervention systems as known in the art of the present disclosure, such as, for example, robot based medical interventions practicable via medical intervention systems commercially offered by Philips Healthcare. The present disclosure improves upon such medical intervention systems as well as other intervention systems, as known in the art of the present disclosure and hereinafter conceived, by providing an adaptive intervention controller 70b for improving a robotic navigation of interventional instrument 10 with an anatomy, such as, by example, setting the robot navigation parameters of medical robot apparatus 11 and/or activating/deactivating navigation enhancement modes in accordance with a shape sensed identification of a device type of interventional instrument 10 will be further detailed in the present disclosure with the description of FIGS. 6 and 7.

Still referring to FIG. IB, in practice of the present disclosure, adaptive intervention controller 70b may be (1) installed within one of the medical robot system, the spatial tracking system and the medical display system, (2) distributed among two or more of the medical robot system, the spatial tracking system and the medical display system or (3) installed within a separate device, such as, for example, a tablet, a laptop, a workstation or a server.

Referring to FIGS. 1 A and IB, for embodiments of a medical intervention system employing both a medical imaging system and a medical robot system, these embodiments may have an segregation, a partial integration or a complete integration of adaptive intervention controller 70a and adaptive intervention controller 70b.

To further facilitate an understanding of the present disclosure, the following description of FIGS. 6 and 7 teaches exemplary embodiments of a medical intervention control method in accordance with the present disclosure. From the description of FIGS. 6 and 7, those having ordinary skill in the art of the present disclosure will appreciate how to apply the present disclosure to devise and execute additional embodiments of a medical intervention control method in accordance with the present disclosure.

In practice of the present disclosure a medical intervention control method is executed by an adaptive intervention controller of the present disclosure, such as, for example, adaptive intervention controller 70a (FIG. 1 A) adaptive intervention controller 70b (FIG. IB) or a partial/complete integration of adaptive intervention controllers 70a and 70b.

For purposes of describing and claiming the enhanced medical imaging embodiments of the present disclosure, the term “adaptive intervention controller” encompasses all structural configurations, as understood in the art of the present disclosure and as exemplary described in the present disclosure, of a main circuit board or an integrated circuit for controlling an application of various principles of the present disclosure for adapting a medical intervention to a shape sensed identification of a device type of interventional instrument 10 as will be further detailed in the description of FIGS. 6 and 7.

The structural configuration of the adaptive intervention controller may include, but is not limited to, processor(s), computer-usable/computer readable storage medium(s), an operating system, application module(s), peripheral device controller(s), slot(s) and port(s).

For purposes of describing and claiming the enhanced medical imaging embodiments of the present disclosure, the term “application module” broadly encompasses an application incorporated within or accessible by a controller consisting of an electronic circuit (e.g., electronic components and/or hardware) and/or an executable program (e.g., executable software stored on non-transitory computer readable medium(s) and/or firmware) for executing a specific application associated with a medical intervention. In practice of the enhanced medical imaging embodiments of the present disclosure, adaptive intervention controller 70a employs application modules for adapting a medical intervention to a shape sensed identification of a device type of interventional instrument 10 as will be further detailed in the description of FIGS. 6 and 7.

Referring to FIG. 6, a flowchart 100 represents an exemplary embodiment of a medical intervention control method of the present disclosure as executable by an adaptive intervention controller of the present disclosure.

A device shape sensing stage SI 02 of flowchart 100 involves a translation or a positioning of the shape sensors of guidewire 30 within OTW device 11 and/or hub 20 (if employed in interventional instrument 10) whereby the shape sensor(s) of guidewire 30 generate shape sensing data 110 indicative of a shape profile of guidewire 30 within OTW device 11 and/or hub 20 (if employed in interventional instrument 10) and may be further informative of additional profiles of guidewire 30 (e.g., a curvature profile, a temperature profile, a vibration profile, a strain profile, a twist profile and/or an alpha profile).

Also practice of the present disclosure, device shape sensing stage SI 02 of flowchart 100 may involve a spatial registration of interventional instrument 10 (FIG. 1) and medical imaging apparatus 41 (FIG. 1) executed by medical imaging controller 42 or adaptive intervention controller 70a. In practice of stage SI 02, any type of registration technique, as known in the art of the present disclosure and hereinafter conceived, may be utilized to register interventional instrument and medical imaging apparatus. Examples of such registration techniques based on spatial dimensions or time series include, but are not limited to, extrinsic image based registration techniques (e.g., invasive with screw markers or noninvasive with fiducial markers), intrinsic image based registration techniques (e.g., landmark based, segmentation based or voxel based), or non-image based registration techniques (e.g., OSS registration).

Still referring to FIG. 6, a device type identification stage SI 04 of flowchart 100 involves a device type identification of interventional instrument 10 (FIG. 1) executed by adaptive intervention controller 70a. In practice of stage SI 04, adaptive intervention controller 70a inputs shape sensing data 110 generated during device shape sensing stage S102 to (1) identify pre-defined identity feature(s) of OTW device 11 from the shape of the guidewire, (2)and/or a pre-defined identity template of hub 20 (if employed in interventional instrument 10), and/or (3) temporal changes of 3D shape sensing generated from the guidewire or OTW device

More particularly for stage SI 04, profile(s) of guidewire 30 may be derived from a translation or a positioning of the shape sensors of guidewire 30 within OTW device 11 and/or hub 20 (if employed) to execute (1) an instantaneous three- dimensional (3D) shape sensing generated from multiple sensors of guidewire 30 positioned within OTW device 11 and/or hub 20 (if employed), or (2) a derived 3D shape sensing generated from one sensor of guidewire 30 temporally translating through OTW device 11 and/or hub 20 (if employed). -

For purposes of describing and claiming the present disclosure, the term "device type" broadly encompasses a particular genus of interventional instrument 10 (e.g., a catheter, a deployment device, a delivery device, a therapy device, an imaging device, etc.) or a particular species of a genus of interventional instrument 10 (e.g., a balloon catheter, a stent catheter, an ablation catheter, an imaging catheter, an infusion catheter, an endograft deployment device, a sheath, an introducer, a mitral clip delivery device, a mitral valve delivery device, an aortic valve delivery device, an IVUS catheter, etc.), or an exact of interventional instrument 10 (e.g. a Cook 80cm Cobra catheter, a Medtronic Endurant II endograft system, etc.).

In practice of stage SI 04, adaptive intervention controller 70a interprets shape sensing data 110 to identify the pre-defined identity feature(s) of OTW device 11 and/or the pre-defined identify template of hub 20 (if employed in interventional instrument 10) the correspond to the shape profile and any other additional profiles of guidewire 30 within OTW device 11 and/or hub 20 (if employed in interventional instrument 10) as indicated by shape sensing data 110.

In one exemplary embodiment of stage SI 04, adaptive intervention controller 70a executes a search in a database of an organized collection of genuses of interventional instruments and/or species of one or more genuses of interventional instruments to match the shape profile and any other additional profiles of guidewire 30 within OTW device 11 and/or hub 20 (if employed in interventional instrument 10) to pre-defined identity feature(s) of an OTW device 11 and/or the pre-defined identify template of a hub to thereby identify the device type of interventional instrument 10. In a second exemplary embodiment of stage SI 04, adaptive intervention controller 70a executes an indexing of a lookup table of an array of genuses of interventional instruments and/or species of one or more genuses of interventional instruments to match the shape profile and any other additional profiles of guidewire 30 within OTW device 11 and/or hub 20 (if employed in interventional instrument 10) to pre-defined identity feature(s) of an OTW device 11 and/or the pre-defined identify template of a hub to thereby identify the device type of interventional instrument 10.

In a third exemplary embodiment of stage SI 04, adaptive intervention controller 70a executes a scanning of a device reference of genuses of interventional instruments and/or species of one or more genuses of interventional instruments to match the shape profile and any other additional profiles of guidewire 30 within OTW device 11 and/or hub 20 (if employed in interventional instrument 10) to pre-defined identity feature(s) of an OTW device 11 and/or the pre-defined identify template of a hub to thereby identify the device type of interventional instrument 10.

Temporal data may also be used in stage S104, where the adaptive intervention controller 70a executes a scanning of the shape of the guidewire and saves the shape of the guidewire and/or OTW device with respect to time. Known information about the type of guidewire that is in use may also be saved, for example a floppy guidewire or a stiff guidewire. Depending on the type of guidewire in use the shape of the OTW device may deform differently during manipulation of the guidewire and OTW device. For example a floppy guidewire will deform the OTW device minimally, while a stiff guidewire with cause larger changes to the shape of the OTW device. The changes in shape of the OTW device over time may be used to determine the type of device it is. Alternatively, utilizing the temporal data of the OTW device, the pre-defined identity features of the OTW device, and the type of guidewire can be used together to determine the type of OTW device. For example, if curvature is used to pre-define the OTW device; the curvature peak locations may be similar when used with a floppy or stiff wire but the magnitude of those peaks may be different. Hence when the guidewire is known to be floppy and the pre-defined identity feature is from a floppy guidewire, the OTW device is easily determined. However, if the guidewire is stiff the pre-defined identify feature with a deformation applied to it for the stiff guidewire will then determine the type of OTW device. In practice of stage SI 04, adaptive intervention controller 70a may provide a user interface to enable an operator of the system to add any unidentifiable profile(s) of guidewire 30 within OTW device 11 and/or hub 20 (if employed) corresponding to particular device type(s) of interventional instrument(s) to the database, the look-up table, the device reference and/or other data storage means. Also, adaptive intervention controller 70a may be configured (e.g., via Artificial Intelligence) to any unidentifiable profile(s) of guidewire 30 within OTW device 11 and/or hub 20 (if employed) corresponding to particular device type(s) of interventional instrument(s) to the database, the look-up table, the device reference and/or other data storage means.

Still referring to FIG. 6, an image/robotic adaptation stage SI 06 of flowchart 100 involves adaptive intervention controller 70a adapting an operational control of medical imaging apparatus 41 by the medical imaging controller 42 in accordance with the identification of the device type of the interventional instrument 10 of stage SI 04. In practice of stage SI 06, the adaptation of the operational control of medical imaging apparatus 41 is directed to improving the visualization quality of interventional instrument 10 within medical image(s).

In one exemplary embodiment, adaptive intervention controller 70a searches a database of any imaging enhancement mode(s) of medical imaging apparatus 41 and/or any imaging param eter(s) of medical imaging apparatus 41 associated with device type of the interventional instrument 10, and issues an adaptation command 111 to medical imaging controller 42 to activate any retrieved imaging enhancement mode(s) of medical imaging apparatus 41 and set any retrieved imaging param eter(s) of medical imaging apparatus 41.

In a second exemplary embodiment, adaptive intervention controller 70a indexes lookup tables of any imaging enhancement mode(s) of medical imaging apparatus 41 and/or any imaging parameter(s) of medical imaging apparatus 41 associated with device type of the interventional instrument 10, and issues an adaptation command 111 to medical imaging controller 42 to activate any indexed imaging enhancement mode(s) of medical imaging apparatus 41 and set any indexed imaging param eter(s) of medical imaging apparatus 41.

In a third exemplary embodiment, adaptive intervention controller 70a informs medical imaging controller 42 of identification data 112 whereby medical imaging controller 42 searches a database of any imaging enhancement mode(s) of medical imaging apparatus 41 and/or any imaging parameter(s) of medical imaging apparatus 41 associated with device type of the interventional instrument 10 per identification data 112, and activates any retrieved imaging enhancement mode(s) of medical imaging apparatus 41 and sets any retrieved imaging param eter(s) of medical imaging apparatus 41.

Upon the completion of flowchart 100, medical imaging controller 42 and spatial tracking controller 54 may be operated to generate medical images 62 as adapted to the shape sensed identification of a device type of interventional instrument 10 to thereby obtain an improved visualization quality of medical images, and/or medical robot controller 92 and spatial tracking controller 54 may be operated to navigate interventional instrument 10 within an anatomy adapted to the shape sensed identification of a device type of interventional instrument 10 to thereby obtain an optimal navigation of interventional instrument 10 at target position(s) within the anatomy.

In practice of stage SI 04, adaptive intervention controller 70a may provide a user interface to enable an operator of the system to add image enhancement mode(s) of medical imaging apparatus 41 corresponding to particular device type(s) of interventional instrument(s) to the database, the look-up table, the device reference and/or other data storage means. Also, medical imaging controller 42 and/or adaptive intervention controller 70a may be configured (e.g., via Artificial Intelligence) to automatically add any new image enhancement mode(s) of medical imaging apparatus 41 corresponding to particular device type(s) of interventional instrum ent(s) to the database, the look-up table, the device reference and/or other data storage means.

Referring to FIG. 7, a flowchart 200 represents an exemplary embodiment of an X-Ray based medical intervention control method of the present disclosure as executable an adaptive intervention controller of the present disclosure

A device shape sensing stage S202 of flowchart 200 may include OTW device shape sensing embodiment 210 involving a translation or a positioning of the shape sensors of guidewire 30 within OTW device 11 whereby the shape sensor(s) of guidewire 30 generate shape sensing data 110 indicative of a shape profile of guidewire 30 within OTW device 11 and may be further informative of additional profiles of guidewire 30 (e.g., a curvature profile, a temperature profile, a vibration profile, a strain profile, a twist profile and/or an alpha profile).

Alternatively, device shape sensing stage S202 of flowchart 200 may include a hub device shape sensing embodiment 211 involving a translation or a positioning of the shape sensors of guidewire 30 within hub 20 whereby the shape sensor(s) of guidewire 30 generate shape sensing data 110 indicative of a shape profile of guidewire 30 within hub 20 and may be further informative of additional profiles of guidewire 30 (e.g., a curvature profile, a temperature profile, a vibration profile, a strain profile, a twist profile and/or an alpha profile).

Still referring to FIG. 7, a device type identification stage S204 of flowchart 200 involves either an execution of a controller recognition algorithm 212 by the adaptive intervention controller to identify a particular device type of interventional instrument 10 as previously described in the present disclosure, or a user identification 213 of a particular device type of interventional instrument 10 (e.g., via a user delineating interventional instrument via a dropdown list of available interventional instruments).

Still referring to FIG. 7, an image/robotic adaptation stage S206 of flowchart 200 may involve the adaptive intervention controller adapting an operational control of medical imaging apparatus 41 by the medical imaging controller 42 in accordance with the identification of the device type of the interventional instrument 10 via X-ray parameter setting(s) 214 and/or an X-ray enhancement activation 215.

Examples of X-ray parameter setting(s) 214 include, but are not limited to, a setting of a dosage, an acquisition time, a shuttering and a windowing associated with the identified device type of the interventional instrument 10.

Examples of X-ray enhancement activations 215 include, but are not limited to, an activation of StentBoost and/or SmartPerfusion if associated with the identified device type of the interventional instrument 10.

In practice of the stage S206, medical imaging controller 42 and/or the adaptive intervention controller may control a display of one or more user interfaces allowing user interface with the medical images.

For example, a user interface may be displayed that provide a user interaction to change a color of interventional instrument 10 within the medical image as means for delineating the identified device type of interventional instrument 10 or to tag the medical image with the identified device type of interventional instrument 10.

By further example, a user interface may be displayed that provide various protocols of imaging parameters/imaging enhancement modes corresponding to the identified device type of interventional instrument 10 whereby the user can select a desired protocol to optimize the generation and/or visualization of the medical images.

Still referring to FIG. 7, image/robotic adaptation stage S206 of flowchart 200 may involve the adaptive intervention controller adapting an operational control of medical robot apparatus 91 by the medical robot controller 92 in accordance with the identification of the device type of the interventional instrument 10 via robot parameter setting(s) 217 and/or a navigation enhancement mode 218.

Examples of robot parameter setting(s) 217 include, but are not limited to, a setting of a motions of interventional instrument within an anatomy in accordance with the identified device type of the interventional instrument 10 (e.g., an acceleration, a deceleration and a max velocity of interventional instrument 10 within the anatomy.

Examples of navigation enhancement activations 218 include, but are not limited to, an activation of a maintaining an orientation of a distal tip of interventional instrument 10 relative to a beam bath of the X-ray imaging apparatus.

In practice of the stage S206, medical robot controller 92 and/or the adaptive intervention controller may control a display of one or more user interfaces allowing user interface with navigation of interventional instrument 10 within an anatomy.

For example, a user interface may be displayed that provide a user interaction with a simulated navigation of the interventional instrument 10 within X-ray image(s) of the anatomy via operator-manipulated medical robots the corresponds to the the identified device type of interventional instrument 10.

By further example, a user interface may be displayed that provide various protocols of robot parameters/robot enhancement modes corresponding to the identified device type of interventional instrument 10 whereby the user can select a desired protocol to optimize the navigation of interventional instrument 10 within an anatomy.

Upon the completion of flowchart 200, medical imaging controller 42 and spatial tracking controller 54 may be operated to generate medical images 62 as adapted to the shape sensed identification of a device type of interventional instrument 10 to thereby obtain an improved visualization quality of medical images, and/or medical robot controller 92 and spatial tracking controller 54 may be operated to navigate interventional instrument 10 within an anatomy adapted to the shape sensed identification of a device type of interventional instrument 10 to thereby obtain an optimal navigation of interventional instrument 10 at target position(s) within the anatomy.

To facilitate a further understanding of the present disclosure, the following description of FIGS. 8A-8F and 9A-9F respectively teach an exemplary embodiment of an adaptive intervention controller of the present disclosure. From this description, those having ordinary skill in the art will appreciate how to apply various aspects of the present disclosure for making and using additional embodiments of an adaptive intervention controller of the present disclosure.

FIGS. 8A-8C illustrate various embodiments 170a-170c of adaptive intervention controller 70a (FIG. 1 A) and FIGS. 8D-8F illustrate various embodiments 170d-170f of adaptive intervention controller 70b (FIG. IB).

Referring to FIGS. 8A-8F, each adaptive intervention controller 170a-170f includes one or more processor(s) 171, memory 172, a user interface 173, a network interface 174, and a storage 175 interconnected via one or more system buses 176.

Each processor 171 may be any hardware device, as known in the art of the present disclosure or hereinafter conceived, capable of executing instructions stored in memory 172 or storage or otherwise processing data. In a non-limiting example, the processor(s) 171 may include a microprocessor, field programmable gate array (FPGA), application-specific integrated circuit (ASIC), or other similar devices.

The memory 172 may include various memories, as known in the art of the present disclosure or hereinafter conceived, including, but not limited to, LI, L2, or L3 cache or system memory. In a non-limiting example, the memory 172 may include static random access memory (SRAM), dynamic RAM (DRAM), flash memory, read only memory (ROM), or other similar memory devices.

The user interface 173 may include one or more devices, as known in the art of the present disclosure or hereinafter conceived, for enabling communication with a user such as an administrator. In a non-limiting example, the user interface may include a command line interface or graphical user interface that may be presented to a remote terminal via the network interface 174.

The network interface 174 may include one or more devices, as known in the art of the present disclosure or hereinafter conceived, for enabling communication with other hardware devices. In a non-limiting example, the network interface 174 may include a network interface card (NIC) configured to communicate according to the Ethernet protocol. Additionally, the network interface 174 may implement a TCP/IP stack for communication according to the TCP/IP protocols. Various alternative or additional hardware or configurations for the network interface 174 will be apparent.

The storage 175 may include one or more machine-readable storage media, as known in the art of the present disclosure or hereinafter conceived, including, but not limited to, read-only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, or similar storage media. In various non-limiting embodiments, the storage 175 may store instructions for execution by the processor(s) 171 or data upon with the processor(s) 171 may operate. For example, the storage 175 stores a base operating system for controlling various basic operations of the hardware.

Referring to FIG. 8A, storage 175 of adaptive intervention controller 170a also stores application modules 177a in the form of executable software/firmware for implementing the various functions of adaptive intervention controller 170a including, but not limited to:

(1) a device shape sensing module 178a for executing an embodiment of stage SI 02 of flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S202 of flowchart 200 (FIG. 7) as previously in the present disclosure;

(2) a device type identification module 178b for executing an embodiment of stage SI 04 flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S204 of flowchart 200 (FIG. 7) as previously in the present disclosure; and

(3) an imaging adaptation module 178c for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical imaging apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure.

In practice, as shown in FIG. 9A, adaptive intervention controller 170a may be installed in a remote device 80a (e.g., a tablet, a laptop, a workstation or a server) that is in communication with a medical imaging system 40a including medical imaging apparatus 41 and medical imaging controller 42 as previously in the present disclosure, and in further communication with a spatial tracking system 50a including a spatial tracking apparatus 53 and a spatial tracking controller 54 (e.g., an OSS system) as previously in the present disclosure.

Alternatively, adaptive intervention controller 170a may be installed within medical imaging system 40a or spatial tracking system 50a, or distributed between medical imaging system 40a and spatial tracking system 50a.

Referring to FIG. 8B, storage 175 of adaptive intervention controller 170b also stores application modules 177b in the form of executable software/firmware for implementing the various functions of adaptive intervention controller 170b including, but not limited to:

(1) device shape sensing module 178a for executing an embodiment of stage SI 02 of flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S202 of flowchart 200 (FIG. 7) as previously in the present disclosure;

(2) device type identification module 178b for executing an embodiment of stage SI 04 flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S204 of flowchart 200 (FIG. 7) as previously in the present disclosure;

(3) an imaging adaptation module 178c for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical imaging apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure; and

(4) one or more medical imaging module(s) 178d, as known in the art of the present disclosure or hereinafter conceived, for controlling an operation of a medical imaging apparatus, such as, for example, setting the imaging parameters of medical imaging apparatus, activating/deactivating enhancement modes of medical imaging apparatus and actuating medical imaging apparatus to generate medical imaging data.

In practice, as shown in FIG. 9B, adaptive intervention controller 170b may be installed in a medical imaging system 40b as a substitute for medical imaging controller 42 (FIG. 1A) whereby adaptive intervention controller 170b controls an operation of a medical imaging apparatus 41 and whereby adaptive intervention controller 170b is in communication with a spatial tracking system 50a including a spatial tracking apparatus 53 and a spatial tracking controller 54 (e.g., an OSS system).

Referring to FIG. 8C, storage 175 of adaptive intervention controller 170c also stores application modules 177c in the form of executable software/firmware for implementing the various functions of adaptive intervention controller 170c including, but not limited to:

(1) device shape sensing module 178a for executing an embodiment of stage SI 02 of flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S202 of flowchart 200 (FIG. 7) as previously in the present disclosure;

(2) device type identification module 178b for executing an embodiment of stage SI 04 flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S204 of flowchart 200 (FIG. 7) as previously in the present disclosure;

(3) an imaging adaptation module 178c for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical imaging apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure; and

(4) one or more spatial tracking module(s) 178e, as known in the art of the present disclosure or hereinafter conceived, for controlling an operation of spatial tracking apparatus such as, for example, actuating sensors of the spatial tracking apparatus to generate sensor shape data or position data (e.g., OSS interrogation of fibre optic Bragg gratings (FBG)).

In practice, as shown in FIG. 9C, adaptive intervention controller 170c may be installed in a spatial tracking system 50b as a substitute for spatial tracking controller 54 (FIGS. 1A and IB) whereby adaptive intervention controller 170c controls an operation of a spatial tracking apparatus 53 (e.g., an OSS system) and whereby adaptive intervention controller 170c is in communication with a medical imaging system 40a including a medical imaging apparatus 41 and a medical imaging controller 42.

Referring to FIG. 8D, storage 175 of adaptive intervention controller 170d also stores application modules 177d in the form of executable software/firmware for implementing the various functions of adaptive intervention controller 170d including, but not limited to:

(1) device shape sensing module 178a for executing an embodiment of stage SI 02 of flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S202 of flowchart 200 (FIG. 7) as previously in the present disclosure;

(2) device type identification module 178b for executing an embodiment of stage SI 04 flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S204 of flowchart 200 (FIG. 7) as previously in the present disclosure; and

(3) a robot adaption module 178f for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical robot apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure.

In practice, as shown in FIG. 9D, adaptive intervention controller 170d may be installed in a remote device 80b (e.g., a tablet, a laptop, a workstation or a server) that is in communication with a medical robot system 90a including medical robot apparatus 91 and medical robot controller 92 as previously in the present disclosure, and in further communication with a spatial tracking system 50a including a spatial tracking apparatus 53 and a spatial tracking controller 54 (e.g., an OSS system) as previously in the present disclosure.

Alternatively, adaptive intervention controller 170d may be installed within medical robot system 90a or spatial tracking system 50a, or distributed between medical robot system 90a and spatial tracking system 50a. Referring to FIG. 8B, storage 175 of adaptive intervention controller 170e also stores application modules 177e in the form of executable software/firmware for implementing the various functions of adaptive intervention controller 170b including, but not limited to:

(1) device shape sensing module 178a for executing an embodiment of stage SI 02 of flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S202 of flowchart 200 (FIG. 7) as previously in the present disclosure;

(2) device type identification module 178b for executing an embodiment of stage S104 flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S204 of flowchart 200 (FIG. 7) as previously in the present disclosure;

(3) robot adaption module 178f for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical robot apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure; and

(4) one or more medical imaging module(s) 178d, as known in the art of the present disclosure or hereinafter conceived, for controlling an operation of a medical imaging apparatus, such as, for example, setting the imaging parameters of medical imaging apparatus, activating/deactivating enhancement modes of medical imaging apparatus and actuating medical imaging apparatus to generate medical imaging data.

In practice, as shown in FIG. 9E, adaptive intervention controller 170e may be installed in a medical robot system 90b as a substitute for medical robot controller 92 (FIG. 1A) whereby adaptive intervention controller 170e controls an operation of a medical robot apparatus 91 and whereby adaptive intervention controller 170e is in communication with a spatial tracking system 50a including a spatial tracking apparatus 53 and a spatial tracking controller 54 (e.g., an OSS system).

Referring to FIG. 8F, storage 175 of adaptive intervention controller 170c also stores application modules 177c in the form of executable software/firmware for implementing the various functions of adaptive intervention controller 170c including, but not limited to: (1) device shape sensing module 178a for executing an embodiment of stage SI 02 of flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S202 of flowchart 200 (FIG. 7) as previously in the present disclosure;

(2) device type identification module 178b for executing an embodiment of stage SI 04 flowchart 100 (FIG. 6) as previously described in the present disclosure, such as, for example, stage S204 of flowchart 200 (FIG. 7) as previously in the present disclosure;

(3) robot adaption module 178f for executing an embodiment of stage S106 of flowchart 100 (FIG. 6) related to adapting a medical robot apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure; and

(4) one or more spatial tracking module(s) 178e, as known in the art of the present disclosure or hereinafter conceived, for controlling an operation of spatial tracking apparatus such as, for example, actuating sensors of the spatial tracking apparatus to generate sensor shape data or position data (e.g., OSS interrogation of fibre optic Bragg gratings (FBG)).

In practice, as shown in FIG. 9F, adaptive intervention controller 170f may be installed in a spatial tracking system 50b as a substitute for spatial tracking controller 54 (FIGS. 1A and IB) whereby adaptive intervention controller 170f controls an operation of spatial tracking apparatus 53 (e.g., an OSS system) and whereby adaptive intervention controller 170f is in communication with a medical robot system 90a including a medical robot apparatus 91 and a medical robot controller 92.

Referring back to FIGS. 8A-8C, the respective application modules 177a-177c of adaptive intervention controllers 170a- 170c may further include robot adaption module 178f (FIGS. 9D-9F) for executing an embodiment of stage S106 of flowchart 100 (FIG. 6) related to adapting a medical robot apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure whereby adaptive intervention controllers 170a- 170c are in communication with a medical robot system including a medical robot apparatus and a medical robot controller.

Referring back to FIGS. 8D-8F, the respective application modules 177d-177f of adaptive intervention controllers 170d-170f may further include an imaging adaptation module 178c for executing an embodiment of stage SI 06 of flowchart 100 (FIG. 6) related to adapting a medical imaging apparatus to a shape sensed identification of a device type of interventional instrument 10 (FIGS. 1A and IB) as previously described in the present disclosure, such as for example, S206 of flowchart 200 (FIG. 7) as previously in the present disclosure whereby adaptive intervention controllers 170d-170f are in communication with a medical imaging system including a medical imaging apparatus and a medical imaging controller.

Referring to FIGS. 1 A-9F, those having ordinary skill in the art of the present disclosure will appreciate numerous benefits of the present disclosure including, but not limited to, (1) reducing/ simplifying user interaction with a medical imaging system while achieving optimal medical imaging parameter settings of the medical imaging system and enhanced medical image visualization of an interventional instrument by the medical imaging system, and (2) reducing/ simplifying user interaction with a medical robot system while achieving optimal robot navigation parameter settings of the medical robot system and enhanced robotic navigation of an interventional instrument by the medical robot system.

Further, as one having ordinary skill in the art will appreciate in view of the teachings provided herein, structures, elements, components, etc. described in the present disclosure/specification and/or depicted in the Figures may be implemented in various combinations of hardware and software, and provide functions which may be combined in a single element or multiple elements. For example, the functions of the various structures, elements, components, etc. shown/illustrated/depicted in the Figures can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software for added functionality. When provided by a processor, the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared and/or multiplexed. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor (“DSP”) hardware, memory (e.g., read only memory (“ROM”) for storing software, random access memory (“RAM”), non-volatile storage, etc.) and virtually any means and/or machine (including hardware, software, firmware, combinations thereof, etc.) which is capable of (and/or configurable) to perform and/or control a process.

Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (e.g., any elements developed that can perform the same or substantially similar function, regardless of structure). Thus, for example, it will be appreciated by one having ordinary skill in the art in view of the teachings provided herein that any block diagrams presented herein can represent conceptual views of illustrative system components and/or circuitry embodying the principles of the invention. Similarly, one having ordinary skill in the art should appreciate in view of the teachings provided herein that any flow charts, flow diagrams and the like can represent various processes which can be substantially represented in computer readable storage media and so executed by a computer, processor or other device with processing capabilities, whether or not such computer or processor is explicitly shown.

The terms “signal”, “data” and “command” as used in the present disclosure broadly encompasses all forms of a detectable physical quantity or impulse (e.g., voltage, current, or magnetic field strength) as understood in the art of the present disclosure and as exemplary described in the present disclosure for transmitting information and/or instructions in support of applying various inventive principles of the present disclosure as subsequently described in the present disclosure. Signal/data/command communication between various components of the present disclosure may involve any communication method as known in the art of the present disclosure including, but not limited to, signal/data/command transmission/reception over any type of wired or wireless datalink and a reading of signal/data/commands uploaded to a computer-usable/computer readable storage medium. Having described preferred and exemplary embodiments of the various and numerous inventions of the present disclosure (which embodiments are intended to be illustrative and not limiting), it is noted that modifications and variations can be made by persons skilled in the art in light of the teachings provided herein, including the Figures. It is therefore to be understood that changes can be made in/to the preferred and exemplary embodiments of the present disclosure which are within the scope of the embodiments disclosed herein.

Moreover, it is contemplated that corresponding and/or related systems incorporating and/or implementing the device/system or such as may be used/implemented in/with a device in accordance with the present disclosure are also contemplated and considered to be within the scope of the present disclosure. Further, corresponding and/or related method for manufacturing and/or using a device and/or system in accordance with the present disclosure are also contemplated and considered to be within the scope of the present disclosure.