Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
HOLOGRAPHIC USER INTERFACES FOR MEDICAL PROCEDURES
Document Type and Number:
WIPO Patent Application WO/2013/057649
Kind Code:
A1
Abstract:
An interactive holographic display system includes a holographic generation module (115) configured to display a holographically rendered anatomical image. A localization system (120) is configured to define a monitored space (126) on or around the holographically rendered anatomical image. One or more monitored objects (128) have their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image.

Inventors:
VERARD LAURENT (NL)
CHAN RAYMOND (NL)
RUIJTERS DANIEL SIMON ANNA (NL)
DENISSEN SANDER HANS (NL)
SLEGT SANDER (NL)
Application Number:
PCT/IB2012/055595
Publication Date:
April 25, 2013
Filing Date:
October 15, 2012
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
KONINKL PHILIPS ELECTRONICS NV (NL)
International Classes:
G03H1/00; G03H1/22
Domestic Patent References:
WO2005010623A22005-02-03
Foreign References:
US20090237759A12009-09-24
Other References:
None
Attorney, Agent or Firm:
VAN VELZEN, Maaike et al. (AE Eindhoven, NL)
Download PDF:
Claims:
CLAIMS:

1. An interactive holographic display system, comprising:

a holographic generation module (115) configured to display a holographically rendered anatomical image (124);

a localization system (120) configured to define a monitored space (126) on or around the holographically rendered anatomical image; and

one or more monitored objects (128) having their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image.

2. The system as recited in claim 1 , wherein the holographically rendered anatomical image (124) is generated in-air.

3. The system as recited in claim 1, wherein the localization system 120 includes one or more of a fiber optic shape sensing system, an electromagnetic tracking system, a light sensor array, and a sensing device to determine the position and orientation of the monitored space and the one or more monitored object in a same coordinate system.

4. The system as recited in claim 1, wherein the one or more monitored objects (128) include a medical instrument, an anatomical feature of a user and a virtual object.

5. The system as recited in claim 1, wherein the response in the holographically rendered anatomical image (124) includes one or more of: translation or rotation of the holographically rendered anatomical image, magnification adjustment of the holographically rendered anatomical image, marking of the holographically rendered anatomical image and feedback generation.

6. The system as recited in claim 5, wherein the response includes haptic feedback to a user.

7. The system as recited in claim 1 , wherein the ho lo graphically rendered anatomical image (124) includes a response region (504) monitored by the localization system such that upon activating the response region a display event (502) occurs.

8. The system as recited in claim 7, wherein the display event (502) includes a generated help menu, a generated menu of virtual objects to be included in the

ho lographically rendered anatomical image upon selection and a generated information display.

9. The system as recited in claim 1, wherein the ho lographically rendered anatomical image (124) displays superimposed medical data mapped to positions thereon.

10. The system as recited in claim 1, wherein the response in the ho lographically rendered anatomical image (124) generates control signals for operating robotically controlled instruments (602).

11. The system as recited in claim 1 , wherein the response in the ho lographically rendered anatomical image includes seed points (162) placed to direct virtual camera angles for an additional display.

12. The system as recited in claim 1, wherein the interactive holographic display system is remotely disposed from a patient location and connected to the patient location over a communication network (910).

13. The system as recited in claim 1, wherein the interactive holographic display system is remotely disposed from a patient location and connected to the patient location over a communication network (910) such that the ho lographically rendered anatomical image is employed to remotely control instruments (906) at the patient's location.

14. The system as recited in claim 1, further comprising a speech recognition engine (166) configured to convert speech commands into commands for altering an appearance of the ho lographically rendered anatomical image.

15. An interactive holographic display system, comprising:

a processor (114);

memory (1 16) coupled to the processor;

a holographic generation module (115) included in the memory and configured to display a holographically rendered anatomical image (124) as an in-air hologram or on a holographic display;

a localization system (120) configured to define a monitored space (126) on or around the holographically rendered anatomical image; and

one or more monitored objects (128) having their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image wherein the response in the holographically rendered anatomical image includes one or more of: translation or rotation of the holographically rendered anatomical image, magnification adjustment of the holographically rendered anatomical image, marking of the holographically rendered anatomical image and feedback generation.

16. The system as recited in claim 15, wherein the localization system (120) includes one or more of a fiber optic shape sensing system, an electromagnetic tracking system, a light sensor array and a sensing device to determine the position and orientation of the monitored space and the one or more monitored object in a same coordinate system.

17. The system as recited in claim 15, wherein the one or more monitored objects (128) include a medical instrument, an anatomical feature of a user and a virtual object. 18. The system as recited in claim 15, wherein the response includes haptic feedback to a user.

19. The system as recited in claim 15, wherein the holographically rendered anatomical image (124) includes a response region (504) monitored by the localization system such that upon activating the response region a display event (502) occurs.

20. The system as recited in claim 19, wherein the display event (502) includes a generated help menu, a generated menu of virtual objects to be included in the holographically rendered anatomical image upon selection and a generated information display.

21. The system as recited in claim 15, wherein the holographically rendered anatomical image (124) displays superimposed medical data mapped to positions thereon.

22. The system as recited in claim 15, wherein the response in the holographically rendered anatomical image generates control signals for operating robotically controlled instruments (602).

23. The system as recited in claim 15, wherein the response in the holographically rendered anatomical image includes seed points (162) placed to direct virtual camera angles for an additional display.

24. The system as recited in claim 15, wherein the interactive holographic display system is remotely disposed from a patient location and connected to the patient location over a communication network (910).

25. The system as recited in claim 15, wherein the interactive holographic display system is remotely disposed from a patient location and connected to the patient location over a communication network (910) such that the holographically rendered anatomical image is employed to remotely control instruments (906) at the patient's location.

26. The system as recited in claim 15, further comprising a speech recognition engine (166) configured to convert speech commands into commands for altering an appearance of the holographically rendered anatomical image.

27. A method for interacting with a holographic display, comprising:

displaying (1002) a holographically rendered anatomical image;

localizing (1004) a monitored space on or around the holographically rendered anatomical image to define a region for interaction;

monitoring (1006) a position and orientation of one or more monitored objects by the localization system;

determining (1008) coincidence of spatial points between the monitored space the one or more monitored objects; and

if coincidence is determined, triggering (1010) a response in the holographically rendered anatomical image.

28. The method as recited in claim 27, wherein displaying includes generating (1002) the holographically rendered anatomical image in-air.

29. The method as recited in claim 27, wherein the localization system includes one or more of a fiber optic shape sensing system, an electromagnetic tracking system, a light sensor array and a sensor device, and the method further comprises determining (1004) the position and orientation of the monitored space and the one or more monitored object (1006) in a same coordinate system.

30. The method as recited in claim 27, wherein the one or more monitored objects include a medical instrument, an anatomical feature of a user and a virtual object.

31. The method as recited in claim 27, wherein triggering a response includes one or more of: moving (1012) the holographically rendered anatomical image; adjusting (1014) zoom of the holographically rendered anatomical image, marking (1016) the holographically rendered anatomical image; and generating (1020) feedback to a user.

32. The method as recited in claim 31, wherein generating feedback includes generating haptic feedback for a user.

33. The method as recited in claim 27, wherein the holographically rendered anatomical image includes a response region (1022) monitored by the localization system such that upon activating the response region a display event occurs.

34. The method as recited in claim 33, wherein the display event includes generating (1024) a help menu; generating (1026) a menu of virtual objects to be included in the holographically rendered anatomical image upon selection; and generating (1028) information to be displayed.

35. The method as recited in claim 27, further comprising rendering (1030) the holographically rendered anatomical image with superimposed medical data mapped to positions on the holographically rendered anatomical image.

36. The method as recited in claim 27, further comprising generating (1032) control signals for operating robotically controlled instruments.

Description:
HOLOGRAPHIC USER INTERFACES FOR MEDICAL PROCEDURES

CROSS-REFERENCE TO RELATED APPLICATIONS:

This application claims priority to U.S. provisional application number 61/549,273 filed on October 20, 2011, the entire disclosure of which is hereby incorporated herein by reference in its entirety.

BACKGROUND:

Technical Field

The present disclosure relates to medical systems, devices and methods, and more particularly to systems, devices and methods pertaining to integration of holographic image data with other information to improve accuracy and effectiveness in medical applications.

Description of the Related Art

Auto-stereoscopic displays (ASDs) for three-dimensional (3D) visualization on a two-dimensional (2D) panel, without the need for user goggles/glasses, have been

investigated. However, resolution and processing time limits the ability to render high quality images using this technology. Additionally, these displays have generally been confined to a 2D plane (e.g., preventing a physician from moving around or rotating the display to view the data from different perspectives). Although different perspectives may be permitted with a limited field of view, the field of view for this type of display still suffers from breakdown of movement parallax.

Similarly, user input for manipulation of data objects has largely been confined to mainstream 2D mechanisms, e.g., mice, tablets, keypads, touch panels, camera-based tracking, etc. Accordingly, there is a need for a system, device and method as disclosed and described herein which can be used to overcome the above-identified deficiencies.

SUMMARY

In accordance with the present principles, an interactive holographic display system includes a holographic generation module configured to display a holographically rendered anatomical image. A localization system is configured to define a monitored space on or around the holographically rendered anatomical image. One or more monitored objects have their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image.

Another interactive holographic display system includes a processor and memory coupled to the processor. A holographic generation module is included in the memory and configured to display a holographically rendered anatomical image as an in-air hologram or on a holographic display. A localization system is configured to define a monitored space on or around the holographically rendered anatomical image. One or more monitored objects has their position and orientation monitored by the localization system such that coincidence of spatial points between the monitored space and the one or more monitored objects triggers a response in the holographically rendered anatomical image wherein the response in the holographically rendered anatomical image includes one or more of: translation or rotation of the holographically rendered anatomical image, magnification adjustment of the

holographically rendered anatomical image, marking of the holographically rendered anatomical image and feedback generation.

A method for interacting with a holographic display includes displaying a

holographically rendered anatomical image; localizing a monitored space on or around the holographically rendered anatomical image to define a region for interaction; monitoring a position and orientation of one or more monitored objects by the localization system;

determining coincidence of spatial points between the monitored space the one or more monitored objects; and if coincidence is determined, triggering a response in the

holographically rendered anatomical image.

These and other objects, features and advantages of the present disclosure will become apparent from the following detailed description of illustrative embodiments thereof, which is to be read in connection with the accompanying drawings.

BRIEF DESCRIPTION OF DRAWINGS

This disclosure will present in detail the following description of preferred embodiments with reference to the following figures wherein:

FIG. 1 is a block/flow diagram showing a system for interfacing with holograms in accordance with exemplary embodiments;

FIG. 2 is a perspective view of a hologram rendered with a data map or overlay thereon in accordance with an illustrative embodiment; FIG. 3 is a block diagram showing an illustrative process flow for displaying a data map or overlay in a holographic image in accordance with an illustrative embodiment;

FIG. 4 is a block diagram showing an illustrative system and process flow for displaying static or animated objects in a holographic image in accordance with an illustrative embodiment;

FIG. 5 is a diagram showing an illustrative image for displaying an objects menu for selecting a virtual objects during a procedure for display in a holographic image in accordance with an illustrative embodiment;

FIG. 6 is a block diagram showing an illustrative system for controlling a robot using a holographic image in accordance with an illustrative embodiment;

FIG. 7 is a block diagram showing an illustrative system which employs haptic feedback with a holographic image in accordance with an illustrative embodiment;

FIG. 8 is a diagram showing multiple views provided to different perspectives in an illustrative system for displaying a holographic image or the like in accordance with one embodiment;

FIG. 9 is a block diagram showing an illustrative system for controlling a robot remotely over a network using a holographic image in accordance with an illustrative embodiment; and

FIG. 10 is a flow diagram showing a method for interfacing with a hologram in accordance with an illustrative embodiment.

DETAILED DESCRIPTION OF EMBODIMENTS

In accordance with the present principles, systems, devices and methods are described which leverage holographic display technology for medical procedures. This can be done using 3D holographic technologies (e.g., in-air holograms) and real-time 3D input sensing methods such as optical shape sensing to provide a greater degree of human-data interaction during a procedure. Employing holographic technology with other technologies potentially simplifies procedure workflow, instrument selection, and manipulation within the anatomy of interest. Such exemplary embodiments described herein can utilize 3D holographic displays for real-time visualization of volumetric datasets with exemplary localization methods for sensing movements in free space during a clinical procedure, thereby providing new methods of human-data interaction in the interventional suite.

In one exemplary embodiment, 3D holography may be used to fuse anatomical data with functional imaging and "sensing" information. A fourth dimension (e.g., time, color, texture, etc.) can be used to represent a dynamic 3D multimodality representation of the status of an object of interest (e.g., organ). A display can be in (near) real-time and use color-coded visual information and/or haptic feedback/tactile information, for example, to convey different effects of states of the holographically displayed object of interest. Such information can include morphological information about the target, functional information about the object of interest (e.g. flow, contractility, tissue biomechanical or chemical composition, voltage, temperature, pH, p0 2 , pC0 2 , etc.), or the measured changes in target properties due to interaction between the target and therapy being delivered. The exemplary 3D holographic display can be seen from (virtually) any angle/direction so that, e.g., multiple users can simultaneously interact with the same understanding and information.

Alternatively, it is possible to simultaneously display different information to different users positioned in the room, such as by displaying different information on each face of a cube or polyhedron, for example.

In one embodiment, one could "touch" or otherwise interact with a specific region of interest in the 3D holographic display (e.g., using one or multiple fingers, virtual tools, or physical instruments being tracked within the same interaction space), and tissue

characteristics would become available and displayed in the 3D hologram. Such "touch" can also be used to, e.g., rotate the virtual organ, zoom, tag points in 3D, draw a path and trajectory plan (e.g., for treatment, targeting, etc.), select critical zones to avoid, create alarms, and drop virtual objects (e.g., implants) in 3D in the displayed 3D anatomy.

Exemplary embodiments according to the present disclosure can also be used to facilitate a remote procedure (e.g., where the practitioner "acts" on the virtual organ and a robot simultaneously or subsequently performs the procedure on the actual organ), to practice a procedure before performing the actual procedure in a training or simulation setting, and/or to review/study/teach a procedure after it has been performed (e.g., through data recording, storage, and playback of the 3D holographic display and any associated multimodality signals relevant to the clinical procedure).

Exemplary embodiments according to the present disclosure are further described herein below with reference to the appended figures. While such exemplary embodiments are largely described separately from one another (e.g., for ease of presentation and understanding), one having ordinary skill in the art shall appreciate in view of the teachings herein that such exemplary embodiments can be used independently and/or in combination with each other. Indeed, the implementation and use of the exemplary embodiments described herein, including combinations and variations thereof, all of which are considered a part of the present disclosure, can depend on, e.g., particular laboratory or clinical use/application, integration with other related technologies, available resources,

environmental conditions, etc. Accordingly, nothing in the present disclosure should be interpreted as limiting of the subject matter disclosed herein.

A real-time 3D holographic display in accordance with the present principles may include a real-time six degree of freedom (DOF) input via localization technology embedded into a data interaction device (e.g., a haptic device for sensory feedback). An imaging / monitoring system for multidimensional data acquisition may also be employed. Datalinks between the holography display, localization system / interaction device, and imaging / monitoring system may be provided for communication between these systems. In one embodiment, the display, feedback devices, localization devices, measurement devices may be employed with or integrated with a computational workstation for decision support and data libraries of case information that can be dynamically updated / recalled during a live case for training / teaching / procedure guidance purposes (e.g., for similar archived clinical cases relative to the procedure and patient undergoing treatment).

It should be understood that the present invention will be described in terms of medical instruments; however, the teachings of the present invention are much broader and are applicable to any systems that can benefit from holographic visualization. In some embodiments, the present principles are employed in tracking or analyzing complex biological or mechanical systems. In particular, the present principles are applicable to internal tracking procedures of biological systems, procedures in all areas of the body such as the lungs, gastro -intestinal tract, excretory organs, blood vessels, etc. The elements depicted in the FIGS, may be implemented in various combinations of hardware and software and provide functions which may be combined in a single element or multiple elements.

The functions of the various elements shown in the FIGS, can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions can be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which can be shared. Moreover, explicit use of the term "processor" or "controller" should not be construed to refer exclusively to hardware capable of executing software, and can implicitly include, without limitation, digital signal processor ("DSP") hardware, read-only memory ("ROM") for storing software, random access memory ("RAM"), non-volatile storage, etc.

Moreover, all statements herein reciting principles, aspects, and embodiments of the invention, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future (i.e., any elements developed that perform the same function, regardless of structure). Thus, for example, it will be appreciated by those skilled in the art that the block diagrams presented herein represent conceptual views of illustrative system components and/or circuitry embodying the principles of the invention. Similarly, it will be appreciated that any flow charts, flow diagrams and the like represent various processes which may be substantially represented in computer readable storage media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.

Furthermore, embodiments of the present invention can take the form of a computer program product accessible from a computer-usable or computer-readable storage medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable storage medium can be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or a propagation medium. Examples of a computer-readable medium include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a random access memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Current examples of optical disks include compact disk - read only memory (CD-ROM), compact disk - read/write (CD-R/W), Blu-Ray™ and DVD.

Referring now to the drawings in which like numerals represent the same or similar elements and initially to FIG. 1, a system 100 for generating and interacting with holographic images is illustratively shown in accordance with one embodiment. System 100 may include a workstation or console 112 from which a procedure is supervised and/or managed.

Workstation 112 preferably includes one or more processors 114 and memory 116 for storing programs and applications. Memory 116 may store a holographic generation module 115 configured to render a holographic image on a display 158 or in-air depending on the application. The holographic generation module 115 codes image data to generate a three dimensional hologram. The coding may provide the hologram on a 2D display or in 3D media or 3D display. In one example, data from 3D imaging, e.g., computed tomography, ultrasound, magnetic resonance may be transformed into a hologram using spatial distribution and light intensity to render the hologram.

A localization system 120 includes a coordinate system 122 to which a holographic image or hologram 124 is registered. The localization system 120 may also be employed to register a monitored object 128, which may include virtual instruments, which are separately created and controlled, real instruments, a physician's hands, fingers or other anatomical parts, etc. The localization system 120 may include an electromagnetic tracking system, a shape sensing system, such as a fiber optic based shape sensing system, an optical sensing system, including light sensors and arrays, or other sensing modality, etc. The localization system 120 is employed to define spatial regions in and around the hologram or the holographic image 124 to enable a triggering of different functions or actions as a result of movement in the area of the hologram 124. For example, dynamic locations of a physician's hands may be tracked using a fiber optic shape sensing device. When the physician's hands enter the same space, e.g., a monitored space 126 about a projected hologram 124, the intensity of the hologram may be increased. In another example, the physician's hand movements may be employed to spatially alter the position or orientation of the hologram 124 or to otherwise interact with the hologram 124.

A monitored object or sensing system 128 may be spatially monitored relative to the hologram 124 or the space 126 around the hologram 124. The monitored object 128 may include the physician's hands, a real or a virtual tool, another hologram, etc. The monitored object 128 may include a sensor or sensors 132 adapted to monitor the position of the monitored object 128 such that when a position of the object or a portion thereof is within the hologram 124 or the space 126 around the hologram 124, a reaction occurs that is consistent with the type of the monitored object 128 and the action performed or to be performed by the monitored object 128. The sensor or sensors 132 may include EM sensors, fiber optic shape sensors, etc.

In one embodiment, the sensors 132 include fiber optic shape sensors. A sensor interpretation module 134 may be employed to interpret feedback signals from a shape sensing device or system (132). Interpretation module 134 is configured to use the signal feedback (and any other feedback, e.g., optical, electromagnetic (EM) tracking, etc.) to reconstruct motion, deflection and other changes associated with the monitored object 128, which may include a medical device or instrument, virtual tools, human anatomical features, etc. The medical device may include a catheter, a guidewire, a probe, an endoscope, a robot, an electrode, a filter device, a balloon device, or other medical component, etc.

The shape sensing system (132) may include one or more optical fibers which are coupled to the monitored object 128 in a set pattern or patterns. The optical fibers connect to the workstation 112 through cabling 127. The cabling 127 may include fiber optics, electrical connections, other instrumentation, etc., as needed.

Shape sensing system (132) may be based on fiber optic Bragg grating sensors. A fiber optic Bragg grating (FBG) is a short segment of optical fiber that reflects particular wavelengths of light and transmits all others. This is achieved by adding a periodic variation of the refractive index in the fiber core, which generates a wavelength-specific dielectric mirror. A fiber Bragg grating can therefore be used as an inline optical filter to block certain wavelengths, or as a wavelength-specific reflector.

A fundamental principle behind the operation of a fiber Bragg grating is Fresnel reflection at each of the interfaces where the refractive index is changing. For some wavelengths, the reflected light of the various periods is in phase so that constructive interference exists for reflection and, consequently, destructive interference for transmission. The Bragg wavelength is sensitive to strain as well as to temperature. This means that Bragg gratings can be used as sensing elements in fiber optical sensors. In an FBG sensor, the measurand (e.g., strain) causes a shift in the Bragg wavelength.

One advantage of this technique is that various sensor elements can be distributed over the length of a fiber. Incorporating three or more cores with various sensors (gauges) along the length of a fiber that is embedded in a structure permits a three dimensional form of such a structure to be precisely determined, typically with better than 1 mm accuracy. Along the length of the fiber, at various positions, a multitude of FBG sensors can be located (e.g., 3 or more fiber sensing cores). From the strain measurement of each FBG, the curvature of the structure can be inferred at that position. From the multitude of measured positions, the total three-dimensional form is determined.

As an alternative to fiber-optic Bragg gratings, the inherent backscatter in

conventional optical fiber can be exploited. One such approach is to use Rayleigh scatter in standard single-mode communications fiber. Rayleigh scatter occurs as a result of random fluctuations of the index of refraction in the fiber core. These random fluctuations can be modeled as a Bragg grating with a random variation of amplitude and phase along the grating length. By using this effect in three or more cores running within a single length of multi- core fiber, the 3D shape and dynamics of the surface of interest can be followed.

In one embodiment, workstation 112 includes an image generation module 148 configured to receive feedback from the shape sensing system 132 or other sensor to sense interactions with the hologram 124. A position and status of the hologram 124 and its surrounding space 126 is known to the localization system 120. When the monitored object 128 enters the space 126 or coincides with the positions of the hologram 124, as determined by a comparison module 142, an action is triggered depending on a type of motion, a type of monitored object 128, a type of procedure or activity and/or any other criteria. The comparison module 142 informs the holographic generation module 115 that a change is needed. The holographic generation module 115 recodes the image data, which is processed and output to the image generation module 148, which updates the hologram 124 in accordance with set criteria.

In illustrative embodiments, the hologram 124 may include an internal organ rendered based on 3D images 152 of a patient or subject 150. The images 152 may be collected from the patient 150 preoperative ly using an imaging system 110. Note the imaging system 110 and the patient 150 need not be present to employ the present principles as the system 100 may be employed for training, analysis or other purposes at any time. In this example, a physician employs a pair of gloves having sensors 132 disposed thereon. As the

gloves/sensors 132, enter the space 126 and coincide with the hologram 124, the physician is able to rotate or translate the hologram 124. In another embodiment, the gloves include a haptic device 156 that provides tactile feedback depending on a position of the

gloves/sensors relative to the hologram 124 or the space 126. In other embodiments, the haptic feedback is indicative of the tissue type corresponding with the hologram 124 and its representation. The haptic device or system 156 may include ultrasound sources, speakers or other vibratory sources to convey differences in state of the hologram 124 using vibrations or sound.

A display 118 and or display 158 may also permit a user to interact with the workstation 112, the hologram 124 and its components and functions, or any other element within the system 100. This is further facilitated by an interface 130 which may include a keyboard, mouse, a joystick, a haptic device, or any other peripheral or control to permit user feedback from and interaction with the workstation 112.

In one embodiment, a user (practitioner, surgeon, fellow, etc.) can touch (or otherwise interact with) a specific region of interest (ROI) 154 within the 3D holographic display 158 or the hologram 124 within the 3D holographic display (or elsewhere) to display additional information related to the selected specific region of interest, e.g., tissue characteristics, such as temperature, chemical content, genetic signature, pressure, calcification percent, etc. An overlay of information can be displayed or presented on a separate exemplary 2D display (118), whereby parts of the 2D display can be transparent, for example, for better viewing of displayed information. It is also possible that the exemplary 2D display 118 presents or displays other graphics and text in high resolution (e.g., in exemplary embodiments where the 3D display may be of relatively low or limited resolution).

Other embodiments can provide a practitioner (e.g., doctor) with a "heads up" display (as display 158) or as a combination display (118 and 158) to accommodate the

display/presentation of such additional information. Additionally, other zones or regions of interest 154 can be automatically highlighted and/or outlined within the 3D holographic display 158 or hologram 124. Such other zones of interest can be, e.g., zones which have similar characteristics as the selected zone of interest and/or zones which are otherwise related.

According to yet another exemplary embodiment, the 3D holographic display 158 or hologram 124 may be employed with six degrees of freedom (6DOF) user tracking, e.g., with shape enabled instruments 132 and/or with camera based sensors 137, allowing for use as a user interface in 3D and real-time 6DOF user interaction. For example, a user (e.g., practitioner) is provided with the capability of touching a virtual organ being displayed as a 3D holographic image 124. The user can rotate, zoom in/out (e.g., changing the

magnification of the view), tag points in 3D, draw a path and/or trajectory plan, select (critical) zones to avoid, create alarms, insert and manipulate the orientation of virtual implants in 3D in the anatomy, etc. These functions are carried out using the localization system(s) 120 and image generation system or module 148 working in conjunction with the holographic data being displayed for the hologram 124.

Seed points 162 may be created and dropped into the 3D holographic display 158 or hologram 124 by touching (and/or tapping, holding, etc.) a portion of the display 158 or the hologram 124. The seed points 162 may be employed for, e.g., activation of virtual cameras which can provide individually customized viewing perspectives (e.g., orientation, zoom, resolution, etc.) which can be streamed (or otherwise transmitted) onto a separate high resolution 2D display 118.

The touch feature can by employed to create or drop virtual seed points 162 into the 3D display 158 for a plurality of tasks, e.g., initialization of segmentation, modeling, registration or other computation, visualization, planning step, etc. In addition to the 3D holographic display 158 or hologram 124 being used to display data of an anatomy, the display can also be used to display buttons, drop down menus, pointers/trackers, optional functions, etc. allowing users to interact and give commands to the system and/or any computer included therein or connected thereto (e.g., directly connected or via the Internet or other network).

In another embodiment, a microphone 164 may be employed to receive verbal information to connect, control, interact, etc. with the exemplary 3D holographic display 158 or hologram 124 via voice-controlled commands. A speech recognition engine 166 may be provided to convert speech commands into program commands to allow a user (e.g., surgeon) to interact with the 3D holographic display 158 or hologram 124 without having to use their hands. For example, a user could say "SHOW LAO FORTY", and the volume displayed within the holographic image would rotate to the proper angle to provide the user with the desired view. Other commands can range from those which are relatively simple, such as "ZOOM", followed by a specific amount e.g., "3 times" or so as to display particular (additional) information, to more complex commands, e.g., which can be related to a specific task or procedure.

According to another embodiment, a recording mode can be provided in memory 116 and made available to, e.g., play back a case on a same device for full 3D replay and/or on conventional (2D or 3D) viewing devices with automatic conversion of recorded 3D scenes into multiple 2D viewing perspectives (or rotating 3D models, e.g., in virtual reality modeling language (VRML)). Data connections between the holographic display 158 and recordings archived in a library/database 168 such as a picture archiving and communication system (PACS), Radiology Information Systems (RIS) or other electronic medical record system can be used to facilitate, e.g., visualization and diagnostic interpretation/data mining. Recordings can be replayed and used for, e.g., teaching and training purposes, such as to teach or train others in an individual setting, (e.g., when a user wants to review a recorded procedure performed), a small group environment (e.g., with peers and/or management), a relatively large class, lecture, etc. Such exemplary recordings may also be used for marketing presentations, research environments, etc. and may also be employed for quality and regulatory assessment, e.g., process evaluation or procedure assessment by hospital administrators, third-party insurers, investors, the Food and Drug Administration (FDA) and/or other regulatory bodies. Virtual cameras may be employed to capture or record multiple viewpoints/angles and generate multiple 2D outputs for, e.g., video capture or simultaneous display of images on different 2D television screens or monitors (or sections thereof).

Referring to FIG. 2, in another embodiment, three-dimensional (3D) holography may be used to display volumetric data of an anatomy (e.g., from a 3D CT scan), for example, to fuse anatomical with functional imaging and "sensing" information, as well as temporal (time-related) information. The information may be employed to create (generate, produce, display, etc.) a dynamic 3D multimodality representation 202 (e.g., a hologram) of an object (e.g., organ) and a status thereof using visual indicators 204, 206, such as colors, contrast levels and patterns from a display 210. The object 202 (e.g., hologram 124) may show different regions 204, 206 to indicate useful data on the object 202. For example, epicardial and/or endocardial mapping data can be used to, e.g., display electrical activity data on a heart image during an electrophysiology procedure, superimposed with the anatomical imaging data of the heart (e.g., coming from CT, XperCT or MR). Another example is the display of temperature maps which can be provided by MR during ablation, or magnetic resonance high- intensity focused ultrasound (MR-HIFU) 4D (four-dimensional) information during an intervention (e.g., using MR digital data transfer systems and procedures). It is also possible to use information associated with a real-time radiation dose distribution map superimposed over the anatomical target during a radiation oncology treatment (Linac, brachytherapy, etc.), for example. Other embodiments are also contemplated.

Referring to FIG. 3, an exemplary holographic visualization of functional and anatomical information, which may be employed during an interventional procedure in accordance with an exemplary embodiment, is illustratively shown. A volumetric image 302 of a heart, in this example, is acquired and may be segmented to reduce computational space and to determine anatomical features of the heart as opposed to other portions of the image. This results in a segmented image 304. Functional or device data is acquired by performing measurements or tests in block 306 on the heart or other anatomical feature. In the illustrative embodiment, an electroanatomical map or other map is generated corresponding with the heart or organ. The map is registered to the segmented image 304 to provide a registered image 310 that may be generated and displayed as a hologram. Real-time catheter

308 data may be collected from within or about the heart using a localization technique (shape sensing, etc.). Data traces of catheter positions or other related data (treatment locations, etc.) may be rendered in a holographic image 312 which includes both the anatomical data (e.g., segmented hologram) and the device data (e.g., catheter data).

Another exemplary embodiment according to the present disclosure includes the acquisition of incomplete data (e.g., projections rather than full 3D images). This may include, for example, data in Fourier (frequency) space where intermittent or incomplete images are acquired. For example, undersampled image data in the frequency domain are collected. According to this exemplary embodiment, it is possible to construct (generate, produce, display, etc.) a 3D holographic image display with relatively less or a reduced amount of input data, and thus a relatively less or reduced amount of associated

computational processing power and/or time. Depending on the incompleteness of the acquired data and what particular information may not be available, it is possible that the resultant 3D holographic image may be constructed/displayed with (some) limitations.

However, such exemplary embodiments can help achieve real-time or near-real-time dynamic displays with significantly less radiation exposure (e.g., in the case of live X-ray imaging) as well as computational overhead, which benefits can be considered (e.g., balanced, weighed against) in view of the potential limitations associated with this exemplary embodiment.

Referring to FIG. 4, another exemplary embodiment includes inputting virtual instruments or objects into a holographic display. In one embodiment, objects 402 may be digitized or otherwise rendered into a virtual environment 404 and displayed. The objects 402 may be drawn or loaded into the workstation 112 as object data 405 and may be coded into the display 158 and concurrently renders with the hologram 124. A static image of the object 402 may appear in the hologram 124 and may be separately manipulated with the hologram 124 (and or on the display 158). The static image may be employed for size comparisons or measurements between the object 402 and the hologram 124.

In one embodiment, a converter box 406 may be included to employ a standardization protocol to provide for a "video-ready" interface to the 3D holographic display 158. For example, with respect to shape sensing technology, the converter box 406 can format the x, y, z coordinates from each localized instrument or object 402 (catheter, implant, etc.) into a space readable by the holographic display 158 (e.g., rasterized/scan converted voxel space, vector space, etc.). This can be performed using the workstation 112 in FIG. 1. The 3D format should at least support voxels (for volumes), and graphical elements / primitives e.g., meshes (a virtual catheter can be displayed as a tube) and lines in 3D (to encode

measurements and text rendering). The 3D format can be varied in accordance with the present disclosure based on, e.g., particular laboratory or clinical use or applications, integration with other related technologies, available resources, environmental conditions, etc. Using this video capability, the object 402 (e.g., a computer aided design rendering, model, scan, etc. for an instrument, medical device, implant, etc.) may be independently manipulated relative to the hologram 124 on the display 158 or in the air. In this way, the object 402 can be placed in or around the hologram 124 to determine whether the object will fit within a portion of the hologram 124, etc. For example, an implant may be placed through a blood vessel to test the fit visually. It is also contemplated that other feedback may be employed. For example, by understanding the space that the object 402 occupies and its orientation, a comparison module may be capable of determined interference between the hologram 124 and the objects 402 to enable, say, haptic feedback to indicate that a clearance for the implant is not possible. Other applications are also contemplated.

In another exemplary embodiment, the system 100 of FIG. 1 and/or FIG. 4 may be employed as an education and/or training tool. For example, a practitioner (e.g., surgeon, physician, fellow, doctor, etc.) could practice a procedure (surgery, case, etc.) virtually prior to actually performing the procedure by understanding the 3D anatomy and/or incorporating the use of actual or virtual tools or instruments (monitored objects 128 and/or objects 402, respectively). A fellow/practitioner could practice (perform virtually) a surgical

case/procedure by, e.g., sizing an implant to plan whether it would fit a particular patient's anatomy, etc.

Referring to FIG. 5 with continued reference to FIG. 1, a tracked input device (monitored object 128), e.g., an instrument tracked with shape sensing, electromagnetic tracking, acoustic tracking, or machine vision based optical tracking (time-of- flight cameras or structured light cameras), may be employed in conjunction with the display 158 to access a virtual help mode trigger point 504 (or other functions) in the hologram 124 and generated by the image generation module 148. The virtual help trigger point 504 may include pixel regions within the display or hologram. For example, when manipulating virtual instruments or objects 402, the region or trigger point 504 may be selected (e.g., virtually selected and displayed by using the tip of the tracked virtual tool (402) (or using the monitored object 128) which is automatically registered with the hologram 124 in the image.

In one embodiment, the trigger points 504 are selected in the hologram 124 and a menu 502 or other interactive space may open to permit further selections. For example, a fellow/practitioner could first select a program called "HIP" by activating a trigger point 504 to display a 3D CT image of a patient's (subject's) hip, and then select different "HIP IMPLANTS" from different manufacturers to see and "feel" which implant would fit best for the particular patient. It is also possible to use (e.g., physically hold and manipulate) the actual implant in the air and position it within the 3D holographic display to see, feel and assess fit (e.g., if and how well such implant may fit the particular patient).

FIG. 5 shows the virtual menu 502 that may be provided in the holographic display 158 or other display 118 to permit the selection of a stent 508. The virtual menu 502 can be called using the display 158, the hologram 124 or by employing interface 130. Once the stent 508 is selected, a virtual model is rendered (see FIG. 4) in the display 158 or hologram 124 to permit manipulation, measurement, etc.

The virtual menu 502 provides for clinical decision support tying together localization and an exemplary holographic user interface in accordance with an exemplary embodiment. During intra-procedural use, the shape tracked instrument (128), e.g., a catheter, can be navigated to the anatomy of interest (504) and the virtual menu 502 can pop up automatically for each region, or the trigger point 504 may be activated by placing the object tip into the region of the trigger point 504 or otherwise activating the trigger point 504 (e.g., touching it, etc.). An implant or other device may be selected, which is then introduced to allow for device sizing/selection to be performed in the virtual holography space (e.g., within or in close proximity to the holographic display).

Referring to FIG. 6, according to another exemplary embodiment, a 3D holographic display 158 or hologram 124 may be employed during surgery to interact with a device 602 inside the patient. Robotics via a master/slave configuration can be used, where a shape sensed analog 604 of the device 602 moving within the display 158 is employed to actuate the motion of the actual device 602 within a target region 606. A practitioner's (surgeon's, physician's, etc.) hands 610 or voice can be tracked by sensor-based and/or voice-based techniques, such as by, e.g., tracking a physician's hands using a shape-sensing device 608 and shape sensing system 614 in the 3D holographic display. Accordingly, a practitioner's movements (including, e.g., (re)positioning, orientation, etc. of their hands) performed in the holographic display 158 can be transmitted to the device 602, such as a robot 612 (e.g., robotically controlled instruments) inside the patient to replicate such movements within the patient's body, and thereby perform the actual surgery, procedure or task inside the patient's body. Thus, a surgeon can see, touch and feel a 3D holographic display of an organ, perform a procedure thereon (i.e., within the 3D holographic display), causing such procedure to be performed inside of a patient on the actual organ via, or simply to move instruments, e.g., robotically controlled instruments.

The movement of the physician creates sensing signals using sensor 608 (and/or sensors in device 604), which are adapted to control signals by the system 100 for controlling the robot or other device 602. The signals may be stored in memory (1 16) for delayed execution if needed or desired. The actual procedure may be performed in real-time (or near real-time), e.g., a robot 612 performs a specific movement within a patient's body concurrently with a surgeon's performance within the 3D holographic display 158. The actual procedure may be performed by, e.g., a robot (only) after a surgeon completes a certain task/movement (or series of tasks or movements), and/or the surgeon confirms that the robot should proceed (e.g., after certain predefined criteria and/or procedural milestone(s) are reached). Such a delay (e.g., between the virtual performance of a task or movement within the 3D holographic display to the actual performance within a patient's body) can help to prevent any movements/tasks being performed within the patient incorrectly and ensure that each such movement/task is performed accurately and precisely by providing the surgeon an opportunity to confirm a movement/task after it has been performed virtually within the 3D holographic display 158 before it is actually performed within a patient's body 150 by the robot 612.

Further, a practitioner could opt to redo a specific movement/task that is virtually performed within the 3D holographic display 158 if the practitioner is not satisfied with such movement or task (for any reason). Thus, for example, if a surgeon were to inadvertently move too far in any particular direction when virtually performing a movement or task in the 3D holographic display, the surgeon could opt to redo such virtual movement or task (as many times as desired or may be necessary) until it is performed correctly. After which the actual movement or task can be performed by a robot inside of the patient with or without dynamic adaptation of the task to adjust for changes in target or therapy instrument on-the-fly (e.g., dynamically, on a continuous basis, in real-time, etc.).

Referring to FIG. 7, another exemplary embodiment includes haptic feedback, which can be incorporated by using, e.g., ultrasound to generate vibrations in the air. A haptic device 710 may take many forms. In one embodiment, a set of ultrasonic probes 702 can be used to send customized waves towards a 3D holographic display 704 to give the user (e.g., a practitioner) a sense of feeling of structures being displayed and represented by the 3D holographic display 704. For example, such waves can be tailored or customized to represent hard or stiff materials of bony structures, while other tissues, such as of the liver and/or vessels, can be represented with a relatively softer "feel" by waves which are tailored or configured accordingly. Such encoding can be realized by, e.g., modulation of the frequency, amplitude, phase, or other spatiotemporal modulation of the excitation imparted by a haptic device 710 to the observer.

In another embodiment, the haptic feedback device 710 may be employed to, e.g., represent physical resistance of a particular structure or tissue in response to a movement or task performed by a practitioner. Thus, for example, when a surgeon 714 virtually performs a task within the 3D holographic display 704, using haptic feedback, it is possible for such task to be felt by the surgeon as if the surgeon were actually performing the task within the patient's body. This can be realized using a haptic device 712, such as, e.g., a glove(s), bracelet, or other garments or accessories having actuators or other vibratory elements.

According to one exemplary embodiment, the exemplary 3D holographic display 158 can be seen from (virtually) any angle, e.g., so that all users can interact with the same understanding and information. This is the case for an in-air hologram; however, other display techniques may be employed to provide multiple individuals with a same view.

Referring to FIG. 8, display information may be provided to different users positioned in a room or area, by displaying the same or different information on a geometrical structure 802 (holographically), such as a multi- faceted holographic display where each face of the display (e.g., a cube or polyhedron) displays the information. This can be achieved by projecting multiple 2D video streams 804 on the geometrical structure 802 (e.g., side by side, or partially overlapping) rendered within a holographic output 806. For example, a holographic "cube" display in 3D can show/display on one cube face information (e.g., a 2D live x-ray image) in one particular direction (e.g., the direction of a first practitioner 808), while another cube face of the same "cube" display can show/display another type of information (e.g., an ultrasound image) to a second practitioner 810 positioned elsewhere in the room (e.g., diametrically opposite the display from the first practitioner).

One having ordinary skill in the art will appreciate in view of the teachings provided herein that such exemplary display can be configured at will depending on, e.g., the number of users in the room. It is also possible that the position (in the room) of each

user/practitioner can be tracked (in the room) and that each individual's display information follows each user's viewing perspective as the user moves (e.g., during a procedure). For example, one particular user (doctor, nurse, etc.) can be provided with the specific information that the user needs regardless of where in the room such particular user moves during a procedure. Further, it is also possible that each user is provided with a unique display, which can be a 2D cube face, such as described above, or a 3D holographic display customized or tailored for such user, and that such a unique display can "follow" the user as the user moves around a room.

Multiple combinations of displays in accordance with this and other exemplary embodiments described herein are possible, providing, e.g., for individual users to have their own unique display and/or be presented with the same information of other users, regardless of the movement and location of a user within a room or elsewhere (e.g., outside of the room, off-site, etc.). Additionally, a user may initially select and change at any time during a procedure what information is displayed to them by, e.g., selecting from predefined templates, selecting specific informational fields, selecting the display of another particular user, etc.

Note that text is an inherently 2D mode of communication. The system may display shapes/symbols identifiable from multiple viewpoints, or represent the text oriented towards the viewer. In case of multiple viewers, the oriented text may be shown in multiple directions simultaneously or to each independently in different frames.

Referring to FIG. 9, in another exemplary embodiment, a remote system 900 may include at least some of the capabilities of system 100 (FIG. 1) but is remotely disposed relative to a patient 902 and data collection instruments. A user (practitioner, surgeon, fellow, etc.) may conduct a procedure remotely (e.g., with the user being physically located off-site from the location where the subject/patient 902 is located) or assist or provide guidance remotely to the procedure. For example, a user can perform a procedure/task on an exemplary holographic display 904 located at their location. In one embodiment, the display 904 is connected (e.g., via the Internet or other network 910 (wired or wireless)) to the system 100 co-located with the patient 902. System 100 can be in continuous

communication with the remote system 900 (e.g., where the user is located) so that the holographic display 904 is continually updated in (near) real-time. Additionally, the system 100 may include robotically controlled instruments 906, e.g., inside of a patient) which are controlled via commands provided (e.g., via the Internet) by the remote system 900, as described above. These commands are generated based on the user's interaction with the holographic display 904. Holographic displays 158 and 904 may include the same subject matter at one or more locations so that the same information is conveyed at each location. For example, this embodiment may include, e.g., providing guidance or assistance to another doctor around the globe, for peer-to-peer review, expert assistance or a virtual class room where many students could attend a live case from different locations throughout the world.

Some or all of the exemplary embodiments and features described herein can also be used (at least in part) in conjunction or combination with any other embodiments described herein.

Referring to FIG. 10, a method for interacting with a holographic display is shown in accordance with illustrative embodiments. In block 1002, a holographically rendered anatomical image is generated and displayed. The image may include one or more organs or anatomical regions. The holographically rendered anatomical image may be generated in-air.

In block 1004, a monitored space is localized on or around the holographically rendered anatomical image to define a region for interaction. The localization system may include one or more of a fiber optic shape sensing system, an electromagnetic tracking system, a light sensor array and/or other sensing modality. The position and orientation of the monitored space and the one or more monitored objects is preferably determined in a same coordinate system. The one or more monitored objects may include a medical instrument, an anatomical feature of a user, a virtual object, etc.

In block 1006, a position and orientation of one or more monitored objects is monitored by the localization system. In block 1008, coincidence of spatial points is determined between the monitored space the one or more monitored objects. In block 1010, if coincidence is determined, a response is triggered in the ho lo graphically rendered anatomical image. In block 1012, the response may include moving the holographically rendered anatomical image (e.g. 6DOF) or changing its appearance. In block 1014, the response may include adjusting a zoom (magnification) or other optical characteristics of the holographically rendered anatomical image. In block 1016, the holographically rendered anatomical image may be marked, tagged, targeted, etc. In block 1018, camera viewpoints can be assigned (for other viewers or displays). In block 1020, feedback may be generated to a user. The feedback may include haptic feedback (vibrating device or air), optical feedback (visual or color differences), acoustic feedback (verbal, alarms), etc.

In block 1022, a response region may be provided and monitored by the localization system such that upon activating the response region a display event occurs. The display event may include generating a help menu in block 1024; generating a menu of virtual objects to be included in the holographically rendered anatomical image upon selection in block 1026; and generating information to be displayed in block 1028.

In block 1030, the holographically rendered anatomical image may be generated with superimposed medical data mapped to positions on the holographically rendered anatomical image. In block 1032, the response that is triggered may include generating control signals for operating robotically controlled instruments. The control signals may enable remote operations to be performed.

In interpreting the appended claims, it should be understood that:

a) the word "comprising" does not exclude the presence of other elements or acts than those listed in a given claim;

b) the word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements;

c) any reference signs in the claims do not limit their scope; d) several "means" may be represented by the same item or hardware or software implemented structure or function; and

e) no specific sequence of acts is intended to be required unless specifically indicated.

Having described preferred embodiments for holographic user interfaces for medical procedures (which are intended to be illustrative and not limiting), it is noted that modifications and variations can be made by persons skilled in the art in light of the above teachings. It is therefore to be understood that changes may be made in the particular embodiments of the disclosure disclosed which are within the scope of the embodiments disclosed herein as outlined by the appended claims. Having thus described the details and particularity required by the patent laws, what is claimed and desired protected by Letters Patent is set forth in the appended claims.