Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
ADVANCED SYSTEMS AND METHODS FOR PROVIDING REAL-TIME ANATOMICAL GUIDANCE IN A DIAGNOSTIC OR THERAPEUTIC PROCEDURE
Document Type and Number:
WIPO Patent Application WO/2015/069657
Kind Code:
A1
Abstract:
The present disclosure provides, among other things, an imaging system and method for providing anatomical guidance in a diagnostic or therapeutic procedure. Also disclosed are a system and method for providing anatomical guidance in an ex vivo diagnostic procedures. Aspects and embodiments include devices and methods for an imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for application with a whole body imaging system, an automated pathology system, a minimally invasive system, a computed tomography / magnetic resonance (CT/MR) integrated system, and the like.

Inventors:
NIE SHUMING (US)
MOHS AARON M (US)
MANCINI MICHAEL C (US)
Application Number:
PCT/US2014/063923
Publication Date:
May 14, 2015
Filing Date:
November 04, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV EMORY (US)
GEORGIA TECH RES INST (US)
International Classes:
A61B5/06; A61B5/00; A61B19/00; G01J3/02; G01J3/44; G01N21/359
Foreign References:
US20080051629A12008-02-28
US20080267472A12008-10-30
US20120123205A12012-05-17
US20110152692A12011-06-23
US20040006276A12004-01-08
Other References:
None
Attorney, Agent or Firm:
WIGLEY, David E. et al. (999 Peachtree Street N, Atlanta Georgia, US)
Download PDF:
Claims:
CLAIMS

We Claim:

1. A system for providing anatomical guidance in a diagnostic or therapeutic procedure, comprising:

(a) a first light source configured to emit a beam of visible light to an area of interest in a subject;

(b) a second light source configured to emit a beam of near- infrared light to the area of interest;

(c) a probe device optically coupled to the second light source but not optically coupled to the first light source, comprising a plurality of optical fibers configured to deliver the emitted beam of near- infrared light to illuminate the area of interest and configured to collect light that is scattered or emitted from a contrast agent introduced into target tissues in the area of interest, in response to illumination by the second light source;

(d) a first imaging device optically coupled to the probe device and configured to detect the collected light and to generate a corresponding signal that comprises collected light data, and wherein the probe device is further configured to transmit the collected light to the first imaging device through the plurality of optical fibers of the probe device;

(e) a second imaging device configured to detect visible light that is emitted from the area of interest in response to illumination by the first light source through an optical path that has no overlapping portion with the plurality of optical fibers of the probe device, and to generate a corresponding signal comprising visible light data;

(f) a third imaging device configured to detect near-infrared light having a first predetermined wavelength that is emitted from the area of interest in response to illumination by the second light source and to generate a corresponding signal comprising a first set of near infrared light data;

(g) a fourth imaging device configured to detect near-infrared light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source, and to generate a corresponding signal comprising a second set of near-infrared light data;

(h) a display for displaying at least one visual representation of data; and

(i) a controller in communication with each of the first light source, second light source, first imaging device, second imaging device, third imaging device, fourth imaging device, and display, and programmed to generate at least one real-time integrated visual representation of the area of interest from each of the collected light data, visible light data, first set of near- infrared light data, and second set of near- infrared light data and to display the at least one real-time visual representation on the display, for guidance during the diagnostic or therapeutic procedure.

2. A system according to claim 1, wherein the plurality of optical fibers are configured in an array, and the probe device is integral to a whole body imaging system, an automated pathology system, a minimally invasive system, or a computed tomography / magnetic resonance (CT/MR) integrated system.

3. A system according to claim 1, wherein the system is delivered within a trochar or catheter, or integrated into a biopsy needle or ablation probe.

4. An imaging system using integrated bright- field imaging, near-infrared imaging, and at least one of Raman imaging and fluorescence imaging for evaluating target tissues in an area of interest in a subject, comprising:

(a) a first light source for delivering a beam of visible light to the area of interest and a second light source for delivering a beam of near- infrared light to the area of interest;

(b) a Raman and fluorescence imaging means, comprising:

(i) a probe device optically coupled to the second light source but not optically coupled to the first light source, for delivering the near infrared light to illuminate target tissues of the area of interest and for collecting at least one of scattered light and emitted light from a corresponding at least one of a Raman probe and a fluorescence probe that is introduced into the target tissues and illuminated by the second light source, the probe device integral to an endoscopic device and comprising a plurality of optical fibers; and

(ii) a first imaging device in communication with the probe device for obtaining at least one of Raman data from the collected scattered light and fluorescence data from the collected emitted light, respectively; and

(c) a bright-field imaging means, comprising:

(i) a second imaging device for obtaining visible light data from visible light emitted from the area of interest in response to illumination by the first light source through an optical path that has no overlapping portion with the plurality of optical fibers of the probe device;

(ii) a third imaging device for obtaining a first set of near-infrared data from light having a first predetermined wavelength that is emitted from the area of interest in response to illumination by the second light source; and

(iii) a fourth imaging device for obtaining a second set of near infrared data from light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source.

5. An imaging system according to claim 4, wherein the plurality of optical fibers are configured in an array within the probe device, and the probe device is integral to a whole body imaging system, an automated pathology system, a minimally invasive system, or a computed tomography / magnetic resonance (CT/MR) integrated system.

6. An imaging system according to claim 4, wherein the bright-field imaging means further comprises:

(iv) an optical port;

(v) a system lens comprising a UV-NIR compact lens and a first focusing lens group;

(vi) a trichroic prism;

(vii) a first laser attenuating filter;

(viii) a bandpass filter;

(ix) a second laser attenuating filter;

(x) a second focusing lens group, a third focusing lens group, and a fourth focusing lens group;

wherein the optical port and the first imaging device define a first optical path therebetween having the trichroic prism and the second focusing lens group, wherein the optical port and the second imaging device define a second optical path therebetween having the trichroic prism, first laser attenuating filter, and third focusing lens group, and wherein the optical port and the third imaging device define a third optical path therebetween having the trichroic prism, the second laser attenuating filter, bandpass filter, and fourth focusing lens group.

7. A system according to claim 1, wherein the system further comprises software integrated with the imaging software of a CT or MR system to identify, localize, and display registered overlay images showing the presence and extent of tissue areas of interest correlated with the CT or MR systems.

8. A system according to claim 7, wherein the system is further integrated with the software of a robotic surgical system to provide localization information about the presence and extent of tissue areas of interest within a robotic surgical field.

9. A method for providing anatomical guidance in a diagnostic or therapeutic procedure, comprising the steps of:

(a) introducing at least one contrast agent into target tissues in an area of interest in a subject;

(b) emitting a beam of visible light to the area of interest, using a first light source;

(c) emitting a beam of near- infrared light to the area of interest, using a second light source;

(d) delivering the emitted beam of near- infrared light to illuminate the area of interest, using a plurality of optical fibers of a probe device that is optically coupled to the second light source but not optically coupled to the first light source;

(e) collecting at least one of scattered light and emitted light from the contrast agent in response to illumination by the second light source, using the plurality of optical fibers of the probe device, wherein the contrast agent comprises at least one of a Raman probe and a fluorescence probe;

(f) detecting the collected light and generating a corresponding signal that comprises collected light data, using a first imaging device that is optically coupled to the plurality of optical fibers, and wherein the optical fibers are further configured to deliver the collected light to the first imaging device;

(g) detecting visible light that is emitted from the area of interest in response to illumination by the first light source and generating a corresponding signal comprising visible light data, using a second imaging device;

(h) detecting near-infrared light having a first predetermined wavelength that is emitted from the area of interest in response to illumination by the second light source and generating a corresponding signal comprising a first set of near- infrared light data, using a third imaging device; (i) detecting near-infrared light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source and generating a corresponding signal comprising a second set of near- infrared light data, using a fourth imaging device;

(j) generating at least one real-time integrated visual representation of the area of interest from the collected light data, visible light data, first set of near- infrared data, and second set of near- infrared data, using a controller in communication with each of the first imaging device, second imaging device, third imaging device, and fourth imaging device; and

(k) displaying the at least one real-time integrated visual representation generated by the controller, for guidance during a diagnostic or therapeutic procedure, using a display in communication with the controller.

10. A system, imaging system, or method according to any one of the preceding claims, further comprising an enclosure and at least one movable video camera to provide whole body or other large surface area surveys.

11. A system for providing anatomical guidance in an ex vivo diagnostic procedure, comprising:

(a) a first light source configured to emit a beam of visible light to an area of interest in a subject tissue;

(b) a second light source configured to emit a beam of near- infrared light to the area of interest;

(c) a probe device optically coupled to the second light source but not optically coupled to the first light source, comprising of one or more optical fiber(s) configured to deliver the emitted beam of near- infrared light to illuminate the area of interest and configured to collect light that is scattered or emitted from a contrast agent introduced into target tissues in the area of interest, in response to illumination by the second light source;

(d) a first imaging device optically coupled to the probe device and configured to detect the collected light and to generate a corresponding signal that comprises collected light data, and wherein the probe device is further configured to transmit the collected light to the first imaging device through the one or more optical fiber(s) of the probe device; (e) a second imaging device configured to detect visible light that is emitted from the area of interest in response to illumination by the first light source through an optical path that has no overlapping portion with the optical fiber(s) of the probe device, and to generate a corresponding signal comprising visible light data;

(f) a third imaging device configured to detect near-infrared light having a first predetermined wavelength that is emitted from the area of interest in response to illumination by the second light source and to generate a corresponding signal comprising a first set of near infrared light data;

(g) a fourth imaging device configured to detect near-infrared light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source, and to generate a corresponding signal comprising a second set of near-infrared light data;

(h) a display for displaying at least one visual representation of data; and

(i) a controller in communication with each of the first light source, second light source, first imaging device, second imaging device, third imaging device, fourth imaging device, and display, and programmed to generate at least one real-time integrated visual representation of the area of interest from each of the collected light data, visible light data, first set of near- infrared light data, and second set of near- infrared light data and to display the at least one real-time visual representation on the display, for guidance during the diagnostic procedure.

12. The system of claim 11 ; wherein said system further comprises a manual or automated system for examining said subject tissue within one or more carrier containers.

13. The system of claim 1 1, wherein said system further comprises a software interface that may correlate detected near-infrared light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source within one or more identified locations of said area of interest within one or more carrier containers.

14. The system of claim 13, wherein said software may direct a user to said one or more identified locations of said areas of interest within said one or more carrier containers.

15. The system of claim 14, wherein said carrier containers may be either single use disposables or reusable.

16. A method for providing anatomical guidance in an ex vivo diagnostic procedure, comprising:

(a) introducing at least one of a Raman probe and a fluorescence probe into a subject tissue until the at least one probe has accumulated in the target tissue;

(b) preparing the subject tissue for a diagnostic procedure;

(c) initializing an imaging system for integrated bright-field imaging, near-infrared imaging, and at least one of Raman imaging and fluorescence imaging;

(d) beginning the diagnostic procedure in the subject tissue;

(e) using a first real-time integrated visual representation of the subject tissue, generated by the imaging system, to identify a boundary of the subject tissue that is diseased;

wherein the imaging system is a system according to claim 9.

Description:
ADVANCED SYSTEMS AND METHODS FOR PROVIDING REAL-TIME ANATOMICAL GUIDANCE IN A DIAGNOSTIC OR THERAPEUTIC

PROCEDURE

CROSS REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of priority to U.S. Provisional Application No. 61/900, 125, filed November 5, 2013, the disclosure of which is incorporated herein by reference in its entirety.

FIELD OF THE DISCLOSURE

The present disclosure generally relates to systems and methods for intra- operatively providing guidance in a diagnostic or therapeutic procedure, including whole body imaging systems.

BACKGROUND

In many areas of surgery, there is a need for anatomical guidance and rapid pathology to be provided during a diagnostic or therapeutic procedure. In the area of surgical oncology, for example, there is a need to determine if a tumor has been completely resected, such as by verifying that the margin of resected tumor tissue is clear, without having to wait for pathology to process the resected tissue to verify that there are no remaining signs of cancerous growth in the margin.

Similarly, medical and biomedical practices often involve the visualization of human or other biological tissues as a means of detecting pathology, including the detection of cancer or pre-cancerous lesions. Such practices may include, but are not limited to, physical examination, endoscopic examinations or treatments, or procedures employing other imaging technologies, such as radiography, fluoroscopy, tomography, computerized tomography, magnetic resonance studies, positron emission tomography, or nuclear medical scans. Such imaging systems may detect abnormalities suggestive of pathology like cancer, but lack a real-time, definitive capacity to actually diagnose the presence (or absence) of such pathology in real-time in the tissues examined. Therefore, a heretofore unaddressed need still exists in the art to address the aforementioned deficiencies and inadequacies. SUMMARY

In one aspect, the present disclosure relates to a system for intra-operatively providing anatomical guidance in a diagnostic or therapeutic procedure. In one embodiment, the system includes a first light source that is configured to emit a beam of visible light to an area of interest in a living or non-living subject or subject tissue and a second light source that is configured to emit a beam of near- infrared light to the area of interest. The system also includes a handheld probe that is optically coupled to the second light source, and that includes an optical fiber that is configured to deliver the emitted beam of near- infrared light to illuminate the area of interest. The optical fiber is also configured to collect light that is scattered or light that is emitted from a contrast agent introduced into target tissues in the area of interest, in response to illumination by the second light source. A first electronic imaging device is also included in the system. The first electronic imaging device is optically coupled to the handheld probe and is configured to detect the collected light and to generate a corresponding signal that includes collected light data. The handheld probe is further configured to transmit the collected light to the first electronic imaging device through the optical fiber. The system further includes a second electronic imaging device that is configured to detect visible light that is emitted from the area of interest in response to illumination by the first light source, and to generate a corresponding signal including visible light data. A third electronic imaging device is also included in the system, which is configured to detect near-infrared light having a first predetermined wavelength that is emitted from the area of interest, in response to illumination by the second light source, and which is also configured to generate a corresponding signal including a first set of near-infrared light data. In addition, the system includes a fourth electronic imaging device that is configured to detect near- infrared light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest, in response to illumination by the second light source, and the fourth electronic imaging device is also configured to generate a corresponding signal that includes a second set of near- infrared light data. A display for displaying at least one visual representation of data is further included in the system. Also, the system includes a controller that is in communication with each of the first light source, second light source, first electronic imaging device, second electronic imaging device, third electronic imaging device, fourth electronic imaging device, and display. The controller is programmed to generate at least one realtime integrated visual representation of the area of interest from each of the collected light data, visible light data, first set of near-infrared light data, and second set of near- infrared light data, and to display the real-time visual representation on the display for guidance during the diagnostic or therapeutic procedure.

In an embodiment, the contrast agent includes a Raman probe and/or a

fluorescence probe and the collected light data includes Raman data and/or fluorescence data, respectively. In this embodiment, the integrated visual representation includes a wide-field image of the area of interest that is generated from the visible light data, a laser excitation image of a selected area of the area of interest that is defined within the wide- field image and that is generated from at least one of the generated first set of near- infrared light data and the generated second set of near- infrared light data, and a Raman image generated from the Raman data and/or a fluorescence image generated from the fluorescence data. The Raman image and/or fluorescence image is defined within the wide-field image and the laser excitation image, as an overlay image on the laser excitation image.

In an embodiment, the first electronic imaging device includes a spectrometer and each of the second electronic imaging device, third electronic imaging device, and fourth electronic imaging device includes a CCD or CMOS camera.

In another aspect, the present disclosure relates to an imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues (including intra-operatively) in an area of interest in a living or non-living subject (or subject tissue). In one embodiment, the system includes a first light source for delivering a beam of visible light to the area of interest and a second light source for delivering a beam of near- infrared light to the area of interest. The system also includes a Raman and/or fluorescence imaging means that includes a handheld probe optically coupled to the second light source, for delivering the near infrared light to illuminate target tissues of the area of interest, and for collecting scattered light and/or emitted light from a corresponding Raman probe and/or fluorescence probe that is introduced into the target tissues and illuminated by the second light source. The system further includes a first electronic imaging device that is in communication with the handheld probe, for obtaining Raman data and/or fluorescence data from the collected light. In this embodiment, the first electronic imaging device includes a spectrometer. A bright-field imaging means is also included in the system according to this embodiment. The bright- field imaging means includes: an optical port; a system lens including a UV-NIR compact lens and a first achromatic correction lens; a silver mirror; a first dichroic mirror and a second dichroic mirror; a first shortpass filter and a second shortpass filter; a neutral density filter; a bandpass filter; a longpass filter; a second achromatic lens, a third achromatic lens, and a fourth achromatic lens; a second electronic imaging device for obtaining visible light data from visible light emitted from the area of interest in response to illumination by the first light source; a third electronic imaging device for obtaining a first set of near-infrared data from light having a first predetermined wavelength that is emitted from the area of interest in response to illumination by the second light source; and a fourth electronic imaging device for obtaining a second set of near infrared data from light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source. Each of the second electronic imaging device, third electronic imaging device, and fourth electronic imaging device include a CCD camera.

In an embodiment, the optical port and the first electronic imaging device define a first optical path between them that includes the silver mirror, the first dichroic mirror, the second dichroic mirror, and the second achromatic lens, where the optical port and the second electronic imaging device define a second optical path between them that includes the silver mirror, first dichroic mirror, second dichroic mirror, neutral density filter, and third achromatic lens. The optical port and the third electronic imaging device define a third optical path between them that includes the silver mirror, first dichroic mirror, longpass filter, bandpass filter, and fourth achromatic lens. The system of this embodiment also includes a display for displaying at least one visual representation of data, and a controller in communication with each of the first light source, second light source, first electronic imaging device, second electronic imaging device, third electronic imaging device, fourth electronic imaging device, and display. The controller is programmed for generating in real-time an integrated visual representation of the area of interest from the collected light data, first set of near-infrared data, second set of near- infrared data, and displaying the integrated visual representation on the display, to provide guidance for performing a diagnostic or therapeutic procedure.

In an embodiment, the real-time integrated visual representation of the area of interest includes a wide- field image of the area of interest generated from the visible light data, a laser excitation image of a predetermined area defined within the wide-field image that is generated from the first set of near- infrared data and/or the second set of near- infrared data, and a Raman image and/or fluorescence image that is defined within the laser excitation image and that is generated from corresponding Raman data and/or fluorescence data. The Raman image and/or fluorescence image is an overlay image on the laser excitation image.

In an embodiment, the at least one integrated visual representation of the area of interest includes a wide- field image of the area of interest generated from the visible light data, a laser excitation image of a predetermined area defined within the wide-field image that is generated from at least one of the first set of near- infrared data and the second set of near-infrared data, and at least one of a Raman image and a fluorescence image that is generated from a corresponding at least one of the Raman data and fluorescence data. The laser excitation image is an overlay image on the wide-field image and represents the location of the delivered beam of near- infrared light within the area of interest. The Raman data and/or fluorescence data is represented by a signal that, when exceeding a predefined threshold level, signifies disease in the target tissues.

Further, the Raman image and/or the fluorescence image is a color overlay image on the laser excitation image, having an opacity representative of the level of the signal exceeding the predefined threshold level, and the opacity of the color overlay image decays over time to be progressively more translucent relative to the laser excitation image.

In yet another aspect, the present disclosure relates to a method for intra- operatively providing anatomical guidance in a diagnostic or therapeutic procedure. In one embodiment, the method includes the steps of introducing at least one contrast agent into target tissues in an area of interest in a living or non-living subject, and the step of emitting a beam of visible light to the area of interest, using a first light source. The method also includes the step of emitting a beam of near- infrared light to the area of interest, using a second light source, and the step of delivering the emitted beam of near- infrared light to illuminate the area of interest, using an optical fiber of a handheld probe that is optically coupled to the second light source. In addition, the method includes the step of collecting scattered light and/or emitted light from the contrast agent in response to illumination by the second light source, using the optical fiber of the handheld probe. The contrast agent includes a Raman probe and/or a fluorescence probe. Further, the method includes the step of detecting the collected light and generating a corresponding signal that includes collected light data, using a first electronic imaging device that is optically coupled to the optical fiber, where the optical fiber is further configured to deliver the collected light to the first electronic imaging device. The method also includes the step of detecting visible light that is emitted from the area of interest in response to illumination by the first light source and generating a corresponding signal comprising visible light data, using a second electronic imaging device, and the step of detecting near-infrared light having a first predetermined wavelength that is emitted from the area of interest in response to illumination by the second light source and generating a corresponding signal that includes a first set of near- infrared light data, using a third electronic imaging device. Still further, the method includes the step of detecting near-infrared light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source, and generating a corresponding signal that includes a second set of near-infrared light data, using a fourth electronic imaging device, and the step of generating at least one real-time integrated visual representation of the area of interest from the collected light data, visible light data, first set of near-infrared data, and second set of near-infrared data, using a controller that is in communication with each of the first electronic imaging device, second electronic imaging device, third electronic imaging device, and fourth electronic imaging device.

The method also includes the step of displaying the real-time integrated visual representation generated by the controller, for guidance during a diagnostic or therapeutic procedure, using a display that is in communication with the controller.

In an embodiment, the step of generating the real-time integrated visual representation of the area of interest includes the steps of generating a wide-field image of the area of interest from the visible light data, generating a laser excitation image of a selected area of the area of interest that is defined within the wide-field image, from the first set of near- infrared light data and/or the second set of near- infrared light data, and generating a Raman image and/or a fluorescence image from the collected light data that is defined within the wide-field image and the laser excitation image. The Raman image and/or fluorescence image is an overlay image on the laser excitation image.

In an embodiment, the first electronic imaging device includes a spectrometer, and each of the second electronic imaging device, third electronic imaging device, and fourth electronic imaging device includes a CCD or CMOS camera.

In yet another aspect, the present disclosure relates to software stored on a computer-readable medium that is programmed for causing a controller to perform functions for intra-operatively providing anatomical guidance in a diagnostic or therapeutic procedure. In one embodiment, the functions include causing a first light source in communication with the controller to emit a beam of visible light to an area of interest in a subject, causing a second light source optically coupled to an optical fiber and in communication with the controller to emit a beam of near-infrared light to the area of interest through the optical fiber, and causing the optical fiber of the handheld probe to collect light scattered from a Raman probe and/or light emitted from fluorescence probe, in response to illumination by the second light source. The Raman probe and/or fluorescence probe is introduced into the target tissues in the area of interest. The functions also include causing a first electronic imaging device that is in communication with the controller and the optical fiber to detect the collected light, and causing the first electronic imaging device to generate a signal from the collected light that includes Raman data and/or fluorescence data. Further, the functions include causing a second electronic imaging device that is in communication with the controller to detect visible light that is emitted from the area of interest in response to illumination by the first light source, causing the second electronic imaging device to generate a corresponding signal comprising visible light data, causing a third electronic imaging device that is in communication with the controller to detect near-infrared light having a first

predetermined wavelength that is emitted from the area of interest in response to illumination by the second light source, and causing the third electronic imaging device to generate a corresponding signal that includes a first set of near-infrared light data.

In addition, the functions include causing a fourth electronic imaging device that is in communication with the controller to detect near-infrared light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source, and causing the fourth electronic imaging device to generate a corresponding signal that includes a second set of near- infrared light data. Further, the functions include generating at least one real-time integrated visual representation of the area of interest from the visible light data, first set of near- infrared data, second set of near- infrared data, and from the Raman data and/or fluorescence data, and causing a display in

communication with the controller to display the generated real-time integrated visual representation for guidance during a diagnostic or therapeutic procedure.

In an embodiment, the function of generating the real-time integrated visual representation of the area of interest includes the steps of generating a wide-field image of the area of interest from the visible light data, generating a laser excitation image of a selected area of the area of interest that is defined within the wide- field image from the first set near-infrared light data and/or the second set of near- infrared light data, and generating a Raman image from the Raman data and/or a fluorescence image from the fluorescence data, that is defined within the wide-field image and the laser excitation image.

In an embodiment, the Raman image and/or fluorescence image is an overlay image on the laser excitation image. The first electronic imaging device includes a spectrometer, and each of the second electronic imaging device, third electronic imaging device, and fourth electronic imaging device includes a CCD camera.

In yet another aspect, the present disclosure relates to a method for intra- operatively identifying disease in target tissues in an area of interest in a subject, to be resected in a diagnostic or therapeutic procedure. In one embodiment, the method includes the step of introducing a Raman probe and/or a fluorescence probe into the area of interest until the probe has accumulated in the target tissues, the step of preparing the living subject and the area of interest for a diagnostic or therapeutic procedure, and the step of initializing an imaging system for integrated bright- field imaging, near- infrared imaging, and Raman imaging and/or fluorescence imaging. The method also includes the step of beginning the diagnostic or therapeutic procedure in the area of interest, the step of using a first real-time integrated visual representation of the area of interest and the target tissues that is generated by the imaging system to identify a boundary of the target tissues that are diseased, and the step of performing a surgical resection of the identified diseased target tissues within the boundary. Further, the method includes the steps of, after the surgical resection, using a second displayed real-time integrated visual representation of the area of interest and the target tissues, generated by the imaging system, to identify any remaining diseased target tissues within the boundary, and if any remaining diseased target tissues are identified, performing a series of further surgical resections on identified remaining diseased target tissues corresponding to a respective series of real-time integrated visual representations generated by the imaging system, until the area of interest is free from diseased target tissues.

In an embodiment, the imaging system includes a first light source that is configured to emit a beam of visible light to an area of interest in a subject and a second light source that is configured to emit a beam of near- infrared light to the area of interest. The system also includes a handheld probe that is optically coupled to the second light source, and that includes an optical fiber that is configured to deliver the emitted beam of near-infrared light to illuminate the area of interest and that is also configured to collect light that is scattered or light that is emitted from a contrast agent introduced into target tissues in the area of interest, in response to illumination by the second light source. A first electronic imaging device is also included in the system. The first electronic imaging device is optically coupled to the handheld probe and is configured to detect the collected light and to generate a corresponding signal that includes collected light data. The handheld probe is further configured to transmit the collected light to the first electronic imaging device through the optical fiber. The system further includes a second electronic imaging device that is configured to detect visible light that is emitted from the area of interest in response to illumination by the first light source, and to generate a

corresponding signal including visible light data. A third electronic imaging device is also included in the system, which is configured to detect near-infrared light having a first predetermined wavelength that is emitted from the area of interest, in response to illumination by the second light source, and which is also configured to generate a corresponding signal including a first set of near- infrared light data. In addition, the system includes a fourth electronic imaging device that is configured to detect near- infrared light having a second predetermined wavelength that is different from the first predetermined wavelength, and that is emitted from the area of interest in response to illumination by the second light source. The fourth electronic imaging device is also configured to generate a corresponding signal that includes a second set of near- infrared light data. A display for displaying at least one visual representation of data is further included in the system. Also, the system includes a controller that is in communication with each of the first light source, second light source, first electronic imaging device, second electronic imaging device, third electronic imaging device, fourth electronic imaging device, and display. The controller is programmed to generate at least one real- time integrated visual representation of the area of interest from each of the collected light data, visible light data, first set of near-infrared light data, and second set of near- infrared light data, and to display the at least one real-time visual representation on the display for guidance during the diagnostic or therapeutic procedure.

In an embodiment of the method, each of the steps of identifying diseased target tissues from the displayed visual representation includes identifying visual representations of the emitted laser excitation light and visual representations of the collected light data that are displayed in a selected area of the visual representation.

In one embodiment, the step of identifying the boundary of the target tissues that are diseased and the step of identifying any remaining diseased target tissues within the boundary includes identifying visual representations of the first set of near- infrared light data, second set of near-infrared light data, and collected light data that are displayed in a selected area of the integrated visual representation. The visual representation of the first set of near-infrared data and second set of near- infrared data is a laser excitation image that represents the location of the delivered beam of near- infrared light within the area of interest, and that is displayed as a color overlay image on the wide-field image.

The signal representing the collected light data that is generated by the first electronic imaging device, when exceeding a predetermined threshold level, signifies disease in the target tissues. The visual representation of the collected light data is a color overlay image on the laser excitation image, having an opacity representative of the level of the signal exceeding the predefined threshold level. The opacity of the color overlay image that represents the collected light data decays over time to be progressively more translucent relative to the laser excitation image.

Yet further embodiments include devices and methods of manufacture and use for the imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination with an optical system, including, but not limited to, endoscopes, colonoscopes, microscopes, surgical microscopes, arthroscopes, laparoscopes thoracoscopes, mediastinan endoscopes, hysteroscopes, cyctoscopes, ureteroscopes, stereomicroscopes, colposcopes, fiber-optical systems, and rigid optical systems.

Still further embodiments include devices and methods of manufacture and use for the imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with therapeutic ablation probes to include RF, ultrasound, and microwave ablation technologies.

Still embodiments include devices and methods of manufacture and use for the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with therapeutic laser systems.

Embodiments include devices and methods of manufacture and use for the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination with an automated microscope/histology system to provide automated scanning of tissue. Yet further embodiments include devices and methods of manufacture and use for the imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination with an enclosure and using movable and/or multiple video cameras to provide whole body or other large surface area surveys for detection of skin or subcutaneous cancer.

Still further embodiments include devices and methods of manufacture and use for the imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with therapeutic ablation probes to include RF, ultrasound, and microwave ablation technologies.

Still embodiments include devices and methods of manufacture and use for the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with therapeutic laser systems.

Still further embodiments include devices and methods of manufacture and use for the imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with mechanical surgical resection technologies.

Still further embodiments include devices and methods of manufacture and use for the imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with radioactive or drug-eluting tissue delivery implants.

Embodiments include devices and methods of manufacture and use for the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with radiography, fluoroscopy, CT, MR, PET, nucleotide scanning, or other imaging systems to provide real-time diagnostic information combined with imaging data.

These and other aspects of the disclosure will become apparent from the following description of the preferred embodiments, taken in conjunction with the following drawings, although variations and modifications therein may be affected without departing from the spirit and scope of the novel concepts of the disclosure.

BRIEF DESCRIPTION OF THE DRAWINGS

The accompanying drawings illustrate one or more embodiments and together with the written description, serve to explain the principles of the invention. Wherever possible, the same reference numbers are used throughout the drawings to refer to the same or like elements of an embodiment, and wherein:

FIG. 1A shows schematically a system for intra-operatively providing anatomical guidance in a diagnostic or therapeutic procedure, according to an embodiment;

FIG. IB shows schematically another view of the system according to the embodiment shown in FIG. 1A;

FIG. 2 is a flow chart illustrating the steps of a method for intra-operatively providing anatomical guidance in a diagnostic or therapeutic procedure, using the system according to the embodiment shown in FIGS. 1A and IB;

FIG. 3 shows schematically a system for providing anatomical guidance in a diagnostic or therapeutic procedure combined with an optical pathway of a medical optical system, according to an embodiment;

FIG. 4 shows an embodiment of an enclosure for use with an imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues in real-time using movable and/or multiple video cameras to provide whole body or other large surface area surveys for detection of skin or subcutaneous cancer;

FIG. 5 shows a cross-section of an embodiment in which an imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein, in real-time, is connected to a sheath enclosing an ablation probe, biopsy needle, therapeutic laser deliver cable, or other medical device designed to be passed subcutaneously into a targeted tissue;

FIG. 6 illustrates the flowchart key for the integrated imaging and spectroscopy software flowcharts provided in this disclosure;

FIG. 7 shows the software startup portion of the integrated imaging and spectroscopy software according to the disclosure;

FIG. 8 illustrates the main software loop of the integrated imaging and

spectroscopy software according to the disclosure; FIG. 9 shows the display mode selector of the imaging process portion in the integrated imaging and spectroscopy software according to the disclosure;

FIG. 10 illustrates the determination of the amount of probe in the recorded spectrum of the imaging process portion in the integrated imaging and spectroscopy software of the disclosure;

FIG. 11 shows finding the laser position in the imaging process portion of the integrated imaging and spectroscopy software of the disclosure;

FIG. 12 shows finding the MR probe position in the imaging process portion of the integrated imaging and spectroscopy software according to the disclosure;

FIG. 13 illustrates creating the composite display in the imaging process portion of the integrated imaging and spectroscopy software according to the disclosure;

FIG. 14 illustrates the "age" probe buffering (for VP) in the imaging process portion of the integrated imaging and spectroscopy software of this disclosure; and

FIG. 15 illustrates the virtual phosphorescence concept of the integrated imaging and spectroscopy software of this disclosure.

DETAILED DESCRIPTION

The following examples are intended as illustrative only because numerous modifications and variations therein will be apparent to those skilled in the art. Various embodiments are now described in detail. Referring to the drawings, like numbers indicate like components throughout the views. As used in the description herein and throughout the claims that follow, the meaning of "a", "an", and "the" includes plural reference unless the context clearly dictates otherwise. Also, as used in the description herein and throughout the claims that follow, the meaning of "in" includes "in" and "on" unless the context clearly dictates otherwise.

As used herein, the terms, "diagnostic procedure" or "therapeutic procedure," encompass any medical or surgical procedure that involves the visualization of tissue surfaces or interior or exterior structures of a subject. These medical or surgical procedures may include but are not limited to, physical examination, open surgical, minimally invasive surgical, biopsy, percutaneous biopsy or tissue ablation, endoscopy, colonoscopy, colposcopy, bronchoscopy, thoracoscopy, laryngoscopy, laparoscopy, arthroscopy, cystoscopy, ureteroscopy, in-vivo or ex-vivo microscopy. These procedures may also be in conjunction with radiographic or other medical imaging systems, including but not limited to radiography, fluoroscopy, computerized or other tomography, magnetic resonance imaging, medical ultrasound, nuclear medical imaging, positron emission tomography, or other medical imaging systems. Further, as used herein, the terms "intra- operatively" and "intra-operative" shall mean during the course of or within the context of any such diagnostic or therapeutic procedure.

The description will be made as to the embodiments in conjunction with the accompanying drawings in FIGS. 1-5.

Now referring to FIGS. 1A and IB, in one aspect, the present disclosure relates to a system for intra-operatively providing anatomical guidance in a diagnostic or therapeutic procedure. The system may include a first light source 100 that is configured to emit a beam of visible light to an area of interest 134 of a subject, and a second light source 102a that is configured to emit a beam of near- infrared light to the area of interest 134. The system may also include a handheld probe 104. The handheld probe 104 may be optically coupled to the second light source 102a and may include an optical fiber 106 configured to deliver the emitted beam of near- infrared light to illuminate the area of interest 134. The optical fiber 106 may also be configured to collect light that is scattered 140a and/or or light that is emitted 140b from a contrast agent 132a/ 132b introduced into target tissues in the area of interest 134, in response to illumination by the second light source 102a.

The first electronic imaging device 102b may be optically coupled to the handheld probe 104 and may be configured to detect the collected light 140a/ 140b and to generate a corresponding signal that includes collected light data. The handheld probe 104 may be further configured to transmit the collected light 140a/ 140b to the first electronic imaging device 102b through the optical fiber 106.

The second electronic imaging device 126 may be configured to detect visible light 138 that is emitted from the area of interest 134 in response to illumination by the first light source 100, and to generate a corresponding signal that includes visible light data. The third electronic imaging device 122a may be configured to detect near- infrared light 142a having a first predetermined wavelength that is emitted from the area of interest 134, in response to illumination by the second light source 102a and may also be configured to generate a corresponding signal that includes a first set of near-infrared light data. The fourth electronic imaging device 122b may be configured to detect near- infrared light 142b having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest 134, in response to illumination by the second light source 102a. The fourth electronic imaging device 122b may also be configured to generate a corresponding signal that includes a second set of near-infrared light data.

The system may also include a display 144 for displaying at least one visual representation of data. Also, the system may include a controller 130 that is in

communication with each of the first light source 100, the second light source 102a, the first electronic imaging device 102b, the second electronic imaging device 126, the third electronic imaging device 122a, the fourth electronic imaging device 122b, and the display 144. The controller 130 may be programmed to generate at least one real-time integrated visual representation 146 of the area of interest 134 from each of the collected light data, visible light data, first set of near- infrared light data, and second set of near- infrared light data, and to display the visual representation on the display 144 for guidance during the diagnostic or therapeutic procedure.

In some embodiments, the contrast agent 132a/132b may include a Raman probe 132a and/or a fluorescence probe 132b and the collected light data may include Raman data and/or fluorescence data, respectively. The integrated visual representation 146 may include a widefield image 146d of the area of interest 134 that is generated from the visible light data, and a laser excitation image 146a of a selected area of the area of interest 134 that is defined within the wide-field image 146d. The laser excitation image 146a may be generated from at least one of the generated first set of near-infrared light data and the generated second set of near-infrared light data, and from a Raman image

146b generated from the Raman data and/or a fluorescence image 146c generated from the fluorescence data. The Raman image 146b and/or fluorescence image 146c may be defined within the wide-field image 146d and the laser excitation image 146a, as an overlay image on the laser excitation image 146a.

The first electronic imaging device 102b may include a spectrometer and each of the second electronic imaging device 126, third electronic imaging device 122a, and fourth electronic imaging device 122b may include a CCD or CMOS camera.

In another aspect, the disclosure relates to an imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging, for intra-operatively evaluating target tissues in an area of interest 134 of a subject. The system may include a first light source 100 for delivering a beam of visible light to the area of interest 134 and a second light source 102a for delivering a beam of near-infrared light to the area of interest 134. The system may also include a Raman imaging means and/or fluorescence imaging means that may include a handheld probe 104 optically coupled to the second light source 102a, for delivering the near infrared light to illuminate target tissues of the area of interest 134, and for collecting scattered light 140a and/or emitted light 140b from a corresponding Raman probe 132a and/or fluorescence probe 132b that is introduced into the target tissues and illuminated by the second light source 102a. The system may further include a first electronic imaging device 102b that is in communication with the handheld probe 104, for obtaining Raman data and/or fluorescence data from the collected light 140a/140b. The first electronic imaging device 102b may include a spectrometer.

A bright-field imaging means may also be included in the system. The bright- field imaging means may include: an optical port 150; a system lens 108/1 10a that may include a UV-NIR compact lens 108 and a first achromatic correction lens 1 10a; a silver mirror 112; a first dichroic mirror 114a and a second dichroic mirror 116a; a first shortpass filter 114b and a second shortpass filter 116b; a neutral density filter 124; a bandpass filter 120; a longpass filter 118; a second achromatic lens 1 10b, a third achromatic lens 1 10c, and a fourth achromatic lens 1 10c; a second electronic imaging device 126 for obtaining visible light data from visible light 138 emitted from the area of interest 134 in response to illumination by the first light source 100; a third electronic imaging device 122a for obtaining a first set of near- infrared data from light 142a having a first predetermined wavelength that is emitted from the area of interest 134 in response to illumination by the second light source 102a; and a fourth electronic imaging device 122b for obtaining a second set of near infrared data from light 142b having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest 134 in response to illumination by the second light source 102a. Each of the second electronic imaging device 126, third electronic imaging device 122a, and fourth electronic imaging device 122b may include a CCD or CMOS camera.

In some embodiments, the optical port 150 and the first electronic imaging device 102b may define a first optical path between them that includes the silver mirror 112, the first dichroic mirror 114a, the second dichroic mirror 116a, and the second achromatic lens 110b. The optical port 150 and the second electronic imaging device 126 may define a second optical path between them that includes the silver mirror 1 12, first dichroic mirror 114a, second dichroic mirror 116a, neutral density filter 124, and third achromatic lens 110c. The optical port 150 and the third electronic imaging device 122a may define a third optical path between them that includes the silver mirror 1 12, first dichroic mirror 1 14a, longpass filter 1 18, bandpass filter 120, and fourth achromatic lens 110c. The system may also include the display 144 for displaying at least one visual representation 146 of data, and the controller 130 in communication with each of the first light source 100, second light source 102a, first electronic imaging device 102b, second electronic imaging device 126, third electronic imaging device 122a, fourth electronic imaging device 122b, and display 144.

The controller may be any controller now known or later developed. For example, the controller may be, but is not limited to, a central processing unit, a processor, or a microprocessor. The controller may be coupled directly or indirectly to memory elements. The controller may also be a central processing unit or a processor of a machine, such as a conventional or general-purpose computer, that is capable of executing machine- executable instructions. The computer may also include a random-access memory (RAM), a read-only memory (ROM), and I/O devices to which the controller may be coupled. The controller 130 may be programmed to generate in real-time an integrated visual representation 146 of the area of interest 134 from the collected light data, visible light data, first set of near-infrared data, and second set of near- infrared data. The controller 130 may also be programmed to display the integrated visual representation 146 on the display 144, to provide guidance for performing a diagnostic or therapeutic procedure.

In some embodiments, the real-time integrated visual representation 146 of the area of interest 134 may include a wide-field image 146d of the area of interest 134 that is generated from the visible light data, a laser excitation image 146a of a predetermined area defined within the wide-field image 146d that is generated from the first set of near- infrared data and/or the second set of near- infrared data, and a Raman image 146b and/or fluorescence image 146c that is defined within the laser excitation image 146a and that is generated from corresponding Raman data and/or fluorescence data. The Raman image 146b and/or fluorescence image 146c may be an overlay image on the laser excitation image 146a.

In yet another aspect, the present disclosure relates to a method for intra- operatively providing anatomical guidance in a diagnostic or therapeutic procedure. The method may include the steps of introducing at least one contrast agent 132a/132b into target tissues in an area of interest 134 of a subject, and the step of emitting a beam of visible light to the area of interest 134, using a first light source 100. The method may also include the step of emitting a beam of near- infrared light to the area of interest 134, using a second light source 102a, and the step of delivering the emitted beam of near- infrared light to illuminate the area of interest 134, using an optical fiber 106 of a handheld probe 104 that is optically coupled to the second light source 102a. In addition, the method may include the step of collecting scattered light 140a and/or emitted light 140b from the contrast agent 132a/132b in response to illumination by the second light source 102a, using the optical fiber 106 of the handheld probe 104. The contrast agent 132a/132b may include a Raman probe 132a and/or fluorescence probe 132b. Further, the method may include the step of detecting the collected light 140a/ 140b and generating a corresponding signal that includes collected light data, using a first electronic imaging device 102b optically coupled to the optical fiber 106. The optical fiber 106 may be further configured to deliver the collected light 140a/140b to the first electronic imaging device 102b.

The method may also include the steps of detecting visible light 138 that is emitted from the area of interest 134 in response to illumination by the first light source 100 and generating a corresponding signal that includes visible light data, using a second electronic imaging device 126. Further, the method may also include the steps of detecting near- infrared light 142a having a first predetermined wavelength that is emitted from the area of interest 134 in response to illumination by the second light source 102a and generating a corresponding signal that includes a first set of near-infrared light data, using a third electronic imaging device 122a. Still further, the method may include the steps of detecting near-infrared light 142b having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest 134 in response to illumination by the second light source, and generating a corresponding signal including a second set of near-infrared light data, using a fourth electronic imaging device 122b. In addition, the method may include the step of generating at least one real-time integrated visual representation 146 of the area of interest 134 from the collected light data, visible light data, first set of near- infrared data, and second set of near- infrared data, using a controller 130 that is in communication with each of the first electronic imaging device 102b, second electronic imaging device 126, third electronic imaging device 122a, and fourth electronic imaging device 122b. The method may further include the step of displaying the real-time integrated visual representation 146 generated by the controller 130, for guidance during a diagnostic or therapeutic procedure, using a display 144 that is in communication with the controller 130.

The step of generating the real-time integrated visual representation 146 of the area of interest 134 may include the steps of generating a wide-field image 146d of the area of interest 134 from the visible light data, generating a laser excitation image 146a of a selected area of the area of interest 134 that is defined within the wide- field image 146d, from the first set of near- infrared light data and/or the second set of near-infrared light data, and generating a Raman image 140a and/or a fluorescence image 140b from the collected light data, that is defined within the wide- field image 146d and the laser excitation image 146a. The Raman image 140a and/or fluorescence image 140b may be an overlay image on the laser excitation image 146a.

In some embodiments, one or more contrast agents may be selected for desired tissue responses to allow for a multiplexed system that can simultaneously identify and display fluorescence in differing types of tissues or pathology. Thus, by selecting the appropriate contrast agent, a user could simultaneously and in real-time screen a targeted tissue for various types of cancer or other cellular pathologies.

In further embodiments, the first electronic imaging device 102b may include a spectrometer, and each of the second electronic imaging device 126, third electronic imaging device 122a, and fourth electronic imaging device 122b may include a CCD camera.

In yet another aspect, the present disclosure relates to software stored on a computer-readable medium programmed for causing the controller 130 to perform functions for intra-operatively providing anatomical guidance in a diagnostic or therapeutic procedure. A computer-readable medium may be any computer-readable medium now known or later developed. For example, the computer-readable medium may be any apparatus that may include, store, communicate, propagate, or transport the program for use by or in connection with the controller. The computer-readable medium may be electronic, magnetic, optical, electromagnetic, or infrared. Examples of a computer-readable medium may include, but are not limited to, a removable computer diskette, RAM, ROM, a rigid magnetic disk and an optical disk, such as a compact disk- read only memory (CD-ROM), compact disk-read/write (CD-R/W), and DVD.

The functions may include causing a first light source 100 in communication with the controller 130 to emit a beam of visible light to an area of interest 134 of a subject, causing a second light source 102a that is optically coupled to an optical fiber 106 and in communication with the controller 130 to emit a beam of near-infrared light to the area of interest 134 through the optical fiber 106, and causing the optical fiber 106 of the handheld probe 104 to collect light scattered 140a from a Raman probe and/or light emitted 140b from a fluorescence probe, in response to illumination by the second light source 102a. The Raman probe 132a and/or fluorescence probe 132b may be introduced into the target tissues in the area of interest 134. The functions may also include causing a first electronic imaging device 102b that is in communication with the controller 130 and the optical fiber 106 to detect the collected light 140a/ 140b, and causing the first electronic imaging device 102b to generate a signal from the collected light 140a/140b that includes Raman data and/or fluorescence data. Further, the functions may include causing a second electronic imaging device 126 that is in communication with the controller 130 to detect visible light 138 that is emitted from the area of interest 134 in response to illumination by the first light source 100, causing the second electronic imaging device 126 to generate a corresponding signal comprising visible light data, causing a third electronic imaging device 122a that is in communication with the controller 130 to detect near-infrared light 142a having a first predetermined wavelength that is emitted from the area of interest 134 in response to illumination by the second light source 102a, and causing the third electronic imaging device 122a to generate a corresponding signal that includes a first set of near- infrared light data. In addition, the functions may include causing a fourth electronic imaging device 122b that is in communication with the controller 130 to detect near- infrared light 142b having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest 134 in response to illumination by the second light source 102a, and causing the fourth electronic imaging device 122b to generate a corresponding signal that includes a second set of near- infrared light data. Also, the functions may include generating at least one real-time integrated visual representation 146 of the area of interest 134 from the visible light data, first set of near- infrared data, second set of near-infrared data, and from the Raman data and/or fluorescence data, and causing a display 144 in communication with the controller 130 to display 144 the generated real-time integrated visual representation 146 for guidance during a diagnostic or therapeutic procedure.

In some embodiments, the function of generating the real-time integrated visual representation 146 of the area of interest 134 may include the steps of generating a wide- field image 146d of the area of interest 134 from the visible light data, generating a laser excitation image 146a of a selected area of the area of interest 134 that is defined within the wide-field image 146d from the first set near-infrared light data and/or the second set of near- infrared light data, and generating a Raman image 146b from the Raman data and/or a fluorescence image 146c from the fluorescence data, that is defined within the wide-field image 146d and the laser excitation image 146a.

In further embodiments, the Raman image 146b and/or fluorescence image 146c may be an overlay image on the laser excitation image 146a. The first electronic imaging device 102b may include a spectrometer, and each of the second electronic imaging device 126, third electronic imaging device 122a, and fourth electronic imaging device 122b may include a CCD camera.

Now referring also to FIG. 2, in yet another aspect, the present disclosure relates to a method for intra-operatively identifying disease in target tissues in an area of interest 134 of a subject, to be resected in a diagnostic or therapeutic procedure. In one embodiment, the method may include the step 201 of introducing an optical contrast to the living subject, the step 203 of introducing a Raman probe and/or a fluorescence probe into the area of interest 134 until the probe has accumulated in the target tissues, the step 205 of preparing the living subject and the area of interest 134 for a diagnostic or therapeutic procedure, and the step 207 of initializing an imaging system for integrated bright- field imaging, near- infrared imaging, and Raman imaging and/or fluorescence imaging. The method may also include the step 209 of beginning the diagnostic or therapeutic procedure in the area of interest 134, the step 21 1 of using a first real-time integrated visual representation of the area of interest 134 and the target tissues, generated by the imaging system, to identify a boundary of the target tissues that are diseased, and the step 213 of performing a surgical resection of the identified diseased target tissues within the boundary.

Further, the method may include the step 215 to determine whether all of the cancerous cells have been removed. If it is determined that there are remaining cancerous cells, the method may further include the step 217 of, after the surgical resection, using a second displayed real-time integrated visual representation of the area of interest 134 and the target tissues, generated by the imaging system, to identify any remaining diseased target tissues within the boundary, and the repeat of step 213 of, performing a surgical resection of the identified diseased target tissues within the boundary. The method may include a series of further surgical resections on identified remaining diseased target tissues corresponding to a respective series of real-time integrated visual representations generated by the imaging system, until the area of interest 134 is free from diseased target tissues.

In an embodiment, the imaging system may include a first light source 100 that is configured to emit a beam of visible light to an area of interest 134 of a subject and a second light source 102a that is configured to emit a beam of near-infrared light to the area of interest 134. The system may also include a handheld probe 104 that is optically coupled to the second light source 102a, and that includes an optical fiber 106 that is configured to deliver the emitted beam of near- infrared light to illuminate the area of interest 134. The optical fiber 106 may also be configured to collect light 140a that is scattered or light 140b that is emitted from a contrast agent 132a/132b introduced into target tissues in the area of interest 134, in response to illumination by the second light source 102a. A first electronic imaging device 102b may also be included in the system. The first electronic imaging device 102b may be optically coupled to the handheld probe 104 and may be configured to detect the collected light 140a/ 140b and to generate a corresponding signal that includes collected light data. The handheld probe 104 may be further configured to transmit the collected light 140a/ 140b to the first electronic imaging device 102b through the optical fiber 106. The system may further include a second electronic imaging device 126 that is configured to detect visible light 138 that is emitted from the area of interest 134 in response to illumination by the first light source 100, and to generate a corresponding signal including visible light data. A third electronic imaging device 122a may also be included in the system, which is configured to detect near- infrared light 142a having a first predetermined wavelength that is emitted from the area of interest 134 in response to illumination by the second light source 102a, and which is also configured to generate a corresponding signal including a first set of near- infrared light data. In addition, the system may include a fourth electronic imaging device 122b that is configured to detect near-infrared light 142b having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest 134, in response to illumination by the second light source 102a. The fourth electronic imaging device 122b may also be configured to generate a corresponding signal that includes a second set of near-infrared light data.

A display 144 for displaying at least one visual representation 146 of data may be further included in the system. Also, the system may include a controller 130 that is in communication with each of the first light source 100, second light source 102a, first electronic imaging device 102b, second electronic imaging device 126, third electronic imaging device 122a, fourth electronic imaging device 122b, and display 144. The controller 130 may be programmed to generate at least one real-time integrated visual representation 146 of the area of interest 134 from each of the collected light data, visible light data, first set of near- infrared light data, and second set of near-infrared light data, and to display the real-time visual representation 146 on the display 144 for guidance during the diagnostic or therapeutic procedure. In some embodiments, each of the steps of identifying diseased target tissues from the displayed real-time integrated visual representation 146 may include identifying visual representations 146a of the emitted laser excitation light 142a/ 142b and visual

representations 146b/146c of the collected light data displayed in a selected area of the integrated visual representation 146.

Embodiments may include devices and methods of manufacture and use for the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination with a receiving optical system, including, but not limited to, endoscopes, colonoscopes, microscopes, surgical microscopes, arthroscopes, laparoscopes thoracoscopes, mediastinan endoscopes, hysteroscopes, cyctoscopes, ureteroscopes, stereomicroscopes, colposcopes, fiber-optical systems, and rigid optical systems. Thus, the fundamental system and methods for providing read-time anatomical guidance in a diagnostic or therapeutic procedure may be adapted as required to allow its incorporation into an optical probe integral to an endoscopic device, as well as a therapeutic ablation probe, or a therapeutic laser system. This disclosure provides for both borescope type devices and for video endoscope type endoscopic devices.

As shown in FIG. 3. the optical pathways of the imaging system and

spectroscope/laser source may be coupled electronically, mechanically, and optically to the optical pathway of a receiving optical system to allow the imaging and pathology detection system to be superimposed on the visual field of the receiving optical system. Optical coupling may be achieved by direct coupling of the respective optical pathways using fiber-optic bundles or other optical or electronic intermediaries. Alternately, in other embodiments, optical pathways for the imaging system, laser, illumination, and/or spectroscope may be connected to the target visual field using a parallel optical carrier.

Further embodiments may include devices and methods of manufacture and use for the imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with therapeutic ablation probes to include RF (radio frequency), ultrasound, and microwave ablation technologies.

Generally, ablation is the removal of a portion of the tissue from the surface of an organ or larger tissue area, typically by vaporization of the undesired portion of the tissue using sufficiently energetic ultrasound or electromagnetic radiation. For example, laser ablation, radio frequency (RF) ablation, ultrasound ablation, microwave ablation, and the like can be used to remove a tumor or other dysfunctional tissue, a fibrous mass, or a portion of the electrical conduction system of the heart, or many other applications. For instance, RF ablation including pulsed radiofrequency techniques, for example, uses the heat generated from the high frequency alternating current to treat a medical disorder, while lasers useful in ablation include the continuous argon ion laser, Nd:YAG, Argon, and CO 2 lasers. This disclosure provides for use of the devices and methods of manufacture and use for the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination and concurrently with therapeutic ablation probes, such that the ablation procedures can be performed using real-time image guidance. These concurrent and integrated systems can incorporate any of the computerized algorithms desired for the operation of the ablation probe, for example, any algorithm desired for pulsed, high- current percutaneous radiofrequency (RF) ablation.

Similarly, still further embodiments include devices and methods of manufacture and use for the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with therapeutic laser systems. Therapeutic laser methods can provide success treatment in a number of surgical settings, for example, surgical laser systems (e.g., carbon dioxide, argon, neodynium yttrium aluminum garnet (Nd:YAG)) can be used in oncology for surface ablation of cancerous lesions, photocoagulation of cancerous lesions, and activation of photodynamic therapy agents. In certain embodiments, the disclosed device integrates with therapeutic laser systems at a mechanical level by addition of the therapeutic laser to the local excitation and spectroscopy system (the "Spectropen box"). This system can be augmented with additional software for control of the ablation functions as desired. The therapeutic laser system can be used in open surgery, minimally-invasive surgery (i.e., endoscopic procedures), robotic surgery, and in drug-eluting devices. Accordingly, the disclosure provides for devices and methods of making and using an imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination and concurrently with therapeutic laser systems such that laser therapy can be performed using real-time image guidance.

Embodiments may also include devices and methods of manufacture and use for the imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination with an automated microscope/histology system to provide automated scanning of tissue.

Such an automated microscope/histology system may include electronic, optical, and mechanical coupling of the imaging and spectroscopic system described herein with the optical path of a microscope, and further may be provided with an automated system to allow scanning of one or more cassettes containing tissue slices within the sensitivity range of the Raman imaging system to allow an automated scan of such tissue, with computer identification of detected fluorescence so that a pathologist or other operator may further analyze the areas of positive fluorescence. In yet some embodiments, the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein may be provided in real-time combination with an automated microscope/histology system with an automated histological image recognition system to provide a fully automated scanning of tissue.

Yet further embodiments may include devices and methods of manufacture and use for the imaging system using integrated bright- field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination with an enclosure and using movable and/or multiple video cameras to provide whole body or other large surface area surveys for detection of skin or subcutaneous cancer.

A representative example of one such enclosure is shown in Fig. 4. Either fixed or movable camera systems may be provided within said enclosure to allow whole body imaging by an integrated imaging system using integrated bright-field imaging, near- infrared imaging, image mapping, and Raman imaging and/or fluorescence imaging according to the present disclosure.

Still further embodiments may include devices and methods of manufacture and use for the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with therapeutic ablation probes to include RF, ultrasound, and microwave ablation technologies.

Still further embodiments may include devices and methods of manufacture and use for the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with therapeutic laser systems.

Still further embodiments may include devices and methods of manufacture and use for the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with mechanical surgical resection technologies.

Such embodiments may be represented by the exemplary embodiment 500 shown in FIG. 5, where a concentric sheath 501 encloses the shaft 505 (shown in cross section) of an ablation probe, fiber-optic laser delivery cable, or biopsy needle. The interior 510 of the sheath 501 may comprise a plurality of fiber-optic bundles to allow imaging, illumination, and laser delivery/spectroscopic analysis for the imaging system using integrated bright- field imaging, near- infrared imaging, and Raman imaging and/or fluorescence imaging of the present disclosure. The sheath 501 may be reusable or provided for single-use as a sterile, disposable element.

An embodiment such as that represented in FIG. 5 allows direct visualization and fluorescent guidance of the placement of the ablation probe, biopsy needle, or laser delivery fibers for more precisely directed therapeutic application or controlled biopsy.

Still further embodiments may include devices and methods of manufacture and use for the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with devices for the delivery of radioactive or drug-eluting tissue delivery implants. Such embodiments may employ a concentric sheath, as represented in FIG. 5, or may be directly integrated into the delivery devices for such radioactive or drug-eluting tissue delivery implants [not shown].

Further embodiments may also include devices and methods of manufacture and use for the imaging system using integrated bright-field imaging, near-infrared imaging, and Raman imaging and/or fluorescence imaging for evaluating target tissues as described herein in real-time combination for concurrent use with radiography, fluoroscopy, CT, MR, PET, nucleotide scanning, or other imaging systems to provide real-time diagnostic information combined with imaging data. Among the embodiments, the imaging system as described herein may be superimposed by computer applications on the image generated by the radiography, fluoroscopy, CT, MR, PET, nucleotide scanning, or other imaging systems to provide real-time input as to the localization of malignant tissue. This may provide tumor mapping capabilities at a resolution higher than that of the radiography, fluoroscopy, CT, MR, PET, nucleotide scanning, or other imaging modalities, or it may be used in conjunction with the interventional use of such modalities to facilitate directed biopsy or local treatment.

Therefore, this disclosure also provides systems and methods for interventional oncology, which includes a variety of minimally invasive, image-guided tumor therapies. Examples of interventional oncology procedures generally can be described as transcatheter therapies or ablative therapies. Transcatheter therapies such as TACE (transarterial chemoembolization) and SIRT (selective internal radiation therapy) are catheter-based tumor treatments. Ablative therapies such as radiofrequency ablation

(RFA) or microwave ablation (MWA) generally involve the destruction of the lesion via a percutaneous ly placed needle. Each of these procedures can be carried out with the systems and methods disclosed herein.

Regarding ablation therapy procedures and interventional biopsies that do not result in ablation, presently these procedures are typically carried out using radiographic imaging for the placement of the biopsy needle or ablation probe. These conventional imaging techniques cannot specifically identify malignant tissue, or provide a clear answer to the question of residual live tumor following an ablation. Because there is no way to know whether or not the cancer has been ablated, the clinical protocol now generally calls for a repeat intervention after six months. This six-month interval means that tumor cells left behind unknowingly have ample time for both local and distant cancer spread; or, if the cancer was successfully eliminated during the first procedure, then the second procedure unnecessarily subjects healthy tissue to the ablation process. Accordingly, the system and methods disclosed herein can be used in conjunction with an ablation probe interface and a biopsy needle. In this aspect, the ablation probe can effectively "see" whether or not the cancer is eliminated. If it is, further treatment may not be needed, but if it is not, then it is possible to continue to administer additional radiation immediately and thereby increase the likelihood that first procedure successfully eliminates the cancer. Thus, a fiber-optic interface can allow use of the disclosed methods, systems, and software with interventional catheters and biopsy/ablation probes and needles. Biopsy needles and ablation probes that can immediately detect the presence of cancer while in the body will enable interventional oncologists to immediately assess their targeting and the effect of their ablation efforts. According to further aspect, this disclosure provides a system and method that combines a hand-held scanning unit with a whole-body scanning unit about the size of a phone booth or tanning booth with a mechanical scanner that scans the entire body for the presence of skin cancer, using either an oral or topical tracer agent. The dermal scanner can identify cancerous tissue that might not be visible or recognizable in a visual scan. By identifying skin cancers earlier, this system and method can improve the likelihood of removing cancer before metastasis and substantially reduce scarring. The whole body scanning unit can generate a whole body screening map to localize any surface areas that require the dermatologist's close examination. In this aspect, various contrast agents can be used, and the dermatologic agent can either be administered orally or topically rather than by injection. The wavelength of the laser activation source for the whole body system can be determined by the contrast agent used, and also can be different from the NIR laser used in open surgery.

The present methods and system can also be employed with automated pathology screening methods. Presently, a pathologist must select tissue from samples for examination, but unfortunately, only a fraction of the entire sample is usually examined under a microscope due to time and financial constraints. The present system can allow the pathologist to load thick sections of the entire surgical specimen into disposable carriers that are scanned using this system to pre-screen the entire sample for cancerous tissue. This directs the pathologist to examine particular locations within the specimen where a tracer agent is detected, thus reducing the likelihood of false-negative reports, and assisting the pathologist to provide a better analysis of the diseased tissue. Unlike automated pathology systems that rely on cell morphology, the present system does not attempt to displace the role of the surgical pathologist, but instead provides helpful, adjunctive information to the pathologist. By using image guidance to locate specific areas for conventional slides to be obtained, the disclosed system and method help reduce pathologist time, and ensure that all of the most likely tumor tissue within every specimen will be examined histologically by the pathologist.

This system and methods disclosed herein also can be used in combination with and by integration with CT or MR software and systems, providing the ability to tag desired tissue with the three dimensional imaging software in modern CT and MR scanning. This integration with CT or MR systems effectively allows signal penetration in most tissues to substantially greater depths with a minimally invasive application. A range of multimodal contrast agents that will work with CT or MR can be employed in this method and system as well. Further, the imaging software integration can also apply to fluoroscopy, providing real time, deep images of tissue events well below the penetration reach of the initial technology.

Further, the technology disclosed herein provides the ability to identify specific pathology in image-guided surgery, thereby affording the ability for integration with surgical robotics. By integrating the two imaging software systems, a robot can precisely remove selected tissue, while preserving non-involved tissues and structures with equal precision.

Therefore according to an aspect, disclosed herein is a system for providing anatomical guidance in a diagnostic or therapeutic procedure, comprising:

(a) a first light source configured to emit a beam of visible light to an area of interest in a subject;

(b) a second light source configured to emit a beam of near- infrared light to the area of interest;

(c) a probe device optically coupled to the second light source but not optically coupled to the first light source, comprising a plurality of optical fibers configured to deliver the emitted beam of near- infrared light to illuminate the area of interest and configured to collect light that is scattered or emitted from a contrast agent introduced into target tissues in the area of interest, in response to illumination by the second light source;

(d) a first imaging device optically coupled to the probe device and configured to detect the collected light and to generate a corresponding signal that comprises collected light data, and wherein the probe device is further configured to transmit the collected light to the first imaging device through the plurality of optical fibers of the probe device;

(e) a second imaging device configured to detect visible light that is emitted from the area of interest in response to illumination by the first light source through an optical path that has no overlapping portion with the plurality of optical fibers of the probe device, and to generate a corresponding signal comprising visible light data;

(f) a third imaging device configured to detect near-infrared light having a first predetermined wavelength that is emitted from the area of interest in response to illumination by the second light source and to generate a corresponding signal comprising a first set of near infrared light data; (g) a fourth imaging device configured to detect near-infrared light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source, and to generate a corresponding signal comprising a second set of near-infrared light data;

(h) a display for displaying at least one visual representation of data; and

(i) a controller in communication with each of the first light source, second light source, first imaging device, second imaging device, third imaging device, fourth imaging device, and display, and programmed to generate at least one real- time integrated visual representation of the area of interest from each of the collected light data, visible light data, first set of near-infrared light data, and second set of near-infrared light data and to display the at least one real-time visual representation on the display, for guidance during the diagnostic or therapeutic procedure.

For example, this probe device described above can be integral to a whole body imaging system, an automated pathology system, a minimally invasive system, or a computed tomography / magnetic resonance (CT/MR) integrated system. In another aspect, the plurality of optical fibers can be configured in an array such that they are regularly spaced, if desired. Moreover, the system according to this disclosure can be delivered within a trochar or catheter, or integrated into a biopsy needle or ablation probe. For example, the device size can range from between about 1 and about 50 gauge. In a further aspect, the system can further comprise software integrated with the imaging software of a CT or MR system to identify, localize, and display registered overlay images showing the presence and extent of tissue areas of interest correlated with the CT or MR systems. In this aspect, the system can be further integrated with the software of a robotic surgical system to provide localization information about the presence and extent of tissue areas of interest within a robotic surgical field if desired.

Another aspect of the imaging system using integrated bright-field imaging, near- infrared imaging, and at least one of Raman imaging and fluorescence imaging for evaluating target tissues in an area of interest in a subject, can comprise the following:

(a) a first light source for delivering a beam of visible light to the area of interest and a second light source for delivering a beam of near-infrared light to the area of interest;

(b) a Raman and fluorescence imaging means, comprising: (i) a probe device optically coupled to the second light source but not optically coupled to the first light source, for delivering the near infrared light to illuminate target tissues of the area of interest and for collecting at least one of scattered light and emitted light from a corresponding at least one of a Raman probe and a fluorescence probe that is introduced into the target tissues and illuminated by the second light source, the probe device integral to an endoscopic device and comprising a plurality of optical fibers; and

(ii) a first imaging device in communication with the probe device for obtaining at least one of Raman data from the collected scattered light and fluorescence data from the collected emitted light, respectively; and (c) a bright-field imaging means, comprising:

(i) a second imaging device for obtaining visible light data from visible light emitted from the area of interest in response to illumination by the first light source through an optical path that has no overlapping portion with the plurality of optical fibers of the probe device;

(ii) a third imaging device for obtaining a first set of near-infrared data from light having a first predetermined wavelength that is emitted from the area of interest in response to illumination by the second light source; and

(iii) a fourth imaging device for obtaining a second set of near infrared data from light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source.

According to a further aspect of the immediately above-described imaging system, the bright-field imaging means of the imaging system can further comprise:

(iv) an optical port;

(v) a system lens comprising a UV-NIR compact lens and a first focusing lens group;

(vi) a trichroic prism;

(vii) a first laser attenuating filter;

(viii) a bandpass filter;

(ix) a second laser attenuating filter; (x) a second focusing lens group, a third focusing lens group, and a fourth focusing lens group;

wherein the optical port and the first imaging device define a first optical path therebetween having the trichroic prism and the second focusing lens group, wherein the optical port and the second imaging device define a second optical path therebetween having the trichroic prism, first laser attenuating filter, and third focusing lens group, and wherein the optical port and the third imaging device define a third optical path therebetween having the trichroic prism, the second laser attenuating filter, bandpass filter, and fourth focusing lens group.

In this aspect, the plurality of optical fibers can be configured in an array within the probe device, and the probe device can be integral to a whole body imaging system, an automated pathology system, a minimally invasive system, or a computed tomography / magnetic resonance (CT/MR) integrated system as desired.

There is also provided a method for providing anatomical guidance in a diagnostic or therapeutic procedure, comprising the steps of:

(a) introducing at least one contrast agent into target tissues in an area of interest in a subject;

(b) emitting a beam of visible light to the area of interest, using a first light source;

(c) emitting a beam of near- infrared light to the area of interest, using a second light source;

(d) delivering the emitted beam of near-infrared light to illuminate the area of interest, using a plurality of optical fibers of a probe device that is optically coupled to the second light source but not optically coupled to the first light source;

(e) collecting at least one of scattered light and emitted light from the contrast agent in response to illumination by the second light source, using the plurality of optical fibers of the probe device, wherein the contrast agent comprises at least one of a Raman probe and a fluorescence probe;

(f) detecting the collected light and generating a corresponding signal that comprises collected light data, using a first imaging device that is optically coupled to the plurality of optical fibers, and wherein the optical fibers are further configured to deliver the collected light to the first imaging device; (g) detecting visible light that is emitted from the area of interest in response to illumination by the first light source and generating a corresponding signal comprising visible light data, using a second imaging device;

(h) detecting near-infrared light having a first predetermined wavelength that is emitted from the area of interest in response to illumination by the second light source and generating a corresponding signal comprising a first set of near- infrared light data, using a third imaging device;

(i) detecting near-infrared light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source and generating a corresponding signal comprising a second set of near-infrared light data, using a fourth imaging device;

(j) generating at least one real-time integrated visual representation of the area of interest from the collected light data, visible light data, first set of near- infrared data, and second set of near-infrared data, using a controller in communication with each of the first imaging device, second imaging device, third imaging device, and fourth imaging device; and

(k) displaying the at least one real-time integrated visual representation generated by the controller, for guidance during a diagnostic or therapeutic procedure, using a display in communication with the controller.

It is noted that any of the system, the imaging system, or the method according to any one of the preceding claims, can further comprise an enclosure and at least one movable video camera to provide whole body or other large surface area surveys.

According to a further aspect of this disclosure, there is provided a system for providing anatomical guidance in an ex vivo diagnostic procedure, comprising:

(a) a first light source configured to emit a beam of visible light to an area of interest in a subject tissue;

(b) a second light source configured to emit a beam of near- infrared light to the area of interest;

(c) a probe device optically coupled to the second light source but not optically coupled to the first light source, comprising of one or more optical fiber(s) configured to deliver the emitted beam of near-infrared light to illuminate the area of interest and configured to collect light that is scattered or emitted from a contrast agent introduced into target tissues in the area of interest, in response to illumination by the second light source;

(d) a first imaging device optically coupled to the probe device and configured to detect the collected light and to generate a corresponding signal that comprises collected light data, and wherein the probe device is further configured to transmit the collected light to the first imaging device through the one or more optical fiber(s) of the probe device;

(e) a second imaging device configured to detect visible light that is emitted from the area of interest in response to illumination by the first light source through an optical path that has no overlapping portion with the optical fiber(s) of the probe device, and to generate a corresponding signal comprising visible light data;

(f) a third imaging device configured to detect near-infrared light having a first predetermined wavelength that is emitted from the area of interest in response to illumination by the second light source and to generate a corresponding signal comprising a first set of near infrared light data;

(g) a fourth imaging device configured to detect near-infrared light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source, and to generate a corresponding signal comprising a second set of near-infrared light data;

(h) a display for displaying at least one visual representation of data; and

(i) a controller in communication with each of the first light source, second light source, first imaging device, second imaging device, third imaging device, fourth imaging device, and display, and programmed to generate at least one realtime integrated visual representation of the area of interest from each of the collected light data, visible light data, first set of near-infrared light data, and second set of near-infrared light data and to display the at least one real-time visual representation on the display, for guidance during the diagnostic procedure.

This system can further comprise a manual or automated system for examining said subject tissue within one or more carrier containers. In this aspect, the system may further comprise a software interface that may correlate detected near-infrared light having a second predetermined wavelength that is different from the first predetermined wavelength and that is emitted from the area of interest in response to illumination by the second light source within one or more identified locations of said area of interest within one or more carrier containers. For example, the software may direct a user to said one or more identified locations of said areas of interest within said one or more carrier containers. It is possible that the carrier containers may be either single use disposables or reusable.

Accordingly, this disclosure further provides a method for providing anatomical guidance in an ex vivo diagnostic procedure, comprising:

(a) introducing at least one of a Raman probe and a fluorescence probe into a subject tissue until the at least one probe has accumulated in the target tissue;

(b) preparing the subject tissue for a diagnostic procedure;

(c) initializing an imaging system for integrated bright-field imaging, near- infrared imaging, and at least one of Raman imaging and fluorescence imaging;

(d) beginning the diagnostic procedure in the subject tissue;

(e) using a first real-time integrated visual representation of the subject tissue, generated by the imaging system, to identify a boundary of the subject tissue that is diseased;

wherein the imaging system is a system according to claim 8. The following series of integrated imaging and spectroscopy software flowcharts and explanation provides additional operational information for the system and method executed by the spectroscopy software and system. For example, software startup, the main software loop, and a number of image processing procedures. Image processing procedures can include display mode selector, determination of the amount of probe in the recorded spectrum, determining the laser position and determining the NIR probe position, creation of a composite display, the "age" probe buffering (for VP), and the like. The details of the virtual phosphorescence concept are also shown, along with further explanation of these steps.

CHART 1. Parts list for the fundamental integrated imaging device and method