Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
MULTI-SPECTRAL THREE DIMENSIONAL IMAGING SYSTEM AND METHOD
Document Type and Number:
WIPO Patent Application WO/2016/187549
Kind Code:
A1
Abstract:
Disclosed is a computer-implemented method of creating an image of a specimen including receiving a first image of a first section of a specimen created using a first wavelength of invisible light a second image of a second section of the specimen adjacent to the first section and the second image created using the first wavelength of invisible light, co-registering the first image and the second image and creating, by the processor, a single-plane image of the first section using a next-image process.

Inventors:
HOLT ROBERT W (US)
QUTAISH MOHAMMED Q (US)
HOPPIN JOHN W (US)
SEAMAN MARC E (US)
HESTERMAN JACOB Y (US)
Application Number:
PCT/US2016/033547
Publication Date:
November 24, 2016
Filing Date:
May 20, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
INVICRO LLC (US)
International Classes:
G06T15/08; G01N21/64
Domestic Patent References:
WO2008005991A12008-01-10
Foreign References:
US20140187931A12014-07-03
JPH01169305A1989-07-04
Other References:
ANNEDORE PUNGE ET AL.: "3D Reconstruction of High-Resolution STED Microscope Images.", MICROSCOPY RESEARCH AND TECHNIQUE., vol. 71, no. 9, 2008, pages 644 - 650, XP055331541
JURGEN GLATZ.: "Real-time intra-operative and endoscopic molecular fluorescence imaging.", DEPARTMENT OF ELECTRICAL ENGINEERING AND INFORMATION TECHNOLOGY., 27 October 2014 (2014-10-27), XP055502704
ANNEDORE ET AL.: "3D Reconstruction of High-Resolution STED Microscope Images", MICROSCOPY RESEARCH AND TECHNIQUE, vol. 71, no. 9, 1 September 2008 (2008-09-01), pages 644 - 650, XP055331541, DOI: 10.1002/jemt.20602
See also references of EP 3298588A4
Attorney, Agent or Firm:
SULLIVAN, Todd A. et al. (175 Canal StreetManchester, New Hampshire, US)
Download PDF:
Claims:
CLAIMS

1. A computer- implemented method of creating an image of a specimen, the method comprising:

receiving, by a processor, a first image of a first section of a specimen, the first image created using a first wavelength of invisible light;

receiving, by the processor, a second image of a second section of the specimen, the second section being adjacent to the first section and the second image created using the first wavelength of invisible light;

co-registering, by the processor, the first image and the second image; and creating, by the processor, a single-plane image of the first section using a next-image process.

2. The method of claim 1, further comprising:

receiving, by the processor, an image of an Nth section of the specimen created using the first wavelength of invisible light; and

creating, by the processor, a 3D image of the specimen using at least three images of adjacent specimen sections.

3. The method of claim 2, wherein the creating of the 3D image is performed via a de-blurring process, wherein the de-blurring process includes one of subsurface fluorescence removal, aberration in camera optics or physical correction.

4. The method of claim 3, wherein the specimen is contacted with a second fluorophore, said second fluorophore having an emission spectrum in the range of, but not limited to, approximately 200 nm to 1000 nm and different from the first fluorophore.

5. The method of claim 1, further comprising:

receiving, by the processor, a third image of the first section of the specimen, the third image created using a second wavelength of invisible light; receiving, by the processor, a fourth image of a second section of the specimen, the fourth image created using the second wavelength of invisible light; co-registering, by the processor, the third image and the fourth image; and creating, by the processor, a single-plane image of the first section using a next-image process.

6. The method of claim 1, further comprising:

de-blurring, by the processor, the first image.

7. The method of claim 6, further comprising:

determining tissue type a visible light image and wherein the de -blurring is performed based on tissue type.

8. The method of claim 6, wherein the de-blurring includes one of an analytical solution, a Monte Carlo simulation, a point spread function method, and a deconvolution method.

9. The method of claim 8, wherein the deconvolution method includes one of: a measured point spread function kernel, a simulated point spread function kernel, a Richardson- Lucy method, a Weiner filter deconvolution, a Van Cittert deconvolution, a blind deconvolution, and a regularized deconvolution method.

10. The method of claim 1, wherein the specimen is contacted with at least one first fluorophore having an emission spectrum in the range of approximately 200 nm to 1000 nm.

11. The method of claim 1 , further comprising:

receiving, by the processor, an image of the first section of the specimen created using visible light; and

co-registering, by the processor, the images of the first section of the specimen.

12. A system for creating an image of a specimen comprising:

a light source;

an image capture device;

a processor configured to:

receive a first image of a first section of a specimen, the first image created using a first wavelength of invisible light;

receive a second image of a second section of the specimen, the second section being adjacent to the first section and the second image created using the first wavelength of invisible light;

co-register the first image and the second image; and

create a single-plane image of the first section using a next-image process.

13 The system of claim 12, further comprising:

a visible light source.

14. The system of claim 12, further comprising:

a filter.

15. The system of claim 12, wherein the processor is further configured to:

receive an image of an Nth section of the specimen created using the first wavelength of invisible light; and

create a 3D image of the specimen using at least three images of adjacent specimen sections.

16. The system of claim 15, wherein the creating of the 3D image is performed via a next-image process.

17. The system of claim 12, wherein the specimen is contacted with a second fluorophore, said second fluorophore having an emission spectrum in the range of approximately 200 nm to 1000 nm and different from the spectrum of the first fluorophore.

18. The system of claim 12, wherein the processor is further configured to:

receive a third image of the first section of the specimen, the third image created using a second wavelength of invisible light;

receive a fourth image of a second section of the specimen, the fourth image created using the second wavelength of invisible light;

co-register the third image and the fourth image; and

create a single-plane image of the first section using a next-image process.

19. The system of claim 12, wherein the processor is further configured to de-blur the first image.

20. The system of claim 19, further comprising:

determining tissue type a visible light image and wherein the de -blurring is performed based on tissue type.

21. The system of claim 19, wherein the de-blurring includes one of an analytical solution to optical transport, a Monte Carlo simulation, a point spread function method, and a deconvolution method.

22. The system of claim 21, wherein the deconvolution method includes one of a measured point spread function kernel, a simulated point spread function kernel, a Richardson- Lucy method, a Weiner filter deconvolution, a Van Cittert deconvolution, a blind deconvolution, and a regularized deconvolution method.

23. The system of claim 12, wherein the specimen is contacted with at least one first fluorophore having an emission spectrum in the range of approximately 200 nm to 1000 nm.

24. The system of claim 13, wherein the processor is further configured to:

receive an image of the first section of the specimen created using visible light; and co-register the images of the first section of the specimen.

25. The system of claim 12, further comprising:

a display.

26. A computer-implemented method of creating an image of a specimen, the method comprising: receiving, by a processor, an Nxth image of a Nth section of a specimen, the Nxth image created using a first wavelength of light;

receiving, by the processor, an Νγth image of the Nth section of the specimen, the Νγth image created using a second wavelength of light, Y being equal to X+1 and M being equal to N+l, an Mth section being adjacent the Nth section;

receiving, by the processor, a visible light image of the Nth section of the specimen, the visible light image created using visible light;

co-registering, by the processor the Nxth image, the Νγth image and the visible light image; and

creating, by the processor, a 3D image of the specimen using a next-image process on the Nxth and Νγth images.

27. The method of claim 26, further comprising:

reiterating

the receiving of the Nxth image of the Nth section of the specimen, the receiving of the Νγth image of a Nth section of the specimen,

the co-registering of the Nxth image, the Νγth image and the visible light image, and

the creating of the 3D image, where Νxx+1, Νγ= Νγ+1 and M=M+1.

28. The method of claim 27, further comprising:

receiving a Pth visible light image of the Nth section where N is an integer multiple.

29. The method of claim 26, 27 or 28 wherein the specimen is contacted with at least one first fluorophore having an emission spectrum in the range of approximately 200 nm to 1000 nm.

30. A computer-implemented method of creating an image of a specimen, the method comprising:

receiving, by a processor, an Nth image of a Nth section of a specimen, the Nth image created using a first wavelength of light in the range of approximately 650 nm to 900 nm;

receiving, by the processor, a visible light image of the Nth section of the specimen, the visible light image created using visible light;

co-registering, by the processor, the Nth image and the visible image;

creating, by the processor, a 3D image of the specimen using a next-image process on the Nth images;

reiterating:

the receiving of the Nth image of the Nth section of the specimen, the receiving of the visible light image of the Nth section of the specimen,

the co-registering of the Nth image, and the visible light image, and the creating of the 3D image, where N=N+1;

receiving a Pth visible light image of the Nth section where N is an integer multiple, wherein the specimen is contacted with at least one first fluorophore having an emission spectrum in the range of approximately 200 nm to 1000 nm.

Description:
TITLE

Multi-Spectral Three Dimensional Imaging System and Method CROSS REFERENCE TO RELATED APPLICATION

This application claims benefit of U.S. Provisional Application Serial No. 62/164,800, entitled,“Multi-Spectral Three Dimensional Imaging System and Method” filed May 21, 2015, the entire disclosure of which is incorporated herein by reference.

TECHNICAL FIELD The inventive concepts relate to imaging and more specifically to imaging using different wavelengths of light. BACKGROUND Multispectral fluorescence cryoslice imaging has been previously used to measure drug distribution ex-vivo in standalone or retrofitted cryoslice imagers [1,2]. Such specificity in a single imaging system can result in high cost per scan. For high throughput and low cost, it would be valuable to construct a fluorescence imager with a corresponding software package that can work in tandem with common cryoslicing instruments which are already in place. To that end, the methods outlined here demonstrate a workflow of cryofluorescence imaging techniques for a versatile, transportable add-on to existing cryoslicing instruments.

SUMMARY Aspects of the inventive concepts include a computer-implemented method of creating an image of a specimen including: receiving, by a processor, a first image of a first section of a specimen, the first image created using a first wavelength of invisible light; receiving, by the processor, a second image of a second section of the specimen, the second section being adjacent to the first section and the second image created using the first wavelength of invisible light; the first image and the second image; and creating, by the processor, a single-plane image of the first section using a next-image process.

Methods may include receiving, by the processor, an image of an Nth section of the specimen created using the first wavelength of invisible light; and creating, by the processor, a 3D image of the specimen using at least three images of adjacent specimen sections, which may be created using next-image processing.

Methods may further include receiving, by the processor, a third image of the first section of the specimen, the third image created using a second wavelength of invisible light; receiving, by the processor, a fourth image of a second section of the specimen, fourth image created using the second wavelength of invisible light; co- registering, by the processor the first image and the second image.

Methods may yet further include de-blurring, by the processor, the first image, determining tissue type using a visible light image and the de-blurring may be performed based on tissue type, for example, by a processor using a classification algorithm based on a stored history or library. According to aspects, the de -blurring may include one of a Monte Carlo simulation, a point spread function method and a deconvolution method. Also according to aspects, the deconvolution method may include one of a measured point spread function kernel, a simulated point spread function kernel, a Richardson-Lucy method, a Weiner filter deconvolution, a Van Cittert deconvolution, a blind deconvolution, and a regularized deconvolution method.

According to aspects, the specimen may be contacted with at least one first fluorophores having an emission spectrum in the range of approximately 200 nm to 1000 nm. And according to aspects, the specimen may be contacted with a second fluorophore, said second fluorophore having an emission spectrum in the range of approximately 200 nm to 1000 nm and different from the first fluorophore. According to aspects, the method may further include receiving, by the processor, an image of the first section of the specimen created using visible light; and co-registering, by the processor, the images of the first section of the specimen.

According to another aspect of the inventive concepts is system for creating an image of a specimen including a light source; an image capture device; and a processor. The light source may include a visible light source and/or an invisible light source. Also according to aspects, the system may include a display and/or output means for outputting data, including an image created using the system and methods of the inventive concepts.

Another aspect includes a non-transitory program product for implementing methods of creating an image of a specimen according to the inventive concepts.

Yet another aspect of the inventive concepts includes a computer-implemented method of creating an image of a specimen, the method comprising: receiving, by a processor, an Nxth image of a Nth section of a specimen, the Nxth image created using a first wavelength of light; receiving, by the processor, an Ν γ th image of the Nth section of the specimen, the Ν γ th image created using a second wavelength of light, Y being equal to X+l and M being equal to N+l, an Mth section being adjacent the Nth section; receiving, by the processor, a visible light image of the Nth section of the specimen, the visible light image created using visible light; co-registering, by the processor the Nxth image, the Ν γ th image and the visible light image; and

creating, by the processor, a 3D image of the specimen using a next-image process on the Nxth and Ν γ th images.

The method may further include reiterating the receiving of the Nxth image of a Nth section of a specimen, the receiving of the Ν γ th image of a Nth section of a specimen, the co-registering of the Nxth image, the Ν γ th image and the visible image and the creating of the 3D image, where Νχ=Νχ+1, Ν γ γ +1 and M=M+1. The method may further include receiving a Pth light image of the Nth section when N is an integer multiple.

BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 is an illustrative flow chart showing processes according to inventive aspects;

Figure 2 illustrates devices in an environment according to inventive aspects; Figures 3A and 3B show images created using one aspect of the inventive concepts;

Figure 4 is an illustrative series of serial section images captured according to inventive aspects;

Figures 5 A, 5B and 5C are illustrative sectional images according to aspects; Figure 6 illustrates devices in an environment according to inventive aspects; Figure 7 illustrates devices in an environment according to inventive aspects; Figure 8 illustrates devices in an environment according to inventive aspects; Figure 9 illustrates devices in an environment according to inventive aspects; and

Figure 10 illustrates devices in an environment according to inventive aspects.

DETAILED DESCRIPTION

The ability to image fluorescence distribution of various substances within an animal, organ, or tissue has proven to be valuable in drug discovery and development. In vivo fluorescence imaging has proven to be a difficult problem, especially for targets at depth, due to the high scattering of optical light in tissue. In vivo bulk fluorescence imaging has been performed using tomographic recovery, as well as epi- and transillumination schemes. However, fluorescence recovery using these forms of imaging has been shown to have poor accuracy for deeply-seated and highly absorbing organs such as liver. Further, the resolution limit for fluorescence recovery in vivo is widely believe to be limited to a few millimeters at best, owing to the ill-posed nature of the fluorescence recovery problem. This resolution limit reduces the ability to provide any meaningful fluorescent information on relevant (e.g. vascular) characteristic length scales level beyond about a few hundred microns of depth from the surface of a subject. As drug development progresses it is less sufficient to only examine the bulk characteristics of fluorescent distribution when there is a wide field of vascular and targeted imaging agents in fluorescence. To perform fluorescence imaging with increased resolution, cryoslicing coupled with fluorescence epi-illumination imaging may be performed. While it has previously been demonstrated that drug distribution imaging can be performed with radiographic methods, fluorescence cryoslice imaging can be performed without the use of ionizing radiation. Further, many agents are shelf- stable and non-toxic. It is also possible to image several agents simultaneously using multispectral imaging techniques.

Aspects of the inventive concepts provide a simple workflow and software for a standalone (but portable) add on to a cryoslice imager, which is a common class of instrument available in many labs. However, since the fluorescence imaging system according to aspects of the inventive concepts may be constructed for transportation, it is not necessary to combine fluorescence imaging with a single cryoslicing instrument. The versatility of the instrumentation and processing schemes according to the inventive concepts lies in part in its transportability. For example, after imaging using one cryoslicing instrument designed for whole-body imaging, it may be possible to move the fluorescence instrument to another cryoslicer constructed for high-resolution, single-organ imaging.

The extension of existing cryoslice workflows to include fluorescence imaging opens the door to the use of cutting edge of molecular probes. Aspects of the invention make possible the measurement and visualization of drug distribution and/ or molecular probes, administered via different delivery methods (for example, by intrathecal, oral, intravenous subcutaneous routes, by inhalation, or the like,) with molecular probes (singularly or in combination) while using the imaging schemes of the present inventive concepts. Aspects of the invention enable imaging of the area of interest, co- registration of images, and the rendering of multiple fluorophores simultaneously. Also, relative-intensity quantification may be performed, e.g., by leveraging the different properties and distributions of one or more fluorophores. Three-dimensional images can be created and displayed, which allows for detailed analysis of probe transport through regions of interest. Aspects of the inventive concepts utilize the K FLARE system, optimized for use with the commonly used fluorophores in fluorescence-guided surgery, including indocyanine green, but many other fluorophores may be imaged using methods and systems of the inventive concepts. See, for example, the probes listed in Table 1. Finally, the registration of the optical data with additional modalities such as x-ray CT and MRI may enable multi-modality validation of hybrid agents.

Figure lis a flow chart showing the processes in a computer-implemented method 100 of creating an image of a specimen; the methods of which may include processes performed by a processor. A first process according to an illustrative method may include receiving, by a processor, a first image of a first section of a specimen 102, the first image created using a first wavelength of invisible light. By invisible light, it is understood that electromagnetic energy in wavelengths in the near infrared (NIR), or infrared (IR) spectrum may be included as well as electromagnetic energy such as X-Ray (as in X-ray fluorescence). In addition, ultrasound, MRI, and other modalities now known or later developed may be used for anatomical guidance. Furthermore, invisible light may include ultraviolet light. Invisible light may further include wavelengths at which any now-known or later-developed fluorophores fluoresce.

Another process in the method may include receiving a second image of a second section of the specimen 104, the second section being adjacent to the first section and the second image created using the first wavelength of invisible light. It is contemplated that serial, but non-sequential sections of the specimen may be imaged, for example, every other section, every third section, or a combination of sequential and non- sequential sections may be imaged and processed according to aspects of the inventive concepts. A next process that may be performed includes co-registering the first image and the second image 106. And a final process according to an aspect of the present inventive concepts includes creating a single-plane image of the first section 108 using a next-image process.

Optional processes that may be performed in conjunction with the above processes include receiving an image of an Nth section of the specimen 110 created using the first wavelength of invisible light. A further optional process includes creating a 3D image of the specimen using at least three images of adjacent specimen sections (Fig. 3). These optional processes may be performed iteratively, N times on multiple sections to create a complete (or incomplete) 3D rendering of the specimen (or part thereof). The creation of the 3D image may be performed via next- image processing. Such a method may be described as follows, to include receiving an Nxth image of a Nth section of a specimen, the Nxth image created using a first wavelength of light; receiving an Ν γ th image of the Nth section of the specimen, the Ν γ th image created using a second wavelength of light, Y being equal to X+l, and M being equal to N+l, an Mth section being adjacent the Nth section; receiving a visible light image of the Nth section of the specimen, the visible light image created using visible light; co-registering, by the processor the Nxth image, the Ν γ th image and the visible light image; and creating a 3D image of the specimen using a next-image process on the Nxth and Ν γ th images. Methods according to the inventive concepts may include reiterating each of the following: the receiving of the Nxth image of a Nth section of a specimen, the receiving of the Ν γ th image of a Nth section of a specimen, the co- registering of the Nxth image, the Ν γ th image and the visible image and the creating of the 3D image, where Νχ=Νχ+1, Ν γ γ +1 and M=M+1.

Further, methods according to the present inventive concepts may include receiving a Pth light image of the Nth section where N is an integer multiple. Co- registering using light images may be useful for determining proper placement of image data and for determining tissue type. Co-registering every fluorescent image with every visible light image of the same section is likely unnecessary and each 5th, 10th, or other integer multiple (Pth) visible light image may be used for co-registration with its Pth invisible light image.

It is conceived that different fluorophores may be used such that different wavelengths of light may be imaged. Different fluorophores may be used in conjunction with specific binding to find and illuminate different proteins or structures associated with the specimen. Accordingly, a further optional process that may be performed according to inventive aspects includes receiving a third image of the first section of the specimen 114, the third image created using a second wavelength of invisible light. A next optional process may include receiving a fourth image of a second section of the specimen 116, the fourth image created using the second wavelength of invisible light. A next process may include co-registering the third image and the fourth image 118. Co-registration may be performed to ensure that same structures within different sections are properly lined up in the images.

According to aspects of the inventive concepts, the fluorophores used to contact the specimen may exhibit emission spectra in the range of approximately 200 nm to 1000 nm, and as discussed above, a plurality of fluorophores may be used on a single specimen, each fluorophore having an emission spectrum different from the others. Fluorophores can include, but are not limited to, those detailed in Table 1.

Another optional process that may be performed includes de-blurring the first image and/ or any other images. De-blurring may be performed based on tissue type after tissue type is determined using a white light image. De-blurring may be performed using any appropriate method, including, but not limited to a Monte Carlo simulation, a point spread function method, and a deconvolution method. Further, the deconvolution method used may include, but is not limited to a measured point spread function kernel, a simulated point spread function kernel, a Richardson-Lucy method, a Weiner filter deconvolution, a Van Cittert deconvolution, a blind deconvolution, and a regularized deconvolution method. In another aspect of the invention, a structural image process can include a deconvolution process that probabilistically deconvolves a known point spread function from an image with a constraint that edges and regions of uniformity in the structural image are retained in the resulting fluorescence/ functional image.

Another optional process includes using alternatives to white light for image recovery, de-blurring and reconstruction. Any kind of co-registered structural imaging modality (such as x-ray CT, ultrasound, MRI, etc.) may be used for an informed de- convolution.

Additionally, an image can be de-blurred first by a human or automated classification method, followed by a de-blur based on point-spread functions that are defined for each region.

The de-blurring may be performed on the first image and /or any other images. De-blurring may include a process of removing image artifacts that impair the clarity of the image. Image artifacts amenable to de -blurring can be the result of subsurface fluorescence, camera optic aberrations and other environmental effects, such as vibrations of equipment.

De-blurring may be performed based on a tissue after tissue type is determined using a white light image. De-blurring may be performed using any appropriate method, including, but not limited to a Monte Carlo simulation, a point spread function method, and a de-convolution method. Point spread functions are understood to include a means of taking a single point or combinations of multiple points in an image and correcting for image artifacts that result in a blur on the image. Point spread image functions are a preferred method for de-blurring an image where the blur is the result of subsurface fluorescence.

The de-convolution method used may include, but is not limited to, a measured point spread function kernel, a simulated point spread function kernel, a Richardson- Lucy method, a Weiner filter de-convolution, a Van Cittert de-convolution, a blind de- convolution, and a regularized de-convolution method. In another aspect of the invention, a structural image process can include a de-convolution process that probabilistically de-convolves a known point spread function from an image with a constraint that edges and regions of uniformity in the structural image are retained in the resulting fluorescence/ functional image.

While the methods discussed above have mentioned imaging using invisible light as defined herein, imaging using visible light may also be used according to aspects of the inventive concepts. An optional process that may be used in conjunction with other processes, includes receiving an image of the first section of the specimen created using visible light; and co-registering the images of the first section of the specimen.

Also contemplated is a system for creating an image of a specimen. An illustrative system 200 according to aspects is illustrated in Fig. 2 and includes a light source 202. Light source 202 may be adapted to emit any wavelength or combination of wavelengths of light. That is, light source 202 may emit visible light as well as wavelengths that cause fluorophores to fluoresce. Light source 202 may be configured to emit specific wavelengths alone or simultaneously along with other wavelengths. Light source 202 may be used in conjunction with physical filters 203. Also, filtering may be performed digitally, to manipulate image data to produce digitally altered images. Also illustrated is an image capture device 204. Image capture device 204 may include a digital camera, a digital video capture device, a charge-coupled device (CCD), intensified charge-coupled device (iCCD), film, scintillator, optical coherence tomography, additional light and medical imaging modalities for anatomical guidance such as x-ray. System 200 includes a processor 206 specifically programmed to perform the methods according to the inventive concepts. Processor 206 may be embodied in a single unit, or in multiple units either connected or not connected together. For example, a first process may be performed by a processor in one location and subsequent process may be performed by a different processor, such that the combination of the processes is performed on data that is transmitted from one processor to another. Display 208 is illustrated. Display 208 may be adapted to show reconstructed images created using methods according to the inventive concepts. Display 208 may be any appropriate display, now-known, or later-developed.

According to alternative embodiments of the inventive concepts, a relatively high-resolution camera may be used which may lessen a need to co-register the images of various classes (i.e. fluorescence with the lights on versus lights off, low resolution versus high resolution white light). According to aspects, high resolution, inherently co-registered white light and fluorescence images may be simultaneously gathered and stored. Examples of high resolution that may be used according to aspects include 1024 by 1024 pixels or 36498 by 2432 pixels or any other appropriate resolution.

According to such aspects, fluorescence reconstruction may be performed as follows: White light and fluorescence images of an exposed block face are captured for each image slice. Fiducial reference markers are visible in the white light images, and are typically ink-filled holes drilled into the block. The holes are constructed to be straight, which allows the use of the holes to align one white light image with the next, (i.e., images of sequential sections may be aligned with one another). For each slice, the white light and fluorescence images are natively co-registered, meaning that when the white light images are aligned, those alignment transforms may be used to align the fluorescence images. In this manner, a fully aligned 3D distribution of white light and fluorescence information may be gathered.

In another aspect of the invention, if the fluorescence images gathered using the high-resolution camera(s) are still subject to the appearance of subsurface fluorescence, the effect of subsurface fluorescence may be removed by using a deconvolution method or a next-image processing method. The parameters required for these corrections can be determined heuristically or empirically. These parameters are likely to be different for each tissue type. According to additional aspects of the invention, tissue identification may be performed using the white light images using color matching, atlas-registration, or hybrid techniques, without the use of additional imaging modalities such as x-ray CT or MRI or human intervention, or combinations of these. When the tissues are identified, a lookup table may be used to isolate the appropriate parameters for subsurface fluorescence removal. Each tissue may then be processed with the appropriate parameters for recovery. This results in a distribution of fluorescence as measured by corrected photon counts.

In such cases, when the system is well characterized (i.e. the efficiency and stability characteristics of all cameras and fibers is known), then this distribution can be translated into the quantitative distribution of fluorescence. Alternatively, if the total molecular count of fluorophores injected is known and the entire body has been imaged, then it is possible to use a ratiometric technique to recover the fluorescence distribution. Alternatively, if a fluorescent standard is also visible in the field of view, it will be possible to scale the fluorescence to a concentration based on the photon counts of the standard.

A workflow according to these aspects may add meta-information to the images so that the 3D fluorescence and white light images may be indexed in a database. Non- limiting examples of meta-information may include identification numbers paired with, name of subject, date of imaging, modality used to capture the information, other numbers, project- or study- specific information, pixel size or voxel size, etc. This means that the images may then be accessed through the cloud anywhere in the world.

The methods according to the inventive concepts may be embodied as a non- transitory computer program product. Any combination of one or more computer- readable storage device(s) or computer-readable media may be utilized. The computer readable medium may be a computer readable storage medium. A computer readable storage device may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage device would include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage device may be any tangible device or medium that can store a program for use by or in connection with an instruction execution system, apparatus, or device. The term "computer readable storage device," or variations thereof, does not encompass a signal propagation media such as a copper cable, optical fiber or wireless transmission media.

Program code embodied on a computer readable storage device or computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.

Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).

Aspects of the present invention are described herein with reference to flowchart illustrations and/ or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/ or block diagrams, and combinations of blocks in the flowchart illustrations and/ or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to one or more processors of one or more general purpose computers, special purpose computers, or other programmable data processing apparatuses to produce a machine, such that the instructions, which execute via the one or more processors of the computers or other programmable data processing apparatuses, create means for implementing the functions/ acts specified in the flowchart and/ or block diagram block or blocks.

These computer program instructions may also be stored in one or more computer readable storage devices or computer readable media that can direct one or more computers, one or more other programmable data processing apparatuses, or one or more other devices to function in a particular manner, such that the instructions stored in the one or more computer readable storage devices or computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.

The computer program instructions may also be loaded onto one or more computers, one or more other programmable data processing apparatuses, or one or more other devices to cause a series of operational steps to be performed on the one or more computers, one or more other programmable data processing apparatuses, or one or more other devices to produce a computer implemented process such that the instructions which execute on the one or more computers, one or more other programmable data processing apparatuses, or one or more other devices provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a," "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.

General Method: For measurement, a subject is first prepared by the appropriate method for the study (i.e., injected with a given dose of contrast agent and then, typically sacrificed after the appropriate time.) A sample of interest is frozen in optimum cutting temperature material (O.C.T.). The O.C.T. block is serially sliced on the cryoslicer, where high-resolution white light images are taken in tandem with fluorescence images of each slice or section. Multiple fluorophores can be imaged simultaneously with the proper filtering. The section images are cropped and aligned based on anatomical and fiducial markers. If multiple cameras are used (i.e., the white light camera is located such that the subject of interest is viewed from a different angle by the white light and fluorescence camera), then a 12-parameter affine registration is performed along with the standard rigid alignment performed in cryoslice image processing. The fluorescence images are then processed using a modified form of the method presented in Steyer et al. [3], resulting in a 3D fluorescence distribution recovery which is co-registered to a white light image stack.

EXAMPLE

A naive male Sprague-Dawley rat was injected intrathecally with 70 uL of ZW800- 1 [4] at a concentration of 100 uM. The contrast agent was allowed to distribute for 90 minutes before the subject was sacrificed and the brain resected. The brain was then frozen in O.C.T. and imaged using a Leica 3050 cryostat (Leica Camera AG, Wetzlar, Germany), a high resolution Canon EOS 700 white light camera (Canon, Melville, NY, USA), and a K-FLARE surgical fluorescence imaging system (Curadel LLC, Worcester, MA, USA).

Figure 4 shows a set of images taken on serial sections. Figures 5A-C show sections imaged according to aspects of the inventive concepts. Specifically, Fig. 5A shows a sagittal section, Fig. 5B illustrates a coronal section and Fig. 5C shows a transverse section of rat brain prepared and imaged according to aspects of the inventive concepts.

The measured distribution of the fluorescent tracer matches the expected distribution based on previous studies. The methodology demonstrated that versatile fluorescence imaging techniques can be performed using a standalone fluorescence imaging system in tandem with established cryoslicing instruments.

References:

[1] Roy et al., Anat Rec (2009).

[2] Sarantopoulos et al., Mol Imaging Biol (2011).

[3] Steyer et al., Annals of Biomed. Eng. (2009).

[4] Choi et al., Anger Chem Int Ed Engl (2011).

TABLE 1

Probe Ex 1 (nm) Em 2 (nm)

Ex 1 (nm) En;

Ex 1 (nm) Em 2 (nm)

Y66F 360 508

Y66H 360 442

EBFP 380 440

Wild-type 396, 475 50, 503

GFPuv 385 508

ECFP 434 477

Y66W 436 485

S65A 471 504

S65C 479 507

S65L 484 510

S65T 488 51 1

EGFP 489 508

EYFP 514 52 /

DsRed 558 583

Other probes

Monochlorobimane 380 461

Calcein 496 517

Έχ: Peak excitation wavelength (nm)

2Em: Peak emission wavelength (nm)

See, e.g. US Patent 7,608,392 B2