Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
TRACKING POSITION AND ORIENTATION OF A SURGICAL DEVICE THROUGH FLUORESCENCE IMAGING
Document Type and Number:
WIPO Patent Application WO/2021/206556
Kind Code:
A1
Abstract:
There is provided a controller and a method for tracking the position and orientation of a surgical device using fluorescence imaging, and a surgical device configured to enable position and orientation tracking using fluorescence imaging. The controller comprises an input configured to receive, from a fluorescence imaging system, fluorescence imaging data indicative of emitted fluorescence from the surgical device. The controller is configured to identify, in dependence on the fluorescence imaging data, first marker data indicative of emitted fluorescence from a first marker region of the surgical device, and second marker data indicative of emitted fluorescence from a second marker region of the surgical device; determine, in dependence on the first marker data and the second marker data, pose data indicative of a position and an orientation of the surgical device; and output a signal indicative of the pose data.

Inventors:
VAN LEEUWEN FIJS (NL)
HOUWING KRIJN (NL)
VAN OOSTEROM MATTHIAS (NL)
Application Number:
PCT/NL2021/050232
Publication Date:
October 14, 2021
Filing Date:
April 09, 2021
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ACADEMISCH ZIEKENHUIS LEIDEN (NL)
International Classes:
A61B34/00; A61B34/20
Foreign References:
US20130274596A12013-10-17
US20180235715A12018-08-23
Other References:
MATHIAS MARKERT: "Entwicklung eines kliniktauglichen Assistenzsystems für die Leberchirurgie Doktor-Ingenieurs (Dr.-Ing.)", 31 January 2011 (2011-01-31), XP055634184, Retrieved from the Internet [retrieved on 20191021]
TIMO KRÜGER: "Ein modulares Navigationssystem für die dentale Implantologie", 16 November 2006 (2006-11-16), pages 83, XP055624425, Retrieved from the Internet [retrieved on 20190920], DOI: 10.14279/depositonce-1467
MAMONE VIRGINIA ET AL: "Robust Laparoscopic Instruments Tracking Using Colored Strips", 8 June 2017, BIG DATA ANALYTICS IN THE SOCIAL AND UBIQUITOUS CONTEXT : 5TH INTERNATIONAL WORKSHOP ON MODELING SOCIAL MEDIA, MSM 2014, 5TH INTERNATIONAL WORKSHOP ON MINING UBIQUITOUS AND SOCIAL ENVIRONMENTS, MUSE 2014 AND FIRST INTERNATIONAL WORKSHOP ON MACHINE LE, ISBN: 978-3-642-17318-9, XP047416508
FUERST BERNHARD ET AL: "First Robotic SPECT for Minimally Invasive Sentinel Lymph Node Mapping", IEEE TRANSACTIONS ON MEDICAL IMAGING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 35, no. 3, 1 March 2016 (2016-03-01), pages 830 - 838, XP011608615, ISSN: 0278-0062, [retrieved on 20160301], DOI: 10.1109/TMI.2015.2498125
MATTHIAS N. VAN OOSTEROM ET AL: "Computer-assisted surgery : virtual- and augmented-reality displays for navigation during urological interventions", CURRENT OPINION IN UROLOGY., vol. 28, no. 2, 1 March 2018 (2018-03-01), GB, pages 205 - 213, XP055758857, ISSN: 0963-0643, DOI: 10.1097/MOU.0000000000000478
VAN OOSTEROM ET AL., CURRENT OPINION IN UROLOGY, 2019
Attorney, Agent or Firm:
HGF B.V. (NL)
Download PDF:
Claims:
CLAIMS

1. A controller for tracking position and orientation of a surgical device using fluorescence imaging, comprising: an input configured to receive, from a fluorescence imaging system, fluorescence imaging data indicative of emitted fluorescence from the surgical device; one or more processors; and a memory storing computer executable instructions therein which, when executed by the one or more processors, cause the one or more processors to: identify, in dependence on the fluorescence imaging data, first marker data indicative of emitted fluorescence from a first marker region of the surgical device, and second marker data indicative of emitted fluorescence from a second marker region of the surgical device; determine, in dependence on the first marker data and the second marker data, pose data indicative of a position and an orientation of the surgical device; and output a signal indicative of the pose data.

2. The controller of claim 1 , wherein: the input is configured to receive scan data indicative of a representation of a patient, and the one or more processors are configured to determine, in dependence on the scan data and the pose data, a relative position and relative orientation of the surgical device to at least one object represented by the scan data.

3. The controller of claim 1 or 2, wherein the one or more processors are further configured to output, to a display, display data indicative of the pose data.

4. The controller of claim 3 when dependent on claim 2, wherein the display data is further indicative of the scan data.

5. The controller of any preceding claim, wherein the one or more processors are configured to identify the first marker data in dependence on a first fluorescence emission spectrum associated with the first marker region and the second marker data in dependence on a second fluorescence emission spectrum associated with the second marker region, wherein the first fluorescence emission spectrum is distinct from the second fluorescence emission spectrum.

6. The controller of any preceding claim, the memory storing geometric data indicative of a geometry of each marker region on the surgical device, the one or more processors being configured to determine the pose data further in dependence on the geometric data.

7. The controller of claim 6, wherein the geometric data is indicative of the first and second marker regions defining an asymmetric pattern along a first axis of the surgical device.

8. The controller of claim 6 or 7, wherein the geometric data is indicative of a shape of the first marker region being distinct from a shape of the second marker region, and wherein the one or more processors are configured to identify the first marker data and second marker data in dependence on the geometric data.

9. The controller of any of claims 6 to 8, wherein the geometric data is indicative of the first and second marker regions being separated along a first axis of the surgical device.

10. The controller of claim 9, wherein the one or more processors are further configured to identify, in dependence on the fluorescence imaging data, third marker data indicative of emitted fluorescence from a third marker region of the surgical device; and determine the position and orientation of the surgical device further in dependence on the third marker data.

11. The controller of claim 10, wherein the geometric data is indicative of the third marker region being located between the first and second marker regions, such that the first, second and third marker regions are unevenly distributed along the first axis.

12. The controller of any of claims 6 to 11 wherein the geometric data is indicative of each marker region extending substantially around a portion of a surface proximal to a cross section of the surgical device.

13. The controller of any preceding claim, comprising an output configured to output a control signal for controlling the surgical device in dependence on the pose data.

14. The controller of claim 13 when dependent on claim 2, wherein the output is configured to output a control signal for controlling the surgical device further in dependence on the scan data.

15. The controller of any preceding claim, wherein the one or more processors are configured to identify, in dependence on the fluorescence imaging data, target data indicative of emitted fluorescence from a target external to the surgical device.

16. The controller of claim 15, wherein the one or more processors are configured to identify the target data in dependence on a target fluorescence emission spectrum associated with the target, the target fluorescence emission spectrum being distinct from the first and second fluorescence emission spectra.

17. The controller of claim 15 or 16, wherein the one or more processors are further configured to determine, in dependence on the target data and the pose data, a distance from the surgical device to the target, and to output a signal indicative of the determined distance.

18. The controller of any preceding claim wherein the one or more processors are further configured to: determine, in dependence on the pose data, a path of the surgical device; determine surgical performance data indicative of one or more path parameters derived from the path of the surgical device; and output an indication of the surgical performance data.

19. The controller of claim 18 wherein: the one or more processors are further configured to receive ideal path data indicative of a predetermined path through the surgical field; and the one or more path parameters comprise a deviation from the predetermined path.

20. The controller of claim 19 wherein the one or more processors are further configured to output an indication of the predetermined path to a display.

21. The controller of any of claims 18 to 20, wherein the path parameters comprise one or more of a tortuosity, path curvature, path length, acceleration, velocity, or time elapsed for the path.

22. The controller of any of claims 18 to 21 , wherein the one or more processors are configured to determine a dexterity metric in dependence on an output of a machine learning algorithm taking one or more of the path parameters as input.

23. The controller of claim 22, wherein the machine learning algorithm comprises a deep learning network, and wherein the deep learning network is trained to differentiate path parameters of experienced surgeons from those of inexperienced surgeons.

24. A surgical device for laparoscopy or endoscopy configured to enable pose determination by fluorescence imaging, the surgical device comprising a first fluorescent marker region and a second fluorescent marker region.

25. The surgical device of claim 24, wherein the surgical device comprises a gamma probe for detection of radioactive tracers.

26. The surgical device of claim 24 or 25, wherein the surgical device comprises a tethered modality.

27. The surgical device of any of claims 24 to 26, wherein the first fluorescent marker region is associated with a first fluorescence emission spectrum and the second fluorescent marker region is associated with a second fluorescence emission spectrum, wherein the first fluorescence emission spectrum is distinct from the second fluorescence emission spectrum.

28. The surgical device of any of claims 24 to 27, wherein the first and second fluorescent marker regions define an asymmetric pattern along a first axis of the surgical device.

29. The surgical device of any of claims 24 to 28, wherein a shape of the first fluorescent marker region is distinct from a shape of the second fluorescent marker region.

30. The surgical device of any of claims 24 to 29 wherein the first and second fluorescent marker regions are separated along a first axis of the surgical device.

31. The surgical device of claim 30, wherein the surgical device comprises a third fluorescent marker region being located between the first and second marker regions, such that the first, second and third marker regions are unevenly distributed along the first axis of the surgical device.

32. The surgical device of any of claims 24 to 31 , wherein each marker region extends substantially around a portion of a surface proximal to a cross section of the surgical device.

33. The surgical device of any of claims 24 to 32, wherein the fluorescent marker regions comprise medical grade fluorescent material embedded in the surgical device.

34. The surgical device of claim 33, wherein the medical grade fluorescent material comprises one or more of: fluorescent polymer material or an immobilised fluorescent dye.

35. A system for tracking position and orientation of a surgical device using fluorescence imaging, comprising: the surgical device according to any of claims 24 to 34; a fluorescence imaging device comprising an excitation source and a detector configured to capture fluorescence imaging data indicative of emitted fluorescence from the surgical device; and the controller of any of claims 1 to 23.

36. A computer-implemented method for tracking position and orientation of a surgical device using fluorescence imaging, comprising: receiving, from a fluorescence imaging system, fluorescence imaging data indicative of emitted fluorescence from the surgical device; identifying, in dependence on the fluorescence imaging data, first marker data indicative of emitted fluorescence from a first marker region of the surgical device, and second marker data indicative of emitted fluorescence from a second marker region of the surgical device; determining, in dependence on the first marker data and the second marker data, pose data indicative of a position and an orientation of the surgical device; and outputting a signal indicative of the pose data.

37. The method of claim 36, further comprising: receiving scan data indicative of a representation of a patient, and determining, in dependence on the scan data and the pose data, a relative position and relative orientation of the surgical device to at least one object represented by the scan data.

38. The method of claim 36 or 37, wherein identifying the first marker data and second marker data is in dependence on a first fluorescence emission spectrum associated with the first marker region and a second fluorescence emission spectrum associated with the second marker region, wherein the first fluorescence emission spectrum is distinct from the second fluorescence emission spectrum.

39. The method of any claim 36 to 38, wherein determining the pose data is further in dependence on geometric data indicative of a geometry of the first and second marker regions on the surgical device.

40. The method of claim 39, wherein the geometric data is indicative of the first and second marker regions defining an asymmetric pattern along a first axis of the surgical device.

41. The method of claim 39 or 40, wherein the geometric data is indicative of a shape of the first marker region being distinct from a shape of the second marker region, and wherein the method comprises identifying the first marker data and second marker data in dependence on the geometric data.

42. The method of any claim 36 to 41 , further comprising identifying, in dependence on the fluorescence imaging data, target data indicative of emitted fluorescence from a target external to the surgical device.

43. The method of claim 42, wherein identifying the target data is in dependence on a target fluorescence emission spectrum associated with the target, the target fluorescence spectrum being distinct from the first and second fluorescence emission spectra.

44. The method of claim 42 or 43, further comprising determining, in dependence on the target data and the pose data, a distance from the surgical device to the target, and outputting a signal indicative of the determined distance.

45. The method of any of claims 36 to 44, further comprising controlling the surgical device in dependence on the pose data.

46. The method of any of claims 36 to 45, comprising: determining, in dependence on the pose data, a path of the surgical device; determining surgical performance data indicative of one or more path parameters derived from the path of the surgical device; and outputting an indication of the surgical performance data.

47. The method of claim 46 comprising: receiving ideal path data indicative of a predetermined path through the surgical field; wherein the one or more path parameters comprise a deviation from the predetermined path.

48. The method of claim 47, comprising outputting an indication of the predetermined path to a display.

49. The method of any of claims 46 to 48, wherein the path parameters comprise one or more of a tortuosity, path curvature, path length, acceleration, velocity, or time elapsed for the path.

50. The method of any of claims 46 to 49, comprising determining a dexterity metric in dependence on an output of a machine learning algorithm taking one or more of the path parameters as input.

51. The method of claim 50, wherein the machine learning algorithm comprises a deep learning network, and wherein the deep learning network is trained to differentiate path parameters of experienced surgeons from those of inexperienced surgeons.

52. Computer software which, when executed, is arranged to perform a method according to any of claims 36 to 51.

Description:
Tracking position and orientation of a surgical device through fluorescence imaging

[0001] The invention relates to a controller, a system and a method for tracking the position and orientation of a surgical device through fluorescence imaging. The invention also relates to a surgical device configured to enable tracking through fluorescence imaging.

BACKGROUND

[0002] During surgery, in particular laparoscopic or robot assisted surgery, advanced surgical guidance is required to compensate for the lack of tactile feedback available to a surgeon. This may be implemented through optical laparoscopic systems providing a direct translation of the surgeon’s natural view. Fluorescence imaging has been incorporated into many surgical laparoscopes to further extend this optical assessment of tissue. However, such optical guidance is limited in identifying lesions beyond the surface. Ultrasound and radioguidance may be implemented to complement the imaging modalities, in the form of DROP-IN ultrasound and gamma probes. DROP-IN probes are configured to be small and flexible such that they may be dropped into the surgical field and positioned by a surgeon with high manoeuvrability. Tracking the position and orientation, known also as the pose, of DROP-IN probes using external tracking techniques is problematic due to the tethered design of the probes. Further, influence of the robotic tools on electromagnetic fields renders electromagnetic tracking unfeasible.

[0003] It is nevertheless desired to accurately and precisely track the pose of objects, such as surgical tools and devices like DROP-IN probes, within the surgical field to facilitate accurate surgical navigation. Tracking may be performed by video, utilising the laparoscopic camera in conjunction with fiducial markings such as “barcode” stickers on the device, and detecting the fiducials using white light imaging (Van Oosterom et al, Current Opinion in Urology, 2019). These techniques may be associated with poor accuracy in certain surgical environments, such as in the presence of blood, smoke, water, tissue and fluorescence imaging. Further, there may be a lack of robustness to tracking changes in orientation of the device.

[0004] It is an object of embodiments of the present invention to mitigate one or more problems of the prior art.

BRIEF SUMMARY OF THE DISCLOSURE

[0005] In accordance with the present invention there is provided a controller, a computer-implemented method and computer software for tracking a position and orientation of a surgical device. There is also provided a surgical device configured to enable tracking.

[0006] According to a first aspect, there is provided a controller for tracking position and orientation of a surgical device using fluorescence imaging, comprising: an input configured to receive, from a fluorescence imaging system, fluorescence imaging data indicative of emitted fluorescence from the surgical device; one or more processors; and a memory storing computer executable instructions therein which, when executed by the one or more processors, cause the one or more processors to: identify, in dependence on the fluorescence imaging data, first marker data indicative of emitted fluorescence from a first marker region of the surgical device, and second marker data indicative of emitted fluorescence from a second marker region of the surgical device; determine, in dependence on the first marker data and the second marker data, pose data indicative of a position and an orientation of the surgical device; and output a signal indicative of the pose data.

[0007] Optionally, third marker data indicative of emitted fluorescence from a third marker region may be identified. The marker regions may be separate. The marker regions may form contiguous segments of a complex shape on the surgical device. The marker regions may define an asymmetric pattern along a first axis of the surgical device. Advantageously, the orientation of the surgical device may then be determined easily when the marker regions are identified in the fluorescence imaging data.

[0008] The input may be configured to receive scan data indicative of a representation of a patient, and the one or more processors are configured to determine, in dependence on the scan data and the pose data, a relative position and relative orientation of the surgical device to at least one object represented by the scan data. Thus, the surgical device may be navigated accurately within the patient. The one or more processors may be further configured to output, to a display, display data indicative of the pose data. The display data may be further indicative of the scan data. For example, the pose of the device may be displayed in an augmented reality overlay on the scan of the patient. The relative pose of the surgical device to the at least one object may be displayed in the augmented reality overlay. For example, an indication of the distance from the surgical device to the at least one object may be displayed.

[0009] Optionally, the one or more processors are configured to identify the first marker data in dependence on a first fluorescence emission spectrum associated with the first marker region and the second marker data in dependence on a second fluorescence emission spectrum associated with the second marker region, wherein the first fluorescence emission spectrum is distinct from the second fluorescence emission spectrum. The emission spectra may be distinct in that they define different frequency profiles with unique frequency peaks. That is, each marker region may exhibit a unique fluorescent colour. The regions may be uniquely identified in the fluorescence imaging device due to the unique colours. If there are three marker regions, the fluorescence emission spectrum of the third marker region may be distinct from the first and second emission spectra.

[0010] The memory may store geometric data indicative of a geometry of each marker region on the surgical device. The marker data and consequently the pose may then be determined in dependence on the geometric data, by identifying regions in the fluorescence imaging data with a corresponding geometry. The geometric data may indicate the shape of each region is distinct. The geometric data may indicate the first and second marker regions are separated along a first axis of the surgical device. Thus, when the regions are identified, the orientation of the first axis may be inferred. The geometric data may be indicative of the third marker region being located between the first and second marker regions, such that the first, second and third marker regions are unevenly distributed along the first axis. The provision of an asymmetric distribution allows the orientation of the first axis to be robustly inferred. The geometric data may indicate that each marker region extends substantially around a portion of a surface proximal to a cross section of the surgical device, i.e. that each marker region defines a ring around the first axis of the surgical device.

[0011] The controller may comprise an output configured to output a control signal for controlling the surgical device in dependence on the pose data. Optionally, the device may be controlled further in dependence on the scan data, for example to move the surgical device closer to a location of interest in the scan.

[0012] The one or more processors may be configured to identify, in dependence on the fluorescence imaging data, target data indicative of emitted fluorescence from a target external to the surgical device. The target data may be identified in dependence on a target fluorescence emission spectrum associated with the target, the target fluorescence emission spectrum being distinct from the first and second fluorescence emission spectra. Optionally, a 3D position of the target may be inferred. The target may be a second surgical device, reference target object, or a target tissue labelled with fluorescent tracer. The one or more processors may be further configured to determine, in dependence on the position of the target and the pose of the surgical device, a distance from the surgical device to the target, and to output a signal indicative of the determined distance.

[0013] The one or more processors may be configured to determine, in dependence on the pose data, a path of the surgical device; determine surgical performance data indicative of one or more path parameters derived from the path of the surgical device; and output an indication of the surgical performance data. Optionally, the one or more processors are further configured to receive ideal path data indicative of a predetermined path through the surgical field; and the one or more path parameters comprise a deviation from the predetermined path. Optionally, the one or more processors are further configured to output an indication of the predetermined path to a display. The path parameters may comprise one or more of a tortuosity, path curvature, path length, acceleration, velocity, or time elapsed for the path. Optionally, the one or more processors are configured to determine a dexterity metric in dependence on an output of a machine learning algorithm taking one or more of the path parameters as input. The machine learning algorithm may comprise a deep learning network. The deep learning network may be trained to differentiate path parameters of experienced surgeons from those of inexperienced surgeons.

[0014] According to another aspect of the invention, there is provided a surgical device for laparoscopy or endoscopy configured to enable pose determination by fluorescence imaging, the surgical device comprising a first fluorescent marker region and a second fluorescent marker region. The surgical device may comprise a surgical probe, a laparoscopic probe, a surgical instrument, a detection modality or an imaging modality. For example, the surgical device may comprise a DROP-IN gamma probe for detection of radioactive tracers.

[0015] The first fluorescent marker region may be associated with a first fluorescence emission spectrum and the second fluorescent marker region may be associated with a second fluorescence emission spectrum, wherein the first fluorescence emission spectrum is distinct from the second fluorescence emission spectrum. If there are three marker regions, the fluorescence emission spectrum of the third fluorescent marker region may be distinct from the first and second emission spectra. That is, each marker region may exhibit a different fluorescent colour, and thus may be uniquely identified in a captured image or video footage.

[0016] A shape of the first fluorescent marker region may be distinct from a shape of the second fluorescent marker region, in order that they may be uniquely identified. Optionally, the first and second fluorescent marker regions are separated along a first axis of the surgical device. The first and second fluorescent marker regions may alternatively be contiguous. The first and second fluorescent marker regions may together define an asymmetric pattern along the first axis of the surgical device, to enable the orientation of the device to be inferred from the identification of the marker regions. Optionally, the surgical device comprises a third fluorescent marker region being located between the first and second marker regions, such that the first, second and third marker regions are unevenly distributed along the first axis of the surgical device. At least one marker region may extend substantially around a portion of a surface proximal to a cross section of the surgical device, to define a ring around the first axis.

[0017] The fluorescent marker regions may comprise medical grade fluorescent material embedded in the surgical device. The medical grade fluorescent material may comprise one or more of: fluorescent polymer material or an immobilised fluorescent dye. For example, the medical grade fluorescent material may comprise PEEK, fluorescent dye in epoxy, or fluorescent dye in silicon.

[0018] According to a further aspect there is provided a system for tracking a position and orientation of a surgical device using fluorescence imaging, comprising the surgical device, a fluorescence imaging device comprising an excitation source and a detector configured to capture fluorescence imaging data indicative of emitted fluorescence from the surgical device; and the controller. The system may be configured for laparoscopic, endoscopic or robot assisted surgery. The fluorescence imaging device may comprise a laparoscopic or endoscopic camera.

[0019] According to a further aspect there is provided a computer-implemented method for tracking position and orientation of a surgical device using fluorescence imaging, comprising: receiving, from a fluorescence imaging system, fluorescence imaging data indicative of emitted fluorescence from the surgical device; identifying, in dependence on the fluorescence imaging data, first marker data indicative of emitted fluorescence from a first marker region of the surgical device, and second marker data indicative of emitted fluorescence from a second marker region of the surgical device; determining, in dependence on the first marker data and the second marker data, pose data indicative of a position and an orientation of the surgical device; and outputting a signal indicative of the pose data. There is also provided computer software which, when executed, is arranged to perform the computer-implemented method.

BRIEF DESCRIPTION OF THE DRAWINGS

[0020] Embodiments of the invention are further described hereinafter with reference to the accompanying drawings, in which:

Figure 1 is a schematic illustration of a system 100 according to an embodiment of the invention;

Figures 2a to 2j illustrate a surgical device 120 according to embodiments of the invention; Figure 3a and 3b illustrate a surgical device 120 according to an embodiment of the invention;

Figure 4 illustrates a method 400 for tracking the position and orientation of a surgical device 120 according to an embodiment of the invention;

Figures 5a and 5b illustrate a field of view of a fluorescence imaging device 130 and an illustration of an example output displayed to a user according to an embodiment of the invention;

Figure 6 illustrates a method 600 according to an embodiment of the invention;

Figure 7 is a schematic illustration of a system 700 according to an embodiment of the invention;

Figures 8a and 8b illustrate an example of target localisation according to an embodiment of the invention;

Figures 9a and 9b illustrate example display footage including imaging data captured in fluorescence mode according to an embodiment of the invention;

Figure 10 illustrates example fluorescence imaging data captured in white light mode according to an embodiment of the invention, for simultaneous tracking of three different surgical devices;

Figure 11 illustrates a method 1100 according to an embodiment of the invention;

Figure 12 illustrates an example display output 1200 according to an embodiment of the invention; and

Figure 13 illustrates path data collected according to an embodiment of the invention.

DETAILED DESCRIPTION

[0021] Figure 1 illustrates a system 100 according to an embodiment of the invention.

The system 100 may be used for tracking the position and orientation of a surgical device 120, for example during laparoscopic or robot assisted surgery.

[0022] The system 100 comprises a controller 110, a fluorescence imaging device 130, and the surgical device 120. The fluorescence imaging device 130 is configured to capture fluorescence imaging data indicative of emitted fluorescence from the surgical device 120, and the controller 110 is configured to perform a method for tracking the position and orientation of the surgical device 120 in dependence on the captured fluorescence data, as will be explained. The system 100 may in some embodiments also comprise a display device 140.

[0023] The surgical device 120 may be any device for which it may be desired to determine an accurate position and orientation within a subject. The surgical device 120 may be particularly suitable for laparoscopic, endoscopic or robot assisted surgery. For example, the surgical device 120 may be a DROP-IN probe, such as a DROP-IN gamma probe for detection of radioactive tracers. Such DROP-IN probes may be used to aid localisation of internal targets in a subject during robot assisted or laparoscopic surgery. Accurate tracking of probes and other laparoscopic instruments is important for surgical navigation. The surgical device 120 is however not limited to DROP-IN probes, and may for example comprise any other surgical probe, surgical instrument, detection modality or imaging modality. In some embodiments, the surgical device 120 is a tethered modality, i.e. physically connected to one or more external systems. The surgical device 120 is configured to enable pose determination and tracking by fluorescence imaging, by way of comprising at least a first fluorescent marker region 121 and a second fluorescent marker region 122. However, in some embodiments the surgical device 120 comprises more than two fluorescent marker regions, for example three fluorescent marker regions. Each fluorescent marker region 121, 122 is characterised by exhibiting some degree of fluorescence. Each fluorescent marker region 121, 122 comprises fluorescent material defined by an ability to absorb EM radiation at a first “excitation” frequency and emit EM radiation at a lower “emission” frequency, typically in the visible spectrum, far red, or near- infrared I and II regions.

[0024] Figure 3a and Figure 3b illustrate an example surgical device 120 according to an embodiment of the invention. The surgical device 120 comprises a DROP-IN gamma probe with three fluorescent marker regions 121, 122, 123. Figure 3a illustrates the surgical device 120 alone. Figure 3b illustrates the surgical device 120 in a tethered form, grasped by surgical forceps, illustrating an exemplary use of the surgical device 120. The three fluorescent marker regions 121, 122, 123 comprise fluorescent rings arranged non- uniformly along an axis of the surgical device 120, however it will be appreciated that the fluorescent marker regions 121, 122, 123 are not limited to this arrangement, as will be explained. The surgical device 120 is illustrated in Figure 3a and Figure 3b alongside the five degrees of freedom (5 DOF) by which it may be desired to track the pose of the surgical device. Figure 3a and Figure 3b illustrate three positional axes x, y, z and two rotational axes q, y. Movement with respect to each axis defines an independent degree of freedom. The rotational and positional axes are respectively oriented such that a change in angle Q defines rotation around the y axis and a change in angle y defines rotation around the z axis. According to an embodiment of the invention, the pose of the surgical device 120 may be determined with respect to the 5 DOF defined by the axes illustrated.

[0025] Returning to Figure 1, the fluorescent marker regions 121, 122 may be arranged in a manner such that they can be distinguished by a fluorescence imaging system, for example by one or both of differing geometry or distinct fluorescence emission spectra.

The fluorescent marker regions 121 , 122 may also be distinctly coloured such that they are distinguishable in white light. If the fluorescent marker regions 121, 122 comprise differing geometry or distinct colouring, it will be appreciated that the regions would also be distinguishable by a conventional imaging system in white light conditions. The two fluorescent marker regions 121 , 122 may be arranged such that they are physically separated on the surgical device 120, however it will be appreciated that in some embodiments the fluorescent marker regions 121, 122 may abut. Two regions may be distinguishable despite not being arranged in a physically separated manner. A single complex shape may comprise two or more distinguishable regions. For example, any shape may be divided into two regions having different spectral properties. The two regions may then be distinguishable by a fluorescence imaging system by exhibiting different fluorescent colours, i.e. emission spectra. A single complex shape may also comprise two regions which are geometrically distinguishable. For example, the single complex shape may be a non-equilateral triangle, and each side may define a different fluorescent marker region 121, 122. Each side may be uniquely identified based on the geometry of the shape, and thus the sides will be distinguishable to an imaging system. The potential arrangement and purpose of the fluorescent marker regions 121, 122 will be later described in more detail with reference to several illustrative embodiments.

[0026] The fluorescence imaging device 130 is configured to capture fluorescence imaging data indicative of emitted fluorescence from the surgical device 120, in particular from the fluorescent marker regions 121, 122 of the surgical device 120. The fluorescence imaging device 130 may be a fluorescence camera suitable for laparoscopic or endoscopic procedures, for example the fluorescence imaging device 130 may comprise the Firefly™ fluorescence imaging system. The fluorescence imaging device 130 may therefore be navigated within a subject alongside the surgical device 120. The fluorescence imaging device 130 comprises an excitation source 131 for emitting an excitation signal and a detector 132 for detecting fluorescence emitted by the surgical device 120 responsive to the excitation signal.

[0027] The excitation source 131 is configured to emit an excitation signal suitable for exciting the fluorescent material of the fluorescent marker regions 121, 122. The fluorescent marker regions 121, 122 absorb the excitation signal and fluoresce. The excitation signal may have a frequency profile suitable for selectively targeting excitation of the fluorescent marker regions 121, 122. In such embodiments the fluorescence of the marker regions 121, 122 may cause them to be selectively illuminated within the surrounding scene, and as such will be prominent in the fluorescence imaging data. In this case, the fluorescence imaging device 130 may be described as functioning in “fluorescence mode”. Example fluorescence imaging data captured in fluorescence mode will be illustrated in Figure 9.

[0028] In other embodiments the excitation signal may have a broad frequency spectrum including a broad range of visible wavelengths. The broad frequency spectrum may define white light. In this case, the fluorescence imaging device may be described as functioning in “white light mode”. In addition to exciting the fluorescent material in the fluorescent marker regions 121 , 122, the remainder of the scene may be illuminated by the emitted broad spectrum light. This allows the fluorescence imaging data captured by the imaging device 130 to comprise a more detailed representation of the surrounding scene in addition to the emitted fluorescence from the marker regions 121 , 122. The white light illumination further aids a surgeon in navigation, although the marker regions 121, 122 may be less prominently identified. Example fluorescence imaging data 1000 captured in white light mode will be illustrated in Figure 10. According to some embodiments, the fluorescence imaging device 130 may be a stereoscopic imaging device to enable or improve localisation of other fluorescent targets in 3D space.

[0029] The detector 132 of the fluorescence imaging device 130 is configured to capture an image of the scene by detecting emitted or reflected EM radiation from the scene at a predetermined range of wavelengths. The range of wavelengths detected is configured to comprise at least part of the emission spectrum of each fluorescent marker region 121,

122. The range of wavelengths detected may be configured to comprise at least a portion of the visible spectrum.

[0030] Figure 1 illustrates within the system 100 a controller 110 according to an embodiment of the invention. The controller 110 comprises at least one processor 111 and memory 112. The memory 112 may store computer readable instructions which may be executed by the one or more processors 111 to perform a method for tracking the position and orientation of a surgical device 120. Although illustrated schematically in Figure 1 as a single controller 110, it will be appreciated that in some embodiments the functionality of the controller 110 may be implemented across a plurality of controllers, communicably coupled via a communication means 150.

[0031] The communication means 150 may comprise any means of enabling communication between elements of the system 100. The controller 110 may be operable to transmit and receive data through the communication means 150 via one or more inputs 113 and outputs 114. For example, the communication means 150 may comprise one or more wired connections between elements of the system 100 to which the controller 110 may be interfaced through input/output (I/O) circuitry. The communication means 150 may additionally or alternatively comprise one or more wireless communication channels such as Bluetooth, Infrared or Near-Field (NFC) Communication. The communication means 150 may comprise one or more networks such as Local Area Networks (LANs), the Internet and the like.

[0032] The controller 110 is configured to receive, at the input 113, the fluorescence imaging data captured of the surgical device 120 from the fluorescence imaging system. The controller 110 may be operable to store the fluorescence imaging data in the memory 112, and the processor 111 is configured to perform a method in dependence on the fluorescence imaging data in order to determine a pose of the surgical device 120.

[0033] The controller 110 is configured to output a pose signal indicative of the determined pose of the surgical device 120. In some embodiments the controller 110 may be configured to output the pose signal to a user interface in order to output to a user of the system 100 an indication of the pose of the surgical device. For example, a user of the system 100 may be a surgeon or other person wholly or partially in control of the surgical device. The surgical device 120 may form part of a larger robotic surgical system, in which case the surgeon may only be partially in control, with robotic assistance. In some embodiments, the controller 110 may also be at least partially in control of the surgical device. For example, the pose of the surgical device 120 may be automatically adjusted by the controller 110 without intervention by the user. The determined pose may be output to the user in the form of visual, audible, haptic feedback or any combination thereof. The determined pose may in some embodiments be output in conjunction with other information, for example the fluorescence imaging data, contextual scan data or data from one or more further sensors to further aid navigation, as will be explained. In particular, the determined pose may be output as an augmented reality overlay on captured fluorescence imaging data or the scan data.

[0034] According to some embodiments, the system 100 may comprise at least one display device 140 for outputting visual information to the user. The display device 140 may comprise one or more screens or other display equipment. The controller 110 may then be configured to output display data indicative of the pose to the display device 140, and the display device 140 may be configured to output an indication of the pose of the surgical device 120 to the user. The displayed indication may be in some embodiments embedded in video footage on the display 140 to provide an augmented reality indication of the surgical workflow.

[0035] In some embodiments, the surgical device 120 may be at least in part controlled by the controller 110. The controller 110 may then be configured to output a control signal in dependence on the determined pose, to control movement of the surgical device 120. For example, it may be desired to control movement of the surgical device 120 to achieve a desired pose, such as to move closer to a target or to orient correctly relative to a target.

[0036] The surgical device 120 will now be described in more detail with reference to Figures 2a to 2j.

[0037] As discussed, the surgical device 120 comprises at least a first fluorescent marker region 121 and a second fluorescent marker region 122. The fluorescent marker regions 121, 122 comprise areas of the surgical device 120 arranged to fluoresce, i.e. emit electromagnetic (EM) radiation according to a fluorescence emission spectrum when excited by incident radiation. The incident radiation may be provided for example by the excitation source 131 of the fluorescence imaging device 130.

[0038] According to some embodiments the fluorescent marker regions 121 , 122 may comprise medical grade fluorescent material. The medical grade fluorescent material may be embedded in the surgical device. Advantageously, the surgical device 120 including the fluorescent marker regions 121, 122 may be sterilisable and thus readily reusable without requiring further addition of new fluorescent markers for every iteration. However, in alternative embodiments it will be appreciated that other more temporary marker regions 121 , 122 may be implemented, for example in the form of fluorescent stickers. Optionally, the surgical device 120 may comprise other markings or engravings to illustrate the location at which the fluorescent material should be attached, if disposable fluorescent material is utilised. In this way, the material may be attached at known and reproducible locations on the surgical device to ensure that the fluorescent marker regions 121, 122 are consistently placed and avoid the need to calibrate the system. The medical grade fluorescent material may comprise fluorescent polymer material, for example polyether ether ketone (PEEK). The medical grade fluorescent material may comprise an immobilised fluorescent dye, for example a fluorescent dye in epoxy or silicon. However, it will be appreciated that any medical grade material harbouring fluorescent properties may be utilised in the invention.

[0039] The surgical device 120, as illustrated in Figures 2a to 2j, may comprise a substantially elongate shape, having a first end 124 and a second end 125. Although illustrated for simplicity in Figures 2a to 2g as a generally cuboid or cylindrical shape, it will be appreciated that the shape of the surgical device is not limited in this way and may comprise an alternative or irregular shape. For example, each surgical device 120 illustrated in Figures 2h, 2i and 2j comprises a non-uniform shape. The fluorescent marker regions 121, 122 may be separated along a first axis between the first and second ends 124, 125 of the surgical device 120. Additionally, or alternatively, the fluorescent marker regions 121, 122 may form contiguous segments of a complex shape, which is preferably asymmetric along the first axis. In this way, the orientation of the first axis may be inferred from the identification of the fluorescent marker regions 121, 122.

[0040] The fluorescent marker regions 121 , 122 may preferably be arranged in a distinguishable manner. In this way, the regions 121 , 122 may be readily distinguished from each other in fluorescence imaging data which will provide directional accuracy of pose determination of the surgical device 120. According to some embodiments, this distinction between the marker regions 121, 122 may be brought about by the first fluorescent marker region 121 having a geometry which is distinct from a geometry of the second fluorescent marker region 122. For example, the first fluorescent marker region 121 and second fluorescent marker region 122 may comprise distinct shapes. By distinct shapes, it is meant that the fluorescent marker regions 121, 122 may be defined by differently shaped boundaries. Alternatively or additionally, the first fluorescent marker region 121 and second fluorescent marker region 122 may have a distinct geometry by being differently sized. Figures 2a and 2b illustrate surgical devices 120 having differently sized first and second fluorescent marker regions 121 , 122 in the form of a wider and a narrower ring.

[0041] The fluorescent marker regions 121, 122 may preferably be arranged such that they are viewable from all angles around the first axis of the surgical device 120. For example, each fluorescent marker region may extend substantially around a portion of a surface proximal to a cross section of the surgical device. The fluorescent marker regions 121, 122 may define ring shaped regions around the surgical device 120, however this will not be limited to surgical devices 120 having any particularly shaped cross section. For example, the surgical devices in Figure 2a and 2b may each be described as having ring shaped fluorescent marker regions 121, 122, despite the surgical device 120 in Figure 2a comprising a rectangular cross section and the surgical device 120 in Figure 2b comprising a circular cross section. The ring shaped fluorescent marker regions 121, 122 may be arranged to be of different widths in order to provide differently sized marker regions.

[0042] Optionally, the surgical device comprises three or more fluorescent marker regions. The inclusion of a third fluorescent marker region may advantageously improve the accuracy by which the pose of the surgical device 120 may be determined, because a third point of reference is provided along the axis of the device. Figure 2c illustrates an embodiment comprising a third fluorescent marker region 123. The third fluorescent marker region may be located between the first and second marker regions along the first axis of the surgical device 120, to further aid the definition of the orientation of the surgical device 120 in fluorescence imaging. According to some embodiments, the first, second and third marker regions may be unevenly distributed along the first axis of the surgical device. In this way, the marker regions 121, 122, 123 may be distinguishable due to defining an asymmetric pattern. The asymmetry along the first axis enables the two ends 124, 125 of the surgical device to be distinguished easily. This asymmetric pattern may be implemented together with the regions 121, 122, 123 having a distinct shape, or the asymmetric pattern may be implemented as an alternative. Therefore, in embodiments comprising an asymmetric distribution of marker regions 121, 122, 123 it is not necessary for each region 121, 122, 123 to also comprise a different shape or area in order to be distinguishable.

[0043] It will be appreciated that the invention is not limited to the fluorescent marker regions 121, 122 being separated. Figures 2d to 2g illustrate four exemplary embodiments wherein the fluorescent marker regions 121, 122 form contiguous segments of a single complex shape. The fluorescent marker regions 121, 122 comprise two regions of the complex shape which are geometrically distinguishable.

[0044] Figure 2d illustrates an embodiment wherein the first and second fluorescent marker regions 121, 122 form an arrow shaped fluorescent region. The arrow shaped region may be considered to comprise two segments, a rectangular segment and a triangular segment. The two segments respectively define the first and second fluorescent marker regions 121, 122. The two segments are geometrically distinct and thus may each be uniquely identified from a captured image.

[0045] Figure 2e illustrates an embodiment wherein the first and second fluorescent marker regions 121, 122 form a non-equilateral triangle. The triangle may be considered to comprise at least two segments, for example two halves or base and sharp top/angle. The two halves may be effectively distinguished in dependence on the relative positions of the vertices. Each of the two halves may respectively define the distinct fluorescent marker regions 121, 122 as shown.

[0046] Figures 2f and 2g each illustrate a more complex fluorescent marking comprising a QR code or bar code. The QR code or bar code may be subdivided into two distinguishable fluorescent marker regions 121, 122, for example defining two halves of the QR code or bar code each defining a unique geometry. It will be appreciated that the complex fluorescent markings illustrated in all of Figures 2d to 2g may be considered to form more than two distinguishable regions, and thus may be considered to form a larger plurality of fluorescent marker regions. The illustrated concept simply requires at least two regions of the complex shape to be distinguishable when visible by the fluorescence imaging system 130, in order for the identification of the marking by a fluorescent imaging system 130 to allow inference of the orientation of the surgical device 120.

[0047] Figures 2h to 2j illustrate a surgical device 120 comprising first to third fluorescent marker regions 121, 122, 123 analogous to those illustrated in Figure 2c, in the form of unevenly spaced rings. The surgical device 120 illustrated in Figures 2h to 2j comprises an additional fourth fluorescent marker region 124. The fourth fluorescent marker region 124 is uniquely shaped compared with the first three fluorescent marker regions 121, 122, 123. In the illustrated embodiments, the shape of the fourth fluorescent marker region 124 is defined by the irregular geometry of the surgical device 120. However, it will be appreciated that the fourth fluorescent marker region 124 may comprise a shape independent of the geometry of the surgical device 120. The fluorescent marker region 124 may be combined with any previous embodiment, i.e. it is not limited to being combined with the first to third fluorescent marker regions 121, 122, 123 as illustrated. For example, it can be envisaged that the fourth fluorescent marker region 124 may be combined with the embodiment illustrated in Figure 2d. In this case, a third fluorescent marker region 123 may not be present.

[0048] According to some embodiments, the first fluorescent marker region is associated with a first fluorescence emission spectrum and the second fluorescent marker region is associated with a second fluorescence emission spectrum. The fluorescent marker regions 121 , 122 may be distinguishable due to the first fluorescence emission spectrum being distinct from the second fluorescence emission spectrum. By distinct it is meant that the first and second emission spectra have different frequency profiles, and different frequency peaks. In this way, advantageously the fluorescent marker regions 121, 122 may be readily distinguished in fluorescence imaging data without requiring any deduction of the geometry of the regions. Optionally, if the surgical device 120 comprises a third fluorescent marker region 123, the fluorescence emission spectrum of the third fluorescent marker region 123 may be distinct from each of the first and second emission spectra, to further aid distinguishability, however it will be appreciated that this is not required.

[0049] As discussed, it may be desired to determine a position and orientation (pose) of the surgical device 120. A method 400 for tracking the pose of the surgical device 120 is illustrated in Figure 4. The method 400 may be performed at least in part by the controller 110. [0050] The method 400 comprises a step 410 of receiving fluorescence imaging data.

The fluorescence imaging data may be received from the fluorescence imaging device 130 and may be indicative of EM radiation detected at an array of the detector 131 of the fluorescence imaging device 130. The fluorescence imaging device 130 is arranged to capture an image of the surgical device 120, and so the fluorescence imaging data is indicative of emitted fluorescence from the surgical device 120. The fluorescence imaging data may in some embodiments comprise video footage captured by the fluorescence imaging device 130 of the surgical device 120.

[0051] Figures 5a and 5b illustrate an example field of view of the fluorescence imaging device 130 and an example output of the method 400 in the form of display data respectively. Figure 5a illustrates an example field of view comprising the surgical device 120. The fluorescence imaging device 130 is configured to capture fluorescence imaging data, which may take the form of an image of the illustrated field of view. The captured image comprises at least an indication of fluorescence emitted from the fluorescent marker regions 121, 122, 123. Additionally, further detail within the field of view may be discernible in the captured image. This may be the case particularly when the fluorescence imaging device is functioning in “white light” mode and the field of view is sufficiently illuminated. This may also be the case when one or more further fluorescent objects are present within the field of view, for example other surgical devices 120 or target tissue. The system 100 can track a plurality of such objects concurrently, as will be explained with reference to Figures 8 to 10.

[0052] The method 400 comprises a step 420 of identifying marker data within the fluorescence imaging data. The marker data may be identified as bright regions defining a specific fluorescent emission within the captured image, indicative of emitted fluorescence from fluorescent objects within the field of view illustrated in Figure 5a. In “fluorescence mode”, an image captured by the fluorescence imaging device 130 of the illustrated field of view would comprise three isolated bright regions indicative of the three fluorescent marker regions 121, 122, 123. If the marker regions all have the same spectral properties, the regions may all appear in the image to have the same colour. If the marker regions are arranged to have distinct emission spectra, the regions may then be distinguishable due to comprising different apparent colours within the image. In “white light mode”, the contrast between the fluorescent marker regions 121, 122, 123 and the illuminated surroundings may be lower. However, the marker regions 121, 122, 123 may still be identified based on their predefined emission spectra. [0053] Figure 5b illustrates an example output of the method 400 in the form of display data. Figure 5b illustrates an example image frame 520 captured by the fluorescence imaging device of the field of view illustrated in Figure 5a. The output further comprises an indication of the marker data 521, 522, 523 and pose data 530, as will be described.

[0054] Step 420 comprises distinguishing the first marker region 121 from the second marker region 122 in the captured image frame 520. First marker data 521 indicative of emitted fluorescence from the first fluorescent marker region 121 is thus distinguished from second marker data 522 indicative of emitted fluorescence from the second fluorescent marker region 122. If the surgical device 120 comprises more than two marker regions, step 420 may comprise distinguishing each marker region in the captured image 520, for example to uniquely identify third marker data 523. Each marker data 521, 522, 523 may define a respective area of a frame of the fluorescence imaging data representative of each fluorescent marker region 121, 122, 123.

[0055] According to some embodiments, the footage captured by the fluorescence imaging device may be output to a user in real time for navigation, for example on the display 140. The marker data 521, 522, 523 may then be highlighted or segmented in the displayed footage to draw the user’s attention to the fluorescent marker regions. That is, the marker data 521, 522, 523 may be displayed with the footage in augmented reality. For example, as illustrated in Figure 5b, outlines of the image regions defining the marker data 521, 522, 523 may be superposed on the displayed footage. The segmented outlines may be displayed differently for each of the first, second and third marker data 521, 522, 523 to enable the user to easily distinguish the different fluorescent marker regions 121 , 122, 123 within the displayed footage. For example, each segmented outline may be displayed with a different colour.

[0056] The first marker data 521 , second marker data 522 and optionally third marker data 523 may be uniquely identified in dependence on one or more distinguishable features of each fluorescent marker region 121, 122, 123. As has been described, the fluorescent marker regions 121, 122, 123 may be distinguishable due to comprising distinct geometries, by defining an asymmetric arrangement on the surgical device 120, due to being associated with distinct fluorescence emission spectra, or any combination thereof. Other distinguishable features, such as unique non-fluorescent colours, may also be implemented. The distinguishable features of the fluorescent marker regions 121, 122, 123 may be stored in the memory 112 of the controller 110 or elsewhere accessible to the processor 111. [0057] The memory 112 may be configured to store geometric data indicative of a geometry of the fluorescent marker regions 121, 122, 123 on the surgical device 120. The geometric data may then be used to distinguish the first marker data 521 and second marker data 522 from the captured fluorescence imaging data.

[0058] According to some embodiments, the geometric data may be indicative of a shape of each fluorescent marker region 121 , 122, 123. The shape of each region may be distinct, as has been described. Step 420 may thus comprise identifying an area of fluorescence in the fluorescence imaging data best matching the distinct shape of each fluorescent marker region 121, 122, 123. The geometric data may be indicative of the 2D projection of each fluorescent marker region 121, 122, 123 from a range of angles. In this way, the identification of the marker data will be robust to variation in the angle of orientation of the surgical device 120 relative to the fluorescence imaging device 130. It will be appreciated that any suitable object detection algorithm such as those known from image processing may be used in the identification of the marker data 521, 522, 523.

[0059] The geometric data may be indicative of a relative spacing between the fluorescent marker regions 121 , 122, 123. For example, in an embodiment wherein the surgical device comprises three fluorescent marker regions 121, 122, 123 as in Figure 3, the geometric data may be indicative of the first and third marker regions 121, 123 being more closely spaced than the third and second marker regions 123, 122. In this way, the first, second and third marker data 521, 522, 523 may be each respectively distinguished based on the relative spacing between identified fluorescent regions.

[0060] According to some embodiments as discussed, the first fluorescent marker region 121 of the surgical device 120 may be associated with a first fluorescence emission spectrum and the second marker region 122 may be associated with a second fluorescence emission spectrum, wherein the first fluorescence emission spectrum is distinct from the second fluorescence emission spectrum. That is, the two fluorescent marker regions may comprise distinct spectral properties, i.e. distinct fluorescent colours. The first and second marker data 521, 522 may then be distinguished in dependence on the distinct first and second emission spectra. A region of the fluorescence imaging data exhibiting fluorescence according to the first fluorescence emission spectrum may be identified as the first marker data 521, and a region exhibiting fluorescence according to the second emission spectrum may be identified as the second marker data 522. For example, step 420 may comprise searching for a fluorescent area of the imaging data having a peak emission frequency corresponding to a peak emission frequency associated with each of the respective first and second fluorescent marker regions. Including such a multispectral distinction may advantageously improve the accuracy of identification of the first and second marker data 521, 522, due to mitigating error being introduced by incorrectly identifying geometric shapes.

[0061] Once the first marker data 521 and second marker data 522 have been identified in the fluorescence imaging data, as illustrated in Figure 5, the pose of the surgical device may be deduced.

[0062] The method 400 comprises a step 430 of determining pose data indicative of a position and orientation of the surgical device. The pose data is determined in dependence on the first marker data 521 and the second marker data 522. If further marker data, for example third marker data 523, is identified in step 420, the pose data may be further determined or refined in dependence on the third marker data 523.

[0063] The controller 110 may utilise the identified marker data 521 , 522, 523 as fiducials to deduce the pose of the surgical device. In conjunction with predetermined information indicative of known dimensions and arrangement of the marker regions on the surgical device 120, the pose may be determined based on the location of the fiducials in the fluorescence imaging data. The position and relative spacing of the marker data 521, 522, 523 in the image 520 correlates with the position and orientation of the surgical device 120 relative to the fluorescence imaging device 130. That is, there exists a mapping between the location of the identified marker data 521, 522, 523 in the image 520 and the pose of the surgical device 120. The mapping may for example be determined via a calibration of the fluorescence imaging device 130 and by employing a machine vision pose determination algorithm. The size, relative spacing or orientation and location of the marker data 521 , 522, 523 may then be used to infer the pose of the surgical device 120 relative to the calibrated fluorescence imaging device 130. For example, as the orientation of the surgical device 120 relative to the fluorescence imaging device 130 is varied, the relative spacing of the marker data 521 , 522, 523 in the fluorescence imaging data will consequently change due to perspective distortion. Further, as the location of the surgical device 120 relative to the fluorescence imaging device 130 is varied, the size and location of the marker data 521, 522, 523 within the fluorescence imaging data will consequently change. If the fluorescence imaging device 130 is calibrated and the algorithm is configured to the known dimensions of the fluorescent marker regions 121, 122, 123, then the relative pose of the surgical device 120 may be inferred with 5 degrees of freedom.

The accuracy of the inferred pose may be improved by an increased number and/or complexity of fluorescent marker regions 121, 122, 123 due to providing more points of reference on the surgical device 120.

[0064] The method 400 comprises a step 440 of outputting a signal indicative of the pose data. Step 440 may in some embodiments comprise outputting to a display such as the display 140, display data indicative of the pose data. Figure 5b illustrates example display data comprising an indication 530 of the pose of the surgical device. The display data comprises a quantitative indication 530 of the position and an arrow indicating the orientation, however other representations may be envisaged. Alternatively, only one of the position and orientation may be displayed. In some embodiments, a relative position such as a distance to a navigation target may be displayed, as will be explained. The display data may be output in conjunction with other information, for example footage from the fluorescence imaging device and an indication of the segmented marker regions, as illustrated in Figure 5b. The signal indicative of the pose need not be output in the form of display data. According to some embodiments, the signal indicative of the pose may be output as audible or haptic feedback to the user, for example. According to some embodiments, the signal may be output in the form of a control signal to control the surgical device 120, for example to cause the pose of the surgical device to adjust to a desired pose.

[0065] Method 400 provides a determination of the pose of the surgical device 120 relative to the fluorescence imaging device 130. However, it may be desired to determine the pose of the surgical device 120 in a larger context within the surgical field. The method 400 may be supplemented by the inclusion of pre-operative and/or intraoperative scan data to determine a pose of the surgical device 120 relative to a patient. Determining a pose of the surgical device 120 relative to a patient may then be utilised to effectively navigate the surgical device 120 within the patient, to a desired target.

[0066] Figure 6 illustrates a method 600 of determining a pose of the surgical device 120 relative to a patient. The method 600 may be performed by the system 100. Figure 7 illustrates a supplemented system 700 suitable for performing the method 600. The system 700 comprises the controller 110, fluorescence imaging device 130, display 140 and surgical device 120 as have already been described with reference the system 100. Additionally, the system 700 may comprise scan data 720 and a fiducial marker 710 which may be located internally or externally to the patient, as will be explained.

[0067] Method 600 comprises a step 610 of receiving the pose data indicative of the pose of the surgical device 120. The pose data may have been determined by the controller 110, for example in step 430 of method 400. The pose data is indicative of the pose of the surgical device 120 relative to the fluorescence imaging device 130, illustrated in Figure 7 as R1. Method 600 further comprises a step 620 of receiving scan data 720 indicative of a representation of a patient. The scan data may be for example a single photon emission computerized tomography (SPECT), positron emission tomography (PET), computerized tomography (CT) or magnetic resonance imaging (MRI) scan of a patient taken prior to surgery. The scan data may highlight a location within a patient it is desired to target, for example a specific organ or lesion within the patient. The scan data 720 may be stored in the memory 112 or received via the communication means 150.

[0068] According to some embodiments, the scan data 720 may comprise intraoperative imaging data 720 captured by the surgical device 120. The intraoperative imaging data 720 may comprise, for example, freehand SPECT data captured by a detection or imaging modality on the surgical device 120. In this way, the scan data 720 may be captured concurrently with the tracking of the surgical device 120. In such an embodiment, the scan data 720 may be received at the controller 110 from the surgical device 120.

[0069] The method 600 comprises a step 630 of determining the relative pose of the surgical device 120 within the context of the scan data. According to some embodiments, the relative pose of the surgical device 120 to the patient may be determined by tracking the fluorescence imaging device 130 relative to a body of a patient, for example by implementing a traditional external tracking system such as near-infrared optical tracking to determine the relative pose of the patient and the fluorescence imaging device 130. The external tracking may be aided by the placement of external fiducial markers 710 identifiable in the scan data 720. By tracking the fluorescence imaging device 130 relative to the external fiducial markers 710, the relative pose of the fluorescence imaging device 130 to the patient R2 may be inferred. Step 630 may then comprise combining the relative pose R2 of the fluorescence imaging device 130 to the patient with the determined pose R1 of the surgical device 120. By combining this way, the relative pose R3 of the surgical device 120 to the patient may be determined, and a location and orientation of the surgical device 120 within the scan data 720 may be inferred.

[0070] Alternatively, one or more fiducial markers 710 as illustrated in Figure 7 may be placed internally within the surgical field to mark one or more locations internal to the patient. Internal fiducial markers 710 may be utilised for intraoperative imaging, for example when the scan data 720 is intraoperative imaging data 720 captured using freehand SPECT. The internal fiducial markers 710 are arranged such that they are identifiable within the fluorescence imaging data. The internal fiducial markers 710 may also be identifiable in the intraoperative imaging data 720. For example, each internal fiducial marker 710 may comprise fluorescent material and/or a radioisotope. Step 630 may comprise identifying the internal fiducial marker 710 within the fluorescence imaging data, and consequently inferring the relative pose of the fiducial marker 710 to the fluorescence imaging device 130. The relative pose of the internal fiducial marker 710 to the fluorescence imaging device 130 is illustrated as R2 in Figure 7.

[0071] The identification of the internal fiducial marker 710 in the fluorescence imaging data (or intraoperative imaging data 720) defines the relative pose R2 of the fluorescence imaging device 130 within the patient. The relative pose R2 then allows the intraoperative imaging data 720 and the fluorescence imaging data to be aligned. The relative pose R2 of the intraoperative imaging data 720 to the fluorescence imaging device 130 may then be combined with the determined pose R1 of the surgical device 120. The relative pose of the surgical device 120 within the intraoperative imaging data 720, illustrated as R3 in Figure 7, may then be determined as a linear combination of R2 and R1. A location and orientation of the surgical device 120 within the intraoperative imaging data 720 may therefore be inferred.

[0072] This may be particularly useful in the context of intraoperative imaging data 720 captured by the surgical device 120, for example in the form of freehand SPECT imaging. Provided the internal fiducial marker 710 is captured in the fluorescence imaging data, the relative location of the surgical device 120 and the fluorescence imaging device 130 with respect to the intraoperative imaging data 720 captured of the patient may be determined without necessitating any external tracking of the fluorescence imaging device 130.

[0073] According to some embodiments the determined relative pose R3 may be output as display data in conjunction with the scan data. In this way, a visualisation of the pose of the surgical device within the context of the pre-operative patient scan 720 or intraoperative imaging data 720 may be provided to the user, which will aid navigation and improve localisation of lesions or other desired targets within the surgical field. The position and orientation of the surgical device 120 within the patient may in this way be accurately pinpointed, and the scan data 720 may then act as a “map” for the navigation of the surgical device 120 within the patient.

[0074] Optionally, a control signal may be output for controlling the surgical device 120 in dependence on the relative pose R3 within the scan data. For example, the control signal may be for controlling the surgical device 120 to move closer to a desired target, or for adjusting the orientation of the surgical device 120 in relation to an object identified within the scan data 720. For example, the scan data 720 may illustrate a lesion or other location to which it is desired to navigate the surgical device 120. Utilising the scan data 720 as a map, the control signal may determine a route to the target location and control the position and orientation of the surgical device 120 along the route to reach the target location.

[0075] According to some embodiments, the system 100 may be utilised to concurrently track the position of a target external to the surgical device 120 in addition to the pose of the surgical device 120. The fluorescence imaging device 130 and surgical device 120 may be relatively proximal to a target, for example after being navigated utilising the scan data 720 as described above. The fluorescence imaging device 130 may then be utilised to track the target, and an accurate distance from the surgical device 120 to the target may be determined.

[0076] Figure 8a illustrates an example field of view of the fluorescence imaging device 130 comprising both a surgical device 120 and a target 830. The target 830 may be configured to exhibit fluorescence. The target may be identifiable in white light if the fluorescence imaging device 130 is configured to operate in “white light” mode. The target may be otherwise identifiable in the scan data 720. For example the target 830 may comprise a second surgical device, a target tissue such as a lesion which may be labelled with a fluorescent tracer or identified in the scan data 720, or an internal reference target object placed within the surgical field, such as the one or more internal fiducial markers 710. The fluorescence imaging device 130 thus captures an image comprising fluorescence emitted from the fluorescent marker regions of the surgical device 120, and possibly additional light reflected or fluorescence emitted from the target 830.

[0077] Figure 8b illustrates an example output of the system 100, analogous to that displayed in Figure 5b, supplemented by further information indicative of the additional tracking of the target 830.

[0078] Analogously to the identification of the marker data 521, 522, 523 in the fluorescence imaging data, a supplemented method 400 may further comprise identifying target data 821 indicative of emitted light from the target 830 external to the surgical device. If the target is fluorescent, the target data 821 may be identified within the fluorescence imaging data based on a target fluorescence emission spectrum associated with the target. The target fluorescence emission spectrum may be distinct from the first and second fluorescence emission spectra associated with the surgical device, in order to enable the controller 110 to easily distinguish between the target 830 and the surgical device 120.

[0079] If the configuration of the target is known, a 3D position of the target 830 may be determined relative to the fluorescence imaging device 130 in dependence on the identified target data 821. The accuracy of the position tracking may be refined by the provision of a stereoscopic camera within the fluorescence imaging device 130. [0080] It will be appreciated that in some embodiments, the target 830 may comprise another surgical device of the type illustrated in Figure 2. In this case, if the target 830 comprises suitable marker regions 121, 122, the pose of the target 830 may also be determined in an analogous manner to that of the surgical device 120. In some embodiments, the 3D position or pose of the target may be output to the user. In dependence on the target data 821 and the pose data of the surgical device 120, a distance 823 from the surgical device 120 to the target 830 may be determined.

[0081] According to some embodiments, the target 830 may not be visible in the field of view of the fluorescence imaging device 130. However, the target 830 may be identifiable in the scan data 720. If the relative pose R3 of the surgical device within the scan data 720 has been determined, for example using method 600, then the distance 823 from the surgical device 120 to the target 830 may be determined based on the location of the target 830 as identified in the scan data 720.

[0082] The supplemented method 400 may then comprise outputting a signal indicative of the determined distance 823. For example, as illustrated in Figure 8b, a visual indication 822 of the determined distance may be displayed to the user. The visual indication may take a variety of forms, such as a textual indication as illustrated, or other visual feedback configured to vary coincidently with the determined distance, such as a change in coloured light. Other feedback such as audible feedback may be provided, for example a verbal expression of the determined distance, or other auditory cue such as a change in tone representing a change in distance. Although not illustrated, one or both of the pose of the surgical device 120 and the position of the target 830 may also be output.

[0083] The signal may comprise a control signal for controlling the surgical device 120. The control signal may be for controlling the surgical device to move closer to the target 830 or adjust orientation in relation to the target 830.

[0084] Figures 9a and 9b each illustrate example resultant display footage output to a user. Figure 9a illustrates the localisation of a target 830 in the form of a target lesion labelled with a fluorescent tracer, and a surgical device 120 in the form of a DROP-IN probe. Figure 9b illustrates the localisation of a target 830 in the form an internally placed fluorescent reference target object, and a surgical device 120 in the form of a DROP-IN probe. Each of Figure 9a and 9b illustrate an analogous display to that described with reference to Figure 8b. The marker data 521, 523, 522 has been segmented within the image and highlighted to the user as differently coloured outlines to highlight each fluorescent marker region. The target has been highlighted with a circle. The distance to the target 830 from the DROP-IN probe 120 is output on the display in the form of a textual indication of distance. Further, an indication of the 3D position of both the probe 120 and the target 830 are displayed in the form of (X, Y, Z) co-ordinates.

[0085] It will be appreciated that in some embodiments a plurality of surgical devices 120 may be concurrently tracked by one or more of the disclosed methods. Figure 10 illustrates example fluorescence imaging data 1000 captured in white light mode according to an embodiment of the invention. The fluorescence imaging data 1000 comprises an indication of three surgical devices 1010, 1020 and 1030. The surgical devices 1010, 1020, 1030 in this example are surgical resection instruments. It can be seen that each surgical device 1010, 1020, 1030 comprises a plurality of fluorescent marker regions, wherein the marker regions are arranged in the format of Figures 2h to 2j. Each surgical device 1020, 1030 comprises first to third fluorescent marker regions arranged as three asymmetrically spaced rings, and a fourth marker region shaped to correspond to the unique shape defined by the jaws of the instrument. The surgical device 1010 comprises first and second ring-shaped fluorescent marker regions, and a third differently shaped fluorescent marker region corresponding to the shape defined by the jaws of the instrument. The marker regions of each surgical device 1010, 1020, 1030 thus define an asymmetric pattern along a first axis of the device, allowing the orientation of the device to be readily deduced.

[0086] Each device is distinguishable as the fluorescent marker regions of 1010, 1020 and 1030 have distinct emission spectra. The distinct fluorescent colours defined by the emission spectra are also visible as distinct colours in white light. The first surgical device 1010 comprises yellow fluorescent marker regions 1011, 1012, 1013. The second surgical device 1020 comprises orange fluorescent marker regions 1021, 1022, 1023, 1024. The third surgical device 1030 comprises pink fluorescent marker regions 1031, 1032, 1033, 1034. In this way, the fluorescent marker regions of each device may be separately segmented from the video footage based on the identified colouring. Furthermore, the known distribution and shape of the fluorescent marker regions along each surgical device allows the position and orientation of each surgical device 1010, 1020, 1030 relative to the fluorescent imaging device to be inferred.

[0087] Tracking the pose of a surgical device 120 according to the present invention may be further used to determine an indicator of surgical performance. The pose of the surgical device 120 may be tracked within the surgical field, such as described with reference to the methods 400 and 600. An indication of the determined pose of the surgical device 120 may be output on the display 140 or stored such as in the memory 112. The determined pose may comprise an indication of a 3D position of the surgical device 120 such as in the form of (X, Y, Z) co-ordinates. An indication of the 3D position at a plurality of respective time points may then be used to determine a path or trail of the surgical device in the surgical field over time. The pose data determined via the method 400 or 600 over time may thus be used to generate a sequence of 3D co-ordinates indicative of the path of the surgical device.

[0088] The path of the surgical device may be utilised to provide surgical performance data for the assessment of surgical performance or dexterity. The assessment of surgical performance has numerous applications, such as for use in feedback and assessment during surgical training, and in the validation of instrumentation or procedures. A method 1100 of providing surgical performance data is illustrated in Figure 11.

[0089] The method 1100 comprises a step 1110 of tracking the pose of the surgical device. This may be performed in any way previously described, such as by the method 400 or the method 600. The step 1110 is repeated at a plurality of time points in order to generate a sequence of co-ordinates indicative of the pose of the surgical device over time. A path of the surgical device over time may be determined in dependence on the pose data obtained via method 400 or method 600, for example by storing the determined sequence of co-ordinates in the form of time series data. The path of the surgical device may comprise a series of 3D co-ordinates, for example (C,U,Z) coordinates of a point on the surgical device.

[0090] The method 1100 comprises a step 1120 of determining surgical performance data in dependence on the path of the surgical device. The surgical performance data may be indicative of a level of surgical dexterity or accuracy exhibited by the path of the surgical device. Step 1120 may comprise extracting one or more path parameters of the path of the surgical device in order to construct the surgical performance data.

[0091] The path parameters may comprise any feature of the path relevant to the assessment of surgical performance, such as parameters providing an indication of speed, accuracy and dexterity. It will be appreciated that the specific path parameters included in the surgical performance data may depend on the particular aspects of surgical performance to be assessed. The parameters may comprise in some embodiments one or more of a time taken to traverse the path, a tortuosity of the path, a path curvature, an acceleration of the surgical device 120 on the path, a velocity of the surgical device 120 on the path, or a spread of the path. The tortuosity may be quantified using any suitable metric such as a straightness index or an angular dispersion. The spread of the path may be quantified by, for example, a frequency of the surgical device 120 leaving an area of interest such as a field of view, of the imaging device, or a length of the path, or an area covered by the path.

[0092] For example, in the scenario illustrated in Figures 8a and 8b, the surgical task to be performed may comprise accurately and quickly locating the target 830 with the surgical device 120, although it will be appreciated that embodiments of the invention could be readily applied to any other surgical task. A speed of the task may be quantified such as by the time taken to perform the task (e.g. a time taken for the path of the surgical device 120 to meet the target 830), a velocity of the surgical device or an acceleration of the surgical device. Parameters such as tortuosity, path curvature and spread may be indicative of an accuracy of the surgeon performing the task. Together, these extracted parameters can provide an indication of a surgical dexterity or performance of the surgeon.

[0093] In some embodiments, step 1120 may comprise receiving ideal path data indicative of a predetermined path 1210 through the surgical field. Figure 12 illustrates an example display output 1200 which may be shown on the display 140 in an embodiment. The predetermined path 1210 may define a 3D path through the surgical field, or in some embodiments the predetermined path 1210 may define a 2D path through a plane of the surgical field. The predetermined path 1210 may be derived from previously recorded pose data. For example, the predetermined path 1210 may be obtained from tracking the surgical device of an experienced surgeon performing the same surgical task.

Alternatively, the predetermined path 1210 may be user-defined. For example, the predetermined path may be input as a sequence of coordinates or otherwise through a user interface of the display 140. In some examples, the input of the predetermined path may be performed during the method 1100, such as in the case of a tutor defining a path during a surgical training exercise. In other embodiments, the predetermined path 1210 may be defined in advance. For example, to facilitate surgical planning for complex procedures, the predetermined path 1210 may be defined in dependence on an estimation of an ideal path for a surgical procedure from pre-operative scan data. The predetermined path 1210 may be user-defined in these embodiments relative to the scan data, such as by inputting the predetermined path through a user interface of the display 140 showing the scan data. As discussed previously, the pose of the surgical device 120 may then be tracked in relation to the scan data as in the method 600, and thus be tracked in relation to the predetermined path 1210. The predetermined path 1210 may be output to the user on the display 140 during the method 1100, as illustrated in the example output 1200.

[0094] Surgical performance may then be quantified by comparison to the ideal path data. The one or more parameters determined in step 1120 may thus comprise a deviation of the path of the surgeon from the predetermined path 1210.

[0095] The method 1100 comprises a step 1130 of outputting an indication of the surgical performance data. In some embodiments, the indication may be output directly to a user such as via the display 140, in order to provide live feedback. In the example shown in Figure 12, the output may comprise an indication 1220 of the deviation from the predetermined path. Alternatively, or additionally, one or more of the other path parameters may be output to provide direct feedback to the surgeon regarding the surgical performance. Providing such live feedback may facilitate improved surgical performance and training by highlighting clearly where improvements in performance should be made.

[0096] In some embodiments, the surgical performance data may be output to be stored such as in the memory 112. In this way, the surgical performance data may be utilised for subsequent analysis.

[0097] Figure 13 illustrates a second example of using surgical performance data collected according to the method 1100. The surgical performance data is used in this example for validating improvements in instrumentation. Figure 13 illustrates path data taken for five surgeons (left to right). For each surgeon, a path of the surgical device 120 was tracked in a surgical task locating a target 830 with palpation only (above) and with radioguidance (below). Tracking surgical performance according to the method 1100 was then utilised to quantify and validate the use of the improved radioguidance. Displayed for each path is a 3D surface plot illustrating the time spent in each region of the surgical field. Path parameters could be readily extracted for each path and contrasted between the use of palpation and radioguidance, in order to accurately quantify the improvements made to performance, as shown in the table below. In particular, in this example it can be readily shown that the completion rate and time taken illustrated significant improvements with radioguidance.

[0098] In some embodiments, step 1120 may comprise determining a dexterity metric. The dexterity metric may be defined as an overall measure of surgical performance or dexterity. The dexterity metric may be utilised to quantify and/or classify the performance of individual surgeons in training, in particular to determine whether a sufficient standard of performance has been met. In some embodiments, the dexterity metric may be determined in dependence on an output of a machine learning algorithm taking one or more of the path parameters as input. The machine learning algorithm in an illustrative embodiment is a deep learning network, however it will be appreciated that other algorithms may be used. The deep learning network may be trained using training data comprising data taken from a set of experienced surgeons, as a benchmark for surgical performance of a sufficient standard, and data taken from inexperienced novice surgeons performing the same surgical task. Path parameters as described above may be extracted and used as input to train the deep learning network to recognize parameters and thresholds that are indicative of sufficient surgical performance.

[0099] The trained machine learning algorithm may be utilised in the method 1100 to produce a dexterity metric. The path parameters obtained may be input to the trained machine learning algorithm. The output of the algorithm may comprise a score, or a classification (e.g. sufficient performance or insufficient performance) which may be used to inform or grade surgeons on the relevant surgical task.

[00100] The surgical performance data comprising the dexterity metric and/or the path parameters may be output to the user in order to provide feedback regarding surgical performance. The feedback may be generic (e.g. classification of sufficient performance) or specific feedback regarding particular path parameters requiring improvement. Additionally, as discussed, the surgical performance data may be utilised to quantify improvements provided by new instrumentation or techniques and validate their efficacy as in Figure 13.

[00101] The present invention thus provides a robust method of tracking the pose of a surgical device, e.g. a laparoscopic DROP-IN probe, within the surgical field using fluorescence imaging. The arrangement of fluorescent marker regions along the surgical device may allow the orientation to be accurately inferred, due to the markers providing a robust distinction between the different ends of the device. Other objects, such as lesions, further surgical devices or internal fiducial markers, may be tracked within footage from the fluorescence imaging device in conjunction and the surgical device 120 may thus be accurately navigated with respect to these objects. There is further provided a method for interfacing with pre-operative or intraoperative scan data to pinpoint the pose of the surgical device accurately within the patient, and utilise the scan data as a map for further controlling the surgical device. Furthermore, the tracking may be utilised to monitor surgical performance to enhance surgical training or to validate improvements in surgical technology.

[00102] It will be appreciated that embodiments of the present invention can be realised in the form of hardware, software or a combination of hardware and software. Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a ROM, whether erasable or rewritable or not, or in the form of memory such as, for example, RAM, memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a CD, DVD, magnetic disk or magnetic tape. It will be appreciated that the storage devices and storage media are embodiments of machine-readable storage that are suitable for storing a program or programs that, when executed, implement embodiments of the present invention. Accordingly, embodiments provide a program comprising code for implementing a system or method as claimed in any preceding claim and a machine readable storage storing such a program. Still further, embodiments of the present invention may be conveyed electronically via any medium such as a communication signal carried over a wired or wireless connection and embodiments suitably encompass the same.

[00103] Throughout the description and claims of this specification, the words “comprise” and “contain” and variations of them mean “including but not limited to”, and they are not intended to (and do not) exclude other moieties, additives, components, integers or steps. Throughout the description and claims of this specification, the singular encompasses the plural unless the context otherwise requires. In particular, where the indefinite article is used, the specification is to be understood as contemplating plurality as well as singularity, unless the context requires otherwise.

[00104] Features, integers, characteristics, compounds, chemical moieties or groups described in conjunction with a particular aspect, embodiment or example of the invention are to be understood to be applicable to any other aspect, embodiment or example described herein unless incompatible therewith. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive. The invention is not restricted to the details of any foregoing embodiments.

The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.

[00105] The reader's attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.