Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR UNIQUE IDENTIFICATION OF ITEMS
Document Type and Number:
WIPO Patent Application WO/2021/077044
Kind Code:
A1
Abstract:
Systems and methods for uniquely identifying one or more items of a collection of items is provided based upon at least one naturally occurring feature of the item. In various implementations the item is a surgical instrument and the naturally occurring feature includes a manufacturing mark (e.g., a manufacturing tooling mark) and/or a mark derived from handling or a purposeful use of the item. An example system includes at least one camera adapted to capture one or more images of an item located at an image station and a controller coupled to the image station, wherein the controller is adapted to identify a region of the item, identify at least one feature of item comprising a naturally-occurring mark of the item, and identify the item from the collection of items based upon the at least one feature.

Inventors:
MONTANO ROBERT A (US)
Application Number:
PCT/US2020/056189
Publication Date:
April 22, 2021
Filing Date:
October 16, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
MONTANO ROBERT A (US)
International Classes:
A61B90/96; G06T7/62; B25J9/16; G06K9/00; G06K9/32; G06T7/11
Domestic Patent References:
WO1999041676A11999-08-19
Foreign References:
US20130238124A12013-09-12
US20150124056A12015-05-07
US20070172129A12007-07-26
Attorney, Agent or Firm:
OSBORNE, Thomas J. (US)
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A system for uniquely identifying one or more items of a collection of items comprising: at least one camera adapted to capture one or more images of an item located at an image station; and a controller coupled to the image station, wherein the controller is adapted to identify a region of the item, identify at least one feature of item comprising a naturally-occurring mark of the item, and identify the item from the collection of items based upon the at least one feature.

2. The system of claim 1 wherein the system comprises a handling system for delivering the item to the image station.

3. The system of claim 1 or 2 wherein the system comprises a physical measurement device adapted to take at least one physical measurement of the item.

4. The system of claim 3 wherein the physical measurement device comprises at least one of a scale and a volumetric measuring device.

5. The system of claim 3 or 4 wherein the physical measurement comprises at least one of weight, size, and volume.

6. The system of claim 4 wherein the volumetric measuring device comprises at least one of a liquid displacement device, a gel displacement device, and an image processing module of the controller adapted to determine or estimate a volume of the instruments via the one or more images.

7. The system of claims 3-6 wherein the controller is adapted to eliminate one or more items from the collection based on the physical measurement.

8. The system of any of the preceding claims wherein the controller is wherein the controller is adapted to identify the item without needing an identification or added marking code.

9. The system of any of the preceding claims wherein the controller is adapted to identify a plurality of features from a plurality of regions of the device and identify the item from the collection of items based upon the plurality of features.

10. The system of any of the preceding claims wherein the naturally occurring mark comprises at least one of a manufacturing tool mark or a scratch due to a purposeful use of the item.

11. The system of any of the preceding claims wherein the naturally occurring mark comprises a mark caused by handling the item.

12. The system of any of the preceding claims wherein the naturally occurring mark comprises a defect of the item.

13. The system of any of the preceding claims wherein the naturally occurring mark comprises a blemish of the item.

14. The system of any of the preceding claims wherein the naturally occurring mark comprises a mark caused by storing the item in proximity to other items.

15. The system of any of the preceding claims wherein controller is adapted to enhance the at least one image.

16. The system of any of the preceding claims wherein the controller is adapted to determine at least one of a depth, an angle, and a curvature of the naturally occurring mark.

17. A method for uniquely identifying an item of a collection of items comprising: receiving an item at an image station; capturing one or more images of an item located at an image station via at least one or more camera; and identifying at least one feature of an item at a region of the item via a controller; and identifying the item from the collection of items based upon the at least one feature via the controller.

18. The method of claim 17 wherein the system comprises delivering the item to the image station via a handling system.

19. The method of claim 17 or 18 wherein the method comprises taking at least one physical measurement of the item.

20. The method of claim 19 wherein the physical measurement device comprises at least one of a scale and a volumetric measuring device.

21. The method of claim 19 or 20 wherein the physical measurement comprises at least one of weight, size, and volume.

22. The method of claim 20 wherein the volumetric measuring device comprises at least one of a liquid displacement device, a gel displacement device, and an image processing module of the controller adapted to determine or estimate a volume of the instruments via the one or more images.

23. The method of claims 19-22 wherein the controller eliminates one or more items from the collection based on the physical measurement.

24. The method of claims 17-23 wherein the controller identifies the item without needing an identification or added marking code.

25. The method of claims 17-24 wherein the controller identifies a plurality of features from a plurality of regions of the device and identify the item from the collection of items based upon the plurality of features.

26. The method of claims 17-25 wherein the naturally occurring mark comprises at least one of a manufacturing tool mark or a scratch due to a purposeful use of the item.

27. The method of claims 17-26 wherein the naturally occurring mark comprises a mark caused by handling the item.

28. The method of claims 17-27 wherein the naturally occurring mark comprises a defect of the item.

29. The method of claims 17-28 wherein the naturally occurring mark comprises a blemish of the item.

30. The method of any of the preceding claims wherein the naturally occurring mark comprises a mark caused by storing the item in proximity to other items.

31. The method of any of the preceding claims wherein controller is adapted to enhance the at least one image.

32. The method of any of the preceding claims wherein the controller is adapted to determine at least one of a depth, an angle, and a curvature of the naturally occurring mark.

33. The system or method of any of the preceding claims wherein the controller is adapted to use machine learning to identify instruments.

34. The system or method of claim 33 wherein the controller is adapted to learn to identify the item based on image data.

35. The system or method of claims 33 or 34 wherein the controller comprises a neural network.

36. The system or method of any of the preceding claims wherein the controller is adapted to automatically identify a plurality of regions of the item containing identifiable forensic information.

37. The system or method of any of the preceding claims wherein the controller is adapted to use machine learning to individually identify instruments.

38. The system or method of any of the preceding claims wherein the controller learns based on a set of items specific to the collection of items what the unique patterns that identify items within the collection of items.

39. The system or method of any of the preceding claims wherein the controller is adapted to use forensic analysis to automatically identify items, optionally wherein the controller is adapted to automatically identify a unique item based on a forensic algorithm.

40. The system or method of any of the preceding claims wherein the controller is adapted to use decision trees and multiple machine learning models to eliminate unidentifiable items.

41. The system or method of any of the preceding claims wherein the controller is adapted to use a set of probability weighted decision trees or a set of true/false decision trees to uniquely identify individual instruments.

42. The system or method of any of the preceding claims wherein the controller is adapted to combine multiple factors with machine learning to identify instruments.

43. The system or method of any of the preceding claims wherein the controller is adapted to combine multiple factors with other information to determine a set of potential item models to be used to identify the item.

44. The system or method of any of the preceding claims wherein the controller is adapted to use temporal tracking of individual items to limit results from a probabilistic machine learning module.

45. The system or method of any of the preceding claims wherein the controller is adapted to use knowledge about a current collection of items in use to identify a set of possible items to be identified, use the current collection of items in use to limit a set of potential models and eliminate impossible items from the output of the models.

46. The system or method of any of the preceding claims wherein the controller is adapted to use a decision tree to combine both parametric information and a set of machine learning models that cannot be relevant.

47. The system or method of any of the preceding claims wherein the controller is adapted to uses a set of decision trees to combine both parametric information with a set of results based on applying the models to the images of the item.

Description:
SYSTEM AND METHOD FOR UNIQUE IDENTIFICATION OF ITEMS

CROSS-REFERENCE TO RELATED APPLICATIONS [0001] This application claims the benefit of United States provisional application no. 62/915,740, filed 16 October 2019, which is hereby incorporated by reference as though fully set forth herein.

BACKGROUND a. Field

[0002] The present disclosure relates to unique identification of items, such as surgical tools. In one particular implementation, unique identification of items within a predetermined set of items is provided. b. Background

[0003] Tracking items, such as surgical tools, can be useful in a number of applications. In a surgical application, for example, the ability to track individual surgical items can be used to facilitate timely maintenance (e.g., after a predetermined number of uses), identify damaged or broken instruments, and matching patient outcomes with a particular instrument. Where a patient develops a post-surgical infection, for example, a particular instrument can be identified and undergo further screening, analysis or treatment (e.g., deep cleaning/disinfectant).

BRIEF SUMMARY

[0004] Systems and methods for uniquely identifying one or more items of a collection of items is provided based upon at least one naturally occurring feature of the item. In various implementations the item is a surgical instrument and the naturally occurring feature includes a manufacturing mark (e.g., a manufacturing tooling mark) and/or a mark derived from handling or a purposeful use of the item.

[0005] An example system includes at least one camera adapted to capture one or more images of an item located at an image station and a controller coupled to the image station, wherein the controller is adapted to identify a region of the item, identify at least one feature of item comprising a naturally-occurring mark of the item, and identify the item from the collection of items based upon the at least one feature. [0006] An example method comprises receiving an item at an image station; capturing one or more images of an item located at an image station via at least one or more camera; identifying at least one feature of an item at a region of the item via a controller; and identifying the item from the collection of items based upon the at least one feature via the controller.

[0007] The foregoing and other aspects, features, details, utilities, and advantages of the present invention will be apparent from reading the following description and claims, and from reviewing the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

[0008] Fig. 1 is a block diagram of an example system 10 for uniquely identifying items, such as surgical tools, according to one or more implementations.

[0009] Fig. 2 is a flowchart showing example operations of a method of uniquely identifying one or more items, such as surgical tools, according to one or more implementations.

[0010] Fig. 3 is a flowchart of an example process of identifying an item of a collection of items, according to one or more examples.

[0011] Fig. 4 is a schematic block diagram showing an example processing and physical handling flow of a system of uniquely identifying items from a collection, according to one or more examples.

[0012] Fig. 5 shows an example non-naturally occurring inventory code marking system that may be used to mark items such as surgical instruments, according to one or more examples.

[0013] Fig. 6 is an image of a surgical scissor instrument showing example marks obtained through manufacture or use of the surgical instrument without needing an additional marking or inventory control identifier.

[0014] Fig. 7 is another image of the same surgical scissor instrument shown in Fig. 6 showing the same region shown in Fig. 6 and further showing a second region of marks that are similarly obtained through manufacture or use of the surgical instrument without needing an additional marking or inventory control identifier.

[0015] Fig. 8 shows a closer view of the second region shown in Fig. 7.

[0016] Fig. 9 is an image showing handles of three separate surgical hemostat instruments showing example marks for comparison between the different instruments. [0017] Fig. 10 is an image showing finger ring portions of the same three surgical hemostat instruments for comparison. Again, differing marks on each of the instruments can be identified. [0018] Fig. 11 is another image showing a close-up view of a portion of another example surgical instrument.

[0019] Fig. 12 is an image showing side-by-side surgical instruments having different identifiable naturally occurring marks.

[0020] Fig. 13 illustrates an exemplary system useful in implementations of the described technology, according to one or more examples.

DETAILED DESCRIPTION

[0021] Fig. 1 is a block diagram of an example system 10 for uniquely identifying items, such as surgical tools. In this particular example, the system includes a tray conveyor system 12 adapted to move one or more trays of items throughout the system 10. In other examples, one or more trays of items may be manually moved or otherwise moved within the systems (e.g., by one or more robots or other automated devices). The system 10 includes a scale 14 adapted to weigh one or more of the items at a time.

[0022] A camera station 16 including one or more cameras is also provided that is adapted to take one or more images of the item(s) passing through the system. In one implementation, for example, three cameras can be positioned around an instrument handling location of the station 16 and adapted to take images from a plurality of angles, distances, etc. In another implementation, one or more cameras may be movable (e.g., robotically moveable) to provide the ability to take a plurality of images from a plurality of angles, distances, etc. The instrument handling system may also include one or more markings to provide a scale to analyze the item images.

[0023] A volumetric measurement station 18 can also include a volume measurement device adapted to determine (or estimate) a volume of one or more instruments. The volume measurement device, for example, may include a liquid (e.g., water) or gel displacement device, an image processing device adapted to determine or estimate a volume of the instruments via one or more images, a measurement device for determining a plurality of measurements that may be used to determine or estimate a volume of one or more instruments, a casting device, or other volume measurement or estimate device. Volume may be determined by any method, such as solving space, by density and mass, by displacement, or the like.

[0024] One or more sorting station 20 may be located within the system to move one or more items within the system 10. The one or more sorting station 20 may be used to separate incoming items into one or more groups, separate items within one or more groups into one or more subgroups, sort items into various locations (e.g., bins or trays) upon unique identification, etc. [0025] A controller 22 is operatively coupled to the conveyor system 12, scale 14, camera station 16, sorting station 20, or other stations. The controller 22, for example, may comprises a processor, memory, data storage, a database and additional elements as described herein to receive inputs from the various subsystems and control various operations of subsystems. The controller may also log information about each identified item and or each item unable to be identified. The controller, for example, may access a database that stores a history of each item and update any information relative to each item. The database, for example, may track usage for routine maintenance schedules, may track procedures to link an item to a particular procedure or patient. [0026] Although Fig. 1 shows the system as including a plurality of discrete stations (e.g., scale 14, camera station 16, volumetric measurement station 18, sorting station 20) solely to simplify the description of the overall system, one or more of the stations may provide discrete elements adapted to provide two or more of the operations at a single location. A station, for example, may be adapted to include a scale to weigh one or more item and one or more cameras adapted to take one or more images of the item(s) at the station.

[0027] Fig. 2 is a flowchart showing example operations of a method of uniquely identifying one or more items 30, such as surgical tools. In this example, the method comprises an operation of receiving a plurality of items (e.g., surgical tools) in operation 32. The items may be received individually or in one or more groups (e.g., in one or more trays of surgical tools received post surgery). As described above with reference to Fig. 1, the items and/or trays may be received via an automated system, such as a conveyor system, or manually. The items may be pre-cleaned in operation 34, such as to remove debris that may otherwise interfere with identification or otherwise cleaned or treated to aid in identification of the individual item(s). The item(s) and/or tray may have a unique identifier (e.g., a surgical tray ID such as on a barcode or QR code) that may be used to track the group of instruments over time. For example, the item(s) or tray(s) may be identified as being used in a particular surgical procedure, storage, maintenance, etc. In this manner, a group of items may be identified over time and, as described herein, individual items within a group may further be identified with particularity.

[0028] In various implementations, one or more of the following described operations may be used to eliminate one or more items of a predefined set of items until a single item or group of items is identified. In other implementations, one or more of the following described operations may be performed until a single item or group of items is identified from a predefined set of items. In further implementations, some operations may be performed to eliminate one or more items and other operations may be used to positively identify one or more items from a predefined set of items.

[0029] Each item may then be individually measured, imaged or undergo other identification methods to aid in identification in operation 36. The identification methods may include, for example, weighing, determining size (of the entire item or of one or more components of the item), determining a volume (of the entire item or of one or more components of the item), or the like. The operation of weighing an item, for example, may comprise placing the item on a scale to determine the weight of the item. The operation may be completed manually or in an automated manner such as delivering the item to the scale via a conveyor or robot device. The operation may further comprise determining a size of the item, such as via image processing. In one implementation, for example, one or more cameras may be used to take one or more images of the item (e.g., from a variety of angles and/or distances) and the image(s) used to determine the size or estimated size of the item and/or one or more components of features of the item via image processing. The images may be taken with the item disposed in view of one or more scales to provide a perspective for determining the size. The size may refer to an overall item size, individual component or feature size, and/or relative size of one or more component or feature relative to another component or feature or relative to the overall item.

[0030] In a weighing operation, for example, the weight of an item may be used to sort the item into a subset of items. Varying sized surgical instruments that look similar to other instruments for example, may be sorted by weight. An item that has an unexpected weight can be removed for further (e.g., manual) identification or handling. In some cases, for example, an item that is heavier than a predetermined limit may in reality comprise two or more items linked or stuck together or an item that is lighter than a predetermined limit may in reality comprise a portion of a broken item. [0031] In a sizing operation, a known background may be used in taking a preliminary image and used to define a bounding box or shape that the item will fit within. This can be used to provide an indication of the type of instrument. This may not typically result in a unique identification but can be used to partially identify the item within a class or group and/or can be used to eliminate one or more types or classes of items. Items that do not fit within any known size can be rejected at this point. Again, this may correspond to two or more instruments linked or stuck together or may correspond to a broken portion of an item that is not as large as expected. It is also possible that the item is disposed in an unexpected orientation and may be re-oriented and re-analyzed through the system allowing it to be identified.

[0032] In a volume determination operation, one or more cameras located at different angles and/or distances from an item may be used to approximate a volume of an item to be identified. A volume measure can be used in a similar manner to a size measurement.

[0033] The characteristics, such as weight, size, volume and the like may be used to eliminate one or more items or group of items of a predefined set to reduce the class of items remaining for remaining operations. In this manner, items may be identified within an extremely large set of items (e.g., an entire collection of surgical instruments in a surgical facility such as a hospital) without extraneous computing power of having to match specific, unique identifiers to the entire set of items and, thus, reducing the computing resources for a final determination.

[0034] Again, physical characteristics, such as weight, size, volume and the like, may be used to eliminate a portion of a set of items and potentially reduce the complexity of further identification processing by reducing the size of the set of items from which the item is to be identified.

[0035] In some implementations, however, the physical characteristics need not be measured depending on the system requirements.

[0036] In operation 38, one or more instruments are grouped into one or more sets based on the results of the identification operation 36. For example, a set may be designated to include items having one or more similar characteristics, such as size, weight or other characteristics determined in identification operation 36.

[0037] In operation 40 each set of items can be analyzed, such as using a model (e.g., a model in a neural network, such as but not limited to a TensorFlow convolutional neural network (CNN) to further identify individual item(s) in the set of items. The model may include various machine learning techniques to aid in the identification process. The set of items can be further divided based on this operation 40 to one or more subsets including related items from within the set. In other implementations, the same set may be maintained for further analysis in a following operation. [0038] In operation 42, each individual item in the set or subset obtained from operation 40 may be analyzed individually to identify one or more features of the item. The feature(s) may be located in one or more area of the item where unique marks or identifiable characteristics may be found on the item. The marks or other characteristics may include, for example, tool marks, scratches (e.g., developed during use), etching, staining or the like. In one particular implementation, for example, naturally occurring marks and characteristics, such as those created during the manufacturing process (tool marks) and those occurring in the instrument’ s purposeful use and storage, are able to be used without resorting to additional extraneous identifiers such as codes (e.g., barcode, QR codes and the like) that are added as part of an add-on inventory control system.

[0039] Each set of marks or other characteristics on an item may be used as a unique identifier (e.g., fingerprint) that may, individually or in combination, be used to uniquely identify an item within a larger set of items. For example, tool marks, use marks, etch marks, scratches and the like may be able to uniquely identify an individual item (e.g., surgical tool) within a larger set of items (e.g., set of surgical tools within a particular hospital system). By identifying individual items within the discrete set, the system can be used to track individual items can be used to create a log or tracking history of each item within a system. For a surgical tool, for example, a particular tool may be tracked to each procedure in which it is used. That information can also be correlated to particular patients so that further treatments, maintenance or other steps can be taken in response to the history of the item and/or based on patient outcomes. Where a patient develops a post-surgical infection, for example, the instrument can be designated for further treatment, such as further cleaning and/or disinfection to prevent additional adverse reactions. Similarly, a history of each item may be maintained so that the instrument may undergo routine maintenance, repair, etc.

[0040] Operation 42 may include one or more forensic algorithms (e.g., an AFIS fingerprint algorithm or a ballistic algorithm or the like) to identify one or more features of each item and, in turn, identify the item within a larger set of items. The operation 42 may extract image data from one or more unique area of an item. Feature extraction is used to identify a set of features in one or more set of marks. Based on a set of predefined features the system accesses a database to locate a corresponding item from the set of items.

[0041] The result of the non-unique identification operations 36 may be used to determine a particular model that may be applicable for performing further identification operations. For example, depending on a group of items remaining after a preliminary identification operation, one or more models or analyses may be advantageous to make a more definite identification in further operations.

[0042] In one implementation, for example, a model may be used to identify the item from one or more individual naturally-occurring features (e.g., marks and characteristics) detected on the item, such as those created during the manufacturing process (tool marks) and those occurring in the instrument’s purposeful use and storage, are able to be used without resorting to additional extraneous identifiers such as codes (e.g., barcode, QR codes and the like) that are added as part of an add-on inventory control system. Models may include patterns for non-instrument or broken instrument identification. If a resulting identification fails to meet a sufficiently high likelihood that the item/instrument is properly classified, or if the identification process determines that the item is a foreign object, the instrument may be removed from the identification process and analysis may continue with respect to another item.

[0043] One or more models may be used to determine the most likely identification for an item, especially where a certain identification is not determined. In one implementation, a cross-section of tooling marks may represent a frequency pattern that can be transformed using a Fast Fourier Transform (FFT) into a set of sine wave parameters. These can be matched against a known database to identify the item. A combination of models and machine learning matching and classification may be used to perform an identification. [0044] In one example, a model such as but not limited to a TensorFlow CNN may be used to identify areas on each instrument where unique manufacturing and use marks, or other unique, naturally-occurring marks, can be used to uniquely identify each item in s set of items.

[0045] A second set of models can be used to determine where to inspect an item for distinguishing marks. Tools have machining and wear patterns that are unique. A model can determine where to extract unique naturally occurring machine and use marks or where to identify where the item has been intentionally marked (non-naturally occurring).

[0046] Each set of naturally occurring marks of an item (e.g., surgical tool/instrument) may provide a unique identifier for the item. In one example, a forensic algorithm or set of algorithms may be used to identify this unique identifier (e.g., finger print) of the item. The system may extract image data from one or more unique area on each item. A feature extraction may be used to identify a set of features in each set of marks. Based on a unique mark or more likely a unique combination of marks an item may be identified, such as via a look up in a database that matches.

[0047] In one implementation, for example, know forensic method can be applied to one or more areas of an item to uniquely identify the instrument. If the instrument has been intentionally marked (e.g., barcode or QR code), data can be extracted and used to uniquely identify the item. [0048] In operation 44, a code or identification mark, if present on an item, may also be analyzed. Codes such as barcodes, QR codes or the like may be used. Manufacturer marks, such as make and model marks, identifying marks (e.g., etched marks, engraved marks, etc.) may be identified in addition to or instead of other identification operations described herein. Where a subset of items (e.g., surgical tools) unique identifiers, those identifiers may be used instead of the other identification operations described and the remaining items identified using unique identifying marks/features.

[0049] If an item is not able to be automatically identified, however, the item may be flagged for manual identification and removed from the set of automated identification in operation 46. [0050] Fig. 3 is a flowchart of an example process 50 of identifying an item of a collection of items. In this example, a preliminary analysis 52 of physical characteristics, such as size, weight and volume are performed as discussed with respect to Fig. 2. These measurements, for example, may be used to eliminate a portion of the collection that do not fit within the physical characteristics obtained. A feature detection operation 54 is performed to identify a plurality of regions of the item. While the particular example shown in Fig. 3 shows Regions 1, 2, 3, and 4, any number of regions may be identified and analyzed. A final feature extraction including the features identified in one or more of the regions is performed in operation 56, and an exact identification is performed in operation 58.

[0051] Fig. 4 is a schematic block diagram showing an example processing and physical handling flow of a system 60 of uniquely identifying items from a collection. In this implementation, a surgical instrument is received at an image location 62. The instrument may undergo physical characteristic measurements at this location or prior to arriving at the location, or may not need the initial physical characteristic measurements.

[0052] One or more images of the instrument is taken at location 62. An image file is directed for processing to location 64, at which feature detection and extraction are performed for a first region at 66, a second region at 68, a third region at 70, and a fourth region at 72. In these operations, the image may be enhanced and inspected at each region to identify one or more possible unique, naturally occurring marks at that region.

[0053] In this example, the system identifies the surgical instrument ass a curved mayo scissor surgical instrument and identifies it by code DE158B as curved Mayo scissor instrument number 4 of the collection for the facility. Since the specific item has been identified, it may be tracked, such as to a tray, patient and/or procedure, and the facility may maintain a complete history of the item usage for analytics and other purposes.

[0054] On the physical handling side, the instrument is moved from the image location 62 to a handling system 74 where the instrument can be directed as needed.

[0055] Fig. 5 shows an example non-naturally occurring inventory code marking system that may be used to mark items such as surgical instruments.

[0056] Fig. 6 is an image of a surgical scissor instrument showing example marks obtained through manufacture or use of the surgical instrument without needing an additional marking or inventory control identifier. In this example, the surgical scissor has a plurality of curved scratch marks 80 located in the area of a lock vox of the scissors. These scratch marks may be part of unique identification of the pair of surgical scissors. The marks may be, for example, straight or curved, deep or shallow and may be examined. In one implementation, for example, the image may be isolated and enhanced in one or more target areas to identify particular characteristic of the marks. The depth, angles, curvature, and other characteristics may be determined and matched against other identifying features.

[0057] Fig. 7 is another image of the same surgical scissor instrument shown in Fig. 6 showing the same region 80 having the scratch marks shown in Fig. 6 and further showing a second region 82 of marks that are similarly obtained through manufacture or use of the surgical instrument without needing an additional marking or inventory control identifier. Fig. 8 shows a closer view of the second region 82 shown in Fig. 7. Both sets of scratch marks in regions 80, 82 may be part of unique identification of the pair of surgical scissors. Similar regions would be found on an opposite side of the surgical scissor instrument. Thus, by turning the item over, an additional two set of marks can be imaged. Again, the marks may be, for example, straight or curved, deep or shallow and may be examined. In one implementation, for example, the image may be isolated and enhanced in one or more target areas to identify particular characteristic of the marks. The depth, angles, curvature, and other characteristics may be determined and matched against other identifying features.

[0058] Fig. 9 is an image showing handles of three separate surgical hemostat instruments showing example marks for comparison between the different instruments. The marks again are obtained through manufacture or use of the surgical instrument without needing an additional marking or inventory control identifier. In this example, each handle portion of the hemostat instruments has different identifiable naturally occurring marks in regions 84, 86, 88. These scratch marks may be part of unique identification of the pair of surgical scissors. The marks may be, for example, straight or curved, deep or shallow and may be examined. In one implementation, for example, the image may be isolated and enhanced in one or more target areas to identify particular characteristic of the marks. The depth, angles, curvature, and other characteristics may be determined and matched against other identifying features.

[0059] Fig. 10 is an image showing finger ring portions of the same three surgical hemostat instruments for comparison. Again, differing marks on each of the instruments can be identified. [0060] Fig. 11 is another image showing a close-up view of a portion of another example surgical instrument. Again, naturally occurring marks can be identified, such as the circular impression formed in the instrument, likely from storage against another instrument. Again, depth, shape, curvature and other features of the marks can be used to positively identify the instrument. [0061] Fig. 12 is an image showing side-by-side surgical instruments having different identifiable naturally occurring marks.

[0062] Figs. 6-12 show a plurality of example marks that can be found on surgical instruments that are naturally occurring and not due to an additional intentional marking or inventory control system. These are merely examples that provide example areas of interest in which naturally occurring features of a surgical instrument may be found. In some implementations, multiple different markings for each item (e.g., 5, 6, 7, 8, 9, 10 or even more markings) may be used to identify an instrument within a collection of instruments. By taking advantage of naturally occurring features a complete set of instruments for a facility may be managed without the additional expense of individually marking or adding inventory control devices (e.g., RFID, barcode, or QR code).

[0063] In various implementations, a system or method may include a controller adapted to use machine learning to identify instruments. The controller can be adapted to learn to identify the item based on image data. The controller can comprise a neural network. The controller can be adapted to automatically identify a plurality of regions of the item containing identifiable forensic information. The controller can be adapted to use machine learning to individually identify instruments. The controller can be adapted to learn based on a set of items specific to the collection of items what the unique patterns that identify items within the collection of items. The controller can be adapted to use forensic analysis to automatically identify items, optionally wherein the controller is adapted to automatically identify a unique item based on a forensic algorithm. The controller can be adapted to use decision trees and multiple machine learning models to eliminate unidentifiable items. The controller can be adapted to use a set of probability weighted decision trees or a set of true/false decision trees to uniquely identify individual instruments. The controller can be adapted to combine multiple factors with machine learning to identify instruments. The controller can be adapted to combine multiple factors with other information to determine a set of potential item models to be used to identify the item. The controller can be adapted to use temporal tracking of individual items to limit results from a probabilistic machine learning module. The controller can be adapted to use knowledge about a current collection of items in use to identify a set of possible items to be identified, use the current collection of items in use to limit a set of potential models and eliminate impossible items from the output of the models. The controller can be adapted to use a decision tree to combine both parametric information and a set of machine learning models that cannot be relevant. The controller can be adapted to uses a set of decision trees to combine both parametric information with a set of results based on applying the models to the images of the item.

[0064] Fig. 13 illustrates an exemplary system useful in implementations of the described technology. A general purpose computer system 1000, which may be used as the described controller, is capable of executing a computer program product to execute a computer process. Data and program files may be input to the computer system 1000, which reads the files and executes the programs therein. Some of the elements of a general purpose computer system 1000 are shown in Fig. 13 wherein a processor 1002 is shown having an input/output (I/O) section 1004, a Central Processing Unit (CPU) 1006, and a memory section 1008. There may be one or more processors 1002, such that the processor 1002 of the computer system 1000 comprises a single central processing unit 1006, or a plurality of processing units, commonly referred to as a parallel processing environment. The computer system 1000 may be a conventional computer, a distributed computer, or any other type of computer. The described technology is optionally implemented in software devices loaded in memory 1008, stored on a configured DVD/CD-ROM 1010 or storage unit 1012, and/or communicated via a wired or wireless network link 1014 on a carrier signal, thereby transforming the computer system 1000 in Fig. 13 into a special purpose machine for implementing the described operations.

[0065] The I/O section 1004 is connected to one or more user-interface devices (e.g., a keyboard 1016 and a display unit 1018), a disk storage unit 1012, and a disk drive unit 1020. Generally, in contemporary systems, the disk drive unit 1020 is a DVD/CD-ROM drive unit capable of reading the DVD/CD-ROM medium 1010, which typically contains programs and data 1022.

The data may be stored in any applicable format and may, in some implementations, stored in an accessible database that is adapted to link items to activities such as uses, procedures, storage, age, etc. In other implementations, the disk drive may be an external storage system such as a standalone database (e.g., located on one or more networked servers). Computer program products containing mechanisms to effectuate the systems and methods in accordance with the described technology may reside in the memory section 1008, on a disk storage unit 1012, or on the DVD/CD-ROM medium 1010 of such a system 1000. Alternatively, a disk drive unit 1020 may be replaced or supplemented by a floppy drive unit, a tape drive unit, or other storage medium drive unit. The network adapter 1024 is capable of connecting the computer system to a network via the network link 1014, through which the computer system can receive instructions and data embodied in a carrier wave. Examples of such systems include SPARC systems offered by Sun Microsystems, Inc., personal computers offered by Dell Corporation and by other manufacturers of Intel- compatible personal computers, PowerPC-based computing systems, ARM-based computing systems and other systems running a UNIX-based or other operating system. It should be understood that computing systems may also embody devices such as Personal Digital Assistants (PDAs), mobile phones, gaming consoles, set top boxes, etc.

[0066] When used in a LAN-networking environment, the computer system 1000 is connected (by wired connection or wirelessly) to a local network through the network interface or adapter 1024, which is one type of communications device. When used in a WAN-networking environment, the computer system 1000 typically includes a modem, a network adapter, or any other type of communications device for establishing communications over the wide area network. In a networked environment, program modules depicted relative to the computer system 1000 or portions thereof, may be stored in a remote memory storage device. It is appreciated that the network connections shown are exemplary and other means of and communications devices for establishing a communications link between the computers may be used.

[0067] In accordance with an implementation, software instructions and data directed toward operating the subsystems may reside on the disk storage unit 1012, disk drive unit 1020 or other storage medium units coupled to the computer system. Said software instructions may also be executed by CPU 1006. [0068] The implementations described herein are implemented as logical steps in one or more computer systems. The logical operations are implemented (1) as a sequence of processor- implemented steps executing in one or more computer systems and (2) as interconnected machine or circuit modules within one or more computer systems. The implementation is a matter of choice, dependent on the performance requirements of a particular computer system. Accordingly, the logical operations making up the embodiments and/or implementations described herein are referred to variously as operations, steps, objects, or modules. Furthermore, it should be understood that logical operations may be performed in any order, unless explicitly claimed otherwise or a specific order is inherently necessitated by the claim language.

[0069] Furthermore, certain operations in the methods described above must naturally precede others for the described method to function as described. However, the described methods are not limited to the order of operations described if such order sequence does not alter the functionality of the method. That is, it is recognized that some operations may be performed before or after other operations without departing from the scope and spirit of the claims.

[0070] Although implementations have been described above with a certain degree of particularity, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of this invention. All directional references (e.g., upper, lower, upward, downward, left, right, leftward, rightward, top, bottom, above, below, vertical, horizontal, clockwise, and counterclockwise) are only used for identification purposes to aid the reader’s understanding of the present invention, and do not create limitations, particularly as to the position, orientation, or use of the invention. Joinder references (e.g., attached, coupled, connected, and the like) are to be construed broadly and may include intermediate members between a connection of elements and relative movement between elements. As such, joinder references do not necessarily infer that two elements are directly connected and in fixed relation to each other. It is intended that all matter contained in the above description or shown in the accompanying drawings shall be interpreted as illustrative only and not limiting. Changes in detail or structure may be made without departing from the spirit of the invention as defined in the appended claims.