Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AUGMENTED REALITY-ENABLED SYSTEMS AND METHODS FOR CABLE IDENTIFICATION
Document Type and Number:
WIPO Patent Application WO/2023/180032
Kind Code:
A1
Abstract:
The present invention is to provide an augmented reality systems and methods capable of intuitively recognizing cable-related information for cable assemblies and conveying visual indicators to a user of an augmented reality device. The methods described herein can include receiving video data from a camera, receiving functional parameters for the plurality of medical cables operatively connecting patient monitoring system to a patient, generating touch data indicating one or more cables as having been touched or as actively being touched, analyzing the plurality of medical cables, generating an augmented-reality output comprising one or more visual indicators to be superimposed on the plurality of medical cables using an augmented reality device.

Inventors:
TALGORN ELISE CLAUDE VALENTINE (NL)
DELLIMORE KIRAN HAMILTON J (NL)
GEURTS LUCAS JACOBUS FRANCISCUS (NL)
JOYE NEIL FRANCIS (NL)
JANSSEN ANTHONIUS PETRUS GERARDUS EMANUEL (NL)
Application Number:
PCT/EP2023/055256
Publication Date:
September 28, 2023
Filing Date:
March 02, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
KONINKLIJKE PHILIPS NV (NL)
International Classes:
G06T19/00; G16H40/63; G06V20/20; G16H40/40
Domestic Patent References:
WO2018116222A12018-06-28
Foreign References:
US20190268470A12019-08-29
US20210398355A12021-12-23
Attorney, Agent or Firm:
PHILIPS INTELLECTUAL PROPERTY & STANDARDS (NL)
Download PDF:
Claims:
Claims

What is claimed is:

1. An augmented reality-enabled cable identification system, the system comprising: a memory storing instructions that, when executed by at least one processor, performs the following: receive video data from an associated camera, wherein the video data includes images of a plurality of medical cables operatively connected to an associated patient and an associated patient monitoring system; receive functional parameters for the plurality of medical cables from the associated patient monitoring system; analyze the video data from the associated camera to generate touch data indicating one or more cables of the plurality of medical cables being touched by an associated user; analyze the plurality of medical cables by: extracting optical features associated with the one or more cables indicated by the touch data; comparing the extracted optical features of the one or more indicated cables with optical feature information stored in an optical features database; identifying at least one cable parameter for the one or more indicated cables based on the comparison with the optical feature information of the optical features database, wherein the at least one cable parameter comprises a cable identity; determining whether the one or more indicated cables are functioning properly based on the received functional parameters corresponding to the one or more indicated cables; generate an augmented-reality (“AR”) output comprising one or more visual indicators to be superimposed on the plurality of medical cables using an associated augmented- reality device, the one or more visual indicators being generated based on the analysis of the plurality of medical cables; and transmit the AR output to a display of the associated augmented-reality device.

2. The augmented reality-enabled cable identification system of claim 1, wherein the received functional parameters comprise one or more of: an electrical signal; a signal quality; a signal presence; and a cable-related error.

3. The augmented reality-enabled cable identification system of claim 1, wherein the visual indicators of the AR output comprise one or more of: a persistent highlight color; an intermittent highlight color; a cable-related message; an arrow indicator.

4. The augmented reality-enabled cable identification system of claim 1, further comprising: the optical features database; and the associated augmented-reality device, wherein the associated augmented-reality device includes the display, the associated camera, the at least one processor, and the memory.

5. The augmented reality-enabled cable identification system of claim 1, further comprising: the associated patient monitor; and the plurality of medical cables.

6. The augmented reality-enabled cable identification system of claim 1 , wherein the visual indicators of the AR output comprises a persistent highlight color or an intermittent highlight color for each of the one or more indicated cables of the plurality of medical cables.

7. The augmented reality-enabled cable identification system of claim 1 , wherein the memory further stores instructions that, when executed by the at least one processor, performs the following: analyze the plurality of cables by: determining whether each of the one or more indicated cables has a non-disrupted cable path from the associated patient monitoring system to the associated patient; and if an indicated cable of the one or more indicated cables does not have a non-disrupted cable path, generate the AR output comprising one or more visual indicators to be superimposed on the plurality of medical cables, wherein the one or more visual indicators include sequential directions for creating a non-disrupted cable path for the indicated cable.

8. The augmented reality-enabled cable identification system of claim 1, wherein the at least one cable parameter further includes: a cable identity; a cable purpose; a cable material; a cable connection type; and a cable type.

9. A computer-implemented method of identifying medical cables in an augmented reality environment, the method comprising: receiving, from a camera, video data comprising images of a plurality of medical cables operatively connecting a patient to a patient monitoring system; generating touch data indicating one or more cables of the plurality of medical cables being touched by a user, wherein the touch data is generated by analyzing the images of the video data to identify the one or more cables; analyzing the plurality of medical cables by: extracting optical features associated with the one or more cables indicated by the touch data; comparing the extracted optical features of the one or more cables indicated by the touch data with optical feature information stored in an optical features database; and identifying at least one cable parameter for the one or more indicated cables based on the comparison with the optical feature information of the optical features database; generating an augmented-reality (“AR”) output comprising one or more visual indicators to be superimposed on the plurality of medical cables using an augmented-reality device, wherein the one or more visual indicators are generated based on the analysis of the plurality of medical cables; and transmitting the AR output to a display of the augmented-reality device.

10. The computer-implemented method of claim 9, further comprising: displaying, on the display of the augmented-reality device, the generated AR output.

11. The computer-implemented method of claim 9, further comprising: receiving, from the patient monitoring system, functional parameters for the plurality of medical cables; and analyzing the plurality of medical cables by: determining whether the one or more indicated cables are functioning properly based on the received functional parameters corresponding to the one or more indicated cables.

12. The computer-implemented method of claim 11, wherein the received functional parameters comprise one or more of: an electrical signal; a signal quality; a signal presence; and a cable-related error.

13. The computer-implemented method of claim 9, wherein the visual indicators of the AR output comprise one or more of: a persistent highlight color; an intermittent highlight color; a cable-related message; an arrow indicator.

14. The computer-implemented method of claim 9, further comprising: analyzing the plurality of cables by: determining whether each of the one or more indicated cables has a non-disrupted cable path from the patient monitoring system to the patient; wherein the one or more visual indicators of the generated AR output includes a set of sequential directions for creating a non-disrupted cable path for each of the indicated cables of the one or more indicated cables determined to not have a non-disrupted cable path from the patient monitoring system to the patient.

15. The computer-implemented method of claim 9, wherein the received functional parameters comprise one or more of an electrical signal, a signal quality, a signal presence, and a cable-related error; and wherein the at least one cable parameter further includes one or more of a cable identity, a cable purpose, a cable material, a cable connection type, and a cable type.

Description:
AUGMENTED REALITY-ENABLED SYSTEMS AND METHODS FOR CABLE IDENTIFICATION

Field of the Disclosure

[0001] The present disclosure is directed generally to cable identification in medical settings, and more specifically, to systems and methods of identifying and/or detangling medical cables.

Background

[0002] In healthcare settings, such as hospitals, a patient may be connected to a patient monitoring system using tens or dozens of medical cables at a single time. As a result, cable-related issues arising from “cable spaghetti” makes it time consuming and labor intensive for healthcare specialists to diagnose and resolve. For example, a particular cable may have a weak or noisy signal due to a loose connection, but that cable can only be accessed by moving and/or untangling multiple other unrelated cables. However, conventional methods of addressing cable-related issues and “cable spaghetti” are generally limited to tedious, time consuming, and manual trial-and-error type methods.

Summary of the Disclosure

[0003] The present disclosure is directed generally augmented reality-enabled systems and methods for intuitive cable identification in healthcare settings.

[0004] According to an embodiment, an augmented reality-enabled cable identification system is provided. The system can comprise a memory storing instructions that, when executed by at least one processor, performs the following: receive video data from an associated camera, wherein the video data includes images of a plurality of medical cables operatively connected to an associated patient and an associated patient monitoring system; receive functional parameters for the plurality of medical cables from the associated patient monitoring system; analyze the video data from the associated camera to generate touch data indicating one or more cables of the plurality of medical cables being touched by an associated user; analyze the plurality of medical cables; generate an augmented-reality (“AR”) output comprising one or more visual indicators to be superimposed on the plurality of medical cables using an associated augmented-reality device, the one or more visual indicators being generated based on the analysis of the plurality of medical cables; and transmit the AR output to a display of the associated augmented-reality device. In an aspect, the plurality of medical cables may be analyzed by: extracting optical features associated with the one or more cables indicated by the touch data; comparing the extracted optical features of the one or more indicated cables with optical feature information stored in an optical features database; identifying at least one cable parameter for the one or more indicated cables based on the comparison with the optical feature information of the optical features database, wherein the at least one cable parameter comprises a cable identity; and determining whether the one or more indicated cables are functioning properly based on the received functional parameters corresponding to the one or more indicated cables.

[0005] In an aspect, the received functional parameters can comprise one or more of: an electrical signal; a signal quality; a signal presence; and a cable-related error.

[0006] In an aspect, the visual indicators of the AR output can comprise one or more of: a persistent highlight color; an intermittent highlight color; a cable-related message; an arrow indicator.

[0007] In an aspect, the augmented reality-enabled cable identification system can further comprise: the optical features database; and the associated augmented-reality device, wherein the associated augmented-reality device includes the display, the associated camera, the at least one processor, and the memory.

[0008] In an aspect, the augmented reality-enabled cable identification system can further comprise: the associated patient monitor; and the plurality of medical cables.

[0009] In an aspect, the visual indicators of the AR output can comprise a persistent highlight color or an intermittent highlight color for each of the one or more indicated cables of the plurality of medical cables.

[0010] In an aspect, the memory further stores instructions that, when executed by the at least one processor, analyzes the plurality of cables by: determining whether each of the one or more indicated cables has a non-disrupted cable path from the associated patient monitoring system to the associated patient; and if an indicated cable of the one or more indicated cables does not have a non-disrupted cable path, generate the AR output comprising one or more visual indicators to be superimposed on the plurality of medical cables, wherein the one or more visual indicators include sequential directions for creating a non-disrupted cable path for the indicated cable. [0011] In an aspect, the at least one cable parameter can further include: a cable identity; a cable purpose; a cable material; a cable connection type; and a cable type.

[0012] According to another embodiment, a computer-implemented method of identifying medical cables in an augmented reality environment is provided. The method can comprise: receiving, from a camera, video data comprising images of a plurality of medical cables operatively connecting a patient to a patient monitoring system; generating touch data indicating one or more cables of the plurality of medical cables being touched by a user, wherein the touch data is generated by analyzing the images of the video data to identify the one or more cables; analyzing the plurality of medical cables; generating an augmented-reality (“AR”) output comprising one or more visual indicators to be superimposed on the plurality of medical cables using an augmented- reality device, wherein the one or more visual indicators are generated based on the analysis of the plurality of medical cables; and transmitting the AR output to a display of the augmented-reality device. In an aspect, the plurality of medical cables can be analyzed by: extracting optical features associated with the one or more cables indicated by the touch data; comparing the extracted optical features of the one or more cables indicated by the touch data with optical feature information stored in an optical features database; and identifying at least one cable parameter for the one or more indicated cables based on the comparison with the optical feature information of the optical features database;

[0013] In an aspect, the method can further comprise: displaying, on the display of the augmented-reality device, the generated AR output.

[0014] In an aspect, the method can further comprise: receiving, from the patient monitoring system, functional parameters for the plurality of medical cables; and analyzing the plurality of medical cables by determining whether the one or more indicated cables are functioning properly based on the received functional parameters corresponding to the one or more indicated cables.

[0015] In an aspect, the received functional parameters can comprise one or more of: an electrical signal; a signal quality; a signal presence; and a cable-related error.

[0016] In an aspect, the visual indicators of the AR output can comprise one or more of: a persistent highlight color; an intermittent highlight color; a cable-related message; an arrow indicator.

[0017] In an aspect, the method can further comprise: analyzing the plurality of cables by determining whether each of the one or more indicated cables has a non-disrupted cable path from the patient monitoring system to the patient; wherein the one or more visual indicators of the generated AR output includes a set of sequential directions for creating a non-disrupted cable path for each of the indicated cables of the one or more indicated cables determined to not have a nondisrupted cable path from the patient monitoring system to the patient.

[0018] In an aspect, the received functional parameters comprise one or more of an electrical signal, a signal quality, a signal presence, and a cable-related error; and the at least one cable parameter further includes one or more of a cable identity, a cable purpose, a cable material, a cable connection type, and a cable type.

[0019] These and other aspects of the various embodiments will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.

Brief Description of the Drawings

[0020] In the drawings, like reference characters generally refer to the same parts throughout the different views. Also, the drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the various embodiments.

[0021] FIG. 1 is a flowchart illustrating a method of identifying and/or detangling medical cables according to aspects of the present disclosure.

[0022] FIG. 2 is an illustration showing a patient connected to a patient monitoring system via a plurality of medical cables according to aspects of the present disclosure.

[0023] FIG. 3 is a block diagram of an augmented reality-enabled cable identification system according to aspects of the present disclosure.

Detailed Description of Embodiments

[0024] The present disclosure describes augmented reality systems and methods for medical cable assembly capable of recognizing related information, including cable arrangement information, and conveying such information to an operator of an augmented reality device. More specifically, in the healthcare setting, a patient can be connected to a patient monitoring system through a plurality of medical cables, which can lead to an issue known as “cable spaghetti” where such cables become intertwined and knotted. Identifying and resolving “cable spaghetti” and other cable-related errors are time consuming and labor-intensive tasks for clinicians and healthcare specialists. Moreover, patient care quality may be negatively impacted if such cable-related issues cannot be quickly and efficiently resolved. As described herein, the augmented reality systems and methods allow for the management and resolution of cable-related issues in a medical environment. [0025] With reference to FIG. 1, a flowchart illustrating a computer-implemented method 100 of identifying medical cables in an augmented reality environment is provided according to aspects of the present disclosure. As shown, the method 100 can include performing one or more of the following steps using at least one computer processor: at a step 102, receiving video data; at a step 104, generating touch data; at a step 106, analyzing a plurality of medical cables captured in the video data; at a step 118, generating an augmented-reality (“AR”) output; and at a step 120, transmitting the AR output to a display of an augmented reality device. These and other aspects of the present methods 100 are described in further detail below.

[0026] In embodiments, the method 100 can include a step 102 of receiving video data from a camera associated with the augmented reality devices of the present disclosure. In aspects, the video data can comprise images of a plurality of medical cables that operatively connect a particular patient to a patient monitoring system. In further aspects, the video data can be a continuous live-stream of real time events.

[0027] At a step 104, the method 100 can include generating touch data by analyzing the images of the video data. In aspects, the touch data can indicate one or more cables of the plurality of medical cables that were touched by a user. For example, the images of the video data can be processed and analyzed to identify a user’s hand holding or otherwise touching one or more cables of the plurality of medical cables. In aspects, at least two cables may be indicated by the touch data as having been touched, including at least three cables.

[0028] At a step 106, the method 100 can include analyzing the plurality of medical cables based on at least the images of the video data. In embodiments, the step 106 can be broken down into several sub-steps, including one or more of the following: at a step 108, extracting optical features associated with one or more of the cables of the plurality of medical cables; at a step 110, comparing the extracted optical features of the one or more cables with optical feature information stored in an optical features database; at a step 112, identifying at least one cable parameter for one or more cables of the plurality of medical cables; at a step 114, determining / evaluating the functionality of one or more cables of the plurality of medical cables; and at a step 116, determining cable pathways / cable paths for one or more cables of the plurality of medical cables. [0029] In aspects, the step 108 includes digitally processing the images of the video data to extract one or more optical features associated with one or more of the plurality of medical cables. In embodiments, optical features are extracted from the video data only for the one or more cables indicated by the touch data as having been touched (or as actively being touched). In further embodiments, the optical features can be, for example and without limitation, the color, shape, diameter, or length of the corresponding cable. The optical features can further include an anatomical location of the patient that the cable connected to, or socket of the patient monitoring system that the cable unfolds from.

[0030] In aspects, the step 110 includes comparing the optical features extracted in step 108 with optical feature information stored in an optical features database. More specifically, the optical features database can store optical feature information for a plurality of known medical cables, including the optical features discussed above. As such, the extracted optical features can be compared with the information stored in the database. In embodiments, the optical features database can be stored on a server remote from the augmented reality device such that in step 110, the one or more processors package the extracted optical features into an inquiry that is transmitted to the remote optical features database. In other embodiments, the optical features database can be stored in a memory local to the augmented reality device such that in step 110, the one or more processors directly compare the extracted optical features with the stored optical feature information.

[0031] In aspects, the step 112 includes identifying at least one cable parameter associated with the one or more cables of the plurality of medical cables. In embodiments, the at least one cable parameter can be associated with one or more of the cables indicated as having been touched (or as actively being touched) based on the touch data. In further embodiments, the step 112 includes identifying at least one cable parameter for each of the one or more cables indicated by the touch data. For example, and without limitation, the cable parameters can include a cable identity such as the identity of a particular cable, a cable purpose such as the intended function of the cable, a cable material such as a material used in the construction of the cable, a cable connection type such as how the cable connects to the patient and/or the patient monitoring system, and a cable type such as a broader class of cables. In particular embodiments, the one or more cable parameters are identified based on at least the optical features extracted in step 108 and/or the comparison performed in step 110. [0032] In embodiments, the step 112 can include identifying at least one cable parameter associated with multiple portions or sections of the same cable. For example, as discussed below, a cable may have a disrupted pathway between the patient monitoring system and the patient that is obstructed from view. Nonetheless, in step 112, the method 100 can include identifying two or more non-contiguous sections of a cable as belonging to the same cable and associated the same cable parameters for each non-contiguous section. For example, if the method 100 extracts optical features of a plurality of medical cables wherein only of the cables is the color red, but the images of the video data show a red cable extending from the patient monitor that cannot be seen connecting to the patient (i.e., is obstructed from view in some fashion) while another red cable extending from the patient but cannot be seen connecting to the patient monitoring system (i.e., is also obstructed from view), then the step 112 can include identifying these red cables as different sections of the same cable and assign the same one or more cable parameters to the red cable.

[0033] In aspects, the step 114 includes determining whether one or more cables of the plurality of medical cables are functioning properly. For instance, the method 100 can include a step 122 where one or more functional parameters for the plurality of medical cables are received from the patient monitoring system. Then, at step 114, the functional parameters can be used to determine whether one or more of said cables are functioning properly. As used herein, the phrase “functioning properly” relates to the intended use of the medical cable, and in particular, refers to whether the medical cable is operating as intended or desired. In embodiments, the functional parameters can include one or more of: an electrical signal such as an electrical signal measured in relation to a physiological parameter of the patient, a signal quality such as an indication of the quality (e.g., noise-to-signal ratio, etc.) of the electrical signal, a signal presence such as whether there is a signal being transmitted through the particular cable, a cable-related error such as an error code determined by the patient monitoring system, and the like. Thus, based on such functional parameters, an operational condition can be determined. For example, if there is excessive noise in a signal transmitted through an identified cable, or there is an unusual artifact in the signal, or if the signal is only being received intermittently (e.g., the patient moved and now a signal is no longer being received), then one or a combination of these functional parameters would indicate a problem with the corresponding cable, such as a loose or improper connection. In embodiments, the step 114 can include determining the functionality of only the one or more cables indicated as having been touched (or actively being touched) based on the touch data. [0034] In aspects, the step 116 includes determining a cable path one or more cables of the plurality of medical cables based on the images in the video data. As used herein, the terms “cable path” and “cable pathways” are used interchangeably to refer to how a corresponding cable connected two points, such as how a medical cable operatively connected a patient to a patient monitoring system. In embodiments, the step 116 of determining a cable pathway includes determining whether a corresponding cable has a non-disrupted pathway from the patient monitoring system to the patient, or has a disrupted pathway from the patient monitoring system to the patient. For example, a non-disrupted cable pathway could be detected by examining the video data and determining that the entire length of the cable from the patient monitoring system to the patient is entirely visible and not twisted around another cable. Alternatively, a disrupted cable pathway could be detected by examining the video data and determining that determining that a portion or portions of the cable are obstructed from view and/or twisted around other cables. [0035] In further embodiments, different attributes of the cable pathways can be determined in step 116. For example, if the cable pathway forms a kink and/or the several of the plurality of medical cables for a knot, this kink and/or knot attribute may be associated with the cable pathway. [0036] Turning briefly to FIG. 2, an illustration is provided showing a patient 202 connected to a patient monitoring system 212 via three electrodes 204 and three cables 206, 208, 210. As seen in FIG. 2, cable 206 has a non-disrupted pathway from the patient 204 and the patient monitoring system 212. However, cable 208 and cable 210 are twisted around one another, which may lead to undesirable consequences.

[0037] Then, at a step 118, the method 100 can include generating an augmented-reality (“AR”) output. In embodiments, the AR output includes one or more visual indicators to be superimposed on the plurality of medical cables using an augmented-reality device. In further embodiments, the visual indicators of the AR output can include a persistent highlight / overlay color, an intermittent or flashing highlight / overlay color; a cable-related message, an arrow indicator, or the like for one or more of the plurality of medical cables. For instance, as shown in FIG. 2, the AR output 200 includes three visual indicators: (1) cable 208 includes an overlay / highlight color 216; and (2) there is a cable-related message 214 indicating a particular error or issue; and (3) there is an indication arrow 218 pointing to cable 208. Accordingly, the AR output can include a variety of combinations of visual indicators. [0038] Additionally, in aspects, the AR output can include a sequence of visual instructions for guiding a user of the augmented-reality device to address one or more of the cable-related issues identified above, such as instructions for untangling one or more cables with a disrupted pathway. For example, if it is identified that a relevant cable (e.g., a cable indicated by the touch data) is obstructed from view, the AR output could include a sequence of visual indicators emphasizing other cables of the plurality of medical cables to be adjusted (i.e., moved or shifted), which can appear and/or disappear as a user / healthcare specialist follows the instructions. In embodiments such as the scenario depicted in FIG. 2, if a cable is twisted around another cable, then the AR output can include visual indicators 214, 216, 218 indicating that a user unplug one of the cables and disentangle the cables 208, 210.

[0039] Then, at a step 120, the method 100 can include transmitting the AR output to the display of an augmented-reality device, such as the lenses or display of a wearable headset, wherein the one or more visual indicators generated based on the analysis of the plurality of cables (e.g., step 106) are superimposed upon the plurality of cables in the user’s view. As those of skill in the art will appreciate, the AR output can be updated and/or adjusted as the user’s view of the plurality of cables changes. For example, if the user operating the augmented-reality device changes position, the AR output can update / adjust such that the corresponding visual indicators remain consistent. The AR output can also be updated / adjusted as new video data, touch data, and/or functional parameters are received. In such embodiments, the augmented-reality systems can provide a real-time analysis of the cable assembly connecting a particular patient to a corresponding patient monitoring system.

[0040] Turning now to FIG. 3, a diagram of an augmented reality-enabled cable identification system 300 comprising a cable identification controller 302 is illustrated according to aspects of the present disclosure. As shown, the cable identification controller 302 can include at least one processor 304 operatively connected to a memory storage device 306 that stores instructions 308 for operating the corresponding augmented reality-enabled cable identification system 300. The at least one processor 304 can be a general-purpose processor, a microprocessor, or any conventional processor, controller, microcontroller, or state machine. The processor 304 also may be implemented as a combination of computing devices, such as a combination of a digital signal processor (DSP) and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. The memory storage device 306 can comprise one or more types of transitory and/or non-transitory machine- readable memory, including but not limited to, RAM, ROM, EPROM, EEPROM, or other optical disk storage, magnetic disk storage or other magnetic storage devices, solid-state storage devices such as flash-based solid-state storage devices (e.g., SD card, micro-SD card, SSD, USB flash drives, etc.), or any other medium that can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose, a special purpose, or other machine with a processor. As used here, machine-executable instructions 308 can include, for example, instructions 308 and data 310 which cause a general- purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.

[0041] In embodiments, the cable identification controller 302 can further include an interface bus 312 that comprises one or more components to facilitate communication with peripheral / external devices. For example, and without limitation, the interface bus 312 can include a wired and/or wireless communications interface that allow the cable identification controller 302 to send and receive information from the one or more peripheral devices using various communication protocols, such as Wi-Fi, Bluetooth, cellular data networks (e.g., 3G, 4G, 5G, LTE, etc.), or the like. In particular embodiments, the interface bus 312 can include a communications interface that facilitates wireless communication over a private, dedicated spectrum, such as within the 1.4 GHz band.

[0042] More specifically, the interface bus 312 can facilitate communication with the one or more devices, such as an augmented-reality device 314 (or a display of the augmented-reality device), a patient monitoring system 316, an associated camera 318, an external computer 320, a remote server 322, and the like. In embodiments, the peripheral devices 320, 322 can include one or more servers, cloud computing devices, backend servers, mobile devices, smartphones, other mobile phones, tablet computers, wearable computing devices, desktop computers, laptop computers, display screens, one or more user input devices such as a keyboard, touchpad, touch screen, mouse or other point device, scroll wheel, click wheel, dial, button, stylus, switch, keypad, microphone, camera, and/or combinations thereof. In specific embodiments, the peripheral device 322 can include an optical features database storing optical feature information on a plurality of known medical cables. [0043] In embodiments, the patient monitoring system 316 can comprise at least a patient monitor having a display for displaying the signals measured by the sensors, wherein the patient monitor is operatively connected to the patient via a plurality of medical cables. The display can incorporate various image generating technologies, such as a liquid crystal display (LCD), lightemitting diode (LED) including organic light-emitting diodes (OLED), projection system, cathode ray tube (CRT), or the like, together with supporting electronics (e.g., digital-to-analog or analog- to-digital converts, signal processors, or the like). A device such as a touchscreen that functions as both input and output device can also be used.

[0044] In embodiments, the interface bus 312 of the cable identification controller 302 can further facilitate wireless communication between one or more of the devices 314, 316, 318, 320, 322 themselves. For example, the interface bus 312 of the cable identification controller 302 can facilitate the wireless communication between the cable identification controller 302 and patient monitoring 316 and/or the augmented reality device 314.

[0045] In particular embodiments, the augmented reality-enabled cable identification system 300 can include the memory 306 storing instructions 308 that, when executed by at least one processor 304, perform one or more of the steps of the methods described herein. In aspects, the instructions 308 can include one or more stored program components, such as an input component 326, and analysis component 328, and an augment-reality (“AR”) component 330. These components may be incorporated into, loaded from, loaded onto, or otherwise operatively available to and from the controller 302.

[0046] The input component 326 can be a stored program component that is executed by at least one processor, such as the at least one processor 304. In particular, the input component 326 can facilitate input/output communication between the controller 302 and various devices 314, 316, 318, 320, 322, such as by sending and receiving input from one or more of such devices. For example, the input component 326 can be configured to receive and/or retrieve video data 332, touch data 334, optical features 336, cable parameter 338, AR outputs 340, and other data / information, such as functional parameters from a patient monitor.

[0047] The analysis component 328 can be a stored program component that is executed by at least one processor, such as the at least one processor 304. In particular, the analysis component 328 can analyze a plurality of medical cables captured in the video data 332. In aspects, the analysis component 328 may extract optical features 336 associated with one or more cables captured in the video data 332, compare the extracted optical features 336 with known / pre-defined optical feature information, identifying cable parameters 338 for the different cables, determining the functionality and/or intended use of the cables, and determining the cable pathways including obstructed and unobstructed cable pathways, as discussed above with respect to the methods 100. [0048] The augmented-reality (“AR”) component 330 can be a stored program component that is executed by at least one processor, such as the at least one processor 304. Based on the analysis performed using the analysis component 328, the AR component 330 can generate an AR output 340 to transmit to an AR-enabled device (such as AR device 314). As discussed above, the AR output 340 can include one or more visual indicators to be superimposed over a view of the cables, including sequential directions for handling one or more of the medical cables (e.g., untangling the cables, etc.). In aspects, the AR output 340 can further include one or more of a persistent highlight color emphasizing one or more cables, an intermittent or flashing highlight color for emphasizing one or more cables, a cable-related message, one or more arrow indicators, and the like.

[0049] In further embodiments, the augmented reality-enabled cable identification system 300 can include the cable identification controller 302. In still further embodiments, the augmented reality-enabled cable identification system 300 can include one or more of: the optical features database 322; the associated augmented-reality device 314; the patient monitoring system 316 and/or the plurality of medical cables; and/or the camera 318. In aspects, the augmented-reality device 314 itself includes one or more of: the camera 318; a display; and the cable identification controller 302 comprising a processor 304 and the memory 306.

[0050] It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the inventive subject matter disclosed herein. It should also be appreciated that terminology explicitly employed herein that also may appear in any disclosure incorporated by reference should be accorded a meaning most consistent with the particular concepts disclosed herein. [0051] All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.

[0052] The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”

[0053] The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified.

[0054] As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of’ or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of.”

[0055] As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.

[0056] In the claims, as well as in the specification above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of’ and “consisting essentially of’ shall be closed or semi-closed transitional phrases, respectively.

[0057] It should also be understood that, unless clearly indicated to the contrary, in any methods claimed herein that include more than one step or act, the order of the steps or acts of the method is not necessarily limited to the order in which the steps or acts of the method are recited.

[0058] The above-described examples of the described subject matter can be implemented in any of numerous ways. For example, some aspects can be implemented using hardware, software or a combination thereof. When any aspect is implemented at least in part in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single device or computer or distributed among multiple devices/computers.

[0059] The present disclosure can be implemented as a system, a method, and/or a computer program product at any possible technical detail level of integration. The computer program product can include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure. [0060] The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium can be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium comprises the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire. [0061] Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network can comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.

[0062] Computer readable program instructions for carrying out operations of the present disclosure can be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, comprising an object oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions can execute entirely on the user’s computer, partly on the user’s computer, as a standalone software package, partly on the user’s computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer can be connected to the user's computer through any type of network, comprising a local area network (LAN) or a wide area network (WAN), or the connection can be made to an external computer (for example, through the Internet using an Internet Service Provider). In some examples, electronic circuitry comprising, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) can execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.

[0063] Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to examples of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.

[0064] The computer readable program instructions can be provided to a processor of a, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions can also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture comprising instructions which implement aspects of the function/act specified in the flowchart and/or block diagram or blocks.

[0065] The computer readable program instructions can also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.

[0066] The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various examples of the present disclosure. In this regard, each block in the flowchart or block diagrams can represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the blocks can occur out of the order noted in the Figures. For example, two blocks shown in succession can, in fact, be executed substantially concurrently, or the blocks can sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions. [0067] Other implementations are within the scope of the following claims and other claims to which the applicant can be entitled.

[0068] While several inventive embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the inventive teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure.




 
Previous Patent: ROBOT CLEANER

Next Patent: PROXIMITY TRIGGER IN SCENE DESCRIPTION