Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
PERIPHERAL DEVICE SELECTION ACCORDING TO A PARAMETER DETERMINED BASED ON A CAPTURED IMAGE
Document Type and Number:
WIPO Patent Application WO/2023/285109
Kind Code:
A1
Abstract:
A system is proposed for switching peripheral devices to a computing device selected between a plurality of computing devices based on an analysis of the user gaze or head position from pictures captured by a camera and a selection command sent to a connection device that handles the connection between computing devices and peripheral devices. The system comprises a selection method, devices for implementing the selection method, a connecting device to perform the selection and a selection command.

Inventors:
DEFRANCE SERGE (FR)
THIEBAUD SYLVAIN (FR)
MORIN THOMAS (FR)
Application Number:
PCT/EP2022/067341
Publication Date:
January 19, 2023
Filing Date:
June 24, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
INTERDIGITAL CE PATENT HOLDINGS SAS (FR)
International Classes:
G06F3/01
Foreign References:
US20080024433A12008-01-31
Attorney, Agent or Firm:
INTERDIGITAL (FR)
Download PDF:
Claims:
CLAIMS

1. A method comprising:

- capturing, by one of a plurality of devices, an image of a user,

- determining a device to be selected amongst the plurality of devices based on a parameter representative of the image captured by the one of a plurality of devices, and

- responsively sending a selection command to select the determined device.

2. The method of claim 1 wherein the parameter is an angle of vision, wherein the analysis of the captured image uses an eye-tracking algorithm to determine an angle of vision and wherein a range of angle vision is associated to the plurality of devices, further comprising selecting a device amongst the plurality of devices for which the determined angle of vision is comprised in the associated range of angle vision.

3. The method of claim 1 wherein the parameter is an angular position of the head, wherein the analysis of the captured image uses a head pose algorithm to determine an angular position of the head and wherein a range of angular position of the head is associated to the plurality of devices, further comprising selecting a device amongst the plurality of devices for which the determined angular position of the head is comprised in the associated range of angular position of the head.

4. The method of claim 1 wherein the parameter identifies a collection of labelled pictures corresponding to the captured image and is determined based on machine learning using a collection of labelled pictures for a device of the plurality of devices and, further comprising selecting the device associated to the collection of labelled pictures.

5. The method of any of claim 1 to 4, further comprising a setup phase comprising:

- displaying a learning sequence on a screen or plurality of screens associated with a device chosen amongst the plurality of devices,

- capturing images of the user; and

- associating captured images to the chosen device.

6. The method of claim 5 wherein the setup phase is triggered responsive to a user request.

7. The method of any of claim 1 to 6, wherein a recalibration phase is triggered responsive to a user request.

8. A device comprising a processor configured to:

- obtain a capture of an image of a user, captured by one of a plurality of devices, - determine a device to be selected amongst the plurality of devices based on a parameter representative of the image captured by the one of a plurality of devices, and

- responsively send a selection command to select the determined device.

9. The device of claim 8 wherein the parameter is an angle of vision, wherein the analysis of the captured image uses an eye-tracking algorithm to determine an angle of vision and wherein a range of angle vision is associated to the plurality of devices, further comprising selecting a device amongst the plurality of devices for which the determined angle of vision is comprised in the associated range of angle vision.

10. The device of claim 8 wherein the parameter is an angular position of the head, wherein the analysis of the captured image uses a head pose algorithm to determine an angular position of the head and wherein a range of angular position of the head is associated to the plurality of devices, further comprising selecting a device amongst the plurality of devices for which the determined angular position of the head is comprised in the associated range of angular position of the head.

11. The device of claim 8 wherein the parameter identifies a collection of labelled pictures corresponding to the captured image and is determined based on machine learning using a collection of labelled pictures for a device of the plurality of devices and, further comprising selecting the device associated to the collection of labelled pictures.

12. The device of any of claim 8 to 11, further comprising a setup phase comprising:

- displaying a learning sequence on a screen or plurality of screens associated with a device chosen amongst the plurality of devices,

- capturing images of the user; and

- associating captured images to the chosen device.

13. The device of claim 12 wherein the setup phase is triggered responsive to a user request.

14. The device of any of claim 8 to 13, wherein a recalibration phase is triggered responsive to a user request.

15. A connection device comprising:

- a first plurality of interface connections,

- a second plurality of interface connections,

- a selector module configured to receive a selection command on at least one of the interface connections of the first plurality of interface connections, and responsively connecting a selected interface connection of the first plurality of interface connections to the second plurality of interface connections.

16. The connection device of claim 15 wherein the choice of the selected interface connection is determined by the interface connection on which the command is received.

17. The connection device of claim 15 wherein the choice of the selected interface connection is determined by a parameter of the received command.

18. The connection device of any of claim 16 to 17 wherein the interface connections are compliant with the USB specification and wherein the connection device further comprises a plurality of USB devices associated to the first plurality of interface connections and at least one USB hub associated to the second plurality of interface connections.

19. The connection device of claim 18 wherein the connection is done at the physical layer of the USB protocol.

20. The connection device of claim 18 wherein the connection is done at the protocol layer of the USB protocol.

21. A command for selecting an interface connection of a plurality of interface connections in a connection device according to any of claims 15 to 20, the command comprising a header and a command identifier.

22. The command of claim 21 further comprising information representative of the interface connection to be selected.

23. The command according to any of claim 21 or 22, wherein the command is compliant with the USB protocol.

24. A system for handling connections between a plurality of devices and at least one peripheral devices, the system comprising:

- at least two devices according to any one of claims 8 to 14;

- at least one peripheral device;

- a connection device according to any one of claims 15 to 20; wherein at least one of the device operates the method according to any one of claim 1 to 6 and transmits to the connection device a command according to any one of claim 21 to 23.

25. A computer program product comprising instructions for, when executed on a processor: - capturing, by one of a plurality of devices, an image of a user,

- determining a device to be selected amongst the plurality of devices based on a parameter representative of the image captured by the one of a plurality of devices, and

- responsively sending a selection command to select the determined device. 26. A non-transitory computer-readable medium comprising instructions for, when executed on a processor:

- capturing, by one of a plurality of devices, an image of a user,

- determining a device to be selected amongst the plurality of devices based on a parameter representative of the image captured by the one of a plurality of devices, and - responsively sending a selection command to select the determined device.

Description:
PERIPHERAL DEVICE SELECTION ACCORDING TO A PARAMETER DETERMINED BASED ON A CAPTURED IMAGE

1. Field of the disclosure

The present invention relates generally to connection between devices. More specifically the invention relates to connecting peripheral devices such as a USB mouse, USB keyboard or USB headphone to a computing device selected between a plurality of computing devices based on an analysis of the user gaze, head pose detection or machine learning based techniques.

2. Technical background

USB switches are designed to share USB devices such as keyboard or mouse between multiple computing devices operated by a single user such as desktop computers, laptops, tablets, smartphones or any other devices with similar connection capabilities. It is convenient since it allows a single user to share a unique keyboard and/or mouse (or any other USB-connected device such as a headphone or a smartcard reader) between two or more computing devices and therefore save space on the working desk. However, existing solutions require the user to manually switch from one computing device to the other. This manual switch operation is not convenient when the user wants to switch frequently and on the fly between computing devices.

Conventional operating systems already include a mechanism to handle multiple screens connected to a single computing device. In a configuration phase, the user indicates the relative positions of the multiple screens. When in operation, the system detects when the coordinates of the mouse crosses the border of one screen and in this cases moves the coordinates of the cursor to the next screen, allowing to ‘switch’ automatically from one screen to the other. However, this technique applies only for a single computer.

Embodiments described hereafter have been designed with the foregoing in mind.

3. Summary

The present disclosure proposes a new and inventive solution for switching peripheral devices to a computing device selected between a plurality of computing devices based on an analysis of the user gaze or head pose from pictures captured by a camera and a selection command sent to a connection device that handles the connection between computing devices and peripheral devices.

A first aspect of the present disclosure relates to a method comprising capturing, by one of a plurality of devices, an image of a user, determining a device to be selected amongst the plurality of devices based on a parameter representative of the image captured by the one of a plurality of devices, and responsively sending a selection command to select the determined. A second aspect of the present disclosure relates to a device comprising a processor configured to obtain a capture of an image of a user, captured by one of a plurality of devices, determine a device to be selected amongst the plurality of devices based on a parameter representative of the image captured by the one of a plurality of devices, and responsively send a selection command to select the determined device.

A third aspect of the present disclosure relates to a device comprising a first plurality of interface connections, a second plurality of interface connections, a selector module configured to receive a selection command on one of the interface connections of the first plurality of interface connections, and responsively connecting a selected interface connection of the first plurality of interface connections to the second plurality of interface connections.

A fourth aspect of the present disclosure relates to a command for selecting an interface connection of a plurality of interface connections in a connection device according to the third aspect.

A fifth aspect of the present disclosure relates to a system for handling connections between a plurality of computing devices and a plurality of peripheral devices, the system comprising at least two computing devices according to the second aspect, at least one peripheral device, a connection device according to the third aspect, wherein at least one of the computing device operates the method according to the first aspect and transmits to the connection device a command according to the fourth aspect.

A sixth aspect relates to a computer program product comprising program code instructions for implementing the method according to the first aspect or any variant embodiment of first aspect, when said program is executed on a computer or a processor.

A seventh aspect relates to a non-transitory computer-readable storage medium storing the program code instructions for implementing the method according to the first aspect or any variant embodiment of first aspect.

In a variant embodiment of the first and second aspect, the parameter is an angle of vision, the analysis of the captured image uses an eye-tracking algorithm to determine an angle of vision, a range of angle vision is associated to a device of the plurality of devices, and further comprises selecting a device amongst the plurality of devices for which the determined angle of vision is comprised in its associated range of angle vision.

In another variant embodiment of the first and second aspect, the parameter is an angular position of the head, the analysis of the captured image uses a head pose algorithm to determine an angular position of the head, a range of angular position of the head is associated to a device of the plurality of devices, and further comprises selecting a device amongst the plurality of devices for which the determined angular position of the head is comprised in its associated range of angular position of the head.

In another variant embodiment of the first and second aspect, the analysis of the captured image is based on machine learning using a collection of labelled pictures for each of a device of the plurality of devices and the parameter identifies the collection of labelled pictures corresponding to the captured image, the variant further comprises selecting the device associated to the collection of labelled pictures.

Variant embodiments of the first and second aspect further comprise a setup phase comprising displaying a learning sequence on a screen or plurality of screens associated to a device chosen amongst the plurality of devices, capturing images of the user; and associated captured images to the chosen device.

4. Brief description of the drawings

The invention can be better understood with reference to the following description and drawings, given by way of example and not limiting the scope of protection, and in which:

Figure 1 illustrates an example of layout according to at least one embodiment;

Figure 2A illustrates an example of system according to at least one embodiment based on a single camera and single decision process;

Figure 2B illustrates an example of system according to at least one embodiment based on multiple cameras and multiple decision processes;

Figure 3 illustrates a flowchart for handling the computing device selection according to at least one embodiment;

Figure 4 illustrates a flowchart for the setup phase according to at least one embodiment;

Figure 5 illustrates an example of conventional USB topology;

Figure 6 illustrates a simplified example of conventional USB functional layers;

Figure 7 illustrates an example of connection device according to at least one embodiment based on the USB specification;

Figure 8 illustrates an example of architecture of a connection device according to at least one embodiment based on the USB specification where the connection is done at low level layer;

Figure 9 illustrates an example of architecture of a connection device according to at least one embodiment based on the USB specification where the connection is done at high level layer;

Figure 10A illustrates an example of USB selection command according to an embodiment based on a single detection process;

Figure 10B illustrates an example of USB selection command according to an embodiment based on multiple detection processes;

Figure 11 illustrates a flowchart for a continuous learning process according to at least one embodiment; and

Figure 12 illustrates an example of computing device implementing a method for device selection according to at least one embodiment. 5. Description of embodiments

While example embodiments are capable of various modifications and alternative forms, embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that there is no intent to limit example embodiments to the particular forms disclosed, but on the contrary, example embodiments are to cover all modifications, equivalents, and alternatives falling within the scope of the claims. Like numbers refer to like elements throughout the description of the figures.

Before discussing example embodiments in more details, it is noted that some example embodiments are described as processes or methods depicted as flowcharts. Although the flowcharts describe the operations as sequential processes, many of the operations may be performed in parallel, concurrently or simultaneously. In addition, the order of operations may be re-arranged. The processes may be terminated when their operations are completed, but may also have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, subprograms, etc.

Methods discussed below, some of which are illustrated by the flow charts, may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware or microcode, the program code or code segments to perform the necessary tasks may be stored in a machine or computer readable medium such as a storage medium. A processor(s) may perform the necessary tasks. Specific structural and functional details disclosed herein are merely representative for purposes of describing example embodiments of the present invention. This invention may, however, be embodied in many alternate forms and should not be construed as limited to only the embodiments set forth herein.

Figure 1 illustrates an example of layout according to at least one embodiment. The user sits in front of two computing devices (PCI 210 in figure 2and PC2220 in figure 2), each of them being connected to two screens. Both computing devices are connected to a connection device to which peripheral devices (231 to 233 in figure 2) are connected. Various types of peripheral devices may be connected to the connection device: input devices such as a keyboard or a mouse, output devices such as a headphone or haptic actuators or more complex devices such as smartcard reader, printer, or scanner.

The computing device PCI is also connected to a camera (231 in figure 2) and runs a decision algorithm (process 300 in figure 3) that determines, based on pictures captured by the camera, which screen the user is currently watching, i.e. which computing device is currently used. This decision is based on a user tracking algorithm, for example using conventional eye-tracking or head-tracking image processing algorithms or based on a machine learning process based on labelled pictures. Such algorithms are able the determine a viewing direction currently aimed at by the user (in other words, an angular position of the user) or an angular position of the head of the user.

Other head-tracking mechanisms such as a positioning system attached to the user’s head may also be used. Although such a solution seems more intrusive than the eye or head tracking solution, a positioning system may be integrated in a headphone that the user is wearing and thus would be completely transparent for the user.

The user tracking algorithm determines a parameter representative of the screen currently watched by the user. This parameter is for example an angular position representing the user’s attention respective to the screens. The parameter is determined, for example by analyzing pictures captured by the camera using eye gaze or head pose/orientation algorithm. The system is set up with relative positioning parameters, and more particularly the angular position that corresponds to the boundary between the angular viewing sectors associated with each computing device. This boundary is represented in the figure by the line A. Therefore, any detected angle at the left of line A according to the user position (or greater than the angle of line A according to the camera direction) is considered to be related to computing device PCI. Similarly, any detected angle at the right of line A according to the user position (or smaller than the angle of line A according to the camera direction) is considered to be related to computing device PCI.

When the decision algorithm determines that the user is watching a screen of the computing device PC2, the computing device PCI generates a selection command to the connection device instructing to switch the peripheral devices to the computing device PC2. Such command is issued for each change of angular viewing sector and also when starting up the devices. A default configuration of the connection may also be set up.

The angles B and C of the figure are related to a hysteresis value that prevents oscillation between the two computing devices when the user gaze is near the border of a boundary and thus when the gaze’s measured angle is close to the threshold (i.e. close to the line A in figure 1). When the previous result is PCI, the gaze measured angle should be higher than the threshold plus the hysteresis value to classify the result as PC2. Conversely, when previously having PC2, the gaze’s measured angle should be lower than the threshold minus the hysteresis value.

In another embodiment, the selection is based on a machine learning process using sets of labelled pictures for the devices of the plurality of devices, without really having to determine an angular parameter. In this case, the parameter result of the selection process is designating the set of labelled pictures whose captured picture belongs to (or is similar to) and thus the device to be selected.

Other configurations may use more than two computing devices and thus would lead to multiple boundaries for the angular viewing sectors. However, the principles would still apply. In other configurations, more than one computing device may be connected to a camera and thus multiple decision algorithms may be running concurrently on multiple computing devices.

Figure 2A illustrates an example of system 200A according to at least one embodiment based on a single camera and a single decision process. This setup corresponds to the example of layout of figure 1. The laptop 210 corresponds to computing device PCI of figure 1. It integrates a screen 211 and a camera 213 and is connected to an external screen 212. The desktop computer 220 corresponds to computing device PCI of figure 1. It is connected to two external screens 221 and 222. Both computers 210 and 220 are connected to a connection device 201. Peripherals 231, 232 and 233 are also connected to the connection device 201. The decision process 300 of figure 3 that comprises the decision algorithm mentioned above is executed on the laptop 210 which then sends the appropriate selection command to the connection device when required. In such embodiment, only a single decision algorithm is running and the selection command comprises an identification of the computing device to be selected (laptop 210 or desktop 220).

Figure 2B illustrates an example of system 200B according to at least one embodiment based on multiple cameras and multiple decision processes. This setup differs from the setup of figure 2A by the additional camera 223 connected to the desktop computer 220. In such embodiment, two instances of decision process 300 of figure 3 are running concurrently on the laptop computer 210 and on the desktop computer 220. Each of the computing devices may issue a selection command when required.

Although the cameras are also considered as peripheral devices, their connection to the computing devices is direct. In other words, these devices are not connected through the connection device 201. This provides the required input images to the decision process by allowing images to be captured at any time, independently of the peripheral selection.

In a variant embodiment, the identification of the computing device to be selected in the selection command is not needed since the connection device is able to determine the identification of the computing device according to the source of the selection command. The system has to be setup particularly in order to allow this operating mode.

Figure 3 illustrates a flowchart of a selection process according to at least one embodiment. This selection process 300 is executed for example on device 210 of figure 2B or on device 1200 of figure 12. At the startup of the device, a first selection command is sent to the connection device 201 instructing to select, in step 310, a first device, for example the device 210. This first selection is arbitrary or may be predetermined. The information about the selected device is stored as being the current device. Then the process 300 iterates over the remaining steps. In step 320, the device 210 executes the detection algorithm to determine which of the computing devices is currently being gazed on by the user. In step 330, if the detected device is not different from the current device, in branch “no”, then it means that no change of gaze by the user has occurred and thus the process loops again on the detection step 320. In step 330, if the detected device is different from the current device, in branch “yes”, then it means that the user gaze has changed from one device to another device. In this case, in step 340, a new selection command is sent to the connection device 201 instructing to select the newly detected device. The information about the selected device is stored as current device.

Figure 4 illustrates a flowchart for the setup phase according to at least one embodiment. This process 400 is executed for example on the device 210 of figure 2B or on device 1200 of figure 12. Indeed, before using the process 300 of figure 3, the system has first to be calibrated for the user tracking algorithm to operate or trained if using a classification system based on a neural network for instance. The setup process 400 aims at capturing a sufficient number of pictures of the user’s eye gaze or head orientation in front of his/her computing devices and where the pictures are labeled with the associated device (i.e. the computing device the user wants to use when having his/her eyes/head in such a position). The labeled captured pictures are used as calibration data to determine the appropriate threshold to correctly discriminate cases where the user looks at one or the other computing device. If we consider the example of the classification based on the user’s gaze direction, the threshold is the angle of the axis separating the two angular sectors as perceived by the camera catching the pictures and as shown in figure 1 as angle A.

In addition, a hysteresis value may also be defined to avoid a classification oscillating between two computing devices when the gaze’s measured angle is closed to the threshold. Referring back to figure 1, when the previous result is PCI, the gaze measured angle should be higher than the threshold plus the hysteresis value to classify the result as PC2. At the opposite, when previously having PC2, the gaze’s measured angle should be lower than the threshold minus the hysteresis value.

When considering using a neural network (or any other classification system based on artificial intelligence), the setup phase produces a training set which is then used to train a recognition model running on a computing device. The recognition model classifies then pictures captured by the camera in two classes: the user looks at PCI, or the user looks at PC2 (still referring to Figure 1). In the selection phase, the neural network then classifies the captured images as corresponding to one of the computing devices (as learned during the setup phase). This classification allows to perform the detection step 320 of the detection process 300.

The setup phase can be triggered when the system runs for the first time, or when the user requests a recalibration, for example by pressing a button on the connection device, or via a user interface of the system on one of the computing device. When two computing devices are used, two series of pictures are captured in the setup phase: the first series concerns the first device and the second series the second device. In the first series, the focus of the user is supposed to be on the first computing device. It corresponds to the case where the connection device should connect the peripherals (keyboard, mouse) to the first computing device. The second series is for the second PC.

In order to generate pictures where the user focuses on the screens associated with a given computing device, the computing device displays a learning sequence (animation or video) on the screen, in step 410, preferably taking care that the glance of the user (eye-tracking or head pose) will have to scan the whole screen’s surface. If the computing device is connected to multiple screens, then the learning sequence should cover the multiple screens. One example of learning sequence is an animation representing a ball which the user has to follow with the eyes and which crisscrosses the whole screen’s (or screens’) surface. During the animation, the learning sequence, the computing device records pictures of the user. These pictures were taken while the user was interacting with the computing device that was playing the learning sequence and thus represent the different positions of the user (with regards to eye gaze and/or head pose) that correspond to the use of this computing device. At the end of the animation, the user is asked to validate the recording, in step 420. If the user considers the recording as not satisfactory, he or she may ask for a new one in step 430. This learning phase is iterated for each of the computing device.

The same type of recording is performed for the second PC. Once both recordings are performed, the system is trained with the collected data and can rim in normal operation, i.e. executing the process 300 of figure 3.

In at least one embodiment of the process 300 of figure 3, the step 330 of figure 3 may comprise a buffer mechanism to increase the reliability of the detection. Indeed, the detection is considered as changed only after a series of pictures classified differently from the former classification. The length of this series is determined as a tradeoff between the reaction time of the system (the shorter it is, the faster the system reacts) and the liability of the decision to switch from one computing device to the other (the longer, the more reliable). For example, a picture analysis period of 500 ms (i.e., analysis of two pictures per second) and a series length of 4 would lead to a reaction time of two seconds. When such a series is detected, the connection device selects the newly detected computing device.

In order not to overload the computing capabilities of the PC, pictures are not continuously analyzed. The system is expected to have a reaction time of one or two seconds for example. However, they are analyzed sufficiently frequently to ensure that the system has a satisfying reaction time.

The connections between the devices preferably conform to a standard to ensure the interoperability. Different standards may be used such as the USB standard for example. The USB standard defines the physical and electrical characteristics of the interface and a communication protocol allowing the connected devices to interact together, i.e. to transmit or receive commands and transmit or receive data.

Figure 5 illustrates an example of conventional USB topology. Indeed, in at least one embodiment, the connections between the computing devices and the peripheral devices are based on the USB specification. USB is a bus with a master/slave topology. The master is unique and is named ‘host’ in USB terminology. It is typically implemented by a computing device. The slaves are conventionally peripheral devices, such a keyboard, mouse, headphones, hard disk drives, reader for various removable devices such as smartcards, memory cards, optical media disks, etc. Slave devices (520 to 524 in the figure) are named ‘device’ in USB terminology. In addition, a USB bus can be split into several physical connections through a ‘hub’. A hub typically allows to increase the number of connections. In the figure, the first hub 510 connects the host 500 to devices 520 and 521 and to a second hub 511 that connects to devices 522 to 524.

Figure 6 illustrates a simplified example of conventional USB functional layers. It represents the different layers of a USB host 601, a USB hub 602 and a USB device 603. Above the physical layer, the link layer handles packets and frame delimiters. The protocol layer handles transactions and data packets. The hub only terminates the link and physical layers. The protocol layer is terminated only in the host. Packets have to be forwarded up to the host where transactions are handled. In the most recent version of USB, a functional layer will also be provided.

However, when a connection device 201 is based on the USB protocol, the architecture of figure 6 needs to be adapted as further described with reference to figure 7.

Figure 7 illustrates an example of connection device according to at least one embodiment based on the USB specification. The connection device 201 illustrated in the figure is a two-to-two device meaning that it is able to connect two computing devices and two peripherals. Any other number of connections is easy conceivable and would use the same principles. The connection device 201 comprises the connection interfaces 701, 702, 711, 712. The connection interfaces 701 and 702 are connected to the computing devices (for example respectively to devices 210 and 220 of figure 2A) while the connection interfaces 711 and 712 are connected to the peripheral devices (for example respectively to devices 231 and 232 of figure 2A). The connection device 201 also comprises a selector module 111 that is configured to establish the physical connection between one of the computing devices (the “selected” one) and the peripheral devices. The connection is done according to a selection command that is received from one of the computing devices through one of the connection interfaces 701 or 702. More precisely, the connection is done between one of the USB device entities 771 or 773, respectively corresponding to the connection interface 701 for USB#1 and 702 for USB#2, and the USB hub entity 775 corresponding to the connection interfaces 711 for USB#3 and 712 for USB#4, according to one of the architecture described in figures 8 or 9.

In order to perform the selection, the selector module 111 analyzes the incoming communications to detect a selection command and performs the corresponding selection. The selection command uses for example a USB data packet.

The configuration for the connection device is therefore different from a conventional USB switch. The reason is that the connection device is connected to two different USB lines driven by two different hosts: the usb driven by PC#1 and the usb driven by PC#2. Instead of having one single upstream port as a classical hub, it has two upstream ports.

From a functional point of view, in at least one embodiment, the connection is done at the physical (low level) layer while in a second embodiment, the connection is done at the protocol or application (high level) layer.

Figure 8 illustrates an example of architecture of a connection device according to at least one embodiment based on the USB specification where the connection is done at low level layer. In such embodiment, with reference to figure 7 for the USB connections between the devices, the connection device alternatively connects USB#3 and USB#4 to USB#1 (upstream port to PCI) or to USB#2 (upstream port to USB#2). The decision to connect USB#3 and USB#4 whether to USB#1 or to USB#2 is taken by the selector module upon reception of a USB command as described herein. The USB command is generated by the detection process operated on one of the PCs, for example PCI and communicates its decision to the selector module. Both the detection process and selector module are applications which use a USB supported data communication. This communication is functionally located above the usb protocol, using usb data packets to carry the information.

The selector module controls the effective connection of USB#3 and USB#4 to either USB#1 or USB#2. In this embodiment, the connection is done at the lowest level. This option is equivalent to physically disconnect and reconnect the devices on USB#3 and USB#4 to either USB#1 or USB#2 each time the switch software changes the connection. In this option, the usb switch is a double switch device (one switch device connected on USB#1 and one switch device connected on USB#2). And USB#3 and USB#4 are not autonomous busses, but simple extensions of alternatively USB#1 or USB#2.

Figure 9 illustrates an example of architecture of a connection device according to at least one embodiment based on the USB specification where the connection is done at high level layer. In such embodiment, with reference to figure 7 for the USB connections between the devices, USB#3 is an autonomous bus. The connection device, in addition to comprising a double USB device, comprises also a USB host for USB#3. The connection device creates logical devices both on USB#1 and USB#2 for each real device connected on USB#3 and USB#4. A mouse connected on USB#3 shall have its corresponding logical device on USB#1 and USB#2 respectively. However, depending on which computing device is selected by the connection device, the data packets sent by the device on USB#3 will or will not be forwarded on USB# 1 or USB#2 respectively. In such embodiment, a specific subclass may be specified for the high-level connection option.

In at least one embodiment, when a device is connected on USB#3, the connection device reproduces the connection transaction on both USB#1 and USB#2.

In a variant embodiment, the switch may have a differentiated behavior depending on the type of device. If the device is a keyboard or a mouse, further data transactions initiated by the peripheral device will only be reproduced either on USB#1 or USB#2, depending on which computing device is selected. At the opposite, if the peripheral device is a printer, it will permanently receive data from both PCs.

For the sake of simplification, the diagrams of figures 7, 8 and 9 are based on an example embodiment where only two computing devices are used. However, a connection device with higher number of computing devices can be designed based on the same principles.

Figure 10A illustrates an example of USB selection command according to an embodiment based on a single detection process. An example of such configuration is shown in figure 2A where the detection process 300 is executed on the laptop device 210. In such embodiment, the selection command comprises an identification corresponding to the selected computing device interface of the connection device 600, for example ‘G for the interface 601 and ‘2’ for the interface 602.

In such embodiment, when the user turns his head from the screen of the laptop connected to the interface 601 to one of the screens of the desktop connected to the interface 602, this event is detected by the detection process 300 running on the laptop. As a result, the laptop will send a selection command on his USB connection to the connection device 600. The connection device 600 detects the selection command and identifies which connection interface to select as the identification value carried with the selection command.

In this case, these identification values must be associated to the corresponding computing device during the setup phase, for example though a user interface that allows to associate an identification value to each of the computing device, in other words, each of the classification.

In an implementation based on the USB protocol, such command could take the following form: a header H, a command C and an identification value ID. Figure 10B illustrates an example of USB selection command according to an embodiment based on multiple detection processes. An example of such configuration is shown in figure 2B where the detection process 300 is executed both on the laptop device 210 and on the desktop device 220. In such embodiment, the selection command does not need any identification corresponding to the selected computing device interface but the selection will be determined based on the connection interface on which the command was received.

In such embodiment, when the user turns his head from the screen of the laptop connected to the interface 601 to one of the screens of the desktop connected to the interface 602, this event is detected by the detection process 300 running on the desktop. As a result, the desktop will send a selection command on his USB connection to the connection device 600. The connection device 600 detects the selection command and identifies which connection interface to select as the one on which the command was received.

This embodiment requires an additional test in the step 330 of the detection process 300 and also in the setup phase. Indeed, in such embodiment, each of the detection process is operated concurrently and does not know the other devices. Thus, instead of handling a value representing one of the computing device, the process only to handle a Boolean value indicating if it is about the computing device running the process or another one. In this case also, these identification values do not need to be associated with the corresponding computing device during the setup phase.

In an implementation based on the USB protocol, such command could take the following form: a header H, and a command C.

In a specific embodiment, the selection is a logic selection and not a physical selection anymore. In such embodiment, the connection device interconnects continuously the peripherals to the plurality of devices and the selection is done directly at the level of the device by taking into account or completely ignoring the peripheral data according to the result of the detection process. This embodiment does not use the connection device 201 but uses a kind of “broadcast” type connection device that provides the peripheral data to all the connected computing devices. Therefore, the devices can operate independently, doing the selection ‘internally’ and do not rely on sending out a connection command to an external device.

Figure 11 illustrates a flowchart for a continuous learning process 1100 according to at least one embodiment. Indeed, the system can continuously enrich its calibrating measures with new labeled pictures. Misclassification can be detected easily with the user’s assistance. If the user is unhappy with the selection performed (i.e., the current classification), s/he presses a physical button on the connection device to change the selection of the computing device, in step 1110. This action can trigger a collection of new calibrating measures. It may be labelled pictures, for instance the last picture or the few last pictures before that the user presses the button, and for which the user’s gaze direction will be computed (assuming that the analysis of the gaze’s direction technique is used to classify pictures).

New calibrating measures are used to correct the threshold and hysteresis values used by the classifier, in step 1120. For instance, in case of using the gaze direction, the threshold angle as well as the hysteresis value may be recomputed and updated. This update may be the consequence of some changes in the user’s installation, for instance in case the user moves the position of the camera or of the screens. To make the distinction between a single and isolated misclassification and a real need to change the calibration, it is defined a recalibration threshold. The recalibration threshold triggers the recalibration process in step 1130.

Basically, the recalibration threshold is defined as a certain number of misclassifications triggering the collection of new calibrating measures. For instance, a counter counts the number of times that the user presses the button, and therefore triggers a new calibration measure. When the counter reaches a certain value defined as the recalibration threshold in step 830, a new calibration process is triggered in step 1140, and the counter is reset.

Another way to define the recalibration threshold is the velocity at which the misclassifications occur. If the user changes the relative positions of the screens and/or of the camera, the previously computed calibration values may become wrong, and misclassifications may occur in series. Then the recalibration process may be triggered because a certain number of misclassifications occurred in a short period of time. In this case, the counter which triggers the recalibration process measures the speed at which misclassifications occur (and not the number of misclassifications). Both methods (threshold based on a count or based on the speed) may be mixed.

Figure 12 illustrates an example of computing device implementing a method for device selection according to at least one embodiment. The term computing device is used in this document to differentiate from the peripheral device and does not restrict to devices that aim only at performing computations but also extends to other devices, for example a television. The device 1200 comprises a processor 1201, memory 1202, a communication interface 1203 and may integrate a display panel 1204.

The processor 1201 is configured to obtain, through the communication interface 1203 an image to be displayed. The image may be stored in the memory 1202 in order to perform the required computations before being provided to the display panel 1204.

The device 1200 includes a processor 1201 configured to execute instructions loaded therein for implementing, for example, the various aspects described in this document such as the selection process 300. The processor may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor can include embedded memory, input output interface, and various other circuitries as known in the art.

The device 1200 includes memory 1202 which can include non-volatile memory and/or volatile memory, including, but not limited to, Electrically Erasable Programmable Read-Only Memory (EEPROM), Read-Only Memory (ROM), Programmable Read-Only Memory (PROM), Random Access Memory (RAM), Dynamic Random Access Memory (DRAM), Static Random Access Memory (SRAM), flash, magnetic disk drive, and/or optical disk drive. The memory can include an internal storage device, an attached storage device (including detachable and non-detachable storage devices), and/or a network accessible storage device, as non-limiting examples.

The device 1200 includes a communication interface 1203 providing input and/or output to the device 1200. Such inputs include, but are not limited to, (i) a radio frequency (RF) portion that receives an RF signal transmitted, for example, over the air by a broadcaster, (ii) a Component (COMP) input terminal (or a set of COMP input terminals), (iii) a Universal Serial Bus (USB) input terminal, /or (iv) a High Definition Multimedia Interface (HDMI) input terminal and/or (v) composite video.

In various embodiments, the communication interface 1203 has associated respective input processing elements as known in the art. For example, the RF portion can be associated with elements suitable for (i) selecting a desired frequency (also referred to as selecting a signal, or band-limiting a signal to a band of frequencies), (ii) down-converting the selected signal, (iii) band-limiting again to a narrower band of frequencies to select (for example) a signal frequency band which can be referred to as a channel in certain embodiments, (iv) demodulating the down-converted and band-limited signal, (v) performing error correction, and (vi) demultiplexing to select the desired stream of data packets. The RF portion of various embodiments includes one or more elements to perform these functions, for example, frequency selectors, signal selectors, band-limiters, channel selectors, filters, downconverters, demodulators, error correctors, and demultiplexers. The RF portion can include a tuner that performs various of these functions, including, for example, down-converting the received signal to a lower frequency (for example, an intermediate frequency or a near-baseband frequency) or to baseband. In one embodiment, the RF portion and its associated input processing element receives an RF signal transmitted over a wired (for example, cable) medium, and performs frequency selection by filtering, down-converting, and filtering again to a desired frequency band. Various embodiments rearrange the order of the above-described (and other) elements, remove some of these elements, and/or add other elements performing similar or different functions. Adding elements can include inserting elements in between existing elements, such as, for example, inserting amplifiers and an analog-to-digital converter. In various embodiments, the RF portion includes an antenna. Additionally, the communication interface 1203 may comprise a USB and/or HDMI interface that can include respective interface processors for connecting the device 1200 to other electronic devices across USB and/or HDMI connections. Aspects of USB or HDMI interface processing can be implemented within separate interface ICs or within processor 1201 as necessary.

The communication interface 1203 can include, but is not limited to, a transceiver configured to transmit and to receive data over communication channel. The communication interface 1203 can include, but is not limited to, a modem or network card and can be implemented, for example, within a wired and/or a wireless medium.

Data is streamed, or otherwise provided, to the device 1200, in various embodiments, using a wireless network such as a Wi-Fi network, for example IEEE 802.11 (IEEE refers to the Institute of Electrical and Electronics Engineers). The Wi-Fi signal of these embodiments is received over the communication interface 1203 which are adapted for Wi-Fi communications. The communications channel of these embodiments is typically connected to an access point or router that provides access to external networks including the Internet for allowing streaming applications and other over-the-top communications. Other embodiments provide streamed data to the device 1200 using a set-top box that delivers the data over the HDMI connection of the communication interface 1203. Still other embodiments provide streamed data to the device 1200 using the RF connection of the communication interface 1203. As indicated above, various embodiments provide data in a non-streaming manner. Additionally, various embodiments use wireless networks other than Wi-Fi, for example a cellular network or a Bluetooth network.

In at least one embodiment, the display panel 1204 is based on OLED display panel. In a variant embodiment, the display panel 1204 is based on LED display panel, or alternatively a mini-LED display panel or a micro-LED display panel. In another variant embodiment, the display panel comprises (not illustrated) a backlight that generates light coupled to a light transmission-type panel that generates the image by filtering the light accordingly. The backlight may be based on LEDs, mini-LEDs, micro- LEDs, OLEDs, QD-LEDs (Quantum Dot - Light Emitting Diodes) or CCFLs (Cold-Cathode Fluorescent Lamps). The light transmission-type panel may be a LCD (Liquid Crystal Display) panel.

Various elements of device 1200 can be provided within an integrated housing, Within the integrated housing, the various elements can be interconnected and transmit data therebetween using suitable connection arrangement, for example, an internal bus 1205 as known in the art, including the Inter-IC (I2C) bus, wiring, and printed circuit boards.

In at least one embodiment, the device 1200 does not include the display panel 1204 so that an external display may be located in a second device coupled with device 1200. Examples of such devices are desktop computers, set-top boxes, media players, blue-ray or DVD players, or more general content receivers and decoders.

Although some embodiments of the present invention have been illustrated in the accompanying drawings and described in the foregoing description, it should be understood that the present invention is not limited to the disclosed embodiments, but is capable of numerous rearrangements, modifications and substitutions without departing from the invention as set forth and defined by the following claims.