Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
HANDHELD CONTROLLER DEVICE
Document Type and Number:
WIPO Patent Application WO/2020/212866
Kind Code:
A1
Abstract:
The present technology may be implemented as a handheld general-purpose user input device having an external chassis defining a plurality of faces and comprising multiple keys, at least one touch screen, a processing unit and a transceiver unit. The device is connectable to a recipient electronic device from the transceiver unit. The multiple keys structured in matrices and the touch screen(s) act as a keyboard input. An inertial unit, a vibration motor, an optical sensor, multiple LEDs may also be included. The inertial unit, optical sensor and the touch surface(s) from the touch screen(s) allow the device to read continue input values from the user motion. The vibration motor, the LEDs and the screen from the touch screen act as output allowing a user to receive direct feedback.

Inventors:
BRUN DAMIEN (CA)
GOUIN-VALLERAND CHARLES (CA)
GEORGE SEBASTIEN (FR)
Application Number:
PCT/IB2020/053551
Publication Date:
October 22, 2020
Filing Date:
April 15, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV DU MANS (FR)
SOCPRA SCIENCES SANTE ET HUMAINES S E C (CA)
International Classes:
G06F3/02; G06F3/01; G06F3/0346; G06F3/0354
Foreign References:
US20090179869A12009-07-16
US20120004740A12012-01-05
US20120154134A12012-06-21
Attorney, Agent or Firm:
SAWYER, François (CA)
Download PDF:
Claims:
What is claimed is:

1. A handheld controller device, comprising:

an external chassis defining a plurality of faces;

a set of input elements comprising:

a plurality of keys distributed over at least a subset of the plurality of faces of the external chassis, and

a touch screen adapted to detect a tactile contact and to translate the tactile contact into a virtual input, the touch screen being mounted on a screen receiving face of the external chassis;

a signal transceiver unit adapted to communicate with a recipient unit; and a processing unit operatively connected to each of the input elements and to the signal transceiver unit, the processing unit being adapted to:

detect a user interaction captured by one of the input elements, and cause the signal transceiver unit to transmit, to the recipient unit, information about the detected user interaction.

2. The handheld controller device of claim 1, wherein one of the plurality of faces of the external chassis is a detachable face.

3. The handheld controller device of claim 2, wherein the screen-receiving face of the external chassis is the detachable face of the external chassis.

4. The handheld controller device of any one of claims 1 to 3, wherein none of the plurality of keys is on the screen-receiving face of the external chassis.

5. The handheld controller device of any one of claims 1 to 4, further comprising an internal chassis containing the processing unit.

6. The handheld controller device of claim 5, wherein the internal chassis further contain one or more ancillary component selected from an inertial unit, a vibration motor, and a battery, each of the one or more ancillary component being operatively connected to the processing unit.

7. The handheld controller device of claim 6, wherein the inertial unit is a six degrees- of-freedom inertial unit.

8. The handheld controller device of claim 6 or 7, wherein:

the inertial unit is adapted to detect a motion of the handheld controller device; and

the processing unit is further adapted to:

receive from the inertial unit information about the motion of the handheld controller device, and

cause the signal transceiver unit to transmit, to the recipient unit, information about the detected motion of the handheld controller device.

9. The handheld controller device of any one of claims 1 to 8, further comprising a sensor mounted on one of the plurality of faces of the external chassis, wherein:

the sensor is operatively connected to the processing unit;

the sensor is adapted to detect a change of position of the handheld controller device in relation to a surface; and

the processing unit is further adapted to:

receive from the sensor information about the change of position of the handheld controller device, and

cause the signal transceiver unit to transmit, to the recipient unit, information about the detected change of position of the handheld controller device.

10. The handheld controller device of claim 9, wherein the sensor is mounted on the screen-receiving face of the external chassis.

11. The handheld controller device of claim 9 or 10, wherein the sensor is selected from an optical sensor and a laser sensor.

12. The handheld controller device of any one of claims 1 to 11, further comprising a USB port positioned on one the plurality of faces of the external chassis, the USB port being operatively connected to the processing unit.

13. The handheld controller device of claim 12, wherein the USB port is a micro USB port.

14. The handheld controller device of claim 12 or 13, wherein the USB port is positioned on the screen-receiving face of the external chassis.

15. The handheld controller device of any one of claims 12 to 14, wherein:

the recipient unit is a first recipient unit; and

the processing unit is further adapted to cause the USB port unit to transmit, to a second recipient unit, information about the detected user interaction.

16. The handheld controller device of any one of claims 12 to 15, wherein the USB port is adapted to provide electric power to the processing unit.

17. The handheld controller device of any one of claims 12 to 16, wherein the USB port is adapted to provide electric power to a battery connected to the processing unit.

18. The handheld controller device of any one of claims 1 to 17, further comprising a power switch mounted on one of the plurality of faces of the external chassis and operatively connected to the processing unit.

19. The handheld controller device of claim 18, wherein the power switch is mounted on the screen-receiving face of the external chassis.

20. The handheld controller device of any one of claims 1 to 19, wherein:

the subset of the plurality of faces of the external chassis comprises between 3 and 5 faces of the external chassis; and

the handheld controller device comprises a distinct touch screen on each face of the external chassis that is not comprised in the subset, each distinct touch screen being operatively connected to the processing unit.

21. The handheld controller device of any one of claims 1 to 20, wherein the external chassis comprises a pair of proximally located holes adapted for attachment of a lanyard or strap.

22. The handheld controller device of any one of claims 1 to 21, wherein the external chassis is made of a material selected from plastic, a metal, a polymer material and a combination thereof.

23. The handheld controller device of any one of claims 1 to 22, wherein the keys protrude from the plurality of faces of the external chassis. 24. The handheld controller device of any one of claims 1 to 23, wherein the keys are selected from tactile dome switches, scissor switches and mechanical switches.

25. The handheld controller device of any one of claims 1 to 24, wherein the keys are grouped in matrices, one matrix being located on each of the subset of the plurality of faces of the external chassis.

26. The handheld controller device of claim 25, wherein each matrix is a four by four matrix.

27. The handheld controller device of claim 25, wherein each matrix is a five by five matrix.

28. The handheld controller device of any one of claims 25 to 27, further comprising a plurality of backlighting LEDs, at least one backlighting LED being mounted in the handheld controller behind each matrix.

29. The handheld controller device of any one of claims 25 to 28, wherein each key of a first subset of the plurality of keys defines a particular value.

30. The handheld controller device of claim 28, wherein each particular value is selected from a letter, a number and a symbol.

31. The handheld controller device of claim 29, wherein each key of a second subset of the plurality of keys is assigned a particular keyboard function.

32. The handheld controller device of claim 30, wherein the screen-receiving face of the external chassis is a bottom face of the external chassis, each matrix being positioned on other faces of the external chassis.

33. The handheld controller device of claim 31, wherein:

keys defining letters are positioned on two adjacent lateral faces of the external chassis; and

keys defining symbols and keys assigned particular keyboard functions are positioned on two other lateral faces and on a top face of the external chassis.

34. The handheld controller device of any one of claims 1 to 33, wherein the virtual input is selected from a specific keycode, an emoticon and an action event. 35. The handheld controller device of claim 34, wherein the action event is selected from mute, sound up, sound down, cut, copy and paste.

36. The handheld controller device of any one of claims 1 to 35, further comprising a vibration motor operatively connected to the processing unit, wherein:

the signal transceiver is adapted to receive feedback data from the recipient unit; and

the processing unit is further adapted to cause the vibration motor to vibrate in response to the received feedback data.

37. The handheld controller device of any one of claims 1 to 35, further comprising a plurality of backlighting LEDs, wherein:

the signal transceiver is adapted to receive feedback data from the recipient unit; and

the processing unit is further adapted to cause the at least one of the plurality of backlighting LEDs to emit a light signal in response to the received feedback data.

38. The handheld controller device of any one of claims 1 to 35, wherein:

the signal transceiver is adapted to receive feedback data from the recipient unit; and

the processing unit is further adapted to cause the touch screen to display information related to the feedback data.

39. The handheld controller device of any one of claims 1 to 38, wherein the external chassis is generally shaped as a rectangular prism.

40. The handheld controller device of any one of claims 1 to 38, wherein the external chassis has a generally cubic shape.

41. The handheld controller device of any one of claims 1 to 38, wherein the external chassis is defines six faces.

Description:
HANDHELD CONTROLLER DEVICE

CROSS-REFERENCE

[01] The present application claims priority from European Patent Application no. 19315022.4, filed on April 15, 2019, the disclosure of which is incorporated by reference herein.

FIELD

[02] The present technology relates generally to electronic devices, and more particularly to a handheld controller device.

BACKGROUND

[03] Spatial computing and the use of “smart glasses” or“head-mounted display” has improved rapidly and spread to all continents and cultures. Hardware and software improvements make computational devices powerful presenters of information. They also collect a wide variety of information from the user’s environment, such as GPS location, video, voice, bar code scanning, and small amounts of written text. Such technologies allow the wide adoption and spread of augmented, mixed and virtual reality experience.

[04] Text input technology for users, in particular for spatial computing has lagged far behind in development. This is unfortunate because textual content is a significant form of conceptual material from the user.

[05] Many users need to produce, log, or transmit specific textual content using a variety of non-standard characters to electronic devices while they are moving. For instance, emergency medical technicians, reporters, software developers, financial managers and video game players need to transmit large, important, specific data and they would often benefit from being able to send it textually rather than audibly, and without having to look down or focus at their text input interface or device.

[06] Conventional keyboards and controllers have had at least one or more of the following limitations: 1. Are not portable in that they were intended to be placed on a desk or other work surface (e.g. desktop and laptop keyboard);

2. Are limited to text entry in that they did not provide a way to move a content, neither in two nor three-dimensional space;

3. Are limited to spatial manipulation either in two or three dimensions in that they do not provide a way to type;

4. Are not tangible, lacking tactile feedback; and

5. Are limited to the use of thumbs only and thus are prone to muscular strain and fatigue.

[07] It would be highly desirable for a spatial computing system to have a unified way to type and manipulate content across the whole reality-virtuality continuum. For example, a user would no longer be constrained to learn and use different interfaces with different outcomes to type and manipulate contents in spatial computing systems. Likewise, instead of being limited to inputting data with different devices or interaction inadequate either for some realities or contexts, a user could use the present technology the same ways, with the same outcome, for a wide variety of contexts to the whole reality-virtuality continuum.

[08] Even though the recent developments identified above may provide benefits, improvements are still desirable.

[09] The subject matter discussed in the background section should not be assumed to be prior art merely as a result of its mention in the background section. Similarly, a problem mentioned in the background section or associated with the subject matter of the background section should not be assumed to have been previously recognized in the prior art. The subject matter in the background section merely represents different approaches.

SUMMARY

[10] Embodiments of the present technology have been developed based on developers’ appreciation of shortcomings associated with the prior art. [11] In particular, shortcomings of conventional keyboards and controllers may comprise one or more of the lack of portability, the inability to move text contents in two or three-dimensional spaces, the lack of tangible, tactile feedback, and the tendency to cause muscular strain and fatigue.

[12] In one aspect, various implementations of the present technology provide a handheld controller device, comprising: an external chassis defining a plurality of faces;

a set of input elements comprising:

a plurality of keys distributed over at least a subset of the plurality of faces of the external chassis, and

a touch screen adapted to detect a tactile contact and to translate the tactile contact into a virtual input, the touch screen being mounted on a screen receiving face of the external chassis;

a signal transceiver unit adapted to communicate with a recipient unit; and a processing unit operatively connected to each of the input elements and to the signal transceiver unit, the processing unit being adapted to:

detect a user interaction captured by one of the input elements, and cause the signal transceiver unit to transmit, to the recipient unit, information about the detected user interaction.

[13] In some implementations of the present technology, one of the plurality of faces of the external chassis is a detachable face.

[14] In some implementations of the present technology, the screen-receiving face of the external chassis is the detachable face of the external chassis.

[15] In some implementations of the present technology, none of the plurality of keys is on the screen-receiving face of the external chassis.

[16] In some implementations of the present technology, the handheld controller device further comprises an internal chassis containing the processing unit.

[17] In some implementations of the present technology, the internal chassis further contain one or more ancillary component selected from an inertial unit, a vibration motor, and a battery, each of the one or more ancillary component being operatively connected to the processing unit. [18] In some implementations of the present technology, the inertial unit is a six degrees- of-freedom inertial unit.

[19] In some implementations of the present technology, the inertial unit is adapted to detect a motion of the handheld controller device; and the processing unit is further adapted to: receive from the inertial unit information about the motion of the handheld controller device, and cause the signal transceiver unit to transmit, to the recipient unit, information about the detected motion of the handheld controller device.

[20] In some implementations of the present technology, the handheld controller device further comprises a sensor mounted on one of the plurality of faces of the external chassis, the sensor being operatively connected to the processing unit; the sensor being adapted to detect a change of position of the handheld controller device in relation to a surface; and the processing unit being further adapted to: receive from the sensor information about the change of position of the handheld controller device, and cause the signal transceiver unit to transmit, to the recipient unit, information about the detected change of position of the handheld controller device.

[21] In some implementations of the present technology, the sensor is mounted on the screen-receiving face of the external chassis.

[22] In some implementations of the present technology, the sensor is selected from an optical sensor and a laser sensor.

[23] In some implementations of the present technology, the handheld controller device further comprises a USB port positioned on one the plurality of faces of the external chassis, the USB port being operatively connected to the processing unit.

[24] In some implementations of the present technology, the USB port is a micro USB port.

[25] In some implementations of the present technology, the USB port is positioned on the screen-receiving face of the external chassis.

[26] In some implementations of the present technology, the recipient unit is a first recipient unit; and the processing unit is further adapted to cause the USB port unit to transmit, to a second recipient unit, information about the detected user interaction. [27] In some implementations of the present technology, the USB port is adapted to provide electric power to the processing unit.

[28] In some implementations of the present technology, the USB port is adapted to provide electric power to a battery connected to the processing unit.

[29] In some implementations of the present technology, the handheld controller device further comprises a power switch mounted on one of the plurality of faces of the external chassis and operatively connected to the processing unit.

[30] In some implementations of the present technology, the power switch is mounted on the screen-receiving face of the external chassis.

[31] In some implementations of the present technology, the subset of the plurality of faces of the external chassis comprises between 3 and 5 faces of the external chassis; and the handheld controller device comprises a distinct touch screen on each face of the external chassis that is not comprised in the subset, each distinct touch screen being operatively connected to the processing unit.

[32] In some implementations of the present technology, the external chassis comprises a pair of proximally located holes adapted for attachment of a lanyard or strap.

[33] In some implementations of the present technology, the external chassis is made of a material selected from plastic, a metal, a polymer material and a combination thereof.

[34] In some implementations of the present technology, the keys protrude from the plurality of faces of the external chassis.

[35] In some implementations of the present technology, the keys are selected from tactile dome switches, scissor switches and mechanical switches.

[36] In some implementations of the present technology, the keys are grouped in matrices, one matrix being located on each of the subset of the plurality of faces of the external chassis.

[37] In some implementations of the present technology, each matrix is a four by four matrix. [38] In some implementations of the present technology, each matrix is a five by five matrix.

[39] In some implementations of the present technology, the handheld controller device further comprises a plurality of backlighting LEDs, at least one backlighting LED being mounted in the handheld controller behind each matrix.

[40] In some implementations of the present technology, each key of a first subset of the plurality of keys defines a particular value.

[41] In some implementations of the present technology, each particular value is selected from a letter, a number and a symbol.

[42] In some implementations of the present technology, each key of a second subset of the plurality of keys is assigned a particular keyboard function.

[43] In some implementations of the present technology, the screen-receiving face of the external chassis is a bottom face of the external chassis, each matrix being positioned on other faces of the external chassis.

[44] In some implementations of the present technology, keys defining letters are positioned on two adjacent lateral faces of the external chassis; and keys defining symbols and keys assigned particular keyboard functions are positioned on two other lateral faces and on a top face of the external chassis.

[45] In some implementations of the present technology, the virtual input is selected from a specific keycode, an emoticon and an action event.

[46] In some implementations of the present technology, the action event is selected from mute, sound up, sound down, cut, copy and paste.

[47] In some implementations of the present technology, the handheld controller device further comprises a vibration motor operatively connected to the processing unit, the signal transceiver being adapted to receive feedback data from the recipient unit; and the processing unit being further adapted to cause the vibration motor to vibrate in response to the received feedback data. [48] In some implementations of the present technology, the handheld controller device further comprises a plurality of backlighting LEDs, the signal transceiver being adapted to receive feedback data from the recipient unit; and the processing unit being further adapted to cause the at least one of the plurality of backlighting LEDs to emit a light signal in response to the received feedback data.

[49] In some implementations of the present technology, the signal transceiver is adapted to receive feedback data from the recipient unit; and the processing unit is further adapted to cause the touch screen to display information related to the feedback data.

[50] In some implementations of the present technology, the external chassis is generally shaped as a rectangular prism.

[51] In some implementations of the present technology, external chassis has a generally cubic shape.

[52] In some implementations of the present technology, the external chassis is defines six faces.

[53] In the context of the present specification, unless expressly provided otherwise, a computer system may refer, but is not limited to, an“electronic device”, an“operation system”, a“system”, a“computer-based system”, a“controller unit”, a“monitoring device”, a“control device” and/or any combination thereof appropriate to the relevant task at hand.

[54] In the context of the present specification, unless expressly provided otherwise, the expression“computer-readable medium” and“memory” are intended to include media of any nature and kind whatsoever, non-limiting examples of which include RAM, ROM, disks (CD-ROMs, DVDs, floppy disks, hard disk drives, etc.), USB keys, flash memory cards, solid state-drives, and tape drives. Still in the context of the present specification,“a” computer-readable medium and“the” computer-readable medium should not be construed as being the same computer-readable medium. To the contrary, and whenever appropriate,“a” computer-readable medium and“the” computer-readable medium may also be construed as a first computer-readable medium and a second computer-readable medium.

[55] In the context of the present specification, unless expressly provided otherwise, the words“first”,“second”,“third”, etc. have been used as adjectives only for the purpose of allowing for distinction between the nouns that they modify from one another, and not for the purpose of describing any particular relationship between those nouns.

[56] Implementations of the present technology each have at least one of the above- mentioned object and/or aspects, but do not necessarily have all of them. It should be understood that some aspects of the present technology that have resulted from attempting to attain the above-mentioned object may not satisfy this object and/or may satisfy other objects not specifically recited herein.

[57] Additional and/or alternative features, aspects and advantages of implementations of the present technology will become apparent from the following description, the accompanying drawings and the appended claims.

BRIEF DESCRIPTION OF THE DRAWINGS

[58] Embodiments of the present technology are illustrated by way of example and not by way of limitations in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that different references to“an” or“one” embodiment in this disclosure are not necessarily to the same embodiment, and such references mean at least one. For a better understanding of the present technology, as well as other aspects and further features thereof, reference is made to the following description which is to be used in conjunction with the accompanying drawings, wherein:

[59] FIG. 1 is a top-down perspective view of an embodiment of the present technology.

[60] FIG. 2 is a top-down perspective view of an embodiment of the present technology upside down.

[61] FIG. 3A is a top-down perspective inside view of an embodiment of the present technology upside down with a touch screen removed.

[62] FIG. 3B is a top-down perspective exploded view of an embodiment of the present technology upside down.

[63] FIG. 4A is a side elevation view of an embodiment of the present technology.

[64] FIG. 4B is a bottom focused side elevation view of an embodiment of the present technology. [65] FIG. 5 is a top plan view of the internal chassis of an embodiment of the present technology.

[66] FIG. 6 is a diagram showing hardware components that comprise an embodiment of the present technology.

[67] FIG. 7 depicts a user holding an embodiment of the present technology while wearing a virtual reality head-mounted display.

[68] FIG. 8 depicts a user holding an embodiment of the present technology while wearing a mixed reality head-mounted display.

[69] FIG. 9 depicts a user interacting with an embodiment of the present technology while sitting in front of a desktop computer.

[70] FIG. 10 depicts a user interacting with an embodiment of the present technology while lying on a couch in front of a smart TV.

[71] FIG. 11 depicts a user interacting with an embodiment of the present technology while lying on a couch with a tablet computer.

[72] FIG. 12 depicts a user holding an embodiment of the present technology while sitting in front of a laptop computer.

[73] FIG. 13 depicts a user standing and holding an embodiment of the present technology.

[74] FIG. 14 depicts a user standing and holding an embodiment of the present technology while wearing a mixed reality head-mounted display.

[75] FIG. 15 depicts a user’s hand in a preferable device position.

[76] FIG. 16 is diagram showing a preferable custom mapping layout.

[77] FIG. 17 is diagram showing a helper view from a custom software dedicated to recipient devices.

[78] It should also be noted that, unless otherwise explicitly specified herein, the drawings are not to scale. DETAILED DESCRIPTION

[79] The examples and conditional language recited herein are principally intended to aid the reader in understanding the principles of the present technology and not to limit its scope to such specifically recited examples and conditions. It will be appreciated that those skilled in the art may devise various arrangements that, although not explicitly described or shown herein, nonetheless embody the principles of the present technology.

[80] Furthermore, as an aid to understanding, the following description may describe relatively simplified implementations of the present technology. As persons skilled in the art would understand, various implementations of the present technology may be of a greater complexity.

[81] In some cases, what are believed to be helpful examples of modifications to the present technology may also be set forth. This is done merely as an aid to understanding, and, again, not to define the scope or set forth the bounds of the present technology. These modifications are not an exhaustive list, and a person skilled in the art may make other modifications while nonetheless remaining within the scope of the present technology. Further, where no examples of modifications have been set forth, it should not be interpreted that no modifications are possible and/or that what is described is the sole manner of implementing that element of the present technology.

[82] Moreover, all statements herein reciting principles, aspects, and implementations of the present technology, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof, whether they are currently known or developed in the future. Thus, for example, it will be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative circuitry embodying the principles of the present technology. Similarly, it will be appreciated that any flowcharts, flow diagrams, state transition diagrams, pseudo-code, and the like represent various processes that may be substantially represented in non-transitory computer-readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.

[83] The functions of the various elements shown in the figures, including any functional block labeled as a "processor", may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. In some embodiments of the present technology, the processor may be a general-purpose processor, such as a central processing unit (CPU) or a processor dedicated to a specific purpose, such as a digital signal processor (DSP). Moreover, explicit use of the term a "processor" should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, application specific integrated circuit (ASIC), field programmable gate array (FPGA), read-only memory (ROM) for storing software, random access memory (RAM), and non-volatile storage. Other hardware, conventional and/or custom, may also be included.

[84] Software modules, or simply modules which are implied to be software, may be represented herein as any combination of flowchart elements or other elements indicating performance of process steps and/or textual description. Such modules may be executed by hardware that is expressly or implicitly shown. Moreover, it should be understood that module may include for example, but without being limitative, computer program logic, computer program instructions, software, stack, firmware, hardware circuitry or a combination thereof which provides the required capabilities.

[85] The following detailed description is merely exemplary in nature and is not intended to limit the described embodiments of the application and uses of the described embodiments. As used herein, the word“exemplary” or“illustrative” means“serving as an example, instance, or illustration.” Any implementation described herein as “exemplary” or “illustrative” is not necessarily to be construed as preferred or advantageous over other implementations. All of the implementations described below are exemplary implementations provided to enable persons skilled in the art to practice the disclosure and are not intended to limit the scope of the appended claims. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description.

[86] The present disclosure introduces a new kind of tangible handheld controller device dedicated to text entry, in a sense similar to the keyboard, but being generally shaped as a polyhedron, for example a rectangular prism or a cube. The device may be small enough to be covered and held with hands. Multiple key matrices are disposed over the faces of the device. The designed form factor allows the device to be highly mobile without reducing the comfort, the key size and the numbers of fingers involved with a typical keyboard. The device goes beyond text entry and embeds multiple user input and outputs such as a touch surface, inertial motion and position, vibrotactile feedback and visual feedback. Examples of the handheld controller device are generally defined has having a rectangular prism form factor or a cubic form factor. Embodiments may vary slightly from the shape of a perfect rectangular prism. For example, each face of the tangible device may be slightly concave or convex and may have adjacent edges that have small length variations. Likewise, edges between adjacent faces and corners of the tangible device may be somewhat rounded. Faces of the device may not be perfectly parallel or perpendicular to one another. The shape of the device may have some variations from an ideal rectangular prism or from an ideal cube, for example in order to achieve ergonomic objectives. Terms such as“rectangular”,“prism”, “cube” and“cubic” should therefore not be understood in the absolute. Other embodiments may depart further from the shape of a rectangular prism or from the shape of a cube. The handheld controller may for example adopt the shape of a truncated pyramid, in which the sides are not perpendicular to the base and a top face created by truncation of a pyramidal shape may not be parallel to the base. Further embodiments may have a pentagonal or hexagonal base and, correspondingly, five or six side faces reaching a pentagonal or hexagonal top. The person of ordinary skill in the art will be able to select a shape of the handheld controller according to needs of a particular application, whether these needs are governed by ergonomic considerations or by considerations related to uses of the device.

[87] The handheld controller device is intended to be paired with recipient electronic devices such as, but not limited to, augmented, mixed and virtual reality head-mounted displays, personal computers, smart TVs, tablets, mobile phones and any other computing devices that can receive electronic input. The present technology may contain an electronic circuit board which provides computer power to process the different input/output elements and a wireless connection to a recipient computing device, such as, but not limited to, smart glasses. For the purpose of the present description, we can use the example of a Bluetooth connection, which provides an interface for a wide variety of electronic devices, but the present technology may work with any device that accepts text input. It may also be used to interface with any other kind of computing devices by wire.

[88] A recipient device may be, but not limited to, a smartphone, a tablet, a laptop, a desktop computer, an augmented/mixed/virtual reality head-mounted display, a smart TV, a smart glass, a smart watch. Herein, recipient device is alternatively referred to as a“host.” It is also envisioned that the keyboard of the embodiments in the present technology may be used with any other electronic device for which a keyboard is desirable.“Recipient” as used herein is deemed to include any device that receives inputs from the present technology’s device.

[89] In exemplary implementations, the handheld controller device is a handheld general- purpose input-user device comprising an external enclosure that is generally shaped as a rectangular prism, for example being shaped as a cube, the device further comprising multiple keys, a touch screen, a processing unit and a transceiver (transmitter/receiver) unit, and ancillary components such as an inertial unit, a vibration motor, and an optical sensor,. The device is connected to a recipient electronic device via the transceiver unit. The multiple keys are structured in matrices and the touch screen acts as a keyboard input. The inertial unit, optical sensor and the touch surface from the touch screen allow the device to read continue input values from a user motion. The vibration motor and the screen from the touch screen act as output allowing a user to receive direct feedback. Strong of its form factor and affordance, the present technology offers advantages with regard to mobility, comfort, learnability, privacy and playfulness. The combination of input and output sensors creates a novel interface particularly convenient for text entry and more, such as, but not limited to, pointing and three-dimensional interaction in many use cases across the whole reality- virtuality continuum. For instance, we can envision a case where users fully exploit the present technology by typing, programming and designing in a new three-dimensional integrated development environment (IDE) either in mixed or virtual reality.

[90] In exemplary implementations of the present technology, different user inputs and outputs may simultaneously be transmitted respectively to and from the recipient electronic device from and to the handheld controller device using a wireless connection.

[91] In exemplary implementation of the present technology, a user may type different kinds of text with the multiple keys. The keys may comprise electric switches structured in matrices. Different characters, such as, but not limited to, letters and numbers, are assigned to the keys by keycodes. A keycode is sent to the recipient device when a user presses a key. The keycode may then be repeatedly sent as long as the key is not released. [92] In exemplary implementations of the present technology, the inertial unit may capture any movement and rotation of the handheld controller device. Thus, a user may perform mid air hand gestures, and use these gestures or movements to interact with virtual spatial contents that may appear from their recipient device, such as, but not limited to, a head- mounted display. The virtual spatial contents may convey information or comprise graphical user interface.

[93] In exemplary implementations of the present technology, the optical sensor may capture a change of position of the handheld controller device against a surface, such as, but not limited to, desks, walls, body parts. The user could then use the device motion to move a cursor displayed on the recipient device.

[94] In exemplary implementation of the present technology, a user may receive vibrating feedback based on virtual spatial contents. For instance, the handheld controller device may vibrate while passing through a virtual graphical interface, simulating contact between the device and the virtual content displayed on the recipient devices, such as mixed or virtual reality head-mounted display.

[95] With these fundamentals in place, we will now consider some non-limiting examples to illustrate various implementations of aspects of the present technology.

[96] FIG. 1-4 shows an embodiment of a cubic tangible handheld controller device 101- 401 comprising an external chassis 102-402. The external chassis 102-402 provides 80 keys 103, 311, 403 equally distributed over five key matrices 303, on each respective face. Each key matrix has 16 keys (4x4). Needless to say, each key matrix may have more or less than 16 keys, for instance, a 25-key matrix (5x5), in such case, the number of matrices (to 3) and total keys (to 75) may vary adequately and all the faces without keys (to 3) would include a touch screen. It is contemplated that the device 101-401 may comprise a plurality of touch screens positioned on a corresponding plurality of faces of the device 101-401. The bottom face presented in FIG. 2 is flat and comprises a touch screen 203, 309, an embedded power switch 206, 315 a micro USB female plug 205, 316 and an optical sensor 204, 314. The optical sensor 204, 314 is used to track two-dimensional movements, however, the sensor is not limited to optical and be implemented as a laser sensor. How the power switch 206, 315 is implemented is not limited, and it is contemplated that the power switch 206, 315 be implemented as a button instead. Moreover, although the plug 205, 316 is implemented as a micro USB plug, it is not limited as such, and it is contemplated that any USB plug be used.

[97] The external chassis 102-402 may be generally understood to be the main physical structure of the present technology and is configured to conform to human hands and fingers. In various embodiments of the present technology, the external chassis 102-402, may be made from any desirable material, such as, but not limited to, plastic, metal, various polymer materials, or/any other desired materials or combination of materials. In one embodiment, the external chassis 102-402 size is 85x85x85 millimeters. In alternative embodiments, the external chassis 102-402 may vary in size, such as to accommodate the user’s hand (for example, a child).

[98] The 80 keys 103, 311, 403 are temporary square button switches of regular fingertip sizes: 12x12 millimeters. The number of keys 103, 311, 403 (80) and their size (12x12mm) allows the present technology to offer the same flexibility than a typical keyboard and avoid chording or multi-tap technique. In one embodiment of the present technology, the keys 103, 311, 403 protrude from the external chassis 102-402 increasing the overall size of the device up to 96.2x96.2x90.6 millimeters (lxWxH). In one embodiment of the present technology, the keys are tactile dome switches, but they may be other types of switches, such as, but not limited to, scissors switches, mechanical switches. In one embodiment of the present technology, the keys 103, 311, 403 are mono-colored from five different colors: red, green, blue, yellow, and white. Each color are regrouped and assigned to different matrices 303. Each key 103, 311, 403 has movable and optional transparent cover that includes inscriptions such as, but not limited to, letters, numbers and symbols. Although in the illustrated embodiment, only a set of keys is illustrated, it should be understood that this is done merely for ease of understanding. It is contemplated that the letters be not limited to the Latin alphabet, or to the particular symbols displayed.

[99] In one embodiment of the present technology, the key matrices 303 comprise 16 keys, 4 rows and 4 columns and 1 RGB LED 312 allowing backlighting. Each key’s centered pivot is separated from another by 18.5 millimeters.

[100] The bottom face presented on FIG. 2 is flat and allows the device 101-401 to be placed on a plane surface, such as, but not limited to, desk, table or any furniture. [101] The touch screen 203, 309 is used to convey inputs and outputs information. In one embodiment, the touch screen 203, 309 may display different virtual inputs such as, but not limited to, sliders and square keys. Virtual inputs could include symbolic visual or inscriptions and react to finger touch and send back to the recipient device customizable information, such as, but not limited to, specific keycodes, emoticons, action events, for instance: mute, sound up and down, cut, copy and paste. In one embodiment, the touch screen 203, 309 is a dedicated touch surface to track tactile contacts, such as multiple fingers touch and motion, translate these tactile contacts into virtual inputs, and send information related to the virtual inputs to the recipient devices (such as touchpads). The recipient devices can interpret the tracked information as, but not limited to, cursor movements or scrolling.

[102] The FIG. 4B is a partial elevation view of the bottom of one embodiment of the present technology. The touch screen 405 (203) is attached to the external chassis 402 (102- 302) by clicking and two protrusions 406L, 406L.

[103] In one embodiment of the present technology, the external chassis 102-402 comprise two proximally located holes 104, 404. A lanyard or strap may be used to tie the device to the wrist or neck. The benefit is double, securing the device, thus increasing the user’s agility, and easily allowing free hand users while keeping the device close.

[104] FIG. 5 is a diagram of bottom plan view of one embodiment of the present technology. It shows the internal chassis 501 (304) used to cover the processing unit 502 (305), the power battery 506 (306), the six degrees-of-freedom inertial unit 503 (308), the vibration motor 504 (310) and the signal transceiver unit 505 (313). The processing unit 502 comprises one or more processors operatively connected to one or more memory devices. The one or more memory devices may include non-transitory computer-readable media storing code instructions that are executable by the one or more processors.

[105] In one embodiment of the present technology, the battery 506 (306) is rechargeable by connecting a wire through the USB female plug 205, 316 to a regular USB power source 5V, 500 m A. While charging, the present technology is still usable. While connected by a wire to a recipient, the battery is not used.

[106] In one embodiment of the present technology, the battery 506 (306) is replaceable after removing the touch screen 203, 309. The touch screen is easily removable from the external chassis 102-402, leaving access to the internal chassis 501 (304). [107] In one embodiment of the present technology, the signal transceiver unit 505 (313) is a Bluetooth module sending signal at 2.4 GHz. However, the present technology may be implemented with any type of wireless connection, such as, but not limited to, WiFi, other radio frequency (RF) technologies, and infrared (IR) technologies.

[108] In one embodiment of the present technology, a recipient device (e.g. desktop computer) may be connected and sending or receiving data by a wire through the USB female plug 205, 316. The data sent and received through the USB wire is similar to the one sent and received through the signal transceiver unit 505 (313). Moreover, the present technology may be connected to two different recipient devices (e.g. desktop computer and virtual reality head-mounted display) at the same time, one through the USB female plug 205, 316 and one through the signal transceiver unit 505 (313).

[109] FIG. 6 is a diagram illustrating hardware components 601 and how these components interrelate in an embodiment of the present technology. A set of input elements 602 capture user interaction through multiple keys (press and release) 606 (103-403), an inertial unit 607 (308, 503), an optical sensor 608 (204, 314) and a touch screen 605 (203, 309). Information about user interactions from all input elements 602 is processed by the processing unit 604 (305) and sent to a recipient device 612 through signal transceiver 611 (313). A recipient device 612 may, in response to, or independently, send feedback data through the signal transceiver 611, the feedback data is then processed by the processing unit 604 and to one or few dedicated output elements 603, such as the vibration motor 609 (310), the LEDs 610 (312) and the touch screen(s) 605 (309). In response to the feedback data, the processing unit 604 may cause the vibration motor 609 (310) to vibrate, cause one of more of the LEDs 610 (312) to emit a light signal, and/or cause the touch screen(s) 605 (309) to display information related to the feedback data.

[110] FIG. 7 depicts a user holding an embodiment of the handheld controller device 701 (101-401) with both hands while wearing a virtual reality head-mounted display 702. The device 701 (101-401) is wirelessly connected to the virtual reality head-mounted display 702 that act as a recipient device. The user is highly mobile and can type textual information with the multiple keys as long as manipulate three-dimensional contents with the device inertial motion. The recipient device can show any contents to the user such as, but not limited to, text editors, video games, professional applications. [111] FIG. 8 depicts a user holding an embodiment of the handheld controller device 801 (101-401) with both hands while wearing a mixed reality head-mounted display 802. The device 801 (101-401) is wirelessly connected to the mixed reality head-mounted display 802 that act as a recipient device. The user is highly mobile and can type textual information with the multiple keys as long as manipulate three-dimensional contents with the device inertial motion.

[112] FIG. 9 depicts a user interacting with an embodiment of the handheld controller device 901 (101-401) while sitting in front of an all-in-one desktop computer 902. The device 901 is wirelessly connected to all-in-one desktop computer 902 that act as a recipient device. The recipient device can show any contents to the user such as, but not limited to, text editors, integrated development environments.

[113] FIG. 10 depicts a user interacting with an embodiment of the handheld controller device 1001 (101-401) while lying on a couch in front of a smart TV 1002. The device 1001 is wirelessly connected to the smart TV 1002 that act as a recipient device. The recipient device can show any contents to the user such as, but not limited to, text editors, media service players, video games.

[114] FIG. 11 depicts a user interacting with an embodiment of the handheld controller device 1101 (101-401) while lying on a couch with a tablet computer 1102. The device 1101 is wirelessly connected to the tablet computer 1102 that act as a recipient device. The recipient device can show any contents to the user such as, but not limited to, text editors, email clients.

[115] FIG. 12 depicts a user holding an embodiment of the handheld controller device 1201 (101-401) while sitting in front of a laptop computer 1202. The device 1201 is wirelessly connected to the laptop computer 1202 that act as a recipient device. The device form factor of the embodiment of the present technology 1201 allows a user to hold and interact with it while the arms 1203 are resting along the body.

[116] FIG. 13 depicts a user standing and holding with one hand an embodiment of the handheld controller device 1301 (101-401). The device 1301 is wirelessly connected to an out-of-body recipient device such as, but not limited to, a desktop computer, smart TV. The plane surface of the embodiment of the device 1301 that comprises an optical sensor 204 is against the user’s body. Due to the optical sensor 204, the user may use the device 1301 as a pointing device (e.g. mouse), moving a cursor displayed on the recipient device as the device 1301 move along the body surface. The surface may be other than the body, for instance walls and desktops.

[117] FIG. 14 depicts a user standing and holding with one hand an embodiment of the handheld controller device 1401 (101-401) while wearing a mixed reality head-mounted display 1402. The device 1401 is wirelessly connected to the mixed reality head-mounted display 1402 that act as a recipient device. The user is highly mobile and is using the inertial motion of the device 1401 as input to manipulate contents in a three-dimensional space. For instance, after selecting a three-dimensional virtual object that appears in the recipient device 1402, the user will rotate the virtual object following the rotation of the device 1401.

[118] FIG. 15 depicts a user’s hands holding an embodiment of the handheld controller device 1501 (101-401). The figure shows the position of the hands and the device for a typing activity. The thumbs are positioned on the top surface 1502 while the other fingers are placed on the forward left 1503L and right 1503R sides. The flat surface where belong the touch screen 203, 309, the optical sensor 204, 314, the USB micro female plug 205, 316 and the power switch 206, 315 is provided on the surface below, at the opposite of the top surface 1502. It is by no means the only way to use an embodiment of the present technology 1501 (101-401), a user may hold and interact with it by various manners.

[119] FIG. 16 is a diagram showing an example of the keys’ layout mapped on the handheld controller device 101-401. The layout is suited for the hands and device position presented in FIG. 15. Most of the letters are placed on the forward left 1601L and right 1601R sides, while special characters are placed on the top 1602, backward left 1603L and right 1603R sides. Needless to say, it is contemplated that other layout be used.

[120] FIG. 17 is diagram showing a helper view from an optional software tool dedicated to recipient devices (such as the TV 1002 or the desktop computer 902). The software tool may be installed by the user on their recipient devices. The tool allows a user to see an interactive keyboard layout viewer 1701 showing the current state of each key of the handheld controller device in real time. The viewer is a visual helper presented on the recipient device’s display and visually adapted to the preferable user’s hands and device position. The diagram is showing what users, such as these depicted in FIG. 7-14, may see on their recipient device 702-1402 if they choose to have an optional visual helper, either in augmented, mixed, virtual realities or real environment. The visual helper appears above any used applications and may be slightly transparent to avoid occlusion.

[121] Furthermore, in one embodiment of the present technology, the software tool allows a user to customize the layout by mapping characters to different keys, to save and load custom mapping layout. The software customization of the layout mapping allows a user to set preferences and to adapt the present technology depending to the country language used or depending on the kind of applications, such as, but not limited to, programming, designing, chatting. The customization may include changing keys position, but also adding or removing specific characters. The software customization of the layout mapping may be coupled with the hardware customization due to the movable key covers comprising the inscriptions. The software tool comes with several pre-defined layouts, but other custom layouts may be found on internet and installed on the recipient device by a user.

[122] It shall be noted that those skilled in the art will readily recognize numerous adaptations and modifications which can be made to the various embodiments of the present technology which will result in an improved technology, yet all of which will fall within the spirit and scope of the present technology as defined in the following claims. Accordingly, the present technology is to be limited only by the scope of the following claims and their equivalents.

[123] While the above-described implementations have been described and shown with reference to particular steps performed in a particular order, it will be understood that these steps may be combined, sub-divided, or re-ordered without departing from the teachings of the present technology. At least some of the steps may be executed in parallel or in series. Accordingly, the order and grouping of the steps is not a limitation of the present technology.

[124] It should be expressly understood that not all technical effects mentioned herein need to be enjoyed in each and every embodiment of the present technology.

[125] Modifications and improvements to the above-described implementations of the present technology may become apparent to those skilled in the art. The foregoing description is intended to be exemplary rather than limiting. The scope of the present technology is therefore intended to be limited solely by the scope of the appended claims.