Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SENSING A USER'S TOUCH
Document Type and Number:
WIPO Patent Application WO/2020/036604
Kind Code:
A1
Abstract:
A touch sensing device includes an image capture device to capture a plurality of images of a media. The touch sensing device also includes a processing device to compare the plurality of images of the media to determine changes in glare artifacts resulting from the touch of the media by a user.

Inventors:
ROBINSON IAN (US)
VANKIPURAM MITHRA (US)
Application Number:
PCT/US2018/046951
Publication Date:
February 20, 2020
Filing Date:
August 17, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HEWLETT PACKARD DEVELOPMENT CO (US)
International Classes:
G06F3/01; G06F3/042
Foreign References:
US20100066675A12010-03-18
US20150138232A12015-05-21
US20170351324A12017-12-07
US20180075657A12018-03-15
Attorney, Agent or Firm:
WOODWORTH, Jeffrey et al. (US)
Download PDF:
Claims:
CLAIMS

WHAT IS CLAIMED IS:

1. A touch sensing device, comprising:

an image capture device to capture a plurality of images of a media; a processing device to compare the plurality of images of the media to determine changes in glare artifacts resulting from the touch of the media by a user.

2. The touch sensing device of claim 1 , wherein output of the touch sensing device defines instructions within an extended reality system.

3. The touch sensing device of claim 1 , wherein the media comprises: paper; and

a glossy sleeve surrounding the paper,

wherein the glossy sleeve comprises:

a flat portion covering a detection side of the paper; and a deformable base covering a non-detection side of the paper.

4. The touch sensing device of claim 1 , wherein the media comprises a tattoo comprising a glossy overlay.

5. The touch sensing device of claim 1 , wherein the media comprises: paper; and

a glossy overlay covering at least a portion of a detection side of the paper,

wherein the media bends to change the gloss of the media using a curl formed in the media, a sleeve surrounding the media, a deformable based under the media, or a combination thereof.

6. The touch sensing device of claim 1 , wherein the image capture device detects infrared electromagnetic wavelengths.

7. A system for sensing a user’s touch, comprising:

a media comprising a detectable glare artifact when deformed;

an image capture device to capture a plurality of images of the media; and

a processing device to compare the plurality of images of the media to determine changes in the media and detect glare artifacts resulting from the deformation of the media by a user’s touch.

8. The system of claim 7, comprising a deformable surface under the media, the deformable surface allowing the media to deform into the deformable surface under pressure.

9. The system of claim 7, comprising an electromagnetic wave source to illuminate the media.

10. The system of claim 9, wherein the electromagnetic wave source comprises an infrared electromagnetic wave source.

1 1. The system of claim 10, wherein the image capture device detects infrared electromagnetic wavelengths reflected from the media.

12. The system of claim 9, wherein the electromagnetic wave source comprises a patterned electromagnetic wave source to cause a pattern to be formed onto the media.

13. A method of sensing a user’s touch, comprising:

with an image capture device, capturing an initial image of a media, the media comprising a detectable glare artifact when the media is deformed; detecting a change in the glare artifact in a subsequent image of the media as compared to the initial image of the media; and

in response to a determination that the change in the glare artifact exists between the initial image and the subsequent image, defining the change in the glare artifact as instructions within an extended reality system.

14. The method of claim 13, comprising:

determining whether a permanent deformation of the media has occurred based on a comparison of the initial image and a subsequent image captured subsequent to the capture of the subsequent image; and

registering the permanent deformation of the media as a baseline image for detection of a subsequent touch event.

15. The method of claim 13, comprising:

recognizing an image printed on the media; and

orienting a user interface for interaction with an extended reality system based on the orientation of the image.

Description:
SENSING A USER’S TOUCH

BACKGROUND

[0001] Extended reality (ER) systems and methods provide users with a sensory experience in which real-world environments and virtual-world environments are combined to give the user a virtually immersive experience in an otherwise real experience that is experienced in a real-world environment. Extended reality systems include, for example, augmented reality (AR), augmented virtuality (AV), virtual reality (VR), mixed reality (MR), simulated reality (SM), and the other areas interpolated among a reality-virtuality continuum.

BRIEF DESCRIPTION OF THE DRAWINGS

[0002] The accompanying drawings illustrate various examples of the principles described herein and are part of the specification. The illustrated examples are given merely for illustration, and do not limit the scope of the claims.

[0003] Fig. 1 is a block diagram of a touch sensing device, according to an example of the principles described herein.

[0004] Fig. 2 is a block diagram of a system for sensing a user’s touch, according to an example of the principles described herein.

[0005] Fig. 3 is a cross-sectional diagram of a media for use in detecting a touch of the media, according to an example of the principles described herein. [0006] Fig. 4 is a cross-sectional diagram of a media for use in detecting a touch of the media, according to an example of the principles described herein.

[0007] Fig. 5 is a perspective view of a tattoo media for use in detecting a touch of the tattoo media by a user, according to an example of the principles described herein.

[0008] Fig. 6 is a perspective view of a wrist-worn media for use in detecting a touch of a user, according to an example of the principles described herein.

[0009] Fig. 7 is a diagram of a user and a media before the user touches the media, according to an example of the principles described herein.

[0010] Fig. 8 is a diagram of a user and a media before the user touches the media where the user’s finger creates a shadow over a glare artifact of the media, according to an example of the principles described herein.

[0011] Fig. 9 is a diagram of a user and a media when the user has touched the media creating a distorted glare artifact on the media, according to an example of the principles described herein.

[0012] Fig. 10 is a flowchart showing a method of sensing a user’s touch, according to an example of the principles described herein.

[0013] Fig. 1 1 is a flowchart showing a method of sensing a user’s touch, according to an example of the principles described herein.

[0014] Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements. The figures are not necessarily to scale, and the size of some parts may be exaggerated to more clearly illustrate the example shown. Moreover, the drawings provide examples and/or implementations consistent with the description; however, the description is not limited to the examples and/or implementations provided in the drawings.

DETAILED DESCRIPTION

[0015] Some extended reality (ER) systems may detect user-interactions such as a touch of a user to allow that touch to be interpreted as a user- interactive event that causes the ER system to process data relating to the detected touch. An image capture device such as a camera may be used to detect the movements of the user, and interpret those movements as interactive events with the ER system and environment. In one example, the user may interact with a form of media by touching the media to create the user- interactive event. Some forms of media used in detecting the touch of the user within the ER environment may include papers, textiles, foils, and other types of media, and some of these types of media may include a form factor that is meant to be worn on a part of the user’s body such as the user’s wrist. In this manner, the media serves as an inexpensive form of interactive media that, when captured by an image capture device, may be used as an input device to the ER system.

[0016] However, in some ER systems that utilize image capture devices, for example, in tracking the user’s movements, the actual touch of the user on the media may be difficult to detect, and the hovering of a finger of the user over the media between fractions of a millimeter to several millimeters away from the media may be detected as an actual touch of the media when the user actually had no intent to have such a hovering be interpreted as a touch. This false detection of the touch may frustrate the user in some ER scenarios because the false detection may have been contrary to the user’s intent.

[0017] Examples described herein provide a touch sensing device. The touch sensing device includes an image capture device to capture a plurality of images of a media. The touch sensing device also includes a processing device to compare the plurality of images of the media to determine changes in glare artifacts resulting from the touch of the media by a user.

[0018] Output of the touch sensing device defines instructions within an extended reality system. The media may include paper, and a glossy sleeve surrounding the paper. The glossy sleeve may include a flat portion covering a detection side of the paper, and a deformable base covering a non-detection side of the paper. The media may include a tattoo including a glossy overlay. Further, the media may include paper, and a glossy overlay covering at least a portion of a detection side of the paper. The media bends to change the gloss of the media using a curl formed in the media, a sleeve surrounding the media, a deformable base under the media, or a combination thereof. Further, in an example, the image capture device detects infrared electromagnetic

wavelengths.

[0019] Examples described herein also provide a system for sensing a user’s touch. The system may include a media including a detectable glare artifact when deformed. The system may also include an image capture device to capture a plurality of images of the media, and a processing device to compare the plurality of images of the media to determine changes in the media and detect glare artifacts resulting from the deformation of the media by a user’s touch.

[0020] The system may include a deformable surface under the media. The deformable surface allows the media to deform into the deformable surface under pressure. Further, the system may include an electromagnetic wave source to illuminate the media. The electromagnetic wave source may include an infrared electromagnetic wave source. The image capture device detects infrared electromagnetic wavelengths reflected from the media. The

electromagnetic wave source may include a patterned electromagnetic wave source to project a pattern onto the media.

[0021] Examples described herein also provide a method of sensing a user’s touch. The method may include, with an image capture device, capturing an initial image of a media, the media comprising a detectable glare artifact when the media is deformed, detecting a change in the glare artifact in a subsequent image of the media as compared to the initial image of the media, and, in response to a determination that the change in the glare artifact exists between the initial image and the subsequent image, defining the change in the glare artifact as instructions within an extended reality system. The method may also include recognizing, via images captured by the image capture device, that the media that is being interacted with, and determine the extents of the media within the field of view of the image capture device. The method may also detect the location of the user’s touch on the media in, for example, an (x, y) coordinate within the detected extents of the media. In examples where indicia, graphics, or other interactable marks are included on the media, the ability of the processor to identify the location on the media where the finger of the user’s hand allows for a plurality of indicia to be presented on the media and provides a number of different interactions within the ER environment to be realized.

[0022] The method may also include determining whether a permanent deformation of the media has occurred based on a comparison of the initial image and a subsequent image captured subsequent to the capture of the subsequent image, and registering the permanent deformation of the media as a baseline image for detection of a subsequent touch event. Further, the method may include recognizing an image printed on the media, and orienting a user interface for interaction with an extended reality system based on the orientation of the image.

[0023] As used in the present specification and in the appended claims, the term“extended reality” is meant to be understood broadly as any real-and- virtual combined environments and human-machine interactions generated by computer technology and wearables, and includes representative forms such as, for example, augmented reality (AR), augmented virtuality (AV), virtual reality (VR), mixed reality (MR), simulated reality (SM), and the other areas interpolated among a reality-virtuality continuum. The levels of virtuality range from partially sensory inputs to immersive virtuality that may be referred to as VR. Examples described herein may be used or applied in any extended reality system or scenario.

[0024] As used in the present specification and in the appended claims, the term“media” is meant to be understood broadly as any object whose surface may be deformable and that includes detectable changes in reflectance of light upon deformation of the surface. Thus, the media may be paper, plastics, skin, overlays, other deformable surfaces, and combinations thereof.

[0025] As used in the present specification and in the appended claims, the term“optical distortions” is meant to be understood broadly as any optically detected change in a surface. In the examples described herein, optical distortions may be detected through the capture of a plurality of images of a surface by an image capture device, and comparing the plurality of images to determine if electromagnetic waves reflected off the surface change between the plurality of captured images. As a user touches a surface such as the surface of a media, the angles of the surface changes and light may reflect differently creating a glare artifact that is detectable as an optical distortion In an example, the media may include a glossy surface that assists in the detection of the optical distortions.

[0026] Turning now to the figures, Fig. 1 is a block diagram of a touch sensing device (100), according to an example of the principles described herein. The touch sensing device (100) may form part of an overall extended reality (ER) system in which a plurality of user-interactive devices such as, for example, VR headsets, joysticks, force balls, tracking balls, controller wands, trackpads, on-device control buttons, motion trackers, bodysuits, treadmills, and motion platforms, among other peripheral devices and accessories. The touch sensing device (100) may be used to detect a user’s touch of the media (150) that is used as input to the ER system to allow the user to interact within the ER environment produced by the ER system.

[0027] The touch sensing device (100) may include a processor (101 ) and an image capture device (120) to capture a number of images of the media (150). The image capture device (120) may have an angle at which it captures an image of he media (150) as indicated by lines 152. The user’s hand (170) is depicted in Fig. 1 , as touching the media (150), and an optical distortion (151 ) may be detected. In an example, the media (150) may include a glossy surface. A glossy surface may be any optical property of a surface of the media (150) that indicates how well the surface reflects light in a specular or mirror-like manner. The refractive index of the media (150) or a surface of the media (150) may be a factor that defines the glossiness of the media (150).

[0028] In an example, the media (150) may include portions that include a glossy surface and portions that do not include the glossy surface. Further, in one example, the media (150) may include portions of that are deformable and portions that are not deformable. These glossy and non-glossy portions and deformable and non-deformable portions may be used to indicate interaction with designated portions of the media (150) that may be interpreted as different types of inputs to the ER environment.

[0029] In one example, the processor (101 ) may recognize, via images captured by the image capture device (120), that the media that is being interacted with, and determine the extents of the media (120) within the field of view of the image capture device (120). The processor may also detect the location of the user’s touch on the media (150) in, for example, an (x, y) coordinate within the detected extents of the media (150). In examples where indicia (Figs. 5 and 5, 503), graphics, or other interactable marks are included on the media (150), the ability of the processor (101 ) to identify the location on the media (150) where the finger (170-1 ) of the user’s hand (170) allows for a plurality of indicia (Figs. 5 and 5, 503) to be presented on the media (150) and provides a number of different interactions within the ER environment to be realized.

[0030] The processor (101 ) may instruct the image capture device (120) to capture a plurality of images of the media (150) and the user’s hand (170) before, during, and/or after the user touches the media (150). The processor (101 ) may then compare the plurality of images of the media (150) to determine changes in the optical properties of the media (150) to detect any optical distortions (151 ) such as glare artifacts resulting from the touch of the media (150) by the user (170).

[0031] By detecting the optical distortions (151 ), the touch sensing device (100) may detect when the user actually touches the media (150) as opposed to simply hovering his or her finger or another part if the user’s hand (170) over the media (150). In this manner, an actual touch of the media (150) may be detected. At the same time the position of the user’s (170) fingertip relative to the dimensions of the detected media may also be reported.

[0032] Fig. 2 is a block diagram of a system (200) for sensing a user’s touch, according to an example of the principles described herein. The system (200) includes the touch sensing device (100) of Fig. 1 and the processor (101 ) and image capture device (120) of the touch sensing device (100). [0033] Further, the system (200) may include a plurality of

electromagnetic wave sources (121-1 , 121-2, collectively referred to herein as 121 ). The electromagnetic wave sources (121 ) may be any device used to illuminate the media within a field of illumination (122). The electromagnetic wave sources (121 ) may emit any wavelength of light within or without the visible spectrum. In an example, the electromagnetic wave sources (121 ) emit visible light such as white light. In this example, the image capture device (120) may detect the visible light directed to the media (150) and reflected therefrom. In an example, the electromagnetic wave sources (121 ) emit infrared (IR) wavelengths of light. In this example, the image capture device (120) may detect the IR wavelengths directed to the media (150) and reflected therefrom.

In an example, the system (200) may operate without the use of the

electromagnetic wave sources (121 ). In this example, ambient light produced by natural or artificial light sources available from the surrounding environment may be reflected from the media (150) and captured by the image capture device (120).

[0034] In an example, an electromagnetic wave source (121 ) may be used in the system (200) of Fig. 2, or a plurality of electromagnetic wave sources (121 ) may be utilized in emitting radiant energy towards the media (150). Further, in an example were a plurality of electromagnetic wave sources (121 ) are utilized, the system (200) may cause the electromagnetic wave sources (121 ) alternate between one another from different directions. In this example, the optical distortion (151 ) that indicate glare distortions may be detected more effectively. By comparing images captured when a first electromagnetic wave sources (121-1 ) is activated with images captured when a second electromagnetic wave sources (121 -2) is activated, the system (200) may more effectively detect the optical distortions (151 ).

[0035] In examples where the media (150) includes a matte finish without a glossy layer or coating to provide a refractive index sufficient to detect the optical distortions (151 ), no specular light may be detected. In this situation, a laser device or a pico-projector may be used as one of the electromagnetic wave sources (121 ) to project a pattern onto the media (150) in visible or non- visible wavelengths of light. In this example, the device producing the pattern may be separated from and have a different imaging angle relative to the image capture device (120) as depicted by either of the electromagnetic wave sources (121 ) in Fig. 2 so that any deformation of the media (150) may cause a relatively significant change in the projected pattern. The image capture device (120) may capture any changes in the pattern and the processor (101 ) may identify that change in the pattern as a touch event.

[0036] In an example where a laser is employed to project a pattern onto the media (150), laser speckle may be used to detect movement or deformation of the surface of the media (150). Laser speckle is a spotty interference pattern that arises when laser light is reflected off irregularities in a rough surface such as the rough surface of the media (150). The reflected laser light in this example causes interference with itself, and even very small changes in the surface position or orientation create large, easily detectable changes in the speckle pattern. The detection of a change in the speckle pattern through a user’s touch of the media may be used to detect when and where on the media the user touched the media.

[0037] The media (150) may be any substrate that may reflect at least some light from its surfaces. For example, Fig. 3 is a cross-sectional diagram of a media (300) for use in detecting a touch of the media (300), according to an example of the principles described herein. The media (150) may be made of paper with or without a material having a refractive index applied thereon. In other words, in an example, the paper (300) may have a naturally glossy surface with a detectable refractive index. In the example of Fig. 3, however, the paper (301 ) may be coated with a glossy material (302) that provides the paper (301 ) with a glossy surface with a detectable refractive index.

[0038] The media (150) may also be covered with a sleeve (402) as depicted in Fig. 4. Fig. 4 is a cross-sectional diagram of a media (400) for use in detecting a touch of the media (400), according to an example of the principles described herein. The media (400) of Fig. 4 may include a paper (401 ) or other substrate enclosed within a sleeve (402). The sleeve (402) has a detectable refractive index that allows for the detection of the optical distortions (151 ). In an example, the sleeve (402) may include a corrugated underside (403) that allows the paper (401 ) and top portion of the sleeve (402) to deform under pressure from the user’s finger (170). In this manner, the paper (401 ) of the media (400) of Fig. 4 may deform under applied pressure to create the optical distortion (151 ) depicted in Figs 1 and 2.

[0039] Fig. 5 is a perspective view of a tattoo media (500) for use in detecting a touch of the tattoo media (500) by a user (170), according to an example of the principles described herein. In the example of Fig. 5, the tattoo media (500) may include a tattoo (501 ) applied to a portion of the user such as the user’s hand (171 ), and the opposite hand (170) may be used to interact with the tattoo (501 ). The tattoo (151 ) may be either a permanent or a temporary tattoo, and may include a natural or artificial glossy surface (502). The glossy surface (502) may be the result of oils excreted from the user’s body, or may be a residual transfer film (502) left from a temporary tattoo. A temporary tattoo is any decorative image that may be applied to the skin of the user for short periods of time. A process referred to as screen printing is used to create the temporary tattoo image on paper coated with a transfer film. The transfer film allows the image to“slide” off the backing paper and onto the skin when moisture is applied. After drying, the transfer film holds the image on the skin through several washings. Thus, in the example of Fig. 5 where the tattoo media (500) is a temporary tattoo, the glossy surface (502) is provided by the transfer film (502). In the example of Fig. 5, because the tattoo media (500) is applied to the skin, and because the skin is a deformable subsurface as to the tattoo media (500), the deformation of the user’s skin causes the optical distortion (151 ) to appear and be detectable by the image capture device (120).

[0040] Fig. 6 is a perspective view of a wrist-worn media (600) for use in detecting a touch of a user, according to an example of the principles described herein. The wrist-worn media (600) may include a flexible backing (601 ) with a form factor that allows the wrist-worn media (600) to encircle the wrist of one the hands (171 ) of the user while the user may use the other hand (170) to interact with the wrist-worn media (600). The wrist-worn media (600) may include a glossy material (602) applied to the backing (601 ) that provides the backing (601 ) with a glossy surface with a detectable refractive index. Like the tattoo media (500) of Fig. 5, because the wrist-worn media (600) of Fig. 6 is deformable, and because the skin of the user over which the wrist-worn media (600) lies is a deformable subsurface, the deformation of the wrist-worn media (600) causes the optical distortion (151 ) to appear and be detectable by the image capture device (120).

[0041] In the examples of Figs. 5 and 6, the media (500, 600) may include indicia (503) printed thereon. The indicia (503) in the examples of Figs.

5 and 6 is a quick response (QR) code. Once the user touches the media (500, 600), the image capture device (120) of the touch sensing device (100) may identify the optical distortions (151 ), and take actions based on the indicia (503) such as, for example, accessing a website defined by the QR code. In other examples, a number of input buttons may be printed in the media (500, 600), and the image capture device (120) may detect the optical distortions (151 ) that indicate selection of one of the printed buttons. This allows the user to make a plurality of types of input gestures through the media (500, 600).

[0042] In an example, the image capture device (120) may detect the presence and location of the user’s fingertip, and that location may be reported along with the touch event, disambiguating touch events at different locations on the media (150). The processor (101 ) may recognize, via images captured by the image capture device (120), that the media that is being interacted with, and determine the extents of the media (120) within the field of view of the image capture device (120). The processor may also detect the location of the user’s touch on the media (150) in, for example, an (x, y) coordinate within the detected extents of the media (150). In examples where indicia (Figs. 5 and 5, 503), graphics, or other interactable marks are included on the media (150), the ability of the processor (101 ) to identify the location on the media (150) where the finger (170-1 ) of the user’s hand (170) allows for a plurality of indicia (Figs. 5 and 5, 503) to be presented on the media (150) and provides a number of different interactions within the ER environment to be realized.

[0043] In one example, the media described herein may be three- dimensional (3D) media. The 3D media may be any three-dimensional object with at least a surface region that is deformable, and with a glossy surface that produces the optical distortions. In one example, a deformable, glossy sphere such as a foam ball with a glossy surface may be used to interact with the system (200). In an example, the 3D object may include a smaller section that is independently deformable such as, for example, a flat media (150) that includes a deformable, domed region. In this example, the domed region is large enough to create a detectable gloss artifact to create an optical distortion (151 ) and may be used to indicate an area of the 3D object that is touched.

[0044] Returning to Fig. 2, the touch sensing device (100) of the system (200) may be utilized in any data processing scenario including, stand-alone hardware, mobile applications, through a computing network, or combinations thereof. Further, the touch sensing device (100) may be used in a computing network, a public cloud network, a private cloud network, a hybrid cloud network, other forms of networks, or combinations thereof. In one example, the methods provided by the touch sensing device (100) are provided as a service over a network by, for example, a third party. In this example, the service may include, for example, the following: a Software as a Service (SaaS) hosting a number of applications; a Platform as a Service (PaaS) hosting a computing platform including, for example, operating systems, hardware, and storage, among others; an Infrastructure as a Service (laaS) hosting equipment such as, for example, servers, storage components, network, and components, among others; application program interface (API) as a service (APIaaS), other forms of network services, or combinations thereof. The present systems may be implemented on one or multiple hardware platforms, in which the modules in the system can be executed on one or across multiple platforms. Such modules can run on various forms of cloud technologies and hybrid cloud technologies or offered as a SaaS (Software as a service) that can be implemented on or off the cloud. In another example, the methods provided by the touch sensing device (100) are executed by a local administrator.

[0045] To achieve its desired functionality, the touch sensing device (100) includes various hardware components. Among these hardware components may be the processor (101 ), a number of data storage devices (102), a number of peripheral device adapters (103), a number of network adapters (104), a number of display devices (109), a number of user-interactive devices (1 10), and the image capture device (120). These hardware components may be interconnected and communicatively coupled via a bus (105).

[0046] The processor (101 ) may include the hardware architecture to retrieve executable code from the data storage device (102) and execute the executable code. The executable code may, when executed by the processor (101 ), cause the processor (101 ) to implement at least the functionality of, instructing the image capture device (120) to capture an initial image of the media (150), instructing the image capture device (120) to capture a subsequent image of the media (150), detecting a change in the glare artifact in a

subsequent image of the media as compared to the initial image of the media, and defining the change in the glare artifact as instructions within an extended reality system based on a determination that the change in the glare artifact exists between the initial image and the subsequent image. The executable code may, when executed by the processor (101 ), cause the processor (101 ) to implement at least the functionality of determining whether a permanent deformation of the media has occurred based on a comparison of the initial image and a subsequent image captured subsequent to the capture of the subsequent image, registering the permanent deformation of the media as a baseline image for detection of a subsequent touch event, recognizing an image printed on the media, recognizing the extents of the media, recognizing a user’s hand and track their fingertip(s), orienting a user interface for interaction with an extended reality system based on the orientation of the image, recognizing, via the image capture device (120), the media that is being interacted with, determining extents of the media (120) within the field of view of the image capture device (120), detecting the location of the touch in, for example, an (x, y) coordinate location, within the detected extents of the media (150), and other processes, according to the methods of the present specification described herein. In the course of executing code, the processor (101 ) may receive input from and provide output to a number of the remaining hardware units. [0047] The data storage device (102) may store data such as executable program code that is executed by the processor (101 ) or other processing device. As will be discussed, the data storage device (102) may specifically store computer code representing a number of applications that the processor (101 ) executes to implement at least the functionality described herein.

[0048] The data storage device (102) may include various types of memory modules, including volatile and nonvolatile memory. For example, the data storage device (102) of the present example includes Random Access Memory (RAM) (106), Read Only Memory (ROM) (107), and Hard Disk Drive (HDD) memory (108). Many other types of memory may also be utilized, and the present specification contemplates the use of many varying type(s) of memory in the data storage device (102) as may suit a particular application of the principles described herein. In certain examples, different types of memory in the data storage device (102) may be used for different data storage needs. For example, in certain examples the processor (101 ) may boot from Read Only Memory (ROM) (107), maintain nonvolatile storage in the Hard Disk Drive (HDD) memory (108), and execute program code stored in Random Access Memory (RAM) (106).

[0049] The data storage device (102) may include a computer readable medium, a computer readable storage medium, or a non-transitory computer readable medium, among others. For example, the data storage device (102) may be, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the computer readable storage medium may include, for example, the following: an electrical connection having a number of wires, a portable computer diskette, a hard disk, a random-access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store computer usable program code for use by or in connection with an instruction execution system, apparatus, or device. In another example, a computer readable storage medium may be any non-transitory medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.

[0050] The hardware adapters (103, 104) in the touch sensing device (100) enable the processor (101 ) to interface with various other hardware elements, external and internal to the touch sensing device (100). For example, the peripheral device adapters (103) may provide an interface to input/output devices, such as, for example, the display device (109), the user-interactive devices (1 10), the electromagnetic wave sources (121 ), a mouse, a keyboard, or other data input or data output devices. The peripheral device adapters (103) may also provide access to other external devices such as an external storage device, a number of network devices such as, for example, servers, switches, and routers, client devices, other types of computing devices, and combinations thereof.

[0051] The display device (109) may be provided to allow a user of the touch sensing device (100) to interact with and implement the functionality of the touch sensing device (100). The display device (109) may be a head-mounted display for extended reality applications. The peripheral device adapters (103) may also create an interface between the processor (101 ) and the display device (109), a printer, or other media output devices. The network adapter (104) may provide an interface to other computing devices within, for example, a network, thereby enabling the transmission of data between the touch sensing device (100) and other devices located within the network.

[0052] As the electromagnetic wave sources (121 ) provide radiant energy towards the media (150) and the image capture device (120) captures images of the media (150), a deformable substrate (180) may be placed under the media (150) to allow the media (150) to deform under the pressure from the user’s hand (170). In this manner, the optical distortion (151 ) may be produced as the deformable substrate (180) deforms under the media (150) when pressure is applied to the media (150). In an example, the media (150) may include a curl or natural bend that allows the user to distort the media (150) from the curled state to a flatter or otherwise different state. The curl of the media (150) may be used in addition to or in place of the deformable substrate (180) placed underneath the media (150).

[0053] The user-interactive devices (1 10) may include for example, VR headsets, joysticks, force balls, tracking balls, controller wands, data gloves, trackpads, on-device control buttons, motion trackers, bodysuits, treadmills, and motion platforms, among other peripheral devices and accessories. These devices may work in concert with the touch sensing device (100) within an ER system to provide the user with an immersive environment. In one example, the user-interactive devices (1 10) may be located outside the touch sensing device (100), and instead within an ER system of which the touch sensing device (100) is a part.

[0054] The touch sensing device (100) further includes a number of modules used in the implementation of the functionality of the system (200) and the touch sensing device (100) described herein. The various modules within the touch sensing device (100) include executable program code that may be executed separately. In this example, the various modules may be stored as separate computer program products. In another example, the various modules within the touch sensing device (100) may be combined within a number of computer program products; each computer program product including a number of the modules.

[0055] The touch sensing device (100) may include a differentiation module (1 15) to, when executed by the processor (101 ), compare a plurality of images of the media (150) captured by the image capture device (120) to determine changes in the media (150) using the optical distortions (151 ). The optical distortions (151 ) may also be referred to herein as glare artifacts, and result from the deformation of the media (150) by a user’s touch. For example, Figs. 7 though 9 depict the media (150) at different phases of a sequence of touching the media (150). Specifically, Fig. 7 is a diagram of a user (170) and a media (150) before the user (170) touches the media (150), according to an example of the principles described herein. Fig. 8 is a diagram of a user (170) and a media (150) before the user (170) touches the media (150) where the user’s finger (170-1 ) creates a shadow (175) over a glare artifact (701 ) of the media (150), according to an example of the principles described herein.

Further, Fig. 9 is a diagram of a user (170) and a media (150) when the user (170) has touched the media (150) creating a distorted glare artifact (901 ) on the media (150), according to an example of the principles described herein. Although the glare artifact (701 ) is depicted in Figs. 7 through 9 as a band of reflected light across the media (150), the number of glare artifacts (701 ) and their shape and position may vary from one piece of media to another.

[0056] As defined herein, the term“optical distortions” is meant to be understood broadly as any optically detected change in a surface. In the examples described herein, optical distortions (151 ) may be detected through the capture of a plurality of images of a surface by an image capture device (120), and comparing the plurality of images to determine if electromagnetic waves reflected off the surface change between the plurality of captured images. As a user touches a surface such as the surface of the media (150), the angles of the surface change and light may reflect differently creating a glare artifact that is detectable as an optical distortion. In an example, the media (150) may include a glossy surface that assists in the detection of the optical distortions.

[0057] In one example, a threshold may be established that defines a sufficient change between the plurality of images that may be used by the touch sensing device (100) to determine the existence of the glare artifacts that indicate optical distortions (151 ) along the surface of the media (150). In one example, regions along the surface of the media (150) may, when touched by a user, creates regions of relatively higher intensity electromagnetic waves, regions of different intensities electromagnetic waves, or different patterns or locations of glare artifacts along the surface of the media (150) due to the change in the manner in which light reflects from the surface of the media (150). These changes are detectable by comparison of the images captured by the image capture device (120).

[0058] Further, the touch sensing device (100) may include an image analysis module (1 16) to, when executed by the processor (101 ), work with the differentiation module (1 15) to analyze the images captured by the image capture device (120) to identify differences between the images.

[0059] In Fig.7, the user’s hand (170) may be hovering over the media (150) but is not detectable as a touch event within the glare artifact (701 ). In an example, the glare artifact (701 ) may be a selected portion of the media (150) that the system (200) is focusing on to detect changes to the media (150). In an example, the glare artifact (701 ) may be a portion of the media (150) that for one reason or another creates a detectable glare as produced by the

electromagnetic wave sources (121 ) and/or captured by the image capture device (120). During the stage of touch sensing depicted in Fig. 7, the system (200) may detect and track the user’s finger (170-1 ) via methods such as computer vision-based techniques such as depth detect from shadows or stereo depth detection, and monitors the shape of the glare artifact (701 ) on the media (150) close to the user’s finger (170-1 ).

[0060] In Fig. 8, the user’s finger (170-1 ) hovers over but does not touch the media (150) or the glare artifact (701 ) on the media (150). Here, the user (170) has not touched the surface of the media (150), but the user’s finger (170- 1 ) creates a shadow (175) on part of the glare artifact (701 ). In this situation, the glare artifact (701 ) is not distorted beyond its initial boundaries as indicated by the dotted lines around the glare artifact (701 ).

[0061] In Fig. 9, the user’s finger (175) has made contact with the media (150), and the glare artifact (170) has been altered because of the force applied by the user’s finger (170-1 ) creating an optical distortion (151 ). In the state depicted in Fig. 9, the shadow (175) of the user’s finger (170-1 ) has

disappeared, and the glare artifact (701 ) is distorted outside its original bounds to create a distortion region (901 ) indicating to the system (200) that the media (150) has been actually touched by the user (170). In one example, the system (200) may characterize the glare artifact (701 ) and any optical distortion (151 ) in terms of the surface curvature of the media (150).

[0062] The system (200) may capture images of the media (150) throughout the different states depicted in Figs. 7 through 9 and look for distortions between the images captured in connection with Fig. 8 and images captured in connection with Fig. 9 by comparing the images to look for changes in the shape of the glare artifacts (701 ) that indicate optical distortions (151 ) in the media (150). The detection of an optical distortion (151 ) may be translated into an extended reality (ER) input that is to be addressed by the system (200) in some sort of feedback to the user such as, for example, an action taking place in a virtual reality environment in which the system (200) is being employed. In this manner, the detection of the touching of the media (150) serves as input to a larger ER system. The use of media (150) as an interactive input device provides for an inexpensive way to interact with an ER system and does not significantly take processing power or time from the ER system.

[0063] In an example, the system (200) may also detect the removal of the distortion created by the touch of the user’s finger (170-1 ) that created the distorted glare artifact (901 ). This may be performed by comparing an image taken after the user removes his or her finger (170-1 ) from the media (150) with an image taken by the image capture device (120) at the state depicted in Fig.

9. In an example, the user’s touch of the media (150) may have permanently distorted the media (150) by creating wrinkles in the media (150), or changed the shape of the glare artifacts by physically moving the media (150). Since some media (150) plastically deform when pressure is applied, these permanent distortions may be recorded by the image capture device (120) and stored in the data storage device (102) as a new baseline image of the media (150) to be used in a subsequent touch event.

[0064] Further, some types of media (150) may temporarily deform by creating temporary wrinkles that may“pop out” as the media (150) is moved or settles. This may be the case in media (150) that include a more rigid glossy coating that resists, inhibits, or prevents permanent deformation, but sometimes allow for wrinkles to be created in the media that return to their original states if the user moves the media (150) in any way. The system (200), in this scenario, may detect the temporary deformations, and determine if the temporary deformations have changed between touch events. In this manner, no matter what the state the media (150) is in, the system (200) may consider permanent and temporary deformations in its analysis of consecutive images of the media (150) in determining whether the user initially or subsequently touches the media (150).

[0065] Fig. 10 is a flowchart showing a method (1000) of sensing a user’s touch, according to an example of the principles described herein. The method (1000) may include, with an image capture device (120), capturing (block 1001 ) an initial image of a media (150). The media (150) includes a detectable glare artifact (701 ) when the media (150) is deformed. The method (1000) may also detect (block 1002) a change in the glare artifact (701 ) in a subsequent image of the media (150) as compared to the initial image of the media (150). In response to a determination that the change in the glare artifact (701 ) exists between the initial image and the subsequent image, the method (1000) may include defining (block 1003) the change in the glare artifact (701 ) as

instructions within an extended reality system.

[0066] Fig. 1 1 is a flowchart showing a method (1 100) of sensing a user’s touch, according to an example of the principles described herein. The method (1 100) of Fig. 1 1 may include blocks 1 101 through 1 103 that are identical to blocks 1001 through 1003 of Fig. 10. The method (1 100) of Fig. 1 1 may include determining (block 1 104) whether a permanent deformation of the media (150) has occurred. At block 1 104, the image capture device (120) may detect the movement of the user’s finger (170-1 ) and/or the user’s whole hand (170) away from the media (150) and the processor (101 ) executing the differentiation module (1 15) may determine if a permanent deformation in the media (150) has occurred by comparing images of the media (150) captured before the touch event when the user’s hand was first captured and after the removal of the user’s finger (170-1 ) and/or the whole hand (170). Further, the system (200) may detect a settling of the media (150) by determining if there is a subsequent change in the optical distortions (151 ) on the media (150) without a detection of a portion of the user’s hand.

[0067] In response to a determination that a permanent deformation of the media (150) has not occurred (block 1 104, determination NO), the method (1 100) may proceed with the remainder of the method (1 100) at block 1 106. In contrast, in response to a determination that a permanent deformation of the media (150) has occurred (block 1 104, determination YES), the permanent deformation may be registered (block 1 105) as a baseline image for detection of a subsequent touch event. The registration of the permanent distortion may include registration of a detected permanent distortion or a settling of the media (150) described in connection with block 1 104.

[0068] At block 1 106, an image (503) printed on the media (150) may be recognized (block 1 106). As described herein, QR codes or other interactive images (503) printed on the media (150) may be used as individual input points where the system (200) may detect the touch of a particular portion of the images (503) or one of several images printed on the media (150) that may be used by the system (200) as input to an ER environment. For example, the method (1 100) may also include orienting (block 1 107) a user interface for interaction with an extended reality system based on the orientation of the image (503) printed on the media (150). In other words, interaction with the image (503) through touching the image (503) on the media (150) may result in the display of a user interface within the ER environment that relates to the image (503).

[0069] Aspects of the present system and method are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to examples of the principles described herein. Each block of the flowchart illustrations and block diagrams, and combinations of blocks in the flowchart illustrations and block diagrams, may be implemented by computer usable program code. The computer usable program code may be provided to a processor of a general- purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the computer usable program code, when executed via, for example, the processor (101 ) of the touch sensing device (100) or other programmable data processing apparatus, implement the functions or acts specified in the flowchart and/or block diagram block or blocks. In one example, the computer usable program code may be embodied within a computer readable storage medium; the computer readable storage medium being part of the computer program product. In one example, the computer readable storage medium is a non-transitory computer readable medium.

[0070] The specification and figures describe a touch sensing device.

The touch sensing device includes an image capture device to capture a plurality of images of a media. The touch sensing device also includes a processing device to compare the plurality of images of the media to determine changes in glare artifacts resulting from the touch of the media by a user.

[0071] The touch sensing device provides for an inexpensive way to allow for the user to provide input to an ER system and environment. Further, the touch sensing device uses little processing to achieve an input, and is, therefore, more computationally efficient and expeditions relative to other ER input devices.

[0072] The preceding description has been presented to illustrate and describe examples of the principles described. This description is not intended to be exhaustive or to limit these principles to any precise form disclosed. Many modifications and variations are possible in light of the above teaching.