Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A TOUCH-BASED VIRTUAL-REALITY INTERACTION SYSTEM
Document Type and Number:
WIPO Patent Application WO/2019/032014
Kind Code:
A1
Abstract:
A touch-based virtual-reality (VR) interaction system is disclosed. The VR interaction system comprises a touch sensitive apparatus configured to receive touch input from a user, a VR output device configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space, a positioning unit configured to provide spatial position information of the position of the touch sensitive apparatus relative to the user, and a processing unit configured to map the spatial position information of the touch sensitive apparatus to the VR environment coordinate system. The processing unit is configured to communicate a set of VR environment coordinates of the touch sensitive apparatus to the VR output device so that the touch sensitive apparatus is displayed within the virtual space together with the virtual representation of the touch input. A related method is also disclosed.

Inventors:
JAKOBSON KRISTOFER (SE)
CHRISTIANSSON TOMAS (SE)
KRUS MATTIAS (SE)
Application Number:
PCT/SE2018/050781
Publication Date:
February 14, 2019
Filing Date:
July 31, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
FLATFROG LAB AB (SE)
International Classes:
G06F3/01; G06F3/042; G06T7/80; G06T19/00
Foreign References:
US20160093105A12016-03-31
US20130265393A12013-10-10
US20030227470A12003-12-11
EP2981079A12016-02-03
US20170199580A12017-07-13
US20170061700A12017-03-02
US20120206323A12012-08-16
Attorney, Agent or Firm:
DAVIES, Dominic (SE)
Download PDF:
Claims:
Claims

1 . A touch-based virtual-reality (VR) interaction system (100) comprising a touch sensitive apparatus (101 ) configured to receive touch input from a user,

a VR output device (102) configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space,

a positioning unit (103) configured to provide spatial position information of the position of the touch sensitive apparatus relative to the user,

a processing unit (104) configured to map the spatial position information of the touch sensitive apparatus to the VR environment coordinate system, whereby the processing unit is configured to communicate a set of VR environment coordinates of the touch sensitive apparatus to the VR output device so that the touch sensitive apparatus is displayed within the virtual space together with the virtual representation of the touch input.

2. Touch-based VR interaction system according to claim 1 , comprising at least one spatial marker (105, 105') arranged on the touch sensitive apparatus, wherein the positioning unit is configured to track the at least one spatial marker to determine an associated position of the touch sensitive apparatus relative to the user. 3. Touch-based VR interaction system according to claim 1 or 2, comprising

an image sensor device (106) configured to be wearable by the user, and wherein the image sensor device is configured to capture image data (107, 107', 107", 107"') associated with the position of the touch sensitive apparatus and communicate the image data to the positioning unit, wherein the positioning unit is configured to determine the position of the touch sensitive apparatus relative to the user based on the captured image data.

4. Touch-based VR interaction system according to claim 2 and 3, wherein the image sensor device is configured to capture image data (107) of the at least one spatial marker and communicate the image data to the positioning unit, wherein the positioning unit is configured to determine the position of the touch sensitive apparatus relative to the user based on the captured image data.

5. Touch-based VR interaction system according to claim 3 or 4, wherein the image sensor device is configured to capture image data (107') displayed by the touch sensitive apparatus and communicate the image data to the positioning unit, wherein the positioning unit is configured to determine the position of the touch sensitive apparatus relative to the user based on the captured image data.

6. Touch-based VR interaction system according to claim 5, wherein the touch sensitive apparatus is configured to display image data comprising at least one orientation tag (107"), and wherein the positioning unit is configured to track the position of the at least one orientation tag to determine an associated position of the touch sensitive apparatus relative to the user.

7. Touch-based VR interaction system according to any of claims 3 - 6, wherein the touch sensitive apparatus is configured to display a calibration image (108) at the position of a user input device (109) on the touch sensitive apparatus when the touch sensitive apparatus receive touch input from the user input device, whereby the image sensor device is configured to capture image data comprising the calibration image and the user input device and/or the user (1 11 ), wherein the positioning unit is configured to determine an orientation of the user input device and/or the user relative the touch sensitive apparatus based on a projected image (110) of the user input device and/or the user on the calibration image.

8. Touch-based VR interaction system according to claim 7, wherein the touch sensitive apparatus is configured to display the calibration image tracking the position of the user input device on the touch sensitive apparatus.

9. Touch-based VR interaction system according to any of claims 3 - 8, comprising a light emitter (1 16) arranged at a determined spatial position relative to the touch sensitive apparatus, and wherein the image sensor device is configured to capture image data (107"') of light emitted by the light emitter and communicate the image data to the positioning unit, wherein the positioning unit is configured to determine the position of the touch sensitive apparatus relative to the user based on the captured image data.

10. Touch-based VR interaction system according to any of claims 3 - 9, wherein the image sensor device is arranged at the VR output device.

1 1 . Touch-based VR interaction system according to any of claims 1 - 10, comprising a second image sensor device (113, 1 13') arranged on the touch sensitive apparatus, wherein the second image sensor device is configured to capture image data of the user (1 11 ) and/or a user input device (109) and communicate the image data to the positioning unit, wherein the positioning unit is configured to determine an orientation of the user and/or a user input device relative to the touch sensitive apparatus based on the captured image data. 12. Touch-based VR interaction system according to claim 7 or 1 1 , wherein the processing unit is configured to map spatial position information associated with the determined orientation of the user and/or a user input device to the VR environment coordinate system, and wherein the VR output device is configured to display the orientation of the user and/or a user input device in the virtual space.

13. Touch-based VR interaction system according to any of claims 1 - 12, wherein the positioning unit is configured to determine a calibration position of a user input device (109) in the VR environment coordinate system when touching at least one physical coordinate (1 12) on the touch sensitive

apparatus, whereby the processing unit is configured to map the position of the at least one physical coordinate to the VR environment coordinate system by registering the at least one physical coordinate to the calibration position when detecting the touch of the at least one physical coordinate.

14. Touch-based VR interaction system according to any of claims 1 - 13, wherein the VR output device is configured to display the touch sensitive apparatus as a plurality of virtual representations (1 14) thereof in the virtual space, wherein the processing unit is configured to associate at least a second (1 15) virtual representation of the plurality of virtual representations with a second set of VR environment coordinates in response to a user input so that the VR output device displays the second virtual representation as being separated within the virtual space from a virtual representation (1 15') of the touch sensitive apparatus receiving touch input.

15. A method (200) in a touch-based virtual-reality (VR) interaction system (100) having a touch sensitive apparatus (101 ) configured to receive touch input from a user, and a VR output device (102) configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space, the method comprising

providing (201 ) spatial information of the position of the touch sensitive apparatus relative to the user,

mapping (202) the spatial position information of the touch sensitive apparatus to the VR environment coordinate system, and

communicating (203) a set of VR environment coordinates of the touch sensitive apparatus to the VR output device so that the touch sensitive apparatus is displayed (204) within the virtual space together with the virtual representation of the touch input.

16. A computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to claim 15.

Description:
A touch-based virtual-reality interaction system

Technical Field

The present invention relates generally to the field of virtual-reality (VR) interaction systems. More particularly, the present invention relates to a touch- based VR interaction system and a related method.

Background

To an increasing extent, touch-sensitive panels are being used for providing input data to computers, gaming devices, presentation- and

conference systems etc. Alongside this development is the growing field of virtual-reality systems and applications. Virtual-reality presents the user with an environment partially if not fully disconnected from the actual physical environment of the user. Various ways of interacting with this environment have been tried. These include IR tracked gloves, IR tracked wands or other gesturing tools, gyroscope-/accelerometer tracked objects. The IR tracked objects are typically tracked using one or more IR sensors configured to view and triangulate IR light sources on the IR tracked objects. Such interaction systems provide high latency, low accuracy user input to the virtual

environment. It would thus be advantageous to provide a VR interaction system with a high-precision interface.

Summary It is an objective of the invention to at least partly overcome one or more of the above-identified limitations of the prior art.

One objective is to provide a VR interaction system with a high-precision interface.

Another objective is to provide a touch-based VR interaction system in which a VR user interact with a high precision touch sensitive apparatus in the physical reality whilst viewing the interaction in the virtual reality. One or more of these objectives, and other objectives that may appear from the description below, are at least partly achieved by means of a touch- based (VR) interaction system and a related method according to the independent claims, embodiments thereof being defined by the dependent claims.

According to a first aspect a touch-based virtual-reality (VR) interaction system is provided. The VR interaction system comprises a touch sensitive apparatus configured to receive touch input from a user, a VR output device configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space, a positioning unit configured to provide spatial position information of the position of the touch sensitive apparatus relative to the user, and a processing unit configured to map the spatial position information of the touch sensitive apparatus to the VR environment coordinate system. The processing unit is configured to communicate a set of VR environment coordinates of the touch sensitive apparatus to the VR output device so that the touch sensitive apparatus is displayed within the virtual space together with the virtual representation of the touch input.

According to a second aspect a method in a touch-based VR interaction system is provided. The system having a touch sensitive apparatus configured to receive touch input from a user, and a VR output device configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space. The method comprises providing spatial information of the position of the touch sensitive apparatus relative to the user, mapping the spatial position information of the touch sensitive apparatus to the VR environment coordinate system, and

communicating a set of VR environment coordinates of the touch sensitive apparatus to the VR output device so that the touch sensitive apparatus is displayed within the virtual space together with the virtual representation of the touch input.

According to a third aspect a computer program product is provided comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to the second aspect.

Further examples of the invention are defined in the dependent claims, wherein features for the second and third aspects of the disclosure are as for the first aspect mutatis mutandis. Some examples of the disclosure provide for a VR interaction system with a high-precision interface.

Some examples of the disclosure provide for a touch-based VR interaction system in which a VR user interact with a high precision touch sensitive apparatus in the physical reality whilst viewing the interaction in the virtual reality.

Some examples of the disclosure provide for an enhanced VR experience via interaction with a touch panel.

Some examples of the disclosure provide for capturing input from a user's interaction with a VR environment with a high accuracy.

It should be emphasized that the term "comprises/comprising" when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.

Brief Description of the Drawings

These and other aspects, features and advantages of which examples of the invention are capable of will be apparent and elucidated from the following description of examples of the present invention, reference being made to the accompanying schematic drawings, in which;

Fig. 1 shows a touch-based virtual-reality (VR) interaction system according to examples of the disclosure;

Fig. 2 shows a touch-based VR interaction system according to examples of the disclosure;

Fig. 3 shows a touch-based VR interaction system according to examples of the disclosure;

Fig. 4 shows a touch-based VR interaction system according to examples of the disclosure;

Fig. 5 shows a touch-based VR interaction system according to examples of the disclosure;

Fig. 6 shows a touch-based VR interaction system according to examples of the disclosure;

Fig. 7 shows a touch-based VR interaction system according to examples of the disclosure; Fig. 8 shows a touch-based VR interaction system according to examples of the disclosure;

Fig. 9 shows a touch-based VR interaction system according to examples of the disclosure;

Fig. 10 shows a VR environment in which a plurality of virtual

representations of a touch sensitive apparatus is shown, according to examples of the disclosure; and

Fig. 1 1 is a flowchart of a method in a touch-based VR interaction system according to examples of the disclosure.

Detailed Description

Specific examples of the invention will now be described with reference to the accompanying drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these examples are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. The terminology used in the detailed description of the examples illustrated in the accompanying drawings is not intended to be limiting of the invention. In the drawings, like numbers refer to like elements.

Fig. 1 is a schematic illustration of a touch-based virtual-reality (VR) interaction system 100 comprising a touch sensitive apparatus 101 configured to receive touch input from a user, and a VR output device 102 configured to display a position of the user and a virtual representation of the touch input in a

VR environment coordinate system within a virtual space. The touch sensitive apparatus 101 may be configured to receive input using e.g. one or more fingers, a pointer or stylus etc on a touch panel 101 ' of the touch sensitive apparatus 101 . The VR output device 102 may be configured to be wearable by the user, and may thus comprise a VR headset. The VR output device 102 presents a virtual space to the user, as well as a virtual representation of the touch input, when the user provides touch input to the touch sensitive apparatus 101 . A virtual representation of the user, such as one or more fingers, and/or a pointer or stylus may be presented in the VR output device 102 to facilitate orientation in the virtual space. The objects presented in the virtual space, such as the user, the virtual representation of the touch input, or any other

(interactable) VR objects have thus determined coordinates in the VR environment coordinate system, for visualization via the VR output device 102. The VR coordinates may be determined by sensor devices configured to detect the location and movements of these objects. Further, the touch-based VR interaction system 100 comprises a positioning unit 103 configured to provide spatial position information of the position of the touch sensitive apparatus 101 relative to the user, and a processing unit 104 configured to map the spatial position information of the touch sensitive apparatus 101 to the VR environment coordinate system. The processing unit 104 is configured to communicate a set of VR environment coordinates of the touch sensitive apparatus 101 to the VR output device 102 so that the touch sensitive apparatus 101 is displayed within the virtual space together with the virtual representation of the touch input. The VR user may thus reliably interact with a high precision touch sensitive apparatus 101 in the physical reality whilst viewing the interaction in VR.

Various input from the user's interaction with a VR environment may thus be captured with an increased accuracy. For example, touch input of fine details of a component for a machine presented in the VR space may be captured with the increased accuracy and low latency of the touch sensitive apparatus 101 , that otherwise would not be resolved by typical spatial sensors in previous VR systems. Mapping the position of the touch sensitive apparatus 101 to the VR environment provides further for an enhanced VR experience combining the freedom of customizing different VR environments to the user's tasks with the tactile interaction provided by the touch sensitive apparatus 101 . Moreover, the simultaneous interaction with the touch sensitive apparatus 101 allows for a more viable handling of user input from a VR environment, such as the communication of a user's input to various related systems and applications. A realistic and more practical utilization of VR may thus be provided, across a range of applications and technical fields.

There are numerous known techniques for providing touch sensitivity to the touch panel 101 ', e.g. by using cameras to capture light scattered off the point(s) of touch on the panel, by using cameras to directly observe the objects interacting with the panel, by incorporating resistive wire grids, capacitive sensors, strain gauges, etc. into the panel. In one category of touch-sensitive panels known as 'above surface optical touch systems', a plurality of optical emitters and optical receivers are arranged around the periphery of the touch surface of the panel 101 ' to create a grid of intersecting light paths (otherwise known as detection lines) above the touch surface. Each light path extends between a respective emitter/receiver pair. An object that touches the touch surface will block or attenuate some of the light paths. Based on the identity of the receivers detecting a blocked light path, a processor can determine the location of the intercept between the blocked light paths.

The touch-based VR interaction system 100 may comprise at least one spatial marker 105, 105', arranged on the touch sensitive apparatus 101 , as schematically illustrated in Fig. 2. The positioning unit 103 may be configured to track the at least one spatial marker 105, 105', to determine an associated position of the touch sensitive apparatus 101 relative to the user. The at least one spatial marker 105, 105', may comprise IR markers such as IR light sources, or any other marker configured for allowing tracking by the positioning unit 102, such as markers of different shapes and configurations being physically provided on parts of the touch sensitive apparatus 101 and/or displayed on the touch panel 101 ' thereof. Accurate mapping of the obtained spatial position information to the VR environment coordinate system may then be provided. Fig. 2 illustrates first and second spatial markers 105, 105', but it is conceivable that the number of spatial markers may be varied to provide for an optimized position detection.

The touch-based VR interaction system 100 may comprise an image sensor device 106 configured to be wearable by the user, as schematically illustrated in Figs. 3 - 8. The image sensor device 106 may be configured to capture image data 107, 107', 107", 107"', associated with the position of the touch sensitive apparatus 101 and communicate the image data to the positioning unit 103, which is configured to determine the position of the touch sensitive apparatus 101 relative to the user based on the captured image data, such as by a triangulation process of the obtained image data. Since the image sensor device 106 may be arranged at the position of the user, i.e. by being wearable, the relative position between the user and the touch sensitive apparatus 101 may be accurately determined. This provides for accurately determining the VR environment coordinates of the touch sensitive apparatus

101 and a precise positioning the touch sensitive apparatus in the virtual space. Such precise positioning in the virtual space facilitates the interaction with the touch sensitive apparatus 101 when the user is immersed in the VR experience, since the virtual representation of the touch sensitive apparatus 101 may be precisely aligned with the physical touch sensitive apparatus 101. The touch- based VR interaction system 100 thus enables high-resolution input and for more complex tasks to be carried out by the user in the VR space. The image sensor device 106 may be configured to capture image data 107 of the at least one spatial marker 105, 105', and communicate the image data to the positioning unit 103, which is configured to determine the position of the touch sensitive apparatus 101 relative to the user based on the captured image data. Fig. 3 illustrates an example where the image sensor device 106 locates the position of the touch sensitive apparatus 101 based on spatial markers 105, 105'. The processing unit 104 may then accurately map the retrieved spatial position information to the VR environment coordinate system.

The image sensor device 106 may be configured to capture image data 107' displayed by the touch sensitive apparatus 101 and communicate the image data to the positioning unit 103, as schematically illustrated in Fig. 4. The image data 107' displayed by the touch sensitive apparatus 101 may comprise objects of varying shapes and configurations that allow for a calibration of the position of the touch sensitive apparatus 101 in the VR environment coordinate system. A flexible and highly optimizable calibration may thus be provided, since the displayed image data may be varied for different conditions and applications.

The touch sensitive apparatus 101 may be configured to display image data comprising at least one orientation tag 107", as schematically illustrated in Fig. 5. The positioning unit 103 may be configured to track the position of the at least one orientation tag 103 to determine an associated position of the touch sensitive apparatus 101 relative to the user. The number of tags 103 displayed and the configurations thereof may vary to provide for a precise positioning procedure and a VR environment which is accurately anchored to the physical reality, i.e. the touch sensitive apparatus 101 .

The touch sensitive apparatus 101 may be configured to display a calibration image 108 at (or at a defined distance to) the position of a user input device 109 on the touch sensitive apparatus 101 when the touch sensitive apparatus receives touch input from the user input device 109, as schematically illustrated in Fig. 6. The image sensor device 106 may be configured to capture image data comprising the calibration image 108 and the user input device 109 and/or the user 1 1 1 . The positioning unit 103 may be configured to determine an orientation of the user input device 109 and/or the user 1 1 1 (such as one or more fingers, hand or lower arm of the user) relative the touch sensitive apparatus 101 based on a projected image 110 of the user input device 109 and/or the user 1 1 1 on the calibration image 108. Thus, by observing which parts of the calibration image 108 being obscured by the user input device 109 and/or the user 1 1 1 , the positioning device 103 may determine the orientation, position, or dynamics of the movement, such as the speed or acceleration, of the user input device 109 and/or the user 1 1 1. Such spatial position information is then mapped to the VR environment coordinate system as described, which provides for a facilitated interaction with the touch sensitive apparatus 101 , e.g. by displaying a virtual representation of the user input device 109 and/or the user 1 1 1 in the VR space. This also allows for providing sufficient information to allow effective palm rejection, e.g. by identifying a stylus tip from the projected image and ignoring all other touches around that stylus tip position. As the user is usually looking at their hand when interacting with the touch panel, the calibration image 108 is advantageously displayed around the user input device 109 and/or the hand of the user 1 11 .

The touch sensitive apparatus 101 may be configured to display the calibration image 108 tracking the position of the user input device 109, and/or the user 11 1 , on the touch sensitive apparatus 101 . The calibration image 108 may thus follow the position of the user input device 109, and/or the user 1 1 1 , on the touch sensitive apparatus 101 , which may improve the detection of the above mentioned spatial position information.

The touch-based VR interaction system 100 may comprise a light emitter 1 16 arranged at a determined spatial position relative to the touch sensitive apparatus 101 , as schematically illustrated in Fig. 8. The image sensor device 106 may be configured to capture image data 107"' of light emitted by the light emitter and communicate the image data to the positioning unit 103, which is configured to determine the position of the touch sensitive apparatus 101 relative to the user based on the captured image data. The light may be IR light or light of any other wavelength suitable for detection by the image sensor device 106.

The image sensor device 106 may be arranged at the VR output device 102, as schematically illustrated in Figs. 3 - 8. It is conceivable however that the image sensor device 106 may be displaced from the VR output device 102 but at a predetermined distance from the touch sensitive apparatus 101 and communicating with the positioning unit 103, so that the image data 107 - 107"' may be received by the positioning unit 103.

The touch-based VR interaction system 100 may comprise a second image sensor device 113, 1 13', arranged on the touch sensitive apparatus 101 , as schematically illustrated in Fig. 7. The second image sensor device 1 13, 1 13', may be configured to capture image data of the user 1 1 1 and/or a user input device 109 and communicate the image data to the positioning unit 103, which is configured to determine an orientation of the user 1 1 1 and/or a user input device 109 relative to the touch sensitive apparatus 101 based on the captured image data. The second image sensor device 1 13, 113', may comprise depth cameras for accurately determining the spatial positioning information. Inertia sensors may also track the movement of the user input device 109 for defined periods of time, such as the time between letters when writing a word. The positioning device 103 may determine the orientation, position, or dynamics of the movement, such as the speed or acceleration, of the user input device 109 and/or the user 1 1 1 from the image data. The processing unit 104 may subsequently map such spatial position information to the VR environment coordinate system as described above for providing a precise representation of the user input device 109 and/or the user 1 1 1 in the VR space. The accuracy of the virtual representation of the touch input in the VR environment coordinate system may thus be improved so that user may experience a more direct connection between physical movements of e.g. input device 109 and the resulting virtual presentation, which is critical for fine touch input gestures e.g. in high-resolution tasks. Such improved VR representation and tracking of the user input device 109 and/or the user 1 11 is also

advantageous for avoiding disorientation of the user.

The processing unit 104 may thus be configured to map spatial position information associated with the determined orientation of the user 1 1 1 and/or a user input device 109 to the VR environment coordinate system, and the VR output device 102 may be configured to display the orientation of the user 1 1 1 and/or a user input device 109 in the virtual space.

The positioning unit 103 may be configured to determine a calibration position of a user input device 109 in the VR environment coordinate system when touching at least one physical coordinate 112 on the touch sensitive apparatus 101 (i.e. on the touch panel 101 ' thereof). The processing unit 104 may be configured to map the position of the at least one physical coordinate to the VR environment coordinate system by registering the at least one physical coordinate to the calibration position when detecting the touch of the at least one physical coordinate 1 12. Thus, if the user has a tracked user input device 109, such as VR gloves or the like, the user may calibrate the position of the touch sensitive apparatus 101 in the VR space with a few touches on the touch panel 101 '. Each touch with the user input device 109 connects the respective physical coordinate at the touch site of the touch panel 101 ' with the coordinate of the user input device 109 in the VR environment coordinate system, when at the same point in time.

The VR output device 102 may be configured to display the touch sensitive apparatus as a plurality of virtual representations 1 14 thereof in the virtual space, as schematically illustrated in Fig. 10. The processing unit 104 may be configured to associate at least a second 1 15 virtual representation of the plurality of virtual representations 1 14 with a second set of VR environment coordinates in response to a user input so that the VR output device 102 displays the second virtual representation 1 15 as being separated within the virtual space from a virtual representation 1 15' of the touch sensitive apparatus receiving touch input. For example, it is conceivable that the VR output device 102 displays a presentation session in the VR space in one application, in which a plurality of virtual representations 1 14 of a touch sensitive apparatus is displayed to a user 11 1 or a plurality of users. A user 1 1 1 may interact with a first virtual representation 1 15' of the touch sensitive apparatus. The user may subsequently provide a dedicated touch input, such as a swipe gesture, to shift the first virtual representation 115' to a different location in the VR space (e.g. as denoted by reference 1 15 in Fig. 10) and continue interaction with another virtual representation of the touch sensitive apparatus in the VR space, but with the same physical touch sensitive apparatus 101 . Hence, a plurality of virtual representations 114 may be arranged in the VR space for viewing and further interaction by the participating VR users. A user may then 'activate' any of the virtual representations 1 14 for touch input, by again anchoring a virtual representation 115 to the VR coordinates represented by the touch sensitive apparatus 101 . The virtual representation 1 15' aligned with the physical touch sensitive apparatus 101 may be highlighted e.g. with a different color in the VR space to facilitate the user orientation. The touch-based VR interaction system 100 thus provides for a highly dynamic interaction with the freedom to utilize the VR space while ensuring that all of the user's input is structured and retained, with high resolution and accuracy. It is conceivable that several touch sensitive apparatuses 101 are connected over a communication network, where the touch-based VR interaction system 100 incorporates the touch sensitive apparatuses 101 so that simultaneous input to the plurality of touch panels 101 ' can be provided and mapped to the VR space for simultaneous interaction and viewing by a plurality of user's in a network.

Fig. 1 1 illustrates a flow chart of a method 200 in a touch-based VR interaction system. The order in which the steps of the method 200 are described and illustrated should not be construed as limiting and it is

conceivable that the steps can be performed in varying order. As mentioned, the touch-based VR interaction system 100 has a touch sensitive apparatus 101 configured to receive touch input from a user, and a VR output device 102 configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space. The method 200 comprises providing 201 spatial information of the position of the touch sensitive apparatus 101 relative to the user, and mapping 202 the spatial position information of the touch sensitive apparatus 101 to the VR environment coordinate system. The method 200 comprises communicating 203 a set of VR environment coordinates of the touch sensitive apparatus to the VR output device 102 so that the touch sensitive apparatus 101 is displayed 204 within the virtual space together with the virtual representation of the touch input. The method 200 thus provides for the advantageous benefits as described above in relation to the system 100 and Figs. 1 - 10.

A computer program product is provided comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method 200.

The present invention has been described above with reference to specific examples. However, other examples than the above described are equally possible within the scope of the invention. The different features and steps of the invention may be combined in other combinations than those described. The scope of the invention is only limited by the appended patent claims.

More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings of the present invention is/are used.