Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
INTERACTIVE VIRTUAL REALITY SPORTS SIMULATION SYSTEM AND METHODS EMPLOYED THEREOF
Document Type and Number:
WIPO Patent Application WO/2019/155271
Kind Code:
A1
Abstract:
Exemplary embodiments of the present disclosure are directed towards an interactive sports simulation system and method for training a user. The system comprising a haptic feedback enabled device configured for enabling a user to perform a sporting activity comprising a vibration motors and a handle. The vibration motors comprising linear resonant actuators, a first eccentric rotating mass motor, and a second eccentric rotating mass motor. The system further comprising, a first processing device configured to interface with the vibration motors and an end-user device. The end-user device is configured to detect the sporting activity performed by the user and transmit different signals corresponding to different patterns of haptic feedback to the first processing device based on the sporting activity performed by the user and the vibration motors configured to generate the different patterns of haptic feedback. The system further comprising a feet detection device and a two-dimensional tracking device comprise a photosensitive sensors positioned on bars and laser emitters positioned along a line of sight of the photosensitive sensors. The laser emitters configured to emit laser beams on the photosensitive sensors to detect the user's presence and the two-dimensional location in a play area. The system further comprising at least one second processing device positioned in the feet detection device and at least one third processing device positioned in the two-dimensional tracking device configured to interface with the laser emitters and the photosensitive sensors.

Inventors:
SAMA VASANTHA SAI (IN)
Application Number:
PCT/IB2018/053064
Publication Date:
August 15, 2019
Filing Date:
May 03, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
PROYUGA ADVANCED TECH LIMITED (IN)
International Classes:
G06T19/00; G06V10/147; G09B9/00
Foreign References:
US8992322B22015-03-31
US9067097B22015-06-30
Other References:
SONG ET AL.: "A 3D localisation method in indoor environments for virtual reality applications", HUMAN-CENTRIC COMPUTING AND INFORMATION SCIENCES, 13 October 2017 (2017-10-13), Retrieved from the Internet [retrieved on 20181012]
Attorney, Agent or Firm:
C.V. GRANDHI, Krishna (IN)
Download PDF:
Claims:
CLAIMS:

1. An interactive sports simulation system, comprising: a haptic feedback enabled device configured for enabling a user to perform a sporting activity comprising a plurality of vibration motors and a handle, whereby the plurality of vibration motors comprising a plurality of linear resonant actuators, at least one first eccentric rotating mass motor, and at least one second eccentric rotating mass motor; at least one first processing device configured to interface with the plurality of vibration motors and an end-user device, whereby the end- user device configured to detect the sporting activity performed by the user and transmit different signals corresponding to different patterns of haptic feedback to the first processing device based on the sporting activity performed by the user and the plurality of vibration motors configured to generate the different patterns of haptic feedback based on the different signals corresponding to the different patterns of haptic feedback received by the first processing device; a feet detection device and a two-dimensional tracking device comprise a plurality of photosensitive sensors positioned on a plurality of bars and a plurality of laser emitters positioned along a line of sight of the plurality of photosensitive sensors, whereby the plurality of laser emitters configured to emit laser beams on the plurality of photosensitive sensors to detect and locate the user’s presence in a play area; and at least one second processing device positioned in the feet detection device and at least one third processing device positioned in the two- dimensional tracking device configured to interface with the plurality of laser emitters and the plurality of photosensitive sensors.

2. The interactive sports simulation system of claim 1 , wherein the plurality of vibration motors are positioned in the handle.

3. The interactive sports simulation system of claim 1, wherein the plurality of linear resonant actuators positioned between the first eccentric rotating mass motor and the second eccentric rotating mass motor.

4. The interactive sports simulation system of claim 1, wherein the at least one first processing device is configured to receive different signals corresponding to the different patterns of haptic feedback to actuate the plurality of vibration motors.

5. The interactive sports simulation system of claim 1, wherein the end-user device is configured to create a simulated environment of a sporting event for the user.

6. The interactive sports simulation system of claim 1, wherein the plurality of laser emitters are adjusted such that the laser beams emit on the plurality of photosensitive sensors in the play area.

7. The interactive sports simulation system of claim 1, wherein the at least one second processing device and the at least one third processing device are configured to monitor the presence of laser beams on the plurality of photosensitive sensors.

8. The interactive sports simulation system of claim 1, wherein the at least one second processing device and the at least one third processing device are configured to transmit the status of user’s presence and a two-dimensional location information in the play area to the end-user device.

9. The interactive sports simulation system of claim 1, further comprising a plurality of regulated power supply circuits configured to receive an input power supply voltage from a battery and to output a regulated power supply voltage to the plurality of vibration motors, the at least one first processing device, the at least one second processing device, the at least one third processing device, the plurality of photosensitive sensors, and the plurality of laser emitters.

10. The interactive sports simulation system of claim 8, further comprising at least one charging interface configured to charge the battery.

11. The interactive sports simulation system of claim 1, further comprising at least one charging status indicator configured to indicate the status of charging, a plurality of connection status indicators configured to indicate the connection between the at least one end-user device and the at least one first processing device, the at least one second processing device, the at least one third processing device, and a plurality of user detection status indicators configured to indicate the presence of user.

12. A method for training a user for a sporting event, comprising: creating a simulated environment of a sporting event for a user by an end-user device; performing a sporting activity associated with the sporting event by the user using a haptic feedback enabled device and transmitting different signals corresponding to different patterns of haptic feedback to a first processing device from the end-user device based on the performed sporting activity by the user; generating different patterns of haptic feedback through a plurality of vibration motors based on the different signals corresponding to the different patterns of haptic feedback received by the first processing device; emitting laser beams on a plurality of photosensitive sensors from a plurality of laser emitters in a play area, whereby monitoring the presence of laser beams on a plurality of photosensitive sensors by a second processing device and a third processing device; and detecting and locating a status of user’s presence in the play area and transmitting the status of user’s presence information and a two- dimensional location information in the play area to the end-user device from the second processing device and the third processing device.

13. The method of claim 12, further comprising a step of mapping different signals to the different patterns of haptic feedback.

14. The method of claim 12, further comprising a step of locating the presence of the user in the play area by the third processing device.

15. The method of claim 12, further comprising a step of finding the plurality of photosensitive sensors which are being blocked by the user from the laser beams.

16. The method of claim 12, further comprising a step of detecting light conditions on the plurality of photosensitive sensors.

17. The method of claim 12, further comprising a step of admitting the light beams in the play area by the plurality of photosensitive sensors.

18. A computer program product comprising module code embedded in a non- transitory data storage medium, wherein execution of the module code on an end-user device causes the end-user device to: create a simulated environment of a sporting event for a user by an end-user device; perform a sporting activity associated with the sporting event by the user using a haptic feedback enabled device and transmitting different signals corresponding to different patterns of haptic feedback to a first processing device from the end-user device based on the performed sporting activity by the user; generate different patterns of haptic feedback through a plurality of vibration motors based on the different signals corresponding to the different patterns of haptic feedback received by the first processing device; emit laser beams on a plurality of photosensitive sensors from a plurality of laser emitters in a play area, whereby monitoring the presence of laser beams on a plurality of photosensitive sensors by a second processing device and a third processing device; and detect and locate a status of user’s presence in the play area and transmit the status of user’s presence and a two-dimensional location information in the play area to the end-user device from the second processing device and the third processing device.

Description:
“INTERACTIVE VIRTUAL REALITY SPORTS SIMULATION SYSTEM AND

METHODS EMPLOYED THEREOF”

TECHNICAL FIELD

[001] The disclosed subject matter relates generally to interactive sports training systems and methods. More particularly, the present disclosure relates to an interactive sports simulation system and method for training a user.

BACKGROUND

[002] Interactive sports training systems are configured to simulate various sporting events or physical activities, such as cricket, tennis, baseball, ping-pong, hockey, and fishing. Such systems allow users to compete with virtual opponents in virtual playing fields displayed on an end-user device.

[003] Virtual reality environments can provide the users with simulated experiences of sporting events. Such virtual reality environments may be particularly useful for sports such as cricket in which users (for e.g., players) may experience many repetitions of plays while avoiding the chronic injuries that may otherwise result on real-world practice fields. However, conventional virtual reality sports simulators do not provide meaningful training experiences and feedback of the performance of the user (for e.g., player). For example, the conventional virtual reality cricket game hardware peripherals are not able to improve the cricket game experience in the virtual environment. Although visual and audio signals are typically employed to inform the status of the users, many such systems do not provide the haptic feedback to the users.

[004] In the light of the aforementioned discussion, there exists a need for a certain system with novel methodologies that would overcome the above-mentioned disadvantages.

SUMMARY

[005] An objective of the present invention is directed towards making the user feels as if the user is playing in a real sport. [006] Another objective of the present invention is direct towards transmitting the data (for e.g., the user’s presence) in real time.

[007] Another objective of the present invention is directed towards enhancing the virtual reality experience.

[008] In an embodiment of the present invention, an interactive sports simulation system comprising a haptic feedback enabled device configured for enabling a user to perform a sporting activity comprising a plurality of vibration motors and a handle, the plurality of vibration motors comprising a plurality of linear resonant actuators, at least one first eccentric rotating mass motor, and at least one second eccentric rotating mass motor.

[009] In another embodiment of the present invention, the interactive sports simulation system comprising at least one first processing device configured to interface with the plurality of vibration motors and an end-user device, the end-user device configured to detect the sporting activity performed by the user and transmit different signals corresponding to different patterns of haptic feedback to the at least first processing device based on the sporting activity performed by the user and the plurality of vibration motors configured to generate the different patterns of haptic feedback based on the different signals corresponding to the different patterns of haptic feedback received by the at least one first processing device.

[0010] In another embodiment of the present invention, the interactive sports simulation system further comprising a feet detection device and a two-dimensional tracking device comprising a plurality of photosensitive sensors positioned on a plurality of bars and a plurality of laser emitters positioned along a line of sight of the plurality of photosensitive sensors, the plurality of laser emitters configured to emit laser beams on the plurality of photosensitive sensors to detect the user’s presence in a play area.

[0011] In another embodiment of the present invention, the interactive sports simulation system further comprising at least one second processing device positioned in the feet detection device and at least one third processing device positioned in the two- dimensional tracking device configured to interface with the plurality of laser emitters and the plurality of photosensitive sensors. BRIEF DESCRIPTION OF THE DRAWINGS

[0012] In the following, numerous specific details are set forth to provide a thorough description of various embodiments. Certain embodiments may be practiced without these specific details or with some variations in detail. In some instances, certain features are described in less detail so as not to obscure other aspects. The level of detail associated with each of the elements or features should not be construed to qualify the novelty or importance of one feature over the others.

[0013] FIG. 1 is a block diagram representing an example environment in which aspects of the present disclosure can be implemented. Specifically, FIG. 1 depicts a schematic representation of an interactive sports simulation system according to an embodiment of the present invention.

[0014] FIG. 2 is a diagram depicting the haptic feedback enabled device 102 shown in the FIG. 1, in one or more exemplary embodiments.

[0015] FIG. 3 is a block diagram depicting the haptic feedback enabled device 102 shown in the FIG. 1, in one or more exemplary embodiments.

[0016] FIG. 4A-4B are example diagrams depicting a first play area of the user, in one or more exemplary embodiments.

[0017] FIG. 4C is a block diagram depicting the feet detecting device

103comprisingan array of photosensitive sensors 408-410 and the laser emitters 412-414, in accordance with one or more embodiments.

[0018] FIG. 5A-FIG. 5B are example diagrams depicting a second play area of the user, in one or more exemplary embodiments.

[0019] FIG. 5C is a block diagram depicting the 2-D feet tracking device

105comprising an array of photosensitive sensors 510-516 and the laser emitters 518-524, in accordance with one or more embodiments. [0020] FIG. 6 is a flow diagram depicting a method for training the user for the sporting event in a virtual environment, in one or more exemplary embodiments.

[0021] FIG. 7 is a flow diagram depicting a method for generating the haptic feedback depending upon the sporting activity performed by the user, in one or more exemplary embodiments.

[0022] FIG. 8 is an example flow diagram depicting a method for detecting and transmitting the status of user’s presence in the first play area, in one or more exemplary embodiments.

[0023] FIG. 9 is a flow diagram depicting a method for detecting and transmitting two-dimensional position of user’s feet in the second play area, in one or more exemplary embodiments.

DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

[0024] It is to be understood that the present disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The present disclosure is capable of other embodiments and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting.

[0025] The use of “including”, “comprising” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms“a” and“an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item. Further, the use of terms “first”,“second”, and“third”, and the like, herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another.

[0026] Referring to FIG. 1, FIG. 1 is a block diagram 100 representing an example environment in which aspects of the present disclosure can be implemented. Specifically, FIG. 1 depicts a schematic representation of an interactive sports simulation system according to an embodiment of the present invention. The example environment is shown containing only representative devices and systems for illustration. However, real-world environments may contain more or fewer systems or devices. FIG. 1 depicts a haptic feedback enabled device 102, a feet detection device 103, a two-dimensional feet tracking device 105, a network 104, and an end-user device 106. The network 104 may include, but not limited to, an Ethernet, a wireless local area network (WLAN), or a wide area network (WAN), a Bluetooth low energy network, a ZigBee network, a WIFI communication network e.g., the wireless high speed internet, or a combination of networks, a cellular service such as a 4G (e.g., LTE, mobile WiMAX) or 5G cellular data service..

[0027] In one or more exemplary embodiments, a virtual reality environment provides the user with simulated experiences of sporting events. A simulated environment of a sporting event is generated for the user by the end-user device 106, where the simulated environment depicting the sporting event appears to be in the immediate physical surroundings of the user. The simulated environment having one or more virtual objects and play areas of the sporting event to the user by the end-user device 106. The virtual object, for example, a cricket ball, a baseball, a golf ball, a tennis ball, ping-pong ball, a hockey puck, a field hockey ball, a hurling ball, and the like. The play areas include, but are not limited to, a batting crease, a bowling crease, a bowling pitch, predefined fielding zones, and the like.

[0028] The haptic feedback enabled device 102 may include, but not limited to, a cricket bat, a baseball bat, a golf club, a tennis racket, a hockey stick, a ping-pong bat, and the like. The end-user device 106 may include a device such as a personal computer, a workstation, an electronic book reader, a personal digital assistant, a mobile station, mobile phones, computing tablets, and the like. The haptic feedback enabled device 102 is configured to provide a haptic feedback depending upon a user’s sporting activity. The feet detection device 103 is configured to detect the status of user’s presence in the play area and transmit the detected status information to the end-user device 106. The two-dimensional feet tracking device 105 is configured to track the feet position of user in the play area and transmit the tracked information to end-user device 106.

[0029] Referring to FIG. 2, FIG. 2 is a diagram 200 depicting the haptic feedback enabled device 102 shown in the FIG. 1, in one or more exemplary embodiments. The haptic feedback enabled device 102 comprises vibration motors 202-212 and a handle 214. The vibration motors 202-212 comprise eccentric rotating mass motors, and four linear resonant actuators. The eccentric rotating mass motor 202 is positioned at top of the handle 214 and the linear resonant actuators 206-212 are positioned below the eccentric rotating mass motor 202. The eccentric rotating mass motor 204 is positioned below the linear resonant actuators 206-212.

[0030] Referring to FIG. 3, FIG. 3 is block diagram 300 depicting the haptic feedback enabled device 102 shown in the FIG. 1, in one or more exemplary embodiments. The haptic feedback enabled device 102 further comprises a first processing device 302, a regulated power supply circuit 304, a battery 306, a charging interface 308, a charging status indicator 310, and a connection status indicator 312. The first processing device 302 includes, but is not limited to, a microcontroller (for example ARM 7 or ARM 11), a microprocessor, a digital signal processor, a microcomputer, a field programmable gate array, a programmable logic device, a state machine or a logic circuitry. The first processing device 302 is configured to interface with the vibration motors 202-212 to generate different patterns of haptic feedback. The haptic feedback in the form of tactile feedback by applying forces, vibrations, or motions from the vibration motors 202-212 to the user.

[0031] The regulated power supply circuit 304 is configured to receive an input power supply voltage from the battery 306 and to output a regulated power supply voltage to the vibration motors 202-212, and the first processing device 302. The output from the regulated power supply circuit 306 may be an unidirectional but is nearly always DC (Direct Current). The charging interface 308 is configured to charge the battery 306 on providing input power (for e.g., DC input power). The charging interface 308 includes a charging integrated circuit and a connector. The charging interface 308 may be a USB interface or DC female power jack. The charging integrated circuit and connector are coupled to the battery 306. The charging status indicator 310 is connected to the charging interface 308 to indicate the charging status. The charging status indicator 310 comprises light emitting diodes (red and green LEDs). The charging status may include, but is not limited to, charging in progress, charge completion, and, the like. The connection status indicator 312 is configured to indicate the connection between the first processing device 302 and the end-user device 106. [0032] The user is enabled to perform a sporting activity by the haptic feedback enabled device 102. The sporting activity may include, hitting the virtual object using the haptic feedback enabled device 102. The end-user device 106 is configured to detect the performed activity and transmit different signals corresponding to different patterns of haptic feedback to the first processing device 302 depending on how the user performs the sporting activity. The first processing device 302 receives the signal from the end-user device 106 to actuate the vibration motors 202-212. The first processing device 302 is configured to provide a haptic feedback through the vibration motors 202-212. The vibration motors 202- 212 are configured to generate the haptic feedback. The different signals of the end-user device 106 are mapped to the different patterns of haptic feedback.Each transmitted signal may correspond to a pattern of haptic feedback, depending upon the signal received, corresponding haptic feedback is generated by the vibration motors 202-212.

[0033] Referring to FIG. 4A-4B, FIG. 4A-4B are example diagrams 400a-400b depicting the first play area of the user, in one or more exemplary embodiments. A second processing device 416 (not shown) is configured to detect the presence of the user inside the first play area (for e.g., inside the crease mark (the Region A and region B)). The diagrams 400a-400b depict bars 402-406, the array of photosensitive sensors 408-410, and laser emitters 412-414. The bars 402-406 are positioned in a predetermined shape (for e.g., in a U- shape). The length of the bars 402-404 may include, for example, l20cm. The length of the bar 406 may include, for example, 264cm. The array of photosensitive sensors 408-410 are positioned along the bars 402-404 with a certain distance (for e.g., 3cm) between any photosensitive sensors. The laser emitters 412-414 are positioned at a particular place (for e.g., at point A and point B). The laser emitters 412-414 may be positioned along the line of sight of photosensitive sensors 408-410. The laser emitters 412-414 are configured to emit the laser beams on regions (for e.g., the laser emitter 412 is configured to emit the laser beam in the region A and the laser emitter 412 is configured to emit the laser beam in the region B). The array of photosensitive sensors 408-410 comprises photosensitive sensors configured to detect the presence of light on them .For an example, the laser emitter 412 at point A is mounted such that its laser beam emits on the array of photosensitive sensors 408 on the bar 402. The laser emitter 414 at point B is mounted such that its laser beam emits on the array of photosensitive sensors 412 on the bar 404. The arrays of photosensitive sensors 408-410 are configured to admit the laser beams in the play area (in the Region A and Region B). [0034] Referring to FIG. 4C, FIG. 4C is a block diagram 400c depicting the feet detection device 103 having the array of photosensitive sensors 408-410 and the laser emitters 412-414, in accordance with one or more embodiments. The diagram 400c depicts the second processing device 416 which is interfaced with the array of photosensitive sensors 408-410 and the laser emitters 412-414. The second processing device416 is configured to transmit the data to the end-user device 106 through the network 104. The data may include, for example, the presence of the user in the play area, and the like. A regulated power supply circuit418 is configured to provide power to the laser emitters 412-414, the second processing device 416 and the arrays of photosensitive sensors 408-410. A user detection status indicator 420is connected to the second processing device 416 and the user detection status indicator 420is configured to indicate the status of the user. The status may include, the presence of the user inside the play area or outside the play area. A connection status indicator 422 connected to the second processing device 416 and the connection status indicator 422 is configured to indicate the connection between the second processing device 416 and the end-user device 106. The second processing device 416 continuously monitors the presence of a laser beam on the array of photosensitive sensors 408-410 and recognize if the user is inside the play area or outside the play area. Referring to FIG. 4B, if the laser beam is blocked on any photosensitive sensors, then the user is inside the play area (for e.g., the crease marking). If the laser beam emits on the array of photosensitive sensors, then the user is not inside the playing area (for e.g., the crease marking).

[0035] Referring to FIG. 5A-FIG. 5B, FIG. 5A-FIG. 5B are example diagrams 500a- 500b depicting the second play area of the user, in one or more exemplary embodiments. A third processing device528 (not shown) is configured to track the two-dimensional position of user’s feet in the second play area (for e.g., along X-axis and along Y-axis). The diagrams 500a-500b depicts bars 502-508, the array of photosensitive sensors 510-516, and the laser emitters 518-524. The bars 502-508 are positioned in a predetermined shape (for e.g., in a rectangular shape). The length of bars 502-504 may include, for example, 12 feet. The length of bars 506-508 may include, for example, 10 feet. The array of photosensitive sensors 510- 516 are positioned along the bars 502-508 with a certain distance (for e.g., 3cm) between any two photosensitive sensors. The laser emitters 518-524 are positioned such that their laser beams emit on the array of photosensitive sensors 510-516. For example, the laser emitter 518 and the laser emitter 522 are positioned such that their laser beams emit on the array of photosensitive sensors 510 and the array of photosensitive sensors 514 respectively. The laser beam from a point A covers the region ABC and the laser beam at point C covers the region ADC. The laser emitter 520 and the laser emitter 524 are positioned such that their laser beams fall on the array of photosensitive sensors 512 and the array of photosensitive sensors 516 respectively. The laser beam from a point B covers the region BCD and the laser beam from a point D cover the region ABD. The laser emitters 518-524 are positioned such that laser beams fall on the array of photosensitive sensors 510-516. The arrays of photosensitive sensors 510-516 are configured to admit the laser beams in the play area (e.g.,the region ABCD ).

[0036] Referring to FIG. 5C, FIG. 5C is a block diagram 500c depicting the two- dimensional feet tracking device 105 comprising the array of photosensitive sensors 510-516 and the laser emitters 518-524, in accordance with one or more embodiments. The block diagram 500c depicts the third processing device 528 which is interfaced with the array of photosensitive sensors 510-516 and the laser emitters 518-524. The third processing device 528 is configured to transmit the data to the end-user device 106 through the network 104. A regulated power supply circuit530 is configured to provide the power to the arrays of photosensitive sensors 510-516, the laser emitters 518-524, and the third processing device 528. A user detection status indicator 526 is connected to the third processing device 528 and the user detection status indicator 526 is configured to indicate the status of the user’s presence. A connection status indicator 532 connected to the third processing device 528 and the connection status indicator 532 is configured to indicate the connection between the third processing device 528 and the end-user device 106. The third processing device 528 continuously monitors the presence of the laser beams on the array of photosensitive sensors 510-516 on the bars 502-508.

[0037] The third processing device 528 continuously monitors the presence of the laser beam on the array of photosensitive sensors 510-516to recognize if the user is inside the play area or outside the paying area (for e.g., the crease marking). The diagram 500b depicts X-axis and Y-axis. In an example, the laser emitters 518 and 522 at point A and point C along with the photosensitive sensors 510 on the bar 502 BC and 514 on the bar 504 AD gives the position in X-axis. The laser emitters 520 and 524 at point B and point D along with the photosensitive sensors 516 on the bar 508 AB and 512 on the bar 506 CD gives the position in Y-axis. As shown in FIG. 5B, if the user stands at point E inside the rectangular, then the laser beam is blocked on the array of photosensitive sensors510 on the bar 502 BC. This gives the position of the user in X-axis. Similarly, the laser beam is blocked on the array of photosensitive sensors5l6 on the bar 508 AB. This gives the position of the user in Y-axis.

[0038] Referring to FIG. 6, FIG. 6 is a flow diagram 600 depicting a method for training the user for the sporting event (for e.g., cricket) in a virtual environment, in one or more exemplary embodiments. The method 600 may be carried out in the context of the details of FIG. 1, FIG. 2, and FIG. 3, FIG. 4A-4C, and FIG. 5A-5C. Flowever, the method 600 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.

[0039] The method commences at step 602, creating the simulated environment of a sporting event for the user by the end-user device. At step 604, performing the sporting activity using the haptic feedback enabled device by the user and transmitting the different signals corresponding to different patterns of haptic feedback to the first processing device from the end-user device based on the sporting activity performed by the user. At step 606, generating different patterns of haptic feedback through the vibration motors based on the different signals corresponding to the different patterns of haptic feedback received by the first processing device. Thereafter, at step 608, emitting the laser beams on the photosensitive sensors from the laser emitters in the play area. At step 610, monitoring the presence of laser beams on the array of photosensitive sensors by the second processing device and the third processing device. At step 612, locating the user’s presence in the play area. At step 614, transmitting the status of user’s presence and the two-dimensional location information in the play area to the end-user device from the second processing device and the third processing device.

[0040] Referring to FIG. 7, FIG. 7 is a flow diagram 700 depicting a method for generating the haptic feedback depending upon the performed sporting activity by the user, in one or more exemplary embodiments. The method 700 may be carried out in the context of the details of FIG. 1, FIG. 2, and FIG. 3, FIG. 4A-4C, FIG. 5A-5C, and FIG. 6. However, method 700 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.

[0041] The method commences at step 702, the user performs the sporting activity using the haptic feedback enabled device. At step 704, detecting the performed sporting activity by the end-user device and transmission of different signals associated with the performed sporting activity from the end-user device to the first processing device. Thereafter, at step 706, the first processing device determines whether the different signals are received from the end-user device. If answer to step 706 is a YES, then at step 708, generating different patterns of haptic feedback through the vibration motors based on the different signals corresponding to the different patterns of haptic feedback received by the first processing device to the user and method continues at step 702. If answer to step 706 is a NO, then the method continues at step 704.

[0042] Referring to FIG. 8, FIG. 8 is a flow diagram 800 depicting a method for detecting and transmitting the status of user’s presence in the play area (for e.g., region A and region B), in one or more exemplary embodiments. The method 800 may be carried out in the context of the details of FIG. 1, FIG. 2, and FIG. 3, FIG. 4A-4C, FIG. 5A-5C, FIG. 6, and FIG. 7. However, method 800 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below.

The method commences at step 802, monitoring the presence of laser beams on the array of photosensitive sensors by the second processing device. At step 804, the second processing device determines whether the laser beams are emitted on the all photosensitive sensors. If answer to step 804 is a YES, then at step 806, the user is not presented in the play area (for e.g., region A and region B shown in the FIG. 4B). If answer to step 804 is a NO, then at step 808, locate the user’s presence in the play area (for e.g., region A or region B shown in the FIG. 4B). Transmitting the status of user’s presence in the play area (for e.g., in the region A and region B) to the end-user device from the second processing device, at step 810 and method continues at step 802.

[0043] Referring to FIG. 9, FIG. 9 is a flow diagram 900 depicting a method for determining and transmitting the two-dimensional position of user’s feet in the play area (for e.g., along X-axis and along Y-axis), in one or more exemplary embodiments. The method 900 may be carried out in the context of the details of FIG. 1, FIG. 2, and FIG. 3, FIG. 4A-4C, FIG. 5A-5C, FIG. 6, FIG. 7, and FIG. 8. However, method 900 may also be carried out in any desired environment. Further, the aforementioned definitions may equally apply to the description below. [0044] The method commences at step 902, monitoring the presence of laser beams on the array of photosensitive sensors by the third processing device. At step 904, the third processing device determines whether the laser beams are emitted on all the photosensitive sensors. If answer to step 904 is a YES, then at step 906, the user is not presented in the play area (for e.g., inside the region shown in the FIG. 5B). If answer to step 904 is a NO, then at step 908, the detection of photosensitive sensors are being blocked from the laser beams on the bars (for e.g., BC and AD). Thereafter, at step 910, locate the user’s presence in the play area (for e.g., along X-axis). At step 912, the detection of photosensitive sensors are being blocked from the laser beams on the bars (for e.g., AB and CD). Thereafter, at step 914, locate the user’s presence in the play area (for e.g., along Y-axis). Transmitting the two- dimensional position of the user’s feet in the play area (for e.g., along the X-axis and Y-axis regions) to the end-user device from the third processing device, at step 916.The process continues at step 902.

[0045] Referring to FIG. 10, FIG. 10 is a block diagram illustrating the details of digital processing system 1000 in which various aspects of the present disclosure are operative by execution of appropriate software instructions. Digital processing system 1000 may correspond to end-user device 106 (or any other system in which the various features disclosed above can be implemented).

[0046] Digital processing system 1000 may contain one or more processors such as a central processing unit (CPU) 1010, random access memory (RAM) 1020, secondary memory 1027, graphics controller 1060, display unit 1070, network interface 1080, and input interface 1090. All the components except display unit 1070 may communicate with each other over communication path 1050, which may contain several buses as is well known in the relevant arts. The components of Figure 10 are described below in further detail.

[0047] CPU 1010 may execute instructions stored in RAM 1020 to provide several features of the present disclosure. CPU 1010 may contain multiple processing units, with each processing unit potentially being designed for a specific task. Alternatively, CPU 1010 may contain only a single general-purpose processing unit.

[0048] RAM 1020 may receive instructions from secondary memory 1030 using communication path 1050. RAM 1020 is shown currently containing software instructions, such as those used in threads and stacks, constituting shared environment 1025 and/or user programs 1026. Shared environment 1025 includes operating systems, device drivers, virtual machines, etc., which provide a (common) run time environment for execution of user programs 1026.

[0049] Graphics controller 1060 generates display signals (e.g., in RGB format) to display unit 1070 based on data/instructions received from CPU 1010. Display unit 1070 contains a display screen to display the images defined by the display signals. Input interface 1090 may correspond to a keyboard and a pointing device (e.g., touch-pad, mouse) and may be used to provide inputs. Network interface 1080 provides connectivity to a network (e.g., using Internet Protocol), and may be used to communicate with other systems (such as those shown in Figure 1, network 104) connected to the network.

[0050] Secondary memory 1030 may contain hard drive 1035, flash memory 1036, and removable storage drive 1037. Secondary memory 1030 may store the data software instructions (e.g., for performing the actions noted above with respect to the Figures), which enable digital processing system 1000 to provide several features in accordance with the present disclosure.

[0051] Some or all of the data and instructions may be provided on removable storage unit 1040, and the data and instructions may be read and provided by removable storage drive 1037 to CPU 1010. Floppy drive, magnetic tape drive, CD-ROM drive, DVD Drive, Flash memory, removable memory chip (PCMCIA Card, EEPROM) are examples of such removable storage drive 1037.

[0052] Removable storage unit 1040 may be implemented using medium and storage format compatible with removable storage drive 1037 such that removable storage drive 1037 can read the data and instructions. Thus, removable storage unit 1040 includes a computer readable (storage) medium having stored therein computer software and/or data. Flowever, the computer (or machine, in general) readable medium can be in other forms (e.g., non removable, random access, etc.).

[0053] In this document, the term "computer program product" is used to generally refer to removable storage unit 1040 or hard disk installed in hard drive 1035. These computer program products are means for providing software to digital processing system 1000. CPU 1010 may retrieve the software instructions, and execute the instructions to provide various features of the present disclosure described above.

[0054] The term“storage media/medium” as used herein refers to any non-transitory media that store data and/or instructions that cause a machine to operate in a specific fashion. Such storage media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical disks, magnetic disks, or solid-state drives, such as storage memory 1030. Volatile media includes dynamic memory, such as RAM 1020. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid-state drive, magnetic tape, or any other magnetic data storage medium, a CD- ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.

[0055] Storage media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 1050. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.

[0056] Reference throughout this specification to “one embodiment”, “an embodiment”, or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrases “in one embodiment”, “in an embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.

[0057] Although the present disclosure has been described in terms of certain preferred embodiments and illustrations thereof, other embodiments and modifications to preferred embodiments may be possible that are within the principles and spirit of the invention. The above descriptions and figures are therefore to be regarded as illustrative and not restrictive. [0058] Thus the scope of the present disclosure is defined by the appended claims and includes both combinations and sub-combinations of the various features described hereinabove as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description.