Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A METHOD AND SYSTEM FOR PROVIDING A VIRTUAL REALITY EXPERIENCE
Document Type and Number:
WIPO Patent Application WO/2020/224805
Kind Code:
A1
Abstract:
The present invention relates to a method for providing a virtual reality experience on a mechanical user motion apparatus. The method includes the steps of receiving sensor information from a device mounted relative to the user on the apparatus; processing the sensor information in conjunction with pre-stored kinetic data relating to the motion apparatus to generate a mapping between movement of the user and a virtual world; and displaying the virtual world to the user via a display in accordance with the mapping. A system and software are also disclosed.

Inventors:
MORLEY THOMAS JAMES (GB)
GOLEMBEWSKI MICHAEL WALTER (GB)
Application Number:
PCT/EP2020/025204
Publication Date:
November 12, 2020
Filing Date:
May 04, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
STUDIO GO GO LTD (GB)
International Classes:
A63F13/26; A63G31/16; G02B27/01; G06F3/01
Domestic Patent References:
WO2016023817A22016-02-18
WO2017153532A12017-09-14
WO2018109502A12018-06-21
Foreign References:
US20150363976A12015-12-17
US20170072316A12017-03-16
Attorney, Agent or Firm:
CHANDRAHASEN, Gerard Francis (GB)
Download PDF:
Claims:
Claims

1. A method for providing a virtual reality experience on a mechanical user motion apparatus, including: a) receiving sensor information from a device mounted relative to the user on the apparatus; b) processing the sensor information in conjunction with pre

stored kinetic data relating to the motion apparatus to generate a mapping between movement of the user and a virtual world; and c) displaying the virtual world to the user via a display in accordance with the mapping.

2. A method as claimed in claim 1 , wherein the mechanical user

motion apparatus is a mechanical ride.

3. A method as claimed in claim 2, wherein the mechanical ride is an amusement ride.

4. A method as claimed in any one of the preceding claims, wherein the device is worn by the user.

5. A method as claimed in any one of the preceding claims, wherein the sensor information includes one or more selected from the set of acceleration, positional and orientation information.

6. A method as claimed in claim 5, wherein the acceleration, positional and orientation information relates to the physical body of the user.

7. A method as claimed in any one of the preceding claims, wherein the kinetic data includes a physics simulation of the mechanical apparatus.

8. A method as claimed in any one of the preceding claims, further including the step of generating a dynamic physics simulation for the user on the motion apparatus using the pre-stored kinetic data; wherein the sensor information is processed in conjunction with the pre-stored kinetic data by using the dynamic physics simulation.

A method as claimed in claim 8, wherein the dynamic physics simulation is refined using the sensor information.

10. A method as claimed in any one of claims 8 to 9, wherein the

mapping is generated by extracting information from the dynamic physics simulation.

1 1. A method as claimed in claim 10, wherein the information includes user information relating to the user’s position and/or orientation.

12. A method as claimed in claim 1 1 , wherein the user’s position

and/or orientation is mapped to a virtual position and/or orientation of the user within the virtual world by transforming the user’s position and/or orientation in accordance with one or more transformation functions.

13. A method as claimed in claim 12, wherein the one or more

transformation functions are defined to apply at various trigger points during the ride.

14. A method as claimed in claim 13, wherein the trigger points are one or more selected from the set of time and/or motion phase.

15. A method as claimed in any one of claims 12 to 14, further

including the step of rendering the virtual world from the

perspective of the virtual position and/or orientation of the user.

16. A method as claimed in any one of the preceding claims, wherein the mapping is one-to-one between the movement of the user and the virtual world.

17. A method as claimed in any one of the preceding claims, wherein data representing the movement of the user is transformed during the mapping such that the mapping is not one-to-one between the movement of the user and the virtual world.

18. A method as claimed in claim 17, wherein the data representing the movement of the user is transformed by one or more method selected from the set of amplification, suppression, reversal, offsetting, phase shifting, and swapping.

19. A method as claimed in any one of the preceding claims, wherein the display is within a headset worn by the user.

20. A method as claimed in any one of the preceding claims, wherein the headset is a virtual reality headset.

21. A method as claimed in any one of the preceding claims, wherein the device is the headset.

22. A method as claimed in any one of the preceding claims, further including the step of filtering the sensor information before processing the sensor information to generate the mapping.

23. A method as claimed in any one of the preceding claims, wherein the sensor information is received from one or more sensors and wherein at least one sensor is located within the device.

24. A system for providing a virtual reality experience on a mechanical user motion apparatus, including: a device, including one or more sensors, each sensor configured for generating sensor information, and the device configured for mounting relative to a user when on the apparatus; a processor configured for processing the sensor information in conjunction with pre-stored kinetic data relating to the motion apparatus to generate a mapping between movement of the user and a virtual world; a display configured for displaying the virtual world to the user via a display in accordance with the mapping; and a memory configured for storing the pre-stored kinetic data.

25. A computer program configured, when executed on a processor, for performing the steps of any one of methods 1 to 23.

Description:
A Method and System for Providing a Virtual Reality Experience

Field of Invention The present invention is in the field of virtual reality. More particularly, but not exclusively, the present invention relates to provision of a virtual reality experience on a mechanical user motion apparatus.

Background

Circular and/or harmonic and/or repeating mechanical rides provide entertainment and/or sometimes education and/or health and/or leisure benefits to users, or“riders”. Such mechanical rides include amusement rides such as the carousels or ferris wheels found at fairgrounds, theme parks, and family entertainment centres; health and fitness equipment such as rowing or stepper machines, or trampolines found at gyms, leisure centres or in domestic settings; play equipment such as swings, seesaws or roundabouts found in playgrounds or in domestic settings; kinetic furniture such as porch swings and rocking chairs found in public spaces or in domestic settings; sports equipment such as rowing boats.

In typical mechanical rides, the user’s visual field is that of the ride’s surroundings or perhaps a static view of the inside of the ride. There is a desire to improve mechanical rides by providing augmentation or replacement of the user's visual field. Some existing mechanical rides in the entertainment sector accomplish this by installation of multi-user screens or personal screens. The screens can display different visual representations corresponding to the movement of the mechanical ride.

Examples of mechanical rides with this feature include: 1. Sum of All Thrill Rides at Epcot - The rider designs their own experience at a terminal that drives a robotic arm with the rider’s seat at the end to provide the physical experience, using a wrap-around screen to deliver the accompanying visual experience.

2. Galactica at Alton Towers - The visual experience is pre-scripted, then stored played on VR headsets worn by riders on the ride, and played in synchronisation with the physical experience using keyframe synchronisation points delivered from sensing devices positioned off the ride.

3. Battle for Eire - An interactive simulator-VR attraction, where the story world drives the motion of the simulator and an accompanying audio-visual experience is delivered to the rider’s VR headset.

However, there is a desire for more advanced visual representations taking advantage of virtual reality technology.

It is an object of the present invention to provide a method and system for a virtual reality on a mechanical motion apparatus which overcomes the disadvantages of the prior art, or at least provides a useful alternative.

Summary of Invention

According to a first aspect of the invention there is provided a method for providing a virtual reality experience on a mechanical user motion apparatus, including:

a) receiving sensor information from a device mounted relative to the user on the apparatus;

b) processing the sensor information in conjunction with pre-stored kinetic data relating to the motion apparatus to generate a mapping between movement of the user and a virtual world; and

c) displaying the virtual world to the user via a display in accordance with the mapping. According to a further aspect of the invention there is provided a system for providing a virtual reality experience on a mechanical user motion apparatus, including:

a device, including one or more sensors, each sensor configured for generating sensor information, and the device configured for mounting relative to a user when on the apparatus;

a processor configured for processing the sensor information in conjunction with pre-stored kinetic data relating to the motion apparatus to generate a mapping between movement of the user and a virtual world;

a display configured for displaying the virtual world to the user via a display in accordance with the mapping; and

a memory configured for storing the pre-stored kinetic data.

Other aspects of the invention are described within the claims.

Brief Description of the Drawings

Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings in which:

Figure 1 : shows a block diagram for a system in accordance with an embodiment of the invention;

Figure 2: shows a flow diagram for a method in accordance with an embodiment of the invention;

Figure 3: shows a flow diagram for a data flow method in accordance with an embodiment of the invention;

Figure 4: shows a screenshot of an operator interface in accordance with an embodiment of the invention; Figure 5: shows a diagram illustrating different methods for a swing in accordance with an embodiment of the invention;

Figure 6: shows a diagram illustrating different methods for a twist/scrambler ride in accordance with an embodiment of the invention.

Figure 7: shows a diagram illustrating a system for a twist/scrambler ride in accordance with an embodiment of the invention; Figure 8: shows a diagram illustrating a system for a swing in accordance with an embodiment of the invention;

Figure 9: shows a diagram illustrating an inchworm transformation in accordance with an embodiment of the invention.

Figure 10: shows a screenshot of a creator interface for a tool for designing virtual experiences in accordance with an embodiment of the invention;

Figure 1 1 : shows a screenshot of a creator interface for a tool for designing virtual experiences in accordance with an embodiment of the invention; and

Figure 12: shows a screenshot of a creator interface for a tool for designing virtual experiences accordance with an embodiment of the invention.

Detailed Description of Preferred Embodiments

The present invention provides a method and system for a virtual reality experience on a mechanical user motion apparatus such as mechanical ride.

Through-out this document the term user (of a mechanical user motion apparatus) and rider will be used interchangeably.

The inventors have discovered how to provide a virtual reality experience for existing mechanical rides without requiring modification or requiring minimal modification to the ride itself. To create rich user experiences for riders, the inventors have determined that contextual knowledge of the mechanical ride can be used. Over hundreds of years humans have created mechanical rides based on circular and/or harmonic motion. These static rides are typically designed to operate on a relatively small foot-print around which a rider’s position will move over time, usually in a repeating pattern. These patterns of movement can be repeated indefinitely and will generate associated patterns of repeating forces experienced by the rider that are (a) related to changes in rotational and linear accelerations, and changes in the perceived direction of gravitational force being experienced by the rider; and/or (b) persistent forces related to circular motion, that are being experienced by the rider as constant accelerations. These patterns are so strong that they can usually be detected and predicted by eye.

Once detected, the inventors have discovered that these repeating patterns can be used to (a) synchronise content produced by the processor to play in harmony with the patterns of movement felt by the rider, (b) predict upcoming position and movement to dynamically generate audio-visual content, (c) correct sensor data noise (d) predict and correct rider forward direction. These repeating patterns are often ubiquitous (for example: there are many rides that use fundamental principles of the pendulum). The present invention may use a local configuration file to parametrically adjust the ride model to make it applicable to that local ride. In the case of a pendulum, local information about the pendulum length may be stored.

Content that is dynamically generated in this fashion may be able to accommodate mechanical rides with varying lengths of run-time and varying maximum intensity levels. This is advantageous for ride operators such as fairground showmen who are required to operate a ride for longer lengths of time during quiet periods, and at lower intensities for younger audiences, or for those wishing to keep fit where fitness programmes might vary significantly from session to session.

As from mechanical rides that repeat about a fixed position at a fixed location, the present invention could be applied to mechanical apparatus that repeat about a moving location. For example, for a rowing machine, the position and dynamic status of rider in a mechanical oscillating system can be determined which is in rider’s living room. Equally, the same system could be used by a rower in an actual rowing boat on a river. The position of the rower relative to their oscillating position inside the boat could be determined.

In Figure 1 , a system 100 in accordance with embodiments of the invention is shown.

One or more sensors 101 are shown. The sensors 101 may include accelerometer(s), gyroscope(s), GPS (Global Positioning System) sensors, or local positioning sensors. In a preferred embodiment, the sensors 100 include both an accelerometer and a gyroscope.

At least one of the sensors 100 may be mounted relative to a user who is to use/ride a mechanical user motion apparatus. The sensors may be collocated within a single device 102 (as shown in Figure 1 ) or spread across multiple physical devices. The sensors 100 may be worn on the user, for example, on a belt, on an armband, on the wrist as a watch-style device, or on the head; held by the user; and/or on the user’s person (e.g. within a pocket). At least some of sensors may be, for example, within a device such as a personal user device such as a smartphone, or smartwatch.

A database 103 is shown. The database 103 may be configured for pre storing information about one or more mechanical user motion apparatuses such as mechanical rides (including amusement rides) or other user motion apparatus (such as exercise apparatus). The information may be kinetic information, for example, relating to the motion of the motion apparatuses. The kinetic information may relate to a physics simulation of the motion apparatuses.

A processor 104 is shown. The processor may be configured for processing the pre-stored kinetic information along with information from the sensors while the user is on the motion apparatus to generate a mapping between movement of the user and a virtual world. This process is defined in greater detail in relation to Figure 2 but may include generating a physics simulation of the motion apparatus, updating the physics simulation using the sensor information, and transforming, at least some, outputs of the physics simulation, such as“real world” position and/or orientation of the user, via one or more functions to create a mapping to a position and/or orientation for the user within a virtual world.

A display 105 is shown. The display 105 may include one or more screens. The screens may be positioned within a headset 106 (as shown in Figure 1 ) to be worn by the user such as a virtual reality, augmented reality, or mixed reality headset. Examples of which include the Oculus Rift or Microsoft HoloLens. The display 105 may be configured for displaying the virtual world to the user in accordance with the mapping. The virtual world may be rendered for display by the processor 104 or by another processors, such as a specialised graphics processing unit.

The headset 106 may include audio output 107 such as headphones or earphones.

One or more of the sensors 101 may be within the headset 106 (not shown). In such a way, as the headset 106 is worn by the user, these sensors are within a device mounted on the user.

Referring to Figure 2, a method 200 in accordance with embodiments of the invention will be described.

The method 200 relates to a mechanical user motion apparatus as described in relation to Figure 1.

Pre-stored kinetic data may be retrieved from a database (e.g. 103). The pre stored kinetic data may include kinetic information specific to the specific mechanical user motion apparatus which the user is on. This kinetic information may be stored in one of a plurality of configuration files, each file specific to a mechanical user motion apparatus of specific dimensions. The pre-stored kinetic data may include physics simulation information for the type of mechanical user motion apparatus which the user is on. For example, the physics simulation information may include simulation information for a swing ride, a twist ride, or a carousel ride. The physics simulation information may define a simulation for the mechanical motion apparatus as a set of equations, a filter (such as a Kalman filter), or as output from a physics simulation engine (such as Physx).

The physics simulation information may be configured using the specification kinetic information (e g. a configuration file). A real-world simulation for the motion apparatus is generated using the pre stored kinetic data.

The real-world simulation and the actual position and orientation of the user may be first aligned or initialised. For example, this may be via the user assuming a start position, or the simulation being configured to the user’s actual start position. In some embodiments, the real-world simulation aligns after receiving a plurality of sensor information to enable calibration for a specific user’s ride.

In step 201 , sensor information is received from one or more sensors within one or more devices. The device(s) may be mounted relative to a user on the apparatus. The term“on” as in the“user on the apparatus” will be used in relation to embodiments within the present disclosure and will be understood to include meanings such as riding the apparatus, within the apparatus, and positioned relative to the apparatus such that movement of a kinetic portion (or whole) of the apparatus moves the user.

In relation to the term“mounted relative to the user”, for example, the user may wear the device (e.g. on a belt, harness, or headset), hold the device, or have the device within their pocket. Other mountings are also possible. Preferably, the device is mounted in such a way that movement of the user moves the device in at least one axis of movement, but preferably three axes of movement, in the same vector and magnitude and/or orientation changes of the user change the orientation of the device in a correlated way. The orientation of the user may be the user’s body orientation or head position. Orientation may be rotation within a plane or rotation within 3-dimensions.

The sensor information may include position of the user, orientation of the user (e.g. body and/or head), velocity of the user, and acceleration of the user. The sensor information may be filtered, for example, to smooth or correct noisy data.

As time elapses, the simulation is updated to reflect predicted real world position and orientation of the user.

In step 202, the simulation may be refined using the sensor information.

In step 203, information is then extracted from the simulation. This information may include user information such as the user’s position and orientation, and may include apparatus information such as the current cycle of the motion apparatus.

In step 204, the user information is used to generate a mapping between movement of the user and a virtual world. Preferably, this converts a user’s real world position and/or orientation to a virtual position and/or orientation for the use in the virtual world. In one embodiment, the user information may be transformed to generate a mapping between movement of the user and a virtual world. In an alternative embodiment, the user information is not transformed and the mapping between the movement of the user and the virtual world is one-to-one.

Where the user information is transformed, one or more transformation functions may be used either continuously or a specific points/times/events while the user is on the motion apparatus. The transformation functions may relate to translational or rotational transformations. Preferably, the user’s real world position and/or orientation is transformed to result in a modified virtual position and/or orientation of the user within the virtual world.

The transformation functions may include amplification, suppression, reversal, offsetting, phase shifting, and/or swapping. It will be appreciated that other transformations may be used. The transformations and ancillary information (such as when to apply the transformations) may be stored within a configuration file specific to the motion apparatus. Selection and/or application of the transformation functions may also use the extracted apparatus information.

In step 205, this mapping (e.g. the user’s virtual position and orientation) are used to render a view from the user’s virtual perspective within the virtual world.

In step 206, this view is displayed to the user via a display. The display is preferably embedded within a headset, such as a virtual reality headset, worn by the user. Audio based on the virtual position and orientation of the user within the virtual world may also be generated and delivered to the user via headphones, earphones, or other speaker configurations.

Estimating the Dynamics of an Amusement Rider Based on Local Sensors, and transforming those Dynamics to create a Virtual Ride

One aspect of embodiments of the present invention is to determine the dynamics of the motion of a rider on an amusement ride, based on sensors (which may be noisy) local to the rider (e.g. internal sensors in a virtual reality headset).

The present disclosure describes several methods of achieving this. These methods may be generally based on the approach of creating a simulation of a ride’s mechanics and combining them with processed measurements from the local sensors in ways that keep the simulation close to reality. An overall approach will be described along with a several examples of this approach that could be implemented by a skilled person. A flow of data is shown in Figure 3.

First a physical simulation of a ride’s mechanics is constructed. This simulation may capture the primary constraints and forces which are expected to act on the rider and their seat (e.g. joints or chains, gravity, perhaps friction), and it may be used to simulate the effects of those constraints and forces on the system over time. This physical simulation may be initialised in a form consistent with the rest state of the attraction.

The Physical Simulation

Depending on the context this physical simulation might be created as equations, as a physically based filter such as a Kalman filter, or as a constructed world in dedicated physics-simulation software such as Physx. For example, a swing can be authored as a set of masses and joints within Physx, and Physx will calculate the movement of such a simulation as the coder requests.

Periodically the processor of the system will receive new data from the local sensors. The nature of this data may vary depending on the hardware in use. For example, with known VR headsets the input data consists of a noisy feed of world space accelerations and orientations for the user’s head, but future hardware may provide more - or less - data. The primary source of irreducible noise is that the sensors from a built-in device combine movement of the ride (the "signal") with arbitrary movement of the user's head (the "noise"); this noise is a direct consequence of containing sensors and processing solely within a head-mounted device, and better sensors contained within that device would not be able to separate this signal and noise. Future headsets may provide world space position information about the headset; this could be used to increase accuracy but the present invention does not require this feature. In a preferred embodiment, only world space accelerations and orientations the minimum sensor configuration are necessary; it might be possible to create limited experiences with less in alternate embodiments.

Optionally, filtering may be performed on this data feed. This filtering may include smoothing the acceleration data to reduce noise, and/or complex frequency analysis to extract higher-level measurements from the data’s history such as its period or magnitude of oscillation.

This step is optional because different rides have very different types of motion. Some will be amenable to having the acceleration data applied to the physics simulation directly in which case this filtering is not required; others will not produce good movement in that case filtering will improve it. The swing (described later in this document), for example, may work without the filtering process but the version with filtering may produce more precise information; more complex rides may not work without some form of filtering.

Whatever filtering is performed, a set of measurements of real world values may be produced. In the simple case, no filtering is performed, and these measurements are the accelerations and orientations received from the local sensors.

The sensor’s world space orientation is aligned with that of the simulated ride.

There are four methods for aligning the rider’s sensor’s world space orientation with that of the simulated ride. The first is through instruction and input. Riders will be instructed to take their ride position, wear their headset, and face forwards at which point this orientation would be automatically marked as the initial setting for forwards. This method is simple to deploy, but riders may occasionally not be facing forwards at the moment “facing forwards” is recorded. The second option is similar to the first, an input is required either via the headset, or via a networked device operated by the ride operator or rider, which triggers alignment in the software. This method ensures that the rider or operator manually triggers alignment once they are assured that the rider is facing forwards. There is a development, which could use one input to trigger signals to all rider headsets at the same time.

The third method is via a headset mounted receiver, coupled with a ride mounted emitter, for example via photoelectric sensor installation, or via the addition of a video sensor (camera) on the headset which references visual markers on the ride. This method has the advantage of absolute verification, but has the disadvantage that it requires additional sensors.

The fourth method is via the simulated model. The model will include the rider's position and a relative direction that can be considered facing forward. Once the ride is operational and in motion, and the model is being calculated and updated, the rider forward facing direction can be determined and world space can be aligned.

The above methods can be used together, for example the first method could provide good initial readings to start driving a physical simulations bespoke to each rider. Content would be created that is sensitive to any misalignment that might occur for example by creating a content that is primarily audio with omni-directional visuals. Once confidence is established in the rider facing- forwards-direction, content could fully exploit these higher confidence levels.

In parallel to these steps, whenever new measurements are received the physical simulation may be updated according to the time that has passed since the last update, based on the known constraints and simulated forces. This results in a new simulated state that is appropriate for the current time but does not take into account the new measurements. Now that there is an up-to-date simulation state and a set of real world measurements as shown in an operator interface in Figure 4, those measurements may be used to refine the simulated state. There are several possible approaches to this, depending on the measurements that are available and how much influence is determined for them to have on the state. In the simple case, the world space acceleration of the rider in the simulation is replaced with the acceleration data received from the headset, replacing the simulation’s calculated forces but retaining its constraints; this has the effect of moving the simulated rider in a manner consistent with the measured accelerations and the constraints that must be applied to them. There may also be cases where the measurements are sufficient to completely describe the state of the system ; in such cases the physical simulation may be unnecessary and the refined state may be produced solely from the measurements.

This refined state is returned to the simulation to be used as the basis for the next update. Finally, the dynamics of the simulated state are provided to the rest of the application so that it may use them to build its experience. These dynamics may be any values that are derivable from the simulated state; this will typically include at least the rider’s position, velocity and acceleration, but may include additional calculated values such as the period and phase of harmonic motion for rides that go through harmonic motion.

Translational and rotational data taken directly from the dynamics of the simulated state are transformed.

With a simple worked example of a swing as shown in Figure 5, different transformations can be performed in different phases of the swing cycle. Transformations include amplification, suppression, reversal, offsetting, phase shifting, swapping. Different combinations create different perceived movements, some of which can make progression through a virtual world, even though the real ride is oscillating about a fixed position. For example: • Rolling forward: 0 to 180 degrees translation in the z-forward axis is amplified, and y-up axis is diminished. 180 to 360 degrees, translation in the z-forward axis is reversed.

· Undulating upward: 0 to 180 degrees - translation in the y-up axis is mapped to the translation in the z-forward axis; rotation about x-right axis is amplified; translation in the z-forward axis is diminished to zero. 180 to 380 degrees - translation in the y-up axis is mapped to translation in the z-forward axis and reversed; rotation about x-right axis is amplified; translation in the z- forward axis is diminished to zero.

With a more complex ride example of the Twist (or Scrambler), as shown in Figures 6 and 7, more complex patterns of movement can be produced - such as“spinning off” - by following the same process. Further detail on both a swing and twist example will be described later in this disclosure.

The simulation and transformation steps above may be performed by a simulation and transformation software application (or applications). The virtual experience may be displayed to a user via a VR apparatus.

The VR apparatus may include one or more of the following components: (i) a VR headset, which provides the audio visual experience, (ii) a memory that stores information (iii) a processor that calculates and plays the audio visual experience to the headset (iv) one or more position calculation systems that use motion & position sensors, with motion & position processors to calculate where the headset is in the world locally and/or globally. These elements can exist as physically separate systems, with elements connected wirelessly or wired. Memory and/or processors may be distributed over local or global networks (increasingly viable as network bandwidths increase and latency decreases). VR headsets may be networked locally or globally to enable user- to-user interactions. The simulation and transformation software application can be stored and distributed either globally via the cloud, or locally via a local server, to local VR headsets. The application may need to be configured for a local instance of the ride, which will require a local configuration file. This configuration file can either be stored and downloaded from the same repository, or created and stored locally on the memory of the local headset. Configuration files created locally can be uploaded and stored on local servers or via the cloud to enable sharing with other headsets. Configuration files will be described later in this disclosure.

The processor may, in addition, be connected to additional input and output devices to enable richer forms of user interaction. Output devices could include effects such as vibration units or heating pads, which could exist as additional locally or globally networked peripheral devices, and may exist as individual physical products or built directly into headset-units. Input devices could include triggers, joysticks, and microphones.

For the creation of virtual reality experiences for mechanical rides, the current position and dynamic status of a rider are desirably determined to record the history of their position and dynamic status and predict their future position and dynamic status.

The present invention provides a method for determining position and dynamic status of a rider on a circular and/or harmonic and/or repeating ride using (a) sensors on or within a headset, or (b) sensors placed about the riders’ body.

It will be appreciated that key elements of VR technology are beginning to coalesce to form mobile VR headsets. For example a VR headset may contain all key elements within the same VR headset, which combines all sensors required for global and local positioning, with one processor providing game and position calculations in one place. Examples of such mobile VR headsets include Oculus Quest. As communications technology advances; bandwidths increase and latency decreases, memory and processing will increasingly become cloud based. However, sensing may still have to be performed locally on the ride/r, and the audio visual experience may still need to be delivered to the user’s ears and eyes.

Designing ride experiences to work with these mobile VR headsets alone, with associated peripherals such as handsets required to function, is advantageous as experiences can be distributed globally as software, using mechanisms such as app stores. Software can be downloaded to any similar headset, to be experienced by riders riding copies of the same ride mechanism anywhere in the world, with little or no modification to the ride itself.

A detailed description of embodiments of the present invention applied to a swing will be described with reference to Figure 7.

In this embodiment, the behaviour of a swing is modelled in two dimensions (considered equivalent to a pendulum) under the effects of gravity and a changing acceleration being applied to the pendulum bob with reference to equations.

The model has two variables; the signed angle which the pendulum rod makes with the vertical Q and the angular velocity with which this angle is changing w. Also used are two constants: r, the pendulum’s length (which may be a default value such as 2 metres rather than the exact length of the swing) and g, the acceleration due to gravity. The initial conditions may be arbitrarily set to q=w=0 in other words, a pendulum that is vertical and stationary. This would correlate to an experience where the rider/user starts on the swing stationery.

The simulation may be updated iteratively by calculating new values of the variables after each of a set of fixed time steps. 1/60s may be chosen as the length of the time step (henceforth At), as this is a value that is used very commonly in real time medium fidelity physics simulations like video games, and can thus be considered well-tested.

The sensors may provide new acceleration values at frequent but irregular intervals; when the update is performed the most recently received value of this acceleration may be used, henceforth referred to by the vector a = (a x , acceleration in the direction of swinging

parallel to the ground and « represents upward acceleration perpendicular to the ground.

On each iteration new values may be calculated for the bob’s angle and angular velocity based on the following formulae:

g

These new values are stored in the variables and become the new state of the simulation, which can be queried by other systems.

In this embodiment, the behaviour of a swing is modelled in two dimensions (considered equivalent to a pendulum) under the effects of gravity and a changing acceleration being applied to the pendulum bob using a physics middleware application. For example purposes use of PhysX in Unity is described here, but the process would be largely identical for any physics middleware in any application engine (e.g. PhysX in the Unreal Engine or Havok in the Source engine).

The swing is constructed within the physics engine using its built-in tools. The swing consists of two objects:

• The first object is a pivot at the top, which is massless and constrained to allow no linear motion and no rotation except in the horizontal axis the swing is to swing around.

• The second object is the bob, a point mass which is rigidly attached at a fixed position relative to the pivot, so it will move as the pivot rotates. The mass of the bob is unimportant (as long as it is non-zero), as the system will be dealing with accelerations not forces.

Each time the physics engine performs an internal update it can be configured to send a notification beforehand; at that time the most recent acceleration data received from our sensors may be applied as an acceleration on the bob. The physics engine then performs its update, simulating the movement of the bob under the acceleration applied and under its default gravity (which will match acceleration due to gravity on the surface of Earth).

When other systems want to query the state of the simulation they can do so by querying the physics middleware directly, using its standard APIs.

A detailed description of embodiments of the present invention applied to a carousel will now be described.

In this embodiment, the rotational behaviour of a carousel may be modelled in two dimensions based on noisy measurements of its rotation around vertical and its angular velocity around vertical, using a Kalman Filter.

The state vector c=[q,w\ has two components; Q is the rotation of the carousel around the vertical axis (relative to an arbitrarily chosen zero rotation), and w is the angular velocity of our carousel. This state vector may be initialised to and the initial uncertainties may be determined to be a high variance on Q{s q 2 ) and a low variance on w{s w 2 ) to reflect the fact that the initial rotation angle is completely unknown but that the carousel starts its sequence stationary or moving very slowly; the exact values used here are arbitrary and will generally be compensated for early in the simulation’s run, for example’s sake variables are set as follows: s q 2 = 1.5 and s w 2 = 0.05.

There is no covariance between the initial estimates, so the initial covariance matrix P 0|0 = [ s q 2 , 0] [0, s w 2 ].

To predict the filter’s state changes over time a state derivative transition model may be used: T=[0, 1] [0, 0]; this matrix has the simple physical interpretation that the expected derivative of the rotation angle is equal to the angular velocity, and the expected derivative of the angular velocity is equal to zero. The uncertainty in the angular acceleration (and hence the process noise covariance) is determined to be small but non-zero, on the basis that carousels do not accelerate or decelerate very fast. For example’s sake the angular acceleration variance may be set to s 2 = 0.005 ; from this the process noise derivatives covariance matrix is defined to be

1 1 1

Q = s a 2 X {—At 1 ,— At] [—At, 1], where At is the duration of the discrete

4 2 2 ^

time step in seconds. With these defined the prediction steps become:

The measurements vector ¾ = [¾, z w ] also has two components, corresponding to the measurements of the rotation angle Zg and the angular velocity z w . Since the measurement of the angle is arbitrary mod 2n this embodiment can arbitrarily choose to represent the measurement with whichever representation of the angle is closest to the current c q \ this allows treatment of the filter as being linear internally. Because the measurements correspond directly to the state the observation model H is just the identity matrix and can be ignored. The observation noise covariance matrix R k is dependent on what noise is present in the measurements, so its derivation is outside the scope of this description; similarly the gain matrix G k describes the confidence in the measurements and its values will be dependent on how those measurements are obtained, so its values are likewise omitted here. With these defined, the Kalman gain for the update step may be calculated as:

and from that the update step is:

To run the simulation the prediction step may be executed once every At seconds, while the update step can be executed whenever new measurements are made. Once the simulation is running other systems may query the rotation angle or angular velocity at any time by accessing the corresponding components of the state vector x; typically the angle is returned in its canonical form Q mod 2p and the angular velocity w as-is.

Filtering Acceleration Data to Determine Periodicity Using Autocorrelation

In this embodiment, a method of filtering noisy acceleration data from a carousel to calculate an estimate of the carousel’s rotation period is described. This example describes just one method of filtering this data in this way, and others (such as linear regression) may exist. The acceleration data is provided by external sensors at irregular intervals as a three-dimensional cartesian vector (x, y, z). The second component of this vector y measures the vertical acceleration; the other two components measure horizontal accelerations with an arbitrary basis.

As a first step each acceleration vector is converted into an angle about the vertical axis, Q = arctan(x/z ) . Sign and direction of this conversion are chosen arbitrarily, as they won’t affect the periodicity of the rotation which is the goal. These angles are then interpolated with a sample spacing of At to produce a series of angles { is the interpolated value of Q at time i X At . Care must be taken when interpolating to ensure that interpolation is always performed following the shortest angle distance around the circle. The value of At is chosen to balance processing time (higher when At is smaller) and accuracy of analysis (also higher when At is smaller); for example, At = 0.1s may be used.

Now there are a series of angles that are uniformly spaced in time a normalised square difference autocorrelation function can be used to estimate the periodicity of a subsequence { q ί+ί . . . q ί+h- } of it; assuming smooth rotation at a constant speed this period will be the period of the carousel’s rotation. To do this the subsequence is first padded to twice its length with zeroes:

The normalised square difference autocorrelation of this sequence is calculated by the following formula:

This sequence {n t } will have peaks at offsets t where the sequence {x t } is highly correlated with itself; one of these peaks will correspond to the period of the rotation. To determine which one to use all the peaks of {n t } are considered; discarding any whose time t is an integer multiple of another peak time with similar magnitude, as these correlations indicate multiple cycles. Then discarding any peaks that would indicate periods that are implausible based on a physical understanding of the carousel’s limits. Finally choosing the time t of the remaining highest peak. From this the estimate of the period is calculated as t X At.

Other filtering methods such as a Butterworth low pass filter or exponential smoothing may be used to reduce noise in the input data.

Defining Confidence Levels for Simulations and Filtered Values

The physical simulations may not necessarily be an accurate representation of the state of the real world at all times. In order to produce high-quality experiences it will often be helpful for other systems to have some idea of how accurate the representations are at the current time, so elements of the virtual world experience that are closely tied to motion can be reserved for times when there is good confidence that the models of the motion are accurate.

In order to accommodate this, an estimated confidence level may be generated in the simulations in the form of a value in the range [0,1] where 1 indicates complete certainty and 0 indicates that the model is in no way indicative of the real world state. Methods for calculating this confidence level will vary considerably depending on the application, but certain common elements will be used:

• If a Kalman Filter is in use the variance of its current state provides a natural source of confidence values. The exact process for converting a variance into a confidence value will depend on the physical interpretation of the state, but in general a variance that is low in size compared to perceptible changes in the state would indicate a high confidence, which a variance that is large compared to perceptible changes in the state would indicate a low confidence.

• If a correlation filter such as autocorrelation is being used to determine a value then the magnitude of the normalised square difference correlation function can be used as a confidence value, as higher correlation indicates a greater confidence that the signals being compared are truly correlated.

• If values of a certain confidence level are being used to drive a simulation that does not itself consider their confidence levels, such as a Kalman Filter with optimal gain, the confidence level of the simulation’s output can be estimated by multiplying the simulation’s confidence level with the confidence level of the input values. This has the intuitive effect of producing a low final confidence if either the inputs or the process are low confidence.

• If confidence levels provided by these measures are noisy an embodiment can artificially introduce some hysteresis into them, for example by exponentially smoothing them or by limiting their rate of change. This matches intuition that an accurate simulation is unlikely to suddenly become substantially inaccurate, and that an inaccurate system is unlikely to suddenly become correct; a system that has been reporting consistently high confidence levels for a period of time provides more overall confidence than one that has only momentarily reported good confidence. This also ensures that any elements of the virtual world experience that change based on the confidence only do so when the changes are meaningful; in other words when confidence in the confidence is high.

Multiple different types of transformations are possible for transforming movement of a user within a physics simulation to a virtual world“experience”. Some of these transformations will now be described with reference to Figures 9 and 12 and with reference to a method for defining the transformations.

A“transformation” of a rider's motion is any rule or set of rules for moving the rider’s virtual perspective based on the rider’s dynamics. For any such transformed motion physical principles can be used to determine what reaction forces and angular motion the rider would expect to experience when going through such motion, and testing can judge which transformed motions have greater or lesser perceived discrepancy compared to the rider’s real motion, both quantitatively (by comparing the calculated values) and qualitatively (by testing with real riders). In some embodiments, it may be preferred to use transformed motions which have the least discrepancy with the rider’s real motion, and seek to minimise this discrepancy when transformations are designed.

Motion transformations may take any form; in general they are defined by a function which takes a rider’s real world position and orientation over time ( p(t ), o(t )) and transforms it into a virtual position and orientation over time

[p '(t), o '(t)). Embodiments of the present invention provide a method and system (called“tools”) to assist a“creator” to define such general functions between pairs of functions. This will be termed authoring a virtual world experience. It will be appreciated that the definition of the functions do not require the use of a tool and could be defined by a developer.

In the described embodiment, a specific tool out of the tools may be designed for describing a small subset of possible motion transformations. For example, the tool may relate only to motion’s translational movement through the world, and the tool takes responsibility for ensuring that the rider’s rotational movement is commensurate with that. Some tools, however, give control of both the translational and rotational transformations to the creator; others may only give the creator control of rotational transformations. When describing translational motions the tool may be restricted to only describe transformations of movement in terms of one form of the movement; typically that means describing a transformation of one of the rider’s position, velocity or acceleration. In some tools additional views may be provided to show the effect of the creator’s transformation on other forms of movement; for example, if a transformation is described in terms of velocity the tools would allow the creator to also view that transformation’s expected effects on the rider’s position, acceleration, jerk and jaunt. Authoring transformations in each form of movement has advantages and disadvantages; forms of movement closer to position allow more control but require much more work to make higher derivatives smooth, which is important for rider comfort. Conversely authoring in higher derivatives makes smoothness easier to achieve but is subject to undesirable drift in the lower derivatives, making the experience more difficult for the creator to control.

Motion Authoring For the Swing

When determining motion transformation processes for a ride the form of the physical motion is considered. In the case of the swing this motion is simple harmonic; when swinging freely the rider follows a repeating pattern of motion over a fixed period, which suggests a natural way of normalising the motion over this period. This period may be called the Harmonic Period, and a value called the Harmonic Phase may be defined which ranges between 0 and 1 to describe where the rider is within this period. When the Rider is at the rearmost apex of the swing the harmonic phase may be defined to be 0; it may increase linearly over time, passing through 0.25 as the rider passes through the nadir moving forward, 0.5 as the rider reaches the frontmost apex, 0.75 as the rider passes again through the nadir moving backward, and wrapping from 1 back to 0 as they return to the rearmost apex.

With this phase value defined, it provides a natural way to think about how a creator can transform a motion differently at different points during the swing rider's movement. By specifying ranges of this phase the creator may apply a transformation only on while the rider is swinging forward (0-0.5), only while the rider is ahead of the midpoint (0.25-0.75), only briefly around the frontmost apex (0.45-0.55), or any other region of interest. They may also specify transitions to smoothly introduce and remove this transformation as the rider enters and leaves the region of interest.

Transformations of motion of the swing may be divided into two types: Oscillating motions, where the virtual position and orientation of the rider return to the same place once each swing cycle, and progression motions, where the rider’s virtual position and/or orientation change permanently with each swing cycle. Oscillating motions are preferred when the experience designer (e.g. a creator) wants to know where the rider will be located when designing the experience, whereas progression motions can be used to create experiences where the rider moves through a virtual world (even though their motion in the real world is only oscillating). These two types of transformations may be combined in the same experience, either sequentially or simultaneously, in order to achieve the experience designer’s goals.

When creating oscillating motions the transformations may be authored in terms of positions, as transforming the velocity or acceleration may be subject to cumulative errors which make it difficult to ensure the motion is truly oscillating without applying extra correction steps on top of the authored transformations. Typically position is transformed by means of a scaling factor on the rider’s position relative to some fixed point (for example an apex or nadir of the swing), separated into two axis; later examples will demonstrate this. Any other transforming function can be used, however.

When creating progression motions any of the rider’s position, velocity or acceleration may be used as the basis for a transformation depending on the effect a creator wishes to achieve. The same technique used to author position transformations for oscillating motions may be used for progression motions; alternatively arbitrary changes in velocity or acceleration may be applied at points during the swing as desired by a creator. This latter technique can be useful for, for example, giving the rider a sensation of an impulse being applied to them, or of a dragging force. Later examples will demonstrate this approach.

Transformations may also be applied to the rider’s orientation. These may be used for creative effect in the experience, for example to create the impression that the rider’s swing angle is greater or less than it really is, or to reduce the discrepancy between the rider’s physiological and visual sensations by matching the direction of acceleration the rider would expect based on their visual experience with the direction of the rider's physiologically perceived acceleration. This reduction of discrepancy as is defined as “banking”, by analogy with the corresponding concept on a roller coaster. This banking may be authored by hand by a creator, or applied automatically by software to a degree that is under a creator’s control.

Typically multiple transformations may be used sequentially over the course of an experience; a creator may author these transformations separately, then blend or choose between them according to the time passed during the experience, the number of swing cycles a rider has been through, the rider’s position in the virtual world, the magnitude of the rider's swings, or any other factor they feel is appropriate.

Motion Authoring for the Carousel Movement of a rider on a carousel has two components; rotational movement around the carousel’s central axle, and vertical oscillation of the horse they are riding.

The centripetal acceleration force on a typical carousel is usually very slight, and a rider in a headset will not notice it. Thus for transformation purposes, this movement can be treated as linear forward motion of the rider at a fixed velocity (with acceleration and deceleration at the start and end of the experience); this velocity can be transformed as a creator sees fit in order to produce a progression experience.

The vertical oscillation of the horse comprises simple harmonic motion, and thus all of the techniques described above for the swing may be applied to it simply by mapping the phase of the swing (rear apex to front apex to rear apex) to the horse (low apex to high apex to low apex). The same oscillating and progression transformations used on the swing may be applied to the horse’s motion with similar goals and effects. This same approach may be applied to any ride motion which undergoes motion which can be mapped to the phases of simple harmonic motion.

The“Inchworm” Motion Transformation

The“Inchworm” transformation of a swing’s motion describes a progression motion akin to the movement of an inchworm, where the forward arc of the swing describes a sense of reaching forward though the world, and the backward arc describes a sense of the rider dragging the ground underneath them to move an invisible body forwards as shown in Figure 9. Figure 10 shows how such a motion transformation might be authored using the tools:

The transformation comprises two component transformations executed simultaneously. The first describes an oscillating (or“in-place”) transformation which multiplies the vertical position of the rider by a factor of 0 relative to the nadir, applied across the whole duration of the swing. This has the effect of linearizing the swing’s motion, creating the sense of moving forward and backward along a straight line in the virtual world without the ascent and descent present in the real swing motion. The bank factor is set to 1 (“auto on”). This means the application will automatically rotate the virtual world to match the rider’s visually perceived accelerations with their physiologically perceived accelerations (banking 0“auto off” would mean no modification of the rider's headset orientation; and the virtual orientation would exactly match the real world orientation).

The second component transformation describes an accelerating impulse used to propel the rider through the virtual world (along the“Z” or forward axis). An impulse profile may be defined (using a standard ADSR curve) that applies the impulse during the second half of the harmonic phase (corresponding to the backward-moving portion of the swing), with a shape artistically tuned by a creator to produce a pleasant motion. A total velocity change may be defined that this impulse should apply (independently of the impulse profile) as a product of any parameters the creator wishes to use; in this example it is scaled based on the highest angle of the current swing cycle in radians, multiplied by 5, multiplied by a strength parameter which can be controlled dynamically by other code over the course of the experience. Again the bank factor may be set to 1 , and exponential smoothing may be applied to both the harmonic phase and the final acceleration magnitude.

These two transformations combine linearly to modify the position and acceleration of the rider’s perspective in the virtual world, producing the desired effect.

The“Jellyfish” Motion Transformation

The“Jellyfish” transformation of a swing’s motion describes a progression motion akin to the movement of a jellyfish, where the rider bobs up and down as they swing, with the sense of an upward“kick” as they begin the backward portion of a swing. Figure 1 1 shows how such a motion transformation might be authored using the tools:

The transformation comprises two component transformations executed simultaneously. The first describes an oscillating (or“in-place”) transformation which multiplies the horizontal (forward/back) position of the rider by a factor of 0 relative to the nadir, applied across the whole duration of the swing. This has the effect of linearizing the swing’s motion, creating the sense of moving upward and downward along a straight line in the virtual world without the forward and backward motion present in the real swing motion. The bank factor is set to 1 , so the application will automatically rotate the virtual world to match the rider’s visually and physiologically perceived accelerations.

The second component transformation describes an accelerating impulse used to propel the rider through the virtual world (along the Ύ” or up axis). An impulse profile may be defined (using a standard ADSR curve) that applies the impulse shortly after the middle of the harmonic phase (corresponding to the point just after the rider passes the forward apex), with a shape artistically tuned by a creator to produce a pleasant motion. A total velocity change may be defined that this impulse should apply (independently of the impulse profile) as a product of any parameters the creator wishes to use; in this example it is scaled based on the highest angle of the current swing cycle in radians, multiplied by 100 (a high value both to create an exaggerated effect and to overcome the virtual world’s gravity). Again the bank factor is set to 1 , and exponential smoothing is applied to both the harmonic phase and the final acceleration magnitude.

These two transformations combine linearly to modify the position and acceleration of the rider’s perspective in the virtual world, producing the desired effect.

The“Scoop” Motion Transformation The “Scoop” transformation of a swing’s motion describes an oscillating motion where the rider swings normally on the backward-moving portion of the swing, but dips much deeper than their real motion on the forward-moving portion, as if scooping liquid with a ladle. Figure 12 shows how such a motion transformation might be authored using the tools: This transformation comprises one oscillating (or“in-place”) transformation of the rider’s motion which uses different parameters in different portions of the swing cycle. During the forward-moving portion of the swing cycle a multiplier of 4 is applied to the rider’s vertical position relative to the height of the apex; this creates the effect of a much deeper dip. During the backward-moving portion of the swing all the modifiers are 1 , so the real world swing motion is reproduced virtually as-is.

In this example changes between these two transformations always happen at the apex. At that point both transformations will produce the same virtual position and velocity for the rider, but they will produce different virtual accelerations so if it were switched instantaneously at the crossover point the rider would experience a momentary jerk. To avoid this a transition is introduced at both crossover points, with a duration of 0.1 times the harmonic period, which blends between the two transformation outputs using a “smoothest step” curve. This ensures that the jerk experienced by the rider is continuous over time.

In this example a creative decision has been made to set the bank factor to zero, so the application will not automatically adjust the rider’s virtual rotation to align with experienced forces. This decision was made because in this case as the discomfort created by introducing this additional rotation was felt to be greater than any gain in comfort from aligned forces.

Intensity settings

The boundary between a comfortable and an uncomfortable experience is often subjective, and thresholds are typically determined empirically by the designer. However, parameters, such as banking, could be set by users depending on whether they want to experience low, medium or high intensity experiences.

Configuration files When designing experiences for certain rides, it may be desirable for a single application to be usable with multiple rides of similar construction but slightly different measurements or behaviour. Where these differences have a material effect on the physical simulation or experience design of a ride there will need to be a method to configure information about these differences externally to the application. To this end parameters describing these differences may become part of a configuration file. This file could be stored and accessed in local memory, on a local network, or in the cloud.

Parameters that may be included in such a file include:

• For any ride type:

o If an application contains multiple authored experiences, a parameter to decide which experience should be used on this attraction.

o A parameter indicating whether this headset is to be used by an adult or child (for the purposes of positioning and scaling elements of the experience).

• For any mechanical ride with fixed run time:

o A parameter defining the expected run time of a normal run of the attraction (i.e. one that is not ended early by operator intervention).

• For a carousel-type ride:

o A parameter defining whether the ride rotates clockwise or anticlockwise.

o A parameter defining the expected maximum angular velocity of the ride.

o Parameters defining the greatest and least distances of seats from the centre of rotation.

• For a Twist-type ride: o A parameter defining whether the large arm of the ride rotates clockwise or anticlockwise.

o A parameter defining the expected maximum angular velocity of the large arm.

o A parameter defining the gear ratio between the large and small axles of the ride.

o A parameter defining the distance between the large and small axles of the ride.

Parameters defining the greatest and least distances of seats from the small

A potential advantage of some embodiments of the present invention is that a virtual reality experience can be provided to a user with no or minimal changes to an existing mechanical user motion apparatus such as an amusement ride.

While the present invention has been illustrated by the description of the embodiments thereof, and while the embodiments have been described in considerable detail, it is not the intention of the applicant to restrict or in any way limit the scope of the appended claims to such detail. Additional advantages and modifications will readily appear to those skilled in the art. Therefore, the invention in its broader aspects is not limited to the specific details, representative apparatus and method, and illustrative examples shown and described. Accordingly, departures may be made from such details without departure from the spirit or scope of applicant’s general inventive concept.




 
Previous Patent: SEISMIC SWAY BRACE FITTING

Next Patent: SINTER-READY SILVER FILMS