Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR CREATING 3D OBJECT WITH TRACKING
Document Type and Number:
WIPO Patent Application WO/2023/223194
Kind Code:
A1
Abstract:
A system and method for creating three dimensional objects that feature tracking through an EMF (electromagnetic field) sensor for deployment in a virtual world. By "virtual world", it is meant a VR (virtual reality) or AR (augmented reality) environment. The system and method enable such objects to be designed, manufactured and then deployed in the virtual world in a reproducible, efficient manner.

Inventors:
VAN DEN BRINK STEPHAN (NL)
WITTEVEEN MAARTEN (NL)
OSTENDORF PIM (NL)
Application Number:
PCT/IB2023/055015
Publication Date:
November 23, 2023
Filing Date:
May 16, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
MANUS TECH GROUP B V (NL)
International Classes:
A63F13/24; A63F13/22; A63F13/428; G01R33/02; G01S1/70; G06F3/0346
Foreign References:
US20200341538A12020-10-29
US20220032167A12022-02-03
US20160246370A12016-08-25
US203062633180P
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. A method for creating a three dimensional (3D) physical object with a virtual representation in a virtual world, comprising determining at least one interaction point on said 3D physical object; placing a tracker at a zero coordinate on said 3D physical object, wherein said zero coordinate is associated with said virtual representation in a virtual world; wherein upon interacting with said interaction point, moving said 3D physical object or a combination thereof, said virtual representation is altered according to said interacting, said moving or a combination thereof.

2. The method of claim 1, wherein said 3D physical object has an associated 3D model, and wherein said 3D model corresponds to said virtual representation in said virtual world.

3. The method of claims 1 or 2, wherein said tracker comprises an EMF (electromagnetic field) transmitter, an EMF receiver or a combination thereof; wherein a location of said 3D physical object is determined according to detection of an EMF through said tracker.

4. The method of any of the above claims, wherein said virtual world is provided through a virtual world computational device, said virtual world computational device comprising a memory for storing a plurality of instructions and a processor for executing said instructions; wherein said instructions are executed to generate said virtual world.

5. The method of claim 4, wherein said virtual world computational device further comprises a game engine for generating said virtual world.

6. The method of claims 4 or 5, wherein said virtual world is selected from the group consisting of a VR (virtual reality) or AR (augmented reality) environment. The method of any of claims 4-6, further comprising receiving a game object file for said virtual representation, converting said game object file to a 3D printer file, and printing a physical 3D object according to said 3D printer file. The method of claim 7, further comprising printing a plurality of physical 3D objects according to a plurality of 3D printer files and combining said plurality of physical 3D objects to form a single physical 3D object. The method of claims 7 or 8, further comprising adding said tracker to said physical 3D object. The method of any of the above claims, wherein said interacting with said interaction point further comprises providing at least one sensor worn by a user, and transmitting a signal indicative of a distance between said tracker and said at least one sensor. The method of claim 10, wherein one of said sensor and said tracker comprises an EMF transmitter, and the other one of said sensor and said tracker comprises an EMF receiver, wherein said transmitting said signal further comprises determining said distance according to EMF signals from said EMF transmitter and received by said EMF receiver. The method of claim 11, wherein said sensor comprises an EMF transmitter and said tracker comprises an EMF receiver. The method of claim 11, wherein said tracker comprises an EMF transmitter and said sensor comprises an EMF receiver. The method of any of claims 10-13, wherein said sensor is worn by said user as part of an article of apparel. The method of claim 14, wherein said article of apparel comprises a glove, helmet, hat, head band, belt, wrist band, shoe or any suitable article of clothing. The method of claim 15, wherein said article of apparel comprises a glove assembly. The method of any of claims 14-16, wherein said 3D physical object comprises said tracker, and wherein said 3D physical object is manipulated by said user. The method of claim 17, wherein said 3D physical object is manipulated by said user for performing one or more actions in said virtual world. The method of claim 18, wherein said one or more actions in said virtual world are performed by said user for gameplay. The method of any of the above claims, wherein said moving said 3D physical object further comprises moving said 3D physical object by a user; determining a distance between said 3D physical object and said user according to transmission of EMF signals; and transmitting a signal indicative of said distance. The method of any of the above claims, wherein one of said tracker or said sensor comprises said EMF receiver and hence enters sleep mode when said EMF receiver does not sense EMF for a predetermined period of time. The method of claim 21 , wherein said tracker or said sensor awakens at a predetermined time interval, such that said EMF receiver scans for EMF; if said EMF is not detected, said tracker or said sensor reenters sleep mode. The method of any of the above claims, wherein said EMF transmission source comprises a synthesizer and a transmission coil; wherein said synthesizer generates electrical signals according to a plurality of instructions, wherein said instructions determine a shape of said EMF signals; wherein said electrical signals are passed to transmission coil for transmission as EMF signals. The method of claim 23, wherein said shape is a square shape or a triangular shape. The method of any of the above claims, wherein said sensor comprises a magnetic flux density sensor, a magnetic field strength sensor, three Hall effect sensors or any other suitable magnetic sensor or combination thereof; wherein said suitable magnetic sensor is at least able to determine an amplitude of said EMF at an appropriate speed. The method of any of claims 25-28, wherein said sensor comprises a magnetometer which is able to detect EMF. The method of any of claims 25-29, wherein said sensor comprises magnetic flux density sensor, a magnetic field strength sensor, or a combination thereof. A system for performing the method of any of the above claims. The system of claim 31, comprising a model computational device for generating a 3D model of said physical object, wherein said model computational device comprises a memory for storing a plurality of instructions and a processor for executing said instructions for performing the method of any of the above claims in regard to said 3D model. The system of claim 32, further comprising a virtual world computational device, wherein said virtual world computational device comprises a memory for storing a plurality of instructions and a processor for executing said instructions for executing said functions of said virtual world according to any of the above claims. A system for creating a three dimensional (3D) physical object with a virtual representation in a virtual world, comprising a model computational device for generating a 3D model of said physical object, wherein said 3D model comprises an interaction point for interaction with a user and a zero coordinate point for receiving a tracker, wherein said model computational device comprises a memory for storing a plurality of instructions for generating said 3D model and a processor for executing said instructions; the system further comprising a virtual world computational device, wherein said virtual world computational device comprises a memory for storing a plurality of instructions and a processor for executing said instructions, wherein upon execution of said instructions, said virtual world computational device creates a virtual world for interaction with the user; wherein said virtual world receives said 3D model and generates a virtual representation of said physical object according to said 3D model, including with regard to said interaction point and said location of said physical object as determined through said tracker. The system of claim 32, wherein upon interaction with said interaction point, said virtual representation is updated in said virtual world according to instructions executed by said processor by said virtual world computational device. The system of claim 33, further comprising a 3D printer, wherein said physical object is created according to said 3D model by said 3D printer. The system of claim 34, wherein said 3D printer receives a plurality of 3D printer files, such that said physical object is created by combining a plurality of 3D printed outputs from said 3D printer. The system of claims 33 or 34, wherein said 3D model is suitable for rendering by a game engine and wherein said 3D model is converted to at least one 3D printer file. The system of any of the above claims, wherein said 3D model comprises a plurality of reusable elements and wherein each of said reusable elements is mapped to said virtual representation in said virtual world, such that combining said plurality of reusable elements provides a direct mapping of said 3D model to said virtual representation. The system of any of the above claims, wherein said tracker comprises an EMF (electromagnetic field) transmitter, an EMF receiver or a combination thereof; wherein a location of said 3D physical object is determined according to detection of an EMF through said tracker. The system of any of the above claims, wherein one of said tracker or said sensor comprises said EMF receiver and hence enters sleep mode when said EMF receiver does not sense EMF for a predetermined period of time. The system of claim 24, wherein said tracker or said sensor awakens at a predetermined time interval, such that said EMF receiver scans for EMF; if said EMF is not detected, said tracker reenters sleep mode. The system of any of the above claims, wherein said virtual world computational device further comprises a game engine for generating said virtual world. The system of claim 41 , wherein said virtual world is selected from the group consisting of a VR (virtual reality) or AR (augmented reality) environment. The system of any of the above claims, wherein said sensor is worn by said user as part of an article of apparel. The system of claim 43, wherein said article of apparel comprises a glove, helmet, hat, head band, belt, wrist band, shoe or any suitable article of clothing. The system of claim 44, wherein said article of apparel comprises a glove assembly. The system of any of claims 43-45, wherein said 3D physical object comprises said tracker, and wherein said 3D physical object is manipulated by said user. The system of claim 46, wherein said 3D physical object is manipulated by said user for performing one or more actions in said virtual world. The system of claim 47, wherein said one or more actions in said virtual world are performed by said user for gameplay. The system of any of the above claims, wherein said EMF transmission source comprises a synthesizer and a transmission coil; wherein said synthesizer generates electrical signals according to a plurality of instructions, wherein said instructions determine a shape of said EMF signals; wherein said electrical signals are passed to transmission coil for transmission as EMF signals. The system of claim 49, wherein said shape is a square shape or a triangular shape. The system of any of the above claims, wherein said sensor comprises a magnetic flux density sensor, a magnetic field strength sensor, three Hall effect sensors or any other suitable magnetic sensor or combination thereof; wherein said suitable magnetic sensor is at least able to determine an amplitude of said EMF at an appropriate speed. The system of any of claims 51-54, wherein said sensor comprises a magnetometer which is able to detect EMF. The system of any of claims 51-55, wherein said sensor comprises magnetic flux density sensor, a magnetic field strength sensor, or a combination thereof. A method performed with the system of any of the above claims.

Description:
PCT APPLICATION

Title: SYSTEM AND METHOD FOR CREATING 3D OBJECT WITH TRACKING

Inventors: Stephan van den Brink, Maarten Witteveen, and Pim Ostendorf

FIELD OF THE INVENTION

The present invention relates to a system and method for creating three dimensional objects that feature tracking through an EMF (electromagnetic field) sensor and in particular, to such a system and method for creating such three dimensional objects for deployment in a virtual world.

BACKGROUND OF THE INVENTION

EMF (electromagnetic field) sensors may be used for detecting the position of any attached object to which they are attached, and hence may be used for tracking. For example, such sensors may be used to detect the position of humans and/or specific human appendages when attached to a human, for example when worn as an item of clothing. Determining the position of humans and/or specific human appendages may be useful, for example, for virtual reality (VR) or augmented reality (AR) devices.

Such sensors and EMF based tracking are also applied to physical objects which are manipulated in a VR or AR environment. Currently, each object needs to be designed separately through a non-reproducible process, which is inefficient and results in higher costs.

BRIEF SUMMARY OF THE INVENTION

The present invention overcomes the drawbacks of the background art, by providing a system and method for creating three dimensional objects that feature tracking through an EMF (electromagnetic field) sensor for deployment in a virtual world. By “virtual world”, it is meant a VR (virtual reality) or AR (augmented reality) environment. The system and method enable such objects to be designed, manufactured and then deployed in the virtual world in a reproducible, efficient manner.

Non-limiting examples of various systems, methods and implementations for localized tracking, for example to track a prop or other 3D (three dimensional) object, are described in US Provisional Application No. 63318030, filed on 9 March 2022, entitled “SYSTEM AND METHOD FOR FINGER TRACKING”, which is owned in common with the present application and which is fully incorporated by reference as if set forth herein.

According to at least some embodiments, there is provided a method for creating a three dimensional (3D) physical object with a virtual representation in a virtual world, comprising determining at least one interaction point on said 3D physical object; placing a tracker at a zero coordinate on said 3D physical object, wherein said zero coordinate is associated with said virtual representation in a virtual world; wherein upon interacting with said interaction point, moving said 3D physical object or a combination thereof, said virtual representation is altered according to said interacting, said moving or a combination thereof.

Optionally said 3D physical object has an associated 3D model, and wherein said 3D model corresponds to said virtual representation in said virtual world. Optionally wherein said tracker comprises an EMF (electromagnetic field) transmitter, an EMF receiver or a combination thereof; wherein a location of said 3D physical object is determined according to detection of an EMF through said tracker. Optionally said virtual world is provided through a virtual world computational device, said virtual world computational device comprising a memory for storing a plurality of instructions and a processor for executing said instructions; wherein said instructions are executed to generate said virtual world. Optionally said virtual world computational device further comprises a game engine for generating said virtual world. Optionally said virtual world is selected from the group consisting of a VR (virtual reality) or AR (augmented reality) environment.

Optionally the method further comprises receiving a game object file for said virtual representation, converting said game object file to a 3D printer file, and printing a physical 3D object according to said 3D printer file. Optionally the method further comprises printing a plurality of physical 3D objects according to a plurality of 3D printer files and combining said plurality of physical 3D objects to form a single physical 3D object. Optionally the method further comprises adding said tracker to said physical 3D object. Optionally said interacting with said interaction point further comprises providing at least one sensor worn by a user, and transmitting a signal indicative of a distance between said tracker and said at least one sensor. Optionally one of said sensor and said tracker comprises an EMF transmitter, and the other one of said sensor and said tracker comprises an EMF receiver, wherein said transmitting said signal further comprises determining said distance according to EMF signals from said EMF transmitter and received by said EMF receiver. Optionally said sensor comprises an EMF transmitter and said tracker comprises an EMF receiver. Optionally said tracker comprises an EMF transmitter and said sensor comprises an EMF receiver. Optionally said sensor is worn by said user as part of an article of apparel. Optionally said article of apparel comprises a glove, helmet, hat, head band, belt, wrist band, shoe or any suitable article of clothing. Optionally said article of apparel comprises a glove assembly. Optionally said 3D physical object comprises said tracker, and wherein said 3D physical object is manipulated by said user. Optionally said 3D physical object is manipulated by said user for performing one or more actions in said virtual world. Optionally said one or more actions in said virtual world are performed by said user for gameplay. Optionally said moving said 3D physical object further comprises moving said 3D physical object by a user; determining a distance between said 3D physical object and said user according to transmission of EMF signals; and transmitting a signal indicative of said distance. Optionally one of said tracker or said sensor comprises said EMF receiver and hence enters sleep mode when said EMF receiver does not sense EMF for a predetermined period of time. Optionally said tracker or said sensor awakens at a predetermined time interval, such that said EMF receiver scans for EMF; if said EMF is not detected, said tracker or said sensor reenters sleep mode. Optionally said EMF transmission source comprises a synthesizer and a transmission coil; wherein said synthesizer generates electrical signals according to a plurality of instructions, wherein said instructions determine a shape of said EMF signals; wherein said electrical signals are passed to transmission coil for transmission as EMF signals. Optionally said shape is a square shape or a triangular shape. Optionally said sensor comprises a magnetic flux density sensor, a magnetic field strength sensor, three Hall effect sensors or any other suitable magnetic sensor or combination thereof; wherein said suitable magnetic sensor is at least able to determine an amplitude of said EMF at an appropriate speed. Optionally said sensor comprises a magnetometer which is able to detect EMF. Optionally said sensor comprises magnetic flux density sensor, a magnetic field strength sensor, or a combination thereof.

According to at least some embodiments, there is provided a system for performing the method as described herein. Optionally the system further comprises a model computational device for generating a 3D model of said physical object, wherein said model computational device comprises a memory for storing a plurality of instructions and a processor for executing said instructions for performing the method of any of the above claims in regard to said 3D model. Optionally the system further comprises a virtual world computational device, wherein said virtual world computational device comprises a memory for storing a plurality of instructions and a processor for executing said instructions for executing said functions of said virtual world according to any of the above claims.

According to at least some embodiments, there is provided a system for creating a three dimensional (3D) physical object with a virtual representation in a virtual world, comprising a model computational device for generating a 3D model of said physical object, wherein said 3D model comprises an interaction point for interaction with a user and a zero coordinate point for receiving a tracker, wherein said model computational device comprises a memory for storing a plurality of instructions for generating said 3D model and a processor for executing said instructions; the system further comprising a virtual world computational device, wherein said virtual world computational device comprises a memory for storing a plurality of instructions and a processor for executing said instructions, wherein upon execution of said instructions, said virtual world computational device creates a virtual world for interaction with the user; wherein said virtual world receives said 3D model and generates a virtual representation of said physical object according to said 3D model, including with regard to said interaction point and said location of said physical object as determined through said tracker.

Optionally upon interaction with said interaction point, said virtual representation is updated in said virtual world according to instructions executed by said processor by said virtual world computational device. Optionally the system further comprises a 3D printer, wherein said physical object is created according to said 3D model by said 3D printer. Optionally said 3D printer receives a plurality of 3D printer files, such that said physical object is created by combining a plurality of 3D printed outputs from said 3D printer. Optionally said 3D model is suitable for rendering by a game engine and wherein said 3D model is converted to at least one 3D printer file. Optionally said 3D model comprises a plurality of reusable elements and wherein each of said reusable elements is mapped to said virtual representation in said virtual world, such that combining said plurality of reusable elements provides a direct mapping of said 3D model to said virtual representation. Optionally said tracker comprises an EMF (electromagnetic field) transmitter, an EMF receiver or a combination thereof; wherein a location of said 3D physical object is determined according to detection of an EMF through said tracker. Optionally one of said tracker or said sensor comprises said EMF receiver and hence enters sleep mode when said EMF receiver does not sense EMF for a predetermined period of time. Optionally said tracker or said sensor awakens at a predetermined time interval, such that said EMF receiver scans for EMF; if said EMF is not detected, said tracker reenters sleep mode. Optionally said virtual world computational device further comprises a game engine for generating said virtual world.

Optionally said virtual world is selected from the group consisting of a VR (virtual reality) or AR (augmented reality) environment. Optionally said sensor is worn by said user as part of an article of apparel. Optionally said article of apparel comprises a glove, helmet, hat, head band, belt, wrist band, shoe or any suitable article of clothing. Optionally said article of apparel comprises a glove assembly. Optionally said 3D physical object comprises said tracker, and wherein said 3D physical object is manipulated by said user. Optionally said 3D physical object is manipulated by said user for performing one or more actions in said virtual world. Optionally said one or more actions in said virtual world are performed by said user for gameplay. Optionally said EMF transmission source comprises a synthesizer and a transmission coil; wherein said synthesizer generates electrical signals according to a plurality of instructions, wherein said instructions determine a shape of said EMF signals; wherein said electrical signals are passed to transmission coil for transmission as EMF signals. Optionally said shape is a square shape or a triangular shape. Optionally said sensor comprises a magnetic flux density sensor, a magnetic field strength sensor, three Hall effect sensors or any other suitable magnetic sensor or combination thereof; wherein said suitable magnetic sensor is at least able to determine an amplitude of said EMF at an appropriate speed. Optionally said sensor comprises a magnetometer which is able to detect EMF. Optionally said sensor comprises magnetic flux density sensor, a magnetic field strength sensor, or a combination thereof. According to at least some embodiments, there is provided a method performed with the system as described herein.

Implementation of the method and system of the present invention involves performing or completing certain selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present invention, several selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof. For example, as hardware, selected steps of the invention could be implemented as a chip or a circuit. As software, selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the invention could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.

Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The materials, methods, and examples provided herein are illustrative only and not intended to be limiting.

An algorithm as described herein may refer to any series of functions, steps, one or more methods or one or more processes, for example for performing data analysis.

Implementation of the apparatuses, devices, methods and systems of the present disclosure involves performing or completing certain selected tasks or steps manually, automatically, or a combination thereof. Specifically, several selected steps can be implemented by hardware or by software on an operating system, of a firmware, and/or a combination thereof. For example, as hardware, selected steps of at least some embodiments of the disclosure can be implemented as a chip or circuit (e.g., ASIC). As software, selected steps of at least some embodiments of the disclosure can be implemented as a number of software instructions being executed by a computer (e.g., a processor of the computer) using an operating system. In any case, selected steps of methods of at least some embodiments of the disclosure can be described as being performed by a processor, such as a computing platform for executing a plurality of instructions. The processor is configured to execute a predefined set of operations in response to receiving a corresponding instruction selected from a predefined native instruction set of codes.

Software (e.g., an application, computer instructions) which is configured to perform (or cause to be performed) certain functionality may also be referred to as a “module” for performing that functionality, and also may be referred to a “processor” for performing such functionality. Thus, a processor, according to some embodiments, may be a hardware component, or, according to some embodiments, a software component.

Further to this end, in some embodiments: a processor may also be referred to as a module; in some embodiments, a processor may comprise one or more modules; in some embodiments, a module may comprise computer instructions - which can be a set of instructions, an application, software - which are operable on a computational device (e.g., a processor) to cause the computational device to conduct and/or achieve one or more specific functionality. Some embodiments are described with regard to a "computer," a "computer network," and/or a “computer operational on a computer network.” It is noted that any device featuring a processor (which may be referred to as “data processor”; “pre-processor” may also be referred to as “processor”) and the ability to execute one or more instructions may be described as a computer, a computational device, and a processor (e.g., see above), including but not limited to a personal computer (PC), a server, a cellular telephone, an IP telephone, a smart phone, a PDA (personal digital assistant), a thin client, a mobile communication device, a smart watch, head mounted display or other wearable that is able to communicate externally, a virtual or cloud based processor, a pager, and/or a similar device. Two or more of such devices in communication with each other may be a "computer network."

BRIEF DESCRIPTION OF THE DRAWINGS

The invention is herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in order to provide what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice. In the drawings:

Figure 1 shows a non-limiting, exemplary schematic diagram of a glove and a local device for localized near field tracking;

Figures 2A-2C show non-limiting, exemplary systems for localized near field tracking;

Figures 3A and 3B show a non-limiting, exemplary system incorporating a 3D prop, and also a schematic of the prop itself, respectively;

Figure 4 shows a non-limiting, exemplary method for operating a tracking system as shown herein;

Figure 5 shows a non-limiting, exemplary method for creating a 3D prop for use in the systems and methods as described herein;

Figure 6 shows an exemplary, non-limiting system incorporating a plurality of props as described herein;

Figure 7 shows a non-limiting, exemplary method for determining a location of a tracker on a 3D model of a prop as described herein;

Figure 8 shows a non-limiting, exemplary method for creating 3D props in a reproducible manner; and

Figure 9 shows a non-limiting, exemplary method for converting game object files to 3D printer files.

DESCRIPTION OF AT LEAST SOME EMBODIMENTS

Figure 1 shows a non-limiting, exemplary schematic diagram of a glove and a local device for localized near field tracking. As shown with regard to Figure 1, an EM (electromagnetic) field 103 is generated by the source coil 104 on the back of the hand. This field is then detected by the external tracker 102, which is attached to prop 101, as an example of a 3D (three dimensional) object as described herein. These props could be anything that requires accurate tracking when manipulated within a virtual world, including but not limited to musical instruments, precision tools, utensils or weapons. External tracker 102 then calculates position and rotation in relation to the position of the back of the hand (source coil 104) in real space. This data may then be transmitted to a computer, or other system or device, for further processing, preferably through a standard RF (radiofrequency) transmission.

Optionally, a tracked object may be placed in deep sleep mode where the communication module is turned off, for example due to a period of inactivity. External tracker 102 may for example enter such a deep sleep mode. Periodically, for example at predefined time intervals, the tracked object may then scan for the proximity of an EM transmission signal. Upon detecting such a transmission signal, the tracked object may then restart full operation.

Upon ceasing to detect such an EM transmission signal, for example optionally for a predetermined period of time, the tracked object may then re-enter deep sleep mode. Such a deep sleep mode is useful for reducing power consumption during a period of inactivity.

Figure 2A shows a non-limiting, exemplary system for near field tracking of an EMF, optionally with shaped wave patterning. As shown in a system 200A, an EMF generator 202 comprises a synthesizer 204 and a transmission coil 206. Synthesizer 204 generates the electrical signals which are then passed to transmission coil 206 for transmission as EMF signals 212. A processor 208 A executes instructions stored in a memory 210A to determine when EMF signals 212 are to be emitted by transmission coil 206. Such signals 212 are emitted intermittently, with a periodicity and duration of transmission that is determined according to the instructions stored in the memory 210A and executed by the processor 208 A. In particular, the shape of the waves is determined according to these instructions, which in turn determine the electrical signals that are put into transmission coil 206.

EMF signals 212 are received by a sensor 214, through a sensor coil 216. A processor 208B executes instructions stored in a memory 210B which enables the received signals from sensor coil 216 to be measured and optionally for further processing on these signals to be performed.

In accordance with Faraday’s law of induction, EMF signals 212 pass through sensor coil 216 and in turn induce a voltage over sensor coil 216 based on the rate of change of this magnetic field. The current and therefore also the voltage over sensor coil 216 takes the shape of the first derivative of the electrical signal that is put into transmission coil 206. The optimal shape of the input electrical signals from synthesizer 204 may be determined so that EMF signals 212 may remain measurable as far away as possible. The optimal shape may comprise a square wave or a triangular wave. Instructions stored in memory 210A and executed by processor 208A determine the shape of the input electrical signals.

Although the optimal shape of the electrical signals may comprise a square wave, because such a shape may provide the largest measurable range for sensor 214, the derivative of a square wave is a pulse on the rising edge and another one on the falling edge of each square. These pulses are extremely short and thus difficult to measure at sensor 214. Shaping the electrical input signals as triangular signals makes EMF signals 212 easier to measure at sensor 214, because the derivative is a square wave. In addition, such a triangular shape enables binary information to be embedded in the transferred signal.

When shaping the electrical input signals as triangular signals, the steepness of the signal increases with the frequency of the signal. As the frequency of the input signals increases, the measurable range of EMF signals 212 for sensor 214 also increases. In addition, such a triangular wave results in a square wave when measured by sensor 214, such that binary data may easily be embedded in this signal. Differentiating between signals from different EMF generators 202 and/or from a single EMF generator 202 but generated at different times, is easier. Optionally a start and end code may be embedded in EMF signals 212 for synchronization. Such differentiation may also enable a plurality of different sensors 214 to measure each other’s signal and so to determine the distance between them.

Synthesizer 204 may comprise a band pass filter 218A. Lowering the noise floor and/or increasing the amplification at sensor 214 may be achieved with band pass filter 218A. Alternatively, band pass filter 218A may comprise a high pass filter.

Optionally, EMF generator 202 is located at a particular known location, such as the back of a user’s hand, as shown with regard to Figure 1. EMF generator 202 may also be located at another part of a body of the user, including but not limited to, wrist, arm, leg, torso, neck, head, foot and so forth. The fact that EMF generator 202 is at a known location enables the relative location of sensor 214 to be determined through sensing of the EMF received by sensor 214. For example, sensor 214 may be attached to, within, or integrally formed with, a prop, a plane or another object. As used herein, the term “prop” relates to an object that is to be moved, manually manipulated and/or to be otherwise contacted by the body of the user, such as for example a hand of the user. As used herein, the term “plane” relates to an object or location that is not necessarily moved or manually manipulated, or otherwise acted upon by the user. For example, a plane may be a table or other object.

Figure 2B shows a similar system as Figure 2A, except that band pass filter 218B is now located at sensor 214. That is, band pass filter 218B is now located at the EMF receiver rather than the EMF transmitter. The other components in system 200B operate in an identical or at least similar manner to the components of system 200A.

Figure 2C shows a similar system as for Figures 2A and 2B, except that a band pass filter and/or high pass filter is now located at both EMF generator 202 and sensor 214. The other components in system 200C operate in an identical or at least similar manner to the components of systems 200A and 200B.

Sensor 214 may comprise a magnetic flux density sensor or magnetic field strength sensor, three Hall effect sensors or any other suitable magnetic sensor or combination thereof. It should be noted that for the combined application of such sensors to sample a set frequency to gauge RF interference, as well as for EMF signal reception, Hall effect sensors may optionally not be used. Each sensor may comprise a magnetometer which is able to detect EMF from transmission coil 206, but preferably comprises a sensor that is at least able to determine an amplitude of the EMF at the appropriate speeds.

For finger tracking applications, preferably the location of each finger is tracked with a separate sensor 214, while the location of all sensors 214, and hence all fingers on one hand, is preferably tracked with one transmission coil 206 for generating EMF signals 212.

Figures 3A and 3B show a non-limiting, exemplary system incorporating a 3D prop, and also a schematic of the prop itself, respectively. Turning now to Figure 3A, a system 300 features a sensor device 302 in communication with a central computational device 320 through a computer network 316. Sensor device 302 features an EMF receiver 304 for receiving EMF signals. Sensor device 302 also features a communication module 306 for supporting communication with central computational device 320 through computer network 316. Instructions for performing functions of sensor device 302 are stored in a memory 311 and are executed by a processor 310.

Sensor device 302 is also within a detection distance of a prop 352, which is a 3D object that features an EMF transmitter 350. When EMF receiver 304 is within detection proximity of EMF transmitter 350, EMF signals generated by EMF transmitter 350 are detected by EMF receiver 304. Such proximity may occur when a user who is wearing sensor device 302 on an article of clothing as described herein, or who otherwise has sensor device 302 attached to their body, grasps or otherwise physically interacts with prop 352. In another non-limiting example, such proximity may occur when another physical object or apparatus has sensor device 302 attached thereto or physically integrated therewith, for example for automated machinery, educational environments and/or loT (internet of things) operation.

The location of EMF transmitter 350 in regard to prop 352 is known to at least central computational device 320. For example, EMF transmitter 350 may be located in a physical center of prop 352 or at another location. The specific location of EMF transmitter 350 is the zero coordinate for tracking prop 352, such that central computational device 320 is preferably able to track the relative location of prop 352 and sensor 302, as well as their relative orientation. For example, if prop 352 is a tool that a human user is to grasp, incorrectly grasping prop 352 is preferably detectable by central computational device 320, thereby enabling feedback to be given to the user (for example, by not enabling prop 352 to be fully or correctly operational within the virtual world).

Central computational device 320 receives information from sensor 302, to determine the relative location and orientation of prop 352 and sensor 302. For example, such information may comprise processed EMF signal data as received by EMF receiver 304. Central computational device 320 preferably receives such information through a computer interface 332, which is then analyzed by an analysis engine 334. Analysis engine 334 is preferably able to determine the relative location and orientation of prop 352 and sensor 302. Analysis engine 334 then provides such information to a virtual world application (not shown).

Central computational device 320 also preferably comprises a memory 331 for storing instructions for performing the functions of analysis engine 334 and other functions of central computational device 320. These instructions are preferably executed by a processor 330. An electronic storage 322 may store other necessary information.

Figure 3B shows a more detailed schematic of prop 352. Prop 352 features a tracker 354 as previously described, located at a specific location at prop 352. This location may be within, at a side of or attached to prop 352. The location of tracker 354 may be described as the “reference location” which is sometimes termed the “zero coordinate” or the “0,0,0 coordinate” for tracker 354 within prop 352. These terms are used interchangeably herein. In terms of further data analysis, this location is considered to be the center of prop 352, even if not directly located within the center thereof. This offset is then used to correctly calculate and create the visualization on an external computer, with regard to the location and placement of prop 352. The 0,0,0 coordinate 358 is an effective object center. Tracker 354 preferably comprises EMF transmitter 350 (not shown) and may also comprise other components as well, as necessary to enable sensor 302 to receive EMF signals from tracker 354. Effective object center 358 may not be located at the actual physical center of prop 352; however, the position of tracker 354 defines the central location in terms of tracking of prop 352, as it is this position which is tracked.

Figure 4 shows a non-limiting, exemplary method for operating a tracking system as shown herein. Turning now to Figure 4, as shown in a method 400, the process begins at 402, when the system is initialized. The system is preferably initialized by initializing the glove or other EMF source being worn by a user, and the tracker, which preferably comprises sensors for receiving EMF signals. The tracker may be contained in a prop or plane, as previously described. The initialization process preferably enables both sides of the transmission/tracking system to begin operation. Optionally one or both sides are calibrated.

At 404, one or more EMF sensors at the tracker waits until it receives a signal from a glove EMF generator. Upon receiving the signal, the one or more EMF sensors lock onto the signal from that particular glove (for example, a left glove or a right glove of a pair). At 406, the tuned EMF sensors listen for the EMF signals from the source (transmission coil). At 408, the EMF signals are received by the sensors at the tracker. At 410, data in regard to the EMF signals is sent to an external computer, preferably including data about the location of the tracker. At 412, the external computer performs a calculation to determine relative location information. If the tracker is a ground truth tracker (such as the previously described plane), then the external computer preferably performs the inverse calculation by knowing the location of the tracker, and then performing a vector calculation to determine the position of the glove. This calculation determines the relative location of the glove in relation to the tracker.

On the other hand, if the tracker is on a movable prop, then the tracker data is applied to the prop to create a dynamic and continuously updated visualization; and/or to perform other, further calculations. At 414, the relative location information is transmitted to at least one of the glove, the tracker, an external computational device or a combination thereof. At 416, stages 406-414 are preferably repeated at least once.

Figure 5 shows a non-limiting, exemplary method for creating a 3D prop for use in the systems and methods as described herein. As shown in a method 500, the process begins at 502 when a 3D model of a prop is uploaded. The 3D model may be any suitable model for creating such a prop, for example on a 3D printer, as CAD/CAM drawings, as manufacturing schematics, through non-CAD/CAM three dimensional models, from photos or images of a physical object and so forth. At 504, the interaction elements on the 3D model are defined. Such interaction elements may relate to elements that are contacted by the user, manipulated by the user and so forth. Non-limiting examples include a lever, a button, a trigger, a handle, a tool or a portion thereof, a weapon or a portion thereof, and so forth.

At 506, the location of the tracker is determined. The location may be determined as described herein, for example in regard to one or more parameters of the tracker itself, desired interactions with the interaction elements, intended orientation of the prop in regard to the user and so forth. At 508, the prop schematics are created from the 3D, the defined interaction elements and the determined location of the tracker. The prop schematics may also further include manufacturing information or instructions, such as specific dimensions, materials or parts to be used, and so forth, if not already provided.

At 510, the virtual world schematics are created. These schematics relate to the interactions of the virtual world representation of the prop within that virtual world. For example, for this object to be used, operative and/or to interact with a user in the virtual world, optionally such information as physical properties of a real life material may be applied to the virtual object, so that the virtual object behaves as a user may expect from real world behaviors. Such behaviors may be directly applied within the game engine, as described for example in the Unity game engine documentation (https://docs.unity3d.com/Manual/GameObjects.html). In this system, functions are added as components to a GameObject, which is the representation of a real life object within the game world as a virtual world. Other game engines and/or virtual world engines may operate with other such functions.

At 512, the prop is produced, for example through manufacturing or on a 3D printer. For manufacturing, the prop may be produced in an injectable plastic mold or through another manufacturing process. At 514, the tracker is incorporated at the predetermined location. The tracker may be so incorporated during the production of the prop, or may be added after the prop has been produced. At 516, the user interacts with the prop, and the location of the prop in the virtual world is tracked according to real world interactions of the user with the prop.

Figure 6 shows an exemplary, non-limiting system incorporating a plurality of props as described herein. As shown, a system 600 features a plurality of props 614, of which three are shown herein as props 614A-614C for the purpose of illustration and without any intention of being limiting. Props 614 may comprise any suitable prop devices as described herein, including but not limited to a tool, musical instrument or weapon, or other physical 3D object that is physically manipulated in a virtual world.

System 600 also optionally and preferably features a central computational device 620, which is in contact with props 614 through a computer network 610. Network 610 may comprise any suitable wired or wireless communication network, including without limitation WiFi, Bluetooth, radio frequencies and cellular network communication.

Central computational device 620 preferably comprises a processor 630 and a memory 631. Memory 631 stores a plurality of instructions for execution by processor 630 to fulfill the functions of central computational device 620, for example and without limitation to provide an engine 636. For example and without limitation, engine 636 may support game play for an interactive electronic game. A plurality of users may interact with props 614, and may interact with the game according to game play supported by engine 636. The interaction and/or orientation of the users in regard to props 614 may be determined by engine 636; such an interaction and/or relative orientation may affect game play. Props 614 may send signals to central computational device 620 in regard to such interactions and/or relative orientation. In turn, central computational device 620 may send information and/or instructions, and/or may fulfill such functions as keeping score, according to the provided interaction and/or relative orientation information.

Central computational device 620 may also comprise an electronic storage 622, for example for storing user profile information, additional game data and/or other information for supporting the functions of central computational device 620 and/or of system 600 overall.

System 600 may also, additionally or alternatively, comprise a plurality of user computational devices 602, shown as user computational devices 602A-602C for the purpose of illustration only and without any intention of being limiting. Optionally one or more user computational device(s) 602 replace central computational device 620. User computational devices 602A-602C may be used for example to control game play, to receive information about game play and/or to participate in game play, in combination with props 614. User computational devices 602A-602C may also comprise the previously described EMF sensor, containing the previously described EMF receiver (not shown). For this implementation, optionally props 614 do not communicate with central computational device 620 but only with user computational devices 602. In another implementation, the previously described EMF sensor is implemented as another, separate device, which may be worn by the user (not shown). Again, for this implementation, optionally props 614 do not communicate with central computational device 620 but only with the separate sensor device (not shown).

Figure 7 shows a non-limiting, exemplary method for determining a location of a tracker on a 3D model of a prop as described herein. As shown in a method 700, the process begins by creating a plurality of parameters for the tracker at 702. The tracker preferably includes an EMF generator as described herein. The parameters for the tracker may relate to strength of the generated EMF field, which may in turn determine the maximum potential distance of the tracker from the EMF receiver for detection to occur. Such a maximum potential distance may also depend on the parameters of the EMF receiver; optionally such parameters of suitable EMF receivers may be considered in combination with parameters for the tracker. The effect of different materials and thicknesses of such materials may also be considered, such that the position of the tracker within a prop and covered by one or more prop materials may affect the strength of a detectable EMF field. The size and weight of the tracker may also be considered as tracker parameters. A further tracker parameter may relate to the ability of the tracker to withstand application of a physical force, for example through being dropped or contacted by another object, or by a human body.

At 704, one or more interaction points on the prop are determined. Such interaction points may relate to the previously described interaction elements. Additionally or alternatively, such interaction points may relate to how the prop is used, such as a portion of a weapon which may contact the body of another user or another object if used in a virtual world.

At 706, one or more handling parameters of the prop are determined. For example, such parameters may include determining whether the prop meant to be held stationary or to be moved, for example by being swung or thrown. Other parameters may relate to how the prop is meant to be grasped and held. The amount of expected physical force to be applied to the prop may also be determined as a handling parameter. Other such parameters may relate to the extent of force that may be applied to interactive elements of the prop, as well as to the degree and/or extent and/or angle or direction of movement of each such interactive element.

At 708, the virtual world parameters are determined. These parameters may include but are not limited to material and shape properties that are stored in the generated virtual 3D file and linked to the physical printed 3D shape, including but not limited to weight, texture, shape, color, elasticity and so forth. These parameters may be determined according to the virtual world schematics as described with regard to the method of Figure 5.

At 710, the tracker location is preferably selected according to these previously determined parameters, more preferably including all of the tracker parameters, handling parameters for the prop, interaction points on the prop, and the virtual world parameters for determining how the prop exists as a 3D object within the virtual world. The type of tracker may also be determined in regard to the location, as a more powerful EMF generator within a tracker may be desired if the tracker is surrounded by a significant amount of material of the prop, for example.

At 712, one or more 3D models of the prop are created, preferably as determined by handling parameters for the prop, interaction points on the prop, and one or more desired materials. The material(s) may be selected according to look, feel (sensation when touched), durability, cost and so forth. The 3D models may also be determined according to the virtual world parameters for determining how the prop exists as a 3D object within the virtual world. Optionally and preferably, the 3D model also takes into consideration the location of the tracker and one or more manufacturing constraints.

At 714, the tracker location is then applied to the 3D model, so that the prop may be produced with the tracker placed in the correct or most suitable location.

Figure 8 shows a non-limiting, exemplary method for creating 3D props in a reproducible manner. As shown in a method 800, the process starts by defining a plurality of reusable prop elements at 802. Such prop elements may include but are not limited to a handle, a button, a lever, a switch, a blade for a sword, a barrel of a gun, an operational element of a tool (such as a blade for a knife, a pair of blades for scissors, a hammer head for a hammer, and so forth). At 804, one or more interaction points are defined on the reusable elements. For example, a location or locations for grasping the prop may be defined on the handle. The placement of the fingers and thumb on the handle may be defined, for one hand or more than one hand. The interaction of one or more fingers and/or of the thumb on a button, lever or switch may be defined for example. The orientation of the prop with regard to one or both hands of the user, and/or of more than one user, may also be defined for example.

At 806, one or more handling parameters are defined on the reusable elements. These handling parameters relate to state based effects that measure changes to the physical shape and apply those in the virtual world, for instance the trigger being pushed. Each state based effect may comprise a triggered effect that can easily be linked between a physical reusable element and a corresponding virtual element in a game engine or other virtual world engine. For example and without limitation, such an element may be a “on pick-up” trigger or a “on hand proximity” trigger.

The handling parameters may be determined according to further user equipment, which may be worn by the user. For example, a user may wear one or more sensors, such as a glove comprising a plurality of sensors on fingers and/or on the wrist. If the user is interacting with an object and a sensor worn by the user is in proximity to the object, then the sensor may detect the presence of the object and/or movements of the object. For example, if the user wears a glove and pulls a trigger on the object, a sensor on the glove may detect such movement, in addition to or in place of the interaction element (trigger) on the object. The trigger may participate in the measurement of the movement or it may not. The game engine or other virtual world engine may then receive the representation of, or data related to, the movement of the trigger, and may then adjust the virtual representation of the object in the virtual world. A “on trigger push” trigger can be embedded in the virtual object and thus linked to the game engine or other virtual world engine. A script or piece of code, embedded in the virtual object, may then be provided in a format that game engines may understand, including but not limited to such game engines as Unity or Unreal.

At 808, one or more potential tracker locations on the reusable elements are defined. Not all reusable elements may feature a tracker location. The tracker location may also be defined for particular tracker(s) and/or for particular materials of the prop, as described herein. At 810, one or more potential tracker parameters are determined. Stages 808 and 810 may be performed in reverse order and/or may be repeated at least once, such that the tracker parameters and location may be adjusted with regard to each other.

At 812, 3D models of the reusable elements are preferably created according to the use and interaction points of such elements, their respective handling parameters, the potential tracker location(s) (if any) and also the tracker parameters (if any). Optionally, at 814 such reusable element 3D models may be uploaded to a marketplace, so that they could be used by more than one prop designer and/or manufacturer. The marketplace may also be internal to a single prop designer and/or manufacturer.

Figure 9 shows a non-limiting, exemplary method for converting game object files to 3D printer files. As shown in a method 900, the process begins by creating a game object according to any suitable art known method at 902. For example, the game object may be rendered as an OBJ file, which may be read by the Unity game engine. Other formats may be applied to other game engines. The file may include for example vertices, skeleton and skin. The skin may be rendered as a mesh, for example as a plurality of connected triangles. The inner skeleton may be present to support movement of moving parts of the object within the game. In such a scenario, optionally only those parts of the skeleton that are required for movement may be present. A texture layer or other visuals on the skin may be applied through a projection map.

At 904, if not previously determined at 902, the above described mesh is preferably defined. The mesh determines the outer boundaries and contours of the game object. At 906, any movable part(s) that are present in the game object are selected or otherwise indicated. At 908, the game object file is converted to at least one 3D printer file, such as a STL file format for example. Preferably, each movable part of the game object is provided as a separate 3D printer file. Optionally, the game object is further split to a plurality of files, for example to enable the tracker to be correctly located within the physical 3D object.

At 910, the tracker location is preferably defined within the physical 3D object, in relation to the 3D printer file(s). Defining the location preferably also includes determining how the tracker is to be incorporated within the 3D object as printed. For example and without limitation, if the physical 3D object is to be assembled from the 3D printed output of a plurality of 3D printer files, then the tracker may be defined as being attached to, combined with, or integrally formed with, one of the 3D printed objects. At 912, the 3D printer file(s) are preferably adjusted. For example, conversion of the game object file may result in a 3D printer file which needs to be repaired, and/or where model is incomplete. For example, for a 3D printer file with a mesh comprised of a plurality of triangles, the triangles need to form a complete solid. If the triangles do not form a complete solid, then the slicer (3D slicing engine) cannot appropriately plan the tool paths. Tool path planning is one aspect that may impact the quality of the final 3D printed object and as such, is preferably performed with any necessary corrections and/or adjustments to the 3D printer file(s).

At 914, the 3D component(s) are printed on a 3D printer, according to the 3D printer file(s). At 916, if a plurality of components has been printed separately, such as one or more movable parts, these separately printed components are combined. For example, glue or another adhesive, screws, bolts, nails, straps and other connectors may be used to combine these separately printed components to the physical 3D object. At 918, if not already added, then the tracker is added to the object. The tracker may also be added before or during stage 916.

It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable subcombination.

Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims. All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention.




 
Previous Patent: CHRONOGRAPH WATCH

Next Patent: HANDLING SYSTEM