Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND SYSTEM FOR GEOREFERENCING DIGITAL CONTENT IN A SCENE OF VIRTUAL REALITY OR AUGMENTED/MIXED/EXTENDED REALITY
Document Type and Number:
WIPO Patent Application WO/2022/129999
Kind Code:
A1
Abstract:
A method and a kit are disclosed for tracking the position of a portable system of virtual reality VR or augmented/mixed/extended reality AR/MX/XR, with precision less than 1 cm, in real time, eventually without connecting to a remote internet platform, and georeferencing digital content in a scene of virtual reality or augmented/mixed/extended reality on the portable VR/AR/MR/XR system. The position of the portable VR/AR/MR/XR system is tracked continuously over the time (positional tracking), as the portable VR/AR/MR/XR system moves with the user, and the digital content in the VR/AR/MR/XR scene displayed to the user is feedback adjusted depending upon the position tracked each instant of time. In this way, digital content is always correctly georeferenced in the VR/AR/MR/XR scene even if the user is moving a long distance with the portable VR/AR/MR/XR system and the user gets a fully immersive experience.

Inventors:
PATTERI GIOVANNI (IT)
SORESINI MASSIMILIANO (IT)
Application Number:
PCT/IB2020/062109
Publication Date:
June 23, 2022
Filing Date:
December 17, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ELIOS S R L (IT)
International Classes:
G01C15/00; G01C21/00; G01C21/16; G06F16/29; G06T17/05; G06T19/00
Domestic Patent References:
WO2012037994A12012-03-29
Foreign References:
US20200302093A12020-09-24
EP3165939A12017-05-10
US20200286289A12020-09-10
Other References:
JIEBO LUO ET AL: "Geotagging in multimedia and computer vision - a survey", MULTIMEDIA TOOLS AND APPLICATIONS., vol. 51, no. 1, 19 October 2010 (2010-10-19), US, pages 187 - 211, XP055569453, ISSN: 1380-7501, DOI: 10.1007/s11042-010-0623-y
Attorney, Agent or Firm:
BIESSE S.R.L. (IT)
Download PDF:
Claims:
CLAIMS

1. A method for georeferencing digital content in a scene of virtual reality VR or augmented reality AR or mixed reality MR or extended reality XR on a portable VR/AR//MR/XR system (3), comprising:

(A) providing a geolocation device (2), intended to be placed stationary in a position, named initialization position, the geographical coordinates of which may be known a priori and are named initialization geographical coordinates (llx, Uy, Uz), wherein the geolocation device (2) is equipped with a first satellite antenna (A1 );

(B) providing a portable VR/AR/MR/XR system (3) with a second satellite antenna (A2), wherein both first and second satellite antennas (A1 , A2) are able to receive the signal from satellites (S) of at least one constellation of global navigation satellites (S) orbiting around the earth;

(C) through the first satellite antenna (A1 ), determining the latitude, the longitude and optionally the altitude of the geolocation device (2), together referred to as first geographical coordinates (A1x, A1y, A1z);

(D) through the second satellite antenna (A2), determining the latitude, the longitude and optionally the altitude of the portable VR/AR/MR/XR system (3), together referred to as second geographical coordinates (A2x, A2y, A2z);

(E) triangulating the position of the portable VR/AR/MR/XR system (3), i.e. computing enhanced geographical coordinates (En.X, En.Y, En.Z), basing on the first geographical coordinates (A1x, A1y, A1z), or the initialization geographical coordinates (Ux, Uy, Uz), and the second geographical coordinates (A2x, A2y, A2z), and

(F) georeferencing at the enhanced geographical coordinates (En.X, En.Y, En.Z) the digital content in a scene of virtual reality or augmented/mixed/extended reality displayed by the portable VR/AR/MR/XR system (3),

(G) tracking the position of the portable VR/AR/MR/XR system (3), as the portable VR/AR/MR/XR system (3) is moved on the ground, and feedback

- 32 - georeferencing the digital content in said scene, by repeating phases (C) to (F) over the time.

2. The method according to claim 1 , wherein the position of the portable VR/AR/MR/XR system (3) is substantially continuously tracked, and the digital content is substantially continuously georeferenced in said scene, by implementing phase (G) at a frequency of 30 Hz, or higher.

3. The method according to any preceding claim, comprising wireless transmitting the first geographical coordinates (A1x, A1y, A1z), or the initialization geographical coordinates (llx, Uy, Uz), to the portable VR/AR/MR/XR system (3) by the geolocation device (3).

4. The method according to claim 3, wherein said wireless transmission is performed in radiofrequency, preferably at ultra-high frequency UHF.

5. The method according to any preceding claim, wherein phase (E) is carried out by correcting the second geographical coordinates (A2x, A2y, A2z) with the first geographical coordinates (A1x, A1y, A1z), or the initialization geographical coordinates (Ux, Uy, Uz) manually set by the user (U), applying the real-time kinematic RTK positioning technique.

6. The method according to any preceding claim, wherein phase (E) is performed by a processor (4), preferably a processor (4) of the portable VR/AR/MR/XR system (3).

7. The method according to any preceding claim, comprising:

(H) providing the portable VR/AR/MR/XR system (3) with means (5) for implementing simultaneous localization and mapping SLAM of the surrounding environment, for instance at least one Lidar system, and optionally with at least one between an accelerometer (6), a gyroscope (7), a compass (8), each named sensor, and

(I) using the maps generated at (H), and optionally the signals generated by one or more sensors (6-8), for determining the orientation of the portable VR/AR/MR/XR system (3) in the space.

8. The method according to any preceding claim, comprising tracking the

- 33 - vertical position of the portable VR/AR/MR/XR system (3) with respect to the ground by means of at least one Lidar system (5).

9. The method according to any preceding claim, comprising:

(L) providing the portable VR/AR/MR/XR system (3) with means for generating a three-dimensional 3D map of the surrounding environment, for instance at least one Lidar system (5), and defining anchor points POI, that can also be defined as points of interest, in the generated 3D map, wherein each anchor point AR describes the location of a physical object in the real world, and the digital content is feedback georeferenced also with respect to one or more anchor points POI, and optionally said anchor points POI are shared over the internet, for instance in cloud, among several users of said scene.

10. The method according to any preceding claim, comprising:

- providing the portable VR/AR/MR/XR system (3) with a multimedia device, for instance a smartphone, having a screen, and

- displaying virtual reality or augmented/mixed/extended reality scenes on the screen, with digital content georeferenced at the enhanced geographical coordinates (En.X, En.Y, En.Z).

11. The method according to any preceding claim, comprising:

(M) providing the portable VR/AR/MR/XR system (3) with at least one between an accelerometer (6), a gyroscope (7), a compass (8), each named sensor, and

(N) determining the distance Dist.A, and eventually the trajectory, walked by the user carrying the portable VR/AR/MR/XR system (3), between a first position having first enhanced geographical coordinates (En.Xt, En.Yt) (latitude, longitude) and a second position having second enhanced geographical coordinates (En.Xt+i, En.Yt+i) (latitude, longitude), basing on the sensor (6-8) signals.

12. The method according to claim 11 , comprising:

(O) eliminating or reducing errors of the signals generated by said sensors (6-8), by applying the Kalman filter to said sensor (6-8) signals.

13. The method according to any preceding claim, comprising:

(P) determining the distance Dist.B walked by the user (II) carrying the portable VR/AR/MR/XR system (3), between a first position having first enhanced geographical coordinates (En.Xt, En.Yt) (latitude, longitude) and a second position having second enhanced geographical coordinates (En.Xt+i, En.Yt+i) (latitude, longitude), by applying the formula Ellipsoidal Earth projected to a plane to said first enhanced geographical coordinates (En.Xt, En.Yt) and said second position having second enhanced geographical coordinates (En.Xt+i, En.Yt+i).

14. The method according to claim 13 depending upon claim 11 or 12, comprising:

(Q) repeating phase (P) over the time, for instance every second or every one meter walked by the user (II), and determining the deviation Delta, defined as deviation vector Delta, between Dist.Aand Dist.B, and feedback correcting the trajectory walked by the user (II) initially determined basing on the sensor signals.

15. The method according to any preceding claim, wherein phases (F) and (G) are implemented without connecting the portable VR/AR/MR/XR system (3) to a remote internet platform.

16. The method according to any preceding claim, comprising providing the portable VR/AR/MR/XR system (3) with an electronic compass (8) and calibrating said compass (8) by walking at least 10 meters from an initial position along the same longitude.

17. The method according to any preceding claim, optionally comprising wireless connecting the VR/AR/MR/XR system (3) to a remote internet platform for exchanging content and/or information, and/or for sharing the VR/AR/MR/XR scenes with other users.

18. The method according to any preceding claim, wherein the portable VR/AR/MR/XR system (3) is wearable, for instance it is configured as a visor or helmet, or a handheld device, or it is interfaceable or integrable to/into the control panel of an operating vehicle or a rover, and/or wherein the geolocation device (2) is located on a vehicle, a rover or an unmanned vehicle/drone.

19. A kit (1 ) comprising a portable virtual reality VR or augmented reality AR, or mixed reality MR or extended reality XR system (3), a geolocation device (2) and processing means (4), wherein:

- the geolocation device (2) is intended to be placed stationary in a position, named initialization position, the geographical coordinates of which may be known a priori and are named initialization geographical coordinates (llx, Uy, Uz), and is equipped with a first satellite antenna (A1 );

- the portable VR/AR/MR/XR system (3) is intended to be carried by a user and is provided with a second satellite antenna (A2);

- both first and second satellite antennas (A1 , A2) are set to receive the signal from satellites (S) of at least one constellation of global navigation satellites (S) orbiting around the earth;

- the processing means (4) processes the signal provided by the first satellite antenna (A1 ) and determines the latitude, the longitude and optionally the altitude of the geolocation device (2), together referred to as first geographical coordinates (A1x, A1y, A1z);

- the processing means (4) processes the signal provided by the second satellite antenna (A2) and determines the latitude, the longitude and optionally the altitude of the portable VR/AR/MR/XR system (3), together referred to as second geographical coordinates (A2x, A2y, A2z);

- the processing means (4) triangulates the position of the portable VR/AR/MR/XR system (3), i.e. it computes enhanced geographical coordinates (En.X, En.Y, En.Z), basing on the first geographical coordinates (A1x, A1y, A1z), or the initialization geographical coordinates (Ux, Uy, Uz), and the second geographical coordinates (A2x, A2y, A2z), and

- the portable VR/AR/MR/XR system (3) displays digital content in a scene of virtual reality VR or augmented/mixed reality AR, georeferenced at the enhanced geographical coordinates (En.X, En.Y, En.Z);

- the processing means (4) computes the enhanced geographical coordinates (En.Xt, En.Yt, En.Zt; En.Xt+i, En.Yt+i, En.Zt+i; En.Xt+n, En.Yt+n,

- 36 - En.Zt+n) over the time, and the digital content is feedback georeferenced on the basis of the enhanced geographical coordinates (En.Xt, En.Yt, En.Zt), as the portable VR/AR/MR/XR system is moved.

20. The kit (1 ) according to claim 19, wherein computing the enhanced geographical coordinates and feedback georeferencing the digital content is performed at a frequency of 30 Hz, or higher, to achieve a substantially continuous position tracking of the portable VR/AR/MR/XR system (3) and a substantially continuous georeferencing of said digital content.

21. The kit (1 ) according to claim 19 or claim 20, wherein the geolocation device (2) comprises a wireless interface and transmits the first geographical coordinates (A1 x, A1 y, A1 z), or the initialization geographical coordinates (llx, Uy, Uz), to the portable VR/AR/MR/XR system (3).

22. The kit (1 ) according to claim 21 , wherein said wireless interface (R1 , R2) is a radio transmitter, preferably operating at ultra-high frequency UHF.

23. The kit (1 ) according to any preceding claim, wherein the processing means (4) comprises program means programmed for carrying out said triangulation by correcting the second geographical coordinates (A2x, A2y, A2z) with the first geographical coordinates (A1x, A1y, A1z), or the initialization geographical coordinates (Ux, Uy, Uz), applying the real-time kinematic RTK positioning technique.

24. The kit (1 ) according to any preceding claim, wherein the portable VR/AR/MR/XR system (3) is provided with means (5) for implementing simultaneous localization and mapping SLAM of the surrounding environment, for instance at least one Lidar system, and optionally with at least one between an accelerometer (6), a gyroscope (7), a compass (8), each named sensor, and the processing means (4) determines the orientation of the portable VR/AR/MR/XR system (3) in the space on the basis of said mapping and sensor signals.

25. The kit (1 ) according to any preceding claim, comprising at least one Lidar system (5) and wherein the processing means (4) tracks the vertical position

- 37 - of the portable VR/AR/MR/XR system (3) with respect to the ground by means of said at least one Lidar system (5).

26. The kit (1 ) according to any preceding claim, wherein the portable VR/AR/MR/XR system (3) comprises a multimedia device, for instance a smartphone, having a screen, and virtual reality VR or augmented reality AR scenes are displayed on the screen, with digital content georeferenced at the enhanced geographical coordinates.

27. The kit (1 ) according to any preceding claim, wherein the portable VR/AR/MR/XR system (3) comprises at least one between an accelerometer (6), a gyroscope (7), a compass (8), each named sensor, and the processing means are programmed for determining the distance Dist.A, and eventually the trajectory, walked by the user (II) carrying the portable VR/AR/MR/XR system (3), between a first position having first enhanced geographical coordinates (En.Xt, En.Yt) (latitude, longitude) and a second position having second enhanced geographical coordinates (En.Xt+i, En.Yt+i) (latitude, longitude), basing on the sensor signals.

28. The kit (1 ) according to claim 27, wherein the processing means applies the Kalman filter to said sensor signals.

29. The kit (1 ) according to any preceding claim, wherein the processing means (4) are programmed for determining the distance Dist.B walked by the user (II) carrying the portable VR/AR/MR/XR system (3), between a first position having first enhanced geographical coordinates (En.Xt, En.Yt) (latitude, longitude) and a second position having second enhanced geographical coordinates (En.Xt+i, En.Yt+i) (latitude, longitude), by applying the formula Ellipsoidal Earth projected to a plane to said first enhanced geographical coordinates (En.Xt, En.Yt) and said second position having second enhanced geographical coordinates (En.Xt+i, En.Yt+i).

30. The kit (1 ) according to claim 29 depending upon claim 27 or claim 28, wherein the processing means (4) are programmed for determining the distance Dist.B over the time, for instance every second or every one meter walked by the

- 38 - user, and determining the deviation Delta, defined as deviation vector Delta, between Dist.A and Dist.B, and feedback correcting the trajectory walked by the user initially determined basing on the sensor signals.

31. The kit (1 ) according to any preceding claim, wherein the portable VR/AR/MR/XR system (3) comprises an electronic compass (8), and the processing means calibrates said compass upon walking by the user at least 10 meters from an initial position along the same longitude.

32. The kit (1 ) according to any preceding claim, wherein the portable VR/AR/MR/XR system (3) comprises a wireless interface able to connect to a remote internet platform for exchanging content and/or information, and/or for sharing the VR/AR/MR/XR scenes with other users.

33. The kit (1 ) according to any preceding claim, wherein the VR/AR/MR/XR system (3) is wearable, for instance it is configured as a visor or helmet, or a handheld device, or it is interfaceable or integrable to/into the control panel of an operating vehicle or a rover, and/or wherein the geolocation device is located on a vehicle, a rover or an unmanned vehicle/drone.

- 39 -

Description:
Method and system for georeferencing digital content in a scene of virtual reality or augmented/mixed/extended reality

***

DESCRIPTION

Field of the invention

The present invention relates to a method and a system for georeferencing digital content, with high precision, in a scene of virtual reality VR, or augmented reality AR or mixed reality MR, or extended reality XR, in particular for applications in sectors such as construction, tourism, recreational/entertainment, education, training and the like.

State of the art

Virtual reality VR may be defined, in the simplest way, as a set of images and sounds, produced by a computer, that seem to represent a place or a situation that a person can take part in.

Currently the most common VR systems use VR headsets to generate realistic images, sounds and other sensations that simulate a user's physical presence in a virtual environment. A person using VR headsets is able to look around the artificial world, move around in it, and interact with virtual features or items.

The simplest VR headsets consist of a head-mounted display with a small screen in front of the eyes, but it can be also equipped with, or associated to, other devices to produce sounds, tactical feedback, wireless connections to remote devices or the internet. In particular, modem VR headset displays are based on technology developed for smartphones including: gyroscopes and motion sensors for tracking head, body, and hand positions; small HD screens for stereoscopic displays; and small, lightweight and fast processors.

VR can simulate real workspaces for workplace occupational safety and health purposes, educational purposes, and training purposes. It can be used to provide learners with a virtual environment where they can develop their skills without the real-world consequences of failing. It has been used and studied in primary education, anatomy teaching, astronaut training, flight simulators, miner training, architectural and engineering design, driver training, industrial plant inspection and so on.

Immersive VR engineering systems enable engineers to see virtual prototypes prior to the availability of any physical prototypes. VR has proved very useful for both engineering educators and the students. The most significant element lies in the ability for the students to be able to interact with 3D models that accurately respond based on real world possibilities. As noted, the future architects and engineers benefit greatly by being able to form understandings between spatial relationships and providing solutions based on real-world future applications.

Augmented reality AR may be defined, in the simplest way, as an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computer-generated information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory. In other words, AR is a system that fulfils three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects.

The overlaid sensory information can be constructive (i.e. additive to the natural environment), or destructive (i.e. masking of the natural environment). This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment. In this way, AR alters one's ongoing perception of a real-world environment, whereas VR completely replaces the user's real-world environment with a simulated one.

AR is often referred to using the following synonymous terms: mixed reality MR and computer-mediated reality.

The primary value of AR/MR is the manner in which components of the digital world blend into a person's perception of the real world, not as a simple display of data, but through the integration of immersive sensations, which are perceived as natural parts of an environment. AR/MR is used to enhance natural environments or situations and offer perceptually enriched experiences. With the help of advanced AR/MR technologies like, for instance, adding computer vision, incorporating AR/MR cameras into smartphone applications and object recognition, the information about the surrounding real world of the user becomes interactive and digitally manipulated. Information about the environment and its objects, i.e. digital virtual content, is overlaid on the real world.

Extended reality XR is another emerging term for all immersive technologies, including augmented reality AR, virtual reality VR, and mixed reality MR, plus those that are yet to be created. Immersive, or XR, technologies, extend the reality we experience by either merging the virtual and real worlds, or by creating a fully immersive experience.

In the present application augmented reality AR, mixed reality MR and extended reality XR shall be considered synonyms and often reference is made to AR/MR/XR reality to encompass all the cases.

In recent times, VR and AR/MR/XR have been used more and more often by engineers, architects, technicians, maintenance workers, etc., basically all the technical staff working on a construction site, to simulate operations on the site, and in particular to double check accuracy and correctness of the projects, before operations commence, and during operations, before buildings, facilities and installations are completed.

A known AR system used in construction is Trimble® SiteVision™ outdoor AR system ( tiPgiZ ggospatiaL trim ble..gom/sjteyisjon; manufactured by the company Trimble Inc., U.S.A.. This AR System is a portable device having a handle carrying batteries, a smartphone and a large satellite antenna, both supported by the handle. The large satellite antenna is intended to detect the signals of satellites of a constellation of satellites, for determining the geographical coordinates commonly known as GPS coordinates. In turn, the smartphone is provided with a camera, an accelerometer, a compass, its own satellite antenna and a software that mixes digital content with the images taken by the camera, according to AR techniques, by assuming that the position of the smartphone coincides with the position of the GPS coordinates determined by the large antenna, since these elements are vertically aligned above the handle. The digital content may be, for instance, the 2D/3D engineering model of a facility, a house, a road, a bridge, etc., that has to be built in the nearby, or technical information of any kind, for example a BIM model, and is overlaid to the actual footage taken by the smartphone, in real time, so that the user sees on the screen of the smartphone both the real landscape which is in front of him/her, and the digital content mixed with the images of the landscape, oriented and localized within the same, i.e. georeferenced. As the user turns the device left and right, or tilts it upwardly or downwardly, the digital content mixed with the images of the real landscape is updated accordingly, i.e. it is re-oriented and re-positioned in real time or, in other words, the content is constantly georeferenced with respect to the actual position and orientation of the AR system in the space.

Similar AR devices are manufactured by the company Leica Geosystems AG, Switzerland.

As anticipated above, usually known AR systems which georeference digital content when mixing it with images (photographs or video) taken by cameras, rely on a single large satellite antenna that receive the signals from one or more satellite constellations among GPS, GLONASS, Galileo, QZSS, SBAS, L-Band. For instance, Trimble® SiteVision™ features an antenna having diameter of about 12,5 cm. In fact, the satellite antenna of the smartphones currently available on the market is small, and the accuracy of the GPS coordinates determined with such small antennas is 3-5 meters only, which is clearly too low to correctly georeference digital content in AR applications, especially in the engineering field. The accuracy of the determination of the GPS coordinates made possible by the large antennas used by known AR systems is much higher, typically: horizontal 1 cm + 1 ppm RMS, vertical 2 cm + 1 ppm RMS, when the AR system is stationary. Despite the fact that large satellite antennas permit to determine GPS coordinates with such higher accuracy when the AR system is stationary in a given initial position, with respect to the accuracy achievable using the satellite antenna of the smartphone, the accuracy of the position determination is unsatisfactory when the AR system is being moved by the user across the construction site, before the user stops to get a new determination, typically ranging from 2 meters to 3 meters at a distance of 100-200 meters from the initial position, and the accuracy may be further reduced in case the satellite signal is low, for instance inside a building, or below trees, or when it’s cloudy.

Some methods have been developed, seeking to maximize accuracy of the determination of position and orientation of the AR/MR/XR systems and, consequently, seeking to maximize accuracy of the georeferentiation of the digital content in AR/MR/XR applications.

One method is based on an algorithm called SLAM, which is the acronym of simultaneous localization and mapping, and corresponds to building a map of the environment which surrounds the AR system at a given moment, and localizing the user in that map at the same time. SLAM algorithms have been largely used for guiding autonomous vehicles just for the purpose of allowing constant mapping of the environment around the vehicle and navigating the vehicle at the same time. An overview is offered at the following link:

Environmental mapping can be achieved, for instance, by processing the images taken by one or more cameras, or using the LIDAR technique (sometimes also referred to as LADAR), which is a method for measuring distances (ranging) by illuminating the target with laser light and measuring the reflection with a sensor: differences in laser return times and wavelengths is used to make digital 3D representations of the target. Therefore, the result of applying a SLAM algorithm to a building or a bridge is obtaining a 3D model of that building or bridge.

By cross-correlating the information obtained with the SLAM algorithm and the geographical coordinates obtained with the large satellite antenna, the AR/MR/XR system tries to re-calibrate itself, meaning that it tries to minimize the positioning errors, i.e. it tries to maximize the precision of the positioning (geographical location and orientation) of the AR/MR/XR system in the space, at the site, as the user moves across the site, as well as the georeferentiation of the digital content which is displayed to the user, mixed with the content captured by the smartphone camera.

Another re-calibration method is based on using markers across the site, i.e. identifiers, the position of which is precisely known a priori, and that are recognized by the AR/MR/XR system as soon as the user walks or moves within a given distance from the marker. Common markers are visual markers, but radio or infrared light emitting markers can also be used as described, for instance, in US 2020/0286289. For example, several markers can be located on a construction site, in order to define a grid of so-called points-of-interest POIs, eventually with some markers placed indoor, and the user of the AR/MR/XR system may walk around the site, visualizing 3D models of buildings or facilities on the smartphone screen, together with the images of the background, and the AR system re-calibrates every time it gets in the range of a marker, therefore correcting the positioning initially determined using only the large satellite antenna.

Markers allow for re-calibrating the AR/MR/XR system at the POIs, but do not bring advantages when the AR/MR/XR system is out of POI range, meaning that the drawback of the low accuracy in tracking the position remains in between the POIs and, on the other hand, the number of markers that can be installed on the site has to be limited.

Just to clarify, an error of 2-3 meters after having walked with the AR/MR/XR system 200 meters away from the initial position leads to a wrong georeferentiation of the digital content: for instance, the 3D model of a bridge is shown on the smartphone screen of the AR/MR/XR system not aligned with the corresponding tunnel, but shifted apart of 2-3 meters or, similarly, the 3D model of a pipeline which is supposed to intersect an opening in a pumping tower, instead shows the pipeline ending in the open field, alongside the tower. These are simple examples of unacceptable positioning errors. Similar errors occur in applications other than engineering, for instance in the education/entertainment, when digital information about an installation in a museum does not match what is actually shown on the screen of the smartphone, etc.

Some manufacturers of VR/AR/MR/XR system provide wireless internet connectivity that allow the VR/AR/MR/XR system to gather information form a remote platform and use this information to correct positioning errors. Typically, these solutions require paid subscription to the remote platform, which constitutes a drawback for the end user and, most of all, do not work in those places that are not reached by data signal, like at sea or in rural areas.

Summary of the invention

It is an object of the present invention to provide a method and a system for tracking the position of a portable VR/AR/MR/XR system with high precision, namely less than 1 cm, in real time, eventually without connecting to a remote internet platform, and, consequently, for correctly georeferencing content in VR/AR/MR/XR applications.

A first aspect of the present invention therefore relates to a method according to claim 1 for georeferencing digital content in a scene of virtual reality VR or augmented/mixed/extended reality AR/MR/XR on a portable VR/AR/MR/XR system.

In the present, the expression digital content means any content that is intended to be displayed on the portable VR/AR/MR/XR system and seen by the user, either alone in a scene of virtual reality VR, or mixed with images of the real world in a scene of augmented/mixed/extended reality AR/MR/XR. Examples of digital content is: CAD drawings of buildings, renderings of vehicles, persons, gardens, fountains, etc., written information like technical charts, and the like.

The term georeferencing, referred to the digital content, means determining the correct mathematical correlation between the reference system of the digital content and the reference system of a VR/AR/MR/XR scene, in an immersive way, so that the user watching the VR/AR/MR/XR scene has the impression that the digital content is part of the same VR/AR/MR/XR/ scene, i.e. it fits the scene like it is not simply overlaid, but like it is part of the same scene. This means that as the VR/AR/MR/XR scene changes, for instance because the framing of the scene changes, the digital content is adapted to the new framing, in terms of position, perspective, orientation, illumination, shadows, transparency, distance, angle, dimensions, etc.. For example, if the digital content is a CAD model of a bridge and the AR/MR/XR scene is intended to show the real valley between two hills, wherein the bridge has to be erected, georeferencing the CAD model means overlaying the CAD model onto a photograph or a video of the valley, so that the user has the impression that the bridge is actually extending between the hills in the designed position, with the same perspective and point of view of the photos/videos.

The method according to the present invention can be used in a variety of fields, for instance: engineering, architecture, tourism, education, medical applications like surgery, sport training, gaming, tactical training, etc.

The method comprises the following phases:

(A) providing a geolocation device, intended to be placed stationary in a position in the field, named initialization position, the geographical coordinates of which may be known a priori and are named initialization geographical coordinates llx, Uy, Uz, wherein the geolocation device is equipped with a first satellite antenna;

(B) providing a portable VR/AR/MR/XR system with a second satellite antenna.

The geolocation device may be a self-standing electronic device that can be carried by the user and placed on the ground, for instance supported by a tripod, or it can be an electronic device located onboard of a vehicle, for instance a truck or a car, or an unmanned vehicle, like a drone, so that it can be positioned on the ground from distance, remotely. The initialization geographical coordinates of the geolocation device are known a priori, meaning that such coordinates can either be measured directly on the field by the user, with accuracy, for instance by using as references existing buildings, roads, electric lines, or the like, or they can be precisely determined from cartographic scale maps, cadastral maps, maps provided by the municipality or other official sources, etc.. The coordinates include the latitude and the longitude, and may also include the altitude.

The portable VR/AR/MR/XR system is configured to be carried by the user as he/she moves on the ground, for instance while the user walks or takes an elevator, turns or tilts the head, etc.. The portable VR/AR/MR/XR system has a display for showing a scene of virtual reality VR or augmented/mixed/extended reality AR/MR/XR to the user. Examples of portable VR/AR/MR/XR system are a tablet, a smartphone, a laptop, a headset, a visor to be fitted before the eyes.

Both satellite antennas, i.e. first satellite antenna of the geolocation device and second satellite antenna of the portable VR/AR/MR/XR system, are configured to receive the signal from satellites of at least one constellation of global navigation satellites orbiting around the earth like, for example GPS, GLONASS, Galileo, QZSS, SBAS, etc.. Preferably, the precision of these antennas is 1 cm or higher.

The method also comprises:

(C) through the first satellite antenna, determining the latitude, the longitude and optionally the altitude of the geolocation device, together referred to as first geographical coordinates Mx, A1y, A1z;

(D) through the second satellite antenna, determining the latitude, the longitude and optionally the altitude of the portable VR/AR/MR/XR system, together referred to as second geographical coordinates A2x, A2y, A2z.

Since the user carrying the portable VR/AR/MR/XR system moves with respect to the geolocation device, there is a distance between the two devices that can be taken in account to triangulate the position of the portable VR/AR/MR/XR system, as described in the following phase E of the method: (E) triangulating the position of the portable VR/AR/MR/XR system, i.e. computing what can be defined as enhanced geographical coordinates En.X, En.Y, and optionally En.Z, basing on any between first geographical coordinates A1x, A1y, A1z and the initialization geographical coordinates llx, Uy, Uz, and basing on the second geographical coordinates A2x, A2y, A2z.

Triangulating the available coordinates permits to improve the accuracy of the tracking of the position of the portable VR/AR/MR/XR system, because the enhanced geographical coordinates En.X, En.Y, En.Z can be determined with a precision higher than 1 cm.

Upon obtaining the enhanced geographical coordinates En.X, En.Y, En.Z, the method provides:

(F) georeferencing at the enhanced geographical coordinates En.X, En.Y, En.Z the digital content in a scene of virtual reality VR or augmented/mixed reality AR displayed on the portable VR/AR/MR/XR system.

Basically, the digital content is georeferenced by using the most accurate coordinates available, that are the enhanced geographical coordinates En.X, En.Y, En.Z of the portable VR/AR/MR/XR system, instead of simply using the second geographical coordinates A2x, A2y, A2z, this leading to a sub centimetric precision of the positioning of the digital content in the scene which is seen by the user.

In order to maintain the accuracy of the positioning of the digital content, the method provides:

(G) tracking the position of the portable VR/AR/MR/XR system, as the portable VR/AR/MR/XR system is moved on the ground by the user or with the user, and feedback georeferencing the digital content in the scene displayed to the user, by repeating the afore mentioned phases (C) to (F) over the time.

Phase (G) provides important advantages: instead of losing accuracy when georeferencing the digital content in the VR/AR/MR/XR scene between two positions walked by the user (initial and final), phase (G) provides for continuous tracking the position of the portable VR/AR/MR/XR system when it is moved by the user, and this is achieved by repeating phases (C) to (F) over the time, with a proper frequency, as a routine. In this way, digital content is always correctly georeferenced in the VR/AR/MR/XR scene even if the user is moving a long distance with the portable VR/AR/MR/XR system. Every time a new triangulation of coordinates is performed and enhanced geographical coordinates En.X, En.Y, En.Z are determined, the portable VR/AR/MR/XR system feedback adjusts the VR/AR/MR/XR scene by determining the univocally corresponding position of the digital content, which means, in other words, by continuously georeferencing the digital content in the VR/AR/MR/XR scene, in feedback based upon the enhanced geographical coordinates computed each time (En.Xt, En.Yt, En.Zt; En.Xt+i, En.Yt+i, En.Zt+i; En.Xt+n, En.Yt+n, En.Zt+n), and not only based upon the second geographical coordinates A2xi, A2yi, A2zi of the initial position and the second geographical coordinates A2x2, A2y2, A2z2 of the final position of the user.

This feature may be defined as positional tracking.

The proposed method permits to achieve sub centimetric precision of the digital content positioning in the VR/AR/MR/XR scene and sub centimetric precision of the positioning of the portable VR/AR/MR/XR system in the field, even without having the portable VR/AR/MR/XR system connected to the internet and to remote platform that provide correction of coordinates, and even if the user walks long distances, for instance 500 meters or more.

In the present description the first geographical coordinates A1x, A1y, A1z, the initialization geographical coordinates llx, Uy, Uz, the second geographical coordinates A2x, A2y, A2z and the enhanced geographical coordinates En.X, En.Y, En.Z are not limited to a specific system, meaning that the coordinate system can be 3D cartesian, earth-centred, earth-fixed, stereographic, UTM and UPS, horizontal (latitude, longitude), etc.. Therefore, the skilled person should intend references A1x, A1y, A1z, Ux, Uy, Uz, A2x, A2y, A2z and En.X, En.Y, En.Z as merely exemplary.

In the preferred embodiment of the method the position of the portable VR/AR/MR/XR system is substantially continuously tracked, and the digital content is substantially continuously feedback georeferenced in the VR/AR/MR/XR scene, by implementing phase (G) at a frequency of 30 Hz, or a higher frequency. This frequency has proven suitable to achieve the described advantages when the user walks, runs or moves on a vehicle, that makes the method suitable also for gaming and training, not only for engineering and architecture applications.

Preferably the first geographical coordinates A1x, A1y, A1z and/or the initialization geographical coordinates llx, Uy, Uz are wireless transmitted by the geolocation device to the portable VR/AR/MR/XR system. A wired transmission would be a clear alternative, although less practical.

In the preferred embodiment the wireless transmission is performed in radiofrequency, preferably at ultra-high frequency UHF, by fitting proper emitter and receiver into the geolocation device and the portable VR/AR/MR/XR system.

Preferably phase (E), i.e. the triangulation, is carried out by correcting the second geographical coordinates A2x, A2y, A2z with the first geographical coordinates A1x, A1y, A1z, or the initialization geographical coordinates Ux, Uy, Uz manually set or input by the user, wherein these coordinates are considered taken in the same instant of time, by applying the real-time kinematic RTK positioning technique. An overview of this technique is given on the following presentation : https://www.gps.gov/cgsic/meetinqs/2009/qakstatter1 .pdf.

Preferably the correction, and any other computing, can be performed by processing means of the portable VR/AR/MR/XR system, for instance a CPU or any programmable electronic unit set in communication with the geolocation device.

In the preferred embodiment a phase (H) is implemented which corresponds to implement simultaneous localization and mapping (SLAM) of the surrounding environment by the portable VR/AR/MR/XR system. SLAM is performed by dedicated means, for instance at least one Lidar system, and optionally with at least one between an accelerometer, a gyroscope, a compass, each of these being generically named sensor. For instance, if the portable VR/AR/MR/XR system consists of a tablet or a smartphone featuring a camera, or more cameras, SLAM can also be performed by processing the images/video captured by the camera(s) to obtain a map of the environment.

In a preferred phase (I), subsequent to phase (H), the maps generated as described above, and optionally the signals generated by the accelerometer, the gyroscope and the compass, are used for determining the orientation of the portable VR/AR/MR/XR system in the space. In fact, phases (A) to (G) permit to achieve extremely high accuracy of the determination of the position of the portable VR/AR/MR/XR system, in terms of latitude, longitude and optionally altitude, but these coordinates don’t comprise sufficient information to determine the orientation of the portable VR/AR/MR/XR system in the space; phase (I) fulfils this task, by permitting to determine the orientation, for instance mathematically expressed in terms of angle with respect to the magnetic north and angle with respect to an artificial horizon.

In the preferred embodiment of the method the vertical position of the portable VR/AR/MR/XR system with respect to the ground is tracked by means of at least one Lidar system. The Lidar system scans the environment surrounding the portable VR/AR/MR/XR system and the distance of any scanned surface is determined by analysing the wavelength of the reflected light. In this way, the distance of the portable VR/AR/MR/XR system from the ground is determined. This feature can turn useful, for instance, when a user standing at the top of a building or a bridge wants to know the precise height of the building/bridge.

Preferably the method comprises:

(L) providing the portable VR/AR/MR/XR system with means for generating a three-dimensional 3D map of the surrounding environment, for instance at least one Lidar system, and defining anchor points or points of interest POI in the generated 3D map, wherein each anchor point POI describes the location of a physical object in the real world. Once one or more POIs are defined in the 3D digital map of the surroundings, the digital content is feedback georeferenced with respect to at least one POI.

In case the portable VR/AR/MR/XR system can be connected to the internet, the anchor points POIs can be shared over the internet, for instance in cloud, among several users of the same VR/AR/MR/XR scene. In this way, several users can share the same experience at the same time and also in different times. For example, if a first user has set an anchor POI at the top of a bridge pillar, this being real or virtual, a second user can have his/her VR/AR/MR/XR scene georeferenced to the same POI even after days or weeks, as the building continues.

The method according to the present invention is preferably implemented, as anticipated above, by providing the portable VR/AR/MR/XR system with a multimedia device, for instance a smartphone, having a screen, and by displaying virtual reality VR or augmented reality AR scenes on the screen, wherein the digital content is georeferenced at the enhanced geographical coordinates and is most preferably oriented in the space by referring to the information gathered through SLAM, Lidar system and sensors.

In a preferred phase (M) the portable VR/AR/MR/XR system is equipped with at least one sensor between an accelerometer, a gyroscope, a compass, and a phase (N) is carried out:

(N) determining the distance Dist.A, and eventually the trajectory, walked by the user carrying the portable VR/AR/MR/XR system, between a first position having first enhanced geographical coordinates En.Xt, En.Yt, (latitude, longitude) and a second position having second enhanced geographical coordinates En.Xt+i, En.Yt+i, basing on the sensor signals.

Basically, the portable VR/AR/MR/XR system equipped with the sensors is capable of calculating Dist.A and the trajectory by using the sensors, with a different accuracy than when implementing phases (C) to (G) only. Clearly, this information can be used together with the continuous position tracking implemented with phase (G) to perform a real-time double check on the accuracy, as it will be now be described. In a phase (0), subsequent to phase (N), errors of the signals generated by said sensors are reduced or eliminated by applying the Kalman filter to said sensor signals. Kalman filter, also known as linear quadratic estimation (LQE), is a known algorithm that uses a series of measurements observed over time, containing statistical noise and other inaccuracies, and produces estimates of unknown variables that tend to be more accurate than those based on a single measurement alone, by estimating a joint probability distribution over the variables for each timeframe. An overview of the Kalman filter is given at this link:

Preferably the method also comprises:

(P) determining the distance Dist.B walked by the user carrying the portable VR/AR/MR/XR system, between a first position having first enhanced geographical coordinates En.Xt, En.Yt, and a second position having second enhanced geographical coordinates En.Xt+i, En.Yt+i, by applying the formula Ellipsoidal Earth projected to a plane to said first enhanced geographical coordinates En.Xt, En.Yt and said second position having second enhanced geographical coordinates En.Xt+i, En.Yt+i. The formula Ellipsoidal Earth projected to a plane is a formula prescribed by the Federal Communications Commission (FCC) of the United States for computing distances. Alternatively, other known formulas may be applied for computing distances starting from geographical coordinates.

A subsequent phase (Q) consists of repeating phase (P) over the time, for instance every second or every one meter walked by the user, to determine the deviation Delta, defined as deviation vector Delta, between Dist.Aand Dist.B, and feedback correcting the trajectory walked by the user as initially determined basing on the sensor signals.

Phases (0) to (Q) permit to achieve the best results from the sensors in terms of accuracy of the signals, i.e. information, provided. This information is used by the portable VR/AR/MR/XR system to determine the distance walked by the user and his/her trajectory with sub centimetric precision. As anticipated above, phases (A) to (G) can be implemented without connecting the portable VR/AR/MR/XR system to a remote internet platform, because the method provides for high precision position tracking even without support from remote platforms. Nevertheless, this feature may be implemented in the portable VR/AR/MR/XR system to achieve an even higher precision, or simply to allow sharing of content and information among several users located in the same place or at different places, in the same time or at different times.

Preferably the method is implemented by equipping the portable VR/AR/MR/XR system with an electronic compass. Calibrating the compass is performed by having the user walking at least 10 meters, and preferably 100 meters, from an initial position along the same longitude; the user may refer to a validated physical map or a magnetic traditional compass to identify a same longitude during the calibration process.

Even more preferably, calibrating the compass is performed by having the user walking at least one tenth of the maximum dimension of the digital content to be displayed: for example, if the digital content is a rectangular parking lot having dimensions 200x100 metres, the VR/AR/MR/XR system opens a wizard to guide the user to walk at least 20 metres, i.e. one tenth of the maximum length of the parking lot, along a same longitude. The VR/AR/MR/XR system helps the user to walk along a same longitude, by correcting the user’s path with visual and/or audible indications, if necessary, using the processed signals obtained by the satellite antennas and/or sensors. The calibration process ends when the VR/AR/MR/XR system acquires or memorizes the north, that is basically a cartographic north.

Preferably the method is implemented with a wearable VR/AR/MR/XR system, for instance a visor or helmet, or a handheld device. Alternatively, the VR/AR/MR/XR system is interfaceable or integrable to/into the control panel of an operating vehicle or a rover.

A second aspect of the present invention relates to an assembly or a kit comprising a portable virtual reality VR or augmented/mixed reality AR system, a geolocation device and processing means, wherein:

- the geolocation device is intended to be placed stationary on the ground in a position, named initialization position, the geographical coordinates of which may be known a priori and are named initialization geographical coordinates llx, Uy, Uz, and is equipped with a first satellite antenna;

- the portable VR/AR/MR/XR system is intended to be carried by a user and is provided with a second satellite antenna;

- both first and second satellite antennas are set to receive the signal from satellites of at least one constellation of global navigation satellites orbiting around the earth;

- the processing means, for instance an electronic device like a CPU or a microprocessor that can be located onboard the portable VR/AR/MR/XR system or onboard the geolocation device, processes the signal provided by the first satellite antenna and determines the latitude, the longitude and optionally the altitude of the geolocation device, together referred to as first geographical coordinates A1 x, A1 y, A1 z;

- the processing means processes the signal provided by the second satellite antenna and determines the latitude, the longitude and optionally the altitude of the portable VR/AR/MR/XR system, together referred to as second geographical coordinates A2x, A2y, A2z.

The processing means is programmed to triangulate the position of the portable VR/AR/MR/XR system, i.e. to compute enhanced geographical coordinates En.X, En.Y, En.Z, basing on any between the first geographical coordinates A1x, A1y, A1z or the initialization geographical coordinates Ux, Uy, Uz, and the second geographical coordinates A2x, A2y, A2z.

The portable VR/AR/MR/XR system displays on a screen digital content, for example digital content selected by the user, in a scene of virtual reality VR or augmented/mixed reality AR, georeferenced at the enhanced geographical coordinates En.X, En.Y, En.Z.

Advantageously, the processing means computes the enhanced geographical coordinates over the time, i.e. the processing means repetitively compute the enhanced geographical coordinates as routine (En.Xt, En.Yt, En.Zt; En.Xt+i, En.Yt+i, En.Zt+i; En.Xt+n, En.Yt+n, En.Zt+n), and the digital content is feedback georeferenced on the basis of the enhanced geographical coordinates En.Xt+n, En.Yt+n, En.Zt+n that are computed each time, as the portable VR/AR/MR/XR system is moved.

Preferably, the kit is used to implement the method according to the present invention, described above, with same advantages.

This means that, preferably the processing means computes the enhanced geographical coordinates En.Xt+n, En.Yt+n, En.Zt+n and feedback georeferences the digital content at a frequency of 30 Hz, or higher, to achieve a substantially continuous position tracking of the portable VR/AR/MR/XR system and a substantially continuous georeferencing of said digital content.

Preferably the geolocation device comprises a wireless interface and transmits the first geographical coordinates A1x, A1y, A1z, or the initialization geographical coordinates llx, Uy, Uz, to the portable VR/AR/MR/XR system. The geolocation device may be equipped with a keyboard and a screen to allow the user to manually input and set the initialization geographical coordinates Ux, Uy, Uz.

In the preferred embodiment the wireless interface is a radio transmitter, preferably operating at ultra-high frequency UHE

As explained above, with reference to the method, preferably the processing means are programmed for carrying out the above-mentioned triangulation by correcting the second geographical coordinates A2x, A2y, A2z with the first geographical coordinates A1x, A1y, A1z or the initialization geographical coordinates Ux, Uy, Uz, by applying the real-time kinematic RTK positioning technique.

Preferably the portable VR/AR/MR/XR system is provided with means for implementing simultaneous localization and mapping SLAM of the surrounding environment, for instance at least one Lidar system, and optionally with at least one sensor between an accelerometer, a gyroscope, a compass. With this configuration the processing means determines the orientation of the portable VR/AR/MR/XR system in the space on the basis of the mapping and the sensor signals. In this way the digital content can be precisely georeferenced and precisely oriented in the space, with sub centimetric accuracy.

A Lidar system may be used onboard of the portable VR/AR/MR/XR system to track the vertical position of the portable VR/AR/MR/XR system with respect to the ground; the Lidar system scans the environment and the ground with light beams and the processing means analyses the wavelengths of the reflected light to determine the distance of the scanned surfaces.

As anticipated above, preferably the portable VR/AR/MR/XR system comprises a multimedia device, for instance a smartphone or a tablet, having a screen, and virtual reality VR or augmented reality AR scenes are displayed on the screen, with digital content continuously georeferenced at the enhanced geographical coordinates, which are computed continuously at a given frequency.

Preferably the portable VR/AR/MR/XR system comprises at least one sensor between an accelerometer, a gyroscope, a compass, and the processing means are programmed for determining the distance Dist.A, and eventually the trajectory, walked by the user carrying the portable VR/AR/MR/XR system, between a first position having first enhanced geographical coordinates En.Xt, En.Yt (latitude, longitude) and a second position having second enhanced geographical coordinates En.Xt+i, En.Yt+i (latitude, longitude), basing on the sensor signals. The information provided by the sensors may be used together with the continuous position tracking implemented by the processing means to perform a real-time double check on the accuracy, as it will be now be described.

Preferably the processing means applies the Kalman filter to the sensor signals, as described above with reference to the method.

Preferably the processing means are programmed also for determining the distance Dist.B walked by the user carrying the portable VR/AR/MR/XR system, between a first position having first enhanced geographical coordinates En.Xt, En.Yt (latitude, longitude) and a second position having second enhanced geographical coordinates En.Xt+i, En.Yt+i (latitude, longitude), by applying the formula Ellipsoidal Earth projected to a plane to said first enhanced geographical coordinates En.Xt, En.Yt and said second position having second enhanced geographical coordinates En.Xt+i, En.Yt+i. alternatively, other known formulas may be applied.

Upon having determined Dist.A, the processing means are programmed for determining the distance Dist. B over the time at a given frequency, for instance every second or every one meter walked by the user, and determining the deviation Delta, defined as deviation vector Delta, between Dist.A and Dist.B. Then deviation vector Delta is used to feedback correct the trajectory walked by the user initially determined basing on the sensor signals.

Preferably the portable VR/AR/MR/XR system comprises an electronic compass that can be calibrated as described above. The processing means can be set to define anchor points, or points of interest POI in the digital content, as described above.

Brief list of the figures

Further characteristics and advantages of the invention will be better highlighted by examining the following detailed description of its preferred, but not exclusive, embodiments depicted by way of non-limiting example, with the support of the appended drawings, wherein:

- figure 1 is a schematic view of a kit/assembly according to the present invention, in a first configuration;

- figure 2 is a flowchart of the method according to the present invention;

- figure 3 is a schematic view of the kit shown in figure 1 , in a second configuration;

- figure 4 is a front view of a component of a kit according to the present invention, and the environment;

- figure 5 is a schematic view of a component of a kit according to the present invention, and the environment; - figure 6 is a schematic view of a of a kit according to the present invention during use;

- figure 7 is a perspective view of a component of of a kit according to the present invention;

- figure 8 shows a possible application of a kit according to the present invention.

Detailed description of the invention

Figure 1 shows a schematic view of a kit or assembly 1 according to the present invention, comprising a geolocation device 2, a portable virtual reality VR or augmented/mixed reality AR system 3, and processing means 4, preferably in the form of a microprocessor or CPU.

Figure 6 shows a possible physical embodiment of the kit 1 .

The geolocation device 2 is intended to be placed stationary on the ground in an initialization position, the geographical coordinates of which are known a priori and are named initialization geographical coordinates Ux, Uy, Uz. In figure 6 the geolocation device 2 is a device supported onto a tripod.

The geolocation device 2 is equipped with a first satellite antenna A1 configured to receive the signal of one or more satellites S of a constellation of satellites orbiting the Earth like, for example GPS, GLONASS, Galileo, QZSS, SBAS, etc..

The portable VR/AR/MR/XR system 3 is intended to be carried by a user U and is provided with a second satellite antenna A2 configured to receive the signal of one or more satellites S of a constellation of satellites orbiting the Earth.

As shown in the figure, preferably the portable VR/AR/MR/XR system 3 is a smartphone, or a tablet, or a visor comprising a smartphone, etc., being provided with a screen wherein a VR/AR/MR/XR scene is shown to user U.

The processing means 4 are preferably onboard the VR/AR/MR/XR system 3, although it may be possible to have it onboard the geolocation device 2, and having the geolocation device 2 communicating with the VR/AR/MR/XR reality system 3. The processing means 4 processes the signal provided by the first satellite antenna A1 and determines the latitude, the longitude and optionally the altitude of the geolocation device, together referred to as first geographical coordinates A1x, A1y, A1z, for instance in decimal degrees or in degrees, minutes and seconds.

In the same way, the processing means 4 processes the signal provided by the second satellite antenna A2 and determines the latitude, the longitude and optionally the altitude of the portable VR/AR/MR/XR system, together referred to as second geographical coordinates A2x, A2y, A2z.

Preferably, as shown in figure 1 , the geolocation device 2 and the portable VR/AR/MR/XR system 3 are both provided with communicating interfaces R1 and R2, respectively, that permit at least communicating from the geolocation device 2 to the portable VR/AR/MR/XR system 3, and preferably permit two-way communications. In the preferred embodiment communicating interfaces R1 and R2 are radio interfaces, and most preferably at ultra-high frequency UHF.

When available, initialization geographical coordinates llx, Uy, Uz are set by the user U, for instance by using the keyboard of a user interface 2’ arranged onboard the geolocation device 2, and visible in figure 6. These coordinates are then radio transmitted to the portable VR/AR/MR/XR system 3, in UHF frequency. Alternatively, the geolocation device 2 determines its position by receiving satellite signals with the first satellite antenna A1 , and radio transmits the first geographical coordinates A1x, A1y, A1z to the portable VR/AR/MR/XR system 3.

The portable VR/AR/MR/XR system 3, that is distant from the geolocation device 2, for instance 1 meter, 100 meters or 400 meters, determines its position by receiving satellite signals with the second satellite antenna A2, corresponding to the second geographical coordinates A2x, A2y, A2z.

Preferably, the processing means 4 is programmed to convert both first geographical coordinates A1x, A1y, A1z and second geographical coordinates A2x, A2y, A2z in UTM coordinates (Universal Transverse Mercator).

The processing means 4 is programmed to triangulate the position of the portable VR/AR/MR/XR system 3, considering the triangle formed by the geolocation device 2, the portable VR/AR/MR/XR system 3 a satellite S or any other point having known accepted coordinates. This means that, instead of relying only on the second geographical coordinates A2x, A2y, A2z as obtained by the second satellite antenna A2, the kit 1 performs a triangulation, using the first geographical coordinates or the initialization geographical coordinates, if available, to maximize the accuracy of the determination of the position of the portable VR/AR/MR/XR system 3. As anticipated above, the geographical coordinates of the portable VR/AR/MR/XR system 3 obtained with the triangulation technique are here called enhanced geographical coordinates En.X, En.Y, En.Z. If the accuracy of the first satellite antenna A1 and the second satellite antenna A2 is 1 cm, the enhanced geographical coordinates have higher precision, sub centimetric.

Such accuracy can be obtained without connecting the kit 1 to the internet to get assistance or correction from a remote platform, that is, without paying a subscription to a service.

The portable VR/AR/MR/XR system displays on its screen the digital content in a scene of virtual reality VR or augmented/mixed reality AR, georeferenced at the enhanced geographical coordinates En.X, En.Y, En.Z. Therefore, the precision of the georeferencing is also sub centimetric.

In order to maximize the accuracy, triangulating is carried out using the first geographical coordinates (A1x, A1y, A1z), or the initialization geographical coordinates (llx, Uy, Uz), and the second geographical coordinates (A2x, A2y, A2z), and by applying the real-time kinematic RTK positioning technique.

At this point, if the user U moves form his/her initial position, the accuracy of the determination of the position of the portable VR/AR/MR/XR system 3 can only lower if the processing means 4 estimates the trajectory of the user U and his/her position only on the basis of one or more sensors of the same VR/AR/MR/XR system 3 like, for example, an accelerometer s, a gyroscope 7, a compass 8, because these sensors cannot match the accuracy of the determination described above to compute the enhanced geographical coordinates En.X, En.Y, En.Z.

Therefore, the method and the kit 1 according to the present invention provide for a different approach, called positional tracking: continuously tracking the position of the portable VR/AR/MR/XR system 3 by repeating the aforementioned triangulation of coordinates, with a sufficiently high frequency, for example 30 Hz, or higher. In this way, every second the processing means 4 compute a new set of enhanced geographical coordinates: (En.Xt, En.Yt, En.Zt); (En.Xt+i, En.Yt+i, En.Zt+i); (En.Xt+n, En.Yt+n, En.Zt+n), etc.. For each set of enhanced geographical coordinates (En.Xt+n, En.Yt+n, En.Zt+n) the processing means 4 feedback georeferences the digital content in the VR/AR/MR/XR scene shown to the user II, and the accuracy of the AR scene is maximized, with a great immersive experience for the user.

Clearly, the kit 1 preferably does take into account the signals of sensors 6-8, but the information retrieved from the sensors 6-8 is subordinated to the determination of the enhanced geographical coordinates (En.Xt+n, En.Yt+n, En.Zt+n), which is obtained continuously at the selected frequency. In this respect, 30Hz proved to be sufficiently high to guarantee continuous tracking the position of the VR/AR/MR/XR system 3 even if the user runs or moves onboard of a vehicle, which permits new uses of VR/AR/MR/XR systems.

As anticipated above, reference signs A1x, A1y, A1z, llx, Uy, Uz, A2x, A2y, A2z, En.X, En.Y, En.Z, etc. are not intended to limit the scope of protection to a specific coordinate system, as several can be used to indicate the position on earth.

Figure 2 is a flowchart that shows the method according to the present invention, that can be implemented using the kit 1 .

With (A) it is indicated that the geolocation device 2, provided with the first satellite antenna A1 , is positioned stationary on the ground in the initialization position, characterized by known initialization geographical coordinates Ux, Uy, Uz. With (B) it is indicated setting up the portable VR/AR/MR/XR system 3, provided with the second satellite antenna A2, and eventually moving the portable VR/AR/MR/XR system 3 with respect to the geolocation device 2.

With (C) it is indicated determining the first geographical coordinates A1x, A1y, and optionally A1z of the geolocation device 2, through the first satellite antenna A1 .

With (D) it is indicated determining the second geographical coordinates A2x, A2y and optionally A2z of the portable VR/AR/MR/XR system 3, through the second satellite antenna A2.

With (E) it is indicated triangulating the position of the portable VR/AR/MR/XR system 3, as described above, to compute the enhanced geographical coordinates at a first instant of time En.Xt, En.Yt, and optionally En.Zt, with a precision higher than 1 cm.

With (F) it is indicated georeferencing the digital content in a scene of virtual reality VR or augmented/mixed reality AR displayed on the portable VR/AR/MR/XR system 3, at the enhanced geographical coordinates En.Xt, En.Yt, and optionally En.Zt.

With (G) it is intended to determine the enhanced geographical coordinates repetitively, at each selected instant of time, to obtain a set of coordinates En.Xt+i, En.Yt+i, and optionally En.Zt+i and En.Xt+n, En.Yt+n, and optionally En.Zt+n. Georeferencing the digital content, as described in (F), is then done in feedback, over the time, by continuously adjusting the process to the newly determined set of enhanced geographical coordinates En.Xt+n, En.Yt+n, and optionally En.Zt+n. Therefore, if the portable VR/AR/MR/XR system 3 remains still, the position and the orientation of the digital content does not change, otherwise, if the portable VR/AR/MR/XR system 3 is moved by the user, the digital content is correctly feedback georeferenced for each new position of the portable VR/AR/MR/XR system 3.

Preferably, as shown in figure 3, the portable VR/AR/MR/XR system 3 comprises a system 5 having the function of acquiring a digital map of the surrounding environment. The system 5 is preferably a Lidar system, but it can also be a camera with a software that analyses images to extract a digital model of the surfaces, etc..

As explained above, a Lidar system 5 scans the environment with light beams, for instance a laser, and determines the distance of the scanned surfaces by analysing the return times and wavelengths of the reflected light.

By means of the Lidar system 5, the portable VR/AR/MR/XR system 3 is capable of making a digital map of the surroundings and locate itself in the same map, according to an algorithm known as SLAM (simultaneous localization and mapping), nowadays used in unmanned road and aerial vehicles. Performing a SLAM algorithm allows the portable VR/AR/MR/XR system 3 to determine its orientation in the space, therefore, by taking into account the information provided by the positional tracking with the information gathered by the Lidar system 5 with the SLAM, the kit 1 is able to determine not only the position of the portable VR/AR/MR/XR system 3 on earth, with sub centimetric precision, but also its orientation, which means that the VR/AR/MR/XR scene is also properly oriented according to the point of view of the user U.

As anticipated above, also the signals of sensors 6-8 is taken into account to determine the exact orientation of the portable VR/AR/MR/XR system 3 in the space.

In particular the following angles can be determined: a first angle with respect to the magnetic north, a second angle with respect to the horizon, and a third angle which is the tilt angle.

Figure 4 shows an exemplary application of the kit 1 , wherein the portable VR/AR/MR/XR system 3 is a smartphone 3’ having at least one camera that is framing the environment 9 in front of the smartphone 3’. The environment comprises a construction site into which buildings are to be erected. With reference number 10 it is indicated the digital content which is used to create an augmented reality AR scene 11 on the screen of the smartphone 3’. In particular, the digital content 10 are two buildings, each having six floors. As it can be appreciated, the buildings 10 are georeferenced with respect to the environment 9, meaning that they are overlaid to the images of the environment 9 with the same point of view of the images, in the right perspective, as the buildings were already present in the scene 11 . If the user II walks around the construction site, the AR scene 11 is continuously updated by continuously feedback georeferencing the buildings 10 based upon the position, and preferably the orientation in the space, of the smartphone 3’. In this application the camera of the smartphone 3’ is also used to perform the SLAM, by having the images taken by the camera analysed for image recognition.

Figure 5 shows a similar application, wherein a building 12 has to be renewed. The portable VR/AR/MR/XR system 3 is a tablet 3” equipped with a Lidar system 5 that generates a digital model 12’ of the building 12 using laser beams. The digital model 12’ is displayed on the screen of the tablet 3” and an AR scene is created by adding digital content 10 in the form of written technical information related to the building 12. the digital model 12’ can be zoomed, detailed, etc..

All the information gathered, processed or generated by the tablet 3” can be shared over the internet, using a cloud service 13 and/or remote servers 14. The shared information can be used by several users, at the same time or in different times, for example several workers on the field. In this respect, the tablet 3” is provided with means for establishing wireless remote internet connection.

Figure 7 shows a portable VR/AR/MR/XR system in the form of a visor or headset 13, having a screen in front of the user’s eyes, typically a smartphone 3’. The visor 13 is also equipped with a Lidar system 5.

Figure 8 shows another possible application of the kit 1 , wherein the portable VR/AR/MR/XR system 3 is integrated into the dashboard 15 of an operative vehicle 14 used at the construction site. The environment 9 as seen by the user through the windshield is shown on a screen of the VR/AR/MR/XR system 3 together with the relevant digital content 10, in this case warning signal warning the user that he is approaching electric lines. With reference to figure 6, the kit 1 allows the user to compute the distance D between two positions P1 and P2 on earth, for example the distance between the actual position P2 of the user II and the position P1 of the geolocation device 2, by using the formula Ellipsoidal Earth projected to a plane (I) for distances not exceeding 475 kilometres. Position P1 has first geographical coordinates (A1x, A1y) and position P2 has second geographical coordinates (A2x, A2y). The differences in latitude and longitude are as follows:

Ax = A2x -A1x; Ay = A2y -A1y.

Mean latitude is computed as:

Xm = (A1x + A2x) / 2.

Colatitude is computed as follows:

0 = (n/2 - x) in radiants; 0 = (90° - x) in degrees.

The distance D is then:

D = ( (K-iAx) 2 + (K 2 Ay) 2 ) (I) where:

D is the distance in kilometres,

Ax and Ay are in degrees,

Xm is in units compatible with the method used for determining cos(Xm), K1 = 111 .13209 - 0.56605 cos(2Xm) + 0.00120 cos(4Xm);

K2 = 111 .41513 cos(Xm) - 0.09455 cos(3Xm) + 0.00012 cos(5Xm).

In the above, K1 and K2 are units of kilometres per degree.

For instance, if the construction site shown in figure 6 is at sea level (altitude is null), and positions P1 and P2 have the following coordinates:

P1 = (-23,591636, -46,661714, 0),

P2 = (-23,591661 , -46,661695, 0), then the exact distance between P1 and P2 is 3,380432468 m and the distance D computed with the formula (I) is 3,3803471746537 m, i.e. the difference between the real distance and the computed distance is only 0,000085 metres.

Yet another example at sea level with positions P1 and P2 having the following coordinates: P1 = (-23,591636, -46,661714, 0), same as in the example above,

P2 = (-23,593587, -46,660772, 0), then the exact distance between P1 and P2 is 236,504492294 m and the distance D computed with the formula (I) is 236,496095100588 m, i.e. the difference between the real distance and the computed distance is only 0,0084 m.

By recalculating the distance D over the time, as the user II walks away from the geolocation device 2 carrying the portable VR/AR/MR/XR system 3, the trajectory of the user’s path can be determined precisely and also the direction of movement of the portable VR/AR/MR/XR system 3 is determined.

Distance D is also computed by the processing means 4 analysing the signals provided by sensors 6-8, and compared/corrected the value obtained by using the coordinates retrieved with satellite antennas.

The same approach is used to determine the cartographic north, to be considered in alternative to the magnetic north in case the compass 8 is not sufficiently accurate.

The best way to carry out the present invention, i.e. the way to achieve the best results in terms of accuracy, like sub-centimetric accuracy of the position tracking and georeferencing the digital content, is to continuously:

- computing the enhanced geographical coordinates En.Xt+n, En.Yt+n, and optionally En.Zt+n as described in phase (G);

- analysing the signals provided by sensors (6-8) and cross-correlating these signals with the enhanced geographical coordinates En.Xt+n, En.Yt+n, En.Zt+n in order to determine the orientation of the portable VR/AR/MR/XR system 3 in the space;

- computing the distance D walked by the user II and his/her trajectory, and cross-correlating the results with the enhanced geographical coordinates En.Xt+n, En.Yt+n, En.Zt+n in order to eventually correct the enhanced geographical coordinates.

The portable VR/AR/MR/XR system 3 may be used also for determining the cartographic north, when the magnetic north cannot be determined with sufficient precision. In this circumstance, the processing means 4 is programmed to guide the user II in a calibration procedure, consisting of having the user II walking from an initial calibration position C1 for at least 10 meters along a direction I which leaves the longitude constant, the longitude being determined with the second satellite antenna A2, to a final position C2. Then the processing means 4 determines the line passing by points C1 and C2, that have known second geographical coordinates: such line indicates the cartographic north with respect to which angles are computed instead of using the compass 8.

Preferably, the portable VR/AR/MR/XR system 3 is provided with a wireless interface for connecting to the internet for downloading, or uploading and sharing the digital content, or 3D models like 3D CAD drawings, BIM models, etc.. When digital 3D models are already available on the market, for instance a 3D map of part of a city, the portable VR/AR/MR/XR system 3 may download the 3D models instead of proceeding to map the surrounding with the Lidar system 5, or the map obtained from the Lidar system 5 may be compared to the downloaded 3D model to identify discrepancies that are shown to the user U, or to exclude/cover items from the VR/AR/MR/XR scene, or to select whether surfaces, objects or items like persons, walls, vehicles, poles, road signals, etc., have to be shown in the background or in the foreground of the VR/AR/MR/XR scene, according to a technique called body and environmental occlusion.

Preferably the portable VR/AR/MR/XR system 3 memorizes the position of the sun or the moon as the user U frames them with the camera, and the processing means 4 computes shadows and lights of the VR/AR/MR/XR scene, for instance the light entering from a window, in feedback on the basis of the memorized position of the sun/moon, as the user U moves the portable VR/AR/MR/XR system 3.

Preferably the portable VR/AR/MR/XR system 3 is configurable to set virtual anchor points or points of interest POI in the 3D map generated by the Lidar system 5, or in the AR scene. Each anchor point POI describes the location of a physical object in the real world. Once one or more POIs are defined the digital content is feedback georeferenced with respect to at least one POI. Therefore, as the user moves and sets POIs, these POIs constitute references for any future georeferencing of the digital content in the scene. POIs can be shared with other users, as long as, the portable VR/AR/MR/XR system 3 is connected to a network, in order to permit creating a shared AR scene, wherein each user can participate creating or modelling content, referring to the POIs.

Markers can also be used across the site, either physical markers like poles with visual identifiers/tags, or radio markers, to defines physical POIs.