Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
GEO-LOCATION ASSISTED MOBILE AUGMENTED REALITY
Document Type and Number:
WIPO Patent Application WO/2022/025885
Kind Code:
A1
Abstract:
Systems, methods, and software that support an Augmented Reality (AR) session. In one embodiment, the AR session runs on User Equipment (UE) or other apparatus with a Mobile AR (MAR) application, a camera, and a screen configured to display digital images for the AR session within a field of view of the camera. An agent on the UE receives coordinates for a target object(s) from an edge computing platform via wireless signals, where the coordinates are in a premises coordinate system of a premises. The agent converts the coordinates for the target object(s) from the premises coordinate system to an AR coordinate system created for the AR session, and provides the coordinates in the AR coordinate system to the MAR application. The MAR application overlays virtual content relating to the target object(s) for the AR session based on the coordinates.

Inventors:
UDDIN MOSTAFA (US)
LAKSHMAN T (US)
KODIALAM MURALIDHARAN (US)
MUKHERJEE SARIT (US)
Application Number:
PCT/US2020/044084
Publication Date:
February 03, 2022
Filing Date:
July 29, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NOKIA TECHNOLOGIES OY (FI)
NOKIA AMERICA CORP (US)
International Classes:
G06T15/20; A63F13/53; G06F3/01; G06F3/042; G06T7/30; G06T19/20
Domestic Patent References:
WO2019161903A12019-08-29
WO2019113380A12019-06-13
Foreign References:
US20190371067A12019-12-05
US20190294235A12019-09-26
Download PDF:
Claims:
CLAIMS

What is claimed is:

1. A system (110) that supports an Augmented Reality (AR) session (334) to augment one or more target objects (114) on a premises (100), the system comprising:

User Equipment (UE) (202) comprising: a battery (314); a radio interface component (302) configured to communicate with an edge computing platform (204) via wireless signals; a camera (310); a screen (350) configured to display digital images for the AR session within a field of view (210) of the camera; and at least one processor (304) and memory (306); the at least one processor of the UE executes a Mobile AR (MAR) application (330) to run the AR session, wherein an AR coordinate system (803) is created for the AR session; the at least one processor of the UE further implements an agent (201) to: receive coordinates for the target objects from the edge computing platform, wherein the coordinates are in a premises coordinate system (802) of the premises; convert the coordinates for the target objects from the premises coordinate system to the AR coordinate system; and provide the coordinates in the AR coordinate system to the MAR application; wherein the MAR application is configured to overlay virtual content relating to the target objects for the AR session based on the coordinates.

2. The system of claim 1 wherein: the at least one processor of the UE further implements the agent to compute a coordinate transformation matrix (336) to convert the coordinates for the target objects from the premises coordinate system to the AR coordinate system.

3. The system of claim 2 wherein: the at least one processor of the UE further implements the agent to: perform location measurements in the premises coordinate system and the AR coordinate system at two distinct locations of the UE; estimate a rotation matrix (337) based on the location measurements; compute a translation matrix (338) based on the rotation matrix and at least one of the location measurements; and compute the coordinate transformation matrix based on the translation matrix and the rotation matrix.

4. The system of claim 3 wherein: the at least one processor of the UE further implements the agent to perform the location measurements at the two distinct locations when the UE is stationary; for each of the two distinct locations, the at least one processor of the UE further implements the agent to: determine whether the UE is stationary at a present location; collect a location measurement for the UE at the present location in the premises coordinate system and the AR coordinate system; save the location measurement when the UE remains stationary for a threshold time; and discard the location measurement when the UE moves before the threshold time expires.

5. The system of claim 4 wherein: the UE further comprises an Inertial Measurement Unit (IMU) sensor (312); and the at least one processor of the UE further implements the agent to determine whether the UE is stationary based on sensor data from the IMU sensor.

6. The system of claim 1 wherein: the UE further comprises an Inertial Measurement Unit (IMU) sensor (312) configured to transmit sensor data to the edge computing platform indicating an orientation of the UE; and the at least one processor of the UE further implements the agent to receive the coordinates from the edge computing platform for one or more of the target objects that are within the field of view of the camera based on the orientation of the UE.

7. The system of claim 1 further comprising: the edge computing platform comprising at least one processor (430) and memory (432) that implement a controller to: receive location data for the target objects from a positioning system (206) implemented on the premises; receive sensor data from Inertial Measurement Unit (IMU) sensors (504) implemented at the target objects; estimate the coordinates for the target objects in the premises coordinate system based on the location data and the sensor data; and transmit the coordinates for the target objects to the UE via the wireless signals.

8. The system of claim 7 wherein: the at least one processor of the edge computing platform further implements the controller to estimate the coordinates for the target objects based on a particle filter (414) where the target objects are represented with virtual particles; for the particle filter, the controller is configured to: estimate the coordinates for a target object of the target objects as an average of the virtual particles that represent the target object; and when receiving an indicator that the target object is motionless at a first update time, calculate an estimated location of the target object as a mean of the virtual particles, and re-sample the virtual particles around the estimated location of the target object.

9. The system of claim 8 wherein: for the particle filter, the controller is further configured to: when receiving acceleration data for the target object at a second update time, update positions of the virtual particles based on an estimated acceleration of the virtual particles at a preceding update time, and calculate an estimated acceleration of the virtual particles at the second update time based on a normal distribution of the acceleration data; and when receiving location data for the target object at a third update time, update the positions of the virtual particles based on the estimated acceleration of the virtual particles at the preceding update time, compute weighted values for the virtual particles based on the location data, and re-sample the virtual particles with the weighted values.

10. The system of claim 7 wherein: the positioning system comprises an indoor positioning system (601) comprising positioning tags (502) implemented at the target objects, and one or more locators (604).

11. The system of claim 7 wherein: the at least one processor of the edge computing platform further implements the controller to: receive sensor data from an IMU sensor (312) at the UE indicating an orientation of the UE; identify one or more of the target objects within the field of view of the camera based on the orientation; and restrict the coordinates that are transmitted to the UE based on the one or more of the target objects within the field of view of the camera.

12. A method (1400) of supporting an Augmented Reality (AR) session to augment one or more target objects on a premises, the method comprising: running (1402) the AR session on User Equipment (UE) with a Mobile AR (MAR) application, a camera, and a screen configured to display digital images for the AR session within a field of view of the camera, wherein an AR coordinate system is created for the AR session; receiving (1410) coordinates for the target objects at an agent implemented in the UE from an edge computing platform via wireless signals, wherein the coordinates are in a premises coordinate system of the premises; converting (1412), at the agent, the coordinates for the target objects from the premises coordinate system to the AR coordinate system; and providing (1414) the coordinates in the AR coordinate system from the agent to the MAR application; wherein the MAR application is configured to overlay virtual content relating to the target objects for the AR session based on the coordinates.

13. The method of claim 12 wherein converting the coordinates for the target objects from the premises coordinate system to the AR coordinate system comprises: computing (1420) a coordinate transformation matrix to convert the coordinates for the target objects from the premises coordinate system to the AR coordinate system.

14. The method of claim 13 wherein computing the coordinate transformation matrix comprises: performing (1602) location measurements in the premises coordinate system and the AR coordinate system at two distinct locations of the UE; estimating (1604) a rotation matrix based on the location measurements; computing (1606) a translation matrix based on the rotation matrix and at least one of the location measurements; and computing (1608) the coordinate transformation matrix based on the translation matrix and the rotation matrix.

15. The method of claim 14 wherein performing the location measurements in the premises coordinate system and the AR coordinate system at the two distinct locations of the UE comprises: for each of the two distinct locations: determining (1610) whether the UE is stationary at a present location; collecting (1612) a location measurement for the UE at the present location in the premises coordinate system and the AR coordinate system; saving (1614) the location measurement when the UE remains stationary for a threshold time; and discarding (1616) the location measurement when the UE moves before the threshold time expires.

16. The method of claim 12 further comprising: receiving (1002), at the edge computing platform, location data for the target objects from a positioning system implemented on the premises; receiving (1004) sensor data from Inertial Measurement Unit (IMU) sensors implemented at the target objects; estimating (1006) the coordinates for the target objects in the premises coordinate system based on the location data and the sensor data; and transmitting (1008) the coordinates for the target objects to the UE via the wireless signals.

17. The method of claim 16 wherein estimating the coordinates for the target objects is based on a particle filter where the target objects are represented with virtual particles; for the particle filter, the method further comprises: estimating (1102) the coordinates for a target object of the target objects as an average of the virtual particles that represent the target object; and when receiving an indicator that the target object is motionless at a first update time, calculating (1114) an estimated location of the target object as a mean of the virtual particles, and re-sampling (1116) the virtual particles around the estimated location of the target object.

18. The method of claim 17 wherein: for the particle filter, the method further comprises: when receiving acceleration data for the target object at a second update time, updating (1104) positions of the virtual particles based on an estimated acceleration of the virtual particles at a preceding update time, and calculating (1106) an esti ated acceleration of the virtual particles at the second update time based on a normal distribution of the acceleration data; and when receiving location data for the target object at a third update time, updating (1108) the positions of the virtual particles based on the estimated acceleration of the virtual particles at the preceding update time, calculating (1110) weighted values for the virtual particles based on the location data, and re-sampling (1112) the virtual particles with the weighted values.

19. The method of claim 16 wherein transmitting the coordinates for the target objects to the UE comprises: receiving (1010) sensor data from an Inertial Measurement Unit (IMU) sensor at the UE indicating an orientation of the UE; identifying (1012) one or more of the target objects within the field of view of the camera based on the orientation; and restricting (1014) the coordinates that are transmitted to the UE based on the one or more of the target objects within the field of view of the camera.

20. A non-transitory computer readable medium embodying programmed instructions executed by a processor, wherein the instructions direct the processor to implement a method of supporting an Augmented Reality (AR) session to augment one or more target objects on a premises, the method comprising: running the AR session on User Equipment (UE) with a Mobile AR (MAR) application, a camera, and a screen configured to display digital images for the AR session within a field of view of the camera, wherein an AR coordinate system is created for the AR session; receiving coordinates for the target objects at an agent implemented in the UE from an edge computing platform via wireless signals, wherein the coordinates are in a premises coordinate system of the premises; converting, at the agent, the coordinates for the target objects from the premises coordinate system to the AR coordinate system; and providing the coordinates in the AR coordinate system from the agent to the MAR application; wherein the MAR application is configured to overlay virtual content relating to the target objects for the AR session based on the coordinates.

Description:
GEO-LOCATION ASSISTED MOBILE AUGMENTED REALITY

Technical Field

This disclosure is related to the field of Augmented Reality (AR), and more particularly, to Mobile Augmented Reality (MAR).

Background

Augmented Reality (AR) is a technology that superimposes or overlays virtual content, such as graphics, audio, and/or other sensory content, on a user’s view of the real-world. Mobile Augmented Reality (MAR) refers to AR technology that is implemented on a mobile device, such as a smart phone, a tablet, a head-mounted display, or another type of mobile User Equipment (UE). For example, a camera on a mobile device captures digital images of a real world environment, and the mobile device displays the digital images on a screen as still images or video. A MAR application on the mobile device is able to overlay virtual content onto the digital images as displayed on the screen to convey information to the user. MAR may be used to enhance the perception of a real-world view of a user in a variety of ways. However, some MAR applications may be data-intensive and can overwhelm the resources of a mobile device.

Summary

Described herein is a Geo-Location Assisted MAR (GLAMAR) system that supports a MAR application running on a mobile device. As an overview, the GLAMAR system includes a GLAMAR service running on an edge computing platform, and a GLAMAR agent running on a UE. A MAR application on the UE runs an AR session, and a user is able to view one or more target objects in a Field of View (FoV) of the UE through the AR session. The GLAMAR service receives location data for target objects on a premises (e.g., an industrial or manufacturing facility) from a positioning system, and estimates a real-time position of the target objects in a premises coordinate system. The GLAMAR service streams coordinates for the target objects to the GLAMAR agent on the UE. The GLAMAR agent converts the coordinates for the target objects from the premises coordinate system to an AR coordinate system created by the MAR application for the AR session, and provides the coordinates for the target objects in the AR coordinate system to the MAR application. The MAR application is able to accurately overlay virtual content on the target objects for the AR session based on the coordinates provided by the GLAMAR agent. One technical benefit of the GLAMAR system is that location computation of the target objects in the premises coordinate system is offloaded to the edge computing platform, and streamed to the UE in substantially real-time. This reduces processing overhead and battery consumption in the UE. Also, vision-based techniques are not needed in the UE to run the AR session, which are often limited by the distance between the UE and target objects, and are prone to object occlusion.

One embodiment comprises a system that supports an AR session to augment one or more target objects on a premises. The system comprises a UE that includes a battery, a radio interface component configured to communicate with an edge computing platform via wireless signals, a camera, a screen configured to display digital images for the AR session within a field of view of the camera, and at least one processor and memory. The processor of the UE executes a MAR application to run the AR session, where an AR coordinate system is created for the AR session. The processor of the UE further implements an agent (e.g., the GLAMAR agent) to receive coordinates for the target objects from the edge computing platform, where the coordinates are in a premises coordinate system of the premises. The agent converts the coordinates for the target objects from the premises coordinate system to the AR coordinate system, and provides the coordinates in the AR coordinate system to the MAR application. The MAR application is configured to overlay virtual content relating to the target objects for the AR session based on the coordinates.

In another embodiment, the agent computes a coordinate transformation matrix to convert the coordinates for the target objects from the premises coordinate system to the AR coordinate system.

In another embodiment, the agent performs location measurements in the premises coordinate system and the AR coordinate system at two distinct locations of the UE, estimates a rotation matrix based on the location measurements, computes a translation matrix based on the rotation matrix and at least one of the location measurements, and computes the coordinate transformation matrix based on the translation matrix and the rotation matrix.

In another embodiment, the agent performs the location measurements at the two distinct locations when the UE is stationary. For each of the two distinct locations, the agent determines whether the UE is stationary at a present location, collects a location measurement for the UE at the present location in the premises coordinate system and the AR coordinate system, saves the location measurement when the UE remains stationary for a threshold time, and discards the location measurement when the UE moves before the threshold time expires.

In another embodiment, the UE further comprises an Inertial Measurement Unit (IMU) sensor, and the agent determines whether the UE is stationary based on sensor data from the IMU sensor.

In another embodiment, the UE further comprises an IMU sensor configured to transmit sensor data to the edge computing platform indicating an orientation of the UE, and the agent receives the coordinates from the edge computing platform for one or more of the target objects that are within the field of view of the camera based on the orientation of the UE. In another embodiment, the edge computing platform comprises at least one processor and memory that implement a controller to receive location data for the target objects from a positioning system implemented on the premises, receive sensor data from IMU sensors implemented at the target objects, estimate the coordinates for the target objects in the premises coordinate system based on the location data and the sensor data, and transmit the coordinates for the target objects to the UE via the wireless signals.

In another embodiment, the controller estimates the coordinates for the target objects based on a particle filter where the target objects are represented with virtual particles. For the particle filter, the controller estimates the coordinates for a target object of the target objects as an average of the virtual particles that represent the target object. When receiving an indicator that the target object is motionless at a first update time, the controller calculates an estimated location of the target object as a mean of the virtual particles, and re-samples the virtual particles around the estimated location of the target object.

In another embodiment, when receiving acceleration data for the target object at a second update time, the controller updates positions of the virtual particles based on an estimated acceleration of the virtual particles at a preceding update time, and calculates an estimated acceleration of the virtual particles at the second update time based on a normal distribution of the acceleration data. When receiving location data for the target object at a third update time, the controller updates the positions of the virtual particles based on the estimated acceleration of the virtual particles at the preceding update time, computes weighted values for the virtual particles based on the location data, and re-samples the virtual particles with the weighted values.

In another embodiment, the positioning system comprises an indoor positioning system comprising positioning tags implemented at the target objects, and one or more locators.

In another embodiment, the controller receives sensor data from an IMU sensor at the UE indicating an orientation of the UE, identifies one or more of the target objects within the field of view of the camera based on the orientation, and restricts the coordinates that are transmitted to the UE based on the one or more of the target objects within the field of view of the camera.

Another embodiment comprises a method of supporting an AR session to augment one or more target objects on a premises. The method comprises running the AR session on a UE with a MAR application, a camera, and a screen configured to display digital images for the AR session within a field of view of the camera. The method further comprises receiving coordinates for the target objects at an agent implemented in the UE from an edge computing platform via wireless signals, and converting the coordinates for the target objects from the premises coordinate system to the AR coordinate system. The method further comprises providing the coordinates in the AR coordinate system from the agent to the MAR application. The MAR application is configured to overlay virtual content relating to the target objects for the AR session based on the coordinates. In another embodiment, converting the coordinates for the target object from the premises coordinate system to the AR coordinate system comprises computing a coordinate transformation matrix to convert the coordinates for the target objects from the premises coordinate system to the AR coordinate system.

In another embodiment, computing the coordinate transformation matrix comprises performing location measurements in the premises coordinate system and the AR coordinate system at two distinct locations of the UE, estimating a rotation matrix based on the location measurements, computing a translation matrix based on the rotation matrix and at least one of the location measurements, and computing the coordinate transformation matrix based on the translation matrix and the rotation matrix.

In another embodiment, performing the location measurements in the premises coordinate system and the AR coordinate system at the two distinct locations of the UE comprises, for each of the two distinct locations, determining whether the UE is stationary at a present location, collecting a location measurement for the UE at the present location in the premises coordinate system and the AR coordinate system, saving the location measurement when the UE remains stationary for a threshold time, and discarding the location measurement when the UE moves before the threshold time expires.

In another embodiment, the method further comprises receiving, at the edge computing platform, location data for the target objects from a positioning system implemented on the premises, receiving sensor data from IMU sensors implemented at the target objects, estimating the coordinates for the target objects in the premises coordinate system based on the location data and the sensor data, and transmitting the coordinates for the target objects to the UE via the wireless signals.

In another embodiment, estimating the coordinates for the target objects is based on a particle filter where the target objects are represented with virtual particles. For the particle filter, the method further comprises estimating the coordinates for a target object of the target objects as an average of the virtual particles that represent the target object. When receiving an indicator that the target object is motionless at a first update time, the method further comprises calculating an estimated location of the target object as a mean of the virtual particles, and re-sampling the virtual particles around the estimated location of the target object.

In another embodiment, when receiving acceleration data for the target object at a second update time, the method further comprises updating positions of the virtual particles based on an estimated acceleration of the virtual particles at a preceding update time, and calculating an estimated acceleration of the virtual particles at the second update time based on a normal distribution of the acceleration data. When receiving location data for the target object at a third update time, the method further comprises updating the positions of the virtual particles based on the estimated acceleration of the virtual particles at the preceding update time, calculating weighted values for the virtual particles based on the location data, and re-sampling the virtual particles with the weighted values.

In another embodiment, transmitting the coordinates for the target objects to the UE comprises receiving sensor data from an IMU sensor at the UE indicating an orientation of the UE, identifying one or more of the target objects within the field of view of the camera based on the orientation, and restricting the coordinates that are transmitted to the UE based on the one or more of the target objects within the field of view of the camera.

Another embodiment comprises a non-transitory computer readable medium embodying programmed instructions executed by a processor. The instructions direct the processor to implement a method of supporting an AR session to augment one or more target objects on a premises. The method comprises running the AR session on a UE with a MAR application, a camera, and a screen configured to display digital images for the AR session within a field of view of the camera, receiving coordinates for the target objects at an agent implemented in the UE from an edge computing platform via wireless signals, converting the coordinates for the target objects from the premises coordinate system to the AR coordinate system, and providing the coordinates in the AR coordinate system from the agent to the MAR application. The MAR application is configured to overlay virtual content relating to the target objects for the AR session based on the coordinates.

Another embodiment comprises an apparatus, such as a UE, that includes a battery, a radio interface component configured to communicate with an edge computing platform via wireless signals, a camera, a screen configured to display digital images for the AR session within a field of view of the camera, and at least one processor and memory. The processor executes a MAR application to run the AR session, where an AR coordinate system is created for the AR session. The processor further implements an agent (e.g., the GLAMAR agent) to receive coordinates for one or more target objects on a premises from the edge computing platform, where the coordinates are in a premises coordinate system of the premises. The agent converts the coordinates for the target objects from the premises coordinate system to the AR coordinate system, and provides the coordinates in the AR coordinate system to the MAR application. The MAR application is configured to overlay virtual content relating to the target objects for the AR session based on the coordinates.

Another embodiment comprises an apparatus, such as a UE, that includes at least one processor, and at least one memory including computer program code. The at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to run an AR session with a MAR application, where an AR coordinate system is created for the AR session. The computer program code is further configured to cause the apparatus to receive coordinates for one or more target objects on a premises from an edge computing platform, where the coordinates are in a premises coordinate system of the premises. The computer program code is further configured to cause the apparatus to convert the coordinates for the target objects from the premises coordinate system to the AR coordinate system, and provide the coordinates in the AR coordinate system to the MAR application. The MAR application is configured to overlay virtual content relating to the target objects for the AR session based on the coordinates.

Another embodiment comprises an apparatus, such as a UE, that includes a battery, a means for communicating with an edge computing platform via wireless signals, a camera, and a means for displaying digital images for the AR session within a field of view of the camera. The apparatus further includes a means for executing a MAR application to run the AR session, where an AR coordinate system is created for the AR session. The apparatus further includes a means for implementing an agent (e.g., the GLAMAR agent) to receive coordinates for one or more target objects on a premises from the edge computing platform, where the coordinates are in a premises coordinate system of the premises. The agent converts the coordinates for the target objects from the premises coordinate system to the AR coordinate system, and provides the coordinates in the AR coordinate system to the MAR application. The MAR application is configured to overlay virtual content relating to the target objects for the AR session based on the coordinates.

Another embodiment comprises an edge computing platform that includes a radio interface component configured to communicate with a UE via wireless signals, and at least one processor and memory that implement a controller to receive location data for one or more target objects on a premises from a positioning system implemented on the premises, receive sensor data from IMU sensors implemented at the target objects, estimate coordinates for the target objects in the premises coordinate system based on the location data and the sensor data, and transmit the coordinates for the target objects to a UE via the wireless signals.

The above summary provides a basic understanding of some aspects of the specification. This summary is not an extensive overview of the specification. It is intended to neither identify key or critical elements of the specification nor delineate any scope of the particular embodiments of the specification, or any scope of the claims. Its sole purpose is to present some concepts of the specification in a simplified form as a prelude to the more detailed description that is presented later.

Description of the Drawings

Some embodiments of the invention are now described, by way of example only, and with reference to the accompanying drawings. The same reference number represents the same element or the same type of element on all drawings.

FIG. 1 is a plan view of a premises that implements a GLAMAR system in an illustrative embodiment.

FIG. 2 illustrates a GLAMAR system in an illustrative embodiment.

FIG. 3 is a block diagram of a UE in an illustrative embodiment. FIG. 4 is a block diagram of an edge computing platform in an illustrative embodiment.

FIG. 5 is a block diagram of a target object in an illustrative embodiment.

FIG. 6 is a block diagram of a positioning system in an illustrative embodiment.

FIG. 7 illustrates a locator and a positioning tag in an illustrative embodiment.

FIG. 8 illustrates coordinate systems used by different entities in an illustrative embodiment.

FIG. 9 is a flow chart illustrating a method of reporting location data for target objects in an illustrative embodiment.

FIG. 10 is a flow chart illustrating a method of providing a GLAMAR service at an edge computing platform in an illustrative embodiment.

FIG. 11 is a flow chart illustrating a method of estimating coordinates for a target object in an illustrative embodiment.

FIG. 12 is a distribution of location error in an illustrative embodiment.

FIG. 13 is a distribution of an accelerometer sensor reading while a target object is static in an illustrative embodiment.

FIG. 14 is a flow chart illustrating a method of operating a GLAMAR agent in a UE in an illustrative embodiment.

FIG. 15 illustrates virtual content overlaid on target objects in an illustrative embodiment.

FIG. 16 is a flow chart illustrating a method of computing a coordinate transformation matrix in an illustrative embodiment.

FIG. 17 illustrates an estimation of a rotation matrix in an illustrative embodiment.

Description of Embodiments

The figures and the following description illustrate specific exemplary embodiments. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the embodiments and are included within the scope of the embodiments. Furthermore, any examples described herein are intended to aid in understanding the principles of the embodiments, and are to be construed as being without limitation to such specifically recited examples and conditions. As a result, the inventive concept(s) is not limited to the specific embodiments or examples described below, but by the claims and their equivalents.

FIG. 1 is a plan view of a premises 100 that implements a GLAMAR system 110 in an illustrative embodiment. Premises 100 may include an indoor area 102 represented by one or more buildings 104 or other types of indoor structures. Premises 100 may additionally or alternatively include an outdoor area 106. Premises 100 may represent an industrial or manufacturing facility, a warehouse, a factory, a hospital, an office, a care home, or another type of facility. There may be one or more target objects 114 located within the indoor area 102 (i.e., within building 104) and/or on outdoor area 106 that can be enhanced with MAR technology using a mobile device.

In order to overlay virtual content on a target object 114, a MAR application running on a mobile device needs to know the location of the target object 114 in the real world. Tracking and localizing a large number of target objects 114 can become a burden for the mobile device due to compute and battery requirements. For example, some AR applications use vision-based techniques to recognize and track target objects 114 in a camera’s FoV. Such techniques require recognition of target objects 114 appearing in the camera frame, and tracking procedures running on the AR application. As these procedures are compute intensive and put a huge demand on the battery, they are not suitable to run on commercial mobile devices for MAR applications. Thus, GLAMAR system 110 is used to provide geo-location of the target objects 114 to mobile devices to locate the target objects 114 precisely in the MAR world.

FIG. 2 illustrates GLAMAR system 110 in an illustrative embodiment. GLAMAR system 110 is a framework or apparatus that supports a MAR application running on a mobile device with location and tracking of target objects 114 by an external platform. In this embodiment,

GLAMAR system 110 includes a GLAMAR agent 201 executed on a UE 202, and a GLAMAR service 203 executed on an edge computing platform 204. UE 202 is a hand-held end user device or apparatus that is mobile, such as a mobile phone (e.g., smart phone), a tablet or PDA, etc. Although not shown, multiple UEs 202 may be operating with GLAMAR system 110. Edge computing platform 204 comprises external resources (i.e., external to UE 202) that perform data processing at the edge of a network near UE 202 and target objects 114. For example, UE 202 and target objects 114 communicate with edge computing platform 204 via wireless signals. Thus, a wireless network 208, such as a Personal Area Network (PAN), may be established using a wireless technology such as Bluetooth (e.g., Bluetooth Low Energy (BLE)), WiFi, 3G, 4G, 5G, New Radio (NR), etc. Wireless network 208 is a low-latency network that allows for direct communication between UE 202 and edge computing platform 204. Edge computing platform 204 may be implemented within range of wireless network 208 to reduce communications bandwidth between the elements of GLAMAR system 110. In one embodiment, edge computing platform 204 may comprise Multi-access Edge Computing (MEC) architecture (formerly referred to as Mobile Edge Computing). MEC may be part of a telecommunication network, and may be run isolated from the rest of the network, while having access to local resources on premises 100.

There are different physical deployments of MEC. In one example, MEC and the local User Plane Function (UPF) may be collocated with a base station or the like. UPF is deployed and used to steer traffic towards the targeted MEC applications, such as in GLAMAR system 110. For enhancing the latency and reliability in mission critical applications, GLAMAR system 110 may use 5G Ultra Reliable Low Latency Communications (URLLC) communications. URLLC is a service category of 5G or 5G New Radio aimed at mission critical communications, with a target latency of one millisecond and requirements for end-to-end security and 99.999 percent reliability. This ultra-fast and ultra-reliable type of wireless communication may be used for latency sensitive applications as used in GLAMAR system 110.

Further, edge computing platform 204 may be part of private telecommunications, such as a private LTE (Long-Term Evolution) or 5G network.

Also illustrated in FIG. 2 is a positioning system 206, which is a set of devices that locate and/or track target objects 114 on premises 100.

As an overview of the functionality of GLAMAR system 110, UE 202 has a MAR application that runs its own independent AR session, and GLAMAR agent 201 assists the MAR application to overlay virtual content relating to the corresponding target objects 114 in the FoV 210 of a camera within UE 202, such as overlaying the virtual content on the target objects 114, in the vicinity of the target objects 114, pointing to the target objects 114, etc.). Target objects 114 may be stationary or moving in the real world. For instance, a target object 114 may be a box on a conveyor belt, or a robot moving on premises 100. Target objects 114 are equipped with positioning tags for localization via positioning system 206, and may also be equipped with Inertial Measurement Unit (IMU) sensors or the like that provide sensor data to edge computing platform 204 in substantially real-time. GLAMAR service 203 receives location data for target objects 114 from positioning system 206, and provides coordinates for the target objects 114 to UE 202 in substantially real-time. GLAMAR agent 201 on UE 202 preforms a mapping procedure to map the coordinates for the target objects 114 from a coordinate system of premises 100 to a coordinate system of the AR session. The MAR application uses the coordinates provided by the GLAMAR agent 201 to accurately overlay virtual content on the target objects 114 displayed for the AR session. One benefit of GLAMAR system 110 is that it uses external geo-location (i.e., positioning system 206) to determine the location of target objects 114, instead of on-device target object location tracking, to offload the burden on UE 202. The location computation is offloaded to edge computing platform 204, and the location information (i.e., coordinates) for the target objects 114 is streamed to UE 202 in substantially real-time using low latency networking. This makes a MAR application viable to run on UE 202 without requiring high processing capacity or draining the battery.

FIG. 3 is a block diagram of UE 202 in an illustrative embodiment. In this embodiment, UE 202 includes a radio interface component 302, one or more processors 304, a memory 306, a user interface component 308, a camera 310, an IMU sensor 312, and a battery 314. Radio interface component 302 is a hardware component that represents the local radio resources of UE 202, such as an RF unit 320 (e.g., transceiver) and one or more antennas 322, used for wireless communications. Radio interface component 302 may be configured for communications via Bluetooth (e.g., BLE), ZigBee, WiFi, 5G, or another protocol. Processor 304 represents the internal circuitry, logic, hardware, etc., that provides the functions of UE 202. Processor 304 may be configured to execute instructions 340 for software that are loaded into memory 306. Processor 304 may comprise a set of one or more processors or may comprise a multi-processor core, depending on the particular implementation. Memory 306 is a computer readable storage medium for data, instructions 340, applications, etc., and is accessible by processor 304. Memory 306 is a hardware storage device capable of storing information on a temporary basis and/or a permanent basis. Memory 306 may comprise a random-access memory, or any other volatile or non-volatile storage device.

User interface component 308 is a hardware component for interacting with an end user. For example, user interface component 308 may include a screen 350 (e.g., touch screen, Liquid Crystal Display (LCD), Light Emitting Diode (LED) display, viewfinder, etc.) or the like. User interface component 308 may include keyboard or keypad, a tracking device (e.g., a trackball or trackpad), a speaker, a microphone, etc. Camera 310 comprises a digital camera having one or more lenses 360 that focuses light on an image sensor 362 (e.g., CMOS sensor) to detect, capture, or output digital images (i.e., picture or video). IMU sensor 312 is an electronic device that measures and reports a specific force, angular rate, and orientation of a body, such as using a combination of accelerometers, gyroscopes, magnetometers, etc. UE 202 may include various other components not specifically illustrated in FIG. 3.

In this embodiment, processor 304 implements a MAR application 330 and GLAMAR agent 201. MAR application 330 comprises a software application on a UE that runs an AR session 334 to integrate virtual content 335 (e.g., visual content) into the user’s real-world environment. MAR application 330 may be developed using ARCore, ARKit, or another tool depending on the UE’s platform or operating system. MAR application 330 may store or receive a record or library of virtual content 335 for premises 100. When screen 350 displays digital images for an AR session 334, MAR application 330 operates to overlay or superimpose virtual content 335 onto one or more of the target objects 114 within the FoV 210 of camera 310 to augment the target objects 114.

GLAMAR agent 201 is a software application on a UE (i.e., a local agent) that assists MAR application 330 in overlaying the virtual content 335 onto the digital images for the AR session 334. As will be described in more detail below, GLAMAR agent 201 receives location information for one or more target objects 114 from edge computing platform 204 in substantially real-time. The location information is in the coordinate system of premises 100 (referred to herein as the premises coordinate system). MAR application 330 builds an independent AR coordinate system for the AR session 334. GLAMAR agent 201 therefore translates the location information for target objects 114 from the premises coordinate system to the AR coordinate system for the AR session 334. MAR application 330 may then use the translated location information for the target objects 114 to overlay virtual content 335 onto the target objects 114 for the AR session 334. FIG. 4 is a block diagram of edge computing platform 204 in an illustrative embodiment. Edge computing platform 204 refers to physical resources that perform data processing to provide one or more services. Edge computing platform 204 may include the physical resources of compute resources, storage resources, and network resources. In this embodiment, edge computing platform 204 includes or provides the following subsystems: a radio interface component 402, a network interface component 404, and a GLAMAR controller 406. Radio interface component 402 is a hardware component that communicates via wireless signals, and includes an RF unit 420 (e.g., transceiver) and one or more antennas 422. Radio interface component 402 may be configured for communications via Bluetooth (e.g., BLE), ZigBee, WiFi, 5G, or another protocol. Network interface component 404 is an optional hardware component that exchanges messages, signaling, or packets with other elements, such as positioning system 206. Network interface component 404 may operate using a variety of protocols. GLAMAR controller 406 is circuitry, logic, hardware, means, etc., configured to provide a GLAMAR service 203. GLAMAR service 203 estimates location information for target objects 114 in the premises coordinate system based on location data from positioning system 206, and also optionally based on sensor data from IMU sensors at the target objects 114. GLAMAR service 203 may use a regenerative particle filter 414 to estimate the location information for target objects 114, as will be described in more detail below.

One or more of the subsystems of edge computing platform 204 may be implemented on a hardware platform comprised of analog and/or digital circuitry. One or more of the subsystems of edge computing platform 204 may be implemented on a processor 430 that executes instructions 434 stored in memory 432. Processor 430 comprises an integrated hardware circuit configured to execute instructions 434, and memory 432 is a non-transitory computer readable storage medium for data, instructions 434, applications, etc., and is accessible by processor 430. In other alternatives, one or more of the subsystems of edge computing platform 204 may be implemented on an edge cloud 440, one or more edge servers 442, or another architecture (e.g., MEC). Edge computing platform 204 may include various other components or sub-systems not specifically illustrated in FIG. 4.

FIG. 5 is a block diagram of a target object 114 in an illustrative embodiment. A target object 114 comprises an item, device, machine, part, component, asset, or any other thing that is annotated or otherwise enhanced with AR technology. As described above, a target object 114 may be a box on a conveyor belt, a robot, etc. In this embodiment, a positioning tag 502 is attached, affixed, or otherwise implemented at target object 114. A positioning tag 502 comprises circuitry, hardware, means, etc., configured to interact with one or more locators of positioning system 206 to determine a location of the positioning tag 502. Target object 114 may further include an IMU sensor 504 that is configured to generate sensor data for target object 114, and transmit the sensor data to edge computing platform 204 or other destinations. FIG. 6 is a block diagram of positioning system 206 in an illustrative embodiment. In this embodiment, positioning system 206 includes a location server 602, one or more locators 604, and one or more positioning tags 502. A locator 604 comprises a directional positioning reference station installed at premises 100 that interacts with a positioning tag 502 to determine the location of the positioning tag 502. As indicated in FIG. 6, a locator 604 is attached to a fixed surface 610 on premises 100, such as a ceiling or wall of building 104, to obtain line-of-sight measurements with positioning tags 502. Positioning tags 502 transmit signals that are received by one or more of locators 604. A locator 604 measures the Angle of Arrival (AoA) of a signal from a positioning tag 502, and provides measurements to the centralized location server 602. Location server 602 collects measurements from one or more locators 604, and determines location data for the positioning tag 502. Location server 602 may then transmit the location data to edge computing platform 204 or another destination. FIG. 7 illustrates a locator 604 and a positioning tag 502 in an illustrative embodiment. Locator 604 includes a transceiver (TRX) 710 and a switched antenna array 712. Positioning tag 502 includes a transceiver (TRX) 720, an antenna 722, and a battery 724. In a network centric mode, positioning tag 502 broadcasts packets (e.g., BLE (Bluetooth Low Energy) packets) at regular intervals, or in a need basis. Locator 604 receives a packet through switched antenna array 712, and measures the AoA of the packet. Although BLE is mentioned in this example, other low power radio solutions may be used.

In the embodiment shown in FIG. 6, positioning system 206 may comprise an indoor positioning system 601, such as inside of building 104 on premises 100. An Indoor Positioning System (IPS) is a system used to locate objects inside a structure, underground, beneath a covered area, etc., such as where Global Positioning Systems (GPS) and other satellite technologies lack precision. One particular type of IPS is High Accuracy Indoor Positioning (HAIP) developed by Nokia. HAIP uses a modified version of BLE for cost efficient and highly-accurate indoor positioning. HAIP may be used to determine the location of a positioning tag 502 with one locator 604 or multiple locators 604 to provide accuracy up to 0.3 meters. Also, HAIP requires much less energy than a typical GPS device, which saves on the battery 724 within a positioning tag 502.

Although illustrated as an indoor positioning system 601 in FIG. 6, positioning system 206 may comprise an outdoor positioning system in other embodiments.

Before describing the functionalities of GLAMAR system 110, the coordinate systems used by different entities and their inter-relationship will be described so that target objects 114 and their corresponding virtual content can be displayed on UE 202. FIG. 8 illustrates the coordinate systems 800 used by the different entities in an illustrative embodiment. One of the coordinate systems 800 is the premises coordinate system 802. Positioning system 206 on premises 100 has its own point of reference or coordinate system, which is referred to as the premises coordinate system 802. Premises coordinate system 802 helps in representing the location of a target object 114 in two-dimensions or three-dimensions. In the three-dimensional case, the X-axis and the Y-axis form a horizontal plane parallel to the Earth’s surface, and the Z- axis is perpendicular to the horizontal plane (i.e., opposite of gravity) as shown in FIG. 8. In the two-dimensional case, the X-axis and the Y-axis form the horizontal plane parallel to the Earth’s surface.

Another of the coordinate systems 800 is the AR coordinate system 803. As the camera 310 of UE 202 moves, it uses the visual features of the surrounding environment to build an AR coordinate system 803 for the AR session 334. All objects, real or virtual, must be represented in the AR coordinate system 803 for viewing on UE 202. The origin of the AR coordinate system

803 remains at the starting point of the AR session 334. The AR coordinate system 803 is three- dimensional having the Y-axis along the opposite direction of gravity, and X-axis and the Z-axis forming the horizontal plane.

Another of the coordinate systems 800 is the UE coordinate system 804. This is a three- dimensional coordinate system with six degrees of freedom (6DoF). The UE coordinate system

804 is relative to the screen 350 of UE 202 when UE 202 is held in its default orientation (i.e., user holding UE 202 to view screen 350). In its default orientation, the X axis is horizontal and points to the right, the Y axis is vertical and points up, and the Z axis points toward the outside of the screen surface. In this coordinate system, coordinates behind the screen 350 have negative Z values. The axes are not swapped when the orientation of the UE’s screen 350 changes.

Another of the coordinate systems 800 is the reference (Earth’s) coordinate system 805. The reference coordinate system 805 is a three degrees of freedom (3DoF) world coordinate system on the Earth’s surface plane. In this case, the Y-axis is towards global north, the X-axis is towards the East, and the Z-axis is perpendicular to the Earth’s surface plane opposite of gravity. The reference coordinate system 805 is used as reference to find the orientation of a target object 114 in the premises coordinate system 802.

The following describes the functionalities of GLAMAR system 110 to support or assist a MAR application 330 on UE 202 in running an AR session 334. As described above, edge computing platform 204 receives location data for target objects 114 from an external positioning system 206 implemented on premises 100. FIG. 9 is a flow chart illustrating a method 900 of reporting the location data for target objects 114 in an illustrative embodiment. The steps of method 900 will be described with reference to positioning system 206 in FIG. 6, but those skilled in the art will appreciate that method 900 may be performed in other systems. Also, the steps of the flow charts described herein are not all inclusive and may include other steps not shown, and the steps may be performed in an alternative order.

Positioning system 206 determines location data for one or more target objects 114 (step 902). The location data is in the premises coordinate system 802. In one embodiment, positioning system 206 may determine the location data as follows. A positioning tag 502 at one or more target objects 114 broadcasts packets (e.g., BLE packets). A locator 604 (or multiple locators 604) receives one or more of the packets (optional step 912), and performs a measurement of the AoA of the packet(s) (optional step 914). Locator 604 then reports the measurement(s) to location server 602, and location server 602 determines the location data for the target object(s) 114 based on the measurements from one or more locators 604 (optional step 916). After determining the location data for one or more of the target objects 114, positioning system 206 transmits the location data to edge computing platform 204 (step 904). Positioning system 206 operates in substantially real-time to report the location data, so method 900 may repeat so that location data is continually reported to edge computing platform 204.

In another embodiment, positioning system 206 may report the location data on a need basis or scheduled basis. For example, positioning system 206 may be linked to other process control, conveyor belt function, etc., on premises 100, starting it when the conveyor belt starts to convey, for example. In another example, positioning system 206 may report the location data if a user selects a target object 114 to be tracked through UE 202. UE 202 can then start to follow that target object 114, and may update that target object 114 as a priority. Priority may indicate that the location update communication is prioritized regarding the other target objects 114. The direction of view may be controlled at least partly by object movement direction. Prioritization may be released by action by the user or by losing the view of the target object 114.

With the location data provided by positioning system 206, GLAMAR service 203 operates to generate coordinates for the target objects 114. FIG. 10 is a flow chart illustrating a method of providing a GLAMAR service 203 at edge computing platform 204 in an illustrative embodiment. The steps of method 1000 will be described with reference to edge computing platform 204 in FIG. 3, but those skilled in the art will appreciate that method 1000 may be performed in other systems, such as in a 5G environment, for example.

GLAMAR controller 406 of edge computing platform 204 receives the location data for the target objects 114 from positioning system 206 (step 1002). GLAMAR controller 406 may receive the location data from positioning system 206 via radio interface component 402, network interface component 404, or through another interface or port depending on connectivity with positioning system 206. GLAMAR controller 406 also receives sensor data from the target objects 114 (step 1004). As described above, an IMU sensor 504 or the like may be implemented at each target object 114 to collect sensor data (e.g., linear acceleration, orientation, and/or heading) for its corresponding target object 114. The IMU sensors 504 report the sensor data through wireless communications, which is received by GLAMAR controller 406, such as through radio interface component 402.

Noise or other interferences, such as from nearby radio sources, may inject errors in the location data received from positioning system 206. Thus, GLAMAR controller 406 estimates coordinates for the target objects 114 in the premises coordinate system 802 based on the location data and the sensor data (step 1006). GLAMAR controller 406 then transmits the coordinates for the target objects 114 to UE 202 (step 1008) and/or other UEs. GLAMAR controller 406 operates in substantially real-time to report the coordinates, so method 1000 may repeat so that the coordinates are continually transmitted to UE 202 and/or other UEs or at least on a need basis.

In one embodiment, GLAMAR controller 406 may limit the coordinates that are sent to UE 202 based on which target objects 114 are within the FoV 210 of UE 202. For example, IMU sensor 312 in UE 202 may transmit sensor data to edge computing platform 204 indicating an orientation of UE 202 on premises 100. GLAMAR controller 406 receives the sensor data from IMU sensor 312 (optional step 1010). GLAMAR controller 406 identifies one or more target objects 114 within the FoV 210 of UE 202 based on the orientation (optional step 1012), and restricts the coordinates that are transmitted to UE 202 based on the target objects 114 within the FoV 210 of UE 202 (optional step 1014). Therefore, GLAMAR controller 406 may transmit the coordinates exclusively for one or more target objects 114 that are within the FoV 210 of UE 202. This may be a benefit in reducing the amount of data that is streamed from edge computing platform 204 to UE 202 as part of the GLAMAR service 203. However, GLAMAR controller 406 may provide location information for all target objects 114 on premises 100, or a subset of target objects 114 depending on desired implementations.

In one embodiment, GLAMAR controller 406 may use filter 414 to estimate the coordinates for the target objects 114. FIG. 11 is a flow chart illustrating a method 1100 of estimating coordinates for a target object 114 in an illustrative embodiment. According to the filter algorithm, the location of the target object 114 is represented with virtual particles or samples. GLAMAR controller 406 estimates the coordinates for the target object 114 as an average of the virtual particles (step 1102). Filter 414 is considered a “regenerative particle filter” in that the positions of the virtual particles are recalculated in response to update events indicated in the location data and/or sensor data for the target object 114. For example, the update events may include an acceleration update and a location update when the target object 114 is moving, and a motionless update for the target object 114 when it is stationary. The acceleration update occurs when receiving acceleration data (i.e., in the sensor data) for the target object 114 from the IMU sensor 504. The location update occurs when receiving location data for the target object 114 from positioning system 206. The motionless update occurs when the target object 114 has been motionless or stationary for a threshold time. These updates occur at irregular times, which are referred to as update times.

When receiving acceleration data for the target object 114 from the IMU sensor 504 at a present update time, GLAMAR controller 406 updates the positions of the virtual particles based on an estimated acceleration of the virtual particles at a preceding update time (step 1104). GLAMAR controller 406 also calculates an estimated acceleration of the virtual particles at the present update time based on a normal distribution of the acceleration data (step 1106). When receiving location data for the target object 114 at a present update time, such as from positioning system 206, GLAMAR controller 406 updates the positions of the virtual particles based on an estimated acceleration of the virtual particles from a preceding update time (step 1108). GLAMAR controller 406 calculates or computes weighted values for the virtual particles based on the location data (step 1110), and re-samples the virtual particles with the weighted values (step 1112).

When receiving an indicator that the target object 114 is motionless at the present update time, GLAMAR controller 406 calculates an estimated location of the target object 114 as a mean of the virtual particles (step 1114), and re-samples the virtual particles around the estimated location of the target object (step 1116).

The following further describes re-sampling of the virtual particles, and the three types of measurement events that are generated in the GLAMAR framework as described above. Location updates give an estimate of the present location of a target object 114 in the premises coordinate system 802. If Z is the present location, then the estimate of the present location is Y ~ N(Z, o / , where o / is the covariance matrix and may be estimated from measurements as shown in FIG. 12. FIG. 12 is a distribution of location error in an illustrative embodiment. N(a, b ) denotes a normal distribution with mean a and covariance b.

The acceleration update gives the present acceleration of the target object 114. A sensor fusion technique may be used to measure the orientation of the target object 114 with respect to the reference (Earth’s) coordinate system 805. Using the knowledge of the orientation of the premises coordinate system 802 with respect to the reference coordinate system 805, the accelerometer reading of the target object 114 is converted to the premises coordinate system 802. As in the case of the location update, if A is the true acceleration, then the estimate of the acceleration is N(A, OA). The covariance matrix may be estimated from measurements as shown in FIG. 13. FIG. 13 is a distribution of an accelerometer sensor reading while a target object 114 is static in an illustrative embodiment. The acceleration measurement is noisy and will in general give non-zero values even if the target object 114 is not moving.

The motionless update indicates that the target object 114 is stationary. Combining the acceleration measurement with gyroscope readings using sensor fusion techniques, it is possible to derive with an extremely high level of confidence whether the target object 114 is stationary or not. The IMU sensor 504 at the target object 114 sends out a motionless indication (i.e., in the sensor data) when it detects that it is not in motion.

These updates arrive at irregular intervals. Let the times at which the updates arrive at GLAMAR controller 406 be denoted by ti < t2 < t3.... Each of these arrival times t, (where j = 1,

2, 3...) is one of these three types of updates. We use \, = t j +1 - t, to denote the time between updates j and j+ 1. We use ,, to denote the location of the target object 114 at time t,. In order to simplify notation, we use Z j to denote Z„ and more generally we use the subscript j to denote t j . We use V j and A j to denote the velocity and acceleration of the target object 114 at time t j . Assume that the initial location Zo is known and that the target object 114 is stationary (i.e., Vo = Ao = 0).

Filter 414 is a Bayesian update mechanism to track the location of moving objects. Filter 414 represents the location of a target object 114 using n virtual particles. Each virtual particle is evolved in time using the location data and sensor data received at GLAMAR controller 406. The coordinates of the target object 114 at its actual location is estimated as the average of the location of the n virtual particles. One component in the filter algorithm is the re-sampling step that is performed whenever there is a location update. Re-sampling of the virtual particles eliminates unlikely virtual particles while reinforcing more likely virtual particles. This controls the variance of the location estimator. A natural approach when receiving a motionless update is to simply freeze the existing virtual particles until receiving an acceleration update. Flowever, this leads to poor location estimation. Therefore, a regeneration step is performed when a motionless update is received. In the regeneration step, we first determine the current expected location of the virtual particle and then generate n virtual particles around the current expected location. This regeneration step improves the performance of the particle filter algorithm significantly. With filter 414 having this additional regeneration step, it is referred to herein as a “regenerative particle filter”.

The particle filter algorithm is described as follows. We use X- n to denote the estimated location of particle m at time t j . In addition, we use V- n and A 1 to denote the velocity and acceleration of particle m at time t j . At the initial time to, assume that we have an estimate Xo,Vo, Ao of the initial target object location, initial velocity, and acceleration, respectively. Assume, for example, that an acceleration update A j is received at time t j . In this case, the current location of the particles is updated according to:

We then generate n ~ N (A j , a A)·

Assume, for example, that a location update Y j is received at time t j . In this case, the current location of the particles is updated according to:

Next, the conditional probability is calculated as follows: wj n = Pr[Y ] \Xj n \ This probability may be computed since given X j m , the actual location of the target object 114 is:

N(X j m ,a L )

Once the values of W j m are generated for all particles m, n samples are drawn with replacement to generate the new X j m , and A™ is set to A" j i.

When a motionless update is received at time t j . , the estimated location of the target object Z j is computed as the mean of the particle locations:

A new set of m particles X j m ~ N( Z j , OL) may be generated. V j m is set to “0”, and A j m is set to “0” for all particles m. The particles are regenerated around the estimated location at this step.

FIG. 14 is a flow chart illustrating a method 1400 of operating a GLAMAR agent 201 in a UE 202 in an illustrative embodiment. The steps of method 1400 will be described with reference to UE 202 in FIG. 2, but those skilled in the art will appreciate that method 1400 may be performed in other mobile devices.

It is assumed in this embodiment that a user is operating UE 202 on premises 100, and more particularly, within building 104 on premises 100. UE 202 establishes wireless communications with edge computing platform 204, such as by pairing radio interface component 302 with radio interface component 402 of edge computing platform 204. MAR application 330 in UE 202 instantiates and runs an AR session 334 (step 1402). For the AR session 334, screen 350 displays the digital images within the FoV 210 of camera 310 (step 1406). MAR application 330 overlays or superimposes virtual content 335 on one or more of the target objects 114 within the FoV 210 of the UE’s camera 310 (step 1408). In this embodiment, MAR application 330 overlays the virtual content 335 based on coordinates provided by GLAMAR agent 201.

GLAMAR agent 201 receives the coordinates (e.g., continuously) for one or more target objects 114 from GLAMAR controller 406 of edge computing platform 204 (step 1410). The coordinates received from GLAMAR controller 406 are received in substantially real-time, and are in the premises coordinate system 802. As described above, MAR application 330 builds an independent AR coordinate system 803 for the AR session 334. GLAMAR agent 201 transforms, translates, or converts the coordinates from the premises coordinate system 802 to the AR coordinate system 803 of the AR session 334 (step 1412). After conversion, GLAMAR agent 201 provides the coordinates for the target objects 114 in the AR coordinate system 803 to MAR application 330 (step 1414). MAR application 330 then overlays or superimposes the virtual content 335 on one or more of the target objects 114 within the FoV 210 of the UE’s camera 310 based on the coordinates provided by GLAMAR agent 201 (step 1408). FIG. 15 illustrates virtual content 335 overlaid on target objects 114 in an illustrative embodiment. In the example of FIG. 15, two target objects 114 are shown in the FoV 210 of UE 202 for the AR session 334. MAR application 330 overlays virtual content 335 on the two target objects 114 displayed on screen 350. MAR application 330 is able to accurately overlay the virtual content 335 on the target objects 114 based on the coordinates provided by GLAMAR agent 201. The virtual content 335 assists the user of the UE 202 to identify, inspect, and/or control the target objects 114 in the virtual world, and the effect of which will be carried out in the physical world in real-time. For example, the virtual content 335 may allow a user of UE 202 to identify and distinguish two robots performing a certain task of interest. The virtual content 335 may also display controlling information about the robots for the user to control the robots in real-time (e.g., move a robot to a particular place by dragging its image on screen 350).

In one embodiment, UE 202 may be located in outdoor area 106 (see FIG. 1) and is able to view target objects 114 within building 104 through AR session 334. GLAMAR controller 406 in edge computing platform 204 may transmit a virtual display to UE 202 in addition to the coordinates for the target objects 114. The virtual display may appear when MAR application 330 is directed towards a wall of building 104 behind which the target objects 114 can be found.

In one embodiment, GLAMAR agent 201 computes a coordinate transformation matrix 336 to translate the coordinates from the premises coordinate system 802 to the AR coordinate system 803 (optional step 1420 of FIG. 14). FIG. 16 is a flow chart illustrating a method 1600 of computing a coordinate transformation matrix 336 in an illustrative embodiment. Knowledge of both the origin and the orientation of the two coordinate systems are needed to compute the coordinate transformation matrix 336. Even though such information is easily available for the premises coordinate system 802, it is non-trivial to know the origin and orientation of the AR coordinate system 803. Furthermore, each GLAMAR agent 201 in a UE 202 has different and independent AR coordinate systems 803. For computing the coordinate transformation matrix 336, GLAMAR agent 201 determines both a rotation matrix 337 and a translation matrix 338.

Independent of the translation between the two coordinate systems, GLAMAR agent 201 first estimates the rotation matrix 337 by coinciding the origin of the AR coordinate system 803 and the premises coordinate system 802. GLAMAR agent 201 performs location measurements at two distinct locations of UE 202 (step 1602). GLAMAR agent 201 performs the location measurements when UE 202 is stationary, and performs the location measurements in the premises coordinate system 802 and the AR coordinate system 803. For instance, GLAMAR agent 201 determines whether UE 202 is stationary (optional step 1610). GLAMAR agent 201 may use a threshold-based technique on gyroscope sensor data from IMU sensor 312 or the like to determine whether UE 202 is stationary. When UE 202 is stationary, GLAMAR agent 201 collects a location measurement for UE 202 at the present location (optional step 1612) in the premises coordinate system 802 and the AR coordinate system 803. When UE 202 remains stationary at the present location for a threshold time, such as five seconds, GLAMAR agent 201 saves the location measurement at the present location (optional step 1614). This eliminates any random fluctuation of the location measurement. When UE 202 moves before the threshold time expires, GLAMAR agent 201 discards the location measurement (optional step 1616). When collecting location measurements in the premises coordinate system 802, GLAMAR agent 201 or another element of UE 202 may interact with positioning system 206. For example, GLAMAR agent 201 may determine location information for UE 202 based on BLE packets or other signals received from a locator 604 of positioning system 206. This way the location of UE 202 is concealed from positioning system 206. In another example, positioning system 206 may track the location of UE 202, and report location information to GLAMAR agent 201. In one embodiment, positioning system 206 may track the orientation and/or direction of UE 202 and may report orientation and/or direction of UE 202 to GLAMAR agent 201.

GLAMAR agent 201 then estimates the rotation matrix 337 based on the location measurements at the two distinct locations (step 1604). FIG. 17 illustrates an estimation of a rotation matrix 337 in an illustrative embodiment. Both the AR coordinate system 803 and the premises coordinate system 802 have one axis aligned with the gravity direction, which is the Y- axis for the AR coordinate system 803 and is the Z-axis for the premises coordinate system 802 (see FIG. 8). The other two axes represent the same horizontal plane. Therefore, GLAMAR agent 201 computes the rotation to align the X-Z plane of the AR coordinate system 803 with the X-Y plane of the premises coordinate system 802. GLAMAR agent 201 derives the rotation based on the location measurements of UE 202 at the two distinct locations in the AR coordinate system 803 and the premises coordinate system 802. In FIG. 17, X p -Y p and X a -Z a represent the two horizontal planes of the premises coordinate system 802 and the AR coordinate system 803, respectively. In the horizontal plane, A and B represent the two location measurements of UE 202. A and B have the coordinates (xai, za ) and (xci , z.ci ) for the AR coordinate system 803, and (xpi, ypi) and (xp , ypi) for the premises coordinate system 802. As shown in FIG. 17, AC a and BC a are parallel to the X a and Z a axis, respectively. Similarly, AC p and BC p are parallel to the X p and Y p axis, respectively. Thus, the angle of rotation (a) between the AR coordinate system 803 and the premises coordinate system 802 may be derived as follows:

With the angle of rotation determined, the quaternion vector, qi = [0, sin a/2, 0, cos a/2] T aligns the X a -Z a plane of the AR coordinate system 803 with the X P -Y P plane of the premises coordinate system 802. The quaternion vector q = [sin 45, 0, 0, cos 45] T aligns the Z a axis of the AR coordinate system 803 with the Y p axis of the premises coordinate system 802. The rotation matrix 337 (R qi and R q ) is derived by converting the corresponding quaternion vector ql and q2 respectively, and multiplying the matrices, which allows for mapping between the premises coordinate system 802 and the AR coordinate system 803.

GLAMAR agent 201 then computes the translation matrix 338 (step 1606) based on the rotation matrix 337 and at least one of the location measurements to shift the origin between the premises coordinate system 802 and the AR coordinate system 803. Shifting the origin, keeping in consideration of the right-hand rotation rule, makes the vertical axes align in the opposite direction in the two coordinate systems. In one embodiment, the rotation on point A may be applied to convert its coordinate from the AR coordinate system 803 to the premises coordinate system 802 based on the following:

[xpi'.yplzpi'f = R ql x R q2 x [xa 1 ,ya 1 ,za 1 ] T

Then, the translation vector may be calculated as follows:

[t x , t y , t z ] = [x i,y i, z if - [xpi',ypi',zpi'

It is noted that a negative value is used for the Y axis as despite having the q2 rotation on the Z a axis, it remains in the opposite direction of Y p for right-hand rule rotation.

GLAMAR agent 201 then computes the coordinate transformation matrix 336 based on the translation matrix 338 and the rotation matrix 337 (i.e., coordinate transformation matrix = translation matrix X rotation matrix) (step 1608). Ideally, both coordinate systems 802-803 should be fixed, and one-time measurement should be sufficient. However, as the AR session 334 continues, the AR coordinate system 803 may be adjusted based on surrounding visual features. Also, location information may be erroneous. Thus, multiple estimations of the rotation matrix 337 may be performed over the life of the AR session 334 to improve the accuracy of the coordinate transformation matrix 336.

One technical benefit of GLAMAR system 110 is that the location computation is offloaded into edge computing platform 204, and streamed to UE 202 in substantially real-time using low latency networking. This makes it attractive for MAR-based applications, as processing overhead and battery consumption on UE 202 may be reduced. Additionally, vision-based techniques are not needed on UE 202. Thus, the separation between UE 202 and a target object 114 can be any arbitrary distance, and is not constrained by the limitation of the vision-based algorithms. While computation heavy operations are executed at edge computing platform 204, the lightweight on-device coordinate transformation keeps user information private within UE 202 itself.

Any of the various elements or modules shown in the figures or described herein may be implemented as hardware, software, firmware, or some combination of these. For example, an element may be implemented as dedicated hardware. Dedicated hardware elements may be referred to as “processors”, “controllers”, or some similar terminology. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, a network processor, application specific integrated circuit (ASIC) or other circuitry, field programmable gate array (FPGA), read only memory (ROM) for storing software, random access memory (RAM), non-volatile storage, logic, or some other physical hardware component or module.

Also, an element may be implemented as instructions executable by a processor or a computer to perform the functions of the element. Some examples of instructions are software, program code, and firmware. The instructions are operational when executed by the processor to direct the processor to perform the functions of the element. The instructions may be stored on storage devices that are readable by the processor. Some examples of the storage devices are digital or solid-state memories, magnetic storage media such as a magnetic disks and magnetic tapes, hard drives, or optically readable digital data storage media.

As used in this application, the term “circuitry” may refer to one or more or all of the following:

(a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry);

(b) combinations of hardware circuits and software, such as (as applicable):

(i) a combination of analog and/or digital hardware circuit(s) with software/firmware; and

(ii) any portions of hardware processor(s) with software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions); and

(c) hardware circuit(s) and or processor! s), such as a microprocessor(s) or a portion of a microprocessor(s), that requires software (e.g., firmware) for operation, but the software may not be present when it is not needed for operation.

This definition of circuitry applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term circuitry also covers an implementation of merely a hardware circuit or processor (or multiple processors) or portion of a hardware circuit or processor and its (or their) accompanying software and/or firmware. The term circuitry also covers, for example and if applicable to the particular claim element, a baseband integrated circuit or processor integrated circuit for a mobile device or a similar integrated circuit in server, a cellular network device, or other computing or network device. Although specific embodiments were described herein, the scope of the disclosure is not limited to those specific embodiments. The scope of the disclosure is defined by the following claims and any equivalents thereof.