Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND SYSTEM FOR OVERLAYING HOLOGRAMS IN AUGMENTED REALITY/MIXED REALITY (AR/MR) APPLICATIONS
Document Type and Number:
WIPO Patent Application WO/2023/168511
Kind Code:
A1
Abstract:
A system and method for overlaying a hologram representing an object of interest into an augmented reality/mixed reality scene including determining a set of model termination points on the object of interest and then determining a set of scene termination points relating to the set of model termination points of the object of interest in an augmented reality/mixed reality scene. A transformation matrix based on the set of model termination points and the set of scene termination points is then generated and used to map the coordinates of the object of interest into the augmented reality/mixed reality scene. An optimization matrix may also be calculated to further enhance the hologram placement.

Inventors:
SHARIF MOHAMMAD MAHDI (CA)
MAZUMDAR VISHVAM (CA)
HAAS CARL THOMAS (CA)
PECEN MARK (CA)
Application Number:
PCT/CA2022/050341
Publication Date:
September 14, 2023
Filing Date:
March 09, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GLOVE SYSTEMS INC (CA)
International Classes:
G06T19/00
Foreign References:
DE102018113047A12019-12-05
US20180197331A12018-07-12
Other References:
S. GARRIDO-JURADO, R. MUñOZ-SALINAS, F.J. MADRID-CUEVAS, M.J. MARíN-JIMéNEZ: "Automatic generation and detection of highly reliable fiducial markers under occlusion", PATTERN RECOGNITION, vol. 47, no. 6, 1 June 2014 (2014-06-01), GB , pages 2280 - 2292, XP055601771, ISSN: 0031-3203, DOI: 10.1016/j.patcog.2014.01.005
Attorney, Agent or Firm:
WONG, Jeffrey et al. (CA)
Download PDF:
Claims:
What is Claimed is:

1 . A method of coordinating placement of a hologram representing a 3D model of an object of interest in an augmented reality/mixed reality (AR/MR) scene comprising: determining a set of model termination point coordinates for a set of model termination points on the 3D model in a global coordinate system; determining a set of AR/MR scene target point coordinates for the object of interest within the AR/MR scene, the set of AR/MR scene target point coordinates corresponding to the set of termination point coordinates; calculating a transformation matrix using the set of model termination point coordinates and the set of AR/MR scene target point coordinates; applying the transformation matrix to all coordinate points on the 3D model to generate a set of hologram scene coordinates; and placing the hologram into the AR/MR scene based on the set of hologram scene coordinates.

2. The method of Claim 1 further comprising: generating an updated transformation matrix based on an optimization matrix; applying the updated transformation matrix to the set of hologram scene coordinates to generate a set of updated hologram scene coordinates; and placing the hologram into the AR/MR scene based on the set of updated hologram scene coordinates.

3. The method of Claim 1 further comprising: generating an updated transformation matrix based on an optimization matrix; applying the updated transformation matrix to all coordinate points on the 3D model to generate a set of updated hologram scene coordinates; and placing the hologram into the AR/MR scene based on the set of updated hologram scene coordinates.

4. The method of Claim 1 further comprising, before determining a set of AR/MR scene target point coordinates for the object of interest within the AR/MR scene: determining an AR/MR scene coordinate system.

5. The method of Claim 1 wherein determining a set of model termination point coordinates for a set of model termination points on the 3D model in a global coordinate system comprises: receiving a set of termination points for the 3D model; determining a relationship between the set of termination points; and generating coordinates in the global coordinate system for each termination point in the set of termination points.

6. The method of Claim 1 wherein determining a set of AR/MR scene target point coordinates for the object of interest within the AR/MR scene comprises: placing targets on the object of interest, the location of the targets corresponding to the set of model termination points; and obtaining, via an AR/MR device, coordinates of the targets in the AR/MR scene, the coordinates of the targets in the AR/MR scene representing the set of AR/MR scene target point coordinates or scene termination points.

7. The method of Claim 1 wherein calculating a transformation matrix using the set of termination point coordinates and the set of AR/MR scene target point coordinates comprising: calculating a rotational matrix and a translation matrix via a principal component analysis methodology, a cross-product methodology or a cross-product using normal vectors methodology.

8. The method of Claim 1 wherein determining a set of model termination point coordinates for a set of model termination points on the 3D model in a global coordinate system comprises: determining a geometric relationship between the set of model termination points.

9. The method of Claim 8 wherein the geometric relationship between the set of model termination points is the same as a geometric relationship between the set of AR/MR scene target point coordinates.

10. A system for coordinating placement of a hologram representing a 3D model of an object of interest in an augmented reality/mixed reality (AR/MR) scene comprising: an input module for receiving the hologram and for receiving a set of model termination points associated with the hologram; a scene detection module for determining a set of scene termination points for a desired location of the object of interest within an AR/MR scene; a processor for determining a transformation matrix corresponding the set of model termination points to the set of scene termination points and for applying the transformation matrix to all coordinates of the object of interest to generate a set of hologram scene coordinates; and a visualization module for placing the hologram in the AR/MR scene based on the set of hologram scene coordinates.

11. The system of Claim 10 further comprising an AR/MR device for housing the scene detection module.

12. The system of Claim 11 wherein the AR/MR device further houses the visualization module.

13. The system of Claim 10 further comprising an optimization module for calculating an optimization matrix to be applied to the set of hologram scene coordinates to generate an updated set of hologram scene coordinates.

Description:
METHOD AND SYSTEM FOR OVERLAYING HOLOGRAMS IN AUGMENTED REALITY/MIXED REALITY (AR/MR) APPLICATIONS

Field

The disclosure is generally directed at augmented reality/mixed reality (AR/MR) applications, and more specifically, at a method and system for overlaying holograms in AR/MR applications or an AR/MR scene.

Background

In the domain of augmented reality/mixed reality (AR/MR), a challenge exists to accurately and robustly overlay, or superimpose holograms (representing an object of interest) onto a desired location for the object of interest in a scene viewed by an AR/MR device such as a Microsoft™ HoloLens device, an iPhone™, an iPad™, or any other AR/MR enabled device. Placement of the hologram in its desired location within the scene is not a straight-forward process. This is due to the fact that the coordinate system of the object of interest or a 3D model of the object of interest from which the hologram is based is different than the coordinate system of the AR/MR unit or device. Additionally, the relationship between the object of interest and the AR/MR unit is unknown.

Currently, two general approaches exist to overlay a hologram onto an AR/MR scene. The first approach is to use common features between the 3D model/object of interest and the AR/MR scene. The shortcomings of this approach include: (1) there are not always enough distinguishing features existing on the object of interest; in other words, the performance depends on the geometry of the 3D model of the object of interest that is intended to be viewed; (2) long computation time; and/or (3) the fit between the hologram of the object of interest and the scene is a general fit between two parametric objects, however, in quality control applications, the fit needs to be based on datum points so that deviations can be properly interpreted and reported. The second approach is to use machine learning algorithms to train a model to detect a type of object. For example, a convolutional neural network (CNN) can be trained to detect a specific 20” flange. This approach is only effective if the same object is manufactured, or displayed in a scene, multiple times. However, this assumption may not always be attainable. For example, in fabrication applications, parts are non-recurring and often custom made.

Therefore, there is provided a novel method and system for overlaying holograms in augmented reality/mixed reality (AR/MR) applications. Summary

The disclosure is directed at a method and apparatus for overlaying holograms in augmented reality/mixed reality (AR/MR) applications.

In one embodiment, the disclosure includes the selection of at least three corresponding termination points between a digital, or physical, 3D model (model termination points) and an AR/MR scene (scene termination points). The detection of points in the AR/MR scene may be achieved by placing targets onto the scene termination points and then detecting the targets in the scene. Detection of the targets in the scene allows for the creation of correspondence between the hologram and the as-built environment. This correspondence may be implemented via a transformation matrix. Depending on how corresponding termination points are selected between the 3D model and the scene, the 3D model may supplement an object of interest within the scene, or it can be overlaid onto a corresponding object within the scene.

One advantage of the current disclosure is the provision of a method and apparatus that is able to overlay holograms of an object of interest into an AR/MR scene that is not dependent on the geometry of the object of interest. Another advantage is that the method of the disclosure may be simpler and/or faster than current approaches to complete and visualize a hologram within the AR/MR scene. A further advantage is that the disclosure may be more accurate due to its use of termination points along with an optimization calculation or process. Another advantage is that the use of termination points in providing the hologram overlay may be useful in fabrication quality control processes.

In one aspect of the disclosure, there is provided a method of coordinating placement of a hologram representing a 3D model of an object of interest in an augmented reality/mixed reality (AR/MR) scene including determining a set of model termination point coordinates for a set of model termination points on the 3D model in a global coordinate system; determining a set of AR/MR scene target point coordinates for the object of interest within the AR/MR scene, the set of AR/MR scene target point coordinates corresponding to the set of termination point coordinates; calculating a transformation matrix using the set of model termination point coordinates and the set of AR/MR scene target point coordinates; applying the transformation matrix to all coordinate points on the 3D model to generate a set of hologram scene coordinates; and placing the hologram into the AR/MR scene based on the set of hologram scene coordinates.

In another aspect, the method further includes generating an updated transformation matrix based on an optimization matrix; applying the updated transformation matrix to the set of hologram scene coordinates to generate a set of updated hologram scene coordinates; and placing the hologram into the AR/MR scene based on the set of updated hologram scene coordinates. In a further aspect, the method further includes generating an updated transformation matrix based on an optimization matrix; applying the updated transformation matrix to all coordinate points on the 3D model to generate a set of updated hologram scene coordinates; and placing the hologram into the AR/MR scene based on the set of updated hologram scene coordinates. In yet a further aspect, before determining a set of AR/MR scene target point coordinates for the object of interest within the AR/MR scene, determining an AR/MR scene coordinate system. In another aspect, determining a set of model termination point coordinates for a set of model termination points on the 3D model in a global coordinate system includes receiving a set of termination points for the 3D model; determining a relationship between the set of termination points; and generating coordinates in the global coordinate system for each termination point in the set of termination points.

In yet a further aspect, determining a set of AR/MR scene target point coordinates for the object of interest within the AR/MR scene includes placing targets on the object of interest, the location of the targets corresponding to the set of model termination points; and obtaining, via an AR/MR device, coordinates of the targets in the AR/MR scene, the coordinates of the targets in the AR/MR scene representing the set of AR/MR scene target point coordinates or scene termination points. In another aspect, calculating a transformation matrix using the set of termination point coordinates and the set of AR/MR scene target point coordinates includes calculating a rotational matrix and a translation matrix via a principal component analysis methodology, a cross-product methodology or a cross-product using normal vectors methodology. In a further aspect, determining a set of model termination point coordinates for a set of model termination points on the 3D model in a global coordinate system includes determining a geometric relationship between the set of model termination points. In another aspect, the geometric relationship between the set of model termination points is the same as a geometric relationship between the set of AR/MR scene target point coordinates.

In another aspect of the disclosure, there is provided a system for coordinating placement of a hologram representing a 3D model of an object of interest in an augmented reality/mixed reality (AR/MR) scene including an input module for receiving the hologram and for receiving a set of model termination points associated with the hologram; a scene detection module for determining a set of scene termination points for a desired location of the object of interest within an AR/MR scene; a processor for determining a transformation matrix corresponding the set of model termination points to the set of scene termination points and for applying the transformation matrix to all coordinates of the object of interest to generate a set of hologram scene coordinates; and a visualization module for placing the hologram in the AR/MR scene based on the set of hologram scene coordinates.

In another aspect, the system further includes an AR/MR device for housing the scene detection module. In yet another aspect, the AR/MR device further houses the visualization module. In yet a further aspect, the system includes an optimization module for calculating an optimization matrix to be applied to the set of hologram scene coordinates to generate an updated set of hologram scene coordinates.

Description of Drawings

Embodiments of the present disclosure will now be described, by way of example only, with reference to the attached Figures.

Figure 1a is a photograph of a worker drawing centerlines using manual tools and chalk;

Figure 1 b is a photograph of a set of three (3) scene termination points located on a flange;

Figure 1c is a photograph of targets placed on scene termination points of the flange;

Figures 2a to 2f are example targets;

Figure 3 is a schematic diagram of one embodiment of a system for overlaying holograms in augmented reality/mixed reality (AR/MR) applications;

Figure 4a is a flowchart outlining a method of overlaying holograms in augmented reality/mixed reality (AR/MR) applications;

Figure 4b is a flowchart outlining another embodiment of method of overlaying holograms in AR/MR applications;

Figure 5a is a workflow showing the superimposition of a hologram into an AR/MR scene;

Figure 5b is a 3D model (or hologram) of a table that is to be placed in the AR/MR scene;

Figure 5c is a schematic diagram of how the object of interest is processed within the AR/MR scene;

Figure 5d is an AR/MR visualization of the table in the AR/MR scene;

Figure 6a is a flowchart outlining a method of superimposing a hologram to supplement a AR/MR scene;

Figure 6b is a 3D model of the machine component that is to be placed in the AR/MR scene; Figure 6c is an example of how the machine component is processed within the AR/MR scene;

Figure 6d is an AR/MR visualization of the machine component in the AR/MR scene;

Figure 7 is a schematic diagram of another embodiment of apparatus for use in overlying a hologram in an AR/MR scene;

Figure 8 is a schematic representation of selected points for use in a cross-product methodology calculation;

Figure 9 is a schematic representation of selected points for use in a cross-product using normal vectors methodology calculation;

Figure 10 is a schematic representation of selected points for use in an optimization matrix methodology calculation; and

Figures 11a to 11c are examples of user interfaces for interacting with a user.

Detailed Description of the Embodiments

For ease of understanding the following disclosure, definitions are now provided for terms which will be used in the description below.

“Termination points” are defined as points on objects or assemblies that represent a geometric feature. For example, a termination point may be seen as the center-point of a flange or the intersection of two centerlines within that flange or any other object of interest. These points can also be described as points where assemblies are constrained or connected to each other. As shown in Figures 1a to 1c, photographs are provided of a worker locating termination points on a flange that are perpendicular to the centerline of a body of a flange. As shown in Figure 1b, the termination points are selected as the cross-section of centerlines with the circumference of the flange. It should be noted that termination points selected on the model may be referred to as model termination points and termination points selected (or obtained) from the scene may be referred to as scene termination points.

“Hologram” is defined as a three-dimensional (3D) image formed by the interference of light beams from a laser or other coherent light source. In the disclosure, a hologram may be seen as a representation of an object of interest that is superimposed, or overlaid, onto an augmented reality/mixed reality (AR/MR) scene, and that can be viewed by an AR/MR device (such as, but not limited to, a Microsoft™ HoloLens device, an iPhone™, an iPad™, or any other AR/MR enabled component) or unit.

“Target” is defined as an object with a unique pattern or geometry that can be detected by a camera of an AR/MR device or unit. Various methods can be used to detect each individual target in a scene, including, but not limited to, machine vision-based approaches such as, but not limited to, pattern recognition, edge detection, and RANSAC. Figures 2a to 2f are examples. Figures 2a to 2c are targets that may be used for pattern recognition, Figure 2d is a target for use in edge detection combined with RANSAC, Figure 2e is a target for use in edge detection combined with RANSAC and geometry optimization and Figure 2f is a target for use in RANSAC and geometry optimization.

“3D model” or “3D object” refers to an object of interest that is to be placed within an AR/MR scene.

“Reference point” is defined as a point on a target that has been detected. Generally, each target has at least one reference point. Once a target is detected, the coordinate of a reference point (in the AR/MR unit’s, or AR/MR scene coordinate system) related to the target can be obtained. Also, the normal vector of the target can be calculated based on this reference point. The reference point may or may not be the center point of the target as it depends on how the reference point is defined with respect to the target.

“Supplementing a hologram” refers to cases where the 3D object has components that do not already exist in the AR/MR scene.

“Overlaying a hologram” refers to cases where all components of the 3D object already exist in the scene viewed by the AR/MR unit. Overlaying a hologram is directed at accurately superimposing a hologram of the 3D model or object of interest onto its corresponding or desired position or location within the scene. The main application of this is quality control to detect deviations on the as-built from the 3D model.

Turning to Figure 3, a schematic diagram of a system for inserting a hologram into an AR/MR scene is shown.

In one embodiment, the system 300 includes a server, or central processing unit (CPU) 302, a database 304 and an AR/MR device 306. In some embodiments, the AR/MR device 306 is part of the system 300 and in other embodiments, the AR/MR device 306 receives instructions or data from the server 302 to place a hologram within an AR/MR scene being viewed by the device 306 and the server 302 performs all processing of information and data. In other embodiments, the processing of data may be shared between the AR/MR device 306 and the server 302. The database stores information that is uploaded to the server or to store results (such as transformation matrixes or optimization matrixes) that are calculated, generated or processed by the server 302 and/or AR/MR device 306. A user may interact with the server via a user device 308 which may include, but is not limited to, a laptop, a tablet, a desktop computer, a smartphone and the like. Turning to Figure 4, a flowchart outlining a method of inserting or placing a hologram into an AR/MR scene is shown. The hologram may be overlaid into the scene or may be superimposed into the scene.

Initially, the system receives a digital 3D model of a physical object or interest, that is to be placed into the AR/MR scene (400). The digital 3D model may be uploaded or inputted to the system by the user, such as via the user device. In other embodiments, a digital 3D model of the physical object of interest may be captured, such as via a camera, and then uploaded or inputted to the system. A hologram of the 3D model is then generated by the system (402). In some embodiments, the digital 3D model may be the hologram so there is no need to generate a further hologram of the 3D model.

A set of model termination points located on the hologram are then selected (404). In one embodiment, the set of model termination points are input into, and received by, the system by the user, however, the model termination points may also be selected or determined by the system, such as via the server. The coordinates of the model termination points are then determined, or calculated (406) with respect to their position or location within a global coordinate system.

An AR/MR scene coordinate system is then generated or determined (408). A desired position of the object of interest within the AR/MR scene is then determined (410) and the coordinates of the target points based on this desired position within the AR/MR scene coordinate system are then calculated (412). The target points may also be seen as scene termination points. In one embodiment, targets (such as those taught above with respect to Figures 2a to 2f) are placed on the object of interest, the positions of the targets corresponding with the model termination points selected above. The object of interest is then viewed in the AR/MR scene with the AR/MR device and the coordinates of the targets, target coordinate points, scene termination points or reference coordinate points with respect to the AR/MR scene are then calculated or determined. In some embodiments, the coordinates of the target points may also be input into the system by a user.

A transformation matrix is then calculated (414). In one embodiment, the transformation matrix is calculated using the coordinates of the model termination points in the global coordinate system and the coordinates of the target points (or scene termination points) in the scene coordinate system. In one embodiment, the transformation matrix may include a rotational matrix (to address orientation of the object of interest in the scene) and a translation matrix (to address positioning of the object of interest in the scene). Examples of how the transformation matrix may be calculated are discussed below. The hologram of the object may then be placed into the AR/MR scene using the transformation matrix (416). In one embodiment, this may be performed by applying the transformation matrix to each coordinate point of the object of interest in the global coordinate system to determine its corresponding coordinate point or location within the AR/MR scene coordinate system.

In some embodiments, the system may then generate or calculate an optimization matrix (418) to be applied (420) to the hologram (or coordinate points of the object of interest in the scene coordinate system) and the hologram updated to provide a more detailed positioning of the hologram within the AR/MR scene.

As shown in Figure 4b, another method of inserting a hologram into an AR/MR scene is shown. In the current embodiment, the method is performed by multiple components such as the AR/MR device and the server. In Figure 4b, some parts of the method will be understood as being standard and may or may not be associated with the method of the disclosure. It is understood that the parts of the method being executed may occur concurrently or consecutively. In some embodiments, the server and the AR/MR device may perform actions concurrently as actions relating to one device or component does not affect actions of the other device or component.

With respect to the AR/MR device, it is initially started or booted up (430). Once booted up, the AR/MR device transmits a signal to the server to connect to the server (432). After connecting to the server, the AR/MR device requests a digital 3D model and model termination points from the server (434). As discussed above, the digital 3D model (which may or may not be the hologram) represents the object of interest that is to be placed within a scene observed or controlled by the AR/MR device and the model termination points represent the coordinates of the selected termination points of the 3D model in the global coordinate space. Prior to the request from the AR/MR device, the digital 3D model (436) and the model termination points (438) may already stored in the server and simply retrieved from a database or this information is input into the server by a user in real-time such as via a request to the user by the server. In one embodiment, termination target points (or the scene termination points) are selected based on the set of termination points that are associated with the 3D model (model termination points) or are provided by a user. In another embodiment, the model termination points are selected based on a selection or determination of termination points on the 3D model by the server. In yet another embodiment, the model termination points may be included as metadata with the 3D model and the server retrieves these termination points when the 3D model is uploaded by the user or requested by the AR/MR device. The AR/MR device then determines if the requested information is received from the server (440). If it is not received, the AR/MR device may make the same request to the server for the 3D model and the model termination points (434) until this information is received.

If the AR/MR device receives the information, the AR/MR device then determines the desired position of the hologram within the AR/MR scene and subsequently, the location of the target, or target reference, points, or scene termination points, within the AR/MR scene or with respect to the scene coordinate system (442). Prior to this determination, a coordinate system with respect to the AR/MR scene (or the scene coordinate system) is captured or determined by the AR/MR device.

Once the AR/MR device has the coordinates of the target points (or scene termination points) in the AR/MR scene, the AR/MR device calculates or generates a transformation matrix (444) based on the coordinates of the scene termination points in the global coordinate system and the coordinates of the desired location of the target points, or target reference points, in the scene coordinate system in order to position and orient the hologram within the scene. In some embodiments, the generation of the transformation matrix may be performed by the server.

After the transformation matrix has been calculated, the hologram may be overlaid or superimposed into the AR/MR scene by applying the transformation matrix to all points of the hologram/3D model (444) so that corresponding coordinates for all of the points can be calculated for placement of the object of interest in the AR/MR scene with a correct orientation. The application of the transformation matrix to each of the points assists to orient the 3D model correctly within the AR/MR scene.

In some embodiments, the device may further request optimization of the hologram placement within the scene (446). In one embodiment, the device transmits a signal to the server to execute an optimization process on the transformation matrix and/or the coordinates of all of the points in the AR/MR scene. Alternatively, the AR/MR device may include the functionality to perform this optimization.

If the server receives a request for optimization from the device, the server may then select an optimization methodology (448) to use and the bounds or constraints that are to be applied to the optimization. The server then determines if it has received the scene coordinate system from the device (450). If it has not, it continues to wait until it has received it. In some embodiments, the server may request this information from the device requesting the optimization. Once the scene coordinate system has been received, the server may then execute or run the selected optimization methodology (452) such as to generate an optimization matrix. The optimization matrix is then applied to the coordinate points of the object of interest (in the global coordinate space) to determined updated target reference coordinate points in the AR/MR space. These updated coordinate are then transmitted to the device (454) and the data stored by the system such as in a database.

The device then determines if the updated points are received (456) and updates the placement of the hologram within the AR/MR scene accordingly (458).

In one embodiment, the server may include a user interface enabling a user to interact with the server. The user interface may include an interface allowing a user to upload, visualize and select termination points on a digital 3D model; an interface enabling a user to select optimization parameters; and/or an interface enabling a user to view previously collected data. Example user interfaces are shown in Figures 11a to 11c.

Turning to Figure 5a, a specific workflow showing the superimposition of a hologram into an AR/MR scene is shown. As can be seen in the workflow of Figure 5a, the example workflow includes the use of termination points where the hologram is overlaid onto a 3D object within an AR/MR scene is shown. In the current example, a table is being placed within the AR/MR scene. Figure 5b shows a generated 3D model (or hologram) of the table that is to be placed in the scene; Figure 5c shows an example of how the object of interest (the table) is processed within the scene; and Figure 5d is an example of an AR/MR visualization of the table in the scene.

Initially, a number of model termination points on the table are selected (500) and their coordinates in the global coordinate system determined. These selected model termination points may be seen as where targets are placed on the table (in the AR/MR scene) to assist in the generation of a correspondence between a global coordinate system (or coordinate system of the table) and the coordinate system within the AR/MR device or scene, such as in the form of a transformation matrix. In the current embodiment, three model termination points have been selected, however, in other embodiments, other numbers of at least three model termination points may be selected. In yet other embodiments, the number of model termination points that are required to be selected depends on the methodology used for calculating the transformation matrix to transform the hologram onto the object of interest in its desired location within the AR/MR scene. As will be understood, the selected model termination points should exist in both the global coordinate system and the scene coordinate system, or AR/MR scene space. As seen in Figure 5b, the three model termination points have been selected on the table and are seen as P1 , P2 and P3 within the global coordinate system. In this specific example, the coordinates of these model termination points within the global coordinate system may be seen as:

Targets are then placed on the object of interest on the selected model termination points (502) which may be referred to as scene termination points. Placement of the targets need to be such that the reference point of the target is superimposed onto the selected termination point of the 3D model as accurately as possible. In the current example, the targets are square targets such as those schematically shown in Figures 2a to 2c. A digital 3D model of the table in its desired location is then captured by an AR/MR device, an AM/MR camera or similar device (504) along with the targets in the AR/MR scene. It is understood that the scene coordinate space is automatically captured or determined by the AR/MR device. The coordinates of the targets in the scene coordinate space are then determined (506). This enables an orientation of the table in its desired location to be captured so that when the table is placed in the AR/MR scene, it is oriented correctly. In one embodiment, the reference points of the targets, or scene termination points, are detected by the AR/MR device within the scene based on the desired location. As seen in Figure 5c, the 3D model includes a set of targets placed at the selected model termination points that is then captured by the AR/MR device. In this specific example, the coordinates of the targets within the AR/MR device, or scene, coordinate system may be seen as:

Having the two sets of coordinates, a transformation matrix may be calculated (508) to enable placement of the table into the AR/MR scene in a correct orientation using a shared coordinate system between the global coordinate system and the AR/MR scene coordinate system. In one embodiment, the transformation matrix may include a rotation matrix and a translation matrix. In this specific example, the coordinates of the termination points and targets within the shared coordinate system may be seen as:

After obtaining the two sets of data in the global coordinate system and scene coordinate system, respectively, the hologram can be transformed and visualized to the user within the AR/MR scene (510). In one embodiment, the transformation matrix may be applied to each of the points of the hologram such that their corresponding coordinates in the AR/MR scene may be determined whereby placement, or overlaying, of the hologram in a correct orientation within the scene may be performed.

Turning to Figure 6a, a method of superimposing a hologram using model termination points where the hologram is supplementing the AR/MR scene is shown. In the current example, a machine component is being placed within the AR/MR scene. Figure 6b shows a generated 3D model of the machine component that is to be placed in the scene; Figure 6c shows an example of how the machine component is processed within the AR/MR scene; and Figure 6d is an example of an AR/MR visualization of the machine component in the AR/MR scene.

Initially, a number of model termination points on the 3D model (or hologram) of the machine component are selected (600) and the coordinates of these model termination points within the global coordinate system determined. These selected model termination points may be seen as where targets are placed on the 3D model to assist in generation of a correspondence between the global coordinate system (or coordinate system of the 3D model) and the coordinate system within the AR/MR device or scene. As with the example of Figures 5a to 5d, the selected model termination points should exist in both the global coordinate system and the AR/MR scene space. As seen in Figure 6b, three points have been selected on the 3D model and are seen as P1 , P2 and P3 within the global coordinate system. In this specific example, the coordinates of these model termination points within the global coordinate system may be seen as:

Targets are then placed on the selected model termination points (602). Placement of the targets should be such that the reference point of the target is superimposed onto the corresponding selected model termination point of the 3D model as accurately as possible. In some embodiments, a relationship between the coordinates of the model termination points is used to determine placement of the targets in the AR/MR scene. The reference points of the targets, or scene termination points, are detected by the AR/MR device (604) and their coordinates with respect to the coordinate system of the AR/MR device (or the AR/MR scene) are then obtained or calculated (606). As seen in Figure 6c, the 3D model includes a set of targets placed at the selected model termination points which are detected as scene termination points by the AR/MR device. In this specific example, the coordinates of the targets, or the scene termination points, within the AR/MR device, or scene, coordinate system may be seen as:

Having the two sets of coordinates, a transformation matrix is determined (608) or calculated to enable placement of the machine component hologram in the AR/MR scene (608) in a shared coordinate system between the global coordinate system and the AR/MR device or AR/MR scene. In this specific example, the coordinates of the scene termination points and model termination points within the shared coordinate system may be seen as:

After determining the transformation matrix, the hologram can be transformed and visualized to the user within the AR/MR scene (610).

Turning to Figure 7, a schematic diagram of another embodiment of apparatus for placing holograms in an AR/MR scene is shown. In the current embodiment, the apparatus 700 includes an input apparatus 702 and a processing apparatus 704. In some embodiments, the input apparatus 702 and processing apparatus 704 are located within the server and in some embodiments, separate input and processing apparatuses may be located in the AR/MR device and the server.

The input apparatus 702 may be seen the component or components that receive input from a user or other user devices and the processing apparatus 704 may be seen as the component or components that processes or process the inputs to generate or determine coordinate points of the hologram within the AR/MR scene so that the hologram may be placed correctly with respect to orientation and shape and other characteristics within the AR/MR scene.

In the current embodiment, the input apparatus 702 includes a user preparation module 706 that communicates with external peripherals (such as a user device) to receive inputs. These inputs may include, but are not limited to, a digital 3D model (or hologram) 708 of the component, item or object of interest that is being placed in the AR/MR scene and AR/MR scene information 710 which may be the entire scene or the information required for the input apparatus to generate the scene or to understand the scene coordinate system. In some embodiments, the input apparatus 702 may receive the scene coordinate characteristics or system from the AR/MR device which may include its own processing apparatus.

In the current embodiment, the processing apparatus 704 includes a scene detection module 710, a correspondence module 712, a transformation module 714, an optimization module 716 and a visualization module 718.

In one embodiment, the scene detection module 710 detects or determines the presence and/or location of the targets in the AR/MR scene coordinate system based on the inputs that are received by the user preparation module 702. In one embodiment, the scene detection module 710 may be implemented in the AR/MR device and captures the object of interest (with targets attached) in its desired location within the AR/MR scene to detect the coordinates of the targets, or the scene termination points, in the scene coordinate space. In another embodiment, information captured by the AR/MR device is provided to the scene detection module (implemented on the server) to determine the coordinate points of the targets, or scenen termination points, in the scene coordinate system. The scene detection module 710 may be seen as providing the functionality for providing information associate with where the reference points on targets are detected or located within the AR/MR scene. As discussed above, each target has a reference point and a normal vector associated with the reference point. The location (or coordinates) of the targets in the AR/MR coordinate system may be seen as AR/MR scene target coordinates, or the scene termination points.

The correspondence module 712 is directed at the functionality of generating a correspondence between the AR/MR scene termination points coordinates and the model termination points (or coordinates of the model termination points) that are selected on the 3D model or hologram (digital 3D model). As discussed above, the selected model termination points have a correspondence with the target points, or scene termination points, in the AR/MR scene as the targets are placed on the selected model termination points before being captured by the AR/MR device. Detected points are sorted such that the correspondence between detected points in the scene and termination points in the 3D model is established.

Transformation module 714 provides the functionality to generate or calculate a transformation matrix (associating, corresponding or relating the global coordinate system and the AR/MR scene coordinate system) based on the correspondence calculations of the correspondence module. The transformation matrix may include a rotation matrix and a translation matrix. This transformation matrix may then be used to determine where all coordinate points of the hologram are located within the AR/MR scene so that an orientation and a position of the object of interest is addressed.

In one example of transformation matrix calculation, it may be assumed that the set of model termination points detected, or selected, in the hologram (or 3D model) and the target points, or scene termination points, detected in the AR/MR scene may be represented as:

Points in the 3D model: and

Target Points in the scene (AR/MR unit’s coordinate system):

The transformation module then generates the transformation matrix, Transform, such that the model termination points in the 3D model P m are transformed onto the scene termination points in the scene P s . In one embodiment, the transformation module may calculate a rotation matrix, R, and a translation matrix, T, where: P s = R P m + T

In one embodiment, this calculation may be performed using a principal component analysis where coordinates of at least three corresponding points between the global coordinate system and the scene coordinate system are used (such that three or more targets should be used). Once the coordinates of the at least three model termination points on the 3D model in the global coordinate system and the coordinates of the corresponding three points (or scene termination points) in the scene coordinate system are established or determined, the transformation matrix may be calculated using the principal component analysis.

In another embodiment of transformation matrix calculation, the calculation may be performed using a cross-product methodology which requires exactly three corresponding termination points between the global coordinate system and the scene coordinate system (such that three targets are used). The model termination points in the global coordinate space may be represented as A m , B m , and C m (where the subscript “m” is for points on the 3D model) and the detected points in the scene coordinate system, or scene termination points, (viewed by the AR/MR unit) may be represented as A s , B s , and C s (where the subscript “s” refers to the a target reference point in the scene. This is schematically shown in Figure 8 which is an example representation of selected model termination points in the 3D model and the corresponding scene termination points detected by the AR/MR unit.

In the 3D model:

In the scene:

The rotation matrix can be calculated as follows: and the translation matrix can be calculated as:

T = A m — A s or

T = B m - B s Or

T = C m - C s

In a further embodiment, the calculation of the transformation matrix may be performed using a cross product using normal vectors methodology which requires exactly three corresponding termination points between the model coordinate system and the scene coordinate system (such that three targets should be used). Figure 9 provides a schematic diagram illustrating one example of a naming convention for the normal and points of the 3D model in the global coordinate system and the scene coordinate system. N Am may be seen as the normal vector at point A in the 3D model.

In the 3D model space:

In the scene space:

The rotation matrix can be calculated as:

The translation matrix can be calculated as:

T = A m — A s or

T = B m ~ B s or

T = C m - C s

In yet another embodiment, the transformation matrix calculation may be performed via the optimization module or the calculation of an initial transformation matrix may be improved via an optimization matrix generated by the optimization module.

When superimposing the hologram or 3D model onto or into the AR/MR scene, the superimposition may be improved by applying the optimization matrix to the coordinates of hologram in the AR/MR scene. The application of the optimization matrix may address errors that may arise in drawing scene termination points on the object (in the AR/MR scene), in placing the targets onto the scene termination points on the object of interest in the scene and/or in the detection of the target points by the AR/MR unit. To improve the accuracy of the calculated transformation matrix, the optimization module (via the generated optimization matrix) may be used to improve placement or fit of the hologram within the AR/MR scene.

In one embodiment, the optimization module may calculate an optimization matrix, or an updated transformation matrix, by increasing or maximizing a similarity index (function) between the detected target, or target reference, or scene termination points in the scene and the corresponding model termination points selected on the 3D model. In other words, by assuming the originally selected model termination points as the object’s datum points, a geometric relationship between the detected target points, or scene termination points, in the AR/MR scene should resemble the geometric relationship between those model termination points in the 3D model. This is schematically shown in Figure 10 using three termination points. By maximizing, improving or adjusting a similarity between the relationship between target points in the AR/MR scene, the coordinates in the AR/MR scene may be adjusted, and an updated or more accurate transformation matrix may be calculated. Additionally, the overlay result may be improved from a quality control perspective since the datum points are known and predetermined (instead of to a global fit between the hologram and the desired location of the object of interest in the AR/MR scene).

In one example calculation, the following parameters may be set:

A m . First model termination point’s coordinate in the global coordinate space B m . Second model termination point’s coordinate in the global coordinate space C m . Third model termination point’s coordinate in the global coordinate space Angle created by vectors CA m and AB m (the first and last vectors) Angle created by vectors AB m and BC m (the second and third vectors) First scene termination point’s coordinate detected in AR/MR scene coordinate space Second scene termination point’s coordinate detected in the AR/MR scene coordinate space Third scene termination point’s coordinate detected in the AR/MR scene coordinate space. Angle created by vectors and (the first and last vectors) Angle created by vectors and (the second and third vectors) : First scene termination point’s coordinate updated after the optimization Second scene termination point’s coordinate updated after the optimization Third scene termination point’s coordinate updated after the optimization Angle created by vectors and (the first and last vectors) Angle created by vectors and (the second and third vectors)

With respect to constraints, optimization should not change the coordinates of the initially detected points more than the accuracy of the AR/MR unit. In order to address this condition, the following constraints may be used where p is a parametric representation of the total Euclidean value of the inaccuracy of the AR/MR unit. denotes to the X direction of the point )

The objective can be visually defined as making the triangle (or hexagon in the case where more than three termination points) detected in the scene more similar to the triangle created by termination points in the scene.

In order to improve or maximize the hologram insertion,

Once the objective function is defined, various constrained multi-variable optimization methods can be used such as, but not limited to, trust-region constrained algorithm; a sequential least squares programming (SLSQP) algorithm; or a genetic algorithm. By performing one of the optimization algorithms, a Cartesian location of the detected scene termination points, or target reference points, can be updated, and the described error sources causing the inaccuracies to be compensated for. Upon updating the coordinates of the hologram in the scene coordinate system the hologram overlay is adjusted to compensate for errors in the placement of targets and the error in the detection of reference points.

By using the optimization module, or updated transformation matrix, improved predictions for the updated coordinate points in the scene coordinate system may be obtained. Some guidelines that may be followed may include, but are not limited to, extract all necessary optimization variables and initial ratios; bound the solution to the defined regions; start the optimization with a defined fitness function that checks various parameters including ratios, distance from original points, and whether the new points are in plane; and/or after reaching a maximum, or high, number of iterations or if the values reached stop changing significantly, stop the search and return the new points found.

In the preceding description, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the embodiments. However, it will be apparent to one skilled in the art that these specific details may not be required. In other instances, well-known structures may be shown in block diagram form in order not to obscure the understanding. For example, specific details are not provided as to whether elements of the embodiments described herein are implemented as a software routine, hardware circuit, firmware, or a combination thereof.

Embodiments of the disclosure or components thereof can be provided as or represented as a computer program product stored in a machine-readable medium (also referred to as a computer-readable medium, a processor-readable medium, or a computer usable medium having a computer-readable program code embodied therein). The machine-readable medium can be any suitable tangible, non-transitory medium, including magnetic, optical, or electrical storage medium including a diskette, compact disk read only memory (CD-ROM), memory device (volatile or non-volatile), or similar storage mechanism. The machine-readable medium can contain various sets of instructions, code sequences, configuration information, or other data, which, when executed, cause a processor or controller to perform steps in a method according to an embodiment of the disclosure. Those of ordinary skill in the art will appreciate that other instructions and operations necessary to implement the described implementations can also be stored on the machine-readable medium. The instructions stored on the machine-readable medium can be executed by a processor, controller or other suitable processing device, and can interface with circuitry to perform the described tasks.

The above-described embodiments are intended to be examples only. Alterations, modifications and variations can be effected to the particular embodiments by those of skill in the art without departing from the scope, which is defined solely by the claims appended hereto.