Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
MEASUREMENT OF POSITION AND ORIENTATION OF AN OBJECT
Document Type and Number:
WIPO Patent Application WO/2022/172227
Kind Code:
A1
Abstract:
This invention relates to an apparatus and method for determining the position and orientation of an object in three dimensional space, wherein the apparatus comprises a first light emitting device for emitting first set of two patterns, being at an angle to each other, the first light emitting device attachable to the object, at-least one surface positioned in the direction of the emitted first set of two patterns to intercept the first set of two patterns, a camera to capture an image of the projection of the first set of two patterns on the at-least one surface, and a processor operably coupled to the camera to receive the image and determine the position and orientation of the object based on the position and orientation of the first set of two patterns in the image.

Inventors:
GUPTA ARBIND (IN)
Application Number:
PCT/IB2022/051266
Publication Date:
August 18, 2022
Filing Date:
February 13, 2022
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
GUPTA ARBIND (IN)
International Classes:
G01B11/25; G06K9/62; G06T7/70
Foreign References:
US20190130589A12019-05-02
US20150204662A12015-07-23
Download PDF:
Claims:
I/We claim:

1. An apparatus for determining position and orientation of an object in three dimensional space, the apparatus comprising: a first light emitting device for emitting first set of two patterns, the first set of two patterns being at an angle to each other, the first light emitting device attachable to the object; at-least one surface positioned in the direction of the emitted first set of two patterns to intercept the first set of two patterns; a camera to capture an image of the projection of the first set of two patterns on the at-least one surface; and a processor operably coupled to the camera to receive the image and determining the position and orientation of the object based on the position and orientation of the first set of two patterns in the image.

2. The apparatus according to claim 1, wherein the at-least one surface comprises a first surface and a second surface at an angle to each other, wherein the first surface is positioned in the direction of projection of one of the first set of two patterns and the second surface is positioned in the direction of projection of the other pattern of the first set of two patterns.

3. The apparatus according to claim 1 , wherein the first set of two patterns comprises of two line segments.

4. The apparatus, further comprising a second light emitting device for emitting a second set of two patterns, the second set of two patterns being distinct from the first set of two patterns, the second set of two patterns being at an angle to each other, the second set of two patterns providing an indication of a desired position on being intercepted by the at-least one surface.

5. The apparatus according to claim 4, wherein the second light emitting device is operably coupled to the processor and the processor is configured to control the second light emitting device.

6. The apparatus according to claim 4, wherein the second light emitting device is positioned such that the second set of two patterns are projected on the at-least one surface.

7. The apparatus according to claim 4, wherein an alignment of the first set of two patterns and the second set of two patterns on the at- least one surface provides an indication of the object being at the desired position and orientation.

8. The apparatus according to claim 1, further comprising of a switch operably coupled to the processor, for controlling the first light emitting device such that the first set of two patterns are within the field of view of the camera, based on the last known position of the first set of two patterns.

9. The apparatus according to claim 4, wherein the processor is configured to provide an audio feedback based on the distance between the first set of two patterns and the second set of two patterns.

10. The apparatus according to claim 9, wherein the processor is configured to alter the volume and/or pitch of the audio feedback based on distance between the first set of two patterns and the second set of two patterns.

11. A method of determining the position and orientation of an object in three-dimensional space, the method comprising: providing a first light emitting device for emitting a first set of two patterns, the first set of two patterns being at an angle to each other, the first light emitting device attachable to the object; positioning at-least one surface in the direction of the emitted first set of two patterns to intercept the first set of two patterns; capturing an image of the projection of the first set of two patterns on the at-least one surface by a camera; and determining, the position and orientation of the object based on the position and orientation of the first set of two patterns in the image.

12. The method according to claim 11, wherein the at-least one surface comprises a first surface and a second surface at an angle to each other, wherein the first surface is positioned in the direction of projection of one of the first set of two patterns and the second surface is positioned in the direction of projection of the other of the first set of two patterns.

13. The method according to claim 11 , wherein the first set of two patterns comprise set of two line segments.

14. The method according to claim 11, further comprising of a second light emitting device for emitting a second set of two patterns, the second set of two patterns being distinct from the first set of two patterns, the second set of two patterns being at an angle to each other, the second set of two patterns providing an indication of a desired position on being intercepted by the at-least one surface.

15. The method according to claim 14, is controlling the second light emitting device to emit the second set of two patterns.

16. The method according to claim 14, wherein the second light emitting device is positioned such that the second set of two patterns are projected on the at-least one surface.

17. The method according to claim 4, wherein an alignment of the first set of two patterns and the second set of two patterns on the at- least one surface provides an indication of the object being at the desired position and orientation.

18. The method according to claim 11, further comprising controlling the first light emitting device such that the first set of two patterns are within the field of view of the camera based on the last known position of the first set of two patterns.

19. The method according to claim 1, further comprising providing an audio feedback based on a distance between the first set of two patterns and the second set of two patterns. 20. The method according to claim 19, further comprising altering the pitch and / or volume of the audio feedback based on the distance between the first set of two patterns and the second set of two patterns.

Description:
Measurement of position and orientation of an object

TECHNICAL FIELD

[001] The invention relates to position and orientation and more specifically to determination of position and orientation of an object, and providing guidance to achieve a desired position and orientation.

BACKGROUND

[002] Ultrasound imaging are usually performed by manual placement of the probes onto the anatomy of the subject. For example, echocardiography is done manually by applying the probe to the chest of the subject and the axis of the probe is appropriately manipulated by looking into the monitor screen to obtain best possible view. The quality of image is dependent on the position and orientation in 3D space.

[003] This process, being manual, is dependent on the expertise of the individual performing the imaging. Moreover, this expertise is acquired by the individual performing the imaging over a period of time.

[004] Attempts have been made to determine the position and orientation using various sensors and providing a feedback to the sonographer for a desired location and orientation.

[005] However, such solutions have limitation either in terms of accuracy of measurement or size of sensors used for the purpose, or cost. The accuracy of such systems are dependent on accuracy of sensors, such as, magnetometer, gyroscopes and accelerometer used. [006] Moreover, such sensors sense the position and orientation in a relative manner. As a result, the cumulative value of position and orientation, measured over a series of movement, also leads to accumulation of associated measurement error.

[007] Since the probe has six degrees of freedom (three for translation along X, Y and Z axes, and three for rotation along the three axes), an audio feedback may often be confusing. Hence, a visual feedback to the sonographer about the desired position and orientation of the probe and its current position and orientation information will make it much easier for sonographer to quickly converge to the desired position and orientation.

OBJECTS

[008] The object of the invention is to provide an apparatus and method for determining the position and orientation of an object in three-dimensional space and provide guidance to achieve a desired position and orientation.

[009] The object of the invention is achieved by an apparatus and method, for determining the position of an object in three-dimensional space. According to an embodiment, the apparatus comprises a first light emitting device for emitting a first set of two patterns, the first set of two patterns being at an angle to each other, the first light emitting device attachable to the object, at-least one surface positioned in the direction of the emitted first set of two patterns to intercept the first set of two patterns, a camera to capture an image of the projection of the first set of two patterns on the at-least one surface, and a processor operably coupled to the camera to receive the image and determines the position and orientation of the object based on the position and orientation of the first set of two patterns in the image.

[0010] According to another embodiment, the at-least one surface comprises a first surface and a second surface at an angle to each other, wherein the first surface is positioned in the direction of projection of one of the first set of two patterns and the second surface is positioned in the direction of projection of the other of the first set of two pattern.

[0011] According to yet another embodiment, the first set of two patterns comprises of a set of two line segments.

[0012] According to yet another embodiment, the apparatus further comprises a second light emitting device for emitting a second set of two patterns, the second set of two patterns having the same shape and size but distinct in appearance from the first set of two patterns, the second set of two patterns being at an angle to each other, the second light emitting device is positioned such that the second set of two patterns are projected on the at-least one surface.

[0013] According to yet another embodiment, the second light emitting device is operably coupled to the processor and the processor is configured to control the second light emitting device for projecting the second set of two patterns on the at-least one surface, indicating the desired position and orientation of the object. The user can manipulate the position and orientation of the object, thereby changing the position and orientation of the first set of two patterns on the at-least one surface, to bring it to the desired position. [0014] According to yet another embodiment, an alignment of the first set of two patterns and the second set of two patterns on the at-least one surface provides an indication of the object being at the desired position and orientation.

[0015] According to yet another embodiment, the apparatus further comprises of a switch operably coupled to the processor, for controlling the first light emitting device such that the first set of two patterns are within the field of view of the camera, based on the last known position of the first set of two patterns.

[0016] According to yet another embodiment, the processor is configured to provide an audio feedback based on a distance between the first set of two patterns and the second set of two patterns.

[0017] According to yet another embodiment, the processor is configured to alter the pitch and amplitude of the audio feedback based on the distance between the first set of two patterns and the second set of two patterns.

[0018] According to yet another embodiment, the method comprises, providing a first light emitting device for emitting a first set of two patterns, the first set of two patterns being at an angle to each other, the first light emitting device attachable to the object, positioning at- least one surface in the direction of the emitted first set of two patterns to intercept the first set of two patterns, capturing an image of the projection of the first set of two patterns on the at-least one surface by a camera, and determining the position and orientation of the object based on the position and orientation of the first set of two patterns in the image.

[0019] The first light emitting device can be attached onto the object and the position and orientation of the object can be determined using the projection of the first set of two patterns captured in the image. The first set of two patterns can comprise any shape and geometry, for example a line segment, a set of points, an L shaped line, a rectangle, and the like. The first set of two patterns and the second set of two patterns are visibly distinct. For example, the first set of two patterns and the second set of two patterns may be of two different distinguishable colors or may comprise of different styles such as dashed line, dotted line, solid line, flashing line etc.

[0020] For example, the apparatus may be attached to an ECG probe of an ECG system, where the first set of two patterns provide the current position and orientation information of the ECG probe and the second set of two patterns provide the desired position and orientation of the probe, which provide a visual feedback to the operator to align current position to the more desirable position and orientation of the probe that helps in acquiring a better quality of ultrasound images. The desired position can be pre-defined, may come from an external source or may be computed by other means, which is used by the processor in controlling the second light emitting device. The second set of two patterns emitted by the second light emitting device provides the indication of the desired position. The user of the ECG probe may adjust the ECG probe such that the projections of the first set of two patterns aligns or superimposes with the second set of two patterns. This provides the advantage of computing the position and orientation of the ECG probe and provide visual guidance to achieve the desired position and orientation of the ECG probe.

BRIEF DESCRIPTION OF FIGURES [0021] Embodiments herein are illustrated in the accompanying drawings, throughout which reference letters or symbols indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:

[0022] Fig 1 shows an exemplary illustration of a chest of a subject where an ultrasound probe can be placed for acquiring ultrasound images, wherein the position of placing the ultrasound probes is designated as A, B, C and D.

[0023] Fig 2a illustrates a block diagram of the apparatus for determining the position and orientation of an object and providing guidance to achieve a desired position and orientation of the object according to an embodiment herein - i) object whose position and orientation is to be determined; ii) two surfaces at an angle on which a pattern can be projected (surface S x and S y ); iii) light emitting device P1 attached to the object; iv) light emitting device P2, fixed to the room or surface S x and S y , and v) a camera for acquiring images of patterns projected on the two surfaces S x and S y .

[0024] Fig 2b illustrates projection of the first set of two patterns (line segments) L T X and L T y projected on the two surfaces S x and S y by light emitting device P1 and the second set of two patterns (line segments) L x and L y projected on the two surfaces S x and S y by light emitting device P2. The first set of two patterns indicates the current position and orientation of the object and the second set of two patterns indicates a desired position and orientation of the probe. [0025] Fig 3 illustrates a flow chart of an exemplary method for determining the position and orientation of an object according to an embodiment herein.

[0026] Fig 4 shows these arrangements of a preferred embodiment.

[0027] Fig 5 shows a visualization of (i) probe coordinate system (PCS) X P , Y V ,Z P with origin at O p attached to the object and its virtual surface S p ,S p ·, ii) the table coordinate system (TCS) attached to the room) X T , Y T ,Z T with origin at 0 T and its surface

S T S T

[0028] Fig 6a an alternate arrangement of the two surfaces, where one of the surface S x is at the top.

[0029] Fig 6b shows an alternate arrangement where the two surfaces S x and S y are not at right angle to each other. They are also not vertical or perpendicular to the floor.

[0030] Fig 7a shows and alternate arrangement where the camera cum projection system are not attached to the two surfaces S x and S y and it is placed at a different location.

[0031] Fig 7b shows and alternate arrangement where the camera cum projection system is fixed at an angle to the two surfaces S x and Sy.

DETAILED DESCRIPTION

[0032] Fig 2b shows the various components of the said apparatus and methods for use with an object that are described below. Fig 4 shows a preferred embodiment of the apparatus and methods and it is described in detail for better understanding. In this, the first and second set of two patterns comprise of a pair of line segments each, and the at-least one surface comprises of two surface S x and S Y on which projections from light emitting devices P1 and P2 will be projected. Further, the first set of two patterns and the second set of two patterns are distinguishable from each other by their color. It is sufficient to have one large surface on which patterns from light emitting devices P1 and P2 can be projected. A description of the apparatus and methods comprising of - a. two surface S x and S y which are fixed at an angle to the room, and make an angle to each other. Light patterns can be projected on these two surfaces by light emitting devices P1 and P2. b. the first light emitting device P1 will be attached to the object whose position and orientation information need to be measured accurately. It emits a first set of two patterns L T X and L T y onto the surface S x and S y . The first set of two patterns are emitted such that they are at an angle to each other. The position and orientation of the first set of two patterns L T X and L T y projected onto the surfaces S x and S y provide an indication of the position and orientation of the object. To simplify the calculations for computing the position and orientation information of the object, the surface S x and S y are at right angle to each other and to the floor, as shown in Fig 4. c. a camera that can be fixed to the room or surface S x and/ or S y . It can take image of the two surfaces S x and S y . d. a processor, operably coupled to the camera, that computes the position and orientation of the object based on the position and orientation of the first set of two patterns L T X and L T y in the image. e. a second light emitting device P2, operably coupled to the processor, for emitting a second set of two patterns L x and L y that are projected onto the surface S x and S y respectively. The second set of two patterns emitted by P2 are visually distinct from the first set of two patterns emitted by P1. P2 is controlled by the processing unit to project the second set of two patterns to indicate the desired position and orientation of the object. The user can attain the desired position and orientation of the object by moving and / or rotating the object so that the first set of two patterns L T X and L T y projected by P1 aligns with the second set of two patterns L x and L y projected by P2.

[0033] Before the present subject matter is described in further detail, it is to be understood that the subject matter is not limited to the particular embodiments described, as such, and may of course vary. It shall become abundantly clear after reading this specification, that the subject matter may, without departing from the spirit and scope of the subject matter, also be practiced in other than the exemplified embodiments.

[0034] For the purpose of describing the invention, Fig 2 shows an embodiment of the said apparatus where upon it is used to measure the absolute position and orientation of an ultrasound probe and provide a visual feedback to the sonographer for a more desirable position and orientation of the probe.

[0035] Other embodiments, for example, are shown in Fig 6a and Fig 6b where the projection surface S x and S y are not at right angle to each other nor with the floor. Fig 7a and Fig 7b show yet another embodiment of the said apparatus where the camera and light emitting device P2 are not aligned with the light emitting device P1 of the apparatus.

[0036] In the preferred embodiment shown in Fig 4, the color of pattern projected by light emitting device P1 and light emitting device P2 are shown in red and blue respectively. It is understood that the colors are only one example to make the two patterns emitted by P1 and P2 distinct from each other, and there are many other ways in which they can be made distinct from each other. [0037] Also, in the preferred embodiment shown in Fig 4, the patterns emitted by light emitting device P1 and P2 are shown as line segments. It is clearly understood that they can be any other pattern such as rectangle, two lines forming an L shape, a set of points etc. [0038] In another embodiment, the surface S x and S y can be light sensing suraces to sense the line segments projected by light emitting device P1 and P2.

[0039] It shall also become clear that the drawings may not to the scale. In some other examples, the method may vary to include some additional block or may be practiced in the order different than the order of the blocks discussed in this specification. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting. It must be noted that as used herein, the singular forms "a", "an", and "the" include plural referents unless the context clearly dictates otherwise. The present subject matter provides solution to a number of problems, including but not limited to measuring position and orientation information of an ultrasound probe and providing visual feedback to the sonographer for aligning the probe for a more desirable orientation and position of the probe. [0040] Reference is now made to Figure 3 that describes the steps required, according to one embodiment of the apparatus and methods, for measuring position and orientation of an object and visual feedback on desired position and orientation of the object. [0041] In block 310, first light emitting device P1 is attached to the object projecting two patterns L T X and L T y on surface S x and S y respectively, as shown in Fig 2a and Fig 2b. Moving and / or rotating the object also causes the first set of two patterns L T X and L T y to move and / or tilt.

[0042] In block 320, a camera is mounted to the surface S x and S y (Fig 2a), or to the floor/ ceiling of the room (Fig 7a and Fig 7b). The camera acquires images of the surfaces S x and S y and sends to a processor. [0043] Block 330 is for a processor that detects the position and orientation of the first set of two patterns L T X and L T y , shown in Fig 2b from the given image. It then computes the position and orientation of the object based on the position and orientation of the first set of two patterns L T X and L T y in the image.

[0044] It is now required to calculate the transformation T (3 translation values Tx , Ty and Tz and 3 rotation values Rx, Ry and Rz) of the object that will transform the patterns L T X , L y from a default initial position and orientation of the object to its current position and orientation. This is equivalent to finding the transformation between a probe coordinate system PCS (at the default position and orientation of the object) and table coordinate system TCS (that is fixed). Reference is made to Fig 6 illustrating (i) PCS defined by the axes X P , Y V ,Z P with origin at O p \ ii) the TCS defined by the axes C t , U t , Z t with origin at 0 T . The surface S p , S p on TCS is represented by the virtual surface S p ,S y in the PCS, as shown in Fig 6.

[0045] A point P p in the PCS, may be represented as a 1D vector P p = [x y z 1] T where (x, y, z) are coordinates of the point in PCS. Hence, the coordinates of the point P p (in the PCS) in TCS will be P T = T * P p . Consider the virtual line segment L p with reference to PCS, as shown in Fig 6 and the corresponding line segment L in TCS. Let Lx' be the coordinates of L p in TCS. Applying transformation T on Rc ί , Rc 2 , Rgi, Rgi and O p (coordinates of line segments L p and L p in default position and orientation of the probe and measured in PCS), we get the virtual points Rcg,R'^, R'gg, R'g 2 and 0’ T of these points in TCS with the following constraints -

Points are collinear Points are collinear Points are collinear Points are collinear Line L' and L are coplanar Line L’ and L p are coplanar

[0046] Solving the six equations will give the six transformation parameters - 3 translation values Tx , Ty and Tz and 3 rotation values Rx, Ry and Rz.

[0047] The calculation of the transformation T can also be done if the first set of two patterns are projected at two different locations and having different orientation on just one surface. The calculation is not given here for the sake of brevity and simplicity.

[0048] In block 340, the desired position of the object, as a transformation T D on the PCS, is provided to the system either manually or by an external source. In an embodiment of the apparatus where the apparatus is attached to an ultrasound probe, the desired position of the ultrasound probe can be derived from the acquired ultrasound image by detecting the various chambers of the heart and computing the desired position based on the size and shape of the chambers detected. In yet another embodiment, the desired position and orientation can be defined based on the location where the probe is placed (as shown in Fig 1). In yet another embodiment, it can be provided by an external system.

[0049] Let T pz be the transformation for the light emitting device P2 with respect to the TCS. Hence, a point P T in TCS can be transformed into the coordinate system for light emitting device P2 as pP2 _ yR2 rt or rt _ [pP2 -i * pP2^ w|-|ere pP2 js coordinate of the point in the coordinate system for light emitting device P2. Therefore, the position of the first set of two patterns L T X and L T Y , after applying the desired transformation T D on the coordinate system of P2 and transforming it to the coordinate system of the TCS, will be L x 2 = T P2 * L x and L x = T D * L x , where L x is the coordinates of line segments in the default position and orientation of the light emitting device P1 (similarly one can calculate L Y ) Here, L x 2 and L pz are coordinates of the desired position of the second set of two patterns in the local coordinate system of the light emitting device P2. Block 350 in figure 3 represents this computation needed to compute the L p x and L pz for projection by the light emitting device P2 on the two surfaces S x and S x respectively

[0050] In block 360 of figure 3, the processing unit takes the desired position of the second set of two patterns L x 2 and L pz and projects on the two surfaces S x and S Y using the second light emitting device P2. This provides a visual feedback to the operator for adjusting the probe.

[0051] Block 370 in Fig 3 represents the adjustment of the object by the user so that the current position of the object, indicated by first set of two patterns emitted by the first light emitting device P1 , matches with the desired position and orientation of the probe, indicated by the second set of two patterns emitted by the second light emitting device P2. This can be complimented using voice feedback to the sonographer. Moreover, the difference in the current and desired position can also be mapped to the pitch and amplitude of an audio sound.

[0052] If necessary, these steps can be repeated for a more refined positioning of the probe.