Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
POSE DETERMINATION SYSTEM AND METHOD
Document Type and Number:
WIPO Patent Application WO/2019/068175
Kind Code:
A1
Abstract:
Systems and methods for determination of pose of an autonomous vehicle with respect to a defined area of operation are described. Reference nodes or tags are deployed at known positions proximate to the defined area of operation. The autonomous vehicle can detect the relative position of typically two or more reference nodes, and can determine the pose of the autonomous vehicle with respect to the defined area of operation.

Inventors:
MCMILLAN SCOTT (CA)
STEPHENS SCOTT (US)
LEIES MICHAEL (CA)
DERBEZ ERIC (CA)
Application Number:
PCT/CA2018/051229
Publication Date:
April 11, 2019
Filing Date:
October 01, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
XCO TECH INC (CA)
International Classes:
G05D1/02; B60W30/00; B60W40/00
Domestic Patent References:
WO2017079839A12017-05-18
Foreign References:
US20060224308A12006-10-05
US20100106356A12010-04-29
US20170023659A12017-01-26
US7026992B12006-04-11
US5408411A1995-04-18
EP2278357A22011-01-26
US9915947B12018-03-13
Attorney, Agent or Firm:
TEES, Susan M. (CA)
Download PDF:
Claims:
CLAIMS

1. An autonomous vehicle system, comprising:

a vehicle, having:

a functional apparatus comprising a mobility apparatus to move the vehicle within an area of operation;

a pose-detecting apparatus having a plurality of antennas to wirelessly send an outbound signal to and receive inbound signals from a plurality of tags being fixed relative to the area of operation;

a supplemental location apparatus to generate a location-related signal; and

a controller to:

determine a pose of the vehicle relative to the tags based on one or more of:

information about the outbound signal and the inbound signals; and

the location-related signal; and

control the mobility apparatus based on the pose to keep the vehicle within the area of operation.

2. The autonomous vehicle system of claim 1 , wherein the controller determines 3D pose of the vehicle relative to a predetermined coordinate system defined by the plurality of tags.

3. The autonomous vehicle system of claim 1 or 2, wherein the supplemental location apparatus comprises one or more of a GPS receiver, a real time kinematic receiver, an inertial sensor, radar, an echolocation apparatus, a compass, an odometer, an accelerometer, a magnetometer, a gyroscope, a wheel sensor/encoder, a visual sighting system, or a remote-operator-assisted piloting system, or an altimeter.

4. The autonomous vehicle system of claim 1 , 2, or 3, wherein when inbound signals are lost or information derived from the outbound signal and the inbound signals is determined to be insufficient to determine pose of the vehicle, the controller selectively determines pose of the vehicle based on the location-related signal generated by the supplemental location apparatus, and controls the mobility apparatus based thereon.

5. The autonomous vehicle system of any one of claims 1 to 4, wherein the controller determines pose from the outbound signal and the inbound signals by detecting the distance to two or more tags employing time of flight.

6. The autonomous vehicle system of any one of claims 1 to 5, wherein the controller determines pose from the outbound signal and the inbound signals by detecting the angle of the tag relative to the vehicle employing the time difference of receipt of inbound signals at different ones of the plural antennas.

7. The autonomous vehicle system of any one of claims 1 to 6, wherein the controller determines pose from the outbound signal and the inbound signal by detecting a phase difference of arrival of the inbound signals between each of the plurality of antennas.

8. The autonomous vehicle system of any one of claims 1 to 7, wherein the outbound and inbound signals are ultra-wideband signals.

9. The autonomous vehicle system of any one of claims 1 to 8, wherein the plurality of antenna are operatively connected to receiver/transceiver radios.

10. A method of autonomously operating a vehicle, the method comprising:

wirelessly sending an outbound signal from at least one of a plurality of antennas of the vehicle;

wirelessly receiving at the plurality of antennas inbound signals emitted by a plurality of tags being fixed relative to an area of operation, the inbound signals emitted in response to the plurality of tags receiving the outbound signal;

generating at a controller of the vehicle an indication of whether a pose of the vehicle relative to the tags can be determined based on information about the outbound signal and the inbound signals;

determining at the controller a pose of the vehicle relative to the tags, comprising:

if the indication is positive, determining the pose based on the information about the outbound signal and the inbound signals; and

if the indication is negative:

obtaining a location-related signal at a supplemental location apparatus of the vehicle; and

determining the pose based on the location-related signal; and controlling the vehicle based on the pose to keep the vehicle within the area of operation.

1 1 . The method of claim 10, wherein determining pose relative to the tags comprises determining at the controller the 3D pose of the vehicle relative to a predetermined coordinate system defined by the tags.

12. The method of claim 10 or 1 1 , wherein the supplemental location apparatus comprises one or more of a GPS receiver, a real time kinematic receiver, an inertial sensor, radar, an echolocation apparatus, a compass, an odometer, an accelerometer, a magnetometer, a gyroscope, a wheel sensor/encoder, a visual sighting system, or a remote-operator-assisted piloting system, or an altimeter.

13. The method of claim 10, 1 1 , or 12, wherein the controller determines pose from the outbound signal and the inbound signals by detecting the distance to two or more tags employing time of flight.

14. The method of any one of claims 10 to 13, wherein the controller determines pose from the outbound signal and the inbound signals by detecting the angle of the tags relative to the vehicle employing the time difference of receipt of incoming signals at different ones of the plural antennas.

15. The method of any one of claims 10 to 14, wherein the controller determines pose from the outbound signals and the inbound signals by detecting a phase difference of arrival of the inbound signals between each of the plurality of antennas.

16. The method of any one of claims 10 to 15, wherein the outbound and inbound signals are ultra-wideband signals.

17. The method of any one of claims 10 to 16, further comprising synchronizing the plurality of tags using a single oscillator.

18. The method of any one of claims 10 to 17, further comprising synchronizing the plurality of hub antenna with a local oscillator.

19. A system for determining three dimensional location and orientation of a movable object in an area of operation, comprising:

at least first, second and third tags configured at respective predetermined non- collinear locations in or near the area of operation to define directly or indirectly a reference coordinate system, each tag having a tag antenna providing respective tag signals; and

a pose determination system adapted to be integrated with or coupled to the object so as to be movable therewith, the pose determination system comprising:

an antenna hub having at least first, second and third hub antennas arranged in a non-collinear configuration to define an object coordinate system and to receive the tag signals; and

a controller coupled to the antenna hub, the controller having one or more processors which execute stored instructions to:

determine the distance of each of the first, second and third hub antennas to each of the first, second and third tags employing timing of receipt of the tag signals;

determine the location of each of the first, second and third hub antennas relative to the reference coordinate system using the antenna to tag distances and predetermined tag locations; and

determine the orientation of the object coordinate system relative to the reference coordinate system using the antenna locations.

20. A pose determination system adapted to be integrated with or coupled to an object so as to be movable therewith, the pose determination system comprising:

an antenna hub having at least first, second and third hub antennas arranged in a non-collinear configuration to define an object coordinate system and to receive at tag signals from first, second and third tags; and

a controller coupled to the antenna hub, the controller having one or more processors which execute stored instructions to:

determine the distance of each of the first, second and third hub antennas to each of the first, second and third tags employing timing of receipt of the tag signals;

determine the location of each of the first, second and third hub antennas relative to the reference coordinate system using the antenna to tag distances and predetermined tag locations; and

determine the orientation of the object coordinate system relative to the reference coordinate system using the antenna locations.

21 . The system of claim 19 or 20, wherein the first, second and third hub antennas are arranged to form an equilateral triangle.

22. The system of claim 19 or 20, wherein the first, second and third hub antennas are arranged to form an L shape.

23. The system of any one of claims 19 to 22, wherein the pose determination system further comprises a source of a common timing signal provided to the first, second and third hub antennas.

24. The system of any one of claims 19 to 23, wherein the hub antennas provide one or more outgoing signals to the tags.

25. The system of any one of claims 19 to 24, wherein the tag signals and outgoing signals are ultra-wideband signals.

26. The system of any one of claims 19 to 25, wherein the controller determines the object orientation relative to the reference coordinate system in terms of Euler angles for yaw, pitch and roll.

27. The system of any one of claims 19 to 26, wherein the hub antennas are operatively connected to receiver/transceiver radios.

28. A method for determining three dimensional location and orientation of a movable object in an area of operation, comprising:

providing first, second and third tag signals respectively from each of first, second and third tags configured at predetermined non-collinear locations in or near the area of operation to define directly or indirectly a reference coordinate system;

receiving, at an antenna hub having at least first, second and third hub antennas arranged in a non-collinear configuration to define an object coordinate system, the tag signals from the first, second and third tags;

determining at one or more programmed processors the distance of each of the first, second and third hub antennas to each of the first, second and third tags employing timing of receipt of the tag signals; determining at one or more programmed processors the location of each of the first, second and third hub antennas relative to the reference coordinate system using the antenna to tag distances and predetermined tag locations; and

determining at one or more programmed processors the orientation of the object coordinate system relative to the reference coordinate system using the antenna locations.

29. The method of claim 28, further comprising, at an initial stage, surveying the locations of the first, second and third tags to define a tag coordinate system.

30. The method of claim 29, wherein the surveying comprises recursively defining tag coordinates relative to each other.

31 . The method of claim 29 or 30, wherein the surveying comprises moving a pose detection system to a first location, identifying the first tag with a three dimensional coordinate value, and defining second and third tags with respective coordinate values defined relative to the first tag.

32. The method of claim 31 , wherein the surveying further comprises moving the pose detection system to a new location and identifying an additional tag with coordinate values defined relative to the first, second or third tags.

33. The method of claim 29, further comprising converting the tag coordinate system to the reference coordinate system.

34. The method of claim 33, wherein the reference coordinate system is an East, North, Up frame determined by the local magnetic North.

35. The method of any one of claims 28 to 34, wherein the object is an autonomous vehicle.

36. The method of any one of claims 28 to 35, further comprising controlling the movable object based on the location and orientation to keep the movable object within an area of operation.

37. A system for tracking a moving object comprising:

a pose-detecting apparatus having a plurality of antennas to wirelessly send an outbound signal to and receive inbound signals from a plurality of tags being fixed relative to an area of operation;

a supplemental location apparatus to generate a location-related signal; and a processor to determine a pose of the moving object relative to the tags based on one or more of:

information about the outbound signal and the inbound signals; and the location-related signal.

38. The system of claim 37, wherein the processor determines 3D pose of the moving object relative to a predetermined coordinate system defined by the plurality of tags.

39. The system of claim 37 or 38, wherein the supplemental location apparatus comprises one or more of a GPS receiver, a real time kinematic receiver, an inertial sensor, radar, an echolocation apparatus, a compass, an odometer, an accelerometer, a magnetometer, a gyroscope, a wheel sensor/encoder, a visual sighting system, or a remote-operator-assisted piloting system, or an altimeter.

40. The system of claim 37, 38 or 39, wherein when inbound signals are lost or information derived from the outbound signal and the inbound signals is determined to be insufficient to determine pose of the moving object, the processor selectively determines pose of the moving object based on the location-related signal generated by the supplemental location apparatus.

41 . The system of any one of claims 37 to 40, wherein the processor determines pose from the outbound signal and the inbound signals by detecting the distance to two or more tags employing time of flight.

42. The system of any one of claims 37 to 41 , wherein the processor determines pose from the outbound signal and the inbound signals by detecting the angle of the tag relative to the moving object employing the time difference of receipt of inbound signals at different ones of the plural antennas.

43. The system of any one of claims 37 to 42, wherein the processor determines pose from the outbound signal and the inbound signal by detecting a phase difference of arrival of the inbound signals between each of the plurality of antennas.

44. The system of any one of claims 37 to 43, wherein the outbound and inbound signals are ultra-wideband signals.

45. The system of any one of claims 37 to 44, wherein the plurality of antenna are operatively connected to receiver/transceiver radios.

46. A method for determining pose of a movable object comprising:

wirelessly sending an outbound signal from at least one of a plurality of antennas affixed on the movable object;

wirelessly receiving at the plurality of antennas inbound signals emitted by a plurality of tags being fixed relative to an area of operation, the inbound signals emitted in response to the plurality of tags receiving the outbound signal;

generating at a processor an indication of whether a pose of the movable object relative to the tags can be determined based on information about the outbound signal and the inbound signals;

determining at the processor a pose of the movable object relative to the tags, comprising:

if the indication is positive, determining the pose based on the information about the outbound signal and the inbound signals; and

if the indication is negative:

obtaining a location-related signal at a supplemental location apparatus affixed to the movable object; and

determining the pose based on the location-related signal.

47. The method of claim 46, wherein determining pose relative to the tags comprises determining at the processor the 3D pose of the movable object relative to a predetermined coordinate system defined by the tags.

48. The method of claim 46 or 47, wherein the supplemental location apparatus comprises one or more of a GPS receiver, a real time kinematic receiver, an inertial sensor, radar, an echolocation apparatus, a compass, an odometer, an accelerometer, a magnetometer, a gyroscope, a wheel sensor/encoder, a visual sighting system, or a remote-operator-assisted piloting system, or an altimeter.

49. The method of claim 46, 47 or 48, wherein the processor determines pose from the outbound signal and the inbound signals by detecting the distance to two or more tags employing time of flight.

50. The method of any one of claims 46 to 49, wherein the processor determines pose from the outbound signal and the inbound signals by detecting the angle of the tags relative to the movable object employing the time difference of receipt of incoming signals at different ones of the plural antennas.

51 . The method of any one of claims 46 to 50, wherein the processor determines pose from the outbound signals and the inbound signals by detecting a phase difference of arrival of the inbound signals between each of the plurality of antennas.

52. The method of any one of claims 46 to 51 , wherein the outbound and inbound signals are ultra-wideband signals.

53. The method of any one of claims 46 to 52, further comprising synchronizing the plurality of tags using a single oscillator.

54. The method of any one of claims 10 to 17, further comprising synchronizing the plurality of hub antenna with a local oscillator.

Description:
POSE DETERMINATION SYSTEM AND METHOD

RELATED APPLICATION INFORMATION

The present application claims priority to US provisional application serial number 62/567,523 filed October 3, 2017, the disclosure of which is incorporated herein by reference in its entirety.

FIELD

Embodiments described herein relate to location and orientation monitoring systems and methods. Embodiments described herein further relate to autonomous vehicles and control systems.

BACKGROUND

Determining location and orientation of an object may be important in a number of circumstances. One circumstance where the determinations are important, but not the only one, is with an autonomous vehicle.

Autonomous vehicles are being developed and marketed for a variety of civilian and military applications. One of the difficulties faced by designers of autonomous vehicles is the challenge of making an autonomous vehicle aware of its locations and orientations and surroundings, such that the autonomous vehicle may operate safely and effectively.

SUMMARY OF EMBODIMENTS OF THE INVENTION

Generally, embodiments of the present invention relate to a system and method for determining a pose for movable object. In an embodiment, the movable object can be an autonomous vehicle system.

In a broad aspect, an autonomous vehicle system can have a vehicle with a functional apparatus comprising a mobility apparatus to move the vehicle within an area of operation, a pose-detecting apparatus having a plurality of antennas to wirelessly send an outbound signal and receive inbound signals from a plurality of tags being fixed relative to the area of operation, a supplemental location apparatus to generate a location-related signal; and a controller for determining a pose of the vehicle relative to the tags based on information about the outbound signal and the inbound signals, or the location-related signal, or control the mobility apparatus based on the pose to keep the vehicle within the area of operation.

In another broad aspect of the invention, a method of autonomously operating a vehicle can comprise wirelessly sending an outbound signal from at least one of a plurality of antennas of the vehicle, wirelessly receiving at the plurality of antennas inbound signals emitted by a plurality of tags being fixed relative to an area of operation, the inbound signals emitted in response to the plurality of tags receiving the outbound signal, generating at a controller of the vehicle an indication of whether a pose of the vehicle relative to the tags can be determined based on information about the outbound signal and the inbound signals, and determining at the controller a pose of the vehicle relative to the tags. If the indication is positive, then determining the pose based on the information about the outbound signal and the inbound signals, and if the indication is negative, then obtaining a location-related signal at a supplemental location apparatus of the vehicle; and determining the pose based on the location-related signal; and controlling the vehicle based on the pose to keep the vehicle within the area of operation.

In another broad aspect of the invention, a system for determining three dimensional location and orientation of a movable object in an area of operation can comprise at least first, second and third tags configured at respective predetermined non-collinear locations in or near the area of operation to define directly or indirectly a reference coordinate system, each tag having a tag antenna providing respective tag signals, and a pose determination system adapted to be integrated with or coupled to the object so as to be movable therewith, the pose determination system comprising an antenna hub having at least first, second and third hub antennas arranged in a non- collinear configuration to define an object coordinate system and to receive the tag signals, and a controller coupled to the antenna hub, the controller having one or more processors which execute stored instructions to determine the distance of each of the first, second and third hub antennas to each of the first, second and third tags employing timing of receipt of the tag signals, determine the location of each of the first, second and third hub antennas relative to the reference coordinate system using the antenna to tag distances and predetermined tag locations, and determine the orientation of the object coordinate system relative to the reference coordinate system using the antenna locations.

In another broad aspect of the invention, a pose determination system adapted to be integrated with or coupled to an object so as to be movable therewith comprises an antenna hub having at least first, second and third hub antennas arranged in a non-collinear configuration to define an object coordinate system and to receive at tag signals from first, second and third tags, and a controller coupled to the antenna hub. The controller can have one or more processors which execute stored instructions to determine the distance of each of the first, second and third hub antennas to each of the first, second and third tags employing timing of receipt of the tag signals, determine the location of each of the first, second and third hub antennas relative to the reference coordinate system using the antenna to tag distances and predetermined tag locations, and determine the orientation of the object coordinate system relative to the reference coordinate system using the antenna locations.

In another broad aspect of the invention, a method for determining three dimensional location and orientation of a movable object in an area of operation can comprise providing first, second and third tag signals respectively from each of first, second and third tags configured at predetermined non-collinear locations in or near the area of operation to define directly or indirectly a reference coordinate system, receiving, at an antenna hub having at least first, second and third hub antennas arranged in a non-collinear configuration to define an object coordinate system, the tag signals from the first, second and third tags, determining at one or more programmed processors the distance of each of the first, second and third hub antennas to each of the first, second and third tags employing timing of receipt of the tag signals, determining at one or more programmed processors the location of each of the first, second and third hub antennas relative to the reference coordinate system using the antenna to tag distances and predetermined tag locations, and determining at one or more programmed processors the orientation of the object coordinate system relative to the reference coordinate system using the antenna locations.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a schematic diagram of an illustrative autonomous vehicle location system; and

FIG. 2 is an illustrative map showing overlapping defined areas of operation.

DETAILED DESCRIPTION OF EMBODIMENTS OF THE INVENTION

As discussed herein, "pose" is used in the sense of a particular way of a thing being positioned and oriented. Generally speaking, a "pose" of a thing is determined with respect to one or more references, such as the Earth or a landmark or a marker or a tag. A pose may be determined in a particular coordinate system defined usually relative to one of these landmarks or objects or other references. A "pose" of an object includes the location of the object with respect to one or more references, and also includes a heading (in two dimensions-2D) or orientation (more generally) with respect to one or more references. In aerospace, a representative 3D application, the full orientation is comprised of heading, attitude, and bank angles. In other applications the terminology yaw, pitch and roll, and/or Euler angles, may be employed for 3D orientation. A detailed discussion of 3D orientation, including terminology and mathematical descriptions, may be found in the following publications, the disclosures of which are incorporated herein by reference in their entirety: Representing Attitude: Euler Angles, Unit Quaternions, and Rotation Vectors, James Diebel, Stanford University, 20 October 2006, https://www.astro.rug.nl/software/kapteyn/downloads/attitude .pdf; Determination of a Position in Three Dimensions Using Trilateration and Approximate Distances, Willy Hereman and William S Murphy, Jr., Colorado School of Mines, October 1995, https://inside.mines.edu/~whereman/papers/Murphy-Hereman- Trilateration-MCS-07-1995.pdf.

The pose of an object may be determined by direct measurement or detection of location and orientation, but this is not the only way to determine an object's pose. Other quantities may be measured or detected that may be indicative of pose, or location or orientation separately. Preferred but non-limiting examples of pose determination are discussed in detail below. As alternative examples, an object's velocity may indicate its direction of movement and its position and its orientation. Similarly, acceleration and change in direction of an object may indicate the pose of the object. As discussed herein, a pose of an object may be determined directly from location and orientation or indirectly from one or more related or derivative parameters. Typically, specific quantities indicative of location and orientation may be computed or otherwise determined, though this is not necessary in all cases.

As will be discussed below, pose is of considerable significance for the operation of an autonomous vehicle. The various embodiments described below include an autonomous vehicle but are not, however, limited to vehicles, nor to an autonomous vehicle.

An autonomous vehicle is a machine that moves from place to place without human control or intervention. In some cases, an autonomous vehicle may convey a human from place to place, while in other cases, the autonomous vehicle may be unable to convey a human. In some cases, an autonomous vehicle may be under human control for part of the journey, and in other cases, the autonomous vehicle goes from place to place independent of human control. Autonomous vehicles may be of any size: ships at sea, vessels in space, motorized ground transportation, drones, some weapons or military hardware, carts, trains, automobiles, robotic home vacuum cleaners, and other robotic conveyances can be vehicles that may be, to a degree, autonomous.

Not all autonomous vehicles operate in the same way or under the same conditions. Some autonomous vehicles work in a defined area of operation, and some do not. In general, a defined area of operation bounds or limits the geographical region in which the autonomous vehicle operates. The boundary of the geographical region constrains where the autonomous vehicle can go. Outside the boundary, the autonomous vehicle may not move as it should, or may not move at all; or it may be undesirable for the autonomous vehicle to operate anywhere except for the defined area of operation. In a typical case, the boundaries of the defined area of operation are in close geographic proximity such that an autonomous vehicle operating therein would have line-of-sight view of markers designating the boundaries.

Many of the autonomous vehicles mentioned previously do not operate in defined areas of operation.

The embodiments described herein apply to objects, such as autonomous vehicles, that operate within a defined area of operation. The defined area of operation may be, for example, an outdoor plot of land such as a field, or a factory floor, or a parking lot, or a swimming pool, or a small lake, or a short section of roadway. The defined area of operation may itself be stationary with respect to the planet, but this is not necessary in all implementations. For example, a defined area of operation may exist on the deck of a ship at sea, such that the defined area of operation may be stationary with respect to the ship but moving with respect to the planet. In embodiments, the system and method can be employed to passively track a moving object, such as a person.

To operate within a defined area of operation, an autonomous vehicle ordinarily has to be able to sense or otherwise determine its pose, which may be thought of generally as where the autonomous vehicle is located with respect to the defined area of operation or a landmark proximate to (in, on the border of, or near) the defined area of operation, and the direction in which the autonomous vehicle is oriented or aimed or going with respect to the defined area of operation. It may also be helpful to sense or calculate related or derivative parameters, such as speed of travel, acceleration, change in direction, and future position.

The embodiments discussed below include positioning of at least two reference nodes, having wireless transmission capability, which may be called "tags," at some known positions proximate to the defined area of operation. If the vehicle operates on a flat surface, and only 'heading' is required, then two tags are sufficient; we will refer to this as a 2D pose. In the general case with non-flat topography, where 3D position, as well as yaw, pitch, and roll angles are sought, a plurality, such as three or more tags will be required. We shall refer to this as the 3D pose. These reference nodes define the area of operation for the autonomous vehicle, and the autonomous vehicle typically stores data relating which nodes are related to which defined area of operation and how the tags are deployed with respect to the defined area of operation. The tags may be, for example, on the perimeter or border of the defined area of operation, or in the defined area of operation, or near to the defined area of operation, or any combination thereof. Whether a tag is "near" a defined area of operation may be a function of the communication range of the tag. The tags will be essentially stationary with respect to the defined area of operation. In a typical embodiment, the tags will also be comparatively low in functionality, that is, having specialized functions and less versatility than (for example) a general-purpose processor or a cellular telephone. Low functionality may have a number of potential benefits. For example, a tag with low functionality may require little or no external power to operate. As will be mentioned below in connection with an illustrative embodiment, power to the tags may be supplied by solar cells on the tags. There may be embodiments in which a tag is entirely passive, operating on the power received from the autonomous vehicle.

A typical embodiment, however, may include a tag that is active (partially or fully active), that is, powered by some source other than or in addition to the power from the autonomous vehicle. A tag that is active may have an effective range of transmission that exceeds that of a passive tag. An active tag may also support more kinds of signals (in terms of modulation, frequency, information, power, timing, among other things) than a passive tag. A tag with low functionality may also be less costly to produce, which may be advantageous in that some implementations may entail multiple tags. Low functionality may also affect the value of the tags, making them less attractive to thieves or other mischief-makers. Low functionality may also imply that there are fewer ways in which the tags can malfunction or go wrong (and it may be economical to replace a malfunctioning tag).

As previously mentioned, the tags' locations do need to be known with respect to a uniform reference frame. They can either be surveyed (e.g., with respect to World Geodetic System 1984 or WGS84 datum), or the vehicle could survey them in recursively as follows. Since the vectors from tag 0 , tag x and tag 2 to the vehicle frame can be computed, we can without loss of generality set the position of tag Q to be (0,0,0), then the position of tag x and tag 2 relative to tag 0 can be recorded. The vehicle can then be moved to a new location where tag 1 ,tag 2 ,tag 3 can all be seen by the hub sensors, and the position of tag 3 relative to previously computed tag 2 and tag x positions can also be computed, and so on where the position of tag i+1 is computed relative to the previously computed positions of tag t and tag t _ x and then the vehicle is re-positioned anew. Ideally, these measurements could be stored and post- processed as a batch-job to obtain an optimal set of positions in the least-squares sense.

The tags need not be, and ordinarily are not, a part of the autonomous vehicle itself.

Rather, the autonomous vehicle is equipped with a pose-detecting apparatus that can measure, estimate, sense, ascertain, or otherwise detect the relative position of a tag, typically the position of the tag in relation to the vehicle's coordinate system. In a typical embodiment, the pose-detecting apparatus can detect the distance to two or more tags (or three or more tags for 3D pose), and the relative angle (measured with respect to a reference) of each tag. Details of an implementation of range and angle detection using tags is provided in International Patent Application PCT/CA2016/051309, filed November 10, 2016, publication no. WO2017/079839, the disclosure of which is incorporated herein by reference in its entirety. A typical implementation may treat the heading of the autonomous vehicle as a reference, with angles measured relative to the heading. So, for example, an object located at an angle of zero degrees (zero radians) would be straight ahead, and an object located at an angle of 90 degrees (π/2 radians) would be directly to the right (assuming the convention that 90 degrees is to the right and 270 degrees is to the left; the opposite convention also may be applied), and so on. The zero heading can be referenced to a feature of the vehicle (e.g. a direction perpendicular to its steering axle) or without loss of generality could be the normal to the hub's 3 sensor antennae. In one embodiment, detecting (and detecting over time) the distances and angles, the pose-detecting apparatus can determine the location and orientation of the pose-detecting apparatus (and hence the pose of the autonomous vehicle) with respect to the tags, as well as information related to or derivable therefrom (such as the direction in which the pose- detecting apparatus may be traveling, speed of travel, acceleration, and change in direction). Alternatively, in place of angle detection and as discussed in detail below, three tags may be employed along with tag range information to derive full 3D orientation (yaw, pitch and roll) at one time. Therefore, to clarify, given three or more non-collinear tags, it is possible to determine the 3D pose of the vehicle without it moving.

Other techniques for determining the pose of the autonomous vehicle with respect to the defined area of operation (including more sophisticated forms of triangulation) may also be employed. As noted above, range information to three tags may be employed to derive 3D pose in a preferred embodiment. For purposes of discussion, however, first the use of distance and angle will be described. Distance- and-angle-based operation is often accurate enough for typical implementations, and may have benefits of low cost/overhead, simplicity, easy set-up, easy maintenance, easy troubleshooting/repair, and ready adaptability, among other things.

For distance-and-angle-based operation, detecting the distances and relative angles to two or more tags (rather than a single tag) is a matter of prudence; and as a practical matter, three or more tags are often useful. Detection of two or more tags (or three or more tags for 3D implementations) not only may improve accuracy and precision, it avoids situations in which a single-tag system will fail. For example, an autonomous vehicle, traveling in a circular path with the tag located at the center of the circle, will read that the distance to the tag is constant, and that the angle to the tag is constant (assuming the tag does not convey any angle information to the autonomous vehicle). In other words, even though the autonomous vehicle is changing position relative to the tag and the defined area of operation, the distance and angle measurements give incomplete information as to how the autonomous vehicle is moving, or in which direction it is moving, or how fast it is going, or where it is within the defined area of operation. When the autonomous vehicle detects distance and angle to a second tag at a different location, however, this failure can be avoided; the autonomous vehicle mathematically can determine its own location and heading. Other circumstances may exist in which detection of a single tag will result in an inaccurate or ambiguous determination of location or orientation or both. Generally speaking, detecting distance and angle to two (or more) tags can reduce or eliminate error or ambiguity in pose and information related to or derivable therefrom. Also generally speaking, three tags are more useful than two, and four are more useful than three. In some cases (such as where the defined area of operation has obstacles or hills or places where tags can be blocked, or where the autonomous vehicle may change altitude), more tags may be helpful.

FIG. 1 is a schematic diagram of a typical autonomous vehicle pose-detecting system 10. The system 10 includes an autonomous vehicle 12, which is a machine that typically includes mechanical and electronic components. The system also includes two tags 14A and 14B that serve as reference nodes. Although two tags are shown in FIG. 1 , any number of tags may be employed. (A generic tag may be identified by reference numeral 14.)

The autonomous vehicle 12 may be a vehicle of any kind. For purposes of illustration, a typical autonomous vehicle 12 will be described as a mower that includes apparatus to mow an outdoor field. The outdoor field may be thought of as the defined area of operation of the mower. It may be undesirable for the autonomous vehicle 12 to operate autonomously outside the boundaries of the outdoor field. Other examples of an autonomous vehicle 12, by no means the only examples, include agricultural equipment, irrigation equipment, cleaning equipment, moving/conveying equipment, and delivery equipment. Although ground-based autonomous vehicles will be discussed, alternative embodiments include water-based autonomous vehicles (including those that float or submerge) and air-based autonomous vehicles (such as low-flying drones).

The autonomous vehicle 12 may employ any form of propulsion (such as petroleum-powered, electric, wind-propelled) and may be of any configuration or size. The autonomous vehicle 12 may be configured to convey one or more human beings, or not. The autonomous vehicle 12 may include one or more pieces of functional apparatus 32 according to its general purposes; in the case of a mower, for example, the functional apparatus 32 may include specialized equipment for mowing. The functional apparatus 32 includes one or more kinds of mobility apparatus 34, which convey the autonomous vehicle 12 from place to place (often within the defined area of operation, but the mobility apparatus 34 may convey the autonomous vehicle 12 from place to place outside the defined area of operation as well). Mobility apparatus 34 may also include apparatus that steers the autonomous vehicle 12 that governs the speed of the autonomous vehicle 12, that brakes the autonomous vehicle 12, or other components that make the autonomous vehicle 12 function as a vehicle. Mobility apparatus 34 may include various things such as one or more wheels, propellers, motors, fuel supplies, batteries or other power-related components, rudders, and so forth.

The functional apparatus 32 may also include any of various safety systems, such as systems to suspend operations or shut down in case of malfunction or when any of several hazardous conditions may be present.

The tags 14 may be deployed at any known positions inside the defined area of operation, or on the perimeter or border of the defined area of operation, or proximate to the defined area of operation. The tags 14 may be mounted upon dedicated pedestals, i.e. , supporting structures that hold the tags 14 in fixed positions relative to the defined area of operation (and that may have other functionality); the tags 14 may be mounted upon already-existing structures in fixed positions relative to the defined area of operation (such as fence posts, streetlights, buildings, trees, and so on); or any combination thereof. The tags 14 may be deployed at any height above the ground; for some installations, for example, one meter above the ground might be a typical height for all tags 14, while for another installation, some tags 14 may be positioned higher above the ground while others are positioned lower.

The autonomous vehicle 12 includes a pose-detecting apparatus, which includes an antenna hub 16. The antenna hub 16, described in more detail below, wirelessly transmits a signal 18 (the signal being an electromagnetic signal transmitted wirelessly), which is received by the tags 14A, 14B; and the tags 14A, 14B generate a return signal in response to the signal 18. In FIG. 1 , tags 14A, 14B receive the same single signal from the antenna hub 16 and respond to this single signal 18; in some embodiments, the antenna hub 16 may generate multiple signals of different kinds, and the tags 14A, 14B may respond to different signals. The antenna hub 16 receives and detects the return signals 20A, 20B from the tags 14A, 14B. The autonomous vehicle 12 includes a processor 22 (or more generally a "controller" which may include plural processors with dedicated functions) that receives as input the return signals 20A, 20B or signals from the antenna hub 16 that are functions of the received return signals 20A, 20B. As a function of this input, the processor computes, infers, calculates, measures, or otherwise determines the pose of the autonomous vehicle 12 with respect to the tags 14, and with respect to the defined area of operation. As used herein, a first thing (such as an output) is computed or otherwise determined "as a function of" a second thing (such as an input), when the first thing is directly or indirectly dependent upon the second thing; the first thing may be, but need not be, dependent exclusively upon the second thing.

The antenna hub 16 may be in any of several configurations, and can comprise a plurality of hub antennas. The plurality of hub antennas can be operatively connected to any suitable receiver/transceiver radio, including an ultra-wideband (UWB) receiver/transceiver radio, such as an integrated USB radio system like the DW1000 available from DecaWave of Dublin, Ireland, for example. One illustrative configuration includes three omnidirectional antennas deployed on the vertices of a triangle, such as an equilateral triangle. The distance from one antenna to another may be known to a good degree of precision. When a signal is received from a tag 14, the signal may be received by a first hub antenna in the antenna hub 16 first, and by a second hub antenna in the antenna hub 16 later, after a tiny but measurable delay. By applying principles of geometry and trigonometry, an angle of the tag 14, with respect to the orientation of the autonomous vehicle 12 or the antenna hub 16, can be computed. (Other parameters of interest may be computed or otherwise determined as well.)

Distance of the tag 14, with respect to the orientation of the autonomous vehicle 12 may be computed on the basis of the received signals in a number of ways. One way involves each tag 14 in the system 10 transmitting its response 20 in a manner to reduce interference among responses from several tags.

Tag 14B shows illustrative components of a tag 14. An antenna 36 may detect or receive electromagnetic signals 18 from the autonomous vehicle 12, and transmit electromagnetic signals 20 to the autonomous vehicle 12. A processor 38 may process the electromagnetic signals 18 detected or received by the antenna 36, and may record the time that the signals 18 were received according to an on-board clock 40. (Various tags 14 in the system 10 may synchronize their clocks 40 with one another, but this is not necessary.) Any data, such as the time a signal 18 was received or information about the tag 14B itself, may be stored in memory 42. Although the tag processor 38 may be of any type, there may be practical advantages for the processor 38 to have limited capability or low functionality, as mentioned previously.

In some embodiments, the tag 14B may have a power supply 44, which may include one or more power sources such as a battery, a solar power array, connection to an electrical grid, and so forth. The tag 14B may be configured to operate automatically in a variety of power modes, such as operating in a low-power operating state for much of the time, and automatically switching to a high-power operating state after detecting a signal from an autonomous vehicle 12, and automatically switching back to a low-power operating state after responding to the signal from the autonomous vehicle 12. Operating much of the time in a low-power operating state conserves power for times when more power is useful.

The distance of the antenna hub 16 to a tag 14 is a function of the time it takes for an electromagnetic signal 20 (traveling at the speed of light) to travel from the antenna 36 of a tag 14 to the antenna hub 16. There are numerous ways in which this travel time can be measured. In one embodiment, signal 18 transmitted by the antenna hub 16 may be a polling signal. This same polling signal may be broadcast to all tags 14 in range. In response to the polling signal, a tag 14 may (for example) change from a low-power operating state to a high-power operating state, and record the time at which the signal 18 was received. Since, in some embodiments, various tags 14 may have assigned time windows in which to broadcast their signals 20 (the tags 14 possibly operating on the presumption that the antenna hub 16 is somewhere inside the defined area of operation, but the tags 14 not necessarily having information about where the antenna hub 16 is located), each tag 14 may wait until its assigned time window to transmit its response signal 20. The response signal 20 may include an identification of the tag 14 sending the response signal 20, the time at which the signal 18 from the antenna hub 16 was received, as well as the time at which the response signal 20 was sent (both times according to the on-board clock 40 used by the tag 14). This response signal may be received by the antenna hub 16. Even if the on-board clock 40 of the autonomous vehicle 12 is not perfectly synchronized with the on-board clock 26 of the tag 14, one or more error-correction techniques may be applied to measure the time in which it took for the signal from the tag 14 to reach the antenna hub 16. In particular, if the signals' three hub antennae are synced to the same local oscillator, then all single- difference common mode range errors cancel out. When time for the signal to travel from the tag 14 to the antenna hub 16 is known, then the distance from the tag 14 to the antenna hub 16 is also known (linear distance traveled by an electromagnetic signal is the travel time multiplied by the speed of light).

The electromagnetic signals 18, 20 may be in any format, with any characteristics (frequency, band, modulation, etc.). In a typical implementation, ultra- wide band (UWB) signals may be employed. UWB may have a number of advantages, including having wide spectrum of frequency bands, with some frequencies having less risk of being blocked. UWB signals may carry digital information of almost any kind (including timing information and information identifying responding tag), they may operate at very low power, and they may be less prone to interfere with other electromagnetic signals in the area. UWB can function in a variety of lighting and weather conditions. One concern about UWB may be the range of UWB, which may be less than the range of other electromagnetic communication technologies. Although there is no sharply-defined range for UWB, ordinarily a defined area of operation should be sized and shaped so as not to have any locations in which the autonomous vehicle will be out of range of all of the tags. It may be a criterion for layout of a defined area of operation, in one example, that all locations in the defined area of operation be at least 70 meters from at least two tags. In another example, it may be specified that all locations in the defined area of operation be at least 50 meters from at least three tags.

Using techniques such as UWB, the autonomous vehicle 12 may compute the distance to one or more tags 14, as well as the angle of each tag relative to the autonomous vehicle 12. With information about distance and angle, the autonomous vehicle 12 may compute the pose of the autonomous vehicle 12 relative to the tags 14, and relative to the defined area of operation. Alternatively, as clarified above, pose may be determined using distance but without detecting angle.

With the computed pose of the autonomous vehicle 12, the processor 22 can control the operation of the autonomous vehicle 12 as a function of the pose (or as a function of any parameters related to or derived from the location and orientation). Examples of controlling the operation include turning a corner, avoiding an obstacle, increasing/decreasing speed, or activating/deactivating some of the functional apparatus 32.

The autonomous vehicle 12 may include one or more memory elements 24, which may be of any kind. Memory 24 may be volatile or non-volatile or any combination thereof. Memory 24 may store software or instructions that pertain to determining the pose of the autonomous vehicle 12 or how the autonomous vehicle 12 is to carry out its functions (e.g., which particular tags 14 are proximate to or define the defined area of operation, or what path to pursue as the autonomous vehicle 12 moves about the defined area of operation, or what to do at a particular site in the defined area of operation, or what hazards may exist in the in the defined area of operation). Memory 24 may also store data of any kind, such as a map of the defined area of operation, or a record of places where the autonomous vehicle 12 has been. In a typical implementation, the processor 22 cooperates with the memory 24 to perform any of many kinds of computational and decision-making functions.

The autonomous vehicle may include an on-board clock 26. On-board clock 26 may keep time internally or in reference to external time signals (such as wireless network signals or global positioning system (GPS) signals), or both.

The processor 22 in FIG. 1 may be, but need not be, a single discrete component of the autonomous vehicle 12. In some embodiments, the processor 22 may be a general-purpose processor (configured to perform one or more operations by executable instructions), or a specialized processor, or any combination of processing elements. The operations of the processor 22 may be distributed among multiple components. In some embodiments, various processing functions may be divided among multiple elements (for example, some processing may be performed by supplemental location apparatus 28, as discussed below). In some embodiments, components such as the clock 26 may be included in the processor 22, and need not be embodied as discrete components.

Similarly, the memory 24 need not be embodied as a single discrete component. In some embodiments, memory may include one or more memory elements that are physically separated from the autonomous vehicle 12, with data and instructions conveyed wirelessly (for example) to the autonomous vehicle 12.

As noted above, the autonomous vehicle 12 may determine its pose with respect to the defined area of operation by measuring the linear distance from the antenna hub 16 to any number of tags 14, and measuring the angular displacement of the tags. Alternatively, as clarified above, pose may be determined using distance but without detecting angle. The linear distance from the autonomous vehicle 12 to a tag 14 is a function of the time it takes a signal to travel from the autonomous vehicle 12 to the tag 14; it is also possible to think of the linear distance between the autonomous vehicle 12 and the tag 14 as being a function of the time it takes for a signal to travel from the autonomous vehicle 12 to the tag 14 and the time it takes for a reply signal to travel from the tag 14 to the autonomous vehicle 12. Electromagnetic signals travel at the speed of light, so the time it takes for a signal to go from one site to another is a function of the distance between the sites.

As a practical matter, and for purposes of illustrative discussion, it will be assumed that time computations are performed by the autonomous vehicle 12. The autonomous vehicle 12 in effect transmits a signal and measures the time it takes to receive a reply from a tag 14. This measured time is a function of the distance from the autonomous vehicle 12 to the tag 14.

Determination of the angle of a tag 14 relative to the autonomous vehicle 12 may be accomplished by any of several techniques. A comparatively uncomplicated technique may involve the antenna hub 16 having two or more antennas, disposed apart from one another by a known or fixed distance. A reply signal from a tag 14 may be received by the two antennas at two times, and the difference between the two times is the time difference. The relative angle is a function of the time difference.

In a particular embodiment of the invention, the method may additionally comprise measuring a phase and time of arrival of a signal, such as an ultrawideband signal, transmitted by the tag for each of the plurality hub antennae; and determining the differential phase of arrival, differential time of arrival, time angle of arrival and phase angle of arrival for each of the plurality of hub antennae; and determining a location of the tag relative to the plurality of hub antennae using the phase angle of arrival and range of the tag for each of the respective hub antennae. In one such embodiment, determining the location of the tag may comprise determining a three dimensional (or 3D) location of the tag relative to each of the plurality of hub antennae, using the phase angle of arrival and range of the tag for each of the hub antennae. In an exemplary such embodiment, three hub antennas may be used in combination as two or three pairs of antenna elements, to determine a 3D location of the tag using the phase angle of arrival and range of the tag for each of the two or three pairs of antenna elements. In another embodiment, determining the location of the tag may comprise determining an aggregation or average of a plurality of determined locations using the phase angle of arrival and range for each of the two or more respective pairs of hub antennae.

In one embodiment, system 10 may desirably provide for location of each of the tags 14 in the plurality of tags by means of determining an angle of arrival of the inbound signal with respect to the antenna hub 16, which may be combined with a range of tags 14 from the antenna hub 16 to calculate a relative position of each of the plurality of tags 14 with respect to antenna hub 16, such as recited according to aspects of the presently disclosed methods described in further detail below. In a particular embodiment, system 10 may be adapted for implementation of embodiments of the present inventive methods according to the disclosure which provide for using a differential time of arrival of an inbound between the first and second hub antenna to determine a differential time angle of arrival, which may desirably be used in combination with a multi-lobe differential phase angle of arrival beam pattern calculated for the phase difference of arrival of the inbound between the hub antennas, such as to disambiguate the multi-lobe phase angle of arrival beam pattern, and provide for a desirably more precise disambiguated phase angle of arrival of the inbound signal relative to the first and second hub antennas. Accordingly, in such an embodiment, system 10 may desirably provide for improved accuracy and precision for locating the position of tags 14 relative to the first and second hub antennas, than may be provided using time of arrival methods alone. In another embodiment, system 10 may desirably provide for use of an antenna hub 16 having a plurality of sparsely spaced hub antennas which may be widely spaced relative to the wavelength of the UWB carrier wave signal such as to provide for greater position determination accuracy for a particular precision of time and/or phase differential measurement at the first and second hub antennas.

In a further embodiment, the antenna hub 16, may optionally also be configured to transmit an outbound signal for reception by the tags 14. In one such embodiment, outbound signal may be used as a polling signal such as to initiate a response by tags 14 by transmission of inbound signal, for example. In another aspect, the outbound signal may be used in connection with the inbound signal to provide for a round trip time of flight measurement for determining a range of tags 14 relative to the antenna hub 16, for example. In yet another aspect, the outbound signal may be used in conjunction with the inbound signal and/or optionally also with calibration signal 30 to allow for synchronization of time measurements or to account for clock drift between tags 14 and the antenna hub 16, or to measure and/or calculate error or calibration data such as interference, reflection, multipath, distortion, attenuation or other factors involving the transmission of UWB signals by system 10.

As alluded to earlier, in embodiments, the system can be employed to passively track a movable object within a defined area of operation. In such embodiments, the antenna hub can be affixed to the movable object, such as a person or animal, and the processor can be simply employed to determine the pose of the movable object relative to the plurality of tags.

Next a preferred embodiment for determining vehicle orientation will be described.

Having previously surveyed the tags positions via vehicle-self survey or via other survey techniques, one can then perform a translation and rotation of this coordinate system so as to align it with local north (or local magnetic north) such that (x,y,z)enu align with an 'East', 'North', 'Up' (ENU) frame. That is (xi,yi,Zi)enu = X + R(xi,yi,Zi)survey for some vector X and rotation R. The three axes 'East' 'North' and 'Up' need not align on any earth spin or magnetic axis, or even be level to the ground, but they must be orthonormal. Again, without loss of generality, take the normal to the three hub antennae to be the vehicle orientation vector.

Having our vehicle orientation vector and a reference frame in an ENU frame, suppose the vehicle were driven to some random place in the area of operation. Then, for each hub antenna Aj one can use the tag locations (xi,yi,Zi)enu and ranges r (from the i th tag to the j th antenna) to perform trilateration calculations and to compute the ENU position of the j th antenna (x a j, y a j, z a j)enu. This can be done as described in the Hereman et al. publication referenced above.

It then remains to recover the orientation vector in ENU coordinates. This can be done by calculating the cross-product of antenna position differences (A2 - Ai ) enu x (A3 - A2)enu = Denu i.e. the vector normal to the hub sensors (but now in the ENU frame). The angle of the rotation (relative to the 'north' axis) is φ = cos ~1 ( emi · (0,1,0)), and the axis is V = (0, 1 ,0) xD enu (V s a unit vector).

From this we can then create a unit quaternion (a.k.a. versor) Q(r,x,y,z): r = cos( φ 12 ),

x = Vx sin(q> 12 ),

y = V y sin(cp 12 ),

z = Vz sin(q> 12 )

The above referenced Diebel publication shows how to turn a unit quaternion to a rotation matrix, c.f. equation (125), and equation (72) describes how to recover the Euler angles for yaw, pitch and roll using the (1 ,2,3) Euler sequence. This provides the 3D orientation derived from signals from three tags received at three antennas in the antenna hub. The above computations may be implemented in processor(s) 22.

Some embodiments of the autonomous vehicle 12 include supplemental location apparatus 28. Supplemental location apparatus 28 may include any of several kinds of location apparatus that may be used in the event that the antenna hub-tag system (or the distance-and-angle technique) may be inadequate for brief or extended periods of time. An example of a supplemental location apparatus 28 may be, for example, a GPS receiver, such as a conventional GPS receiver or a real-time kinematic (RTK) receiver. Other examples of supplemental location apparatus 28 may include one or more inertial sensors, or an echolocation apparatus (such as radar or sonar), or a compass, or an odometer, or a gyroscope, of a wheel sensor/encoder, or a visual sighting system, or a remote-operator-assisted piloting system or an altimeter. Some kinds of supplemental location apparatus 28 may be useful for determining location but not orientation, some may be useful for determining orientation but not location, and some may be useful for determining both. In particular if using an accelerometer to determine pitch and roll, it is possible to use only two tags to obtain a 3D pose. Similarly, by employing a barometer, and calibrating at point of known height, or using previous 3D pose measurements to calculate differential height, it is also possible to obtain a 3D pose while using only two tags (current low-cost barometers have a relative accuracy of ~ 10cm). Merely using a single antenna RTK system will only yield position however. On the other hand, using a properly calibrated MEMs accelerometer, gyro and magnetometer alone, can yield 3D orientation (but not position).

The supplemental location apparatus 28 may generate one or more signals as a function of the thing being detected, which in turn is a function of the pose of 30 the autonomous vehicle 12. The processor 22 may determine the pose of the autonomous vehicle 12 (in or outside the defined area of operation) as a function of the signal generated by the supplemental location apparatus 28.

In the course of operation, the antenna hub 16 may lose contact with one or more tags 14 or may fail to receive signals 20 from one or more tags 14. Loss of contact may be due to any number of reasons, such as an object that happens to be interposed between the antenna hub 16 and one or more tags 14 (interfering with line- of-sight or interfering with signals between the antenna hub 16 and one or more tags 14), or damage to a tag 14, or interference from a weather condition, or breakdown or malfunction of the antenna hub 16. Conditions such as any of these may result in outages of the distance-and-angle system. The outages may be momentary, or brief, or extended. Supplemental location apparatus 28 may be used during an outage for purposes of pose correction, or for emergency operation, or for bringing the autonomous vehicle 16 to a safe stop, or returning the autonomous vehicle 16 to home location, or guiding the autonomous vehicle 16 away from a hazard, or changing the operating mode of the vehicle from autonomous to user-controlled, for example. Supplemental location apparatus 28 may also be used when there is no outage.

As an alternative to, or in addition to, utilizing supplemental location apparatus 28, the autonomous vehicle 12 may use previous data and computations to move about when (for example) contact with all tags (or all but one tag) is lost. The processor 22, having previously computed the position and heading and having information from devices such as a compass or wheel or a vertical gyro, may extrapolate position and heading. If contact with the tags is reestablished within a reasonable time, the processor 22 may correct for errors (if any) and the autonomous vehicle 12 may proceed as before. If contact with the tags is not re-established within a reasonable time, the autonomous vehicle 12 may take some other action, such as shutting down or issuing a distress call. The autonomous vehicle 12 may also call upon supplemental location apparatus 28 for assistance with navigating.

The supplemental location apparatus 28 may have deficiencies of its own. Some supplemental location apparatus 28 may be too costly to operate at all times, or may be susceptible to becoming unreliable in certain environments or bad weather, for example. Even so, if the distance-and-angle techniques develop trouble, the supplemental location apparatus may under some circumstances be able to keep the trouble from becoming worse.

As shown in FIG.1 , the autonomous vehicle 12 may include input/output (I/O) devices 30 other than those on the antenna hub 16 or the supplemental location apparatus 28. Such I/O devices 30 may be of any kind; input may be received and output transmitted wirelessly, audibly, visually, haptically, or in any combination thereof, or in other fashions. Examples of other I/O devices 30 include a radio receiver, an alarm, a warning light, a keypad, user controls, and an emergency stop switch.

There are many ways in which a defined area of operation may be defined or otherwise established. One illustrative technique involves having the tags 14 deployed proximate to the expected defined area of operation, and manually positioning or guiding the autonomous vehicle 12 around the perimeter of the defined area of operation. As the autonomous vehicle 12 moves around the perimeter, the autonomous vehicle 12 notes the position of the tags 14. Once the perimeter is closed, the autonomous vehicle 12 has information about the boundaries of the defined area of operation, and the positions of the tags 14 with respect to the boundaries. From this information, the autonomous vehicle 12 may create a map of the defined area of interest. Another technique may include manually positioning or guiding the autonomous vehicle 12 to vertices of the defined area of operation. A further technique may involve moving the autonomous vehicle 12 proximate to a hazard, and instructing the autonomous vehicle 12 to avoid the hazard. Still a further technique may entail the autonomous vehicle automatically following physical perimeter markers, such as a fence, and regarding the area inside the perimeter markers as the defined area of operation.

Whatever functions the autonomous vehicle 12 performs, they need not be performed in the same way at all places in a defined area of operation, but may be set or changed or suspended depending upon where the autonomous vehicle 12 is located and/or oriented within the defined area of operation. For purposes of illustration, consider an example that will be discussed in more detail in relation to FIG. 2: the autonomous vehicle is a mower and the defined area of operation is a playing area on a golf course (e.g., tee area, fairway, rough, green, cart paths, and hazards for a single hole). The autonomous vehicle 12 may be instructed to mow this defined area of operation. The mowing operations, however, need not be uniform throughout the entire playing area. The autonomous vehicle 12 may be instructed to avoid the green and the hazards entirely, for example, and do no mowing operations there. The autonomous vehicle 12 may be instructed to mow the grass in the fairway to a shorter length than the grass in the rough, etc.

FIG. 2 is an illustrative map showing overlapping defined areas of operation on a golf course. FIG. shows one (first) playing area 50 on an illustrative golf course, and a neighboring (second) playing area 52 on the same golf course. Each playing area may have its own tee area, fairway, rough, green, and hazards; and the layout of these features will be different for every play area on the golf course. An autonomous vehicle 12 that functions as a mower may be used to mow the grass in the various playing areas, while avoiding hazards and cutting the grass to desired lengths at various sites.

Deployed on the golf course are several tags 14; in FIG. 2, eight illustrative tags 54, 56, 58, 60, 62, 64, and 66 are shown. Some of the tags may be deployed in trees proximate to the playing areas, others may be deployed on dedicated pedestals, and others may be deployed in other fashions.

Generally speaking, tags 54, 56, 58, 60, 62, and 66 define the first defined area of operation 68, and the first defined area of operation 68 is related to the first playing area 50. Tags 60. 62, 64, 66, and other tags (not shown in FIG.2) define the second defined area of operation 70, and the second defined area of operation 70 is related to the second playing area 52.

As depicted in FIG. 2, the first and second defined areas of operation 68, 70 do not overlap geographically, though the first and second defined areas of operation 68, 70 may share a tag 62. There may be instances in which an autonomous vehicle 12 that performs functions on the first defined area of operation 68 may be called upon to perform similar functions on the second defined area of operation 70. Although it may be possible for an autonomous vehicle 12 to move autonomously from the first defined area of operation 68 to the second defined area of operation 70, such movement may be aided by creation of a third defined area of operation 72, which geographically overlaps the first defined area of operation 68 and the second defined area of operation 70. In FIG. 2, the boundaries of the third defined area of operation 72 essentially correspond to tags 58, 60, 64, and 66, and tag 62 is positioned well within and away from the perimeter of the third defined area of operation 72.

When the autonomous vehicle 12 is in the area of overlap of the first defined area of operation 68 and the third defined area of operation 72, the autonomous vehicle 12 may autonomously terminate its functions in the first defined area of operation 68 and begin its functions the third defined area of operation 72. In the example of an autonomous vehicle 12 that functions as a mower, the autonomous vehicle 12 may finish its work in the first defined area of operation 68, and then move to that part of the first defined area of operation 68 that overlaps the third defined area of operation 72. The autonomous vehicle 12 may then begin its mowing operations in area between the first defined area of operation 68 and the second defined area of operation 70, such as mowing the grass in the region 74 between the playing areas 50, 52. In some cases, the autonomous vehicle 12 may move directly to the second defined area of operation 70, by moving to that part of the third defined area of operation 72 that overlaps the second defined area of operation 70. The autonomous vehicle 12 may finish its work in the third defined area of operation 72 (which may comprise mowing or simply moving from one defined area of operation to another), moving to that part of the third defined area of operation 72 that overlaps the second defined area of operation 70. The autonomous vehicle 12 may then begin its mowing operations in the second defined area of operation 70.

In some cases, it may be advantageous for a large defined area of operation may be broken up into smaller defined areas of operation. In other cases, it may be desirable for a small defined area of operation to be made larger, or to have another defined area of operation positioned nearby or overlapping. The embodiments described herein encompass these possibilities.

In certain embodiments, the pose determination system and method can be employed in an indoor environment. In such instances, accuracy and precision of the pose determination system can be increased by synchronizing the plurality of tags by employing a single high stability oscillator. Using a single oscillator can also improve navigation performance, since different tags normally would have different oscillator drifts. It would also be possible to use open standards IEEE 1588 PTP-V2, or synchronous Ethernet to disseminate phase and frequency.

Implementation of one or more embodiments may realize one or more advantages, many of which have been mentioned already. The operation of the autonomous vehicle in a defined area of operation may simplify the programming of the autonomous vehicle, in that the autonomous vehicle can be programmed to deal with conditions, hazards, obstacles, and other various eventualities that affect the defined area of operation, rather than the much wider range of eventualities that may affect a broader geographical area. The tags described herein can serve as reference nodes that require little or no external power, and may include little or no infrastructure to communicate with other tags or with any other network. A system that uses UWB can be expected to be reliable at low power and is adaptable to a range of terrains and environments and weather conditions. For some terrains and environments, UWB coupled with GPS may provide additional reliability and adaptability. Although some of the embodiments have been described in connection with autonomous vehicles, they need not be limited to autonomous vehicles. For example, embodiments may include vehicles that are not autonomous. Further, embodiments may include objects that are not conventionally thought of as vehicles, such as living things. Further, embodiments may be made applicable to a virtual world as well as to a real world. A pose of an object in a virtual world may be determined with respect to a virtual defined area of operation or virtual reference nodes. Virtual world applications may include various forms of gaming in a virtual world.

In the case in which the tags operate on low power and/or have low functionality, processing power can be concentrated in the autonomous vehicle, which may have fewer power constraints than the tags. Security against theft and other mischief can be concentrated in the autonomous vehicle. In an illustrative case, the autonomous vehicle may be locked up when not in use, but it may not be a practical necessity to lock up the tags, which may remain deployed near the defined area of operation.

Embodiments may be applied to run-of-the-mill activities as well as unusual activities. Applications may be civilian as well as military. Uses may be practical as well as artistic (for example, in the illustration in which the autonomous vehicle is a mower, the autonomous vehicle may be programmed to cut grass to produce a pleasing design).

While particular exemplary embodiments have been described along with their various functional components and operational functions many variations are possible. For example, various functions and components may be implemented in hardware, software, firmware, middleware or a combination thereof and utilized in systems, subsystems, components or subcomponents thereof. In particular for embodiments implemented in software, elements thereof may be instructions and/or code segments to perform the necessary tasks. The program or code segments may be stored in a machine readable medium, such as a processor readable, such as a processor readable medium or a computer program product, or transmitted by a computer data signal embodied in a carrier wave, or a signal modulated by a carrier, over a transmission medium or communication link. The machine readable medium or processor readable medium may include any medium that can store or transfer information in a form readable and executable by a machine, for example a processor, computer, etc.

An embodiment may relate to a computer storage product with a computer-readable medium having computer code thereon for performing various computer-implemented operations. The computer-readable media and computer code may be those specially designed and constructed for the purposes of the disclosed embodiments, or they may be of the kind well known and available to those having skill in the computer software arts. Examples of computer-readable media include, but are not limited to: ROM and RAM devices including Flash RAM memory storage cards, sticks and chips, magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD-ROMs and holographic devices; magneto-optical media such as floptical disks; and hardware devices that are specially configured to store and execute program code, such as application specific integrated circuits ("ASICs"), programmable logic devices ("PLDs") and ROM and RAM devices including Flash RAM memory storage cards, sticks and chips, for example. Examples of computer code include machine code, such as produced by a compiler, and files containing higher-level code that are executed by a computer using an interpreter. For example, an embodiment may be implemented using any suitable scripting, markup and/or programming languages and development tools. Another embodiment may be implemented in hardwired circuitry in place of, or in combination with, machine- executable software instructions.

The illustrative embodiments herein described are not intended to be exhaustive or limiting in nature. Many alterations and modifications are possible in the practice of this invention without departing from the scope thereof.