Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD FOR ALIGNING AN AUTONOMOUS MOBILE APPARATUS TO A REFERENCE OBJECT, AN AUTONOMOUS MOBILE APPARATUS, AND A GUIDANCE MODULE THEREOF
Document Type and Number:
WIPO Patent Application WO/2021/015669
Kind Code:
A1
Abstract:
Methods (300), (500), (900) of aligning an autonomous mobile apparatus (10), (40), (70) to a reference object is disclosed. The reference object includes a light source (210) arranged to emit polarized light (140), (1140), (2140). The autonomous mobile apparatus (10), (40), (70) has an image capturing device (110), (1110), (460), (2110), (2460) with a polarizing filter (112), (1112), (2112), (762) for capturing intensity of the polarized light (140), (1140), (2140) and a propulsion device (120), (1120), (2120). Methods (300), (500), (900) include (i) receiving the polarized light (140) by the image capturing device (110), (1110), (460), (2110), (2460) via the polarized filter (112), (412), (712), (762), and (ii) dynamically adjusting the propulsion device (120), (1120), (2120) to align orientation of the autonomous mobile apparatus (10), (40), (70) to the reference object based on the intensity of the received polarized light (140), (1140), (2140). The autonomous mobile apparatus (10), (40), (70) and a guidance module thereof are also disclosed.

Inventors:
POH HOU SHUN SETH (SG)
Application Number:
PCT/SG2020/050384
Publication Date:
January 28, 2021
Filing Date:
July 07, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NAT UNIV SINGAPORE (SG)
International Classes:
G01S3/14; G05D1/08; B64C13/16; G01S7/499; G02B5/30; G06K9/78
Domestic Patent References:
WO2004041381A22004-05-21
Foreign References:
CN207281598U2018-04-27
CN105786018A2016-07-20
EP2905590A12015-08-12
CN107885223A2018-04-06
CN109581456A2019-04-05
Other References:
ZHI WEI, CHU JINKUI, LI JINSHAN, WANG YINLONG: "A Novel Attitude Determination System Aided by Polarization Sensor", SENSORS, vol. 18, no. 158, 9 January 2018 (2018-01-09), pages 1 - 17, XP055786432, DOI: HTTPS:/IDOI.ORG/10.3390/S18010158
PATRUNO C., NITTI M., PETITTI A., STELLA E., D’ORAZIO T.: "A Vision-Based Approach for Unmanned Aerial Vehicle Landing", JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, vol. 95, no. 2, 12 September 2018 (2018-09-12), pages 1 - 28, XP036838760, DOI: HTTPS://DOi.ORG/10.1007/S10846-018-0933-2
Attorney, Agent or Firm:
POH, Chee Kian, Daniel (SG)
Download PDF:
Claims:
CLAIMS

1. A method of aligning an autonomous mobile apparatus to a reference object, the reference object including a light source arranged to emit polarized light, and the autonomous mobile apparatus having an image capturing device with a polarizing filter for capturing intensity of the polarized light and a propulsion device, the method comprising

(i) receiving the polarized light by the image capturing device via the polarized filter; and

(ii) dynamically adjusting the propulsion device to align orientation of the autonomous mobile apparatus to the reference object based on the intensity of the received polarized light.

2. A method according to claim 1 , wherein dynamically adjusting the propulsion device in (ii), further comprises

adjusting the propulsion device to rotate the autonomous mobile apparatus in a preset direction; and

if the intensity of the received polarized light is further away from a desired intensity, adjusting the propulsion device to rotate the autonomous mobile apparatus in a direction opposite of the preset direction.

3. A method according to claim 2, further comprising adjusting the propulsion device to rotate the autonomous mobile apparatus in the preset direction if the intensity of the received polarized light is closer to the desired intensity.

4. A method according to claim 2 or 3, wherein the polarized light is linearly polarized, and the polarizing filter is a linear polarizing filter. 5. A method according to claim 4, wherein the desired intensity is achieved when the polarized light is perpendicular to the polarizing filter.

6. A method according to claim 1 , wherein the autonomous mobile apparatus includes a secondary image capturing device for capturing an image of the light source, the method further comprising

(iii) identifying a centroid of the light source from at least a captured image of the light source; and

(iv) dynamically adjusting the propulsion device to align a relative position of the autonomous mobile apparatus to the centroid of the light source.

7. A method according to claim 6, wherein the image capturing device and the secondary image capturing device are confocal, the method further comprising, identifying pixels in the secondary image capturing device corresponding to the centroid of the light source; and

using pixels in the image capturing device corresponding to the identified pixels in the secondary image capturing device for capturing the intensity of the polarized light.

8. A method according to claim 6 or 7, wherein the secondary image capturing device includes a secondary polarizing filter for capturing a secondary intensity of the polarized light, and wherein dynamically adjusting the propulsion device in (ii) further comprises

receiving the secondary intensity of the polarized light from the secondary image capturing device;

comparing the intensity of the polarized light from the image capturing device with the secondary intensity of the polarized light from the secondary image capturing device; and

adjusting the propulsion device to rotate the autonomous mobile apparatus based on the comparison.

9. A method according to claim 8, wherein a polarizing axis of the polarizing filter and a secondary polarizing axis of the secondary polarizing filter have an angular difference of 90°, and the desired intensity is achieved when the first intensity reading and the third intensity reading have the same intensity.

10. A method according to claim 9, wherein the polarizing axis is at a +45° angle with respect to a longitudinal axis of the autonomous mobile apparatus’ main body, and the secondary polarizing axis is at a -45° angle with respect to the longitudinal axis of the autonomous mobile apparatus’ main body.

1 1. A method according to claim 9 or 10, wherein the centroid of the light source is identified from a composite image made by overlapping the images of the lights captured by the image capturing device and the secondary image capturing device. 12. A method according to any one of claims 6 to 1 1 , wherein the autonomous mobile apparatus further includes a beamsplitter for splitting the polarized light into a first light beam to be captured by the image capturing device and a second light beam to be captured by the secondary image capturing device. 13. A method according to any preceding claim, wherein the autonomous mobile apparatus rotates about a yaw axis of the autonomous mobile apparatus’ main body.

14. A method according to any preceding claim, wherein the reference object is mobile.

15. A method according to any preceding claim, wherein the autonomous mobile apparatus is a drone, a spacecraft, or an underwater autonomous vehicle, .

16. An autonomous mobile apparatus comprising

an image capturing device with a polarizing filter for capturing intensity of polarized light from a light source of a reference object;

a propulsion device for controlling movement of the autonomous mobile apparatus; and an alignment device communicatively coupled to the image capturing device and the propulsion device, the alignment device configured to dynamically adjust the propulsion device to align the orientation of the autonomous mobile apparatus to the reference object based on the intensity of the polarized light.

17. An autonomous mobile apparatus according to claim 16, wherein the alignment device is further configured to dynamically adjust the propulsion device by

adjusting the propulsion device to rotate the autonomous mobile apparatus in a preset direction; and

if the intensity of the received polarized light is further away from a desired intensity, adjusting the propulsion device to rotate the autonomous mobile apparatus in a direction opposite of the preset direction.

18. An autonomous mobile apparatus according to claim 17, wherein the alignment device is further configured to adjust the propulsion device to rotate the autonomous mobile apparatus in the preset direction if the intensity of the received polarized light is closer to the desired intensity.

19. An autonomous mobile apparatus according to claims 16 or 17, wherein the polarized light is linearly polarized, and the polarizing filter is a linear polarizing filter.

20. An autonomous mobile apparatus according to claim 19, wherein the desired intensity is achieved when the polarized light is perpendicular to the polarizing filter.

21 . An autonomous mobile apparatus according to claim 15, further comprising a secondary image capturing device for capturing an image of the light source, the alignment device is further configured to

identify a centroid of the light source from at least a captured image of the light source; and

dynamically adjust the propulsion device to align a position of the autonomous mobile apparatus to the centroid of the light source.

22. An autonomous mobile apparatus according to claim 21 , wherein the image capturing device and the secondary image capturing device are confocal, and the alignment device is further configured to

identify pixels in the secondary image capturing device corresponding to the centroid of the light source; and

use pixels in the image capturing device corresponding to the identified pixels in the secondary image capturing device for capturing the intensity of the polarized light.

23. An autonomous mobile apparatus according to claim 21 or 22, wherein the secondary image capturing device includes a secondary polarizing filter for capturing a secondary intensity of the polarized light, and wherein the alignment device is further configured to dynamically adjust the propulsion device by

receiving the secondary intensity of the polarized light from the secondary image capturing device; comparing the intensity of the polarized light from the image capturing device with the secondary intensity of the polarized light from the secondary image capturing device; and

adjusting the propulsion device to rotate the autonomous mobile apparatus based on the comparison.

24. An autonomous mobile apparatus according to claim 23, wherein a polarizing axis of the polarizing filter and a secondary polarizing axis of the secondary polarizing filter have an angular difference of 90°, and the desired intensity is achieved when the first intensity reading and the third intensity reading have the same intensity.

25. An autonomous mobile apparatus according to claim 24, wherein the polarizing axis is at a +45° angle with respect to a longitudinal axis of the autonomous mobile apparatus’ main body, and the secondary polarizing axis is at a -45° angle with respect to the longitudinal axis of the autonomous mobile apparatus’ main body.

26. An autonomous mobile apparatus according to claim 24 or 25, wherein the centroid of the light source is identified from a composite image made by overlapping the images of the light source captured by the image capturing device and the secondary image capturing device.

27. An autonomous mobile apparatus according to any one of claims 21 to 26, wherein the autonomous mobile apparatus further includes a beamsplitter for splitting the polarized light into two light beams to be captured by the image capturing device and the secondary image capturing device respectively.

28. An autonomous mobile apparatus according to any one of claims 16 to 27, wherein the autonomous mobile apparatus rotates about a yaw axis of the autonomous mobile apparatus’ body.

29. An autonomous mobile apparatus according to any one of claims 16 to 28, wherein the reference object is mobile.

30. An autonomous mobile apparatus according to any one of claims 16 to 29, wherein the autonomous mobile apparatus is a drone, a spacecraft, an underwater autonomous vehicle. 31. A guidance module to be retrofitted to an autonomous mobile apparatus having a propulsion device, the guidance module comprising

an image capturing device having a polarized filter for receiving polarized light from a light source of a reference object; and

an alignment device arranged to be communicatively coupled to the image capturing device and the propulsion device, the alignment device configured to dynamically adjust the propulsion device to align the orientation of the autonomous mobile apparatus to the reference object based on the intensity of the polarized light.

Description:
METHOD FOR ALIGNING AN AUTONOMOUS MOBILE APPARATUS TO A

REFERENCE OBJECT, AN AUTONOMOUS MOBILE APPARATUS, AND A

GUIDANCE MODULE THEREOF TECHNICAL FIELD

The present disclosure relates to aligning an autonomous mobile apparatus to a reference object. More specifically, the present disclosure relates to aligning relative positions and orientation of the autonomous mobile apparatus to the reference object.

BACKGROUND

Alignment of the position and orientation of an autonomous mobile apparatus to a reference object that is separated by a distance in three dimensional space without an external reference, i.e. reference features are only located on the two objects, is an important task in many applications such as drone landing systems, and spacecraft docking systems. The precise alignment of the position and orientation of the drone or spacecraft to the landing or docking platform allows the drone or spacecraft to land or dock safely, and prevents damage to the drone or spacecraft due to misalignment with the landing or docking platform.

Often, techniques used for such alignment tasks have a small range of working distances or are unable to effect the alignment in both position and orientation. This is primarily due to the difficulty at large distances in resolving reference features with dimensions that are optimized for the accurate determination of the orientation of an object at short distances.

Applications where two objects need to approach each other from a large distance and are required to be aligned in both position and orientation at zero distance are particularly susceptible to this difficulty due to the large range of working distance. Some examples include aerial drones landing on small targets, wheeled robots moving back to their charging bases, and spacecraft docking at a space station. The problems are further compounded when the autonomous mobile apparatus has low maneuverability or that there are safety concerns in performing those alignment maneuvers when close to the reference object. This requires a system which is capable of aligning the relative position and orientation of the autonomous mobile apparatus to the reference object from a large range of working distance.

Therefore, it is desirable to provide a way to align an autonomous mobile apparatus to a reference object in order to address the problems mentioned in existing prior art and/or to provide the public with a useful choice. SUMMARY

Various aspects of the present disclosure are described here. It is intended that a general overview of the present disclosure is provided and this, by no means, delineate the scope of the invention. According to a first aspect, there is provided a method of aligning an autonomous mobile apparatus to a reference object. The reference object includes a light source arranged to emit polarized light. The autonomous mobile apparatus has an image capturing device with a polarizing filter for capturing intensity of the polarized light and a propulsion device. The method includes (i) receiving the polarized light by the image capturing device via the polarized filter, and (ii) dynamically adjusting the propulsion device to align orientation of the autonomous mobile apparatus to the reference object based on the intensity of the received polarized light. Compared to conventional means, the described embodiment is less computationally intensive to extract information on the relative orientation of the autonomous mobile apparatus to the reference object from the intensity of the received polarized light. Furthermore, the polarized light might be transmitted over large distances while retaining its polarization, which allows the method to work over larger distances compared to conventional means. As a result, the described embodiment provides a way to align the autonomous mobile apparatus to the reference object more accurately or more precisely over a large range of working distance since polarized light is used. Preferably, dynamically adjusting the propulsion device may include adjusting the propulsion device to rotate the autonomous mobile apparatus in a preset direction, and if the intensity of the received polarized light is further away from a desired intensity, adjusting the propulsion device to rotate the autonomous mobile apparatus in a direction opposite of the preset direction. On the other hand, if the intensity of the received polarized light is closer to the desired intensity, the method may include adjusting the propulsion device to rotate the autonomous mobile apparatus in the preset direction.

Preferably, the polarized light may be linearly polarized, and the polarizing filter may be a linear polarizing filter.

Preferably, the desired intensity may be achieved when the polarized light is perpendicular to the polarizing filter. Advantageously, this allows aligning of the orientation of the autonomous mobile vehicle to the reference object to be performed more precisely.

Preferably, the autonomous mobile apparatus may include a secondary image capturing device for capturing an image of the light source, and the method may further include (iii) identifying a centroid of the light source from at least a captured image of the light source, and (iv) dynamically adjusting the propulsion device to align a position of the autonomous mobile apparatus to the centroid of the light source.

Preferably, the image capturing device and the secondary image capturing device may be confocal, and the method may further include identifying pixels in the secondary image capturing device corresponding to the centroid of the light source, and using pixels in the image capturing device corresponding to the identified pixels in the secondary image capturing device for capturing the intensity of the polarized light.

Preferably, the secondary image capturing device may include a secondary polarizing filter for capturing a secondary intensity of the polarized light, and dynamically adjusting the propulsion device in (ii) may further includes receiving the secondary intensity of the polarized light from the secondary image capturing device, comparing the intensity of the polarized light from the image capturing device with the secondary intensity of the polarized light from the secondary image capturing device, and adjusting the propulsion device to rotate the autonomous mobile apparatus based on the comparison.

Preferably, a polarizing axis of the polarizing filter and a secondary polarizing axis of the secondary polarizing filter may have an angular difference of 90°, and the desired intensity may be achieved when the first intensity reading and the third intensity reading have the same intensity.

Preferably, the polarizing axis may be at a +45° angle with respect to a vertical axis, and the secondary polarizing axis may be at a -45° angle with respect to the vertical axis.

Preferably, the centroid of the light source may be identified from a composite image made by overlapping the images of the lights captured by the image capturing device and the secondary image capturing device. Preferably, the autonomous mobile apparatus may further include a beamsplitter.

Preferably, the autonomous mobile apparatus may rotate about the yaw axis.

Preferably, the reference object may be mobile.

Preferably, the autonomous mobile apparatus may be a drone, a spacecraft, an underwater autonomous vehicle, or a wheeled robot such as a robotic vacuum cleaner.

According to a second aspect, there is provided an autonomous mobile apparatus including an image capturing device with a polarizing filter for capturing intensity of polarized light from a light source of a reference object, a propulsion device for controlling movement of the autonomous mobile apparatus, and an alignment device communicatively coupled to the image capturing device and the propulsion device. The alignment device is configured to dynamically adjust the propulsion device to align the orientation of the autonomous mobile apparatus to the reference object based on the intensity of the polarized light.

Preferably, the autonomous mobile apparatus may further include a secondary image capturing device for capturing an image of the light source. The alignment device may further be configured to identify a centroid of the light source from at least a captured image of the light source, and dynamically adjust the propulsion device to align a position of the autonomous mobile apparatus to the centroid of the light source.

Preferably, the secondary image capturing device may include a secondary polarizing filter for capturing a secondary intensity of the polarized light. The alignment device may be further configured to dynamically adjust the propulsion device by receiving the secondary intensity of the polarized light from the secondary image capturing device, comparing the intensity of the polarized light from the image capturing device with the secondary intensity of the polarized light from the secondary image capturing device; and adjusting the propulsion device to rotate the autonomous mobile apparatus based on the comparison.

According to a third aspect, there is provided a guidance module to be retrofitted to an autonomous mobile apparatus having a propulsion device. The guidance module includes an image capturing device having a polarized filter for receiving polarized light from a light source of a reference object, and an alignment device arranged to be communicatively coupled to the image capturing device and the propulsion device, the alignment device configured to dynamically adjust the propulsion device to align the orientation of the autonomous mobile apparatus to the reference object based on the intensity of the polarized light.

BRIEF DESCRIPTION OF THE DRAWINGS

Exemplary embodiments will be described with reference to the following figures in which: Figure 1 shows an autonomous mobile apparatus in the form of an unmanned aerial vehicle homing in to a reference object in the form of a landing station having an optical beacon according to a first embodiment;

Figure 2 is an overhead view of the optical beacon illustrated in Figure 1 ; Figure 3 is a block diagram of a system architecture of the unmanned aerial vehicle illustrated in Figure 1 according to the first embodiment;

Figure 4A is an overhead view of the optical beacon illustrated in Figure 2;

Figure 4B is a line graph showing an intensity of the polarized light for different angular displacements of the unmanned aerial vehicle of Figure 1 from optical beacon illustrated in Figure 4A;

Figure 5 is a flow diagram of a method performed by the unmanned aerial vehicle illustrated in Figure 3 according to the first embodiment;

Figure 6 is a block diagram of a system architecture of a second unmanned aerial vehicle according to a second embodiment;

Figure 7 is a flow diagram of a method performed by the second unmanned aerial vehicle illustrated in Figure 6 according to the second embodiment;

Figure 8A illustrates an image captured by a primary image capturing device of the second unmanned aerial vehicle illustrated in Figure 6;

Figure 8B is a viewpoint from a secondary image capturing device of the second unmanned aerial vehicle illustrated in Figure 6;

Figure 9 is a block diagram of a system architecture of a third unmanned aerial vehicle according to a third embodiment;

Figure 10A shows cross-sectional views of a first and second polarizing filters illustrated in Figure 9; Figure 10B is a line graph showing intensities of the polarized light captured by the primary image capturing device and the secondary image capturing device for different angular displacement of the unmanned aerial vehicle illustrated in Figure 9 from the optical beacon illustrated in Figure 4A;

Figure 1 1 is a flow diagram of a method performed by the system architecture illustrated in Figure 9 according to the third embodiment.

DETAILED DESCRIPTION

The following description contains specific examples for illustrative purposes. The person skilled in the art would appreciate that variations and alterations to the specific examples are possible and within the scope of the present disclosure. The figures and the following description of the particular embodiments should not take away from the generality of the preceding summary. Figure 1 illustrates an autonomous mobile apparatus 10 homing in to a reference object 20. In this embodiment, the autonomous mobile apparatus 10 is an unmanned aerial vehicle 100 (eg. drone) and the reference object 20 is a landing station 200 which is movable as the landing station 200 is part of a larger platform (not shown) that is deployed out in the open sea. The landing station 200 includes a light source in the form of an optical beacon 210 that emits polarized light 140 for guiding relative positions and orientation of the unmanned aerial vehicle 100 to land on the landing station 200. The landing station 200 also includes a landing pad 240 for the unmanned aerial vehicle 100 to land on. Figure 2 illustrates an overhead view of the optical beacon 210 that emits polarized light 140 that is linearly polarized along a polarization axis 220 which is along a horizontal plane. Figure 3 illustrates a system architecture of the unmanned aerial vehicle 100 as an example of the autonomous mobile apparatus 10. In this embodiment, the unmanned aerial vehicle 100 includes a main body 100a, and an image capturing device in the form of a camera 110 attached to the main body 100a. The camera 1 10 is arranged to capture the polarized light 140 emitted by the optical beacon 210. The camera 1 10 includes a polarizing filter 1 12 which alters intensity of the polarized light 140 captured by the camera 1 10. The unmanned aerial vehicle 100 also includes a propulsion device 120 that is attached to the main body 100a and in this embodiment, the propulsion device 120 includes four sets of propellers 122 driven by respective motors 124 to allow the unmanned aerial vehicle 100 to take flight and to rotate the unmanned aerial vehicle 100 about its yaw axis 100b (see Figure 1 ). Specifically, the yaw axis 100b is a vertical axis that runs through a middle of the main body 100a when the unmanned aerial vehicle 100 is upright. The four sets of propellers 122 are positioned on the main body 100a such that each propeller 122 is situated at a corner of the main body. Further, the unmanned aerial vehicle 100 includes a power source 160 (eg. batteries) for powering the various components such as the propulsion device 124.

By correspondingly increasing/decreasing the speed of diametrically opposite propellers 122, the unmanned aerial vehicle 100 performs a yaw (axis) rotation. The unmanned aerial vehicle 100 also includes an alignment device 130 which comprises a controller 131 , a beacon identification module 132 and an orientation module 134. The controller 131 is configured to control the operation of the unmanned aerial vehicle 100 based on a set of instructions stored in a memory. The controller 131 is arranged to be communicatively coupled to the camera 1 10 for receiving information about the optical beacon 210 from the camera 1 10. In particular, and for the purpose of aligning the unmanned aerial vehicle 100 to the optical beacon 210, images captured by the camera 1 10 is transmitted to the controller 131 and information about the intensity of the polarized light 140 emitted by the optical beacon 210 is determined from the captured images. The controller 131 is also arranged to be communicatively coupled to the propulsion device 120 to control its flight through adjusting speed and direction of the propellers 122. The controller 131 transmits the images captured by the camera to the beacon identification module 132 which is configured to identify the optical beacon 210 based on the images. Once the optical beacon 210 is identified, the beacon identification module 132 notifies the controller 131 which then informs the orientation module 134. The orientation module 134 includes an algorithm to determine the direction to rotate the unmanned aerial vehicle 100 based on the intensity of the polarized light 140 determined from the captured images.

The algorithm used by the orientation module 134 to determine the direction of rotation is based on Malus’ Law where the intensity of linearly polarized light transmitted through a linearly polarizing filter is dependent on the relative angular displacement of their optical axes. In other words, the intensity of the polarized light 140 captured by the camera 110 depends on the angular displacement of the unmanned aerial vehicle 100 from the optical beacon 210. This is explained in more detail with reference to Figures 4A and 4B.

Once the direction is determined by the orientation module 134, the orientation module informs the controller 131 which then instructs the propulsion device 120 to rotate the unmanned aerial vehicle 100 in the determined direction to align the orientation of the unmanned aerial vehicle 100 to the optical beacon 210.

In this embodiment, the communication channels 150 between the controller 131 and the camera 1 10 and propulsion device 120 are shown as a one-way wired communication.

Figure 4A illustrates the optical beacon 210 of Figure 2 with the polarization axis 220, as well as an optical axis 230 of the polarizing filter 1 12, which is aligned to a vertical plane. The optical axis 230 of the polarizing filter 112 is depicted on the optical beacon 200 merely to demonstrate the angular displacement between the polarization axis 220 of the polarized light 140 and the optical axis 230 of the polarizing filter 1 12.

Figure 4A depicts a desired angular displacement between the polarization axis 220 and the optical axis 230. The polarizing filter 112 of the camera 110 is set up on the unmanned aerial vehicle 100 such that the polarization axis 220 and the optical axis 230 are substantially perpendicular to each other when the orientation of the unmanned aerial vehicle 100 is aligned to the optical beacon 210. Since the intensity of the polarized light 140 captured by the camera 1 10 is zero (0 watts/m 2 ) when the polarization axis 220 and the optical axis 230 are perpendicular to each other, by rotating the unmanned aerial vehicle 100 in a direction that reduces the intensity of the polarized light 1 14 captured until the intensity of the polarized light 140 captured by the camera 1 10 reduces to zero, the orientation module 134 is able to align the orientation of the unmanned aerial vehicle 100 to the optical beacon 210 (and the landing station 200). By stopping rotation of the unmanned aerial vehicle 100 as soon as the intensity of the polarized light 140 reduces to zero, the orientation module 134 is able to align the orientation of the unmanned aerial vehicle 100 to the optical beacon 210 with high precision. For comparison, if instead the orientation of the unmanned aerial vehicle 100 is aligned to the optical beacon 210 when the polarization axis 220 and the optical axis 230 are parallel to each other, i.e. when the intensity of the polarized light 140 increases to a maximum, the orientation module 134 can only determine that the intensity of the polarized light 140 has reached the maximum when the intensity of the polarized light 140 starts to reduce from the maximum. This results in over-rotation of the unmanned aerial vehicle 100 to the optical beacon 210.

Figure 4B is a line graph 250 showing the expected intensity of the polarized light 140 for different angular displacements of the unmanned aerial vehicle 100 from the optical beacon 210. Notably, the orientation of the unmanned aerial vehicle 100 is aligned to the optical beacon 210 when the angular displacement of the unmanned aerial vehicle 100 from the optical beacon 210 is at 0° and ±180°, and the intensity of the polarized light 140 captured by the camera 1 10 is zero. For the present embodiment, the orientation module 134 may not distinguish between front and back of the unmanned aerial vehicle 100. As such, at any given intensity, the unmanned aerial vehicle 100 rotates either in the clockwise direction or counterclockwise direction to achieve the desired orientation. However, except when the intensity is at maximum, one of the directions is always shorter than the other. For example, when intensity is at 0.4, it is quicker to rotate in the direction indicated by arrow 252 than to rotate in the opposite direction indicated by arrow 254. Furthermore, since the landing station 200 is mobile, this causes the orientation of the polarization axis 220 of the optical beacon 210 to shift as well. Consequently, the orientation of the unmanned aerial vehicle 100 is frequently misaligned from the optical beacon. It is therefore necessary for the unmanned aerial vehicle 100 to keep making dynamic adjustments to the orientation of the unmanned aerial vehicle 100 in response to changes in the orientation of the optical beacon 210 until the unmanned aerial vehicle 100 has landed on the landing pad 240. The method by which the orientation module 134 aligns the orientation of the unmanned aerial vehicle 100 is explained next with reference to Figure 5 which illustrates an exemplary method 300 for aligning the unmanned aerial vehicle 100 to the landing station 200 having the optical beacon 210. The method 300 can be divided into two stages - a beacon identification stage 310 and an orientation stage 320.

At step 312 of the beacon identification stage 310, the beacon identification module 132 receives the images captured by the camera 110 from the controller 131. At step 314, the beacon identification module 132 processes the visual information and in particular, looks for the optical beacon 210. The camera 1 10 further includes an optical filter arranged to only allow light having a wavelength corresponding to the polarized light 140 to pass through. This enables the beacon identification module to differentiate the optical beacon 210 from other light sources. If the optical beacon is not identified in the images captured by the camera 1 10, the method 300 goes back to step 312. This process is repeated until the optical beacon 210 is identified at step 314. The beacon identification stage 310 ends when the beacon is identified by the beacon identification module 132 and the beacon identification module 132 sends a signal to the controller 131 which informs the orientation module 134. At this point, the orientation stage 320 is initiated by the orientation module 134.

At step 322, the orientation module 134 first informs the controller 131 to direct the propulsion device 120 to rotate the unmanned aerial vehicle 100 in a preset direction. By making a rotation, the angular displacement of the polarization axis 220 and the optical axis 230 changes, and hence, the intensity of the polarized lightl 40 changes as well. At step 324, the intensity of the polarized lightl 40 is captured by the camera 1 10 again and transmitted by the controller 131 to the orientation module. The orientation module 134 determines if the intensity (or brightness) of the polarized light has increased or decreased. If the intensity of the light decrease, the orientation module 134 determines that the unmanned aerial vehicle 100 is rotating in the right direction (the right direction being the shorter direction to reach zero intensity), and informs the controller 131 to direct the propulsion device 120 to continue rotating the unmanned aerial vehicle 100 in the preset direction, thereby returning the method back to step 322.

On the other hand, if the intensity of the light increases, the orientation module 134 determines that the unmanned aerial vehicle 100 is rotating in the wrong direction (the longer direction to reach zero intensity), then at step 326, the orientation module 134 flips the direction of rotation and informs the controller 131 to direct the propulsion device 120 to rotate the unmanned aerial vehicle 100 in the opposite direction of the preset direction.

As long as the unmanned aerial vehicle 100 is still airborne, the method 300 continues to adjust the propulsion device 120 to align the orientation of the unmanned aerial vehicle 100 to the optical beacon 210. This allows the orientation module 134 to make continuous adjustments to the orientation of the unmanned aerial vehicle 100 as a response to changes in the orientation of the landing station 200 as the unmanned aerial vehicle 100 is homing in to the landing station 200, thus allowing the unmanned aerial vehicle 100 to land more accurately and more precisely on the landing station 200.

At step 324, if the orientation module 134 detects that the intensity of the polarized light 140 has increased instead of decreased despite having rotated the unmanned aerial vehicle 100 in the previously determined direction, the orientation module 134 simply instructs the controller 131 to control the propulsion device 120 to change the direction of rotation according to step 326 such that the unmanned aerial vehicle 100 rotates in the opposite direction. In this way, the unmanned aerial vehicle 100 can quickly react to abrupt changes to the orientation of the unmanned aerial vehicle

100 to the optical beacon 210 as the unmanned aerial vehicle 100 is homing in on the landing station 200. In particular, the unmanned aerial vehicle 100 is also able to react quickly to re-orientate the unmanned aerial vehicle 100 when the orientation of the unmanned aerial vehicle 100 is disturbed by an external force such as strong winds which may cause the unmanned aerial vehicle 100 to over-rotate or simply be completely misaligned. This minimizes the risk of the unmanned aerial vehicle 100 being damaged due to a sudden misalignment to the landing station 200 while the unmanned aerial vehicle is landing. The first embodiment should not be construed as limitative. For example, the autonomous mobile apparatus 10 is not limited to the unmanned aerial vehicle 100. For example, the autonomous mobile apparatus 10 may be a space vehicle (e.g. spacecraft). The autonomous mobile apparatus may also be an unmanned underwater vehicle or an unmanned land vehicle (e.g. a wheeled robot such as a robotic vacuum cleaner). The landing station 200 is described as being mobile or movable since the landing station is deployed out at sea. Needless to say, this may not be the case, and the reference object 20 may be deployed on land or it is stationary with respect to the position of the autonomous mobile apparatus 10. The landing station 200 may also be deployed on a roving vehicle on land. Depending on the autonomous mobile apparatus 10, the reference object 20 may be a space docking station, an underwater docking station, or other stations equipped to enable docking of the autonomous mobile apparatus 10. Furthermore, while the alignment device 130 is described as including the controller 131 in the first embodiment, this is not necessary. In another embodiment, the controller 131 may be a separate component from the alignment device 130.

The controller 131 may instruct the propulsion device 120 to adjust the unmanned aerial vehicle 100 such that the camera 110 captures images in a specific direction. The unmanned aerial vehicle 100 may further include a gimble attached to the camera 100. The gimble enables the camera 100 to rotate, thereby increasing field- of-view of the camera 100.

To illustrate the scope of the present disclosure, further embodiments are described next.

Figure 6 illustrates a system architecture of an exemplary autonomous mobile apparatus 40 according to a second embodiment. Similarly, in the second embodiment, the autonomous mobile apparatus 40 is an unmanned aerial vehicle 400 and like components in this embodiment uses the same reference numerals with an addition of 1000. As illustrated in Figure 6, the unmanned aerial vehicle 400 of the second embodiment includes a main body, a propulsion device 1 120 having four sets of propellers 1122 driven by respective motors 1 124, an alignment device 1 130 having a controller 1 131 , a beacon identification module 1 132 and an orientation module 1 134, a power source 1 160, and a first camera 11 10 having a polarized filter 1 1 12. Unlike the first embodiment, the unmanned aerial vehicle 400 of the second embodiment further includes a secondary image capturing device in the form of a second camera 460 without a polarizing filter. Additionally, the alignment device 1 130 in the second embodiment further includes a positioning module 436 for aligning the [x,y] position of the unmanned aerial vehicle 400 to the optical beacon 210.

In the second embodiment, the first and second cameras 1 1 10,460 are set-up to be confocal. A beamsplitter 420 is implemented to split the polarized light 1 140 into two light beams 1 140a, 1 140b. The first light beam 1 140a is received by the first camera via the polarizing filter 11 12 where the intensity of the first light beam 1 140a is altered by the polarizing filter 1 1 12. The second light beam 1 140b is transmitted to the second camera 460. The intensity of the second light beam 1 140b is unaffected since there is no polarizing filter in front of the second camera 460. Except for the intensity of the polarized light 1 140 which may differ depending on the orientation of the unmanned aerial vehicle 100 relative to the optical beacon 210, the first and second cameras 1 1 10,460 receive similar images of the optical beacon 210. Just like the first embodiment, the controller 1 131 is arranged to be communicatively coupled to the first and second cameras 1 110,460 and the propulsion device 1 120. The communication channel 1 150 between the first and second cameras 1 1 10,460 and the controller 1 131 , and between the propulsion device 1120 and the controller 1 131 , is a one-way wired communication.

The first and second cameras 1 1 10,460 transmit information about the optical beacon 210 to the alignment device 1 130 for further processing. In particular, the alignment device 1 130 obtains information on the intensity of the polarized light 1 140 from the first camera 1 1 10, and obtains information about the relative [x,y] position of the optical beacon 210 to the unmanned aerial vehicle 100 from the second camera 460. The operation of the beacon identification module 1132, the orientation module 1 134 and the positioning module 436 is described next with reference to Figures 7, 8A and 8B.

Similar to the first embodiment, the second embodiment should not be construed as being limitative. For example, while a beamsplitter is implemented in the second embodiment, the beamsplitter may be replaced with other optical devices that essentially achieve the same results as the beamsplitter. For example, the first and second cameras 1 1 10,460 may be implemented with focusing lenses such that similar images of the optical beacon 210 is received by cameras 1 1 10,460. Figure 7 illustrates a flow diagram of an exemplary method 500 for aligning the unmanned aerial vehicle 400 to the landing station 200 having the optical beacon 210 according to the second embodiment. Figure 8A illustrates an exemplary image of the optical beacon 210 from a viewpoint of the camera 1 110. Figure 8B illustrates an exemplary image of the optical beacon from a viewpoint of the camera 1 160.

The exemplary method 500 incorporates all the steps of exemplary method 300. Similarly, like steps in the second embodiment uses the same reference numerals as the first embodiment but with an addition of 1000. In particular, like the first embodiment, the method 500 includes a beacon identification stage 1310 which is operated by the beacon identification module 1 132 and an orientation stage 1 120 which is operated by the orientation module 1 134, Unlike the first embodiment, the method 500 further includes a positioning stage 530 which is operated by the positioning module 436. The two processes 1320,530 run in tandem to align the relative positions and orientation of the unmanned aerial vehicle 400 to the landing station 200. The beacon identification stage 1310 and the orientation stage 1320 are not repeated for the sake of brevity. For the following section, it is taken that the optical beacon has been identified by the beacon identification module 1132, and the method 500 is in the positioning stage 530 (and the orientation stage 1320). At step 532 of the positioning stage 530, the positioning module 436 identifies a centroid 212 of the optical beacon 210 from an image 660 of the optical beacon 210 captured by the second camera 460 (this is illustrated in Figure 8B). The positioning module 436 determines if the [x,y] position of the unmanned aerial vehicle 400 is aligned to the centroid 212 of the optical beacon 210. The unmanned aerial vehicle 400 is aligned when the cross 214 is centered on the centroid 212 as shown in Figure 8B.

Furthermore, the centroid 212 of the optical beacon 210 gives the relative displacement in [x,y] of the unmanned aerial vehicle 400 with respect to the optical beacon 210. If the positioning module 436 determines that the cross 214 is not centered on the centroid 212, then at step 534, the positioning module 436 informs the controller 1 131 to direct the propulsion device 420 to move the unmanned aerial vehicle 400 towards the optical beacon based on the relative displacement in [x,y] determined in step 532.

The positioning module 436 continuously check that the [x,y] position of the unmanned aerial vehicle 400 is aligned to the centroid 212 of the optical beacon 210 as long as the unmanned aerial vehicle 400 is active. This allows the positioning module 436 to make continuous adjustments to the relative [x,y] positions of the unmanned aerial vehicle 400 as it is homing in on the landing station 200. Similar to the first embodiment, this is particularly useful since the unmanned aerial vehicle 400 is operated in an environment where the relative [x,y] positions of the unmanned aerial vehicle 400 and/or the landing station 200 are shifting due to external forces (e.g. wind, waves) acting on the unmanned aerial vehicle 400 and the landing station 200. Advantageously, the positioning module 436 can correct any misaligning in the position of the unmanned aerial vehicle 400 quickly, thus enabling the unmanned aerial vehicle 400 to land more accurately and more precisely in a dynamic environment.

Sometimes, ambient light may contribute to optical interference in the image 660 captured by the camera 460. To counter this, an expected size of the image of the optical beacon 210 can be calculated from the field of view of the camera 460, and the distance between the unmanned aerial vehicle 400 and the optical beacon 210. Both are known factors or may be determined readily. Only an area 216 (refer to Figure 8B) within the expected size of the image (in terms of number of camera pixels) may be used to determine the centroid 212 at step 532. When the determination of the centroid 212 is not limited to the area 216, a larger area may be used since ambient light contributes to optical interference in the image 660 captured by the camera 460. Advantageously, a more accurate determination of the centroid 212 is achieved when the calculation is limited to the area 216.

Furthermore, since the intensity of the image captured by the second camera 460 does not change depending on the orientation of the unmanned aerial vehicle 400 relative to the optical beacon 210, the positioning module 436 can readily identify the centroid 212 of the optical beacon 210 regardless of the orientation of the unmanned aerial vehicle 400. Figures 8A and 8B shows the respective viewpoints of the first camera 1 1 10 and the second camera 460 when the unmanned aerial vehicle 100 is in the desired orientation i.e. when the intensity of the polarized light 1140 captured by the first camera 1 1 10 is zero. As can be seen in Figure 8A, the optical beacon 210 is no longer visible from the image 610 of the viewpoint of camera 1 1 10.

Since the first and second cameras 11 10,460 have similar viewpoints, it is possible to select only the pixels on camera 1 1 10 that correspond to those pixels in camera 460 (i.e. the camera pixels within area 216) used to determine the centroid 212 of the optical beacon 210, when determining the intensity of the polarized light 1 140 at step 324. Advantageously, this also limits the effect of ambient light interfering with the determination of the intensity of the polarized light 1 140.

Figure 9 illustrates a system architecture of an exemplary autonomous mobile apparatus 70 according to a third embodiment. In the third embodiment, the autonomous mobile apparatus 70 is an unmanned aerial vehicle 700 and like components in this embodiment uses the same reference numerals with an addition of 2000. As illustrated in Figure 9, the unmanned aerial vehicle 700 of the third embodiment includes a main body, a propulsion device 2120 having four sets of propellers 2122 driven by respective motors 2124, an alignment device 2130 having a processor 2131 , a beacon identification module 2132, an orientation module 2134, and a position module 2436, a power source 2160, a first camera 21 10 having a polarized filter 2112 and a second camera 2460. Unlike the previous embodiments, the second camera 2460 includes a polarizing filter 762. The first and second cameras 2110,2460 are similarly set-up to be confocal using a beamsplitter 2420 to split the polarized light 2140 into two light beams 2140a, 2140b. The first light beam 2140a is received by the first camera 21 10 via the first polarizing filter 21 12 and the second light beam 2140b is received by the second camera 2460 via the second polarizing filter 762. As a result, the intensities of the first and second light beams 2140a, 2140b are affected by the respective first and second polarizing filters 21 12,762. In other words, the respective intensities captured by the first camera 21 10 and the second camera 2460 are affected by the orientation of the unmanned aerial vehicle 700 to the optical beacon 210.

Similar to the second embodiment, the controller 2131 is arranged to be communicatively coupled to the first and second cameras 21 10,2460 and the propulsion device 2120. The communication channels 2150 are set-up to be a one- way wired communication

The first and second cameras 21 10,2460 transmit information about the optical beacon 210 to the controller 2131 for further processing. In particular, the controller 2131 receives information on the intensity of the polarized light 2140 and the [x,y] position of the optical beacon 210 relative to the unmanned aerial vehicle 700 from both cameras 21 10,2460. However, additional processing by the controller 2131 is required before the information may be extracted. The beacon identification module 2132, the orientation module 2134 and the positioning module 2436 will be described in a later section with reference to Figure 1 1. Similar to the previous embodiments, the third embodiment should not be construed as being limitative. Figure 10A illustrates the first and second polarizing filters 21 12,762 in an upstanding position. The first and second polarizing filters 21 12,762 include respective front facing portions 801 ,805. The front facing portions 801 ,805 have respective vertical axis 802,806 in the upstanding position. Notably, during operation of the unmanned aerial vehicle 700, the front facing portions 801 ,805 are effectively pointing downwards so as to receive the first and second light beams 2140a, 2140b. In this position, the direction of the vertical axis 802,806 corresponds to the longitudinal axis (front to back) of the main body of the unmanned aerial vehicle 700. From this perspective, the first polarizing filter 2112 has an optical axis 804 at +45° to the longitudinal axis of the main body of the unmanned aerial vehicle while the second polarizing filter 762 has an optical axis 808 at -45° to the longitudinal axis of the main body of the unmanned aerial vehicle. Importantly, the optical axes 804,808 have an angular difference of 90° regardless of the orientation of the unmanned aerial vehicle 700 to the optical beacon 210. This relationship is illustrated in the line graphs 810,820 of Figure 10B.

Line graph 810 shows the expected intensity of the first light beam 2140a for different angular displacement of the unmanned aerial vehicle 700 from the optical beacon 210. Another way of looking at line graph 810 is by shifting line graph 250 along the x-axis by +45° (or 45° to the right). This is equivalent to a 45° clockwise rotation of the polarizing filter 1 12 such that the polarization axis 220 is at +45° to the vertical.

Line graph 820 shows the expected intensity of the second light beam 2140b for different angular displacement of the unmanned aerial vehicle 700 from the optical beacon 210. Another way of looking at line graph 820 is by shifting line graph 250 along the x-axis by -45° (or 45° to the left). This is equivalent to a 45° anticlockwise rotation of the polarizing filter 1 12 such that the polarization axis 220 is at -45° to the vertical.

As can be seen in Figure 10B, the line graphs 810,820 are oscillating out of phase such that when the first camera 21 10 captures a maximum intensity of the first light beam 2140a, the second camera 2460 captures a minimum intensity of the second light beam 2140b. By taking a sum of the line graphs 810,820, a resulting line graph 830 has a constant intensity regardless of the angular displacement of the unmanned aerial vehicle 700 from the optical beacon 210. This feature is particularly useful since images from the first and second cameras 21 10,2460 can be processed (i.e. combined) by the controller 2131 to produce a composite image having a constant intensity. The composite image is then used by the positioning module 2436 for the determination of the centroid 212 of the optical beacon 210 for aligning the [x,y] position of the unmanned aerial vehicle 700 to the optical beacon 210. Advantageously, having a composite image with constant intensity allows the positioning module 2436 to reliably determine the centroid 212 of the optical beacon 210 regardless of the orientation of the unmanned aerial vehicle 700 to the optical beacon 210.

Conversely, a line graph 840 can also be formed by taking a difference of the line graphs 810,820. The difference of the line graphs 810,820 is minimum when the orientation of the unmanned aerial vehicle 700 is aligned to the optical beacon 210. In other words, the orientation of the unmanned aerial vehicle 700 is aligned to the optical beacon 210 when the intensity of the first light beam 2140a captured by the first camera 2110 is the same as the intensity of the second light beam 2140b captured by the second camera 2460. For the purpose of orientating the unmanned aerial vehicle 700 to be aligned to the optical beacon 210, instead of determining whether the intensity of the polarized light 2140 is decreasing like in previous embodiments, the orientation module 2134 determines whether the respective intensities of the first and second light beams 2140a, 2140b captured by the first and second cameras 21 10,2460 is balanced.

Figure 1 1 illustrates a flow diagram of an exemplary method 900 for aligning the unmanned aerial vehicle 700 to the landing station 200 having the optical beacon 210. The method 900 is a modified version of method 700 and like steps in the third embodiment uses the same reference numerals as the previous embodiments but with an addition of 2000. In particular, like the method 700, the method 900 includes a beacon identification stage 2310 operated by the beacon identification module 2132, an orientation stage 2320 operated by the orientation module 2134 and a positioning stage 2530 operated by the positioning module 2436. Similar to the second embodiment, the orientation stage 2320 and the positioning stage 2530 run in tandem to align the relative [x,y] positions and orientation of the unmanned aerial vehicle 700 to the landing station 200 having the optical beacon 210. The beacon identification stage 2310 is identical to the beacon identification stages 310,1310, of the first and second embodiments and is not repeated for the sake of brevity.

The positioning stage 2530 is similar to the positioning stage 530 in the second embodiment. The only difference is that the positioning module 2436 uses the composite image formed by combining images from both the first and second cameras 2110,2460 to determine the centroid 212 of the optical beacon 210. The remaining steps of the positioning stage 2530 are also not repeated for the sake of brevity.

At step 922 of the orientation stage 2320, the orientation module 2134 determines if an intensity 922a of the polarized light 2140 captured by the first camera 2110 is the same as an intensity 922b of the polarized light 2140 captured by the second camera 2460, or in other words, whether the intensities are balanced. If the intensities 922a, 922b are balanced, the orientation module 2134 determines that the orientation of unmanned aerial vehicle 700 is aligned to the optical beacon 210. Notably, the orientation module 2134 continues to check whether the intensities 922a, 922b are balanced as long as the unmanned aerial vehicle 700 is active so that the unmanned aerial vehicle 400 can be readily and dynamically adjusted to correct any misalignments.

If the orientation module 2134 determines that the intensities 922a, 922b are not balanced, then at step 924, the orientation module 2134 determines whether the intensity 922a is greater than the intensity 922b. Unlike the orientation stages 320,1320 in the previous methods 300,500, the orientation module 2134 does not need to inform the controller 2131 to direct the unmanned aerial vehicle 700 to make a rotation in a preset direction in order to determine which direction of rotation is the shorter route for the unmanned aerial vehicle 700 to achieve the desired orientation. By determining which of the first and second cameras 2110,2460 has the higher intensity, the orientation module 2134 can determine the shorter rotation to get to the desired orientation. If the intensity 922a of the first light beam 2140a is higher than the intensity 922b of the second light beam 2140b, then the orientation stage 2320 moves to step 928 where the orientation module 2134 informs the controller 2131 to direct the propulsion device 2120 to rotate the unmanned aerial vehicle 700 such that the angular displacement is increased.

If the intensity 922a of the first light beam 2140a is lower than the intensity 922b of the second light beam 2140b, then the orientation module 2134 flips the direction of rotation at step 926, before informing the controller 2131 to direct the propulsion device 2120 to rotate the unmanned aerial vehicle 700 at step 928 such that the angular displacement is decreased.

Once step 928 is completed, the orientation stage 2320 returns to step 922 where the whole process is repeated. This ensures that the orientation module 2134 can react quickly to changes in the orientation of the unmanned aerial vehicle 700 relative to the optical beacon 210 and adjust the unmanned aerial vehicle 700 in response to these changes. The described embodiments thus propose multiple ways that the autonomous mobile apparatus 10,30,70 aligns to the reference object 20 having the optical beacon 210 that emits polarized light 140,1 140,2140. Importantly, polarization is a property of light that is largely unaffected by distance. By making use of polarized light to transmit information about the relative positions and orientation of the autonomous mobile apparatus 10,30,70 to the reference object 20, the autonomous mobile apparatus 10,30,70 has a large range of working distance. More particularly, the autonomous mobile apparatus 10,30,70 is able to make adjustments to align itself to the reference object 20 continuously while homing in to the reference object 20 from a large distance away.

The autonomous mobile apparatus 10,30,70 includes the alignment device 130,1130,2130 having a controller 131 ,1 131 ,2131 that is communicatively coupled to at least one image capturing device 1 10,11 10,460,21 10,2460 having the polarizing filter 1 12,11 12,21 12,762. The controller 131 ,1131 ,2131 is also communicatively coupled to the propulsion device 120,1 120,2120. It should also be appreciated that the alignment device 130,1 130,2130 may be provided in a separate guidance module which is retrofitted to the autonomous mobile apparatus 10,40,70 prior to deployment. In such an embodiment, it is envisaged that the guidance module may have wire/wireless communication ports to realize the communication channels 150,1 150,2150 described in the various embodiments. The guidance module may also have a power delivery device to deliver power to the various components of the guidance module either from an independent source, or from the main body of the autonomous mobile apparatus 10,40,70. In another embodiment, the image capturing device 1 10,1 1 10,460,2110,2460, may be installed on the guidance module or it may also be installed on the main body of the autonomous mobile apparatus 10,40,70. In yet another embodiment, the beacon identification module 132,1 132,2132, the orientation module 134,1 134,2134, and/or the positioning module 436,2436 may be incorporated into an existing system in the main body of the autonomous mobile apparatus 10,40,70.

The unmanned aerial vehicle 100,400,700, may be a drone or a spacecraft. Alternatively, the autonomous mobile apparatus 10,40,70 may also be land-based. For example, the autonomous mobile apparatus 10,40,70 may be a wheeled robot (e.g. automated/robotic vacuum cleaner) capable of precise alignment to a charging base. In this case, the propulsion device 120,1120,2120 may include two wheels driven by respective motors. Still further, the autonomous mobile apparatus 10,40,70 may be an underwater vehicle in which case the propulsion device 120,1120,2120 may include a screw propeller driven by a motor. It should also be appreciated that features and variations described in relation to one embodiment may also be applicable to the other embodiments.

It should be clear that although the present disclosure has been described with reference to specific exemplary embodiments, various modifications may be made to the embodiments without departing from the scope of the invention as laid out in the claims. For example, although the image capturing device used in the embodiments is a camera, depending on the operating environment of the autonomous mobile apparatus 10,40,70, other image capturing devices that are better suited for the intended environment are understood to be possible.

Further, various embodiments as discussed above may be practiced with steps in a different order as disclosed in the described and illustrated in the Figures. Modifications and alternative constructions apparent to the skilled person are understood to be within the scope of the disclosure.