Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND SYSTEM FOR OBJECT LOCALIZATION, AND CONTROLLING MOVEMENT OF AN AUTONOMOUS UNDERWATER VEHICLE FOR OBJECT INTERVENTION
Document Type and Number:
WIPO Patent Application WO/2021/045679
Kind Code:
A1
Abstract:
An underwater object localisation method performed by an autonomous underwater vehicle (AUV) (100) having a sensing device coupled to its body is disclosed. In a described embodiment, the method includes generating sensor data (312) corresponding to an underwater object (314) detected by the sensing device, generating a distribution of N particles (203) based on the generated sensor data (312) to represent a posterior distribution of the object's location, with each particle (203) associated with an initial position (326) relative to a reference of the AUV body (100), and transforming the initial positions (326) of the N particles (203) into initial object global coordinates (208) using the AUV's global coordinates (334) to define the object's approximate 3D location, the object global coordinates (208) being independent of the AUV body reference. A localisation system and a control system are also disclosed.

Inventors:
GOH ENG WEI (SG)
CHEW WAN THENG RUTH (SG)
RAAJ YAADHAV (SG)
Application Number:
PCT/SG2020/050495
Publication Date:
March 11, 2021
Filing Date:
August 25, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
NAT UNIV SINGAPORE (SG)
International Classes:
B63G8/00; G05D1/00
Foreign References:
US20140165898A12014-06-19
CN106569179A2017-04-19
US20100316233A12010-12-16
US20140229034A22014-08-14
Attorney, Agent or Firm:
POH, Chee Kian, Daniel (SG)
Download PDF:
Claims:
CLAIMS

1. An underwater object localisation method performed by an autonomous underwater vehicle (AUV), the AUV having a sensing device coupled to its body, the method comprising generating sensor data corresponding to an underwater object detected by the sensing device; generating a distribution of N particles based on the generated sensor data to represent a posterior distribution of the object’s location, with each particle associated with an initial position relative to a reference of the AUV body; and transforming the initial positions of the N particles into initial object global coordinates to define the object’s approximate 3D location, the object global coordinates being independent of the AUV body reference. 2. An underwater object localisation method according to claim 1 , wherein the reference of the AUV body is off-centre to the AUV body, and transforming the initial positions of the N particles into initial object global coordinates comprises transforming the initial position from the reference of the AUV body to a reference of the centre of the AUV body.

3. An underwater object localisation method according to claim 2, further comprising transforming the initial position from the reference of the centre of the AUV body to a world frame using the AUV’s global coordinates. 4. An underwater object localisation method according to claim 3, wherein the AUV’s global coordinates comprises the AUV’s global position and orientation.

5. An underwater object localisation method according to any one of the preceding claims, further comprising receiving new sensor data relating to the detected object, and obtaining updated object global coordinates of the N particles based on the initial object global coordinates and the new sensor data. 6. An underwater object localisation method according to claim 5, wherein obtaining updated object global coordinates comprises predicting new object global coordinates from the initial object global coordinates.

7. An underwater object localisation method according to claim 6, further comprising transforming the new object global coordinates into a common reference of the new sensor data, and comparing the new object global coordinates with the sensor data.

8. An underwater object localisation method according to claim 7, wherein the new sensor data includes visual image data, and transforming the new object global coordinates into a reference of the new sensor data comprises deriving pixel coordinate of each new object global coordinate and comparing the pixel coordinates with the sensor data.

9. An underwater object localisation method according to claim 7 or 8, further comprising assigning weights to the N particles based on the comparison, the assigned weights defining difference between the detected object’s properties and predefined object properties.

10. An underwater object localisation method according to claim 9, wherein the detected object’s properties include ratio of selected dimensions of the detected object’s bounding box, or colour.

11. An underwater object localisation method according to claim 9 or 10, further comprising adding random numbers to the assigned weights to obtain adjusted weights, the random number being obtained by sampling a value from a normal distribution; and multiplying the sampled value by a constant corresponding to a predefined standard deviation.

12. An underwater object localisation method according to claim 11 , further comprising resampling the new object global coordinates based on the adjusted weights to obtain the updated object global coordinates. 13. An underwater object localisation method according to claim 12, further comprising calculating a mean of the updated object global coordinates to determine an improved approximate of the object’s 3D location.

14. An underwater object localisation method according to any preceding claim, wherein the sensing device includes a sonar device, and the sensor data includes a sonar image of the detected object.

15. An underwater object localisation method according to claim 14, wherein the sensor data includes an object attribute comprising coordinates of the detected object as detected by the sonar device.

16. An underwater object localisation method according to claim 15, wherein the distribution of N particles is generated based on particle filtering of the sensor data.

17. An underwater object localisation method according to any one of claims 14 to 16, wherein generating the distribution of N particles comprises assigning the relative position to each particle based on a mean of the coordinates of the detected object, and a predefined variance.

18. An underwater object localisation method according to claim 17, wherein the predefined variance has a larger value in a z-axis than in a x-axis or y-axis.

19. A method of controlling movement of an autonomous underwater vehicle (AUV) in relation to a detected underwater object, the underwater object having a posterior distribution represented by N particles which are associated with respective object global coordinates representing an approximate 3D location of the detected object, the method comprising

(i) calculating a displacement between the AUV and the detected object by comparing global coordinates of the AUV with the object global coordinates of the N particles;

(ii) if the displacement is greater than a preset tolerance, moving the AUV by a percentage of the displacement to reduce the displacement between the AUV and the object; and

(iii) repeating (i) and (ii) until the displacement is lesser than the preset tolerance.

20. A method of controlling movement of the AUV according to claim 19, wherein the percentage varies depending on the displacement between the AUV and the detected object.

21. A method of controlling movement of the AUV according to claim 19, wherein the percentage is reduced as the displacement between the AUV and the detected object is reduced.

22. An underwater object localisation system for an autonomous underwater vehicle (AUV), the system comprising a sensing device coupled to the AUV’s body and arranged to detect underwater objects and further arranged to generate sensor data corresponding to a detected underwater object; and a fusion module arranged to generate a distribution of N particles based on the generated sensor data to represent a posterior distribution of the object’s location, with each particle being arranged to be associated with an initial position relative to a reference of the AUV body; the fusion module further arranged to transform the initial positions of the N particles into initial object global coordinates to define the object’s approximate 3D location in which the object global coordinates are independent of the AUV body reference.

23. A system for controlling movement of an autonomous underwater vehicle (AUV) in relation to a detected underwater object, the underwater object having a posterior distribution represented by N particles which are associated with respective object global coordinates representing an approximate 3D location of the detected object, the system comprising a controller arranged to (i) calculate a displacement between the AUV and the detected object by comparing global coordinates of the AUV with the object global coordinates of the N particles; and if the displacement is greater than a preset tolerance, (ii) cause the AUV to move by a percentage of the displacement to reduce the displacement between the AUV and the object; and (iii) repeat (i) and (ii) until the displacement is lesser than the preset tolerance.

24. An autonomous underwater vehicle comprising a sensing device arranged to detect underwater objects and further arranged to generate sensor data corresponding to a detected underwater object; a fusion module arranged to generate a distribution of N particles based on the generated sensor data to represent a posterior distribution of the object’s location, with each particle being arranged to be associated with an initial position relative to a reference of the AUV body; the fusion module further arranged to transform the initial positions of the N particles into initial object global coordinates to define the object’s approximate 3D location in which the object global coordinates are independent of the AUV body reference; and a mission controller arranged to calculate a displacement between the AUV and the detected object by comparing global coordinates of the AUV with the object global coordinates of the N particles; if the displacement is greater than a preset tolerance, the mission controller is arranged to move the AUV by a percentage of the displacement to reduce the displacement between the AUV and the object until the displacement is lesser than the preset tolerance.

Description:
METHOD AND SYSTEM FOR OBJECT LOCALIZATION, AND CONTROLLING MOVEMENT OF AN AUTONOMOUS UNDERWATER VEHICLE FOR OBJECT INTERVENTION

TECHNICAL FIELD

The present disclosure relates to object localization, and more particularly as performed by an autonomous underwater vehicle for object intervention.

BACKGROUND

Object detection and tracking techniques that employ LIDAR, radar, and camera sensor fusion is used in self-driving vehicle technologies. However, such techniques cannot be readily utilized in an underwater setting due to differences in the environment.

Existing methods have attempted object localization and tracking underwater objects by fusing the feed from sonar and camera sensors for an autonomous underwater vehicle (AUV) to locate and track underwater objects. However, such methods require well-defined environments with many features to track in order for the methods to work effectively. In one instance of sonar camera sensor fusion, the camera is positioned bottom facing and as a result is exposed to a well-defined environment with many features to track, coupled with a sonar sensor that is pointed towards the seabed, and not ahead. In this case, however, objects ahead of the underwater vehicle would not be captured by the sonar sensor. In addition, if forward facing tracking were conducted with the camera facing forward, the AUV may have to track objects which are much further away with much poorer visibility. In another instance, in the forward domain, objects detected in the sonar are mapped generically directly into a forward facing camera image as an area of interest to search for landmarks. However, such techniques are not able to acquire precise localisation of an objects 3D coordinates.

In addition, such methods do not work well in environments where the water is murky or where there is underwater current since the positions of the objects of interest may not be detected at a high enough accuracy for more complex intervention operations.

Therefore, it is desirable to provide an underwater localisation method and system for an AUV which addresses the problems above and/or to provide the public with a useful choice.

SUMMARY

According to a first aspect, there is provided an underwater object localisation method performed by an autonomous underwater vehicle (AUV) having a sensing device coupled to its body. The method includes generating sensor data corresponding to an underwater object detected by the sensing device, generating a distribution of N particles based on the generated sensor data to represent a posterior distribution of the object’s location, with each particle associated with an initial position relative to a reference of the AUV body, and transforming the initial positions of the N particles into initial object global coordinates using the AUV’s global coordinates to define the object’s approximate 3D location, the object global coordinates being independent of the AUV body reference.

The described embodiment is particularly advantageous since the object global coordinates corresponding to the N particles being independent of the position of the AUV, more accurate estimates of the object’s location could be obtained. Such accuracy may thus allow the AUV to more accurately and precisely manipulate small objects, and at the same time, operate smoothly in murky waters, or in current conditions,

In the described embodiment, the N particles form a distribution that describes the probability of an object being in any 3D coordinate. The particles are weighted to form a posterior distribution that describes the position of the object better. More particles at the same spot equate to a higher probability of the spot being the actual estimated object position. By using particles, a better heuristic model of the object’s position is obtained, instead of typical assumptions such as the Gaussian distribution. The N represents the number of particles that are being used to represent this distribution.

Where the reference of the AUV body is off-centre to the AUV body, transforming the initial positions of the N particles into initial object global coordinates may include transforming the initial position from the reference of the AUV body to a reference of the centre of the AUV body.

In an embodiment, the underwater object localisation method may further include transforming the initial position from the reference of the centre of the AUV body to a world frame using the AUV’s global coordinates. Specifically, the AUV’s global coordinates may include the AUV’s global position and orientation.

In a specific embodiment, the underwater object localisation method may further include receiving new sensor data relating to the detected object, and obtaining updated object global coordinates of the N particles based on the initial object global coordinates and the new sensor data. In this respect, obtaining updated object global coordinates may include predicting new object global coordinates from the initial object global coordinates.

Advantageously, the underwater object localisation method may further include transforming the new object global coordinates into a common reference of the new sensor data, and comparing the new object global coordinates with the sensor data.

Where the new sensor data includes visual image data, transforming the new object global coordinates into a reference of the new sensor data may include deriving pixel coordinate of each new object global coordinate and comparing the pixel coordinates with the sensor data. Preferably, the underwater object localisation method may further include assigning weights to the N particles based on the comparison, the assigned weights defining difference between the detected object’s properties and predefined object properties.

In a specific example, the detected object’s properties may include ratio of selected dimensions of the detected object’s bounding box, or colour.

The underwater object localisation method may further include adding random numbers to the assigned weights to obtain adjusted weights, the random number being obtained by sampling a value from a normal distribution, and multiplying the sampled value by a constant corresponding to a predefined standard deviation. In an embodiment, the underwater object localisation method may further include resampling the new object global coordinates based on the adjusted weights to obtain the updated object global coordinates.

Preferably, the underwater object localisation method may further include calculating a mean of the updated object global coordinates to determine an improved approximate of the object’s 3D location.

The sensing device may include a sonar device, and the sensor data may include a sonar image of the detected object. It is envisaged that the sensor data may include an object attribute comprising coordinates of the detected object as detected by the sonar device.

In a specific example, the distribution of N particles may be generated based on particle filtering of the sensor data.

Preferably, generating the distribution of N particles may include assigning the relative position to each particle based on a mean of the coordinates of the detected object, and a predefined variance. It is envisaged that the predefined variance may have a larger value in a z-axis than in a x-axis or y-axis.

According to a second aspect of the invention, there is provided a method of controlling movement of an Autonomous Underwater Vehicle (AUV) in relation to a detected underwater object, the underwater object having a posterior distribution represented by N particles which are associated with respective object global coordinates representing an approximate 3D location of the detected object, the method comprising (i) calculating a displacement between the AUV and the detected object by comparing global coordinates of the AUV with the object global coordinates of the N particles, (ii) if the displacement is greater than a preset tolerance, moving the AUV by a percentage of the displacement to reduce the displacement between the AUV and the object, and (iii) repeating (i) and (ii) until the displacement is lesser than the preset tolerance.

As discussed in the described embodiment, the method allows highly accurate manipulation of objects by the AUV. In particular, using the method, the AUV is able to smoothly approach and manipulate small objects, like valves, or even objects that sway in the water with the currents.

It is envisaged that the percentage may vary depending on the displacement between the AUV and the detected object. Further, it is possible that the percentage may be reduced as the displacement between the AUV and the detected object is reduced.

Other aspects of the described embodiment relate to system equivalents of the first and second aspects. Indeed, according to a third aspect, there is provided an autonomous underwater vehicle (AUV). The AUV includes a sensing device arranged to detect underwater objects and further arranged to generate sensor data corresponding to a detected underwater object. The AUV further includes a fusion module arranged to generate a distribution of N particles based on the generated sensor data to represent a posterior distribution of the object’s location, with each particle being arranged to be associated with an initial position relative to a reference of the AUV body. The fusion module is further arranged to transform the initial positions of the N particles into initial object global coordinates to define the object’s approximate 3D location in which the object global coordinates are independent of the AUV body reference. The AUV further includes a mission controller arranged to calculate a displacement between the AUV and the detected object by comparing global coordinates of the AUV with the object global coordinates of the N particles. If the displacement is greater than a preset tolerance, the mission controller is arranged to move the AUV by a percentage of the displacement to reduce the displacement between the AUV and the object until the displacement is lesser than the preset tolerance.

It should be appreciated that features relating to one aspect may be relevant and applicable to the other aspects. BRIEF DESCRIPTION OF DRAWINGS

An exemplary embodiment will be described with reference to the accompanying drawings in which:

Figure 1 is a block diagram of a system architecture of an Autonomous Underwater Vehicle (AUV); Figure 2 is a schematic block diagram of a particle filter sensor fusion module of the AUV of Figure 1 ;

Figure 3 is a flow diagram of a particle-filtering method for object localization using an initialization module of the particle filter sensor fusion module of Figure 2; Figure 4 is a flow diagram of a particle-filtering method for object localization which follows on from the particle-filtering method of Figure 3; Figure 5 illustrates a flow diagram of a controller framework for manipulating the AUV of Figure 1 based on input from the particle-filtering method of Figure 4; and

Figures 6A and 6B illustrate thruster configurations for actuating movement of the AUV of Figure 1 based on instructions derived from the controller framework of Figure 5.

DETAILED DESCRIPTION

The following description contains specific examples for illustrative purposes The person skilled in the art would appreciate that variations and alterations to the specific examples are possible and within the scope of the present disclosure. The figures and the following description of the particular embodiments should not take away from the generality of the preceding summary.

Figure 1 is a schematic diagram of a system architecture of an autonomous underwater vehicle (AUV) 100 suitable for underwater object localisation. The AUV 100 includes a system controller 102 (such as a microprocessor) arranged to be communicatively coupled to a sensing device and in this embodiment, the sensing device comprises a sonar system 104 and a camera system 106. An object is first detected independently in the sonar system 104 and the camera system 106. In this embodiment, the sonar system 104 comprises a front facing sonar device 108 and a sonar image processor 110. The sonar device 108 is arranged to capture sonar images of a forward navigation path of the AUV 100 and its vicinity and transmit the captured sonar images of the detected object to the sonar image processor 110 for further processing. In this embodiment, the sonar image processor 110 comprises a median filter (not shown) to remove non-Gaussian noise that comes from electrical interference or frequency clashes from the sonar image to produce a partially processed sonar image. Next, an image thresholding technique is then performed on the partially processed sonar image to detect or acquire the detected object in the sonar image. In this embodiment, the camera system 106 comprises a front facing camera 112 and a camera image processor 114. The camera 112 is arranged to capture visual images also of the forward navigation path of the AUV 100 and its vicinity and to transmit the visual images to the camera image processor 114 for further processing. The camera image processor 114 is configured by a user to process the camera images depending on how robust the image processing needs to be. The type of underwater object to be detected and the environment in which the AUV 100 operates affect how robust the image processing needs to be. Parameters may include uniformity and consistency of colour surfaces, specular effects from surface sunlight, and the shape and design of object to be detected. For less robust requirements, the camera image processor 114 uses a salient region detection algorithm, while for detection of objects with more challenging features, a deep learning based approach utilizing deep convolutional neural networks such as a trained mobile Single Shot Multibox detector (SSD) deep neural network model is employed. While only two imaging devices are described herein, the AUV 100 may include more than two imaging devices and other perception sensors, such as for example sensing devices for the rear or downwards of the AUV 100.

Outputs 104a, 106a, i.e. sensor data, from both the sonar system 104 and the camera system 106 are communicated to the system controller 102 for further action. The sensor data includes a list of objects with object attributes detected by the sonar device 108 and the camera 112. The object attributes include name, area, and bounding box. In the case of sonar images, the object attributes also include real coordinates of the objects derived from the sonar images. The real coordinates include the range and azimuth of the objects detected by the sonar device 108. The system controller 102 is specifically configured to execute instructions of a fusion module arranged to generate a distribution of N particles and in this embodiment, the fusion module is in the form of a particle filter fusion module 118 stored in computer readable memory.

Further details of the sonar system 104 and the camera system 106 (and some other components) may be found from an article “Yaadhav Raaj, Alex John, Tan Jin, 3D Object Localization using Forward Looking Sonar (FLS) and Optical Camera via Particle Filter based Calibration and Fusion, OCEANS 2016 MTS/IEEE Monterey, 2016”, the contents of which is incorporated herein by reference in its entirety.

Notably, prior to autonomous operation of the AUV 100, various transformation matrices are obtained, including between a centre of the AUV 100 and the sonar device 108, and between the centre of the AUV 100 and the camera 112. Since the sonar device 108 and the camera 112 are physically located at different positions of the AUV 100, a purpose is to transform or normalise the object attributes extracted from the sonar and camera images to a common reference. In this described embodiment, the common reference is the centre of the AUV 100. Flowever, any other reference point including other parts of the AUV 100 may be chosen.

Additionally, calibration of the camera 112 is performed to obtain a camera matrix. The camera matrix provides a method to obtain the image coordinate (i.e. pixel coordinate) of a real 3D point captured with respect to the camera frame. The method produces a 2D point in the image frame of reference and is used to transform the 3D coordinates in the list of objects derived from the outputs 104a, 106a into the image frame of reference. Sonar specifications such as the range, field of vision, angle resolution and range resolution are obtained for more effective utilization of the sonar device 108. The AUV 100 also includes a navigation system 116 communicatively coupled to the system controller 102. The navigation system 116 includes an inertial navigation system (which can take reference from Ultra Short Base Line, Long Base Line or just based on pure Doppler Velocity Log inertial navigation) for determining global positioning coordinates of the AUV 100 during its underwater operation and is capable of inertial navigation and highly accurate dynamic positioning with less than 10cm accuracy. The navigation system 116 is arranged to communicate navigation data such as the AUV’s 100 global positioning coordinates to the system controller 102. The global positioning coordinates include 3D coordinates representing the position of the AUV 100 with respect to the world frame. The particle filter sensor fusion module 118 is arranged to receive sensor data and navigation data selectively from the system controller 102. Using the sensor data and the navigation data from the navigation system 116, the particle filter sensor fusion module 118 generates the N particles in the form of N particles which are used to estimate an approximate 3D position of a detected object (or multiple objects of interest) allowing manipulation of the object by the AUV 100.

The AUV 100 also includes thrusters 123 for controlling movement of the AUV 100 and actuators 125 for manipulating or controlling objects. The AUV 100 includes an actuator controller 124 for controlling the actuators 125, and an AUV control system 122 configured to control the thrusters 123 and the actuator controller 124 respectively. The AUV 100 further includes a mission controller 120 communicatively coupled to the system controller 102, the AUV control system 122 and the actuator controller 124. The system controller 102 is arranged to communicate the 3D coordinates of objects calculated by the particle filter sensor fusion module 118 to the mission controller 120 which are used by the mission controller to instruct the AUV control system 122 and the actuator controller 124 to perform certain actions.

Figure 2 illustrates a schematic block diagram of the particle filter fusion module 118. The particle filter fusion module 118 includes an initialization module 200 for executing an initializing phase and an updating module 210 for executing an updating phase, for the AUV 100. The initialization module 200 includes a particle generator 202 arranged to receive initial data 204 from the system controller 102 and a transformation module 206. The initial data 204 comprises object attributes that are derived from the sensor data comprising the sonar image of a detected object. From the initial data 204, the particle generator 202 generates N particles 203 to represent a posterior distribution of the object’s location as detected by the sonar device 108. Since the N particles 203 are generated based on object attributes that are derived from the sonar image, each N particle 203 would thus be associated with an initial position relative to the reference frame of the sonar system 104.

The reference may not be limited to the reference frame of the sonar system 104. For example, each N particle 203 may be associated with an initial position relative to another reference on the AUV body 100. In this way, the relative position determines an approximate location of the N particles based on a posterior distribution in relation to the reference of the body of the AUV 100.

The initial positions of the N particles can be transformed into the AUV 100 frame, with the transformation matrix from sonar frame to AUV 100 frame, obtained prior to operations. Using the global coordinates of the AUV 100 from the navigation system 116, the transformation module 206 then transforms the initial positions of the N particles into initial object global coordinates 208 using the AUV’s global coordinates to define the detected object’s approximate 3D location and it should be appreciated that the object global coordinates are independent of the reference on the AUV 100. At the end of the initializing phase, the initialization module obtains an estimate of the approximate location of the detected object (or each of the multiple detected objects) in the world frame. This then completes the initialization phase for the particle fusion module 118. The updating module 210 is arranged to receive the global coordinates 208 of the N particles 203 from the initialization module 200 and new sensor data 220 received from the system controller 102 to improve the estimate of the location of the detected object (or if multiple objects are detected, then to improve the estimated locations of each detected object). The updating module 210 includes a prediction module 212 configured to predict new global coordinates for the N particles 203 based on the N particles’ global coordinates 208 from the initialization module 200, and a weight-assigning module 214 configured to calculate weights to be assigned to the new global coordinates of the N particles 203 based on new sensor data 220 received from the system controller 102. The new sensor data 220 comprises object attributes that are derived from either or both the sonar images and visual images. The assigned weights are indicative of how probable a particle is correctly indicating the position of the detected object, by comparing the particle to the new sensor data 220 and some preset specifications. The preset specifications are defined by the user and contains information specific to the desired object to be detected such as major axis size of object with a certain tolerance and this information is used to determine the weights. The weight-assigning module 214 is configured to adjust the weights to balance exploration and convergence.

Next, a resampling module 216 resamples the N particles 203 according to the assigned weights to obtain updated global coordinates 230 for the N particles 203. The resampling module then obtains global coordinates 240 for the detected object or objects from a mean of the N particles’ updated global coordinates 230. Each object’s global coordinate 240 represent an estimate of the 3D coordinate or location of the object in the real world. The resampling module 216 also feeds the updated global coordinates 230 of the N particles 203 back to the prediction module 212 to track and better the estimates of the

3D coordinates of the objects. Therefore, as the AUV moves closer or with respect to the detected object or objects, new sensor data is obtained and input to the updating module 210 which produces better estimates of the global coordinates 240 of the object(s). The operation of the initialization module 200 will now be explained with reference to Figure 3. Figure 3 illustrates a flow diagram of a particle-filtering method 300 for object localization using the initialization module 200. At step 310, initial data 204 is received from the system controller 102. The initial data 204 includes sensor data 312 comprising object attributes derived from the sonar image. The object attributes represent objects 314 in the external environment of the AUV 100 that are detected by the sonar device 108. Detected objects are iterated through by the particle generator 202. Notably, the first initialization of the filter (and the positions of each of the particles) is performed using the sensor data 312 from the sonar image.

At step 320, N particles 203 are generated and assigned based on an initial distribution 324, with means represented by the real coordinates in the sensor data 312 and variances predefined by the user. Preferably, the predefined variances are in the x and y axis and large in the z axis, indicating the huge uncertainty in the depth of the object, since sensor data 312 from the sonar image does not contain depth information. The distribution 324 represents the region where the detected object is probable to be in. Each of the N particles 203 has an initial position 326 that is relative to the sonar system’s reference frame. In this way, the initial posterior distribution of the particles can be determined. In particular, since the N particles 203 are based on the object attributes derived from the sonar image which are obtained in relation to the reference of the sonar device, therefore, the initial positions 326 of the N particles 203 are also with respect to the sonar system’s reference frame.

At step 330, the transformation module 206 transforms the N particles’ initial (relative) positions 326 into the initial object global coordinates 208 via two steps. The first is to transform the initial positions 326 of the N particles 203 from the sonar system’s reference frame into the AUV 100 reference frame using the sonar-to-AUV transformation matrix, obtained prior to operation. The second is to transform the N particles’ positions in the AUV 100 reference frame into the world frame, using the global coordinates 334 of the AUV 100. The global coordinates 334 is used to derive the AUV-to-world transformation matrix, where information like the global position and orientation i.e. attitude (roll, pitch and yaw) of the AUV 100 is encoded into the transformation matrix. The N particles’ global coordinates 208 can be determined by multiplying a sonar-to- AUV transformation matrix and an AUV-to-world transformation matrix with the N particles’ initial positions 326. The N particles’ initial positions 326 are stored in global coordinates 208 such that the N particles’ global coordinates 208 are independent of the current position of the AUV 100.

The operation of the updating module 200 will now be explained with reference to Figure 4. Figure 4 illustrates a flow diagram of a particle-filtering method 400 which is a continuation of the particle-filtering method 300.

At step 410, the global coordinates of the N particles 203 are updated by the prediction module 212, from the current global coordinates 208, to form the new global coordinates 412 of the N particles 203. This update increases the particles’ variance in the z-axis if no object from the camera image was matched to the particle filter, indicating the increased uncertainty in the depth of the desired object and increasing the search field for the object depth-wise. The update may also be influenced by the user, via a configuration indicating the speed of the spread of the particles.

At step 420, new sensor data 220 is received from the system controller 102 (as the sensing system is continuously tracking the detected object thus, there would be new sensor data). For the updating phase, the new sensor data 220 can come from the sonar device 108 or the camera 112. The new sensor data 220 are used by the updating module 210 to update the N particles’ belief state which is represented by the global coordinates 208 of the N particles 203.

At step 430, the weight-assigning module 214 first transforms the new global coordinates 412 of the N particles 203 into a frame of reference of either the sonar device 108 (if the new sensor data 220 is derived from a sonar image captured by the sonar device 108) or the camera 112 (if the new sensor data 220 is derived from a visual image captured by the camera 112) for comparison with the new sensor data 220. In the latter case of the visual image, another step is needed to obtain the pixel coordinates corresponding to each particle’s 3D coordinate, using the camera matrix obtained via calibration of the camera 112 prior to the operations.

At step 440, weights 442 are assigned by the weight-assigning module 214 to each of the N particles 203. A higher weight indicates a higher probability that the particle is representative of the desired object. The weights 442 are assigned by comparing the N particles 203 and the new sensor data 220. For each particle 203, if the particle 203 coincides with an object detected in new sensor data 220, a value is calculated based on how similar the detected object’s properties is to a predefined desired object’s properties. Such properties include the ratio of the object’s bounding box and the colour of object. This value becomes the weight of the particle, or if the particle 203 coincides with multiple objects, the maximum value of all corresponding objects would be taken as the weight. A particle 203 that does not coincide with any objects detected in the new sensor data 220 will have a weight of 0.

At step 450, the weight-assigning module 214 adjusts the weights 442 slightly with a random number, sampled from a normal distribution multiplied by a constant corresponding to the desired standard deviation of the particle distribution. This is to introduce some uncertainties in the particle distribution and encourage the particles to explore a bigger space if the detected object associated with the particles is not very similar to the desired object, rather than converging. At step 460, the N particles 203 are resampled by the resampling module 216 according to the adjusted weights from step 450 to obtain the updated global coordinates 230 for the N particles 203. The updated global coordinates 230 are fed back to the prediction module 212 where step 410 is again performed with the updated global coordinates 230 of the N particles 203 as new input.

At step 470, global coordinates 240 for each object is obtained from a mean of the N particles’ updated global coordinates 230. The global coordinates 240 for each object represent an estimate of the 3D position of each object in the real world.

Manipulation of AUV 100 At the end of the process of Figure 4, the object global coordinates 208 of the N particles 203 would be with respect to the world frame, the 3D position of the detected object is also with respect to the world frame. Together with the navigation system 116 of the AUV 100 which provides the AUV’s global coordinates 334, the mission controller 120 is able to control the movement of the AUV 100 using two methods depending on the level of precision required for a task.

For waypoint navigation of the AUV 100 to an object of interest, where the maneuver of the AUV 100 do not need to be precise, the mission controller 120 (which includes a microprocessor in this instance) calculates a distance between the AUV 100 and the object from the global coordinates 334 of the AUV 100 and the 3D position of the object, and provides a one-off command to the AUV control systems 122 of the AUV 100 to move towards the object using the thrusters 123.

Flowever, when manipulating small objects like valves, or when the AUV 100 is operating in underwater environment with current, there is generally a low tolerance for errors or drifts in the navigation system 116. In such scenarios, precise maneuver of the AUV 100 is required. The mission controller 120 is configured to send continual goals to the AUV control system 122 or the actuator controller 124 instead of a one-off command. The controller framework 500 for controlling the AUV 100 in such scenarios is described next with reference to Figure 5.

Figure 5 illustrates a flow diagram of a controller framework 500 for manipulating the AUV 100 in situations where precise maneuver is required. At step 510, the mission controller 120 compares the global coordinate 334 of the AUV 100 and global coordinates 240 of the object 512 to obtain a displacement margin 514.

At step 520, the mission controller 120 compares the displacement margin 514 to a preset tolerance 522. If the displacement margin 514 is greater than the preset tolerance 522, at step 530, the mission controller 120 instructs the AUV control systems 122 to move the AUV 100 by a percentage 532 of the displacement margin 514 to reduce the displacement margin 514 between the AUV 100 and the object 512.

The percentage 532 is preset and represents a trade-off between speed and stability. A higher percentage 532 enables the AUV 100 to move faster but risks overcorrection and overshooting the target. This causes the AUV 100 to hover or vibrate around the target while the mission controller 100 corrects the overcorrection. On the other hand, a lower percentage 532 causes the AUV 100 to move slower but enables smoother operation of the AUV 100 since there is less risk for overcorrection. The percentage 532 can also be set depending on other factors such as distance away from the object, where the percentage becomes smaller as the AUV 100 nears the object, for smoother manipulations.

The mission controller 120 again performs step 510 and the cycle repeats until the displacement margin 514 is determined to be lower than the preset tolerance 522 at step 520. When this happens, the cycle ends at step 540.

Notably, as the particle filter sensor fusion is run simultaneously with the controller framework 500, the estimate of the 3D position of the object 512 is continuously corrected and improved as the AUV 100 moves nearer and nearer to the object 512. This allows the vehicle to approach and/or manipulate small objects or objects that sway in the water due to currents with high accuracy. When the AUV 100 is within range of the detected object to be manipulated, the mission controller 120 is arranged to provide instructions to the actuator controller 124 which then controls the actuators 125 to manipulate required tools or equipment to complete the required task at the detected object. The AUV control system 122 includes a proportional-integral-derivative (PID) controller with a total of six loops to enable six degrees of freedom (DOF) for very precise maneuvering of the AUV 100. Each of the PID DOF are designed for real world controls, providing integral wind-up protection, actuator saturation limitations and derivative component low pass filtering.

In this embodiment, the thrusters include four depth thrusters which actuate movement of the AUV 100 based on instructions from the AUV control system 122. There are two thruster configurations available: a vectored thrust configuration 610 and a straight thrust configuration 620 as illustrated in Figures 6A and 6B respectively. Such configurations allow for maximum stability of the AUV 100 in the roll, pitch, and yaw domains, as well as the heave, sway, and surge domains. The AUV control system 122 is able to achieve station-keeping accuracy and stability of less than approximately 10cm. Notably, the vectored thrust configuration 610 allows for more thrust in a single axis but comes at the cost of more loss of energy. The straight thrust configuration 620 is more energy efficient but comes at a loss of single-axis thrust. The AUV control system 122 fully supports the actuation of both thruster configurations 610 and 620. Thruster setpoints that are available for control are based on absolute depth, absolute yaw/heading, displacement relative to the distance travelled in the forward axis of the AUV 100, and displacement relative to the distance travelled in the side axis of the AUV 100. The proposed particle-filtering methods are implemented on the AUV 100 to achieve a fully autonomous intervention pipeline, from object detection, integration of navigation data, to manipulation of objects. In particular, the particle filtering method 400 involves fusion of sensor data 220 from multiple sensors, namely the sonar device 108 and the camera 112. Notably, the forward looking sonar device 108 on its own is only able to provide the range and azimuth on an object in the sonar image, while the camera 112 alone may not provide distance measures without additional or prior information. A fusion of the sensor data therefore overcomes the individual sensors’ shortcomings to produce accurate 3D localization of the objects 512. Further, fusion of sensor data from other odometry sensors is possible.

Advantageously, the AUV 100 described in the exemplary embodiments is effective in subsea inspections including turning of valves, cathodic protection testing, flooded member detection, despite the murkiness of the water which causes underwater cameras to perform poorly at range.

It should be clear that although the present disclosure has been described with reference to specific exemplary embodiments, various modifications may be made to the embodiments without departing from the scope of the invention as laid out in the claims of invention. For example, although the particle-filtering methods for object localization described are implemented on an Autonomous Underwater Vehicle 100, other algorithms may be used to generate the distribution of the N particles.

Further, various embodiments as discussed above may be practiced with steps in a different order as disclosed in the description and illustrated in the figures. Modifications and alternative constructions apparent to the skilled person are understood to be within the scope of the disclosure.