Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD TO ENHANCE UNDERWATER LOCALIZATION
Document Type and Number:
WIPO Patent Application WO/2014/067684
Kind Code:
A1
Abstract:
The present invention relates to a method to improve localization of an underwater vehicle, the underwater vehicle having a computed location. The method comprises receiving data representing an underwater region, and determining whether an underwater feature is represented in the received data. In addition, upon positive determination, the method also comprise determining at least one location associated with said underwater feature, computing a new location of the vehicle based on the determined at least one location, and updating the computed location based on the new location.

Inventors:
NASR AMIN (FR)
HOUZAY ERWANN (FR)
GUILLON SÉBASTIEN (FR)
GILMOUR WILLIAM (US)
Application Number:
PCT/EP2013/066948
Publication Date:
May 08, 2014
Filing Date:
August 13, 2013
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TOTAL SA (FR)
CHEVRON USA INC (US)
International Classes:
B63G8/00; G05D1/06; G06T3/40
Foreign References:
EP0681230A11995-11-08
FR2965616A12012-04-06
Other References:
GIAN LUCA FORESTI: "Visual Inspection of Sea Bottom Structures by an Autonomous Underwater Vehicle", IEEE TRANSACTIONS ON SYSTEMS, MAN AND CYBERNETICS. PART B:CYBERNETICS, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 31, no. 5, 1 October 2001 (2001-10-01), pages 691 - 705, XP011057004, ISSN: 1083-4419
HALLSET J O: "A vision system for an autonomous underwater vehicle", PATTERN RECOGNITION, 1992 . VOL.1. CONFERENCE A: COMPUTER VISION AND A PPLICATIONS, PROCEEDINGS., 11TH IAPR INTERNATIONAL CONFERENCE ON THE HAGUE, NETHERLANDS 30 AUG.-3 SEPT. 1992, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 30 August 1992 (1992-08-30), pages 320 - 323, XP010029830, ISBN: 978-0-8186-2910-5, DOI: 10.1109/ICPR.1992.201566
Attorney, Agent or Firm:
WLODARCZYK, Lukasz et al. (52 rue de la Victoire, Paris Cedex 09, FR)
Download PDF:
Claims:
CLAIMS

1 . A method to improve localization of an underwater vehicle (402), the underwater vehicle having a computed location, wherein the method comprises:

- receiving data (500) representing an underwater region;

- determining (501 , 502) whether an underwater feature (407, 401 , 408, 405, 406) is represented in the received data;

- upon positive determination:

- determining at least one location associated with said underwater feature;

- computing (506) a new location of the vehicle based on the determined at least one location;

- updating the computed location (507) based on the new location.

2. A method according to claim 1 , wherein receiving data representing an underwater region comprises:

- acquiring data of the underwater region with a sonar (403) and/or a camera.

3. A method according to one of the preceding claims, wherein determining whether an underwater feature is represented in the received data comprises:

- determining a first subset of data in the received data representing a cylindrical body;

- determining a second subset of data based on the first subset of data;

- analyzing the second subset of data to determine whether an underwater feature is represented in the second subset of data.

4. A method according to one of the preceding claims, wherein determining whether an underwater feature is represented in the received data comprises:

- retrieving at least one signature from a set of stored signatures (503);

- determining (502) whether the at least one signature matches with at part of the received data.

5. A method according to claim 4, wherein retrieving at least one signature from a set of stored signatures comprises:

- selecting at least one signature from a set of stored signatures (503) based on the distance between the computed location of the vehicle and a location associated with the at least one signature; and

- retrieving the selected at least one signature.

6. A method according to claim 4 or 5, wherein the retrieved at least one signature is representative of a special feature (407, 401 , 408, 405, 406) attached to a cylindrical body.

7. A method according to one of the preceding claims, wherein determining whether an underwater feature is represented in the received data comprises:

- computing at least one signature of a subpart in the received data;

- determining whether the computed at least one signature matches with at least one signature in a set of stored signatures.

8. A method according to one of the preceding claims, wherein, if a plurality of locations is being associated with said underwater feature, determining a location associated with said underwater feature comprises:

- for each location in the plurality of locations, computing (508) a distance from latter location and the computed location of the vehicle; and

- determining a closest location in the plurality of locations, the closest location corresponding to the location with the smallest computed distance.

9. A method according to one of the preceding claims, wherein computing a new location of the vehicle comprises:

- determining a relative location of the underwater feature from the vehicle; and

- computing a new location for the AUV, the new location being function of the relative location of the underwater feature and the determined location associated with said underwater feature.

10. A controller (1000) to improve localization of an underwater vehicle, the underwater vehicle having a computed location, wherein the controller comprises:

- an interface (1003) to receive data representing an underwater region;

- a circuit (1004) configured to determine whether an underwater feature is represented in the received data;

- a circuit (1004) configured to determine a location associated with said underwater feature;

- a circuit (1004) configured to compute a new location of the vehicle based on the determined location;

- a circuit (1004, 1006) configured to update the computed location based on the new location.

1 1 . A non-transitory computer readable storage medium, having stored thereon a computer program comprising program instructions, the computer program being loadable into a data-processing unit and adapted to cause the data-processing unit to carry out the steps of any of claims 1 to 9 when the computer program is run by the data-processing device.

Description:
METHOD TO ENHANCE UNDERWATER LOCALIZATION BACKGROUND OF THE INVENTION

The present invention relates to the enhancement of underwater localization and more specifically during a pipeline (or other cylindrical body) survey with an underwater vehicle.

The approaches described in this section could be pursued, but are not necessarily approaches that have been previously conceived or pursued. Therefore, unless otherwise indicated herein, the approaches described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section. Furthermore, all embodiments are not necessarily intended to solve all or even any of the problems brought forward in this section.

AUV (for "Autonomous Underwater Vehicle") is a powerful tool to carry out subsea mapping, geotechnical and environmental survey in deep water.

AUV does need minimal support from surface vessel to carry out survey. Therefore, the underwater pipeline may be surveyed much faster and minimal human intervention during operations is requested.

To survey underwater pipeline, the conventional method is to visually inspect the pipeline with a ROV (for "Remote Operated Vehicle"). Standard video format is used to transmit feedback information to the engineer/operator at the surface. Thus, the engineer/operator may detects/sees subsea anomalies. Videos are stored in surface storage devices.

It is also possible to post-process the stored video by replaying the video and identifying pipeline anomalies/defects and any other features manually.

For AUV, no high bandwidth data link (i.e. direct communication link such as a wire) to surface is available. Moreover, for safety reasons, AUV cannot be very close or in-contact with subsea pipelines: it is required that the AUV flies over subsea pipelines.

During surveying operations, the location of the AUV may be uncertain for the navigation module of the AUV. Indeed: - no GPS signal may be received underwater;

- the Inertial Navigation System (or INS) of a AUV is not accurate during a long period and deviation between the real location of the AUV and the location computed by the AUV navigation module may be observed (accumulation of error per distance); - even if a low bandwidth data link may be available (acoustic link), this link could be interrupted due to underwater conditions (temperature, salinity, etc.) and no location data may be sent from the surface to the AUV during a undetermined period.

There is thus a need for improving the localization of the AUV during an underwater pipeline survey.

SUMMARY OF THE INVENTION

The invention relates to a method to improve localization of an underwater vehicle, the underwater vehicle having a computed location.

The method comprises: - receiving data representing an underwater region;

- determining whether an underwater feature is represented in the received data;

- upon positive determination:

- determining at least one location associated with said underwater feature;

- computing a new location of the vehicle based on the determined at least one location;

- updating the computed location based on the new location. Hence, it is possible to ensure that the computed location of the underwater vehicle is accurate even when no GPS signal is received. The underwater vehicle may "resynchronize" its location at determined point on the cylindrical body having underwater features attached to. A cylindrical body may be, for instance, a pipeline, a cable (e.g. mechanic cable, electrical cable, alimentation cable or hydraulic cable), line (e.g. alimentation line, riser (e.g. riser bundle, single hybrid riser, etc).

Receiving data representing an underwater region may comprise: - acquiring data of the underwater region with a sonar and/or a camera.

Sonar device or camera device are simple to use and to adapt on an AUV. Moreover, these devices are often already installed on the underwater vehicle.

In a possible embodiment, determining whether an underwater feature is represented in the received data may comprise:

- determining a first subset of data in the received data representing a cylindrical body;

- determining a second subset of data based on the first subset of data;

- analyzing the second subset of data to determine whether an underwater feature is represented in the second subset of data.

The first subset is for instance a part of the received data defined by the two contour lines of the identified cylindrical body. Nevertheless, it is noted that an underwater feature may be slightly bigger that the diameter of the cylindrical body. Thus, a second subset of the data may be defined, including for instance the first subset, to be sure to encompass the large feature attached to the cylindrical body. For instance, the second subset may correspond to a dilation of the first subset (e.g. homothetic transformation) with a factor equal to 1 10%.

The creation of this second subset may reduce the "data of interest" and thus increase the computation speed of the determination algorithm to determine whether an underwater feature is represented.

In addition, determining whether an underwater feature is represented in the received data may comprise:

- retrieving at least one signature from a set of stored signatures;

- determining whether the at least one signature matches with at part of the received data.

The stored signatures may be signatures that has been previously computed and stored in a database or a memory for future use and comparison. These stored signatures may also be signatures that have been computed during previous survey of the same cylindrical body.

Once signatures have been extracted from the memory/database, it is possible to determine if these signatures match any part/subset of the received data.

In a possible embodiment, retrieving at least one signature from a set of stored signatures may comprise:

- selecting at least one signature from a set of stored signatures based on the distance between the computed location of the vehicle and a location associated with the at least one signature; and

- retrieving the selected at least one signature.

In the database/memory, a location may be associated with each of the stored signature.

Every signature in the stored signatures is not relevant to the specific received data. For instance, it is assumed that a signature associated with an underwater 10km away from the computed location of the AUV is not relevant as it is quite impossible that such deviation error of the navigation module of the AUV is hardly possible. Hence, the most probable relevant underwater features may be the feature close to the location of the AUV (i.e. the location provided by the navigation module). For instance, the selected signatures may be the signature associated with underwater features within a predetermined circle around the computed AUV location.

The retrieved at least one signature may be representative of a special feature attached to (or present onto) a cylindrical body.

For instance, these features may be anodes, stickers, geometrical markers, sleepers, flanges, etc.

In addition, determining whether an underwater feature is represented in the received data may comprise:

- computing at least one signature of a subpart in the received data;

- determining whether the computed at least one signature matches with at least one signature in a set of stored signatures.

If a plurality of locations is being associated with said underwater feature, determining a location associated with said underwater feature may comprise:

- for each location in the plurality of locations, computing a distance from latter location and the computed location of the vehicle; and

- determining a closest location in the plurality of locations, the closest location corresponding to the location with the smallest computed distance.

If a plurality of locations is being associated with said underwater feature, it may indicate that there are uncertainties regarding the real location associated with said underwater feature (e.g. a plurality of signature of stored features matches with the signature of the underwater feature).

Indeed, it is probable that a given flange have the same signature with another flange of the same cylindrical body. It may be advantageous to select the closest location from the computed location of the AUV as being the most probable location.

In a possible embodiment, computing a new location of the vehicle may comprise: - determining a relative location of the underwater feature from the vehicle; and

- computing a new location for the AUV, the new location being function of the relative location of the underwater feature and the determined location associated with said underwater feature.

Due to the configuration of the detection means of an AUV, when a location of an underwater feature is determined, it may not imply that the location of the AUV is said determined location. Indeed, the distance between the detected underwater feature and the AUV may be couple of meters.

It may be possible to determine the distance and the orientation of underwater feature in regard of the AUV location (for instance, thanks to the position of the underwater feature in the received data and thanks to the configuration of capture means in regard of the AUV direction).

A second aspect of the invention relates to a controller to improve localization of an underwater vehicle, the underwater vehicle having a computed location. The controller comprises:

- an interface to receive data representing an underwater region;

- a circuit configured to determine whether an underwater feature is represented in the received data; - a circuit configured to determine a location associated with said underwater feature;

- a circuit configured to compute a new location of the vehicle based on the determined location; - a circuit configured to update the computed location based on the new location.

A third aspect relates to a computer program product comprising a computer readable medium, having thereon a computer program comprising program instructions. The computer program is loadable into a data-processing unit and adapted to cause the data-processing unit to carry out the method described above when the computer program is run by the data-processing unit.

Other features and advantages of the method and apparatus disclosed herein will become apparent from the following description of non-limiting embodiments, with reference to the appended drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings, in which like reference numerals refer to similar elements and in which:

- Figure 1 is a representation of an AUV in survey mode along a pipeline according to a possible embodiment of the invention;

- Figures 2a to 2e are illustrations of images taken by an AUV during a survey according to a possible embodiment of the invention; - Figure 3 is a flow chart describing a possible embodiment for controlling navigation of a subsea vehicle according to a possible embodiment of the invention;

- Figure 4 is an illustration of detection of underwater features according to a possible embodiment of the invention; - Figure 5 is a flow chart describing a possible embodiment for improving localization of an underwater vehicle according to a possible embodiment of the invention;

- Figure 6a is an illustration of a sample image taken by an AUV during a survey according to a possible embodiment of the invention;

- Figure 6b is an illustration of a possible deformation of a sample image taken by an AUV during a survey according to a possible embodiment of the invention; - Figure 7 is an illustration of a possible combination of sample images taken by an AUV during a survey according to a possible embodiment of the invention;

- Figure 8 is an illustration of possible defect detection in a panorama image according to a possible embodiment of the invention; - Figure 9 is a flow chart describing a possible embodiment for simplifying defect recognition according to a possible embodiment of the invention;

- Figure 10 is a possible embodiment for a device that enables the present invention.

DESCRIPTION OF PREFERRED EMBODIMENTS Figure 1 is a representation of an AUV in survey mode along a pipeline according to a possible embodiment of the invention.

AUV (for "Autonomous Underwater Vehicle") is subsea vehicle that are not directly controlled from the surface.

The AUV 102 may be used to ensure that there is no problem on subsea pipelines such as the pipeline 101 in Figure 1 .

To survey the pipeline, the AUV 102 follows the path of the pipeline 101 . For instance, if the pipeline is parallel to the axis x of the Cartesian coordinate system (x,y,z) represented in the Figure 1 , the navigation module of the AUV control the AUV so that the AUV is translated according to this direction. For safety reason the distance d between the AUV 102 and the pipeline 101 is greater than a predetermined safety distance to avoid any collisions.

In addition, the AUV 102 may comprise capture means 103 (such as a camera, a video camera, a sonar, etc.) in order to survey the pipeline and provide information and data to the engineers. The capture means 103 may, for instance, be able to capture visual information close to the pipeline within a predetermined area 104.

Figures 2a to Figure 2e are illustrations of images taken by an AUV during a survey according to a possible embodiment of the invention.

As described in reference with Figure 1 , the camera may create images 200 (or set of data) representing the seabed and comprising the pipeline 204 that is expected to be surveyed.

To control the navigation of the AUV, it is possible to use these images initially captured to survey the pipeline 204. Indeed, it is possible to determine a relative location of the AUV in space (distance of the AUV from the pipeline):

- knowing the dimension of the real diameter d204 of the pipeline 204,

- knowing the orientation of capture means (e.g. the camera axe).

The determination of the relative location of the AUV in space is even more accurate (i.e. orientation of the pipeline compared to the orientation of the pipeline) if two contour lines (210 and 21 1 ) are determined.

This determination may use image processing technique such as contour detection. If the image is defined as a set of pixels with amplitude or colour for each pixel, the detection may be done by searching on the image the two lines which maximize the variation of image amplitude orthogonally of the lines. An optimization process may be used to find the two best lines on image verifying the above criterion.

Once the relative location of the AUV from the pipeline is determined (distance and orientation) it is possible to modify the navigation path of the AUV to bring the AUV at a specific distance from the pipeline (e.g. 3 meters from the pipeline) and with a specific relative orientation (e.g. parallel to the pipeline).

In order to ease this determination, a "control pattern" may be defined in the AUV configuration settings.

If no differences are observed between the control pattern and the representation of the pipeline, then the AUV is well localized. On the opposite, observed differences may be used to correct the location of the AUV. Knowing the mathematical model of the camera, the pipeline location, etc it is possible to compute the displacement between the estimate and the real location: the true location of the AUV could be then estimated. Basically, this pattern may consist in a zone of the captured image 200 where the pipeline (or its representation through the determined contour lines) should remain. There are a huge number of possible solutions to define such "control pattern".

For instance, this pattern may consist in a set of points defining a polygon (e.g. points 220, 221 , 222, 224 and 223) and the representation of the pipeline should fit in this polygon.

It is also possible to define segments at the edge of the capture image and the representation of the pipeline should correspond to the latter segments at those edges. In Figures 2a to 2e, the pattern is defined with three segments 201 , 202 and 203. In order to "validate" this pattern with the representation of the pipeline 204 in the image 200, the following conditions are to be verified:

- the contour line 210 is to go through the point 220 of segment 201 and through the point 223 of segment 203,

- the contour line 21 1 is to go through the point 222 of segment 202 and through the point 224 of segment 203. If this pattern is validated (as represented in Figure 2a), the AUV is assumed to be at a correct distance and to have a correct orientation in regard of the pipeline.

Nevertheless, the pattern may be "not validated".

A first illustration of this invalidity is provided in Figure 2b:

- the contour line 210 goes through the point 220r (which is above the point 220) and through the point 223r (which is at the right of the point 223),

- the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224r (which is below and at the right of the point 223).

It appears that the representation of the pipeline (i.e. its detected contour lines) in the picture 200 is to be rotated in an anti-clockwise direction with a rotation centered on the point 225, in order to "validate" the pattern.

In order to perform a rotation of the representation of the pipeline in the image 200 in an anti-clockwise direction, the AUV may be rotated in a clockwise direction about the axis z (assuming that the pipeline is on the seabed defining the plan (x,y)) (e.g. the direction of the AUV is modified by the AUV navigation module to slightly turn right).

A second illustration of this invalidity is provided in Figure 2c:

- the contour line 210 goes through the point 220r (which is below the point 220) and through the point 223,

- the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224.

Therefore, the segment 203 is locally validated but the segments 201 and 202 are not validated. In order to validate the pattern with the representation of the pipeline 204 in the image 200, the AUV may be moved in the direction y in order to bring the pipeline closer to the AUV (i.e. to zoom the representation of the pipeline in the bottom-left corner of the image). It may also be useful to slightly rotate the AUV in an anticlockwise direction about the axis y . A third illustration of this invalidity is provided in Figure 2d:

- the contour line 210 goes through the point 223r (which is at the right of the point 223) and through the point 220,

- the contour line 21 1 goes through the point 222r (which is at the left of the point 222) and through the point 224. Therefore, the segment 203 is not validated but the segments 201 and 202 are locally validated.

In order to validate the pattern with the representation of the pipeline in the image 200, it may be useful to slightly rotate the AUV in a clockwise direction about the axis y .

A fourth illustration of this invalidity is provided in Figure 2e:

- the contour line 210 goes through the point 223r (which is at the left of the point 223) and through the point 220r (which is above the point 223), - the contour line 21 1 goes through the point 224r (which is below and at the right of the point 224) and through the point 222r (which is at the right of the point 222).

In order to validate the representation of the pipeline in the image 200, it may be useful to move the AUV away from the pipeline (i.e. to move the AUV in the direction - y ).

In order to rotate, translate, etc. the AUV as described above, navigation instructions are sent to the navigation module of the AUV to modify the navigation parameters of the AUV.

With these modified navigation parameters, it is possible to control the AUV to ensure that the AUV follows a subsea pipeline for a survey and to capture consistent images of the pipeline (i.e. where the pipeline is always at the same (or similar) location in the captured images).

Figure 3 is a flow chart describing a possible embodiment for controlling navigation of a subsea vehicle according to a possible embodiment of the invention.

Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit, computer or a computing device.

Upon reception of data 300 (e.g. a 2D-array of pixel values, an image, etc.), it is possible to determine (step 301 ) whether a pipeline have a representation in the received data.

A plurality of methods is possible in order to determine whether a given feature is present in an image. For instance, this determination may use contour detection or pattern recognition algorithms in conjunction with a database 302 with stored pattern signatures.

If a pipeline is not detected (output KO of the test 309) in the image 300, the AUV is considered as "temporally lost' (output KO of test 310). If no pipeline is detected during a predetermined period of time (for instance 1 min) of after a predetermined number of received image (for instance 10 images), the AUV is considered as lost' (output OK of test 310). Thus, the AUV is configured to go back to a location where a pipeline has previously been detected (message 308) or to a predetermined fallback location.

If a pipeline is detected (output OK of the test 309) in the image 300, the contour lines of the pipeline are detected (step 303) and the contours lines may be compared to a predetermined "control pattern" stored in a memory 305 of the AUV in order to determine if the pipeline representation "validates" (see above) this pattern.

The memory 305 and the memory 302 may be the same memory.

If the contour lines do not "validate" this pattern (output KO of test 306), a modification of the navigation parameters (rotations, translations, etc.) may be computed (step 307) and a message 308 may be sent to the navigation module of the AUV to control the AUV survey path.

If the contour lines do "validate" this pattern (output OK of test 306), the navigation parameters does not need to be updated and the AUV continues on its preprogrammed survey path.

Figure 4 is an illustration of detection of underwater features according to a possible embodiment of the invention.

When surveying a pipeline 400 on the seabed, the AUV 402 may be able to detect features attached to the pipeline with detection means 403 (such as camera, sonar, multi-beam sonar, etc.)

The detection may use characters recognition algorithms, pattern recognition algorithms or others.

In order to enhance the detection of the underwater features, it is possible: - to add reflective covering (e.g. painting with microsphere, etc.) on these features;

- to use material that reflects/absorbs specific wavelengths (IR, UV, red light, etc.); - etc.

The above features may be for instance:

- a white sticker 407 with black numbers or letters written on it. This numbers or letters may represent an encoded real location (for instance in signed degrees format, in a DMS + compass direction format, in a degrees minutes seconds format, etc.)or other;

- a flange 401 that is used to attach two part of the pipeline together (detected for instance with a pattern recognition algorithms);

- an anode 408 attached to the pipeline;

- a geometrical form 405 painted on the pipeline 400; - a pipeline sleeper 406 used to avoid any displacement of the pipeline in regard of the seabed.

Figure 5 is a flow chart describing a possible embodiment for improving localization of an underwater vehicle according to a possible embodiment of the invention.

Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit, computer or a computing device.

Upon receiving data (message 500) representing an underwater region (for instance, a picture taken by a camera or a video-recorder, rows of value taken by a sonar, etc.), it is possible to process the data to identify (step 501 ) a underwater features as described above.

The identification of the underwater features may be performed on only part(s) of the received data: for instance, features may be searched in a bottom-left corner of the received image 500 or any other specific subset of the data.

When possible underwater features are identified, in the received data 500, a comparison (step 502) may be performed to find a correspondence (e.g. a signature match) among a plurality of stored features in a database 503. The stored features may have been stored in association with real location.

It is also possible that the detected feature in the data described directly (i.e. without the need of an external database) a real location (for instance, a sticker with real coordinates written on it).

If no correspondence is found in the database 503 (test 504, output KO), no action is performed (step 505).

If a single correspondence is found in the database 503 (test 504, output OK), the real location associated with the correspondence in the database 503 is used to updated (step 506) the computed location 507 of the AUV.

If a plurality of correspondences is found in the database 503 (test 504, output OK2), it is possible to select (step 508) one of the correspondence in the plurality of correspondences. The selected correspondence may be the correspondence for which the distance between the real location associated with and the current computed location 507 of the AUV is minimum.

For instance, when a survey is performed on a pipeline, several flanges/anodes may have the same signature and then several correspondences may be found in the database matching an underwater feature. This algorithm for selecting one correspondence assumes that the most probable detected feature is the closest matching feature (i.e. with the shortest distance).

Figure 6a is an illustration of a sample image 600a taken by an AUV during a survey according to a possible embodiment of the invention.

The image 600a comprises the representation of a pipeline 601 a with a flange 602a and two perpendicular pipeline valves 603a and 604a.

It is noted that the representation of the pipeline 601 a has a perspective effect: the two contour lines of the pipeline (which are normally parallel) are crossing at a vanishing point (outside image 600a).

In order to compensate this perspective effect, it is possible to deform image 600a. Figure 6b is an illustration of a possible deformation of a sample image taken by an AUV during a survey according to a possible embodiment of the invention.

This deformation may comprise a perspective correction or perspective transformation (i.e. to set the contour lines parallel) and a rotation (i.e. to set the contour lines horizontal). Thus, objects of the non-transformed image 600a (i.e. elements 601 a, 602a, 603a, 604a) are modified into new objects in a transformed image 600b (i.e. elements 601 b, 602b, 603b, 604b).

Figure 7 is an illustration of a possible combination of sample images taken by an AUV during a survey according to a possible embodiment of the invention.

During a survey of a pipeline, an AUV may capture a plurality of images along the pipeline.

Due to image acquisition and perspective effect, the pipe location is not stable between pair of images. In order to be able to correlate images and to create the mosaic image, a correction (see above) is applied on the image so that the pipe becomes horizontal with a constant width on the image. The transformation may comprise a simple morphing modifying the two detected lines (edges of the pipe) in two parallel and horizontal lines.

After the transformation of these images (701 , 702, etc.) it is possible to combine these transformed images to create a panorama image (or mosaic image).

The mosaic image may be created with the following process: a/ store the n corrected images in a memory buffer; b/ for the first two successive corrected images (e.g. 701 and 702), analyzing these latter images by detecting the inter correlation between the two images, an overlapping zone (e.g 704) is thus estimated. Then the two images are flattened in a single image. c/ storing the flattened image in the buffer in order to replace the two successive corrected images in the first location in the buffer. d/ if the buffer comprises more than one image, steps bl to 61 are reapplied to obtain the complete mosaic of the pipe 703.

Figure 8 is an illustration of possible defect detection method in a panorama image according to a possible embodiment of the invention.

For instance, it may be considered that there is a defect if the pipeline is not in contact (or close to) the seabed. Indeed, if the distance between the seabed and the pipeline is too big (a gap) and on a given length along the pipeline, the gravitational forces exerted on the pipeline could be dangerous for the pipeline integrity. A possible method for detection such defects is described in the application FR 2 965 616.

Moreover, an possible method for detection such defects may consist in:

- computing a panorama image according to the above method;

- extracting the part 800a of the panorama image corresponding below the representation of the pipeline;

- for each vertical segment (810, 81 1 , 812, 813, etc.) of the extracted part of the panorama image 800a, computing a "contrast variation value" or CW (820, 81 1 , 812, 813, etc.) related to a contrast of the pixels in the latter vertical segment; - if the contrast value or if the variation of the contrast value (within a zone, according to a direction of space, etc) is below a predetermined threshold (in the present example 190, line 800b), it is considered that a defect is present.

For instance the zone 801 a of the extracted part of the panorama image 800a, where a gap below the pipeline and the seabed is present, corresponds to the zone 801 b in the graphic, where CW is below 190.

The zone 802a of the extracted part of the panorama image 800a, where a gap below the pipeline and the seabed is present, corresponds to the zone 802b in the graphic, where contrast variation values are below 190.

It may be possible to detect defects such as:

- debris in-contact with subsea pipelines through applying a real-time shape/ pattern comparison to pre-identified patterns in the software database;

- drag / scar marks on the seabed which consider an evidence of "walking pipelines";

- etc.

Upon the detection of such defects, it is possible to:

- produce preliminary report and/or to compare this report with the last produced report to stress differences;

- identify the defects on the panorama image;

- re-program AUV route to re-survey the area where defects have detected;

- use/activate other detections means (such as acoustics sensor, sonar, etc.) to increase the accuracy of the defect detection;

- etc.

Figure 9 is a flow chart describing a possible embodiment for simplifying defect recognition according to a possible embodiment of the invention.

Part of this flow chart can represent steps of an example of a computer program which may be executed by a circuit.

Upon the reception of a plurality of images (message 900), each image of the plurality may be modified to change the perspective and/or to rotate the latter image (step 901 ). Once all images are modified, it is possible to combine the modified images to create a panorama (step 902).

The panorama image may be cropped in order to keep the only relevant part of the image (i.e. the part of the image close to the representation of the pipeline in the panorama image, step 903).

It is possible to process the panorama image to detect anomalies/defects (step 904) for instance, according to the method described in patent application FR 2 965 616.

The panorama image may be marked according to the previous detection (step 905) to ease a future identification and verification of defects on the pipeline (for instance, to ease the visual inspection by operators/engineers).

The marks may be, for instance, vertical red lines at the location of the detected defects in the panorama image.

Finally, the final marked panorama image (message 906) may be outputted to be displayed, for instance, to the operators/engineers.

Figure 10 is a possible embodiment for a device that enables the present invention.

In this embodiment, the device 1000 comprise a computer, this computer comprising a memory 1005 to store program instructions loadable into a circuit and adapted to cause circuit 1004 to carry out the steps of the present invention when the program instructions are run by the circuit 1004.

The memory 1005 may also store data and useful information for carrying the steps of the present invention as described above. The circuit 1004 may be for instance:

- a processor or a processing unit adapted to interpret instructions in a computer language, the processor or the processing unit may comprise, may be associated with or be attached to a memory comprising the instructions, or - the association of a processor / processing unit and a memory, the processor or the processing unit adapted to interpret instructions in a computer language, the memory comprising said instructions, or

- an electronic card wherein the steps of the invention are described within silicon, or

- a programmable electronic chip such as a FPGA chip (for « Field- Programmable Gate Array »).

This computer comprises an input interface 1003 for the reception of data used for the above method according to the invention and an output interface 1006 for providing a panorama image, control navigation instructions, or update of the AUV location as described above.

To ease the interaction with the computer, a screen 1001 and a keyboard 1002 may be provided and connected to the computer circuit 1004.

A person skilled in the art will readily appreciate that various parameters disclosed in the description may be modified and that various embodiments disclosed may be combined without departing from the scope of the invention.

For instance, the description proposes embodiments with pipelines examples. Any cylindrical body may replace these pipelines.