Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
CAMERA ASSISTED CONTROL SYSTEM FOR AN UNDERWATER VEHICLE
Document Type and Number:
WIPO Patent Application WO/2018/186750
Kind Code:
A1
Abstract:
Control system and method for an underwater vehicle comprising - device for receiving captured images of an underwater scene, - processing means for processing the received captured images, - means for controlling at least one propulsion device of the underwater vehicle, characterised by that the processing means tracks at least one object in the received images, wherein the means for controlling the at least one propulsion device is operated by the processing means to keep the tracked object substantially within the received images.

Inventors:
VIGGEN HENRIKSEN ANDREAS (NO)
Application Number:
PCT/NO2018/050090
Publication Date:
October 11, 2018
Filing Date:
April 05, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
BLUEYE ROBOTICS AS (NO)
International Classes:
G05D1/00; G01C3/00
Foreign References:
US9164506B12015-10-20
US6624899B12003-09-23
Other References:
UNKNOWN: "Converting from RGB to HSV", 10 May 2015 (2015-05-10), XP055483090, Retrieved from the Internet [retrieved on 20180611]
Attorney, Agent or Firm:
ACAPO AS (NO)
Download PDF:
Claims:
Claims

1 . Control system for an underwater vehicle (200) comprising:

- means (202) for receiving captured images of an underwater scene, - processing means (202) for processing the received captured images,

- means (216) for controlling at least one propulsion device (210) of the underwater vehicle,

characterised by that the processing means (202) are provided with means for tracking at least one object in two dimensions in the received captured images, wherein the means (216) for controlling the at least one propulsion device

(210) is operated by the processing means to keep the tracked object substantially within the received images by using the at least one propulsion device (210) to change at least one in a group comprising orientation and position of the underwater vehicle (200)

and wherein the object tracking is performed by using means for converting from a RGB colour model to a cylindrical-coordinate representation.

2. Control system according to claim 1 , characterised by the means (216) for controlling the at least one propulsion device (210) being operated by the processing means (202) to locate the at least one tracked object at a desired location in the received captured image.

3. Control system according to claim 1 , characterised by the means (216) for controlling the at least one propulsion device (210) is operated by the processing means (202) in such a way that the at least one area of the at least one tracked object is maintained substantially similar over a sequence of received captured images.

4. Control system according to claims 1 to 3 , characterised by further comprising means (202) for detecting, in the received captured image, an area of at least one reflection (261 ,263,265,267) of at least one light emitter emitting into the underwater scene for calculating a distance between the at least one light emitter

(260,262,264,266) source and the at least one reflection.

5. Control system according to claim 1 to 4, characterised by further comprising means (202) for detecting, in the received captured images, a separation distance between two reflections of two light emitters (260,262,264,266) emitting into the underwater scene for calculating a distance between the light emitter source and the reflections. 6. Control system according to claim 1 or 4, characterised by further comprising means for detecting, in the received captured images, at least three reflections (261 ,263,265,267) of at least three light emitters (260,262,264,266) emitting into the underwater scene,

wherein the at least three lasers are organized as corners in a polygon in a first plane (252),

wherein the at least three reflections are located in at least one second plane

(270),

wherein the reflections are used to calculate distances between first and the second plane, and

wherein the calculated distances are used to calculate at least an angle Θ between the first and the at least second plane.

7. Control system according to claim 6, characterised by the at least two light emitters are emitting in parallel into the underwater scene.

8. Control system according to claim 6, characterised by the at least two light emitters are emitting convergently into the underwater scene.

9. Control system according to claim 6, characterised by the at least two light emitters are emitting divergently into the underwater scene.

10. Underwater vehicle system (100) comprising:

- underwater vehicle (200) comprising:

- body for attaching components,

- first image capturing device (220) attached to the body for capturing images in a first direction,

- at least one propulsion device controller (216) for moving the underwater vehicle,

characterised by further comprising:

- control system according to claim 1 ,

1 1 . Underwater vehicle (200) according to claim 10, characterised by further comprising:

- at least one light emitter (260,262,264,266) attached to the body emitting light in the first direction.

12. Underwater vehicle (200) according to claim 10, characterised by further comprising a buoyancy unit to provide the underwater vehicle with positive

buoyancy. 13. Underwater vehicle (200) according to claims 10 or 1 1 , characterised by further comprising a second image capturing device (250) for capturing images in a second direction.

14. Underwater vehicle according to claims 12, characterised by the images captured by the second image capturing device (250) overlapping the images captured by the first image capturing device (220).

15. Method for guiding an underwater vehicle system according to claims 10 comprising the following steps:

- capturing at least one image of a first area, and

- navigating the underwater vehicle based on at least a first detected object in the captured image.

16. Method according to claim 15 characterised by the first detected object being at least one reflection of at least one light emitter.

17. Method according to claims 15 or 16 characterised by the first detected object being an ideogram providing localised information. 18. Method according to claims 15 - 17 characterised by the navigating of the underwater vehicle further comprises the step of:

- determining at least a first distance between the first detected object and the underwater vehicle and navigating the underwater vehicle to generally maintain the at least first distance over a sequence of captured images.

19. Method according to claim 20 characterised by the navigating of the underwater vehicle further comprises the steps of: - determining at least a second distance between the underwater vehicle and at least a second object, and

-navigating the underwater vehicle to additionally generally maintain the second distance over a sequence of captured images.

20. Method according to claim 19 characterised by the navigating of the underwater vehicle further comprises the steps:

- determining at least a third distance between the underwater vehicle and at least a third object,

-navigating the underwater vehicle to additionally generally maintain the third distance over a sequence of captured images.

Description:
Camera Assisted Control System for an Underwater Vehicle

Technical field

The presented invention relates to a system and a method for controlling the orientation and position of an underwater vehicle, according to the preamble of the independent claims.

Background

There has been an increased interest in research and development of technical solutions for underwater vehicles. There are many solutions for performing complex underwater operations with high precision, such as seabed mapping, online underwater monitoring, subsea installations, and maintenance on pipes. There are many types of Unmanned Underwater Vehicles (UUV) on the commercial market. Remotely Operated Vehicles (ROV) and Autonomous Underwater Vehicles (AUV) are most common. In most cases, underwater vehicles are expensive and not affordable to private consumers. When operating the underwater vehicle in the sea or other natural water sources the vehicle is exposed to currents in the water that makes the vehicle difficult to operate. These forces changes continuously and induces undesired oscillations and motions that are difficult to manually compensate.

The inventors have appreciated that the known underwater vehicles described exhibit one or more problems, for example:

(a) the underwater vehicles are cumbersome to operate requiring training of the operator,

(b) the underwater vehicles are costly to manufacture,

(c) the underwater vehicles do not provide solutions for maintaining dynamic position,

(d) the underwater vehicles lack possibility to understand the surrounding environment, and

(e) the underwater vehicles lack control to adjust for currents applied to the vehicles under water.

The objective of the invention is therefore devised to solve one or more of the mentioned problems described in (a) to (e). Summary of invention

The objective of the present invention is attained by a control system, an underwater system and a method having the features defined in the independent claims.

Preferred embodiments of the present invention are further defined by the dependent claims.

In this specification the term "light emitter" is used. It should be appreciated that the term "light emitter" is interpreted broadly to include lasers, flashlights or similar devices that may emit light in a direction. The objective of the "light emitter" is to illuminate an area and the illumination may be detected by an image capturing device. The "light emitter" may also illuminate particles in such a way that the light emitted creates a detectable ray or cone of light. The light emitter may be

constructed in such a way that it illuminates an object with a specific pattern. The pattern may contain information in the form of an ideogram.

An ideogram may be understood as a graphic symbol that represents an idea or concept, independent of any particular language, and specific words or phrases. Examples of this are arrows, symbols for items on a boat like bow thrusters, water line etc. The ideogram may also contain information like direction, orientation, coordinates or other information suitable to navigate the underwater vehicle. In the context of this specification ideograms may also contain characters, figures and numerals. The ideograms may contain information regarding the direction the underwater vehicle should film. The ideogram may contain instructions for a path that should be followed by the underwater vehicle. The ideogram may contain information that may be used to retrieved additional information from a database containing information regarding how to navigate the underwater vehicle.

According to a first aspect of the present invention it is provided a control system for an underwater vehicle comprising means for receiving captured images of an underwater scene, processing means for processing the received captured images, and means for controlling at least one propulsion device of the underwater vehicle. The processing means are provided with means for tracking at least one object in two dimensions in the received captured images. The means for controlling the at least one propulsion device is operated by the processing means to maintain the tracked object substantially within the received images. The at least one propulsion device is used to change at least one in a group comprising orientation and position of the underwater vehicle. The object tracking is performed by using means for converting from a RGB colour model to a cylindrical coordinate representation. One advantage is ease of tracking objects in varying illumination in an underwater scene.

The means for controlling the at least one propulsion device may be operated by the processing means to locate the at least one tracked object at a desired location in the received captured image.

Locate in this context means for the means for controlling the at least one propulsion device compensates for changes in the location of the tracked object by using the propulsion means to make the tracked object be in the desired location in the received captured image.

The means for controlling the at least one propulsion device may be operated by the processing means in such a way that the at least one area of the at least one tracked object is maintained substantially similar over a sequence of received captured images.

The object tracking may be performed by using means for converting from a RGB colour model to a cylindrical-coordinate representation, like HSV, HSL, ZXY or similar.

The control system may further comprise means for detecting, in the received captured image, an area of at least one reflection of at least one light emitter emitting into the underwater scene for calculating a distance between the at least one light emitter source and the at least one reflection. One advantage is ease of estimating distance using a light emitter and a camera.

The control system may further comprise means for detecting, in the received captured images, a separation distance between two reflections of two light emitters emitting into the underwater scene for calculating a distance between the light emitter source and the reflections. One advantage is improved precision in measurements and to calculate orientation of the underwater vehicle.

The control system may further comprise means for detecting, in the received captured images, at least three reflections of at least three light emitters emitting into the underwater scene. The at least three light emitters are organized as corners in a polygon in a first plane. The at least three reflections are located in at least one second plane. The reflections are used to calculate distances between first and the at least one second plane. One advantage is improved precision of distance measurements and ability to calculate orientation of the underwater vehicle. The calculated distances are used to calculate at least an angle Θ between the first and the at least second plane. One advantage is improved determination of the orientation of the underwater vehicle.

The at least two light emitters may be emitting in parallel into the underwater scene. One advantage is improved precision of distance measurements.

The at least two light emitters may be emitting convergently into the underwater scene. One advantage is improved precision of distance measurements. The at least two light emitters may be emitting divergently into the underwater scene. One advantage is improved precision of distance measurements.

According to a second aspect of the present invention there is also provided an underwater vehicle system comprising an underwater vehicle which is comprising a body for attaching components, a first image capturing device attached to the body for capturing images in a first direction, at least one propulsion device controller for moving the underwater vehicle, and further comprising a control system.

According to an embodiment, the underwater vehicle may further comprise at least one light emitter attached to the body emitting light in the first direction.

According to another embodiment, the underwater vehicle may further comprise a buoyancy unit to provide the underwater vehicle with positive buoyancy. According to yet another embodiment, the underwater vehicle may further comprise a second image capturing device for capturing images in a second direction.

According to yet another embodiment, the images captured by the second image capturing device may overlap with the images captured by the first image capturing device. According to a third aspect of the present invention it is provided a method for guiding an underwater vehicle system comprising the steps of capturing at least one image of a first area and navigating the underwater vehicle based on at least a first detected object in the captured image.

According to another embodiment, the first detected object may be at least one reflection of at least one light emitter.

According to yet another embodiment, the first detected object may be an ideogram providing localised information.

By localised information is e.g. information about an object and its surroundings. Localised information may be coordinates of an object, the geometrical shape of an object, areas that should be avoided in the proximity of the object, or similar. The localised information may advantageously be used for automatically navigating the underwater vehicle.

According to yet another embodiment, the navigating of the underwater vehicle may further comprise the step of determining at least a first distance between the first detected object and the underwater vehicle and wherein the underwater vehicle may be navigated to maintain the at least first distance similar over a sequence of captured images.

According to yet another embodiment, the navigating of the underwater vehicle may further comprise the steps of determining at least a second distance between the underwater vehicle and at least a second object and navigating the underwater vehicle to additionally maintain the second distance similar over a sequence of captured images. According to yet another embodiment, the navigating of the underwater vehicle may further comprise the steps of determining at least a third distance between the underwater vehicle and at least a third object and navigating the underwater vehicle to additionally maintain the third distance similar over a sequence of captured images.

It will be appreciated that features of the invention described in the foregoing can be combined in any combination without departing from the scope of the invention. Description of figures

The invention will now be described with the help of the enclosed figures, showing a system according to the present invention. The different parts of the figures are not necessarily in scale to each other, as the figures are merely for illustrating the invention.

The following description of an exemplary embodiment refers to the drawings, and the following detailed description is not meant or intended to limit the invention. Instead, the scope of the invention is defined by the appended claims.

Reference throughout the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with an embodiment is included in at least one embodiment of the subject matter disclosed. Thus, the appearance of the phrase "in one embodiment" or "in an embodiment" in various places throughout the specification is not necessarily referring to the same embodiment. Further, the particular features, structures, or characteristics may be combined in any suitable manner in one or more

embodiments.

A preferred embodiment of the invention will now be described, by way of example, with reference to the following figures:

Fig. 1 shows a diagram of the six degrees of freedom for an underwater vehicle, Fig. 2 shows an embodiment of an underwater vehicle in an underwater

environment,

Fig. 3 shows a view from a camera seen from the underwater vehicle in Fig. 1 , Fig. 4 shows a diagram for a feature tracking algorithm,

Fig. 5 shows an imaging system with two lasers,

Fig. 6 shows an imaging system with four lasers,

Fig. 7 shows the geometry between two lasers and a wall, and

Fig. 8 shows an overview of the control system

Detailed description

The present invention relates to a system for orientating an underwater vehicle to orient a camera. The camera is mounted in the underwater vehicle and is directed in a substantially fixed direction relative to the underwater vehicle. To change the direction in which the camera is capturing images, the underwater vehicle is required to change orientation. The change in orientation of the underwater vehicle is achieved by using propulsion devices mounted on the underwater vehicle. The control of the orientation of the underwater vehicle is performed based on the location of an object. The goal is to have the object in the centre of the captured images. Digital object recognition can identify and track objects in real time in the captured images. An operator selects one or more objects to be tracked. A control system will attempt to orientate the underwater vehicle in such a way that the objects are located at the centre of the captured images. The underwater vehicle will compensate for currents, drag from an umbilical cable, or other forces that may influence the position and orientation of the underwater vehicle. The compensation is performed using the propulsion devices mounted on the underwater vehicle.

The propulsion devices are used to keep an object in a wanted location in the captured images. The propulsion devices can change the orientation and the position of the underwater vehicle. The camera of the underwater vehicle is thereby keeping the object at a desired location in the captured images and can track the object by using the propulsion devices. The camera is in some embodiments fixed in the underwater vehicle. In other embodiments the camera is mounted on a gimbal or similar to allow some controlled horizontal and vertical movement of the camera. One advantage of a motorized gimbal is to allow for user control of the camera while the underwater vehicle maintains fixed position and orientation. A gimbal may also improve the image quality by reducing the impact of vibrations, oscillations and other unwanted movement of the underwater vehicle. The camera in the underwater vehicle may be provided with image stabilizing to improve the image quality.

In the context of the description the naming underwater vehicle, ROV, UUV, UAV may be used interchangeably. An underwater vehicle should be interpreted as a vehicle that may be remotely operated either wireless or over an umbilical cable. Underwater vehicle should also encompass fully or partially autonomously operated underwater vehicles.

Fig.1 shows a diagram of the six degrees of freedom for an underwater vehicle. For the purpose of illustration, the point O refers to the centre of mass of an underwater vehicle. Translation or moving forward or backward along the X-axis this is referred to as surge or surging. Translation or moving left or right along the Y-axis is referred to sway or swaying. Translation or moving up or down along the z-axis is referred to as heave or heaving. Rotation or tilting side to side on the x-axis is referred to as roll or rolling, this is indicated as K. Rotation or tilting forward or backward on the Y-axis is referred a pitch or pitching, this is indicated as M. Rotation or turning left or right on the Z-axis is referred to as yaw or yawing, this indicated with N. According to Fig. 2 an embodiment of the underwater system 100 is shown. The system comprises an underwater vehicle 200 with an umbilical 401 . The umbilical 401 is used to communicate data and instructions between a control unit 300 and the control logic in the underwater vehicle 200. The umbilical 401 may be an Ethernet cable, fibre optic cable or similar that is suitable for immersing in water and that can be used for transmitting digital data between a control unit 300 and the underwater vehicle 200. The first end of the umbilical 401 is connected to the underwater vehicle 200 and the second end of the umbilical is coupled to a control unit 300 or relay unit 400. The relay unit 400 communicates with one or more control 300 units and relays data and instructions between the underwater vehicle 200 and the control unit 300.

The relay unit 400 or the control 300 unit may also provide power, wireless

communication in water, waterproof enclosure for fragile electronics or similar.

Power may be transmitted to the underwater vehicle 200 using the umbilical 401 . The underwater vehicle is preferably equipped with a battery to be able to operate autonomously without requiring power from an external source.

The underwater vehicle 200 shown in Fig. 2 comprises a watertight enclosure for protecting electronics from water. Some of the electronics may be exposed to water or sometimes protected by suitable products like epoxy resins etc. The underwater vehicle 200 is equipped with a plurality of thrusters to be able to move itself in up to six directions. The underwater vehicle in Fig. 2 has two thrusters 210 for moving the vehicle backwards and forwards. There is one thruster for moving the vehicle upwards and downwards. An additional thruster is provided for moving the vehicle sideways. A camera 220 is fixedly mounted in the front of the underwater vehicle 200. The camera 220 captures images preferably in a fixed direction in respect of the underwater vehicle 200. The camera 220 is facing in a defined heading. In this embodiment the camera heading is aligned with a horizontal and vertical axis of the underwater vehicle. Changes to the orientation of the underwater vehicle will change the heading of the camera 220. To change the heading the thrusters of the

underwater vehicle 200 are activated. In this embodiment the camera 220 is a wide angle camera, i.e. called fisheye, giving a large field of view. The camera 220 may be other types of camera suitable for capturing underwater images like hyperspectral camera, tele objective, zoom objective, or 360 degree camera. In Fig. 8 shows an overview of the underwater vehicle 200. The underwater vehicle 200 is equipped with a single board computer 202 that runs an operating system, i.e. Raspberry Pi2 running Ubuntu Linux 14.04 with Robot Operating System Indigo. The single board computer 202 provides connections and control of several peripheral units. Among the peripheral units are a pressure sensor 206 and an IMU 208 , these sensors 206,208 are connected to an intermediate microcontroller 204, i.e. Arduino Mega, where the microcontroller 204 is again connected to the single board computer 202 via a suitable bus interface like SPI. A motor controller board 216 is provided for controlling and regulating the propulsion provided by the thrusters 210,212,214. The motor controller board 216 is connected to the single board computer 202 via the microcontroller 204. An application executed on the single board computer 202 instructs the motor controller board 216 to regulate and control the thrusters. The camera 220 is also connected to the single board computer 202, using USB. The single board computer 202 is equipped with one or more processors that can execute digital object recognition algorithms. Additional cameras 250 may be connected to the single board computer 202 for providing additional imaging capturing possibilities. The underwater vehicle may be manufactured with static buoyancy. The vehicle is positively buoyant and will vertically align is vertical axis and thereby be stable floating in the water. In such case pitch and roll movement of the vehicle is not controllable. In other embodiments the buoyancy and the centre of mass may be controlled to enable control of pitch and roll of the vehicle.

Alternatively the underwater vehicle may be moved by wires connected to winches or similar located in proximity of the underwater vehicle. Heave movement of the underwater vehicle may be achieved by using a wire or the umbilical to lower or raise the vehicle from above the water. In other embodiments the underwater vehicle may be anchored to the seabed using an anchor. Heave movement of the vehicle may be performed by adjusting the distance between the anchor with a cord or wire.

During a dive of the underwater vehicle the operator controls the vehicle from a control unit above water. The control unit is a tablet, e.g. iPad, iPhone, smartphone, computer or similar. The control unit connects the relay and communicates data with the vehicle first over wireless network to the relay unit and to the vehicle via the umbilical. An application on the tablet provides functionality to send data to the underwater vehicle and to receive data from the vehicle. The application provides keys or inputs for instructing the vehicle to enable thrusters for changing the orientation of the vehicle. The application receives images captured by the camera an displays the images on the tablet screen.

The control unit may also take the form of a laptop, a desktop pc, smartphone, remote control or similar unit that can transmit instructions to the underwater vehicle.

It is desired to construct a system that reduces the required of skills in operating an underwater vehicle. A common scenario when operating an underwater vehicle under water is that an interesting object is spotted and the operator desires to inspect this object more thorough and capture a film of this object. It is advantageous for the underwater vehicle to maintain its position relative to the object for capturing stable images of the object. Special features may also be implemented that may create special footages of the object like orbiting around the object, film at certain distance, film at a certain angle. Maintaining desired direction of the camera towards a desired object is achieved using digital object recognition in the captured images and adjustment of the vehicle direction based on the objects location in the captured images.

Alternatively the system may comprise of multiple image capturing devices that captures images at different locations and angles around the vehicle. This may be advantageous to be able to track objects from multiple angles and improve the accuracy of the object tracking.

Referring to Fig. 3 that shows a field of view 600 from the camera 220. The camera has field of view 600 with a resolution in two axis defined as X and Y. An object 500 that is desired to be tracked is detected in the captured images. The captured images with the object 500 are framed with an outer rectangle 610.

The object that is desired to be filmed, like a part of a ship wreck, rock, slowly moving object or other non mobile object is tracked using preferably a feature tracking method. Feature tracking isolates features of the tracked object. Several types of object tracking were considered. Background subtraction was considered not to be suitable due to the underwater vehicle is moving most of the time, and the object of interest most likely standing is not moving relative to the background.

Optical flow was considered, but due to particles in the water is not suitable. The filtering strategy is based on unique colours in the captured images. Finding the right limits for the colour of interest can be difficult. A first attempt was by using the RGB colour system, but RGB was not found suitable since changing light conditions changes the RGB values which makes it difficult to track the values. The preferred colour method was to use a cylindrical-coordinate representation as Hue Saturation Value (HSV). With HSV a hue value represents a main colour, a saturation value represents an intensity of the main colour and a brightness value of the main colour. In changing light conditions the main colour remains constant which is easing the tracking of features. A certain threshold of the objects HSV values are used to track the location of an object in the captured images. An advantage by filtering by the object colour is that it is robust in changing light conditions. It may also be suitable to use similar cylindrical-coordinate colour representations like HSL, YUV, or ZXY or other suitable colour representations that have features where one or more colour components are fairly constant regardless of the illumination of the captured scene.

Alternatively the feature tracking may use other features like patterns, surface structure, colour combination or similar features that are easy to track with object recognition. Alternatively the underwater vehicle may be equipped with an illumination source like a lamp for being able to film at depth with little light or during night or low light conditions. This may be a calibrated lamp that illuminates with a calibrated light for improving the feature tracking and improved quality of image recordings. To compensate for changing light conditions the lamp may have a controllable intensity.

Alternatively the feature tracking may track a specific feature by tracking an object with a distinct colour which is attached to the object that is desired to keep in the field of view of the camera. For example a diver may have a yellow ball attached e.g. on the head or on the back. By using feature tracking the yellow ball is easily

distinguished from the surroundings. The underwater vehicle is then directed towards the diver and may follow the diver during a dive. The yellow ball is then maintained at a desired location in the cameras field of view. Change in distance between the yellow ball and the underwater vehicle is detected by tracking the size of the yellow ball in the images captured by the camera. The yellow ball is an example, a similar effect is achieved using balls or other objects with clearly visible or detectable features. These features may be colours or patterns on the object. The underwater vehicle typically may perform vertical movement of 0.6 m/s by using one or more thrusters. The vertical movement may be faster than 0.6 m/s to allow rapid change of depth. The vertical movement may be slower than 0.6 m/s to allow fine control of the vertical movement to compensate for currents in the water. The underwater vehicle may be propelled forward at up to1.5 m/s. The underwater vehicle may be propelled forward or backwards at higher speeds 1.5 m/s to achieve fast displacement. The underwater vehicle may be propelled slower than 1 .5 m/s to e.g. compensate for slow water currents. The feature tracking may also use visual anchors. These visual anchors may be particular patterns, barcodes, QR codes or similar. On a ship hull there are usually patterns that indicate location of bow thrusters, propellers, depth below surface etc. In addition the boat or other object may be equipped with barcodes, patters, QR codes containing information regarding location, distance, direction, type of object or other information. The control system of the underwater vehicle may use this information to perform inspection of the object in a preprogramed fashion. In other embodiments the information may be used to inform the control system that there are certain objects to be avoided like inlets to bow thrusters. Such visual anchors may be used to make sure that the inspection starts at the same location when a previous inspection is to be rerun. The visual anchors may also contain information regarding direction, depth, distance or similar the underwater vehicle should have relative to the object.

Referring to Fig. 4 a diagram of a feature tracking algorithm is shown. The algorithm is implemented as a software program, but may also wholly or partially be performed by hardware logic. The algorithm is executed in the control unit, but may also be executed in the single board computer of the vehicle or combinations where parts of the algorithm are executed in the control unit and other parts in the single board computer. The vehicle is operated under water and the camera is capturing images. The operator selects an object in the captured images and a colour interval of HSV colours values are calculated representing the expected range the HSV values will vary among. The sequential steps in the feature tracking algorithm are represented by the states S1 -S10 as shown in Fig. 4. For each captured image S1 the captured image is filtered S2 only to contain pixels within the colour interval. The filtered image is converted in to a binary image S3 where the pixels within the HSV colour interval is converted to white and the remaining pixels are converted to black.

Gaussian blur is a function that is performed on the binary image to filter out noise grains. An erode function S4 is used to remove noise out the edges followed by a dilate function S5. The dilate function smooths the edge of the white are by increasing the minimum size of a pixel group. This is necessary in order to avoid our mask being separated into two contours. The output after the dilate function S5 is feed into a find contour function S6 for finding the outer contour of the white body. Based on the contour a rectangle is drawn S7 around the contour. The rectangle is drawn over the unfiltered captured image. The centre point of the rectangle is calculated S8 by averaging the pixel coordinates of the four corners. The calculated centre points for a sequence of captured images are low pass filtered S9 and presented S10 to the control system of the vehicle. The feature tracking and needed image manipulation functions may be implemented using OpenCV or a similar programming solution that provides similar computer vision functionality. Other suitable edge detection methods may also be used like Canny Edge detection or similar.

The feature tracking algorithm may also be configured in such a way that it detects objects that have features that may be suitable for being tracked. This may be features like areas with uniform colour or pattern. The objects that are suitable for being tracked are presented to the operator whereby the operator may select one or more of the suitable objects. The control system will then operate the underwater vehicle to orient the camera towards the object or objects.

The feature tracking algorithm may also have or use a database of known images of shellfish, fishes or other objects. If similar objects are detected in the captured images the objects may be indicated and presented to the operator. The operator may then select the object desired to track. It may also be advantageous that the control system automatically tracks known objects to facilitate capturing video footage. The feature tracking algorithm may be configured to detect a specific object. This may be a coloured ball attached to a diver. The underwater vehicle detects the ball in the captured images. The underwater vehicle is configured to follow the coloured ball as the diver moves during a dive. This feature may be used to guide the underwater vehicle to a site for capturing images of this site or perform additional automated tasks together with the diver. Alternatively other filters may be used like Kalman filters or similar more advanced filters that take in to account sensor data, changes over time or even perform predictions of future events. Definition of heading

In the scope of this invention the definition of heading means the direction the camera is facing. Change of the heading of the underwater vehicle to a new heading will make the camera face in the new heading direction. In an embodiment using 360 degree cameras the heading will be selected based of a logical interpretation of what is surge direction of the vehicle.

The system allows for several modes of operation of the underwater vehicle. The operator can alternate between the various modes of operation. The level of input from the operator and the level of control of the degrees of freedom (DOF) are different depending on the modes of operation.

Heading mode

One of the modes of operation of the vehicle is heading mode. The objective of the feature tracking algorithm is to maintain the desired object in the field of view of the camera by controlling the yaw motion of the underwater vehicle. That is, if the objects starts drift out of the field of view to the left, then the vehicle should turn to the left to keep the object at the centre of the field of view. The horizontal position of the object is used in this mode. The other DOFs; surge, sway and heave are controlled by the operator using the control unit. This mode provides good flexibility in manually manoeuvring in front and around the object. In a suitable vehicle pitch and roll may also be manually controlled.

Distance mode

A second mode of operation is distance mode. Distance mode also provides heading control like in the heading mode, but in addition takes the area of the tracked feature in to consideration when controlling the vehicle, enabling automatic control of the surge motion. Hence, now a certain distance to the object is maintained while the object is kept in the field of view. It is still possible for an operator to manually control the pitch, roll, heave and sway of the underwater vehicle. Orbit mode

A third mode of operation is orbit mode. Orbit mode introduces Auto-depth control using input from the pressure sensor. Pressure data from the pressure sensor is used to maintain the same depth of the vehicle. In orbit mode this provides the possibility for the operator to move around the object to view it from different angles. In orbit mode sway motion is the only input from the control unit, the direction of the sway motion will control the direction of the orbit. Alternatively pitch and roll may also be automatically controlled if the vehicle is equipped with means for pitch and roll. The orbit mode may be configured in such a way that the vehicle when there is no new sway motion input the vehicle will maintain its position. In other cases when there is no sway motion input from the control unit the vehicle may drift to another position while still maintaining the object in the field of view. If current is present, the vehicle will drift to a heading where it has the least resistance, hence facing the current. This is usually the heading with the greatest station keeping capabilities for most underwater vehicles.

Dynamic positioning

A fourth mode of operation is full dynamic positioning. In this mode the heading of the vehicle is measured by a compass, an IMU, or externally by using one or more cameras. This allows the horizontal position of the object to control the sway motion, instead of the yaw motion as in the heading, distance, and orbit mode. The heading and depth of the vehicle is maintained using traditional heading and depth control algorithms. At the same time, computer vision is used to control sway, distance and auto guidance for the desired depth. The traditional DOF are not controllable from the control unit, and the vehicle is maintaining its position automatically. The vehicle will attempt to keep the tracked object within the field of view of a camera, compensating for drift by using the thrusters.

One of the objectives of a camera assisted positioning system is to orient the underwater vehicle in such a way that a desired object is at or close to the centre of field of view of a camera and maintaining a constant area of the object in the field of view. The area of the object will be decreased if the vehicle moves away from the object, and be increased if it moves closer to the object. One advantage of such a system is that footages may be recorded of the object requiring little underwater vehicle manoeuvring skills form the operator. During descent the vehicle transmits the images captured by the camera to the control unit and the captured images are displayed for the operator to inspect. The operator selects an object in the displayed image or selects an area of the displayed image. The selection may be defined by a rectangle with four corners with four coordinates. The outer coordinates and other features of the object are used to control the heading of the vehicle. The vehicle may be influenced by currents, drag from the umbilical and other forces that influence the heading of the vehicle.

Changes in the coordinates of the object in the captured images are used to regulate the thrusters of the vehicle to make the object be at a desired location in the captured images. Changes in horizontal or vertical coordinates of the selected object are compensated may be automatically adjusted using PID regulator controlling the thrusters. To limit the influence of noise in the coordinates a low-pass filter is utilised. The low pass filter may have a delay of 20ms, which is beneficial as it represented less than 10 % of the delay in video processing to automatically track features in the captured images. Other types of filters may be combined with using sensor data like heading, acceleration, pressure, and IMU data to maintain the orientation. This may be Kalman filters or similar.

In other embodiments combining feature tracking and other computer vision methods may be done. Such combinations would be advantageous for improving the level of autonomy of the underwater vehicle.

It may also be desired to track multiple objects or multiple features of the same object. This may improve the stability of the feature tracking during for instance changing light conditions or changes in visibility through the water. It may be advantageous to have multiple features of the object being tracked to compensate for changes in colours or features when capturing images at different angles or positions. The operator may add or remove features when orbiting an object to dynamically make sure that the same object is tracked during orbit.

In other embodiments a combination of processing of the captured images from the camera and the digital object recognition algorithms may be performed partially in the single board computer and at another remote processing resource. This would give the ability to have more processing resource available than can be powered by the battery in the underwater vehicle. This may be advantageous to increase the speed of execution of the digital object recognition and object tracking algorithms, automation of object recognition based on larger datasets that can be stored at a data centre, change the algorithms or execute multiple algorithms simultaneously.

In other embodiments it may be desired to be able to tilt, zoom or in any other way control the direction of the cameras using actuators or servos. This would give the benefit of stabilizing the captured images by compensation for movement, shaking, turbulence created by the thrusters, or other external influence like currents in the water. Referring to Fig. 5 shows the underwater vehicle with a secondary imaging system with two lasers 260,262. A camera 250 has a field of view 252. The two lasers 260, 262 emit light in to the field of view and the laser light is reflected from an object as points 261 , 262. In Fig. 6 shows the secondary imaging system 240 comprising four lasers 260, 262, 264, 266. The underwater vehicle 200 may be equipped with the secondary imaging system comprising a camera 250 and four lasers 260, 262, 264, 266. The camera 250 and the four lasers 260, 262, 264, 266 that are emitting laser light in the same direction as the camera 250 is facing. The distances between the camera and the lasers are known. In this embodiment the camera and the lasers are mounted on a plate 252. The four lasers are arranged at fixed angles and distances. The relationship between the camera and the lasers needs to be known to measure distance. When the laser light reflects of an object the four laser lights are visible in the captured images as four coloured dots. The distance between the four lasers dots together with the known relationship between the lasers sources can be used to calculate the distance between the camera and the angle of inclination of the object. This calculation may be done use trigonometric functions, known proportionalities of the dot area and the distance.

The lasers are arranged on the plate 252 in such a way that they emit divergently with respect to each other. Divergently mounting the lasers is advantageous to have more accuracy in discerning distances. The lasers may be arranged such that they emit light parallel or convergently with respect of each other.

The secondary imaging system may be fixedly mounted on the underwater vehicle for simplified mounting. In other embodiments it may be moveable with servos to aim lasers at desired objects. In other embodiments the secondary imaging system may be combined with the camera performing the feature tracking, where lasers emit in to the field of view of the feature tracking camera. The reflections from the lasers are then tracked by the computer vision software.

Referring to an embodiment with four lasers. Determining distance may also be done in the same way for embodiments with two lasers. Four lasers with a known distance are parallel emitting laser light directed in the same direction as a camera. The laser light reflects back from a wall in front of the vehicle as coloured dots

261 ,263,265,267. The distance between the centres of the reflected lasers dots will then change when the vehicle is at different distance from the wall. The coordinates and area of the reflected laser dots in the captured images are detected using computer vision. Maintaining a constant distance to the wall may be attained by the adjusting the thrusters to compensate for currents or other disturbances to the vehicle. There is a correspondence between the number of pixels between two dots in the captured images and the distance between the dots at the laser source. The ability to maintain distance to the wall may allow the operator to control the vehicle by sway movement while maintaining the same distance to the wall. The vehicle would be able to maintain desired depth and heading by using depth and heading modes. The operating modes using four lasers are referred to as range finder modes. The laser may also be controlled individually and switched on and off automatically and manually.

The ranger finder modes may be used in several different modes and a combination of these modes. The number of pixels that the dots cover in the captured images may also be used to measure distances. This may be done by having a

correspondence between number of pixels of a dot in the captured image and distance between the laser source and the dots.

Measure Distance Mode

A first range finder mode is the measure distance mode. This mode provides distance information to the operator with no automatic control. This information is often very useful under water to assess the size of objects. With knowing the distance to an object it is possible to estimate the size. With a fixed camera with a fixed image resolution knowing the distance will give the ability to estimate the size of objects. The distance may be displayed on the screen of the control unit.

Range Finder Mode

A second range finder mode is automatic distance mode that maintains the vehicle at an approximately constant distance in front of the surface of an object. As long as the vehicle does not rotate, it will stabilize in the approximately constant distance, not necessarily perpendicular on the surface of the object. In other embodiments the automatic distance may be used as auto-altitude, where the lasers and the camera are facing downwards. An advantage would be that a constant distance to the bottom can be maintained, and preventing the vehicle from crashing in to the bottom. This may be advantageous where a desired depth is controlled by setting a desired height above the seabed (auto-altitude) when driving the vehicle where the seabed is uneven (i.e. changing depth). Alternatively the desired depth can be controlled by setting a water pressure to maintain that is different than at the seabed to maintain same depth independently of the seabed..

Referring to Fig. 7 which shows the geometry of a range finder with two lasers 260, 262. A distance to a wall 270 in front of the vehicle 200 can be estimated by using two horizontally aligned lasers and a camera 250. Start by defining a first distance between the vehicle and the wall that is larger than a second distance between the vehicle and the wall. With the two lasers facing towards the wall and emitting laser light on the wall as laser dots, the distance between the two lasers dots 261 , 263 are shorter at the first distance than at the second distance. The two lasers and the camera are fixed and therefore it is possible to find a mapping between the distance between the laser dots and the distance from the vehicle to the wall. With two lasers it is possible to find the distance but not possible to find the angle of the wall relative to the vehicle. This is because when the angle Θ is changing, both laser dots 261 , 263 will move to the same side, maintaining the same distance between the dots. Fig. 7 shows an example of an angled wall 270. Angles up to 80 degrees are tested to be working Angles above 80 degrees are difficult to measure because the furthermost laser dot is hard to detect in the images captured by the camera. Using two lasers to measure distance will work on any surface that reflects laser light.

Now referring to Fig. 6 is showing a setup with four parallel lasers. The four lasers 260, 262, 264, 266 are arranged in a square where there is one laser in each corner. A camera 250 is arranged in such a way the reflections 261 , 263, 265, 267of the four lasers 260, 262, 264, 266 are visible in the field of view 252. It is possible to detect the angle between the wall and the vehicle 200 using an arrangement using a fixed camera and four fixed parallel lasers. The four lasers project four dots 261 , 263, 265, 267 on the wall 270, and parts of the wall is within the field of view 252 of the camera 250. The pixel location of the dots 261 , 263, 265, 267 in the captured images may be compared to measure the distance between them. Selecting the pairwise vertical dots, the distance between the camera and the pair of laser dots is calculated similar to the range finder mode. Now there is pair of distance measurements for the vertical dots (dL and dR). Referring to Fig. 7 the wall is vertically angled and the vehicle 200 is approaching the wall at an angle Θ, the angle Θ can then be determined by the function

f(dYL, dYR) = arctan (dl_-dR)/(ParallelDistance) = Θ where Θ is the angle between the underwater vehicle and the wall, dL and dR are the distances from the vehicle to the wall, and ParallelDistance is the distance between the vertical lasers. Typically for a vehicle the ParallelDistance is 5.5 cm, but may be larger for better accuracy of distance measurement. The ParallelDistance may also be smaller than 5.5 cm to enable smaller underwater vehicles. Preferably the

ParallelDistance is 10cm or more. Similarly the pairs of horizontal lasers may be used to estimate an angle of the wall in the horizontal plane.

The automatic distance mode is implemented as a PID controller. Where the estimated distance between the wall and the vehicle is used as input and changes is estimated distance is compensated by using the thrusters to maintain similar distance between the wall and the vehicle. Low-pass filtering is applied to the input data of the PID controller to reduce influence of noise. Other types of filtering may be applicable to compensate better for noise in the distance measurements.

The range finder detects the number of pixels between the centres of two laser dots. The number of pixels is mapped to a distance between the vehicle and the wall. The number of pixels between two dots was measured by capturing images at 10 cm intervals between 10 cm and 180 cm. Alternatively the number of pixels a laser dot covers in the captured image is used to map number of pixels to a distance between the dot and the laser source.

Detecting lasers reflected from an object under water depends on the strength of the lasers, the distance to the object, the visibility in the water, light conditions in the water, colour of the lasers, etc. There are different approaches to filtering the reflected dots from a captured image of the dots. Several algorithms were tested, but feature tracking algorithm gave the best results. This algorithm was implemented using the feature tracking algorithm mentioned earlier for tracking objects. The algorithm was implemented using OpenCV, but similar computer vision software may be used. The laser in this embodiment where chosen to be emitting red light. Red light is easily visible in water and the red lasers are low in price. The feature tracking algorithm is tracking the red colour dots reflected by an object, i.e. a wall. The camera is capturing images of the red dots. The red dots are the filtered for size constraints. The size of the dot will change some depending on the distance to the wall, but is still within a range of 3-40 px 2 , where px 2 are square pixels. This filtering of maximum size of the dots prevents disturbances by large red objects. The range finder functionality is executed in the single board computer, but may be performed by a combination of hardware logic and computer resources in the vehicle or offsite. Performing the range finder functionality in the vehicle reduces the latency in calculation compared to performing the functionality remote to the vehicle. The camera is directly connected to the single board computer where the single board computer controls the capturing of images and retrieves the captured images for processing. The camera may be used solely for the purpose of laser detection and be equipped with a colour filter that filters out other colours than that of the laser. The brightness of the images captured by the camera is turned low to filter out the surroundings. The camera has also a narrow field of view, which gives more pixels per area in the area where the laser dots are reflected by an object. To reduce the required size of the image to process, the captured image may be cropped to only contain an area with the red dots. This cropping may be performed by an image manipulation software or direct control of the camera.

Wall Inspection Mode

Wall inspection mode is a mode where the vehicle uses a pressure sensor and the arrangement of four lasers and a camera. The pressures sensor is used to measure depth and used as input for to be able to maintain constant depth. The secondary imaging system with lasers is used to measure the distance between the vehicle and a wall and the angle of the wall relative to the heading of the vehicle. When enabled the wall inspection mode allows the vehicle to follow the wall at constant depth and constant distance and angle. Manual or automatic control of the sideways movement of the vehicle is available using the control unit.

In other embodiments the wall inspection mode may be augmented to use also use feature tracking. Features of the wall or surface that are to be inspected are tracked using feature tracking. These features may be edges, colours, lines etc. The embodiment may have multiple cameras that may be used for digital feature tracking. For instance capturing images of a ship may be performed by maintaining the correct angle and distance of the underwater vehicle by using the lasers pointing towards the ship hull. A feature of the ship hull tracked, like a line painted on the hull or the keel. The feature on the ship hull is maintained in the same location in the captured images by operating the propulsion devices of the underwater vehicle.

In other embodiments the wall following mode may be further automated using a pressure sensor to maintain the same depth during capturing images.

In a preferred embodiment the lasers are attached to the underwater vehicle. In other embodiments there may be additional lasers may be located at other places than on the vehicle. Such lasers may emit light into the field of view of camera in the underwater vehicle capturing images. Detecting the reflections of the lasers in captured images may be used to calculate the position of the underwater vehicle. In yet another embodiment the distance is calculated with one laser. The laser is mounted on the underwater vehicle and directed in the same direction as the camera. The size of the reflection of the laser light captured by the camera is used to determine the distance between the vehicle and the reflection. In other embodiments an additional IMU may be integrated in to the vehicle. An IMU would give ability to track the direction of acceleration of the vehicle, the orientation of the vehicle relative to the magnetic fields of the Earth, changes in orientation of the vehicle over time with a gyroscope. One advantage of the additional IMU is having a backup unit, allowing for measurement of bending and deformation of the vehicle. In other embodiments a preconfigured database of images of objects may be integrated to enable faster recognition of underwater objects.

In other embodiments a laser may be directed towards and illuminating an object. The control system will detect the reflection from the laser and move the underwater vehicle towards the object. The feature tracking algorithm will attempt to maintain desired distance and keep the object in a desired location in the captured images. This feature is advantageous by a diver being able to direct the underwater vehicle by aiming a laser at an object. It may be further advantageous for the diver to make the underwater vehicle move along a path, this path may be defined by aiming the laser at more than one place. This may be used to make the underwater vehicle capture images along a path defined by a diver using a laser. This may provide a method for the diver to easily operate the underwater vehicle when diving. Example of this is where the diver points at parts of a ship hull, and the underwater vehicle captures images of these parts automatically. It may be advantageous for the diver to aim the laser in a particular direction to make the vehicle follow the laser ray, where the need for an object reflecting the laser is not required.

In other embodiments the laser lights may be of different colours that are less absorbed in water than red. Lasers emitting light in wavelengths that are not in the human visible spectre may be considered due to that such light will not be visible in the camera footage presented to the operator. Using special colours of the lasers may allow for the lasers to be emitted into the field of view of the camera, but easily filtered from the footage. Combinations of lasers with different colours may also be used to easily estimate the orientation of the underwater vehicle. Green or blue lasers may also be used due to blue and green lasers getting less attenuated in water than red lasers.

In other embodiments the rectangle of four lasers arranged with the camera may have additional lasers in a grid pattern. A grid pattern of lasers projecting light on a surface may be used for seabed mapping or 3D scanning of objects. Combining the grid pattern of lasers and orbit mode automatic 3D scanning of underwater objects may be performed. In other embodiments the distance and angle measurements may be achieved using three lasers emitting light in to a field of view of camera. The reflections of the lasers on an object are used similarly as with four lasers to estimated distance and angles between the object and the camera. The three lasers may be arranged in a triangle on an underwater vehicle. A distance measurement system with three lasers is both the simplest where angles may be measured, but also needing the lowest number of lasers.

In other embodiments distance measurement arrangement may comprise of one or more lasers and two cameras, wherein the two cameras have an overlapping field of view. One or more lasers are emitting light into to the overlapping field of view. An object in the overlapping field of view which is illuminated by the laser light will reflect laser light as coloured dots. The cameras will capture images of the dot from two different angles, and by knowing the distance between the cameras the distance to the reflected dot may be calculated. This distance calculation may be performed by triangulation. Other embodiments may have more cameras aligned both horizontally and vertically.

In other embodiments the system may compare the captured images to previously captured images. The underwater vehicle will then attempt to move the underwater vehicle in such a way that the captured images and the previously captured images matches.

In other embodiments the underwater system may record a dive path of the underwater vehicle. The positions of the tracked objects are stored and at a later dive the underwater vehicle will automatically move the underwater vehicle following the same dive path. Feature tracking and the laser distance measurements are used to move the underwater vehicle to capture images in the same orientations as in the stored images.

Industrial applicability

The control system and underwater system is suitable for use in all types of underwater vehicles.