Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
IMPROVEMENTS IN VEHICLE CONTROL
Document Type and Number:
WIPO Patent Application WO/2018/166747
Kind Code:
A1
Abstract:
A computing system (10, 19, 185C) for a vehicle (100), the system comprising a processing means (10, 19) arranged to receive, from terrain data capture means (185C) comprising a stereoscopic camera system, terrain information indicative of the topography of the terrain extending ahead of the vehicle (100), the terrain information including at least a stereo image pair of terrain ahead of the vehicle, wherein the processing means (10, 19) is configured to: calculate a 3D point cloud dataset in respect of the terrain ahead of the vehicle based on the terrain information; identify a substantially continuous region of terrain ahead of the vehicle and within a predicted path of the vehicle in which the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value, said region being considered to be an empty region if the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value; determine whether the empty region corresponds to a body of water, being a region which meets the criteria that: the number of pixels in an image of that region captured by the camera system that correspond to specular reflection of a feature of the environment, including the terrain, exceeds a specular reflection density threshold value; and a difference in height between at least two locations of the region is less than a location height difference threshold value, the system being configured to output a signal in dependence on whether terrain corresponding to a body of water has been identified ahead of the vehicle.

Inventors:
KOTTERI JITHESH (GB)
RAVI BINEESH (GB)
JAYARAJ JITHIN (GB)
Application Number:
PCT/EP2018/053853
Publication Date:
September 20, 2018
Filing Date:
February 16, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
JAGUAR LAND ROVER LTD (GB)
International Classes:
G06K9/00; B60W30/14
Domestic Patent References:
WO2013124321A12013-08-29
WO2014139875A12014-09-18
Foreign References:
DE102012112164A12014-06-12
US20150356357A12015-12-10
GB2507622A2014-05-07
GB2499461A2013-08-21
US7349776B22008-03-25
GB2492748A2013-01-16
GB2492655A2013-01-09
GB2499279A2013-08-14
GB2508464A2014-06-04
Other References:
ANONYMOUS, SPIE, PO BOX 10 BELLINGHAM WA 98227-0010 USA, 1 January 2009 (2009-01-01), XP040496947
RALPH AESCHIMANN ET AL: "Ground or obstacles? Detecting clear paths in vehicle navigation", 2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 1 May 2015 (2015-05-01), pages 3927 - 3934, XP055249069, ISBN: 978-1-4799-6923-4, DOI: 10.1109/ICRA.2015.7139747
Attorney, Agent or Firm:
LOCKEY, Robert Alexander (GB)
Download PDF:
Claims:
CLAIMS:

1 . A computing system for a vehicle, the system comprising a processing means arranged to receive, from terrain data capture means comprising a stereoscopic camera system, terrain information indicative of the topography of the terrain extending ahead of the vehicle, the terrain information including at least a stereo image pair of terrain ahead of the vehicle, wherein the processing means is configured to:

calculate a 3D point cloud dataset in respect of the terrain ahead of the vehicle based on the terrain information;

identify a substantially continuous region of terrain ahead of the vehicle and within a predicted path of the vehicle in which the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value, said region being considered to be an empty region if the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value;

determine whether the empty region corresponds to a body of water, being a region which meets the criteria that:

the number of pixels in an image of that region captured by the camera system that correspond to specular reflection of a feature of the environment, including the terrain, exceeds a specular reflection density threshold value; and

a difference in height between at least two locations of the region is less than a location height difference threshold value,

the system being configured to output a signal in dependence on whether terrain corresponding to a body of water has been identified ahead of the vehicle.

2. A system according to claim 1 configured wherein, in order for the system to determine that an empty region corresponds to a body of water, the further condition must be met that the difference in height between at least two locations on substantially opposite sides of the region is less than a location height difference threshold value.

3. A system according to claim 1 or claim 2 further configured to determine the slope of terrain to be experienced by the vehicle immediately before the empty region within the predicted path, wherein if the slope exceeds a predetermined slope value the system determines that the region does not correspond to a body of water.

4. A system according to any preceding claim further configured to determine the slope of terrain to be experienced by the vehicle immediately before the empty region within the predicted path, wherein if the slope exceeds a predetermined slope value the system determines that the region does not correspond to a body of water and may correspond to a crest of a slope. 5. A system according to any preceding claim further configured to identify, in at least one of the stereoscopic image pairs corresponding to the 3D point cloud, the empty region, and determine whether the empty region corresponds to a body of water in further dependence at least in part on RGB colour values of at least one image pixel corresponding to the empty region.

6. A system according to claim 5 configured to determine whether the empty region corresponds to a body of water in dependence at least in part on an average RGB colour value over a plurality of image pixels corresponding to the empty region. 7. A system according to claim 5 configured to determine that the empty region does not correspond to a body of water unless the average RGB colour value over a plurality of image pixels corresponding to the empty region is above a predetermined minimum average RGB colour value and less than a predetermined maximum average RGB colour value. 8. A system according to any preceding claim wherein the system being configured to determine whether the number of pixels in an image of the empty region captured by the camera system that correspond to specular reflection of a feature of the environment, including the terrain, exceeds a specular reflection density threshold value, comprises the system being configured to determine the disparity between corresponding pixels of the left and right stereo image pair corresponding to the empty region, the number of pixels corresponding to specular reflection being determined in dependence on the disparity between the corresponding pixels of the left and right image pair.

9. A system according to any preceding claim wherein the system is further configured to require that a predetermined width condition must be met in respect of a width of the region of at least one of the stereoscopic image pairs corresponding to the empty region in order for the empty region to correspond to an empty region.

10. A system according to claim 9 wherein the predetermined width condition comprises the condition that a width or average width of at least a portion of the image that includes the empty region exceeds a predetermined threshold width value.

1 1 . A system according to any preceding claim comprising a speed controller, the speed controller being configured to control vehicle speed based at least in part on whether a body of water has been identified in the path of the vehicle. 12. A system according to claim 1 1 configured to provide an output to the speed controller indicative of a maximum recommended speed in dependence at least in part on whether a body of water has been identified in the path of the vehicle.

13. A system according to claim 1 1 or 12 configured to provide an alert to a driver in dependence on whether a body of water has been identified in the path of the vehicle.

14. A system according to any preceding claim wherein the terrain data capture means comprises a stereoscopic camera system. 15. A system according to any preceding claim further comprising the terrain data capture means.

16. A system according to any preceding claim, wherein the system comprises an electronic processor having an electrical input for receiving the terrain information indicative of the topography of terrain ahead of the vehicle; and

an electronic memory device electrically coupled to the electronic processor and having instructions stored therein,

wherein the processor is configured to access the memory device and execute the instructions stored therein such that it is operable to:

calculate a 3D point cloud dataset in respect of the terrain ahead of the vehicle based on the terrain information;

identify a substantially continuous region of terrain ahead of the vehicle and within a predicted path of the vehicle in which the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value, said region being considered to be an empty region if the number density of datapoints in the

3D point cloud corresponding to that region is less than an empty region datapoint density threshold value;

determine whether the empty region corresponds to a body of water, being a region which meets the criteria that:

the number of pixels in an image of that region captured by the camera system that correspond to specular reflection of a feature of the environment, including the terrain, exceeds a specular reflection density threshold value; and a difference in height between at least two locations of the region is less than a location height difference threshold value,

and output a signal in dependence on whether terrain corresponding to a body of water has been identified ahead of the vehicle.

17. A vehicle comprising a system according to any preceding claim.

18. A method of identifying terrain corresponding to a body of water implemented by means of a computing system of a vehicle,

the method comprising receiving, from terrain data capture means comprising a stereoscopic camera system, terrain information indicative of the topography of the terrain extending ahead of the vehicle, the terrain information including at least a stereo image pair of terrain ahead of the vehicle, the method comprising:

calculating a 3D point cloud dataset in respect of the terrain ahead of the vehicle based on the terrain information;

identifying a substantially continuous region of terrain ahead of the vehicle and within a predicted path of the vehicle in which the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value, said region being considered to be an empty region if the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value;

determining whether the empty region corresponds to a body of water, being a region which meets the criteria that:

the number of pixels in an image of that region captured by the camera system that correspond to specular reflection of a feature of the environment, including the terrain, exceeds a specular reflection density threshold value; and

a difference in height between at least two locations of the region is less than a location height difference threshold value,

the method comprising outputting a signal in dependence on whether terrain corresponding to a body of water has been identified ahead of the vehicle.

19. A non-transitory computer readable carrier medium carrying computer readable code for controlling a vehicle to carry out the method of claim 18. 20. A computer program product executable on a processor so as to implement the method of claim 18.

21 . A non-transitory computer readable medium loaded with the computer program product of claim 20. 22. A processor arranged to implement the method of claim 18, or the computer program product of claim 20.

Description:
IMPROVEMENTS IN VEHICLE CONTROL

INCORPORATION BY REFERENCE

The content of co-pending UK patent applications GB2507622 and GB2499461 are hereby incorporated by reference. The content of US patent no US7349776 and co-pending international patent applications WO2013124321 and WO2014/139875 are incorporated herein by reference. The content of UK patent applications GB2492748, GB2492655 and GB2499279 and UK patent GB2508464 are also incorporated herein by reference.

FIELD OF THE INVENTION

The invention relates to a system for controlling a vehicle. In particular, but not exclusively, the invention relates to a system for controlling steering of a land-based vehicle which is capable of driving in a variety of different and extreme terrains and conditions.

BACKGROUND

In known vehicle steering control systems, such as lane keeping systems, a forward-looking camera system detects lane markings on the road ahead of the vehicle. In some systems feedback in the form of an audible alarm or haptic response, such as vibration of a steering wheel, is provided in the case that the vehicle deviates excessively from a notional lane centreline or crosses a lane marking. Some steering control systems automatically control steerable road wheel steering angle in order to maintain a vehicle in-lane when driving on a highway by reference to lane markings. The present applicant has recognised that known steering control systems are unusable in off-road driving environments where such systems may be of particular value in reducing driver fatigue.

It is against this background that the present invention has been conceived. Embodiments of the invention may provide an apparatus, a method or a vehicle which addresses this problem. Other aims and advantages of the invention will become apparent from the following description, claims and drawings.

SUMMARY OF THE INVENTION

In one aspect of the invention for which protection is sought there is provided a computing system for a vehicle, the system comprising a processing means arranged to receive, from terrain data capture means comprising a stereoscopic camera system, terrain information indicative of the topography of the terrain extending ahead of the vehicle, the terrain information including at least a stereo image pair of terrain ahead of the vehicle, wherein the processing means is configured to:

calculate a 3D point cloud dataset in respect of the terrain ahead of the vehicle based on the terrain information;

identify a substantially continuous region of terrain ahead of the vehicle and within a predicted path of the vehicle in which the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value, said region being considered to be an empty region if the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value;

determine whether the empty region corresponds to a body of water, being a region which meets the criteria that:

the number of pixels in an image of that region captured by the camera system that correspond to specular reflection of a feature of the environment, including the terrain, exceeds a specular reflection density threshold value; and

a difference in height between at least two locations of the region is less than a location height difference threshold value,

the system being configured to output a signal in dependence on whether terrain corresponding to a body of water has been identified ahead of the vehicle.

Embodiments of the present invention have the advantage that the presence of a body of water ahead of the vehicle may be reliably detected and an output signal generated in response. It is to be understood that, where a body of water is present, the body of water will typically specularly reflect terrain features above the water such as sky, trees, bushes or the like. The apparent distance of these terrain features from the terrain data capture means, based on disparity of the stereo image pair, will be greater than the distance of the body of water from the terrain data capture means. Accordingly, the region of the 3D point cloud corresponding to the location of the body of water may appear to have a lower density of datapoints than would be the case if the body of water were a non-specularly reflecting body such as a body of earth. Embodiments of the present invention attempt to detect the presence of a body of water based on typical characteristics of such a body, being the presence of specular reflections by the body of water in a captured image of the body of water and the characteristic of flatness typically associated with a body of water. The system may be configured wherein, in order for the system to determine that an empty region corresponds to a body of water, the further condition must be met that the difference in height between at least two locations on substantially opposite sides of the region is less than a location height difference threshold value.

In some embodiments, by 'opposite sides' is meant points along a given column or row of pixels, that includes a portion of the body of water, at which the body of water meets ground. The pixels whose height is considered may be pixels closest to the empty region, on opposite sides of the empty region, along the column or row.

The system may be configured to determine the slope of terrain to be experienced by the vehicle immediately before the empty region within the predicted path, wherein if the slope exceeds a predetermined slope value the system determines that the region does not correspond to a body of water.

The system may be further configured to determine the slope of terrain to be experienced by the vehicle immediately before the empty region within the predicted path, wherein if the slope exceeds a predetermined slope value the system determines that the region does not correspond to a body of water and may correspond to a crest of a slope.

The system may be further configured to identify, in at least one of the stereoscopic image pairs corresponding to the 3D point cloud, the empty region, and determine whether the empty region corresponds to a body of water in further dependence at least in part on RGB (red, green, blue) colour values of at least one image pixel corresponding to the empty region.

The system may be configured to determine whether the empty region corresponds to a body of water in dependence at least in part on an average RGB (red, green, blue) colour value over a plurality of image pixels corresponding to the empty region.

The system may be configured to determine that the empty region does not correspond to a body of water unless the average RGB (red, green, blue) colour value over a plurality of image pixels corresponding to the empty region is above a predetermined minimum average RGB colour value and less than a predetermined maximum average RGB colour value.

Optionally, the system being configured to determine whether the number of pixels in an image of the empty region captured by the camera system that correspond to specular reflection of a feature of the environment, including the terrain, exceeds a specular reflection density threshold value, comprises the system being configured to determine the disparity between corresponding pixels of the left and right stereo image pair corresponding to the empty region, the number of pixels corresponding to specular reflection being determined in dependence on the disparity between the corresponding pixels of the left and right image pair.

Optionally, the system is further configured to require that a predetermined width condition must be met in respect of a width of the region of at least one of the stereoscopic image pairs corresponding to the empty region in order for the empty region to correspond to an empty region.

It is to be understood that in some embodiments this requirement is a necessary condition but not a sufficient condition, for the empty region to correspond to a body of water.

Optionally, the predetermined width condition comprises the condition that a width or average width of at least a portion of the image that includes the empty region exceeds a predetermined threshold width value. In some embodiments, the system may be configured to require that at least a portion of the empty region has a width exceeding the threshold width value. The portion may be a polygonal portion defined by the system by extraction of pixels of the left or right image of the stereoscopic image pair corresponding to the empty region. The system may comprise a speed controller, the speed controller being configured to control vehicle speed based at least in part on whether a body of water has been identified in the path of the vehicle.

The speed controller may receive the signal that is output by the computing system signal in dependence on whether terrain corresponding to a body of water has been identified in the path of the vehicle and control vehicle speed based on the signal.

The system may be configured to provide an output to the speed controller indicative of a maximum recommended speed in dependence at least in part on whether a body of water has been identified in the path of the vehicle. The system may be configured to provide an alert to a driver in dependence on whether a body of water has been identified in the path of the vehicle.

Optionally, the terrain data capture means comprises a stereoscopic camera system.

The system may further comprise the terrain data capture means.

Optionally, the system comprises an electronic processor having an electrical input for receiving the terrain information indicative of the topography of terrain ahead of the vehicle; and

an electronic memory device electrically coupled to the electronic processor and having instructions stored therein,

wherein the processor is configured to access the memory device and execute the instructions stored therein such that it is operable to:

calculate a 3D point cloud dataset in respect of the terrain ahead of the vehicle based on the terrain information;

identify a substantially continuous region of terrain ahead of the vehicle and within a predicted path of the vehicle in which the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value, said region being considered to be an empty region if the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value;

determine whether the empty region corresponds to a body of water, being a region which meets the criteria that:

the number of pixels in an image of that region captured by the camera system that correspond to specular reflection of a feature of the environment, including the terrain, exceeds a specular reflection density threshold value; and

a difference in height between at least two locations of the region is less than a location height difference threshold value,

and output a signal in dependence on whether terrain corresponding to a body of water has been identified ahead of the vehicle.

In an aspect of the invention for which protection is sought there is provided a vehicle comprising a system according to another aspect. In one aspect of the invention for which protection is sought there is provided a method of identifying terrain corresponding to a body of water implemented by means of a computing system of a vehicle,

the method comprising receiving, from terrain data capture means (185C) comprising a stereoscopic camera system, terrain information indicative of the topography of the terrain extending ahead of the vehicle (100), the terrain information including at least a stereo image pair of terrain ahead of the vehicle, the method comprising:

calculating a 3D point cloud dataset in respect of the terrain ahead of the vehicle based on the terrain information;

identifying a substantially continuous region of terrain ahead of the vehicle and within a predicted path of the vehicle in which the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value, said region being considered to be an empty region if the number density of datapoints in the 3D point cloud corresponding to that region is less than an empty region datapoint density threshold value;

determining whether the empty region corresponds to a body of water, being a region which meets the criteria that:

the number of pixels in an image of that region captured by the camera system that correspond to specular reflection of a feature of the environment, including the terrain, exceeds a specular reflection density threshold value; and

a difference in height between at least two locations of the region is less than a location height difference threshold value,

the method comprising outputting a signal in dependence on whether terrain corresponding to a body of water has been identified ahead of the vehicle. In an aspect of the invention for which protection is sought there is provided a non-transitory computer readable carrier medium carrying computer readable code for controlling a vehicle to carry out the method of another aspect.

In an aspect of the invention for which protection is sought there is provided a computer program product executable on a processor so as to implement the method of another aspect.

In one aspect of the invention for which protection is sought there is provided a non- transitory computer readable medium loaded with the computer program product of another aspect. In an aspect of the invention for which protection is sought there is provided a processor arranged to implement the method of another aspect, or the computer program product of another aspect. Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner. BRIEF DESCRIPTION OF DRAWINGS

The present invention will now be described, by way of example only, with reference to the accompanying drawings, in which:

FIGURE 1 is a schematic illustration of a vehicle according to an embodiment of the invention in plan view;

FIGURE 2 shows the vehicle of FIG. 1 in side view;

FIGURE 3 is a high level schematic diagram of an embodiment of the vehicle speed control system of the present invention, including a cruise control system and a low-speed progress control system;

FIGURE 4 illustrates a steering wheel of a vehicle according to the embodiment of FIG. 1 ; FIGURE 5 is a schematic illustration in plan view of an elevation map of terrain ahead of the vehicle;

FIGURE 6 shows (a) an example of an image captured by a left camera of a stereoscopic camera system of the vehicle of FIG. 1 and (b) a representation of the corresponding elevation map, generated according to the MLS methodology, in respect of the image shown in (a); FIGURE 7 is a schematic illustration of an MLS map comprising cells C, cells corresponding to the location of an obstacle OB are shown in relatively dark shading, whilst cells corresponding to the shadow SH of the obstacle are shown in medium shading; FIGURE 8 shows (a) the portion of the image of FIG. 6(a) corresponding to the empty region ER shown in FIG. 6(b) in the form of a cropped polygonal region of the camera image of FIG. 6(a) and (b) an extended cropped polygonal region based on the cropped polygonal region shown in (a); FIGURE 9 is a diagram mapping a domain of values of speckleCount and average intensity of R, G, B colour values determined in accordance with an embodiment of the present invention that are consistent with the presence of a ditch or body of water in a stereoscopic image pair of terrain ahead of a vehicle; and FIGURE 10 is a flow diagram illustrating operation of a computing system of the vehicle of the embodiment of FIG. 1 .

DETAILED DESCRIPTION

References herein to a block such as a function block are to be understood to include reference to software code for performing the function or action specified which may be an output that is provided responsive to one or more inputs. The code may be in the form of a software routine or function called by a main computer program, or may be code forming part of a flow of code not being a separate routine or function. Reference to function block is made for ease of explanation of the manner of operation of embodiments of the present invention.

FIG. 1 shows a vehicle 100 according to an embodiment of the present invention. The vehicle 100 has a powertrain 129 that includes an engine 121 that is connected to a driveline 130 having an automatic transmission 124. It is to be understood that embodiments of the present invention are also suitable for use in vehicles with manual transmissions, continuously variable transmissions or any other suitable transmission.

In the embodiment of FIG. 1 the transmission 124 may be set to one of a plurality of transmission operating modes, being a park mode, a reverse mode, a neutral mode, a drive mode or a sport mode, by means of a transmission mode selector dial 124S. The selector dial 124S provides an output signal to a powertrain controller 1 1 in response to which the powertrain controller 1 1 causes the transmission 124 to operate in accordance with the selected transmission mode.

The driveline 130 is arranged to drive a pair of front vehicle wheels 1 1 1 ,1 12 by means of a front differential 137 and a pair of front drive shafts 1 18. The driveline 130 also comprises an auxiliary driveline portion 131 arranged to drive a pair of rear wheels 1 14, 1 15 by means of an auxiliary driveshaft or prop-shaft 132, a rear differential 135 and a pair of rear driveshafts 139. Embodiments of the invention are suitable for use with vehicles in which the transmission is arranged to drive only a pair of front wheels or only a pair of rear wheels (i.e. front wheel drive vehicles or rear wheel drive vehicles) or selectable two wheel drive/four wheel drive vehicles. In the embodiment of FIG. 1 the transmission 124 is releasably connectable to the auxiliary driveline portion 131 by means of a power transfer unit (PTU) 131 P, allowing operation in a two wheel drive mode or a four wheel drive mode. It is to be understood that embodiments of the invention may be suitable for vehicles having more than four wheels or where only two wheels are driven, for example two wheels of a three wheeled vehicle or four wheeled vehicle or a vehicle with more than four wheels. A control system for the vehicle engine 121 includes a central controller 10, referred to as a vehicle control unit (VCU) 10, the powertrain controller 1 1 , a brake controller 13 (an anti-lock braking system (ABS) controller) and a steering controller 170C. The ABS controller 13 forms part of a braking system 22 (FIG. 3). The VCU 10 receives and outputs a plurality of signals to and from various sensors and subsystems (not shown) provided on the vehicle. The VCU 10 includes a low-speed progress (LSP) control system 12 shown in FIG. 3, a stability control system (SCS) 14, a cruise control system 16 and a hill descent control (HDC) system 12HD. The SCS 14 improves the safety of the vehicle 100 by detecting and managing loss of traction or steering control. When a reduction in traction or steering control is detected, the SCS 14 is operable automatically to command the ABS controller 13 to apply one or more brakes of the vehicle to help to steer the vehicle 100 in the direction the user wishes to travel. In the embodiment shown the SCS 14 is implemented by the VCU 10. In some alternative embodiments the SCS 14 may be implemented by the ABS controller 13.

Although not shown in detail in FIG. 3, the VCU 10 further includes a Traction Control (TC) function block. The TC function block is implemented in software code run by a computing device of the VCU 10. The ABS controller 13 and TC function block provide outputs indicative of, for example, TC activity, ABS activity, brake interventions on individual wheels and engine torque requests from the VCU 10 to the engine 121 in the event a wheel slip event occurs. Each of the aforementioned events indicate that a wheel slip event has occurred. In some embodiments the ABS controller 13 implements the TC function block. Other vehicle sub-systems such as a roll stability control system or the like may also be included.

As noted above the vehicle 100 also includes a cruise control system 16 which is operable to automatically maintain vehicle speed at a selected speed when the vehicle is travelling at speeds in excess of 25 kph. The cruise control system 16 is provided with a cruise control HMI (human machine interface) 18 by which means the user can input a target vehicle speed to the cruise control system 16 in a known manner. In one embodiment of the invention, cruise control system input controls are mounted to a steering wheel 171 (FIG. 4). The cruise control system 16 may be switched on by pressing a cruise control system selector button 176. When the cruise control system 16 is switched on, depression of a 'set- speed' control 173 sets the current value of a cruise control set-speed parameter, cruise_set-speed to the current vehicle speed. Depression of a '+' button 174 allows the value of cruise_set-speed to be increased whilst depression of a '-' button 175 allows the value of cruise_set-speed to be decreased. A resume button 173R is provided that is operable to control the cruise control system 16 to resume speed control at the instant value of cruise_set-speed following driver over-ride. It is to be understood that known on-highway cruise control systems including the present system 16 are configured so that, in the event that the user depresses the brake or, in the case of vehicles with a manual transmission, a clutch pedal, control of vehicle speed by the cruise control system 16 is cancelled and the vehicle 100 reverts to a manual mode of operation which requires accelerator or brake pedal input by a user in order to maintain vehicle speed. In addition, detection of a wheel slip event, as may be initiated by a loss of traction, also has the effect of cancelling control of vehicle speed by the cruise control system 16. Speed control by the system 16 is resumed if the driver subsequently depresses the resume button 173R. The cruise control system 16 monitors vehicle speed and any deviation from the target vehicle speed is adjusted automatically so that the vehicle speed is maintained at a substantially constant value, typically in excess of 25 kph. In other words, the cruise control system is ineffective at speeds lower than 25 kph. The cruise control HMI 18 may also be configured to provide an alert to the user about the status of the cruise control system 16 via a visual display of the HMI 18. In the present embodiment the cruise control system 16 is configured to allow the value of cruise_set-speed to be set to any value in the range 25- 150kph. The LSP control system 12 also provides a speed-based control system for the user which enables the user to select a very low target speed at which the vehicle can progress without any pedal inputs being required by the user to maintain vehicle speed. Low-speed speed control (or progress control) functionality is not provided by the on-highway cruise control system 16 which operates only at speeds above 25 kph.

In the present embodiment, the LSP control system 12 is activated by pressing LSP control system selector button 178 mounted on steering wheel 171 . The system 12 is operable to apply selective powertrain, traction control and braking actions to one or more wheels of the vehicle 100, collectively or individually.

The LSP control system 12 is configured to allow a user to input a desired value of vehicle target speed in the form of a set-speed parameter, user set-speed, via a low-speed progress control HMI (LSP HMI) 20 (FIG. 1 , FIG. 3) which shares certain input buttons 173- 175 with the cruise control system 16 and HDC control system 12HD. Provided the vehicle speed is within the allowable range of operation of the LSP control system 12 (which is the range from 2 to 30kph in the present embodiment although other ranges are also useful) and no other constraint on vehicle speed exists whilst under the control of the LSP control system 12, the LSP control system 12 controls vehicle speed in accordance with a LSP control system set-speed value LSP_set-speed which is set substantially equal to user set- speed. Unlike the cruise control system 16, the LSP control system 12 is configured to operate independently of the occurrence of a traction event. That is, the LSP control system 12 does not cancel speed control upon detection of wheel slip. Rather, the LSP control system 12 actively manages vehicle behaviour when slip is detected.

The LSP control HMI 20 is provided in the vehicle cabin so as to be readily accessible to the user. The user of the vehicle 100 is able to input to the LSP control system 12, via the LSP HMI 20, the desired value of user set-speed as noted above by means of the 'set-speed' button 173 and the '+'/ '-' buttons 174, 175 in a similar manner to the cruise control system 16. The LSP HMI 20 also includes a visual display by means of which information and guidance can be provided to the user about the status of the LSP control system 12.

The LSP control system 12 receives an input from the ABS controller 13 of the braking system 22 of the vehicle indicative of the extent to which the user has applied braking by means of the brake pedal 163. The LSP control system 12 also receives an input from an accelerator pedal 161 indicative of the extent to which the user has depressed the accelerator pedal 161 , and an input from the transmission or gearbox 124. This latter input may include signals representative of, for example, the speed of an output shaft of the gearbox 124, an amount of torque converter slip and a gear ratio request. Other inputs to the LSP control system 12 include an input from the cruise control HMI 18 which is representative of the status (ON/OFF) of the cruise control system 16, an input from the LSP control HMI 20, and an input from a gradient sensor 45 indicative of the gradient of the driving surface over which the vehicle 100 is driving. In the present embodiment the gradient sensor 45 is a gyroscopic sensor. In some alternative embodiments the LSP control system 12 receives a signal indicative of driving surface gradient from another controller such as the ABS controller 13. The ABS controller 13 may determine gradient based on a plurality of inputs, optionally based at least in part on signals indicative of vehicle longitudinal and lateral acceleration and a signal indicative of vehicle reference speed (v actual) being a signal indicative of actual vehicle speed over ground. Methods for the calculation of vehicle reference speed based for example on vehicle wheel speeds are well known. For example in some known vehicles the vehicle reference speed may be determined to be the speed of the second slowest turning wheel, or the average speed of all the wheels. Other ways of calculating vehicle reference speed may be useful in some embodiments, including by means of a camera device or radar sensor. The HDC system 12HD is activated by depressing button 177 comprised by HDC system HMI 20HD and mounted on the steering wheel 171 . When the HDC system 12HD is active, the system 12HD controls the braking system 22 in order to limit vehicle speed to a value corresponding to that of a HDC set-speed parameter HDC_set-speed which may be controlled by a user in a similar manner to the set-speed of the cruise control system 16 and LSP control system, using the same control buttons 173, 173R, 174, 175. The HDC system 12HD is operable to allow the value of HDC_set-speed to be set to any value in the range from 2-30kph. The HDC set-speed parameter may also be referred to as an HDC target speed. Provided the user does not override the HDC system 12HD by depressing the accelerator pedal 161 when the HDC system 12HD is active, the HDC system 12HD controls the braking system 22 (FIG. 3) to prevent vehicle speed from exceeding HDC_set-speed. In the present embodiment the HDC system 12HD is not operable to apply positive drive torque. Rather, the HDC system 12HD is only operable to cause negative brake torque to be applied, via the braking system 22. It is to be understood that the VCU 10 is configured to implement a known Terrain Response (TR) (RTM) System of the kind described above in which the VCU 10 controls settings of one or more vehicle systems or sub-systems such as the powertrain controller 1 1 in dependence on a selected driving mode. The driving mode may be selected by a user by means of a driving mode selector 141 S (FIG. 1 ). The driving modes may also be referred to as terrain modes, terrain response (TR) modes, or control modes. In the embodiment of FIG. 1 four driving modes are provided: an 'on-highway' driving mode suitable for driving on a relatively hard, smooth driving surface where a relatively high surface coefficient of friction exists between the driving surface and wheels of the vehicle; a 'sand' driving mode suitable for driving over sandy terrain, being terrain characterised at least in part by relatively high drag, relatively high deformability or compliance and relatively low surface coefficient of friction; a 'grass, gravel or snow' (GGS) driving mode suitable for driving over grass, gravel or snow, being relatively slippery surfaces (i.e. having a relatively low coefficient of friction between surface and wheel and, typically, lower drag than sand); a 'rock crawl' (RC) driving mode suitable for driving slowly over a rocky surface; and a 'mud and ruts' (MR) driving mode suitable for driving in muddy, rutted terrain. Other driving modes may be provided in addition or instead. In the present embodiment the selector 141 S also allows a user to select an 'automatic driving mode selection condition' of operation in which the VCU 10 selects automatically the most appropriate driving mode as described in more detail below. The on-highway driving mode may be referred to as a 'special programs off (SPO) mode in some embodiments since it corresponds to a standard or default driving mode, and is not required to take account of special factors such as relatively low surface coefficient of friction, or surfaces of high roughness.

The LSP control system 12 causes the vehicle 100 to operate in accordance with the value of LSP_set-speed.

In order to cause application of the necessary positive or negative torque to the wheels, the VCU 10 may command that positive or negative torque is applied to the vehicle wheels by the powertrain 129 and/or that a braking force is applied to the vehicle wheels by the braking system 22, either or both of which may be used to implement the change in torque that is necessary to attain and maintain a required vehicle speed. In some embodiments torque is applied to the vehicle wheels individually, for example by powertrain torque vectoring, so as to maintain the vehicle at the required speed. Alternatively, in some embodiments torque may be applied to the wheels collectively to maintain the required speed, for example in vehicles having drivelines where torque vectoring is not possible. In some embodiments, the powertrain controller 1 1 may be operable to implement torque vectoring to control an amount of torque applied to one or more wheels by controlling a driveline component such as a rear drive unit, front drive unit, differential or any other suitable component. For example, one or more components of the driveline 130 may include one or more clutches operable to allow an amount of torque applied to one or more wheels to be varied. Other arrangements may also be useful. Where a powertrain 129 includes one or more electric machines, for example one or more propulsion motors and/or generators, the powertrain controller 1 1 may be operable to modulate torque applied to one or more wheels in order to implement torque vectoring by means of one or more electric machines. In some embodiments the LSP control system 12 may receive a signal wheel_slip (also labelled 48 in FIG. 3) indicative of a wheel slip event having occurred. This signal 48 is also supplied to the on-highway cruise control system 16 of the vehicle, and which in the case of the latter triggers an override or inhibit mode of operation in the on-highway cruise control system 16 so that automatic control of vehicle speed by the on-highway cruise control system 16 is suspended or cancelled. However, the LSP control system 12 is not arranged to cancel or suspend operation on receipt of wheel_slip signal 48. Rather, the system 12 is arranged to monitor and subsequently manage wheel slip so as to reduce driver workload. During a slip event, the LSP control system 12 continues to compare the measured vehicle speed with the value of LSP_set-speed, and continues to control automatically the torque applied to the vehicle wheels (by the powertrain 129 and braking system 22) so as to maintain vehicle speed at the selected value. It is to be understood therefore that the LSP control system 12 is configured differently to the cruise control system 16, for which a wheel slip event has the effect of overriding the cruise control function so that manual operation of the vehicle must be resumed, or speed control by the cruise control system 16 resumed by pressing the resume button 173R or set-speed button 173.

The vehicle 100 is also provided with additional sensors (not shown) which are representative of a variety of different parameters associated with vehicle motion and status. These include an inertial measurement unit (IMU) 23 shown in FIG. 3 which provides information indicative of vehicular yaw, roll and pitch angle and rate, and rate of longitudinal acceleration. The signals from the sensors provide, or are used to calculate, a plurality of driving condition indicators (also referred to as terrain indicators) which are indicative of the nature of the terrain conditions over which the vehicle 100 is travelling. The sensors (not shown) on the vehicle 100 include, but are not limited to, sensors which provide continuous sensor outputs to the VCU 10, including wheel speed sensors, as mentioned previously, an ambient temperature sensor, an atmospheric pressure sensor, tyre pressure sensors, wheel articulation sensors, an engine torque sensor (or engine torque estimator), a steering angle sensor, a steering wheel speed sensor, a gradient sensor (or gradient estimator), a lateral acceleration sensor which may be part of the SCS 14, a brake pedal position sensor, a brake pressure sensor, an accelerator pedal position sensor, longitudinal, lateral and vertical motion sensors, and water detection sensors forming part of a vehicle wading assistance system (not shown). In other embodiments, only a selection of the aforementioned sensors may be used.

The VCU 10 also receives a signal from the steering controller 170C. The steering controller 170C is in the form of an electronic power assisted steering unit (ePAS unit) 170C. The steering controller 170C provides a signal to the VCU 10 indicative of the steering force being applied to steerable road wheels 1 1 1 , 1 12 of the vehicle 100. This force corresponds to that applied by a user to the steering wheel 171 in combination with steering force generated by the ePAS unit 170C. The ePAS unit 170C also provides a signal indicative of steering wheel rotational position or angle. The steering controller 170C is also configured to set the steering angle of the steerable road wheels to a desired value, using electric motors forming part of the ePAS unit. Thus, the vehicle 100 is configured to implement autonomous steering control when required. In the present embodiment, the VCU 10 evaluates the various sensor inputs to determine the probability that each of the plurality of different TR modes (control modes or driving modes) for the vehicle subsystems is appropriate, with each control mode corresponding to a particular terrain type over which the vehicle is travelling (for example, mud and ruts, sand, grass/gravel/snow) as described above.

If the user has selected operation of the vehicle in the automatic driving mode selection condition, the VCU 10 then selects the most appropriate one of the control modes and is configured automatically to control the subsystems according to the selected mode. This aspect of the invention is described in further detail in our co-pending patent applications GB2492748, GB2492655 and GB2499279, the contents of each of which is incorporated herein by reference as noted above.

As indicated above, the nature of the terrain over which the vehicle is travelling (as determined by reference to the selected control mode) may also be utilised in the LSP control system 12 to determine an appropriate increase or decrease in vehicle speed. For example, if the user selects a value of user set-speed that is not suitable for the nature of the terrain over which the vehicle is travelling, the system 12 is operable to automatically adjust the value of LSP_set-speed to a value lower than user set-speed. In some cases, for example, the user selected speed may not be achievable or appropriate over certain terrain types, particularly in the case of uneven or rough surfaces. If the system 12 selects a set- speed (a value of LSP_set-speed) that differs from the user-selected set-speed user set- speed, a visual indication of the speed constraint is provided to the user via the LSP HMI 20 to indicate that an alternative speed has been adopted.

Other arrangements may be useful. In the present embodiment, the vehicle 100 is provided with a stereoscopic camera system 185C configured to generate stereo colour image pairs by means of a pair of forward-facing colour video cameras comprised by the system 185C. Suitable stereo image pairs in respect of terrain ahead of the vehicle are useful sources of terrain information indicative of the topography of the terrain. A stream of dual video image data is fed from the cameras to the VCU 10 which processes the image data received in a processing portion 19 and repeatedly generates a 3D point cloud data set based on the images received. Techniques for generating 3D point cloud data sets based on stereoscopic image data are well known. Each point in the 3D point cloud data set corresponds to a 3D coordinate of a point on a surface of terrain ahead of the vehicle 100 viewed by each of the forward-facing video cameras of the stereoscopic camera system 185C.

In the present embodiment the 3D point cloud dataset is transformed such that the origin of the frame of reference of the dataset is the midpoint of a line joining the points at which the two front wheels 1 1 1 , 1 12 of the vehicle 100 touch the ground over which the vehicle 100 is driving. In the present embodiment the frame of reference is defined with respect to Cartesian coordinates X, Y, Z where X is an axis transverse to the direction of vehicle travel, i.e. along a lateral direction with respect to the vehicle 100, Y is an axis oriented in an upward direction with respect to the vehicle 100, corresponding to a substantially vertically upward direction when the vehicle 100 is parked on level ground, and Z is parallel to or coincident with a longitudinal axis of the vehicle 100, along the direction of travel of the vehicle 100.

The processing portion 19 is configured to compute a terrain elevation map populated by data points of the point cloud dataset. FIG. 5 is a schematic illustration in plan view of an elevation map of terrain ahead of the vehicle 100. In the present embodiment the elevation map is generated with respect to the vehicle axes X, Y, Z (i.e. in the vehicle frame of reference). Following the MLS (multi-level surface) map methodology, the elevation map is notionally considered to be composed of square cells C in the X-Z plane of predetermined size as illustrated schematically in FIG. 5 (see inset). In the present embodiment the cells are of side 0.25m although other sizes may be useful in some embodiments. In the present embodiment, each 3D point of the elevation map is assigned to a cell C according to its position with respect to the X-Z plane. A given cell may contain points that are at multiple levels or heights, i.e. having different values of Y coordinate. The points within a given cell are grouped into one or more respective 'patches' according to the value of the Y coordinate, points having a Y coordinate within a given predetermined range of values of the Y coordinate being assigned to a patch corresponding to that range of values of Y coordinate. Where patches of points having different y co-ordinates are identified in the same cell the range may, for example, be defined by a +/- range in the y axis around an average y value of each patch. Thus, for example, although points within a given patch may have a different y coordinate, as they are allocated within a range they are attributable to the same surface. Each patch may be defined as being substantially parallel to the X-Z plane, for example it may be defined as am x-z pane the size of the patch having a y co-ordinate which is the average value of y co-ordinate for that patch. In contrast, points within the same cell that correspond to a different surface, for example a bridge passing over the driving surface, or an overhanging tree branch, would not form part of the same patch as points falling on the driving surface. It is to be understood that each patch contains information regarding the geometry of the points falling within that patch.

It is to be understood that data structures other than MLS maps may be employed, for example Voxel maps, or any other electronic surface mapping technology that reduces the number of data points to be analysed, may also be used, the purpose of the map being to reduce the number of data points.

The processing portion 19 analyses the cells and a cell is labelled as an 'obstacle' cell if the variance and range of the datapoints in the lowest patch of the cell exceed respective predetermined threshold values. Additionally, is some arrangements a cell is labelled as an 'obstacle' cell if an overhanging patch is detected having a height below a vehicle dependent threshold value. In some alternative embodiments, mean height may be employed instead. In some embodiments, height difference (mean height) between a current cell and neighbouring cells may be used to detect obstacles, i.e. relatively large steps in height between neighbouring cells may indicate the presence of an obstacle. Cells not labelled as obstacle cells may be labelled as a 'horizontal patch'. If a cell does not contain any points it may be labelled an 'empty patch'. In addition to implementing the MLS methodology as described above, the elevation map is refined such that overhanging patches falsely labelled as horizontal patches are discarded. An overhanging patch is a patch that is at a different height to a corresponding lower patch of the same cell consistent with the presence of an object that overhangs the surface over which the vehicle 100 is driving such as a branch of a tree or bridge that the vehicle 100 is passing underneath.

Cells labelled as obstacle cells are considered to be cells that cannot correspond to a body of water and so are disregarded for the purposes of identifying areas that may correspond to a body of water. Furthermore, 'empty' cells that are behind obstacles and which would be occluded from camera view ('obstacle shadows' or 'shadow cells') are identified and also disregarded. It is to be understood that these 'empty' cells will typically have a relatively low (if not zero) number density of data points per unit horizontal area of terrain.

In the present embodiment the processing portion 19 determines a predicted path of the vehicle 100 based on the instant value of steering angle of the vehicle 100. In the present embodiment the steering angle is considered to directly related to the rotational angle of the steering wheel 171 of the vehicle 100. The predicted path is considered to be a path that will be followed by the vehicle 100 if the vehicle continues moving with the instant steering angle and with substantially no slip or skid of the wheels of the vehicle 100. It is to be understood that the predicted path swept by a wheelbase the vehicle 100 may be considered to be an area of width substantially equal to a maximum track of the vehicle (maximum lateral distance between left and right front or rear wheels) and centred on a centreline of the vehicle 100.

The processing portion 19 then identifies substantially continuous regions or areas of terrain that are defined by empty cells not being shadow cells and which have at least a portion that lies within the predicted path swept by the vehicle 100. These regions or areas are referred to herein as 'empty regions'. Thus, each empty region is defined by a substantially continuous region of empty cells not being shadow cells and which have at least a portion that lies within the predicted path swept by the vehicle 100.

The reason for identifying empty cells not being shadow cells is that such cells are candidate cells for cells which correspond to the geographical location of a body of water, a crest associated with a slope, a ditch, 'over bright' regions or 'dark' regions, as described in more detail below. It is to be understand that in the case of a body of water, the water will typically specularly reflect objects above the body of water. The disparity of the objects in a stereographic image pair will typically be lower than for any objects floating on the surface of the body of water and therefore the location of data points in the 3D point cloud corresponding to regions of terrain visible in the stereoscopic image pair will not in fact correspond to that region of terrain because the objects will be considered to be located further away from the stereoscopic camera system 185C. Accordingly, empty cells are typically present in the region of terrain corresponding to the body of water.

In the case that the empty region is associated with a crest of a slope, a measurement of the slope of the terrain immediately ahead of the empty region may provide a useful indication whether the empty region corresponds to a body of water or to a region behind the crest of a slope.

FIG. 6(a) is an example of an image captured by a left camera of the stereoscopic camera system 185C of the vehicle 100. FIG. 6(b) is a representation of the corresponding elevation map, generated according to the MLS methodology, i.e. the map corresponding to that shown in FIG. 5 in respect of the image of FIG. 6(a). For the map of FIG. 6(b), regions identified by the processing portion 19 as corresponding to regions of different type are shown with different contrast or grayscale. Regions corresponding to obstacles OB are shown hatched and labelled OB, regions corresponding to obstacle shadows are labelled SH, and regions corresponding to empty cells not being obstacle shadow cells are labelled ER. The left and right wheel tracks are shown superimposed on the map. The corresponding features are indicated in the camera image of FIG. 6(a) for ease of correlation by the reader.

FIG. 7 illustrates the effect of the presence of obstacles on the 3D point cloud data by way of a relatively simple example. Shown in FIG. 7 is a schematic illustration of an MLS map comprising cells C. Cells C corresponding to the location of an obstacle OB are shown in relatively dark shading, whilst cells corresponding to the shadow SH of the obstacle are shown in medium shading. Other cells are shown unshaded. The location of the stereoscopic camera system 185C relative to the terrain represented by the MLS map is shown schematically in the lower portion of the figure. The 'fan' shape of the shadow region can be understood from the figure to result from the location of the camera 185C, and its location within the bounds of the MLS map will be dependent at least in part on the location of the camera system 185C relative to the object. X and Y vehicle axes (FIG. 2) are also shown superimposed on FIG. 7. For each empty region ER, the processing portion 19 determines, from the 3D point cloud data, the gradient of the terrain immediately preceding the region, along the predicted path and within the path defined by the left and right wheel tracks LW, RW. If the gradient exceeds a predetermined crest gradient value and a width of the region exceeds a predetermined crest width value, then the terrain immediately preceding the region is considered to be a crest region, corresponding to the crest of a slope and not a body of water. In the present embodiment the predetermined crest gradient value set is substantially 10% and the predetermined crest width value is substantially 2m.

If the processing portion 19 determines that the region is not a crest region, the processing portion 19 attempts to determine whether the region may correspond to a body of water or a ditch. In order to do this, the processing portion 19 identifies, in one of the pairs of 2D images captured by the stereoscopic camera system 185C, a region that corresponds to the empty region ER identified in the MLS map. The processing portion 19 extracts from this 2D image the region corresponding to the empty region ER and, in addition, an area of the image immediately surrounding region ER. This is so as to attempt to include more reflections from objects in the body of water, if indeed the empty region ER does correspond to a body of water.

By way of example, FIG. 8(a) shows the portion of the image of FIG. 6(a) corresponding to the empty region ER shown in FIG. 6(b). The image is in the form of a cropped polygonal region of the camera image of FIG. 6(a). The processing portion 19 defines an extended cropped polygonal region of the image of FIG. 6(a) being a region that includes an area of the image immediately surrounding the cropped polygonal region of FIG. 8(a). The extended cropped polygonal region in this example is shown in FIG. 8(b).

The processing portion 19 then attempts to determine the number of pixels in the extended cropped polygonal region that correspond to the reflection of an image of the environment, such as an image of a tree (being a near object) or sky (being a far object). Such pixels are referred to herein as reflection pixels, and the number of pixels is referred to as the 'speckle count' for the extended cropped polygonal area. The presence of reflection pixels is determined by considering the corresponding region of a disparity image, being a dataset of similar dimensions to each of the image pairs captured by the stereoscopic camera system, where the datapoints of the dataset correspond to the distance (disparity) between corresponding pixels of the respective images. It is to be understood that the disparity between pixels corresponding to a given point on an object will be higher the closer the object is to the camera system 185C. Thus, the disparity map will be expected to show relatively low disparity values in respect of reflection pixels in the empty region.

In the present embodiment, the processing portion 19 is configured to identify reflection pixels in respect of far objects by identifying pixels having a disparity corresponding to a distance of at least 40m from the camera system 185C. The following equation may be employed:

Disparity < (focal length * (base line) / (depth)) where disparity and focal length are in units of pixels, whilst base line and depth are in units of metres.

The processing portion 19 then identifies reflection pixels in respect of near objects The processing portion does this by considering the disparity values for each pixel along each substantially vertical column of the image, i.e. from the top of the image to the bottom of the image (or vice-versa). The processing portion 19 checks the variation in the disparity values from top to bottom (or from bottom to top). If no reflections are present in the image, the disparity values will gradually increase from top to bottom (or decrease from bottom to top), i.e. regions of the image corresponding to objects closer to the camera will have higher disparity than those corresponding to objects further away from the camera. However, if reflections are present, the disparity values may decrease rather than increase over at least some pixels moving from top to bottom. In the present embodiment, the processing portion 19 considers the disparity values at the top and bottom of each column of the image, which are considered to correspond to regions that are not part of a body of water, such as ground or objects bordering a puddle. The processing portion 19 then determines the number of pixels in a given column for which the disparity is less than the disparity of the pixels at the top and bottom of the column. That is, pixels that represent an image of an object that is further away than objects corresponding to the pixels at the top and bottom of the 2D image. These pixels are considered to correspond to reflections from near objects.

The processing portion 19 then sums the number of near and far reflection pixels identified in the empty region ER to obtain a 'speckleCount' value. The processing portion 19 is configured to then determine whether the empty region ER corresponds to a body of water. The processing portion 19 does this by first eliminating 'dark' regions and 'over bright' regions from the 3D point cloud dataset as regions that might correspond to a body of water.

Thus, the processing portion 19 first determines whether the empty regions corresponds to 'dark' image region or an 'over bright' image region. The processing portion 19 does this by first calculating the average 'colour value' of each of the R, G, B channels in the cropped polygonal region, in the present example this is the region shown in FIG. 8(a). If the average colour values are less than a lower colour value threshold and the value of speckleCount is below a lower speckleCount threshold value the polygonal region is classified as a 'dark' region. In the present embodiment the lower speckleCount threshold is a value of substantially 10 disparity points per selected polygon region although other values may be useful. The lower colour value threshold is set at 35 arbitrary units in the present embodiment although other values may be useful. If the average colour values are greater than an upper colour value threshold and the value of speckleCount is below the lower speckleCount threshold value the polygonal region is classified as an 'over bright' region.

The processing portion 19 then determines whether regions not eliminated as potential regions corresponding to a body of water. The processing portion 19 determines that a region not already eliminated may be a region corresponding to a body of water if the following conditions are met:

(a) the value of speckleCount exceeds a predetermined upper speckleCount threshold value; and

(b) the average width of the extended cropped polygonal region along the direction of travel of the vehicle of the image is greater than a width threshold value and the height difference (with respect to the MLS map) between cells at opposite ends of a column of cells spanning the extended polygonal region is less than a height difference threshold value.

In the present embodiment, the predetermined upper speckleCount threshold value is substantially 50 points per polygon region although other values may be useful. In the present embodiment the width threshold value is set to 20 pixels and the height difference threshold value is set to 0.25m. If the processing portion 19 determines that the empty region ER does correspond to a body of water, the processing portion 19 outputs a signal to the LSP control system 12 informing the LSP control system 12 that a body of water has been identified ahead of the vehicle 100 within the predicted path swept by the vehicle 100.. The processing portion 19 also provides an output to the LSP control system 12 indicating the distance of the body of water from the vehicle 100.

In the present embodiment, if the processing portion 19 determines that the empty region ER does not correspond to a body of water, and is neither a dark region nor an over bright region, the processing portion 19 determines whether the region ER meets the criteria for being classified as a ditch. The processing portion 19 determines that the empty region ER corresponds to a ditch if the following conditions are met:

(a) the average width of the extended cropped polygonal region along the vehicle direction is less than a ditch width threshold value;

(b) The value of speckleCount is less than the predetermined lower speckleCount threshold value.

In the present embodiment, the ditch width threshold value is 20 pixels, being the same threshold value as that of the polygonal region used to identify puddles, although other values may be useful. In the present embodiment the predetermined lower speckleCount threshold value is 5 points. Other values may be useful. In some embodiments, the value of speckleCount must be zero. If the processing portion 19 determines that the empty region ER does correspond to a ditch, the processing portion 19 outputs a signal to the LSP control system 12 informing the LSP control system 12 that a ditch has been identified ahead of the vehicle 100 within the predicted path swept by the vehicle 100. The processing portion 19 also provides an output to the LSP control system 12 indicating the distance of the ditch from the vehicle 100.

FIG. 9 is a diagram mapping the values of speckleCount and average intensity of R, G, B colour values in a region of an image illustrating the requirements in terms of these parameters if an empty region ER of an image is to be considered a dark region (DR), an over bright region (BR), or a region corresponding to a ditch or puddle (DP).

In the present embodiment, the LSP control system 12 is configured to respond to signals from the processing portion 19 indicative that a body of water or a ditch lies in the path of the vehicle ahead. In the present embodiment, the LSP control system 12 is configured to provide an audible alert to the driver in the form of a chime, and a visual alert in the form of a graphic, displayed on the LSP HMI 20, indicating the nature of the terrain feature ahead, i.e. whether a body of water or a ditch is present ahead. The LSP HMI 20 also provides an indication of the distance of the terrain feature from the vehicle 100 and updates this value in real time as the vehicle approaches the feature.

In the present embodiment the LSP control system 12 is configured to cause the speed of the vehicle 100 to reduce to a predetermined water crossing speed in the case that a body of water has been identified, or a predetermined ditch crossing speed in the case that a ditch has been identified. In the present embodiment the predetermined water crossing speed and the predetermined ditch crossing speed are each substantially 5kph although other values may be useful in some embodiments. FIG. 10 is a flow diagram illustrating operation of the stereoscopic camera system 185C and processing portion 19 of the vehicle 100 of FIG. 1 .

At step S101 the stereoscopic camera system 185C captures a stereoscopic image pair being spatially separate images captured by left and right cameras of the system 185C mounted adjacent one another.

At In the present embodiment the images are output to the processing portion 19 for step S103 although in some embodiments step S103 is performed by the camera system 185C. The camera system 185C is configured repeatedly to capture stereoscopic image pairs and to output them to the processing portion 19. In the present embodiment image pairs are output to the processing portion 19 at a frame rate of 25 pairs per second. Other values may be useful such as 10 pairs per second, 12 pairs per second or any other suitable value.

At step S103, in the present embodiment, the processing portion 19 calculates a disparity image based on the most recently received stereoscopic image pair.

At step S105 the processing portion 19 calculates a 3D point cloud based on the disparity image calculated at step S103. At step S107 the processing portion 19 calculates an MLS map based on the 3D point cloud. At step S109 the processing portion 19 detects obstacle shadow regions SH in the MLS map and ignores such regions for the purposes of detecting terrain corresponding to a body of water or a ditch. At step S1 1 1 the processing portion 19 identifies the nearest empty region ER of the MLS map corresponding to terrain that lies within the predicted path, being a substantially continuous region of cells of the MLS map having a number density of datapoints that is less than a predetermined empty region datapoint density threshold value. At step S1 13 the processing portion 19 determines the slope, in a direction parallel to the instant longitudinal axis of the vehicle, of the terrain immediately ahead of the nearest empty region ER of terrain.

At step S1 15 the processing portion determines whether the slope exceeds a predetermined crest gradient value and a width of the empty region exceeds a predetermined crest width value.

If the slope does not exceed the predetermined crest gradient value or the width of the empty region does not exceed the predetermined crest width value, the processing portion 19 continues at step S1 17.

If the slope does exceed the predetermined crest slope value and the width of the empty region exceeds the predetermined crest width value, the processing portion 19 outputs a signal indicating that a crest has been detected ahead of the vehicle 100. The processing portion 19 also outputs a signal indicative of the distance of the crest from the vehicle 100. The processing portion 19 then continues at step S101 .

At step S1 17 the processing portion checks the RGB colour values of each of the pixels present in the region of one of the stereoscopic image pairs (in the present embodiment the left image) captured at step S101 that corresponds to the empty region of the MLS map under consideration. The processing portion 19 calculates an average value of the R, G and B colour values of the pixels in that region, which was referred to as a cropped polygonal region in the discussion above. At step S1 19 the processing portion attempts to determine whether the pixels identified as corresponding to the empty region in the MLS map correspond to reflections of objects (near to the region of terrain corresponding to the empty region of the MLS map, such as a bush or tree adjacent the region, or far from it, such as sky). The processing portion 19 achieves this by defining an extended polygonal region in the manner described above, and identifying the corresponding region in a disparity image generated based on the stereoscopic image pair of which the left image is currently being analysed. The processing portion 19 identifies pixels of the left image pair corresponding to the empty region which correspond to a pixel of the disparity image having a disparity corresponding to a distance of more than a predetermined disparity image distance from the camera system 185C, in the present embodiment a distance of 40m. Other values may be useful in some embodiments. Such pixels are considered to correspond to specular reflections from far objects.

The processing portion 19 identifies pixels corresponding to specular reflections from near objects by considering the disparity values in columns of pixels of the left image corresponding to the empty region. The processing portion 19 compares the disparity values of pixels in each column within the empty region with the values of disparity of the 'end' pixels of each column, i.e. the pixel at the top and bottom of each column, at the boundary of the empty region with respect to that column. The processing portion then counts the number of pixels within the column that have disparity values that are less than the values of disparity of the end pixels. These pixels are considered to correspond to specular reflections from near objects.

The processing portion 19 then sums the number of pixels corresponding to specular reflections from near and far objects to generate the speckleCount value.

At step S121 , the processing portion 19 then determines whether the empty region corresponds to a dark region or an over bright region by comparing the value of average RGB colour value with the upper and lower colour threshold values, and the value of speckleCount with the upper and lower speckleCount threshold values. If the average RGB colour value. As described above, if the average colour value is less than the lower colour value threshold and the value of speckleCount is below the lower speckleCount threshold value the polygonal region is classified as a 'dark' region. If the average colour value is greater than the upper colour value threshold and the value of speckleCount is below the lower speckleCount threshold value the polygonal region is classified as an 'over bright' region. If the empty region is considered to correspond to a dark or over bright region the region is considered not to correspond to a ditch or body of water and the processing portion 19 continues at step S121 . At step S123 the processing portion 19 the processing portion determines that the empty region corresponds to a body of water if (a) the value of speckleCount exceeds the upper speckleCount threshold value and (b) the average width of the extended cropped polygonal region is greater than the width threshold value and the height difference (determined from the MLS map) between cells C at opposite ends of a column of cells spanning the empty region is less than the height difference threshold value. In the present embodiment the processing portion 19 considers each column of cells across the width of the empty region. In some embodiments a reduced number of columns is considered, for example columns at locations a distance of (say) 25%, 50% and 75% of the lateral width of the empty region from one lateral edge of the empty region. Other arrangements may be useful in some embodiments.

At step S123, if the processing portion 19 has determined that the empty region does correspond to a body of water, the vehicle 100 outputs a signal indicative of the presence of a body of water ahead of the vehicle 100 and a signal indicative of the distance of the empty region from the vehicle 100. The processing portion 19 then continues at step S101 .

If at step S123 the processing portion 19 has determined the empty region does not correspond to a body of water, the processing portion 19 continues at step S127. At step S127 the processing portion outputs a signal indicative that a ditch is present ahead of the vehicle, and a signal indicative of the distance of the empty region from the vehicle 100. The processing portion 19 then continues at step S101 .

Some embodiments of the present invention enable vehicle operation with enhanced composure when traversing terrain. This is at least in part due to a reduction in driver workload when operating with the LSP control system 12 active. This is because a driver is not required manually to reduce vehicle speed when approaching a terrain feature in the form of a body of water or ditch. Rather, the LSP control system 12 automatically causes a reduction in speed in response to the detection of the terrain feature.

It will be understood that the embodiments described above are given by way of example only and are not intended to limit the invention, the scope of which is defined in the appended claims. Throughout the description and claims of this specification, the words "comprise" and "contain" and variations of the words, for example "comprising" and "comprises", means "including but not limited to", and is not intended to (and does not) exclude other moieties, additives, components, integers or steps.

Throughout the description and claims of this specification, the singular encompasses the plural unless the context otherwise requires. In particular, where the indefinite article is used, the specification is to be understood as contemplating plurality as well as singularity, unless the context requires otherwise.

Features, integers, characteristics, compounds, chemical moieties or groups described in conjunction with a particular aspect, embodiment or example of the invention are to be understood to be applicable to any other aspect, embodiment or example described herein unless incompatible therewith.