Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR CONTROLLING A VEHICLE
Document Type and Number:
WIPO Patent Application WO/2014/181146
Kind Code:
A1
Abstract:
The control system and method to control one or more vehicle functions of a vehicle (1) by recognition of different postures or movements of a user. The vehicle control system (13) comprises a motion recognition unit (15) positioned and oriented so that it can capture images of postures or movements assumed or performed by the user (19) in a surrounding area (18, 18a, 18b) of the vehicle (1); and an electronic control unit (16) that is in communication with the motion recognition unit and that is configured to recognize user postures or movements by comparison with predetermined postures or movements stored in a memory (22) of the vehicle control system. The electronic control unit (16) is configured to generate, depending on the result of the comparison, a control signal in order to control one or more vehicle functions in response to the user postures or movements.

Inventors:
RIBERO RAPHAEL (FR)
Application Number:
PCT/IB2013/001225
Publication Date:
November 13, 2014
Filing Date:
May 06, 2013
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
RENAULT TRUCKS (FR)
International Classes:
G06F3/01; B62D51/00; G06F3/03
Foreign References:
US20020152010A12002-10-17
US20100235034A12010-09-16
US20080085048A12008-04-10
US20090222149A12009-09-03
US20070177011A12007-08-02
Other References:
None
Attorney, Agent or Firm:
FAUCHEUX, Jérôme (Volvo Corporate IP - TER E70 21299 Route de Lyon, Saint Priest, FR)
Download PDF:
Claims:
CLAIMS

1. A vehicle control system to control one or more vehicle functions of a vehicle (1 ) by recognition of different postures or movements of a user, the vehicle control system (13) comprising :

• a motion recognition unit (15) positioned and oriented so that it can capture images of postures or movements assumed or performed by the user (19) in a surrounding area (18, 18a, 18b) of the vehicle (1 ) ;

· an electronic control unit (16) in communication with the motion recognition unit and configured to recognize user postures or movements by comparison with predetermined postures or movements stored in a memory (22) of the vehicle control system ;

wherein the electronic control unit (16) is configured to generate, depending on the result of the comparison, a control signal in order to control one or more vehicle functions in response to the user postures or movements.

2. The vehicle control system according to claim 1 , characterized in that the motion recognition unit (15) is positioned on an outside surface of the vehicle (1) and is in wireless communication or in wire connection with the electronic control unit (16).

3. The vehicle control system according to claim 1 or 2, characterized in that vehicle control system comprises an activation interface (30, 31 , 32) that includes one or a combination of several elements among the following :

• a switch located on the vehicle dashboard,

• a switch on the vehicle key fob,

• a menu on a touch screen on the dashboard,

• a switch on a remote interface which communicates with the electronic control unit (16) using a wireless technology.

4. The vehicle control system according to any one of the preceding claims, characterized in that the motion recognition unit (15) comprises at least one camera that is preferably a 3D camera.

5. The vehicle control system according to any one of the preceding claims, characterized in that the motion recognition unit is located in a side rear view mirror unit (10a, 10b) of the vehicle (1 ). 6. The vehicle control system according the preceding claim, characterized in said side rear view mirror unit (10a, 10b) is fixed to the vehicle via at least one articulation (20a, 20b) and in that the orientation of the motion recognition unit (15) can be adjusted by modifying the orientation of said rear view mirror unit via said articulation (20a, 20b).

7. The vehicle control system according the preceding claim, characterized in said articulation (20a, 20b) is motorized.

8. The vehicle control system according to one of claims 1 to 4, characterized in that the vehicle function controlled by the system includes at least one of :

• the suspensions of the rear and/or front wheel axles,

• the lighting function,

• the tail gate function,

• the crane function,

· the control of the vehicle doors,

• the frontward or rearward motions of the vehicle.

9. A controlling method for controlling at least one function of a vehicle (1 ) that is equipped with a vehicle control system (13) that comprises at least one motion recognition unit (15) and an electronic control unit (16) that is in communication with the motion recognition unit (15), characterized in that the method comprises the steps of :

• providing (80) an operational zone (18, 18a, 18b) in the surrounding area of the- vehicle (1 ) where a user can assume different postures or perform different movements ;

• capturing (1 10), with the motion recognition unit (15), at least one image of said user posture or user movement assumed or performed in the operational zone (18, 18a, 18b) ;

· converting (1 1 1 ) said image into a digital signal ;

• comparing (120), with the electronic control unit (16), the digital signal with several sets of comparison data, wherein each set of comparison data represents a preregistered user posture or user movement or represents a preregistered series of user postures or user movements corresponding to at least one operation of a vehicle function ;

• identifying (130), with the electronic control unit (16), which set of comparison data matches the digital signal ;

• executing (170) the corresponding operation that corresponds to the identified set of comparison data.

10. The controlling method according to claim 9, characterized in that, after the step of identifying which set of comparison data matches the digital signal and before the step of executing the corresponding operation, the method comprises the further steps of :

• generating (150) a control signal which contains information for the execution of the corresponding operation ;

• transmitting (160) the control signal to the electronic control unit (25, 26, 28, 29) of the vehicle (1) that is in charge of managing the vehicle function concerned by the execution of the corresponding operation ;

• controlling (170) said vehicle function according to said control signal in order to execute the corresponding operation.

11. The controlling method according to claim 9 or 10, characterized in that the operational zone (18, 18a, 18b) is provided in a location where the user is able to visually check the execution of the corresponding operation.

12. The controlling method according to any one of the claims 9 to 11 , characterized in that the vehicle control system (13) is configured to control several functions of the vehicle and in that the method comprises a further step of receiving (104) the user selection of the vehicle function that the user is intended to control via the vehicle control system.

13. The controlling method according to claim 12, characterized in that the selection of the function is carried out by the user via a remote interface (31).

14. The controlling method according to claim 12, characterized in that the selection of the function is carried out by assuming an initial user posture or by performing an initial user movement that is detected and recognized by the vehicle control system.

15. The controlling method according to any one of the claims 9 to 14, characterized in that the method comprises the further step of informing the user (140), by generating an alarm or a visual or oral message, that his/her said posture or movement doesn't correspond to an available operation of the vehicle control system if no set of comparison data matches the digital signal. 16. The controlling method according to any one of claims 9 to 15, characterized in that the method comprises a step (100) of user identification wherein the user is identified by an identification code that is recognized by the control system in order to authorize the user to control at least one function of the vehicle.

17. The controlling method according to the preceding claim, characterized in that the identification code is generated by a CID (40) located in the surrounding area of the vehicle or by a CID carried by the user when he or she is located in the surrounding area of the vehicle (1).

18. The controlling method according to any one of claims 9 to 17, characterized in that the method comprises, before the step of executing the corresponding operation (170), further steps (141 , 142) of requesting and receiving a user confirmation in order to confirm that the corresponding operation identified by the vehicle control system is the correct one.

19. The controlling method according to claim 18, characterized in that, following a requesting message of the vehicle control system, the user confirmation is given by a new user posture or a new user movement that the vehicle control system recognizes.

20. The controlling method according to any one of claims 9 to 19, characterized in that the method comprises a further step (145) of checking safety conditions wherein, before and/or during the step of executing the corresponding operation (170), it is checked that the execution of the corresponding operation will not interfere with another operation that is currently executed or will not interfere with an outside obstacle detected by the vehicle control system.

21 . The controlling method according to the preceding claim, characterized in that the method comprises a further step of preventing or stopping the execution of the corresponding operation if it can interfere with another operation or with an outside obstacle.

22. The controlling method according to any one of the claims 9 to 21 , characterized in that the method further comprises the initial step (79) of activating the controlling system in order to activate at least the motion recognition unit.

23. The controlling method according to the preceding claim, characterized in that the step of activating the vehicle control system further comprises the activation of the parking brake and in that the controlling method further comprises the step of releasing the parking brake if the execution of the corresponding operation shall result in a forward or backward motion of the vehicle.

24. The controlling method according to the preceding claim, characterized in that the step of releasing the parking brake is conditional upon a user confirmation according to claim 18 or 19. 25. The controlling method according to any one of claims 9 to 24, characterized in that the method further comprises, before the step (1 10) of capturing at least one image of at least one user posture or one user movement, the step (108) of checking if the user (19) is located in the operational zone (18, 18a, 18b).

26. The controlling method according to the preceding claim and claim 22, if, the recognition unit doesn't detect (108) the presence of the user (19) in the operational zone (18, 18a, 18b) after a first determined time-out following the activation of the recognition unit, the user is informed by an alarm or a visual or oral message (33) that the vehicle control unit can't detect his or her presence in the operational zone (18, 18a, 18b).

27. The controlling method according to the claims 22 and 26, if the recognition unit doesn't detect the presence of the user (19) in the operational zone (18, 18a, 18b) after a second determined time-out following the activation of the recognition unit, the system is deactivated.

28. The controlling method according to any one of claims 9 to 27, characterized in that the method comprises the further steps of :

© receiving (104) the selection by the user of a learning function ;

• receiving (201 ) the selection by the user, among a predetermined list of operations that can be executed by at least one vehicle function, of the operation that will be learnt by the vehicle control system ;

• capturing (202), with the motion recognition unit (15), at least one reference image of a user posture or of a user movement assumed or performed by the user in the operational zone ( 8, 18a, 18b) ;

· converting (203) said reference image(s) into a set of reference digital data ;

• storing (204) the set of reference digital data in a memory (22) of the vehicle control system or in a memory of the vehicle ;

· assigning (205) the set of reference digital data to the selected operation.

29. The controlling method according to any one of claims 9 to 27, characterized in that the method comprises the further steps of :

· receiving (104) the selection by the user of a learning function ; • receiving (301) the selection by the user, among a predetermined list of operations that can be executed by at least one vehicle function, of the operation that will be learnt by the vehicle control system ;

o capturing (302), with the motion recognition unit (15), at least one first reference image of a first user posture or user movement assumed or performed by the user in the operational zone (18, 18a, 18b) ;

• converting (303) said first reference image(s) into a first set of reference digital data ;

• storing (304) the first set of reference digital data in a memory (26) of the vehicle control system or in a memory (22, 23) of the vehicle ;

• capturing (305) at least one second reference image of a second user posture or user movement assumed or performed by the user in the operational zone (18, 18a, 18b) ;

• converting (306) said second reference image(s) into a second set of reference digital data ;

© comparing (307) the second set of reference digital data with the first set of reference digital data ;

• if the second set of reference digital data matches the first one, assigning (308) the first or the second set of reference data to the selected operation.

30. The controlling method according to the claim 28 or 29, wherein in the step of identifying which operation the user selects in order to be learnt by the controlling system, the user selection is performed via a remote interface (31) that is in wireless communication with the control system.

Description:
System and method for controlling a vehicle

The present invention relates to a system and a method for controlling vehicle functions by body postures and/or body movements.

Background of the invention

For controlling vehicles, human machine interface relies on interfaces such as the steering wheel, joystick, switch etc.

Through the interface, the user interacts with the machine and the machine accordingly responds by a machine output.

In the conventional human machine interfaces, the user is thus required to have a physical interaction with the machine.

In some cases, and especially in the case of industrial vehicles such as trucks, the physical interaction with the vehicle can be a problem.

An example illustrating this problem is the operation of light checking. It is critical to ensure the vehicle lighting equipment is fully operational. On a large vehicle, this checking operation requires in practice that an operator is positioned in the vehicle cabin and actuates the various vehicle lights while a second operator stands at the front end and at the rear end of the vehicle to visually record that a particular light is or is not in working condition. This simple maintenance operation thus requires two operators and thus may not be performed at the desired frequency merely because it requires two operators.

A further example of unsatisfactory vehicle interactions can be found for example in the field of construction vehicles. In the construction field, a number of operations have to be performed such as dump tilting, crane elevation etc. Due to the general size of the vehicle, it can prove to be difficult to operate the vehicle through a user interface physically connected to the vehicle while simultaneously taking into account the position of the vehicle or the position of some mobile vehicle equipment, such as a crane or a tipper, in the vehicle environment.

In the same way, adjusting the height of a vehicle cargo floor with the height of a loading platform can prove to be difficult without having a direct view on the rear part of the vehicle cargo that has to be adjusted to the loading platform. In view of the above, it appears that there is room for improvement in the field of vehicle interaction more specifically in the field of human interaction with an industrial vehicle. Description of the invention

In this technical context, it is an object of the present invention to provide a system and a method facilitating the control of a vehicle.

In a first aspect, the invention concerns a vehicle control system to- control one or more vehicle functions of a vehicle by recognition of different postures or movements of a user. The vehicle control system comprises :

• a motion recognition unit positioned and oriented so that it can capture images of postures or movements assumed or performed by the user in a surrounding area of the vehicle ; · an electronic control unit in communication with the motion recognition unit and configured to recognize user postures or movements by comparison with predetermined postures or movements stored in a memory of the vehicle control system.

The electronic control unit of the vehicle control system is configured to generate, depending on the result of the comparison, a control signal in order to control one or more vehicle functions in response to the user postures or movements.

The invention makes it possible to control at least one vehicle function by recognition of user posture(s) or user movement(s). This proves to be a great improvement over traditional control interfaces which can be a push button or a rotary knob fixed in the vehicle and which oblige the user to be positioned in order to reach the control interface. In contrast, with the invention this user has a degree of freedom - within the field of detection of the motion recognition unit - to control a vehicle from the outside.

In a preferred embodiment of the invention, the motion recognition unit is positioned on an outside surface of the vehicle and is in wireless communication or in wire connection with the electronic control unit.

In a further embodiment of the invention, the vehicle control system comprises an activation interface (30, 31 , 32) that includes one or a combination of several elements among the following : • a switch located on the vehicle dashboard,

• a switch on the vehicle key fob,

• a menu on a touch screen on the dashboard,

• a switch on a remote interface which communicates with the electronic control unit using a wireless technology.

It is further envisaged that the motion recognition unit comprises at least one camera that is preferably a three dimensions camera ("3D camera").

It can be envisaged that the motion recognition unit is located in a side rear view mirror unit of the vehicle.

Preferably, side said rear view mirror unit is fixed to the vehicle via at least one articulation and the orientation of the motion recognition unit can be adjusted by modifying the orientation of said rear view mirror unit via said articulation.

In another preferred aspect of the invention said articulation is motorized.

Advantageously, the vehicle function controlled by the system includes at least one vehicle function amongst the following :

the suspensions of the rear and/or front wheel axles, · the lighting function,

the tail gate function,

the crane function,

the control of the vehicle doors,

the frontward or rearward motions of the vehicle.

In a second aspect, the invention concerns a controlling method for controlling one or more functions of a vehicle that is equipped with a vehicle control system that comprises at least one motion recognition unit and an electronic control unit that is in communication with the motion recognition unit. Wherein the method comprises the steps of :

• providing an operational zone in the surrounding area of the vehicle where a user can assume different postures or perform different movements ;

• capturing, with the motion recognition unit, at least one image of said user posture or user movement assumed or performed in the operational zone ; • converting said image into a digital signal ;

• comparing, with the electronic control unit, the digital signal with several sets of comparison data, wherein each set of comparison data represents a preregistered user posture or user movement or represents a preregistered series of user postures or user movements corresponding to at least one operation that can be executed by a vehicle function ;

• identifying, with the electronic control unit, which set of comparison data matches the digital signal ;

· executing the corresponding operation that corresponds to the identified set of comparison data.

In the present application, the terms "the corresponding operation" and "the identified operation" refer to the at least one operation that corresponds to the preregistered user posture or user movement or to the preregistered series of user postures or user movements represented by the set of comparison data identified by the electronic control unit as matching with the digital signal.

Thanks to the inventive method, a user can control one or several functions of a vehicle from the outside by assuming different body postures and/or by performing different body movements. Another advantage is that the user can visually check the execution of the operation corresponding to his/her body posture(s) or body movement(s) without being obliged to get in and get out of the vehicle between each execution.

More precisely, it can be envisaged that, after the step of identifying which set of comparison data matches the digital signal and before the step of executing the corresponding operation, the method comprises the further step of :

• generating a control signal which contains information for the execution of the corresponding operation ;

· transmitting the control signal to the electronic control unit of the vehicle that is in charge of managing the vehicle function concerned by the execution of the corresponding operation ;

• controlling said vehicle function according to said control signal in order to execute the corresponding operation. According to an advantageous aspect of the present invention, the operational zone is provided in a location where the user is able to visually check the execution of the corresponding operation.

Thanks to that, the user can visually check the execution of the operation corresponding to his/her body posture(s) or body movement(s) without being obliged to move from the location where he/she assumes body postures or where he/she performs body movements.

According to yet another aspect of the present invention, the vehicle control system is configured to control several functions of the vehicle and the method comprises a further step of receiving the user selection of the vehicle function that the user is intended to control via the vehicle control system.

It can be envisaged that the selection of the function is carried out by the user via a remote interface.

In a preferred aspect of the invention, the selection of the function is carried out by assuming an initial user posture or by performing an initial user movement that is detected and recognized by the vehicle control system.

It can be envisaged that, the method comprises the further step of informing the user, by generating an alarm or a visual or oral message, that his/her said posture or movement doesn't correspond to an available operation of the vehicle control system if no set of comparison data matches the digital signal.

In a preferred implementation of the method, the method comprises a step of user identification wherein the user is identified by an identification code that is recognized by the control system in order to authorize the user to control at least one function of the vehicle.

Preferably, said identification code is generated by a CID located in the surrounding area of the vehicle or by a CID carried by the user when he or she is located in the surrounding area of the vehicle.

According top another implementation of the method, the method comprises, before the step of executing the corresponding operation, further steps of requesting and receiving a user confirmation in order to confirm that the corresponding operation is the correct one.

Preferably, following a requesting message of the vehicle control system, said user confirmation is given by a new user posture or a new user movement that the vehicle control system recognizes. The method may comprise a further step of checking safety conditions wherein, before and/or during the step of executing the corresponding operation, it is checked that the execution of the corresponding operation will not interfere with another operation that is currently executed or will not interfere with an outside obstacle detected by the vehicle control system.

The method may also comprise a further step of preventing or stopping the execution of the corresponding operation if it can interfere with another operation or with an outside obstacle.

In another aspect of the invention, the method further comprises the initial step of activating the controlling system in order to activate at least the motion recognition unit.

Preferably, the step of activating the vehicle control system further comprises the activation of the parking brake arid in that the controlling method further comprises the step of releasing the parking brake if the execution of the corresponding operation shall result in a forward or backward motion of the vehicle.

Advantageously, the step of releasing the parking brake is conditional upon a user confirmation as previously described.

In further aspect of the invention, the method comprises, before the step of capturing at least one image of at least one user posture or one user movement, the step of checking if the user is located in the operational zone.

Preferably, if the recognition unit doesn't detect the presence of the user in the operational zone after a first determined time-out following the activation of the recognition unit, the user is informed by an alarm or a visual or oral message that the vehicle control unit can't detect his or her presence in the operational zone.

The controlling method may also comprise a learning function in order to allow the user to determine which body postures or movements can be used to control vehicle function(s).

If the method includes a learning function, it may comprise the further steps of :

• receiving the selection by the user of a learning function ;

• receiving the selection by the user, among a predetermined list of operations that can be executed by at least one vehicle function, of the operation that will be learnt by the vehicle control system ;

o capturing, with the motion recognition unit, at least one reference image of a user posture or user movement assumed or performed by the user in the operational zone ; © converting said reference image(s) into a set of reference digital data ;

© storing the set of reference digital data in a memory of the vehicle control system or in a memory of the vehicle ;

. © assigning the set of reference digital data to the selected operation. the method includes a learning function, it may otherwise steps of :

« receiving the selection by the user of a learning function ;

© receiving the selection by the user, among a predetermined list of operations that can be executed by at least one vehicle function, of the operation that will be learnt by the vehicle control system ;

© capturing, with the motion recognition unit, at least one first reference image of a first user posture or user movement assumed or performed by the user in the operational zone; β converting said first reference image(s) into a first set of reference digital data ;

• storing the first set of reference digital data in a memory of the vehicle control system or in a memory of the vehicle ;

• capturing at least one second reference image of a second user posture or user movement assumed or performed by the user in the operational zone;

o converting said second reference image(s) into a second set of reference digital data ;

© comparing the second set of reference digital data with the first set of reference digital data ;

© if the second set of reference digital data matches the first one, assigning the first or the second set of reference data to the selected operation. It can be envisaged that in the step of identifying which operation the user selects in order to be learnt by the controlling system, the user selection is performed via a remote interface that is in wireless communication with the control system.

Brief description of the drawings

The following detailed description of embodiments of the invention is better understood when read in conjunction with the appended drawings. However, the invention is not limited to the specific embodiments disclosed.

In the drawings:

Figure 1 is a schematic side view of a vehicle equipped with a system according to the invention;

Figure 2 is a schematic top view of the vehicle equipped with a system according to the invention having one motion recognition unit ;

Figure 3 is a schematic top view of the vehicle equipped with the system of figure 2 equipped with a second motion recognition units ;

Figures 4-8 are different flowcharts representing different implementations of the method according to the invention.

Figures 9 and 10 are two flowcharts representing two different learning processes of the method according to the invention.

Detailed description of the invention

In order to facilitate understanding, the same elements and steps in the figures will be denoted by the same reference numbers, and redundant descriptions thereof will be omitted.

The matters defined in the description, such as detailed construction and elements, are provided to assist in a comprehensive understanding of the exemplary embodiment. However, the exemplary embodiment can be practiced without those specifically defined matters. Also, well-known functions or constructions are not described in detail since they would obscure the application with unnecessary detail.

Fig. 1 illustrates a vehicle 1 , more precisely a truck having a cabin 2 and a box shaped container 3. The cabin 2 and the container 3 are supported by a chassis frame 4. The vehicle 1 also comprises front and rear wheels 5 connected to the chassis frame 4 via, in particular, axles and a suspension system that can include adjustable pneumatic suspensions 6 in order to make possible height adjustment H of the vehicle. On figure 1 , wheels 5 are represented according to two different positions of the suspension system. A first position corresponding to a normal driving position where the wheels 5, represented in continuous lines, are relatively close to the chassis frame. A second position corresponding to an operational position of the vehicle that can be, for instance, a unique position to unload the container part when the vehicle is against a loading platform. The user 13, can get into the cabin 2 or the container 3 via access doors 7. The vehicle also comprises a lighting system having front lights 8 and rear lights 9. The vehicle comprises rear view mirrors 10a and 10b disposed on each side of the cabin and which are in a normal position oriented rearward. The vehicle 1 can be equipped with a tailgate 1 1 arranged at the rear of the container 3 and which is able to rotate R and lift up or lift down L thanks to hydraulic or pneumatic actuators (not shown). It can also be equipped with a front air deflector 12 arranged on the roof of the cabin 2 that can be adjusted A between at least two positions : a folded position represented in continuous lines and a deployed position represented in dashed lines on figure 1.

Instead of a container, the vehicle can be equipped with a tipping trailer, with a crane, with a concrete mixer or with any other auxiliary equipment (not shown).

Instead of being a truck, the vehicle according to the invention can be a construction vehicle, a bus or any other road vehicle.

The vehicle 1 represented on figures 1 to 3 is equipped with a vehicle control system according to the invention. The vehicle control system is able to control one or more vehicle functions of the vehicle 1 by recognition of different postures or movements of a user. Each function of the vehicle 1 is controlled according to different operations that the function is able to execute. The vehicle control system comprises at least one motion recognition unit 15 which can be positioned on the outside surface of the vehicle 1 and at least one electronic control unit 16 (ECU, see fig° 2 and fig°3) that can be a dedicated ECU, as it will be represented hereunder, or that can be embedded within a multi-function ECU performing other control functions. The vehicle control system can also comprise an activation interface 30, 31 , 32 (see fig°3).

When the vehicle is a truck having a container 3, the motion recognition unit 15 can be, for instance, positioned on the top surface of the container 3 or the motion recognition unit 15' can be fixed to the cabin such as represented in dashed lines on figure 1 .

The motion recognition unit 15 is able to capture images of body postures assumed by the user 19 or images of body movements performed by the user 19 in the surrounding area of the vehicle 1 . The motion recognition unit

15 or a converting unit of the system (not shown), that can be arranged between the motion recognition unit 15 and the ECU 16, converts the images into a digital signal that is transmitted to the ECU 16. The electronic control unit

16 is in communication with the motion recognition unit 15 and is configured to process data received from the motion recognition unit 15 or from a converting unit. To be more precise the ECU 16 is configured to recognize user postures or movements by comparison with predetermined body postures or body movements stored in a memory 22 of the vehicle control system or in a memory of the vehicle and which are assigned to at least one operation that can be executed by a function of the vehicle 1 . When the ECU 16 recognizes user posture(s) or movement(s), it is also configured to generate and transmit a control signal in order to control a function of the vehicle by executing the operation that corresponds to the user posture(s) or movement(s).

The vehicle functions that can be controlled by the vehicle control system are for instance :

• the suspension system of the vehicle, especially the rear suspensions 6 of the vehicle,

· the control access doors 7,

• the lighting system including, for instance, front lights 8 and rear lights 9,

• the tailgate 1 1 ,

• front air deflector 12,

· low speed motion of the vehicle in the frontward or in the rearward direction. Each function is designed to execute some operations. For instance :

• the function corresponding to the suspension system allows adjustment of the height of the vehicle and so to execute a raising operation or a lowering operation of the vehicle,

• the access door function allows the execution of a locking operation or the execution of an unlocking operation of at least one door of the vehicle,

• the tailgate function is able to execute an deployment operation or a folding operation R of the tailgate and can also execute a lift up operation or a lift down operation L of the tailgate,

• the front air deflector function can be used to adjust A the position of the deflector between at least two positions : a folded position and a deployed position,

• the lighting system function is designed to execute different lighting operations,

• the function corresponding to the low speed motions of the vehicle that is able to execute different operations of low speed motions of the vehicle in the frontward or in the rearward direction.

The lighting operations that can be implemented can be considered on their own or in combination with the following :

• all lights are switched OFF,

• daytime running lights ON,

• position lights ON,

• low beams ON,

• high beams ON,

• rear lights ON,

• front flashing lights ON,

• rear flashing lights ON,

• front fog lights ON,

• rear fog lights ON.

It is understood that the vehicle functions that can be controlled by the ECU 16 are not limited to the above described functions. If the vehicle is equipped with a crane or with a concrete mixer, corresponding function and operations can be controlled by the vehicle control system.

The motion recognition unit 15, 15' can typically comprise one or more cameras which can be fixed on a side or top surface of the cabin 2 or on a side or top surface of the box shaped container 3 or trailer. The camera (s) can be a 2D camera. However the camera is preferably a 3D camera capable of generating data including depth information of a user posture or movements and so to determine distance and variation of distance between the 3D camera and position of the user or user's members such as user's hands. When the motion recognition unit 15 is fixed on a surface of the vehicle, it is preferably folded in a rest position when the vehicle is driven to prevent aerodynamic drag in order to limit fuel consumption. In an advantageous arrangement, the motion recognition unit 15 is fixed on a side rear view mirror unit 10a, 10b of the vehicle. The motion recognition unit 15 can be fitted in the side rear view mirror unit 10a, 0b without protruding to the outside of the side rear view mirror unit 10a, 10b. In a particular embodiment of the invention, both side rear view mirrors of the vehicle are equipped with a motion recognition unit 15. The motion recognition unit is able to capture images of said user posture or user movement that falls in a determined operational zone 18. The operational zone 18 may be determined by the field of detection of one or several cameras or can be a more limited area included in said field of detection. The field of detection can be defined by the orientation and the field of view of each camera. It is advantageous to provide the vehicle control system with more than one or two cameras 15 in order to increase the size or the operational area 18 or in order to provide different operational areas 18a, 18b optimally located so that the user can see more easily and visually check the execution of the different operations in response to his / her body postures or body movements. For instance, thanks to a first operational area 18a located at the rear side of the vehicle the user can visually check, in response to his / her body postures or body movements and without being forced to move from the location where he / she has assumed his / her body postures or performed his / her body movements, the execution of the lighting operations corresponding to rear lights ON, rear flashing lights ON and rear fog lights ON. And thanks to a second operational area 18b located at the front of the vehicle the user can visually check the execution of the lighting operations corresponding daytime running lights ON, position lights ON, low beams ON, high beams ON, front flashing lights ON and front fog lights ON.

The motion recognition unit 15 can be in wireless communication or in wire connection with the electronic control unit 16.

The motion recognition unit 15 is suitably connected to a power source such as the vehicle batteries (not shown). The motion recognition unit 15 can also be fitted with a motor (not shown) so that it can be rotated in order to adjust its orientation and to increase the operational zone that can be detected.

As previously described the motion recognition unit 15 can be located in a side rear view mirror unit 0a, 10b of the vehicle, in this case the side rear view mirror unit can be fixed to the vehicle 1 through at least one articulation 20a, 20b so that the orientation of the camera 15 can be adjusted by modifying the orientation of the rear view mirror unit. The articulation 20a, 20b can also be motorized so that the vehicle control system can automatically modify the orientation of the rear view mirror unit 10a, 10b. Thanks to that the detection field of the motion recognition unit 15 and so the operational zone 18 can easily be modified. In an alternative, the user can adjust the orientation of the rear view mirror unit 10a, 10b via a remote control interface.

The orientation of the motion recognition unit 15 or the orientation of the rear view mirror unit 10a, 10b supporting the camera can be automatically modified from a rest position to an active position as soon as the system is activated. For instance, as soon as the vehicle control system is activated, the right rear view mirror unit 10b can be rotated by an a angle from a position, such as depicted on figure 2, where it is oriented rearward to a position, such as depicted on figure 3, where it is oriented frontward.

In a different way, the vehicle control system can modify the orientation of the motion recognition unit 15 or modify the orientation of the rear view mirror unit 10a, 10b that supports the motion recognition unit, depending on the vehicle function that the user has previously selected as the function he or she intends to control by his or her body posture(s) or body movement(s). For instance, if the user selects the vehicle function that corresponds to the suspension system 6, 26 in order to adjust the height of the vehicle, the motion recognition unit 15 can be oriented by the vehicle control system in order to delimit, such as represented on figure 2, an operational area 18a located on a side of the vehicle. If, on the contrary, the user selects the front air deflector vehicle function in order to adjust the position of the front air deflector 12, the vehicle control system modifies the orientation a of the motion recognition unit

15 such that the operational area 18b is preferably delimited, such as a represented on figure 3, in the front of the vehicle.

Of course, the vehicle control system can modify the orientation of one or of several motion recognition units 15 depending on the vehicle function selected by the user.

The ECU 16 can include a microprocessor 21 , a non-volatile memory 22 such as a Read Only Memory (ROM), and a volatile memory such as a Random Access Memory (RAM) 23. The ECU 16 can also include other conventional components such as an input interface circuit, an output interface circuit, and the like. The ECU 16 is programmed to analyze images of a user captured by the motion recognition unit 15. More precisely, the ECU 16 receives, in the form of a digital signal, images from the motion recognition unit 15 or from a converting unit, and compares the images with a series of images stored in the non-volatile memory in the form of a set of digital data. The ECU

16 is thus operatively connected to the motion recognition unit 15.

Preferably, each vehicle function is related to a specific ECU (Electronic Control Unit) 25, 26, 28, 29 of the vehicle that is in charge of managing the vehicle function. Alternatively, one common ECU 25, 26, 28, 29 can be in charge of managing several vehicle functions.

The specific ECUs 25, 26, 28, 29 are preferably connected to the main ECU 16 via a CAN bus 24 commonly used in the field of truck technology.

When the control of vehicle function results in a motion of the vehicle or motion of one vehicle equipment, a specific ECU 25, 26, 28, 29 can manage the corresponding vehicle function via at least one actuator. Such an actuator can be pneumatic suspensions 6, hydraulic cylinder and so on.

Such as represented in figures 2 and 3, different specific ECUs can be for instance :

· a specific ECU 25 to manage the function corresponding to the low speed motions of the vehicle and that can be able to control the engine speed, the gearbox and the clutch of the driveline (not shown) in order to execute a low speed motion of the vehicle in the frontward or in the rearward direction ; · a specific ECU 26 to manage the function corresponding to the suspension system and that is able to control electron- pneumatic vanes connected to at least the rear pneumatic suspensions of the vehicle in order to raise or lower the rear of the vehicle ;

• a specific ECU 28 to manage the lighting system function ; · a specific ECU 29 to manage the access door function and that is able to control locking device 14.

The activation interface 30, 31 , 32 makes it possible to set the vehicle in a state where the vehicle control system is activated. In case of activation of the vehicle control system at least the motion recognition unit 15 is activated. The ECU 16 of the vehicle control system can be activated at the same time or after, for instance, as soon as the user is detected by the motion recognition unit 15.

In normal operating conditions, the vehicle control system is preferably not activated. The activation interface 30, 31 , 32 can have several embodiments and can include one or a combination of several elements among the following :

- a switch 30 located on the vehicle dashboard,

- a menu on a touch screen of the vehicle dashboard,

- a button on a key fob or on the ignition key unit,

- an instruction selected and sent via a remote interface such as a smart phone 31 , which communicates with the vehicle control system using a wireless technology standard for exchanging data over short distances such as Bluetooth or Wi-Fi,

- a microphone unit 32 capable of detecting and converting an oral instruction of the user into an electric or electronic control signal that controls the activation of the vehicle control system.

It is also understood that, the vehicle control system can also include a user identification device 40, 41 such as an image identification system, a voice identification system or a customer identification device 40 (CID) capable of generating an identification code that is recognized by the vehicle control system to ensure that only an authorized user can use the control system.

When the identification device is for instance a CID 40, it can be part of the ignition key unit or key fob that the user carries with him / her.

When the identification device uses image recognition technology, it can use the motion recognition unit 15 or can use a specific image identification unit that is, for instance, capable of reading a bar code fixed to a user's clothing. For instance, the bar code can be printed or sewn to a user's cap.

Depending on their identification code, the vehicle control system can give different rights to different users in terms of vehicle function that can be controlled. For instance, one user can be authorized to control all the vehicle functions accessible through the vehicle control system whereas a second can be authorized to control only the lighting system function.

Referring to Figure 4, a first implementation of the method according to the invention is shown in the form of a flowchart.

In a first step 79 of the method according to this first embodiment, the vehicle control system is activated. In this step of activation at least the motion recognition unit 15 is activated. The ECU 16 of the vehicle control system can be activated during this step 79 or in a different way before or after this step. For instance, the ECU 16 of the vehicle control system can be activated as soon as the ignition key is turned on to make contact with different systems and equipment of the vehicle with the service battery or after the activation of the motion recognition unit 15, for instance, when the user is detected by the motion recognition unit 15. In a preferable implementation of the method, the user can activate the vehicle control system by using one of the previously described activation means 30, 31 , 32. However, the activation of the motion recognition unit 15 can also be performed automatically, for a determined period of time. For instance, the motion recognition unit 15 is automatically activated when the ignition key is turned on and as soon as the system detects that the driver has got out of the vehicle 1. Door sensor and/or seat sensor can be used to detect that the driver gets out or has got out of the vehicle.

The step of activating the vehicle control system 79 can also comprise the activation of the parking brake of the vehicle (not shown). In this case the method can also comprise a step of releasing the parking brake (not shown) if the execution of the corresponding operation requires a forward or backward motion of the vehicle. Preferably, the step of releasing the parking brake is conditional upon a user confirmation wherein the user has to confirm that the operation identified by the control system as the one that the concerned function is about to execute is the correct one. Even if it is not recommended because of energy consumption, the invention doesn't exclude that the vehicle control system remains activated or in a sleeping mode, that is to say wherein energy consumed by the system is reduced, when the ignition key is removed.

In a further step 80, the vehicle control system provides in the surrounding area of the vehicle 1 an operational zone 18, 18a, 18b where the user can assume different postures or perform different movements. In practice, the operational zone can correspond to the zone that the motion recognition unit 15 is able to detect depending on its inherent field of view and on its orientation. The operational zone 18, 18a, 18b can be provided as soon as the motion recognition unit 15 is activated. The operational zone is preferably provided in a location where the user is able to see the operations that are executed by the vehicle function in response to his body postures or motions. In this aim several operational zones 18a and 18b can be provided. For instance, the several operational zones 18a and 18b can be provided by using several motion recognition units 15 that can be arranged at different locations of the vehicle 2, 3, 10a, 10b. The position of the operational zone can also be modified. For instance, a camera 15 (see figure 3) used as a motion recognition unit can be designed so that its orientation can be adjusted depending on the user location in the surrounding area of the vehicle 1 and/or depending on the vehicle function that the user selects according to a step 104 hereinafter described in connection with flowchart of figure 5.

In a next step 1 10, the motion recognition unit 15 captures, at least one image of a user posture or at least one user movement performed in the operational zone 18, 18a, 18b. This step 1 10 is preferably preceded by a step of user detection 108 in order to detect the presence of the user in the operational zone 18, 18a, 18b. The step of user detection 108 can be performed by the motion recognition unit 15 itself or by a specific sensor (not shown).

According to the optional step 108, if the presence of the user can't be detected in the operational zone 18, 18a, 18b after a first elapsed time T1 following the activation of the vehicle control system, for instance 30 seconds, the user is informed, according to a step 109 of the method, by a visual or oral message or alarm that the control system can't detect him or her in the operational zone 18. After several attempts to try to detect the user in the operational zone 18 or after a second elapsed time T2 following the activation of the vehicle control system, for instance 90 seconds, the vehicle control system is automatically deactivated according to step 103.

The detection of the user in the operational zone 18 can be used to trigger the capture of user postures or user movements by the motion recognition unit. Of course, different solutions can be used to trigger the capture of user postures or user movements. For instance, it can be a particular voice message emitted by the user and that is recognized by the vehicle control system. In this case, the vehicle control system can use a microphone unit 32 and features (not represented) able to process an audio signal. For instance, the user voice message can be a single "top" to trigger the capture of one image of a user posture or a "start top" to trigger the capture of several images of successive user postures or a user movement wherein the "start top" is followed by an "end top" to stop the capture.

In a further step 1 1 , the vehicle control system converts said image(s) into a digital signal that is sent to the ECU 16. The image can be converted by the motion recognition unit 15 itself or by a specific converting unit (not shown) connected between the motion recognition unit and the ECU 6.

In a step 120, the digital signal, representative of at least one image of the user posture(s) or user movements(s) captured by the motion recognition unit 15, is compared, by the ECU 16, with several sets of comparison data stored in the non-volatile memory. Each set of comparison data represents a preregistered user posture or user movement or a preregistered series of user postures or user movements to which is assigned to at least one operation of a vehicle function.

The user postures or user movements that the system is able to recognize can be body postures or body movements, for instance a C, U or L shape performed with arms; a hand position, for instance a distinction can be made between fist closed and open hand ; the system can be designed to recognize different finger configurations and can also be designed to recognize a combination of body, hand and/or finger postures or movements (in the present application they are call "user posture(s)" / "body posture(s)" or "user movement(s)" / "body movement(s)").

In a step 130, the ECU 16 identifies which set of comparison data matches the digital signal and so identifies the corresponding operation that corresponds to the identified set of comparison data. If, according to step 130, no set of comparison data matches the digital signal, the controlling method preferably returns 131 to the previous step 1 10 or before, for instance to the step 108, to capture a new user posture or movement. In this case, the user is preferably informed 140 by the vehicle control system that his/her posture or movement doesn't correspond to an available operation of the control system. To inform the user, the vehicle control system can generate a visual or an oral alarm or message by using a colour code display to the outside of the vehicle or an outside speaker 33 (fig° 2 and fig°3). After several unsuccessful attempts by the vehicle control system to recognize user postures or movements, for instance after three attempts, the vehicle control system can be automatically deactivated.

If, according to step 130, the ECU 16 has identified a set of comparison data that matches with the digital signal, in a next step, the corresponding operation, which corresponds to the identified set of comparison data, is executed by the corresponding vehicle function.

To be more precise, in a step 150 the vehicle control system can generate a control signal which contains information for the execution of the corresponding operation.

In a next step 160, the vehicle control system can transmit the control signal to the specific ECU 25, 26, 28, 29 in charge of managing the vehicle function concerned by the execution of the corresponding operation. In other words the control signal is transmitted to a specific ECU 25, 26, 28, 29 of the vehicle 1 , that can be a specifically configured to only manage the identified vehicle function or a general one that is configured to manage the identified vehicle function and some other.

In a step 170, the specific ECU controls the corresponding vehicle function according to said control signal. More precisely, the vehicle function, concerned by the execution of the corresponding operation, is controlled according to the control signal that contains information requesting the execution of the corresponding operation. The corresponding operation is then executed by the corresponding vehicle function.

In other words each operation that the system can control is assigned by the system to at least one user posture or user movement that has to be recognized by the ECU 16 in order to request its execution by the corresponding function. After having controlled the vehicle function in response to one or several user postures or user movements, in a step 180 the vehicle control system ideally checks if a new user posture or movement can be detected in the operational zone 18a, 18b.

If yes, the method returns to the previous step 1 10 in order to capture at least one image of the new user posture or movement and to process it according to step 120 and the following steps.

On the contrary, if the control vehicle system can't detect a new user posture or movement in the operational zone 18, 18a, 8b or if the vehicle control system can't detect the presence of the user in the operational zone 18, 18a, 18b during a determined period of time, the vehicle control system is preferably deactivated 190. The vehicle control system is preferably deactivated, according to a step 190, after an elapsed time of 30 seconds without detecting a new user posture or a new user movement in the operational zone 18, 18a, 18b.

The method such as depicted in figure 4 can be implemented to control one vehicle function or can be implemented to control several functions of the vehicle 1 . In this latter case the user has to assume at least one specific posture or has to perform at least one specific movement for each operation that the user can control. To be more precise, a different set of comparison data is stored in the non-volatile memory 22 of the ECU 16 for each operation of each function that the vehicle control system is capable of controlling.

Figure 5 depicts an improved implementation of the method according to the invention. Compared to the implementation of figure 4, in the improved implementation of figure 5 the step 104 is added and the optional steps 105, 106 and 107 can be added.

In the step 104 the vehicle control system receives the user selection of the function that he or she is intended to control via the vehicle control system.

The method implemented according to figure 5 is particularly suitable when the vehicle control system is configured to control several functions of the vehicle. According to this improved implementation, the user selects, before assuming a posture or performing a movement in order to control a function, which function he or she wants to control. The vehicle control system receives according to step 104 which function the user has selected. Thanks to that, the digital signal that can be received by the ECU from the motion recognition unit can be compared to a limited number of sets of comparison data corresponding to the function that the user is intended to control. Therefore, the risk of executing the wrong operation or controlling the wrong function is reduced. In addition the number of postures or movements that the user has to remember can be limited because the same posture or movement can be used for different vehicle functions.

Advantageously, the selection of the function can be performed by the user via a remote interface 31 . This selection of the function can also be performed by an initial body posture or an initial body movement of the user that is detected and recognized by the vehicle control system which identifies the function actually selected by the user.

To further limit the risk of controlling a function that doesn't correspond to the user choice or that user has selected by mistake, it can be requested in an optional step 105 that the user confirms the vehicle function that he/she has selected. In an intermediate step 106, the controlling method checks that the user confirms his / her selection, if not the method returns 1061 to the step 104 where it is requested once again to the user to select the function that he or she is intended to control.

Preferably and when the selection of the function is not confirmed or is not correctly confirmed, following step 106, the controlling method informs, in step 107, the user that confirmation is not valid or doesn't match with his/her initial selection.

Even if it is not represented on figure 5, the user can at any time during the implementation of the method according to the invention modify the selected function. Particularly, when a identified operation has been executed, the user has the opportunity to change the selected function for another.

Figure 6 depicts a further implementation of the method according to the invention. Compared to the implementations of figures 4 and 5, in the implementation of figure 6, user identification is performed in new step 100. In this step, the user has to be recognized by the control system as being an authorized user, that is to say a user that is authorized to control at least one function of the vehicle. This step 100 is performed before the execution of any operation and preferably before the step 1 10 where the motion recognition captures at least one image of at least one user posture or at least one user movement. More specifically, the user is identified by an identification code that is recognized by the control system in order to authorize him/her to control at least one function of the vehicle depending on the vehicle functions that the system can manage and that the user is authorized to control. Indeed, the identification code can authorize the user to control only a part of the available functions, that is to say part of the functions that the vehicle control system is capable of controlling.

The user identification can be a passive identification. For instance, the identification code can be a generated by a CID 40 (customer identification device) carried by the user when he or she is located in the surrounding area of the vehicle 1.

As previously explain the CID can be an electronic chip that is part of the ignition key unit. In an alternative the identification code can be a readable bar code fixed to a user's clothing, for instance, printed or sewn to a user's cap 45.

When an electronic CID 40 is used to generate the identification code, the electronic CID 40 is provided with a transmitter capable of transmitting the identification code via wireless communication. Ideally the electronic CID is also equipped with a signal receiver in order to receive a challenging message received via wireless communication from a signal transmitter 41 of the control system. If it has received and recognized the challenging message, the electronic CID 40 replies to the challenging message by transmitting to the control system the identification code which in its turn has to be recognized by the control system.

In another alternative, the user identification can be an active identification where the user has, for instance, to enter an alpha-numeric code in a remote interface 31 , such as a smart phone, that is then transmitted, via wireless communication to the control system.

If the control system can't recognize the identification code or if it receives no identification code when the control system has been activated or the presence of the user in the operational zone 18, 18a, 18b has been detected, the controlling method can perform several attempts via the return line 101 to try to identify the user. The fact that the controlling method performs several attempts to try to identify the user is particularly useful when the user identification depends on the location of the user or when the user identification is an active identification. When the user identification is an active one, the control system can inform the user, in step 102 via for instance an outside speaker 33, that the identification code he or she has entered is incorrect. When the user identification uses an electronic CID and if the control system can't receive from the electronic CID an identification code, the control system will also perform several attempts 101 and inform 102 between each attempt that the electronic CID is not in the vicinity of the vehicle or that the user carrying the electronic CID is not located in the surrounding area of the vehicle.

If after several attempts, for instance three attempts, or if after a determined elapsed time, for instance after 60 seconds, the control system doesn't succeed in identifying the user, the control system can be automatically deactivated in the step 103.

Thanks to the steps of user identification according to the invention, it is prevented that an unauthorized person uses the control system or that the user can control a vehicle function when he/she is not qualified or not authorized to control it. Therefore the security of the controlling method and the control system is greatly increased. This high level of security is especially recommended when the controlling method can be used to control accesses of the vehicle or when it can be used to control motions of the vehicle or control motion of some vehicle equipment.

Figure 7 depicts a particular implementation of the method according to the invention. Compared to the preceding implementations depicted on figures 4 to 6, the controlling method requests now to the user, in the further step 141 and for instance via the outside speaker 33, to confirm that the operation identified by the ECU 16, that is to say the corresponding operation that corresponds to the set of comparison data identified by the ECU 16 in step 130, and before to execute it, is really the one that the user wants the vehicle to execute.

The user confirmation can be performed by a new user posture or a new user movement that the recognition unit detects. In a different way, the user confirmation can be an oral confirmation that the system recognizes via the microphone unit 32, for instance, "YES" to confirm the identified operation or "NO" to indicate that the identified operation is not the correct one.

If, according to step 142, the identified operation is confirmed by the user, the control system requests, according to step 150, the execution of the identified operation by generating, for instance, a control signal which contains information for the execution of the identified operation.

If, on the contrary and according to step 142, the selected operation is not confirmed by the user, the method returns 143 to the step 110 in order to capture at least one new image representative of at least one new user posture or at least one new user movement.

Ideally, the confirmation can be requested only before the execution of some specific operations. For instance, the user confirmation will be requested only when the execution of the operation can affect safety. For instance, the user confirmation will be requested when the execution of the operation will result in a motion of the vehicle or in a motion of a vehicle equipment, such as the motion of a crane or a tailgate. On the contrary, if the function concerns the lighting system and if the selected operation consists of switching on front lights of the vehicle, a user confirmation is not necessary requested.

If the controlling method also comprises the step 105 of confirmation by the user of the selected vehicle function such as previously described in connection with figure 5, the confirmation according to steps 141 and 142 offers a safety redundancy that can be useful especially when the execution of the operation consists in a vehicle motion.

In a different implementation of the method such as depicted on figure 8, the method comprises the further step 145 of checking safety conditions wherein, before or during the step 170 of executing the operation identified by the control system, it is checked that the execution of the identified operation will not interfere with another operation currently being executed or will not interfere with an outside obstacle detected by the control system. In can be the case especially when the execution of the identified operation can result in a vehicle motion such as a forward or a backward motion of the vehicle or can result in an adjustment of the vehicle suspensions or if the execution of the identified operation can result in a motion of a piece of vehicle equipment such as the motion of a crane or a tailgate.

If before the step 170 of executing the identified operation, the control system detects, for instance, the presence of an outside obstacle such as an electric pole or a person located behind the vehicle, the control system will refuse, in step 146, the execution of the identified operation if this one consists in a backward motion of the vehicle or in a lift down of the rear tailgate of the vehicle.

If during the step 170 of executing the identified operation, the control system detects, for instance, the presence of an outside obstacle that can interfere with the identified operation, the control system will stop the execution of the identified operation.

If the system refuses or stops the execution of an identified operation, the user is preferably informed that the vehicle can't execute or can't continue to execute the identified operation. In this case, the controlling method returns to step 1 10 where the user can assume at least a new posture or perform at least a new movement whose image(s) will be captured by the motion recognition unit. In this case, not represented on figure 8, the user can also decide to select a new function.

The figures 9 and 10 depict learning functions that can be selected by the user. In some implementations of the method and as previously explained in connection with figure 5, the user can select, according to step 104, which vehicle function he wants to control with the control system. The control system can be configured to receive during this same step 104, the selection by the user of a learning function instead of the selection of a vehicle function.

The learning function can be used by the user in order to assign one or several body postures or body movements to an available operation that the vehicle control system can control. The available operation can be an operation to which a body posture or a body movement is already assigned or it can be an available operation of a vehicle function to which it is not already assigned a body posture or a body movement.

The learning function can be implemented in different ways according to the invention.

In a first implementation, such as represented on figure 9, the learning function can comprise the following steps :

• a step 201 of receiving, the selection by the user, among a predetermined list of operations that can be executed by at least one vehicle function, of the operation that will be learnt by the vehicle control system ;

· a step 202 of capturing, with the motion recognition unit 15, at least one reference image of a reference user posture assumed by the user in the operational zone 18, 18a, 18b or several reference images of a reference user movement performed in the operational zone 18, 18a, 18b ;

• a step 203 of converting said reference image(s) into a set of reference digital data ;

• a step 204 of storing the set of reference digital data in in a memory 22 of the vehicle control system or in a memory of the vehicle ;

• a step 205 of assigning the set of reference digital data to the selected operation.

In a second implementation, such as represented on figure 10, the learning function can comprise the following steps :

• a step of receiving 301 the selection by the user, among a predetermined list of operations that can be executed by at least one vehicle function, of the operation that will be learnt by the vehicle control system ;

• a step of capturing 302, with the motion recognition unit 15, at least one first reference image of a first reference user posture assumed by the user in the operational zone 18,

18a, 18b or several first reference images of a first reference user movement performed in the operational zone 18, 18a, 18b ;

• a step of converting 303 said first reference image(s) into a first set of reference digital data ;

• a step of storing 304 the first set of reference digital data in a memory 22, 23 of the vehicle control system or in a memory of the vehicle ;

• a step of capturing 305 at least one second reference image of a second reference user posture assumed by the user in the operational zone or several second reference images of a second reference user movement performed in the operational zone ;

• a step of converting 306 said second reference image(s) into a second set of reference digital data ; β a step of comparing 307 the second set of reference digital data with the first set of reference digital data ;

* if the second set of reference digital data matches the first one, a step of assigning 308 the first or the second set of reference data to the selected operation.

The user can select the operation to be learnt by the control system via a remote interface 31 that is in wireless communication with the control system.

The invention is of course not limited to the embodiments described above as examples, but encompasses all technical equivalents and alternatives of the means described as well as combinations thereof.