Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
AUTOMATIC VEHICLE CLOSURE OPERATING
Document Type and Number:
WIPO Patent Application WO/2021/093934
Kind Code:
A1
Abstract:
Disclosed is a method for automatically operating a vehicle closure at a vehicle, the method comprising a configuration step (S100) of activating an hands-free closure-operating (HFCO) functionality in the vehicle (V) and setting conditions to operate the vehicle closure automatically, a detecting step (S200) of detecting the presence of a user and generating a sensor signal when the user is located in an authorization region (ADR) in the proximity to the vehicle (V), an analyzing step (S300) of analyzing the sensor signal, a generating step (S400) of generating based on said analyzed sensor signal an activation signal, provided that the conditions set to operate the vehicle closure automatically are satisfied and provided that the user is located in an activation region (ACR) inside the authorization region (AUR), and an actuating step (S500) of actuating an operating mechanism (60) coupled to the vehicle closure of the vehicle (V) upon receiving the activation signal. Further is disclosed a corresponding system (10) for automatically operating a vehicle closure of the vehicle (V), a computer program comprising instructions to cause the system (10) to execute the steps of the method and a computer-readable medium having stored thereon the computer program.

Inventors:
LÜDTKE THORSTEN (DE)
Application Number:
PCT/EP2019/080883
Publication Date:
May 20, 2021
Filing Date:
November 11, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
BYTON LTD (CN)
BYTON GMBH (DE)
International Classes:
E05F15/73; E05F15/76
Foreign References:
DE102014101201A12015-08-06
US20150247352A12015-09-03
GB2542274A2017-03-15
DE102017128774A12019-06-06
US20190330908A12019-10-31
US7688179B22010-03-30
Attorney, Agent or Firm:
KLUNKER IP PATENTANWÄLTE PARTG MBB (DE)
Download PDF:
Claims:
Claims

1. Method of automatically operating a vehicle closure, the method comprising:

- a configuration step (S100) with activating an hands-free closure-operating functionality in the vehicle (V) and receiving setting conditions to operate the vehicle closure automatically;

- a detecting step (S200) with detecting the presence of a user and generating a sensor signal when the user is located in an authorization region (AUR) in the proximity to the vehicle (V);

- an analyzing step (S300) with analyzing the sensor signal based on the conditions set in the configuration step (S100);

- a generating step (S400) with generating an activation signal based on said sensor signal analyzed in the analyzing step (S300), provided that conditions set for operating the vehicle closure automatically are satisfied and provided that the user is located in an activation region (ACR) inside the authorization region (AUR); and - an actuating step (S500) with actuating an operating mechanism (60) coupled to the vehicle closure upon receiving the activation signal.

2. The method of claim 1, wherein the configuration step (S100) further comprises automatically activating the hands-free closure-operating functionality by turning on a GPS of the vehicle (V) and detecting the vehicle’s position in an area where the hands-free closure- operating functionality is preset as likely to be used or preset as a preferred area.

3. The method according to claim 1 or 2, wherein the configuration step (S100) further comprises automatically activating the hands-free closure-operating functionality by turning on the GPS of the vehicle (V) and applying a learning algorithm.

4. The method according to any one of the preceding claims, wherein the conditions set for operating the vehicle closure automatically comprise at least one of: detecting a change of position of the user; detecting an image of the user, in particular an image of the user’s face; detecting a gesture of the user; detecting a sound of the user; and detecting a wireless signal sent by the user to the vehicle (V).

5. The method according to claim 4, wherein, in the detecting step (S200), the detecting the change of position of the user is carried out by identifying and cropping an image of the user and tracking the position of said image over time.

6. The method according to claim 4 or claim 5, wherein, in the detecting step (S200), the detecting of the change of position of the user is based on a wireless communication between the vehicle (V) and at least one of a portable remote-control unit and electronic key fob (501) associated to the user.

7. The method according to any one of the preceding claims, wherein, in the generating step (S400), the analyzed sensor signal triggers the generating of the activation signal when the user approaches the vehicle (V) from a determined direction relative to the vehicle (V). 8. The method according to any one of the preceding claims, wherein, in the generating step (S400), the analyzed sensor signal triggers the generating of the activation signal when an hands-free operating condition is recognized by the vehicle (V), wherein the hands-free operating condition consists in the impossibility for the user of using the hands to operate the vehicle closure. 9. The method of claim 8, wherein hands-free operating condition is determined by combining at least an image of the user, a variation of the position of the user relative to the vehicle (V) and the presence of the user in the activation region (ACR).

10. The method according to any one of the preceding claims, wherein, in the generating step (S400), the sensor signal triggers the generating of the activation signal when a predetermined gesture and/or a predetermined sound of the user is recognized by the vehicle (V).

11. The method according to any one of the preceding claims, wherein, in the generating step (S400), the sensor signal triggers the generating of the activation signal immediately after the user enters the activation region (ACR) as a consequence of a previous detection of a wireless signal sent from the user to the vehicle (V).

12. The method according to any one of the preceding claims, further comprising a detecting step (S600) with detecting a possible hindrance preventing the user from operating the vehicle closure before actuating the operating mechanism (60).

13. The method according to any one of the preceding claims, further comprising calculating the distance of the user relative to the vehicle (V) to detect the presence of the user in the authorization region (AUR) and the activation region (ACR).

14. The method according to any one of the preceding claims, further comprising a detecting step (S800) with detecting the light intensity in the region surrounding the vehicle (V) and activating at least an illuminating element of the vehicle (V) if the detected light intensity is below a determined threshold.

15. The method according to any one of the preceding claims, wherein the activation region (ACR) is defined as a region in the proximity to at least a part of the vehicle (V) extending from said part of the vehicle (V) for a first distance dx, wherein the first distance dx is comprised between 1 m and 9.5 m and the authorization region (AUR) is a region adjacent to the activation region (ACR) and extending outside said activation region (ACR) from the vehicle (V) for a second distance d2 comprised between 2 m and 10 m.

16. The method of claim 15, wherein the vehicle closure automatically closes when the user is outside a closing region (CR), the closing region (CR) being a region adjacent to the activation region (ACR) and extending outside and from the activation region (ACR) for a third distance d3, wherein the third distance d3 is at least 0.5 m.

17. The method according to any one of the preceding claims, wherein the method further comprises, after the actuating step (S500), a detecting step (S700) with detecting the presence of the user in the activation region (ACR) and closing the vehicle closure in case the user is out of the activation region (ACR) for a predetermined period of time or in case the presence of the user is detected in the driving position.

18. System (10) for automatically operating a vehicle closure of a vehicle (V) comprising:

- at least a sensor (20) for detecting the presence of a user in an authorization region (AUR) and in an activation region (ACR) in the proximity to the vehicle (V) and generating a sensor signal;

- a control unit (30) within the vehicle (V) for receiving the sensor signal and for receiving instructions by a user related to the activation of an hands-free closure-operating functionality in the vehicle (V) and related to preset conditions to operate the vehicle closure automatically; and - an actuator (40) for controlling an operating mechanism (60) of the vehicle closure; wherein the control unit (30) is configured to analyze the sensor signal based on the preset conditions and to generate an actuation signal based on the analyzed sensor signal provided that the preset conditions to automatically operate the vehicle closure are satisfied and provided that the user is located in the activation region (ACR) inside the authorization region (AUR), and wherein the control unit (30) is further configured to send the activation signal to the actuator (40) for operating the vehicle closure.

19. The system (10) of claim 18, further comprising a transmitting and receiving unit (50) for receiving a wireless signal from at least one of a portable remote-control unit and electronic key fob (501) associated to the user.

20. The system (10) of any one of claims 18 to 19, wherein the sensor (20) is at least one of:

- a camera (201) for capturing images and/or for recording videos in the authorization region (AUR);

- a microphone (203) for capturing sounds in the authorization region (AUR); and

- a position sensor (204) for detecting the presence of objects in the authorization region (AUR) and the activation region (ACR).

21. The system (10) of claim 20, comprising a plurality of position sensors (204) evenly distributed around the vehicle (V), said plurality of position sensors (204) comprising at least two front position sensors (204F) on each external side of the vehicle (V), at least two center position sensors (2040 on each external side of the vehicle (V), at least one rear position sensor (204R) on the external rear side of the vehicle (V) and at least one internal position sensor (2041) located inside the vehicle (V).

22. The system (10) of claim 20 or 21, comprising a plurality of cameras (201) evenly distributed around the vehicle (V), said plurality of cameras (201) comprising at least one rear camera (201R) on the external rear side of the vehicle (V), at least two side cameras (201S) on each external side of the vehicle (V), and at least one front camera (201F).

23. The system (10) of any one of claims 18-22, further comprising at least a light sensor (206) for detecting the light intensity in the authorization region (AUR) and activation region (ACR).

24. A vehicle (V) comprising the system for automatically operating a vehicle closure of the vehicle (V) according to anyone of the claims 18-23.

25. The method according to one of the claims 1-17, or the system (10) according to one of the claims 18-23, or the vehicle according to claim 24, wherein the closure is an access closures at the vehicle (V) which is at least one of the following list consisting of a sliding roof, a fuel access flap, a charging access flap, a lid of a stowage box, an engine hood, a trunk lid, a liftgate. 26. A computer program product comprising a computer program including instructions causing the control unit (30) of the system (10) according to one of the claims 18-23 to cany out the steps of the method according to one of the claims 1-17.

27. A computer-readable medium having stored thereon the computer program product of claim 26. 28. A data stream containing electronically readable control signals that are able to interact with the control unit (30) of the system (10) according to one of the claims 18-23 in such a way that the control unit (30) carries out a method according to one of Claims 1-17.

Description:
Automatic Vehicle Closure Operating

Field

In general, the present disclosure relates to the automatically operating of any closure on a vehicle. In particular, the present disclosure concerns a method and system for automatically operating, such as opening and/or closing, an access closure of a vehicle to assist a user who intends to access, e.g. to load, the vehicle through that access closure. In addition, the present disclosure relates to a vehicle comprising the system, a corresponding computer program and computer readable medium storing said computer program, and a corresponding data stream. Background

The following background information is provided solely to facilitate understanding of the present disclosure and should by no means be construed as an admitted prior art unless expressly designated as such.

There exist hands-free systems and methods to grant authority for accessing vehicles, such as cars, trucks, vans, or the like. Most are based on the interaction between the vehicle and a portable remote-control unit, such as a key fob, to authorize the user to access the vehicle. Once the vehicle recognizes the presence of the portable remote-control unit within a local area, the vehicle only unlocks a particular door or all doors or a trunk lid of the vehicle. In addition to the hands-free unlocking functionality, the user still has to perform a secondary action to gain physical access.

US 7,688,179 B2, for example, describes an opening mechanism based on a particular foot movement of the user in proximity of the vehicle. In detail, after the user is identified and validated through the key fob, a door opening sensor usually located underneath the bumper can be actuated by kicking a kick switch or interrupting a laser beam by exposing a foot to a laser emitting detector. Thus, by movement of the foot, the user can open or close the door of the vehicle to gain physical access without having to touch the door with his or her hands. This can be particularly useful when the user is unable to use the hands to open the door of the vehicle either directly acting on an opening switch on the vehicle or using an opening switch on a remote unit, as the key fob. A typical situation occurs when the user is carrying bulky objects such as shopping bags or luggage that require the use of both hands. The known methods and systems require the user actively interacting with an opening mechanism to open a particular door or lid when approaching the vehicle. Even though the authorization step occurs automatically by means of the portable remote-control device, to open the door the user still needs to interact with the vehicle by means of a secondary opening method and/or system, for example by the afore-mentioned movement of the foot in proximity to an opening sensor, such as the afore-mentioned kick switch, or actuating dedicated microswitches, or touching capacitive sensors, or pressing a corresponding control button at the key fob, or using a dedicated smartphone app, or similar. However, the required user’s movement to interact with a secondary opening method and/or system such as an opening sensor can be cumbersome or even hindered by different factors. For example, the user cannot easily reach the sensor or switch because sitting in a wheelchair or because holding a baby the user is not willing to additionally lift a leg for performing the foot gesture. The conventional secondary opening methods and systems fail to provide a convenient way for a handsfree access to the space behind a particular door at the vehicle, such as the trunk of a vehicle behind the trunk lid.

Summary

It is an aim of the present disclosure to provide a solution by which the above described drawbacks in the conventional hands-free closure-operating mechanisms can be overcome or at least reduced.

The term (access) closure in the following wording is to be understood as representing all openable closures of a vehicle such as from the following not exhaustive list consisting of sliding roofs, fuel access flaps, charging access flaps, lids of stowage boxes, engine hoods, trunk lids, liftgates just to name a few. The term “operate a vehicle closure” herein is intended to mean both “opening the closure” and “closing the closure” but may be restricted to one of these two alternatives.

The terms locking/unlocking a closure are in the present disclosure strictly understood as different to closing/opening a closure. The last terms concern the process of gaining actually physical access to the space behind a particular closure by opening it or terminating it by closing a closure. The first terms merely concern the process of enabling or disabling that a particular closure can be opened or not. The above stated aim may be achieved with the features of the attached independent claims. Further exemplary implementations and developments are defined in respective dependent claims.

Here, features and details that are defined in connection with the method for automatically operating a closure of a vehicle according to the present disclosure are correspondingly valid for the corresponding system and vice versa. For this reason, reciprocal reference is made with respect to the present disclosure of the individual aspects.

The core concept of the present disclosure is a method and a system to operate the closure of a vehicle in an automatic manner, particularly without requiring the user to employ the hands. To this effect, the vehicle is configured to recognize the need or intension of the user to operate the closure and automatically operate it. In other words, when a system in control of the closures implemented in the vehicle is configured to “understand” that the user intends to access a particular closure and at the same time may be unable or unwilling to operate it by employing the hands, the control system is configured to operate the corresponding closure automatically upon the user approaching the vehicle, in particular approaching the particular closure.

Thus, a first aspect of the present disclosure provides a method for automatically operating a vehicle closure, i.e. a closure at the vehicle which can be opened to gain physical access to the space behind the closure and closed to terminated said access. The method comprises the steps:

S100: a configuration step with activating a hands-free closure-operating functionality in the vehicle and receiving setting conditions to operate the vehicle closure automatically.

S200: a detecting step with detecting the presence of a user and generating a sensor signal when the user is located in an authorization region in the proximity to the vehicle. S300: an analyzing step with analyzing the sensor signal based on the set conditions.

S400: a generating step with generating of an activation signal based on said sensor signal analyzed in the analyzing step (S300), provided that the conditions set to operate the vehicle closure automatically are satisfied and provided that the user is located in an activation region inside the authorization region. S500: an actuating step with actuating an operating mechanism coupled to the vehicle closure after receiving the activation signal.

As mentioned above, the term “vehicle closure” is intended to cover any type of openable closures at the vehicle to gain physical access to the space behind the respective closure or a user. Therefore, the method can be configured to automatically open, for instance, a front door, a rear vehicle door, a sliding roof as well as a tailgate, liftgate, trunk lid or any other type of vehicle closure allowing the access of the user or objects into the vehicle behind the respective closure, e.g. the trunk space behind the tailgate.

In this context, the term “hands-free closure-operating (HFCO) functionality” means that the vehicle closure can be opened by the user without employing the hands to operate, i.e. open and/or close, the vehicle closure directly or indirectly; directly e.g. by actuating a switch or grasping a vehicle closure handle coupled to the operating mechanism of the closure, or indirectly, by using a remote-control, e.g. at the electronic key fob or a smartphone or smartphone application, or intentional movements like a kicking foot motion actuating a kick sensor to remotely operate the vehicle closure. The activation of the here proposed HFCO functionality means that the vehicle is prepared, i.e. configured, to allow the user to operate the vehicle closure without intentional movements.

The term “authorization region” defines a region in proximity to the vehicle used for the authorization and validation of a user. Once the user enters the authorization region, the vehicle starts controlling the identity of the user. Only authorized users can benefit from the different functionalities of the vehicle.

The term “activation region” defines a region comprised in the authorization region used for activating the HFCO functionalities of the vehicle once predetermined (set or preset) conditions are satisfied.

For example, the authorization region may be represented by an area extending 2-10 meters around the vehicle, whereas the activation region may be represented by an area included in the authorization region and extending 1-9.5 meters around the vehicle with the activation region to be at least 0.5 meters less in distance compared to the authorization region in order to offer enough hysteresis between the open or unlocked status and the close or locked status and for avoiding a switching back and forth between the locked and unlocked status when the user is at the border of the authorization region.

According to the method, the conditions to automatically operate the vehicle closure may be set in advance. These conditions can be set in at least one of: the factory by the vehicle’s manufacturer; at the sales of points by the vehicle’s seller; and directly by the user. In particular, the conditions may be set remotely or in the car through a manual input or automatically based on a machine learning process. In this way, a control unit located inside the vehicle is configured to prepare the vehicle according to the needs of the user. When the user approaches the vehicle, for example returning from a store, carrying shopping bags within both hands, the vehicle is configured to “understand” the situation by analyzing data related to the situation and to automatically operate the corresponding vehicle closure, for example the tailgate, to assist the user in stowing the bags away inside the vehicle, e.g. the trunk. For operating the vehicle closure concerned, an activation signal may be sent to the operating mechanism coupled to the vehicle closure.

On the other hand, if the user approaches the vehicle without any bags in the hands, the vehicle is configured to “understand” that in this situation the automatic operating for example of the tailgate is not necessary. In this case, the sensor signals used to understand the contextual situation will not trigger the generation step of generating an activation signal. Therefore, no activation signal will be sent to the operating mechanism because the predetermined (set or preset) conditions to automatically operate the vehicle closure are not met, i.e. not satisfied. Due to the possibility to set the conditions to automatically operate the vehicle closure in advance, the HFCO functionality can be tailored to the needs of any type of user.

For example, a person on a wheelchair can set a condition, according to which once the vehicle immediately detects the presence of that particular wheelchair accessing the activation region, the vehicle closure is automatically opened. On the other hand, a user with a baby can set a condition, according to which once the vehicle detects the presence of the user holding the baby, the vehicle rear door automatically opens to allow the user to secure the baby in the child seat.

The activation of the HFCO functionality can also be carried out manually by simply actuating an activating element, such as a dedicated switch button inside the vehicle or on the vehicle’s electronic key fob or in a corresponding smartphone application. On the other hand, alternatively or additionally the HFCO functionality can be activated automatically by the GPS of the vehicle. In the latter case, the activation can occur in different manners. For example, the HFCO functionality can be activated by detecting the vehicle’s position in an area where the HFCO functionality is likely to be used. For example, if the user is parking the vehicle in close proximity to a shopping mall, e.g. in a parking lot of the mall, by means of the GPS localization a control unit of the vehicle can be configured to recognize that the user most likely intends to buy one or more products in the shopping mall. Thus, when the user will return to the vehicle, the sensors will be turned on to check if the hands of the user are occupied by holding one or more packets or shopping bags. If the situation is falling into the shopping scenario category an configuration algorithm configures the HFCO functionality to operate the tailgate automatically without further input from the user to assist the deposit of the goods.

It is noted that the term “algorithm” is used throughout the description to indicate a particular (module realizing a particular function) or the overall process (the complete HFCO functionality) carried out by a computer such as the control unit in relation to different functionalities of or the entire method for operating, such as opening and/or closing, the vehicle’s closure.

Alternatively, or additionally, the HFCO functionality may also be activated by detecting the vehicle’s position in an area that is preset as a preferred area. A preferred area can be for example the personal parking place of the user. In this way, each time the vehicle is parked in this place, the user does not need to remember to turn on the functionality since it may be automatically activated by the control unit.

Alternatively, or additionally, the configuration algorithm behind the HFCO functionality may also be trained by applying a learning algorithm such as a deep learning algorithm. For example, if the tailgate is frequently or routinely opened and closed in a particular place, the control unit can “learn” that most likely in that place the user needs to load or unload items into or out of the vehicle. As an example, the control unit of the vehicle may learn that each time the user parks close to the veterinary, the tailgate needs to be opened and closed to let the dog in or out the vehicle. Thus, the learning algorithm can be applied to automatically configure the HFCO functionality to be enabled in a particular place, after the user periodically activates the HFCO functionality manually always in this particular place, e.g. a predetermined times in the same place.

As mentioned above, the conditions to automatically operate the vehicle closure can be set in advance by the user. These conditions may comprise (the detection of) a change of position of the user, an image of the user, a gesture or a sound of the user and/or of a wireless signal sent by the user to the vehicle. In other words, a particular movement of the user relative to the vehicle or a determined gesture or sound of the user can be employed to determine the condition to operate the vehicle closure automatically. It is noted that these conditions can be combined with each other. For example, the condition to operate the vehicle closure automatically can be the simple detection of the user’s face for the face recognition. Alternatively, the condition can be the face recognition combined with the movement of the user towards a particular portion of the vehicle, such as the rear of the vehicle.

In a further development of the method, the method may further comprise a detecting step with detecting the change of position of the user by identifying and cropping an image of the user, in particular an image of the user’s face, and tracking the position of said image over time. Alternatively, or additionally, detecting the change of position of the user may be based on a wireless communication between the vehicle and a portable remote-control unit associated to the user. In a further development of the method, in the generating step the analyzed sensor signal triggers the generation of the activation signal when the user approaches the vehicle from a determined direction relative to the vehicle, in particular when the user approaches the vehicle from a particular direction towards a particular portion of the vehicle, such as from behind the vehicle into the direction to the rear of the vehicle. If, for example, the user is returning from a store and is heading towards the rear of the vehicle, the tailgate as the corresponding closure may be automatically opened due to the recognized intention to place objects in the rear compartment. In this particular case, the sensor signal may be generated by at least one of position sensors, optical sensors such as cameras or an antenna for receiving wireless signals. In a further development of the method, the sensor signal triggers the generation of the activation signal when a hands-free operating (set or preset) condition is recognized by the vehicle. The term “hands-free operating condition” is related to the impossibility for the user of using the hands to operate the vehicle closure (for example, when both hands of the user are occupied, e.g. by shopping bags). In this particular case, the operating condition is determined by combining at least an image of the user, a variation of the position of the user relative to the vehicle and the presence of the user in the authorization region (and/or in the activation region). For example, by means of optical sensors it may be detected that the user is talking on mobile phone and at the same time is pushing a shopping cart. This condition, combined with the detection of the user moving toward the rear portion of the vehicle, determines the automatic operating of the tailgate once the user enters the authorization region. It is noted that if, for example, it is detected that the user is talking on the mobile phone and is pushing a shopping cart but is not directed toward the rear portion of the vehicle, the activation signal is not generated since the hands-free operating condition is not identified. The user may for example simply being heading to the place for storing the shopping carts. In this case, the intention of placing objects into the rear compartment of the vehicle is not recognized and therefore the automatic activation of the operating mechanism of the tailgate of the vehicle is not necessaty. In a further development of the method, the sensor signal triggers the generation of the activation signal when a predetermined gesture and/or a predetermined sound produced by the user is recognized by the vehicle. For example, a camera and/or a microphone can detect a predetermined movement of the head or of the body as well as a predetermined phrase or predetermined sound (e.g. whistle) of the user. It is noted that the user should not necessarily be a human being. For example, the vehicle may be configured to recognize the barking of a dog willing to jump inside the rear compartment of the vehicle.

In a further development of the method, the sensor signal triggers the generation of the activation signal immediately after the user enters the activation region as a consequence of a previous sending of a corresponding wireless signal from the user to the vehicle. For example, the user can decide at the check-out of the store or hotel to prepare the vehicle for automatically operate the vehicle closure by sending a wireless signal using for example a remote-control unit. In this way, immediately after the presence of the user is detected in the activation region, the vehicle closure of the vehicle may be automatically opened.

In a further development, the method may further comprise a detecting step with detecting a possible hindrance to operate the vehicle closure before actuating the operating mechanism. After the sensor signal is analyzed, even if the conditions set in advance by the user are satisfied, the vehicle or rather a control unit inside the vehicle can refrain from generating the activation signal if an obstacle is detected that would collide with the vehicle closure when opening and/or closing. In this way, it is possible to avoid damages to the body of the vehicle when automatically operating the closure.

In order to detect the presence of the user in the authorization region and in the activation region the method may further comprise a calculating step with calculating the distance of the user relative to the vehicle. The distance may be calculated by analyzing the data of a position sensor that detects the presence of objects within a predetermined distance from the vehicle. The distance may also be calculated by tracking an image of the user, e.g. an image of the user’s face, or by analyzing the received wireless signal sent by the user through a remote-control unit. In a further development, the method may further comprise a detecting step with detecting the light intensity in the region surrounding the vehicle and activating at least an illuminating element of the vehicle if the detected light intensity is below a determined threshold. By means of one or more light sensors located on the vehicle, the intensity or brightness of the light outside the vehicle may be measured. In case the light intensity or light brightness is considered to lie below a predetermined threshold, meaning that a camera would hardly recognize the presence of the user for example in the authorization region, the headlights, the rear lights, or the fog lights of the vehicle may be turned automatically on. This could be particularly useful for example when the HFCO functionality is activated and the vehicle detects the presence of a moving body in its proximity, but the identification of the operating intention is not clear enough. However, even if the intention is correctly identified by the vehicle, the function of turning on the illuminating element can be furthermore useful to better determine for example the user’s direction or the user’s gesture required as a condition to operate the vehicle closure automatically and improve the safety around the vehicle while automatically operating respective closures. According to the present disclosure, the “activation region” is defined as a region in proximity to at least a part or portion of the vehicle. This region can be limited to the rear portion or can surround the entire vehicle and can extend from the portion of the vehicle for a first distance di, wherein the first distance d j is comprised between 1 m and 9.5 m, preferably 3 m. According to the present disclosure, the “authorization region” is defined as a region adjacent to the activation region and extending outside the activation region from the vehicle for a second distance d 2 , wherein the second distance d 2 is comprised between 2 m and 10 m, preferably 5 m. In particular, the second distance d 2 is at least 0.5 meter larger than the first distance d .

In case the presence of a body outside the vehicle is detected by only using position sensors, the shape and the extension of the activation region or the authorization region would wary based on the number, the location and the characteristic of these sensors on the vehicle. An even distribution of position sensors would determine a continuous region extending all around from the vehicle for almost the same distance. If the sensors are concentrated in a particular area of the vehicle, the region would mostly extend for a distance in that area, wherein said distance can be related to the responsiveness of the sensors. In case the presence of a body outside the vehicle is detected by analyzing a wireless signal received by a remote-control unit, the value of the first distance di or the second distance d can be related to the calculated distance between the remote-control unit and the location of the receiver in the vehicle. In such cases, the region can assume therefore the shape of a circumference centered on the position of the receiver in the vehicle.

In a development, the method may further comprise the step of automatically closing the vehicle closure of the vehicle, when the user is outside or leaving a closing region. The “closing region” is defined as a region adjacent to the activation region and extending outside and from the activation region for a third distance d 3 , wherein the third distance d 3 needs to be minimum 0.5 meters.

It is noted that the closing region can coincide with the authorization region. In this case, when the user enters the authorization region from outside, for example coming from a store, the vehicle analyzes the presence of the user inside said region and starts the authorization process. On the other hand, when the already authorized user enters the authorization region coming from the activation region, the vehicle understands that the user is walking away from the vehicle and can automatically close the closure. In a further development, after actuating the operating mechanism, the method further comprises a detecting step with detecting the presence of the user in the activation region and closing the vehicle closure of the vehicle in case the user is out of or leaving the activation region for a predetermined period of time or in case the presence of the user is detected moving away from the activation region (and entering into the closing region or authorization region) or being in the driving position. In this way, the vehicle closure is automatically closed only when it is determined that the user does not need the HFCO functionality anymore. For example, the vehicle is able to determine whether the user finished to load or unload the vehicle by verifying that the user is outside the activation region for a predetermined period of time (because he/she walked away) or by detecting the presence of the user at the steering wheel or by walking away from the activation region. In this case, the tailgate is automatically closed.

Optionally, the activation region may comprise a loading region located in close proximity of the closure of the vehicle. For example, the loading region may be defined as a delimited area in proximity of the tailgate of the vehicle, where the user remains during the loading and unloading operations. In a further development, the method may further comprise the step of automatically closing the vehicle closure of the vehicle, when the user is outside said loading region. A second aspect of the present disclosure provides a system for automatically operating a vehicle closure.

The system comprises at least a sensor for detecting the presence of a user in an authorization region in the proximity to the vehicle and generating a sensor signal. The system further comprises a control unit within the vehicle for receiving the sensor signal and for receiving instructions by a user related to the activation of a HFCO functionality in the vehicle and related to conditions to operate the vehicle closure automatically. The system also comprises an actuator for controlling an operating mechanism of the vehicle closure of the vehicle.

In particular, the control unit is configured to analyze the sensor signal and to generate an activation signal based on the analyzed sensor signal provided that the conditions set to automatically operate the vehicle closure are satisfied and provided that the user is located in the activation region inside the authorization region and to send the activation signal to the actuator for operating the corresponding vehicle closure.

According to a particular configuration, the control unit can control more than one actuator at the same time, each actuator being coupled to a corresponding operating mechanism of a corresponding vehicle closure.

The control unit of the vehicle HFOC functionality can be a conventional computer, a programmable controller or any other suitable programmable device. The control unit can receive and store the information from the user regarding for example the set conditions to operate the vehicle closure automatically, and can also manage the functionality of the several components of the system, i.e. the detection sensors, the actuation of the operating mechanism, the elaboration of signals, etc.

In a particular configuration, the system may further comprise a transmitting and receiving unit for receiving a wireless signal from a portable remote-control unit associated to the user. The portable remote-control unit may comprise e.g. a key fob, a correspondingly configured smart device, such as a smartphone, or a corresponding application running on the smart device, configured for the user’s authentication. The transmitting and receiving unit can also be used to send and receive information regarding user setting, software updates, between the vehicle or rather the control unit of the vehicle and an external server through, for example, a cloud platform. In this way, the configuration set by the user will not only be stored in the vehicle but also be uploaded into a cloud storage from which the user configuration can be downloaded again e.g. when changing cars.

The sensor used in the system may be at least one of an optical sensor, such as a camera for capturing images and/or for recording videos in the authorization region, a microphone for capturing sounds in the authorization region, and/or a position and authorization sensor for detecting the presence of objects in the activation and authorization region and loading region. In particular, the system may comprise a combination of different type of sensors located in different places of the vehicle to differentiate the detection capabilities of the vehicle.

Additionally, the system may comprise a plurality of position and authorization sensors evenly distributed around the vehicle. The plurality of position sensors may possibly comprise one or more front position sensors, for example two front position sensors, on each external side of the vehicle, one or more center position sensors, for example two center position sensors, on each external side of the vehicle, one or more rear position sensors, for example one rear position sensor, on the external rear side of the vehicle and one or more internal position sensors, for example one internal position sensor, located inside the vehicle.

Additionally, the system may comprise a plurality of cameras evenly distributed around the vehicle. The plurality of cameras may possibly comprise at least one rear camera on the external rear side of the vehicle, at least two side cameras on each external side of the vehicle, and at least one front camera.

In a further development, the system may further comprise at least a light sensor for detecting the light intensity in the activation region and the authorization region. The light sensor may be located close to the optical sensor in order to detect the light conditions in its proximity. The light intensity or light brightness measured by the light detector is analyzed by the control unit. If the measured value is below a predetermined threshold, it is considered that the optical sensor does not receive sufficient light to clearly detect the objects. In this case, the control unit sends a command signal to an actuator coupled to an illuminating element of the vehicle to turn on headlights, rear lights, or fog lights. A third aspect of the present disclosure provides a vehicle comprising the system for automatically operating a vehicle closure according to the second aspect. The system may particularly be configured for carrying out the method according to the first aspect of the present disclosure. The vehicle of the third aspect can be an electric vehicle which may be an automobile, but may, in principle, also be any other kind of vehicle, such as an aircraft, watercraft, or rail vehicle. The vehicle may also be a vehicle having other types of drive or power supply, such as a conventional internal combustion engine or a fuel cell or can be an hybrid vehicle having an electrical drive and a conventional combustion engine as well. A fourth aspect of the present disclosure provides a computer program product comprising a computer program including instructions causing the control unit of the system according to the second aspect to carry out the steps of the method according to the first aspect. This computer program can run in the control unit described above.

A fifth aspect of the present disclosure provides a computer-readable medium having stored thereon the computer program product according to the fourth aspect. This computer- readable medium can be integrated in or separated by the control unit described above.

A sixth aspect of the present disclosure provides a data stream containing electronically readable control signals that are able to interact with the control unit of the system according to the second aspect in such a way that the control unit carries out a method according to the first aspect. For example, the data stream may be provided to the control unit from a server at the vehicle’s manufacturer site via a wireless connection of the vehicle to the internet.

Brief Description of the Drawing Figures

Other advantages, features, and details of the present disclosure arise from the following description, in which exemplary implementations are described in detail with reference to drawings. The features described in the claims and in the description may be relevant to the present disclosure individually or in any combination. Likewise, the features mentioned above and below can each be used individually or collectively in any combination. Functionally similar or identical parts or components are in some cases labelled with the same reference symbols. The terms “left”, “right”, “up,” and “down,” used in the description of the exemplary implementations relate to the drawings in an orientation with the legends legible in the normal fashion or reference characters legible in the normal fashion. The implementations shown and described are not to be taken as exhaustive but serve as examples for explaining the present disclosure. The detailed description is for the information of those of ordinary skill in the art, which is why known structures and methods are not shown or explained in detail in the description, to avoid complicating the understanding of the present description.

Figure 1 shows a flow chart illustrating steps for automatically operating the vehicle closure of a vehicle.

Figure 2 shows a schematic representation of the system for automatically operating the vehicle closure of a vehicle.

Figure 3 shows a flow chart illustrating steps of Fig 1 in detail.

Figure 4 is a block diagram describing the employment of user’s preferences.

Figure 5 is a software logic of the method for automatically operating the vehicle closure of a vehicle. Figure 6 illustrates a top view of a vehicle describing the operating mechanism of the tailgate.

Figure 7 illustrates a top view of a vehicle describing the distribution of position sensors.

Figure 8 illustrates a top view of a vehicle describing the distribution of optical and ultrasonic sensors.

Detailed Description of exemplary implementations

Figure 1 illustrates the steps of the here proposed method for automatically operating a vehicle closure in the system of Figure 2. In particular, the method comprises a configuration step S100, wherein the hands-free closure- operating (HFCO) functionality is activated manually by the user or automatically by the vehicle V or rather by the control unit 30 located in the vehicle V, for example by turning on the GPS. In the configuration step S100, the user also sets the conditions to operate the vehicle closure automatically. In the detecting step S200, the presence of an authorized user is detected and by means of at least a sensor 20 (position sensor, optical sensor ...) or by receiving a wireless signal. A sensor signal is generated when the user is located inside an authorization region AUR in the proximity of the vehicle V.

The method further comprises an analyzing step S300 for analyzing the sensor signal based on the conditions set in advance by the user. If the (set or preset) conditions to operate the vehicle closure automatically are satisfied, the sensor signal triggers the generation of the activation signal in the generating step S400 provided that the user has entered an activation region.

Upon receiving the activation signal, the operating mechanism 60 coupled to the vehicle closure of the vehicle V is actuated in the actuating step S500.

As further illustrated in Figure 2, the system 10 for automatically operating a vehicle closure of a vehicle V comprises one or more sensors 20, which can be of different nature, a control unit 30 for controlling all the elements of the system 10 and an actuator 40 for actuating the operating mechanism 60 of the vehicle closure of the vehicle V. The control unit 30 is the core of the system 10 and receives all the information from the user regarding the conditions to operate the vehicle closure automatically and interacts with the other components of the system 10. Each sensor 20 generates a sensor signal that is sent to the control unit 30. Based on the conditions set by the user, the control unit 30 analyzes the sensor signals and determines whether the conditions are satisfied or not satisfied. If the conditions are satisfied, the control unit 30 generates an activation signal that is sent to the actuator 40 for actuating the operating mechanism 60 coupled to the closure. It is noted that the sensor signal per se generated by the sensor 20 does not directly allow the actuation of the operating mechanism. The sensor signal needs to be first analyzed by the control unit 30 as a function of the conditions to operate the vehicle closure automatically. Only if one or more of the conditions set in advance by the user are satisfied and the user is in the activation region ACR, this sensor signal triggers the generation of an activation signal and is used to command the actuator 40. The system 10 can additionally comprise a transmitting and receiving unit or transceiver device 50 for receiving and transmitting signals between the vehicle V and an external device.

Figure 3 shows a detailed flow chart of the method steps of Figure 1.

In the configuration step S100, the user can activate the HFCO functionality manually in step SI 10 by means of an user interface element (touchscreen, gesture, voice, press button, key, or the like) or can activate the HFCO functionality automatically in step S120 for example by turning on the GPS of the vehicle V. Also, in step S100 the user sets the conditions to operate the vehicle closure automatically. This can be carried out by selecting one or more conditions from a list of possible conditions, for example on a touch screen display present in the vehicle V in step S130. The selection can also be sent to the vehicle V by the user wirelessly using an external device, for example a corresponding smartphone application communicating with the control unit 30 or the transceiver 50 of the system 10. The selection is therefore stored in the control unit 30 and/or online in the user account. Once the HFCO functionality is activated, the vehicle is prepared to operate the vehicle closure automatically and continually checks whether the conditions set in advance by the user are satisfied.

In the detection step S200, one or more sensors 20 are activated to examine the surroundings of the vehicle V in step S210. Based on the number and location of the sensors 20 on the vehicle V, an authorization region AUR is defined in step S220 in proximity of the vehicle V. At this point, each object or body detected in this region AUR is analyzed to determine if said object or said body is the user in step S230. In case the user is not detected, the vehicle V continues to examine the surroundings and the vehicle closure remains close in step S240. Only if the authorized user is detected the function and the sensors 20 are activated (step S250). For example, the user can be detected by face recognition or by detecting a wireless signal for authorization using a key fob or other transmission device like for example a smartphone. The corresponding sensor signals generated by the sensors 20 is sent to the control unit 30 in step S250. In the analyzing step S300, the control unit 30 receives the sensor signals (motion detection/gestures/voice/acoustic signals/approach direction ...) and carries out an analysis of the signal based on the predefined conditions to operate the vehicle closure in step S310. The control unit 30 determines whether the conditions set in advance are satisfied or not satisfied in step S320. If these conditions are not satisfied, the vehicle closure remains close in step S330. In this case, the HFCO can be automatically deactivated in the optional step S340. On the other hand, if the conditions are satisfied, an activation signal can be generated in the control unit 30 if the user is detected in the activation region ACR located inside the authorization region AUR. The sensors 20 can be used for this purpose.

In particular, in the generating step S400, the sensor signal triggers the generating step of generating an activation signal by the control unit 30 in step S410 and is sent to the actuator 40 in step S420.

As an additional step, before operating the closure, the method can comprise the step S600 of detecting possible hindrances to operate the closure. In step S610, the control unit 30 determines whether obstacles to operate the vehicle closure are present. This is carried out by analyzing sensor signals of dedicated sensors 20, such as ultrasonic or optical or other proximity sensors. If obstacles are detected, the vehicle closure remains close in step S620.

On the other hand, if no obstacles are detected, method proceeds further with the actuating step S500.

It is noted that the actuating step S500 can directly follow the generating step S400, without passing through the detection of possible obstacles. In the actuating step S500, the activation signal is received by the actuator 40 in step S510 and the operating mechanism is activated in step S520 to operate the vehicle closure of the vehicle. The method can additionally comprise the step S700 for automatically closing the vehicle closure of the vehicle V. In this step, using the same approach for detecting the presence of the user in the authorization region AUR and/or in the activation region ACR, it is detected if the user is outside the activation region ACR and/or the authorization region AUR in step S710. Alternatively, or additionally, it is detected if the user is located inside the vehicle in step S720. As a further alternative, the vehicle closure is automatically closed if the user moves away from the respective closure within the activation region ACR, for example if the user leaves a loading region LR located in proximity of the closure. If none of these situations are verified, the vehicle closure remains open in step S730. On the other hand, if at least one of these situations, i.e. the user is away from the activation region ACR, the authorization region AUR or is inside the vehicle V, the vehicle closure automatically closes in step S740.

In addition to the aforementioned method steps, the method further comprises the step S800 for measuring the light conditions surrounding the vehicle V. In this step, the method comprises determining if the light intensity measured by a dedicated light sensor is below a predetermined threshold value in step S810. If the measured value is below the threshold, at least an external illuminating element of the vehicle is turned on in step S820. On the other hand, if the measured value is above the threshold, the external illuminating elements of the vehicle remain turned off in step S830. It is noted that measuring the light conditions is carried out during each step of the method described above and that the illuminating elements of the vehicle are activated only if necessary. Figure 4 describes how the user’s preferences are employed for carrying out the method for automatically controlling the operating of a vehicle closure of the vehicle V. The first block 72 illustrates the storing and the input of the user preferences UP in the control unit 30, wherein the user preferences UP comprise the activation of the HFCO functionality and the setting of the conditions to automatically operate the vehicle closure of the vehicle V. User account UA settings can be directly stored locally in the vehicle V. Alternatively, or in addition, these settings can be synchronized with a cloud-based user account UA, i.e. user account UA settings can be stored on a cloud sever connected to and accessible via the internet. In this way, the settings and the behavior of the vehicle V can be automatically transferred from one vehicle to another compatible vehicle, e.g. a same vehicle type but different vehicle. The access to the cloud-based data storage of the user account UA may be established by the vehicle via antenna 52, transceiver 50 and security gateway 54. The local user data storage 56 is present in the first block 72. The user preferences UP can be sent via an onboard menu, two-directional speech logic or by any other means through the vehicle’s interfaces or through the user account UA. The second block 74 illustrates different sensors 20 that can be used as input means based on the user preferences UP. In particular, the sensors 20 comprise at least one of a camera or rear camera 201, a sensor related to the GPS 202, a microphone 203, a positional sensor 204, parking sensors 205, and a light sensor 206. The sensor signals received by the sensors 20 are analyzed by the control unit 30 together with the user preferences UP to actuate

301 the operating mechanism 60, for example to power the liftgate system, and also to activate

302 the external illuminating elements (e.g. reverse lighting, puddle lights or headlights), if necessary.

Figure 5 illustrates in a block diagram a software logic of the method for automatically operating the vehicle closure of a vehicle V.

The method starts at step S901 when the user decides to park the vehicle V and get out from the vehicle V. The vehicle V automatically locks itself in step S902, if the user walks out a predefined access zone (activation region ACR) that is a region extending outside the vehicle for a predetermined distance, for example between 1 m and 10 m, from the driving position. The vehicle V or rather the control unit 30 checks if the HFCO functionality is deactivated in step S903. In case the HFCO functionality is deactivated, the method ends in step S904.

On the other hand, if the HFCO functionality is not deactivated, the vehicle V or rather the control unit 30 controls the user preferences UP in step S905. The user preferences UP are related to the automatic activation of the HFCO functionality as well as the conditions to operate the vehicle closure automatically.

The preferences described in Fig 5 are possible examples which can be combined with each other and should not be considered as limiting features of the method. The user preferences UP can be based on the position of the vehicle V in places where the HFCO functionality is likely to be used 91. As an example, if the vehicle V enters a supermarket parking lot, the HFCO functionality is automatically activated based on GPS data indicating a location that is prone to lead to a situation where the user will benefit from the HFCO functionality (step S906).

The user preferences UP can also be based on preferred locations of the user 92. Here, the HFCO functionality is automatically activated once the vehicle reaches preferred GPS locations set up by the user (step S907).

In addition, the user preferences UP can be based on learning algorithm 93. In step S908, the HFCO functionality is automatically activated when the vehicle V reaches places where, after a learning process, it is determined that the user will benefit from an automatic operating of the closure. For example, while collecting usage data for the individual user in the user account UA, the control unit 30 can activate the HFCO functionality in locations where the user typically has opened the vehicle closure (e.g. the tailgate) in past usage. More specifically, this could be connected to places where the user employed a kick sensor, if present, or where the user frequently activated the HFCO functionality manually. Once the HFCO functionality is activated and based on these three user preferences (91, 92, 93), the user can return to the vehicle V and enter in the authorization region AUR in step S909. By means of a dedicated sensor 20, for example a position sensor 204, it is determined in step S910 if the user is approaching the vehicle V from a particular direction, for example from behind. The sensors are distributed evenly around the vehicle V. In step S911, a different sensor 20, for example the rear camera 201, can be used to collect an image. This image is matched with the user’s position as determined by the position sensor 204 and a motion scenario is analyzed in the control unit 30. An image processing algorithm determines whether (set or preset) conditions to operate the vehicle closure automatically (e.g. hands occupied, pushing a shopping cart...) are satisfied. In step S912, parking sensors 205 are used to detect the presence of any object clashing the vehicle closure swing zone. If no object is detected, the vehicle closure is opened in step S913 if the user is detected in the access zone or activation region ACR.

Optionally, in step S914 the vehicle closure automatically closes if the user is not detected any more in the proximity of the vehicle closure or if the user is detected inside the vehicle and the method ends in step S915.

The user preferences UP can further be based on a gesture control of the user 94 or on a voice control of the user 95. Once the HFCO functionality is activated and based on these two user preferences (94, 95), the user can return to the vehicle V and enter in the authorization region AUR in step S916. After a gesture or a voice pattern is recognized through a dedicated sensor (a rear camera 201 or a microphone 203) in step S917, the method proceeds directly to step S912 to detect the presence of obstacles before operating the closure. It is noted that the gesture and voice control preferences 94, 95 can be combined with the user preferences 91, 92, 93 related to the GPS so that after the user enters the authorization region AUR in step S916, the method further proceeds to step S910.

The user preferences UP can furthermore be based on a one-time activation 96. The HFCO functionality can be activated by the user through a remote command in step S918. Based on this user preference (96), the user can return to the vehicle V and enter in the authorization region AUR in step S919. After the detection of the user in the authorization region AUR, the method proceeds directly to step S912 to detect the presence of obstacles before operating the closure.

During the entire method described above, the light intensity of the region surrounding the vehicle V is measured by a light sensor in step S920. If it is determined that the surrounding is too dark for a camera to clearly distinguish the user or a movement of the user in a captured image, the external illuminating elements of the vehicle are turned on.

In a possible scenario, a car drives into a supermarket parking lot. The car recognizes the position and prepares for the driver of the vehicle to return with shopping cart or hands loaded with shopping bags. When the driver gets out of and walks away from the car for example at a distance of 3 m, the vehicle locks, closure handles retract, the mirrors fold, and the turn signals light up to indicate locked status. When the user returns to the vehicle and enters the authorization region AUR with hands loaded with shopping bags or pushing a shopping cart, the turn signals indicate that the car is unlocked, the vehicle closure handles deploy, the mirrors unfold, the user position is identified through the key fob signal. The rearview camera checks if the user is pushing a shopping cart or got the hands loaded with shopping bags and, when the user enters the activation region ACR, opens the tailgate automatically if one of these situations is recognized.

Figure 6 illustrates a top view representation of a vehicle V and the operating mechanism of the tailgate T for a standard vehicle V. The vehicle V usually comprises an external liftgate switch 601 that is configured for only operating the tailgate T from outside and a shutface switch 602 that is configured for only closing the tailgate T from outside. Also, the vehicle usually comprises an internal switch for the liftgate 603 for operating the tailgate T from inside the vehicle V and, optionally, a kick sensor 207 for opening and closing the tailgate T from outside the vehicle V. When a key fob 501 is detected within the authorization region AUR, the switches for opening and closing the tailgate T are activated. Therefore, the user has the possibility to operate the tailgate T manually using the external liftgate switch 601 or the internal switch 603 and can close the tailgate T using the shutface switch 602. in addition, if present, the user can operate, i.e. open and/or close, the tailgate T using the kick sensor 207. The authorization region AUR is a region surrounding the vehicle V and extending from the vehicle for a predetermined distance d. Figs. 7 and 8 illustrate a top view representation of a vehicle V and an intuitive operating mechanism of the tailgate T according to the method of the present disclosure. Figure 7 shows the evenly distribution of position sensors 204 on the vehicle V, wherein the position sensors 204 comprise two front position sensors 204F on each external side of the vehicle V, two center position sensors 204C on each external side of the vehicle V, one rear position sensor 204R on the external rear side of the vehicle V and one internal position sensor 204I located inside the vehicle V.

Figure 8 illustrates the distribution of the cameras 201, of the microphone 203, of the light sensor 206 and of the parking sensors or ultrasonic sensors 205. In particular, the cameras 201 comprise a one to two rear cameras 201R, two side cameras 201S and one to two front cameras 201F.

Also, Figures 7 and 8 illustrate with dotted lines a schematic representation of the electrical connection among the control unit 30 located inside the vehicle V and the different sensors (position sensors 204, cameras 201, microphone, 203, light sensor 206, ultrasonic sensors 205) arranged inside and outside the vehicle V. For the sake of clarity in the drawings, the position sensors 204 of Figure 7 are not shown in Figure 8. However, it should be clear that all the sensors described in Figure 7 and Figure 8 are present on the same vehicle V at the same time.

When for example the movement of the key fob 501 is detected by at least one of the sensors 204, the control unit 30 located inside the vehicle V checks whether the (set or preset) conditions to automatically operate the tailgate T are satisfied. By means of a combination of sensor signals generated by the position sensors 204 to detect the position of the user around the vehicle V and of the cameras 201 to determine whether the hands of the user are not free to operate the tailgate T and to control the movements of the user relative to the vehicle V, the control unit 30 can evaluate if the conditions set in advance by the user are satisfied and consequently operate the tailgate T.

As an example, Figure 7 also illustrates the authorization region AUR, the closing region CR and the activation region ACR surrounding the vehicle V. Specifically, this Figure illustrates the border of the activation region ACR with a dashed line, the border of the closing region CR with a dash-dotted line and the border of the authorization region AUR with a solid line.

Also, Figure 7 shows a loading region LR located in proximity of the tailgate T. Since the position sensors 204 are uniformly distributed on the vehicle V, the activation region ACR, the closing region CR and the authorization region AUR surround the entire vehicle uniformly as well. In particular, the activation region ACR is a continuous region extending all around from the vehicle for almost the same first distance dl. For example, the first distance dl can be comprised between 1 m and 9.5 m. As regards to the closing region CR, this is a region adjacent to the activation region ACR with a minimum dimension d3 of 0.5 meters in order to offer enough hysteresis between operating/activation of HFOC/unlocking and closing/locking of the vehicle.

The position sensors 204 can be employed as a sensing system for determining the vehicle access. Two or more position sensors 204 are used to calculate the distance between the user (or rather of the key fob 501) and the vehicle V. The sensors 204 authorize and prepare the vehicle V access, for example by deploying vehicle closure handles, activating the HFCO functionality and powering all needed components and systems. In particular, the vehicle access is granted when the user steps into the authorization region AUR. In fact, once the user enters the authorization region AUR, the vehicle V or rather the control unit 30 starts a process for authorizing and validating the identity of the user to access the vehicle V. If the user is authorized, the vehicle V starts analyzing if the (set or preset) conditions to operate the vehicle closure automatically are satisfied. This means that the vehicle V can detect images and/or analyze the user’s movements relative to the closure. If the conditions are satisfied, the closure opens automatically once the user enters the activation region ACR. The closure can be closed manually by the user after finishing the loading and unloading operations, or automatically, when the user for example walks away from the activation region ACR and enters in the closing region CR. The presence of a closing region CR serves for creating enough hysteresis between the open or unlocked status and the close or locked status and for avoiding a switching back and forth between the locked and unlocked status, when for example the user is at the border of the activation region ACR. Alternatively, the closure is automatically closed when the user walks away from the vehicle V for a threshold distance, for example at least 5 meters, or when the user simply leaves the loading region LR. Also, the closure can be automatically closed when the user is detected inside the vehicle V.

The rear camera 201 R is a camera suitable for detecting a gesture, a motion or for face recognition of the user. Two rear cameras 201R can be used to evaluate the distance of an object or a body from the vehicle V. The activation of the camera 201R can be performed by a suitable image processing algorithm as part of an analyzing algorithm when the user approaches the vehicle V from behind or generally closes the tailgate T area. Typically, the position and distance of the user will be sensed by the positional sensors around the vehicle. The image processing algorithm can be employed to isolate the image of the authorized user by analyzing the positional and proximity data derived from the sensing system for determining the vehicle access. The analyzing algorithm analyzes the user’s situation to understand if the closure, in this case the tailgate T, should be opened automatically, for example in case both hands of the user are occupied carrying goods, or the shopping cart is pushed towards the vehicle V, or the authorized user is in proximity of the vehicle and a dog is approaching from behind.

The microphone 203 can be used to detect any noise outside the vehicle V and to transmit the detected data to the control unit 30 for further analysis, e.g. by a suitable sound processing algorithm. The tailgate T can be opened by a voice command of the user or by any other type of noise (e.g. whistle) set in advance in the conditions to operate the closure.

The ultrasonic sensors 205 are located in proximity of the tailgate T and can represent the parking sensors of the vehicle. Additionally, these sensor can be used to detect the presence of obstacles close to the vehicle before automatically operate the tailgate T.

The light sensor 206 is located close to the camera sensors, for example the rear camera 201R, for measuring the light intensity it its vicinity. In case surrounding lighting is not sufficient for a suitable motion detection algorithm to correctly work, reversing lights or alternative light sources around the vehicle V can be switched on to increase the brightness of the surroundings and improve the detection of images from the cameras 201.

The above detailed description only illustrates certain exemplary implementations of the present disclosure and is not intended to limit the scope of the present disclosure. Those of ordinary skill in the art understand the description as a whole so that technical features described in connection with the various implementations can be combined into other implementations understandable to those of ordinary skill in the art. Also, any equivalent or modification of the described implementations as well as combinations thereof do not depart from the spirit and principle of the present disclosure and falls within the scope of the present disclosure as well as of the appended claims. As such, provided that these modifications and variants fall into the scope of the claims and equivalent technologies thereof, it is intended to embrace them within the present disclosure as well.