Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
MANAGEMENT METHOD AND VEHICLE
Document Type and Number:
WIPO Patent Application WO/2023/166430
Kind Code:
A1
Abstract:
A method comprises the following steps: providing a vehicle (1) comprising a dashboard (3) comprising at least a first interface (4A) and a second interface (4B), the second interface (4B) occupying a central position on the dashboard (3), the vehicle comprising at least one control unit (2) in communication with the first and the second interface (4A, 4B), the second interface (4B) being a user input interface; selecting a first animation (10) relating to a first function (F1) in the second interface (4B); dragging the first animation (10) towards the first interface (4A) by a dragging gesture on the second interface (4B); sending a reproduction of the first animation (10) to the first interface (4A) as a function of the first animation (10) selected and of the drag gesture; displaying the reproduction (15) of the first animation (10) on the first interface (4A).

Inventors:
GAGLIONE DAVIDE (IT)
PROGLIO LUCA (IT)
Application Number:
PCT/IB2023/051885
Publication Date:
September 07, 2023
Filing Date:
March 01, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
AUTOMOBILI LAMBORGHINI SPA (IT)
International Classes:
B60K35/00; B60K37/06
Foreign References:
DE102015216108A12017-03-02
EP2018992B12010-11-17
DE102015204978A12016-09-22
Attorney, Agent or Firm:
PUGGIOLI, Tommaso (IT)
Download PDF:
Claims:
CLAIMS

1. A method for managing the infotainment functions of a vehicle, comprising:

- providing a vehicle (1 ) comprising a dashboard (3) comprising at least a first interface (4A) and a second interface (4B), the second interface (4B) occupying a central position on the dashboard (3), the vehicle comprising at least one control unit (2) in communication with the first and the second interface (4A, 4B), the second interface (4B) being a user input interface;

- selecting, in the second interface (4B), a first animation (10) relating to a first function (F1 );

- dragging the first animation (10) towards the first interface (4A) by a dragging action performed on the second interface (4B);

- sending to the first interface (4A) a second animation (15) processed as a function of the first animation (10) selected and of the dragging action;

- displaying the second animation (15) on the first interface (4A).

2. The method according to the preceding claim, wherein the step of sending to the first interface (4A) a second animation (15) as a function of the first animation (10) selected and of the dragging action comprises a step of sending to the first interface (4A) a first signal (S1 ) processed by the control unit (2) as a function of the first animation (10) selected and of the dragging action.

3. The method according to claim 1 or 2, comprising a step of replacing, on the first interface (4A), a previous animation (0) with the second animation (15).

4. The method according to any one of the preceding claims, wherein the step of selecting, on the second interface (4B), a first animation (10) relating to a function (F1 ) comprises a step of selecting the first animation (10) from a menu relating to the function (F1 ).

5. The method according to any one of the preceding claims, comprising: - moving an internal portion of the first animation (10) into the second interface (4B);

- sending, to the first interface (4A), a movement signal as a function of a movement of the internal portion of the first animation (10);

- moving an internal portion of the second animation (15) into the first interface (4A) as a function of the movement signal.

6. The method according to any one of the preceding claims, comprising a step of moving the first animation (10) close to an edge (6) of the second interface (4B) and highlighting at least a portion (12) of the first animation (10) when the first animation (10) is close to the edge (6) of the second interface (4B).

7. The method according to the preceding claim, comprising a step of moving the first animation (10) close to an edge (6) of the second interface (4B) and highlighting at least a portion (7) of the edge (6) in proximity to the first animation (10).

8. The method according to any one of the preceding claims, wherein the vehicle comprises a third interface (4C), the method comprising:

- dragging the first animation (10) towards the third interface (4C) by a dragging action performed on the second interface (4B);

- sending, to the third interface (4C), a third signal (S3) processed by the control unit (2) as a function of the first animation (10) selected and the dragging action;

- sending a third animation (15’) to the third interface (4C) as a function of the third signal (S3);

- displaying the third animation (15’) on the third interface (4C).

9. A vehicle comprising: a dashboard (3) comprising at least a first interface (4A) and a second interface (4B) configured to prepare for selecting a first animation (10) and to allow a user to select the first animation, the first interface (4A) being positioned in a side portion of the dashboard (3) and the second interface (4B) being positioned in the centre of the dashboard (3), the vehicle comprising at least one control unit (2) in communication with the first and the second interface (4B), and being characterized in that the control unit (2) is configured to send a first signal (S1 ) to the first interface (4A) as a function of the first animation (10) selected and of a dragging action performed on the second interface (4B), the first interface (4A) being configured to display a second animation (15) as a function of the first signal (S1 ).

10. The vehicle according to the preceding claim, wherein the control unit (2) is configured to send a movement signal to the first interface (4A) as a function of a movement, performed by a user, of a portion of the first animation (10) into the second interface (4B), the first interface (4A) being configured to display a movement of a portion of the second animation (15) as a function of the movement signal.

1 1 . The vehicle according to claim 9 or 10, wherein the control unit (2) is configured to highlight at least a portion (12) of the first animation (10) and/or is configured to highlight at least a portion (7) of an edge (6) of the second interface (4B).

12. The vehicle according to any one of claims 9 to 1 1 , wherein the control unit (2) is configured to replace, on the first interface (4A), a previous animation (0) with the second animation (15).

13. The vehicle according to any one of claims 9 to 12, comprising a steering wheel (5) having a steering wheel portion (5’, 5”) and at least one control (C1 , C2) located in the first steering wheel portion (5’, 5”), wherein the first interface (4A) comprises an interface portion (4A’, 4A”) in proximity to the steering wheel portion (5’, 5”) and wherein the control unit (2) is configured to show on the interface portion (4A’, 4A”), after the dragging action performed on the second interface (4B), a category (A1 , A2) of animations related to the control (C1 , C2).

14. The vehicle according to any one of claims 9 to 13, comprising a third interface (4C), the control unit (2) being configured to send, to the third interface (4C), a third signal (S3) as a function of the first animation (10) selected and the dragging action performed on the second interface (4B), the third interface (4C) being configured to display a third animation (15’) as a function of the third signal (S3).

Description:
DESCRIPTION

MANAGEMEMENT METHOD AND VEHICLE

Technical field

This invention relates to a method, specifically a method for managing infotainment functions in a vehicle and to a vehicle which integrates a system for managing the infotainment functions.

Background art

Known in the prior art are vehicles provided with an at least partly digital dashboard (called virtual cockpit) where the driver can display numerous categories of functions (and information) on a display mounted in the housing of the instrument panel.

For example, besides the speedometer and rev counter, the virtual cockpit allows displaying the satellite navigator, the on-board computer or audio system information, thus ensuring greater reading comfort and driving safety.

In prior art solutions, the possibility of modifying the virtual cockpit settings, to adopt a favourite screen, for example, is managed directly by the driver via controls which are usually located on the multi-function steering wheel or a touch screen which constitutes the main interface of the infotainment system of the vehicle.

The infotainment screen is usually centrally mounted on the vehicle dashboard and integrates and/or replaces traditional controls.

The infotainment screen can be used to manage a variety of different functions such as, for example, the stereo system, the satellite navigator, applications/widgets and everything regarding connectivity with smartphones or tablets.

The actions that the user must perform to manage the functions (and information) between the virtual cockpit and the infotainment display are complex and potentially dangerous.

To display the different screens and applications in the cockpit, the driver generally has to select the function of interest and confirm the selection with one or more actions using the touch screen or physical buttons located on the main dash tunnel of the vehicle.

These operations, most of which can be carried out while the vehicle is moving, can make managing the functions of the vehicle particularly complicated and confusing.

The passenger also has access to the screen or to the physical buttons on the dashboard used to manage the different functions and, on account of the complexity of the operations, can perform actions that interfere with the driver.

This exposes the driver, passengers and other road users to potentially dangerous situations.

There is therefore a need to simplify the modes of managing the infotainment functions of the vehicle.

Aim of the invention

The aim of this disclosure is at least to meet the above-mentioned need by providing a particularly simple method for managing the infotainment functions in a vehicle and a vehicle to integrate a system for managing the infotainment functions.

At least said aim is achieved by the invention as characterized in the independent claims.

According to an aspect of it, this disclosure relates to a method, in particular a method for managing infotainment functions in a vehicle.

The method comprises a step of providing a vehicle essentially comprising a dashboard comprising at least a first interface and a second interface and at least one control unit in communication with the first and the second interface.

The term "dashboard" is used to denote the set of components situated at the front of the vehicle interior, facing the driver and the passengers, and in some cases in what is known as the "dash tunnel", to perform a variety of functions, for example, for aesthetic, structural, safety and information purposes. The second interface occupies a central position on the dashboard.

The second interface is a user input interface.

The second interface comprises a touch screen and allows the user to select the different vehicle infotainment functions.

For convenience, the second interface is sometimes referred to as "input interface" in this disclosure.

The first interface is, for example, a screen located in front of the vehicle driver and defining a virtual dashboard or virtual cockpit for the driver.

The first interface is therefore a display interface for the driver-user.

For convenience, the first interface is sometimes referred to as "display interface" in this disclosure.

The method comprises a step of selecting from the input interface a first animation relating to a first function.

The term "functions" is used to refer to the multiple multimedia contents/utilities of the vehicle, such as, for example, the satellite navigator, the audio functions, connectivity applications, widgets and so on.

The first animation may be a road navigation map, an audio player popup menu, a contact from the phone contact list and so on.

The method comprises a step of dragging the first animation in the direction of the display interface by a dragging gesture on the input interface.

In other words, the dragging gesture, or swipe, is an action performed by the user on the touch screen of the input interface towards the display interface.

Advantageously, being able to drag the first animation with a single swipe gesture to transfer it from the input screen, positioned centrally on the dashboard, to the driver's display screen considerably simplifies the operations which the user needs to perform to manage the infotainment functions.

Thanks to the single swipe, the user no longer needs to follow a lengthy selection procedure in response to different questions, at times even on different screens of the input interface or of the display interface, which used to make it particularly complicated for a user to manage the relation between the infotainment interface and the virtual cockpit.

Advantageously, being able to control the communication between the display interface and the input interface with a single swipe/dragging gesture makes it easier to manage the multimedia contents and allows making the most of all the functions and capabilities of the vehicle quickly and intuitively.

If the display interface is positioned on the left of the input interface, the user will make a swipe gesture (on the input interface) towards the left.

If the display interface is positioned on the right of the input interface, on the other hand, the user will make a swipe gesture (on the input interface) towards the right.

The method comprises a step of sending to the first interface, a second animation processed as a function of the first animation selected and of the swipe gesture.

The second animation is preferably at a fixed position on the display interface.

The display interface preferably has a rev counter, positioned centrally and defining a right-hand portion and a left-hand portion of the interface.

The second animation is situated, for example, in the right-hand portion or in the left-hand portion of the interface.

Preferably, the second animation may be a reproduction of the first animation; that is to say, the second animation copies the shape and multimedia content of the first animation, preferably in a smaller size.

The second animation may be, for example, a schematic representation of the first animation, which only partly reproduces the information content of the first animation.

In other examples, the second animation may be an animation that represents the first animation or a symbolic representation thereof.

The method comprises a step of displaying the second animation in the first interface. Advantageously, the user can display at least part of the content of the infotainment screen directly in the virtual cockpit without losing attentiveness to look at the dashboard and/or without using any special controls.

The step of sending a second animation to the display interface as a function of the first animation selected and of the swipe gesture may comprise a step of sending to the first interface a first signal, processed by the control unit as a function of the first animation selected and of the swipe gesture.

The control unit puts the first and the second interface, that is, the input interface and the display interface, in communication with each other.

The step of displaying the second animation in the display interface may comprise a step of displaying the second animation in the first interface as a function of the first signal.

The step of selecting a first animation relating to a first function in the input interface may comprise a step of selecting the first animation from a menu relating to the first function.

The menu be in the form of a popup list, a drop-down list or a set of icons/windows and so on.

Advantageously, selecting the animation from a menu relating to the function allows the user to choose the multimedia content of interest.

The method may essentially comprise a step of:

- moving an internal portion of the first animation into the input interface;

- sending a movement signal to the first interface as a function of a movement of the internal portion of the first animation into the input interface;

- displaying, in the first interface, a movement of an internal portion of the second animation as a function of the movement signal.

For example, if the first animation is an audio track playback drop-down menu with a minute progress bar, the user can use the touch screen of the input interface to move the progress bar to the desired minute.

At the same time, the user will see, on the virtual cockpit, a movement in the second animation which reproduces or represents the state of the progress bar, on the animation, in the input interface.

Advantageously, the movement made by the user on the input interface is displayed directly on the cockpit.

To view the movement, it is therefore sufficient for the driver to glance down at the cockpit without having to look away at the central part of the dashboard.

Advantageously, this reduces the time the driver looks away from the road and thus ensures a higher level of road safety.

The method may comprise a step of moving the first animation towards an edge of the input interface and highlighting at least a portion of the first animation to indicate that it can be displayed on the display interface.

The portion of the first animation that is highlighted when the first animation is moved towards the edge of the input interface may be an outer portion of the first animation at the edge of the interface or an edge/boundary of the first animation.

The term "highlighting" is used to mean that the highlighted portion is illuminated or otherwise flagged in order to draw attention to it.

If the display interface is on the left of the input interface, hence the dragging gesture, that is, the swipe, is from right to left, the outer left-hand portion of the animation is the portion that may be highlighted.

Advantageously, highlighting a portion of the first animation when the animation is moved towards the edge of the input interface allows the user to know whether that item can effectively be transferred to the display interface.

Highlighting a portion of the animation therefore allows distinguishing the animations that can be transferred to the display interface from those which the user cannot transfer (for example, for driving safety reasons).

The method may comprise a step of moving the first animation towards an edge of the input interface and highlighting at least a portion of the edge of the interface at the first animation. In other words, when the first animation is close to an edge of the input interface, at least a portion of the edge of the input interface is highlighted. Advantageously, highlighting at least a portion of that edge of the interface at the first animation when the animation is moved towards the edge of the input interface allows the user to know whether that item can effectively be transferred to the display interface.

Further, highlighting at least a portion of the edge of the interface tells the user whether the swipe gesture has been made in the correct direction.

If the display interface is on the left of the input interface, hence the dragging gesture, that is, the swipe, is from right to left, at least part of the outer lefthand edge of the input interface is the portion that will be highlighted.

If the display interface is on the right of the input interface, hence the dragging gesture, that is, the swipe, is from left to right, at least part of the outer right-hand edge of the input interface is the portion that will be highlighted.

The method may comprise a step of replacing a previous animation with the second animation in the first interface.

The previous animation may be a default animation that is displayed in the first interface, for example, when the vehicle is started.

In another example, the previous animation may be an animation of another multimedia content resulting, for example, from a previous dragging gesture, or swipe.

Advantageously, to replace one animation with another, that is to say, to replace the different multimedia contents, it is sufficient to make another swipe, that is, another dragging gesture.

Advantageously, replacing the animations with a swipe/dragging gesture simplifies and streamlines the procedure for selecting different vehicle functions, so the driver does not need to interact with different selection/question screens.

In an example, the first animation may be an audio track playback dropdown menu and the new animation which the user wishes to display may be a road navigation map with which the user replaces the drop-down menu by a single swipe gesture.

The vehicle may comprise a third interface.

The third interface is positioned on the dashboard on the side opposite the display interface with respect to the second interface.

The first interface is a display for the driver and the third interface is a display for the passenger.

The third interface is, for example, a screen.

In this disclosure, the third interface is sometimes referred to as "passenger interface" without thereby losing generality.

The third interface is in communication with the control unit.

In other words, the control unit puts the passenger interface in communication with the input interface.

The method comprises a step of dragging the first animation in the direction of the passenger interface by a swipe gesture (made) on the input interface. The method may comprise a step of sending to the passenger interface, a third animation processed as a function of the first animation selected and of the swipe gesture.

Advantageously, being able to drag the first animation with a single swipe gesture to transfer it from the input screen, positioned centrally on the dashboard, to the passenger's display screen considerably simplifies the operations which the user needs to perform to manage the infotainment functions.

Thanks to the single swipe, the user no longer needs to follow a lengthy selection procedure in response to different questions, at times even on different screens of the input interface or of the display interface, which used to make it particularly complicated for a user to manage the complex multimedia contents.

Managing the multimedia contents of the vehicle with a single swipe allows speeding up the procedures and interactions between the user and the input interface. It also avoids the passenger interfering with the driver by operating on the same interface.

On the contrary, thanks to the ease with which the multimedia contents between the different interfaces of the vehicle can be managed, the passenger can even be of assistance to the driver by managing, for example, the navigator function.

The step of sending a third animation to the third interface as a function of the first animation selected and of the swipe gesture may comprise a step of sending to the third interface a third signal, processed by the control unit as a function of the first animation selected and of the swipe gesture.

The method may comprise a step of displaying the third animation in the third interface.

The step of displaying the third animation in the third interface may comprise a step of displaying the third animation in the third interface as a function of the third signal.

Advantageously, sending and displaying an animation relating to a multimedia content of the infotainment system on the third interface, that is, on the interface intended for the passenger, means that the passenger does not interfere with the operations performed by the driver on the input interface.

In addition, the passenger can help the driver by managing the functions, that is, the multimedia contents, of the vehicle.

The method may comprise a step of replacing a previous animation with the third animation in the third interface.

The previous animation may be a default animation that is displayed in the third interface, for example, when the vehicle is started.

In another example, the previous animation may be an animation of another multimedia content resulting, for example, from a previous dragging gesture, or swipe.

The method may essentially comprise a step of:

- moving an internal portion of the first animation into the input interface; - sending a movement signal to the third interface as a function of a movement of the internal portion of the first animation into the input interface;

- displaying, in the third interface, a movement of an internal portion of the third animation as a function of the movement signal.

For example, if the first animation is an audio track playback drop-down menu with a minute progress bar, the user (in this case, the passenger) can use the touch screen of the input interface to move the progress bar to the desired minute.

At the same time, the passenger will see, on the passenger interface, a movement in the second animation which reproduces or represents the state of the progress bar, on the animation, in the input interface.

According to an aspect, this disclosure relates to a vehicle essentially comprising at least a control unit, a dashboard which in turn comprises at least a first interface and a second interface.

The first interface is positioned in a side portion of the dashboard and the second interface is positioned in the centre of the dashboard.

The first interface is a display interface and the second interface is an input interface.

The input interface is configured to enable selection of a first animation.

In other words, the input interface is configured to allow a user to select the first animation.

The control unit is in communication with the display and input interfaces. The control unit is configured to send a first signal to the display interface as a function of the first animation selected and of a swipe gesture made on the input interface.

Advantageously, having a control unit that is configured to send a first signal to the display interface as a function of the first animation selected and of a swipe gesture made on the input interface allows managing applications and the main contents optimally from the input interface, that is the infotainment system of the vehicle. Advantageously, the dragging gesture or swipe makes it easier to transfer the animation from the input interface, which is positioned centrally on the dashboard to the driver's display interface, thus considerably simplifying the operations which the user has to perform to manage the infotainment functions.

Thanks to the single swipe, the user no longer needs to follow a lengthy selection procedure in response to different questions, at times even on different screens of the input interface or of the display interface, which used to make it particularly complicated for a user to manage the relation between the infotainment interface and the virtual cockpit.

The display interface is configured for displaying a second animation as a function of the first signal.

Advantageously, the user can display at least part of the content of the infotainment screen directly in the virtual cockpit without losing attentiveness to look at the dashboard.

The control unit is configured for replacing a previous animation with the second animation in the first interface.

In an example, the previous animation is a navigation map or a default animation in the virtual cockpit, which is replaced with a second animation which may be, for example, an audio track playback drop-down menu.

Advantageously, the different multimedia contents can be replaced by making another swipe or dragging gesture.

Advantageously, replacing the animations with a swipe/dragging gesture simplifies and streamlines the procedure for selecting different vehicle functions, so the driver does not need to interact with different selection/question screens.

The control unit may be configured for sending a movement signal to the display interface as a function of a the user's moving a portion of the first animation in the input interface.

The first interface can be configured to display a movement of a portion of the second animation as a function of the movement signal Advantageously, the movement made by the user on the input interface is displayed directly on the cockpit.

To view the movement, it is therefore sufficient for the driver to glance down at the cockpit without having to look away at the central part of the dashboard.

Advantageously, this reduces the time the driver looks away from the road and thus ensures a higher level of road safety.

The first animation may comprise at least one outer portion.

The control unit may be configured to highlight at least one portion of the first animation.

More specifically, the control unit allows highlighting at least one portion of the first animation when the first animation is close to the edge of the input interface.

Advantageously, highlighting an outer portion of the first animation when the animation is moved towards the edge of the input interface allows the user to know whether that item can effectively be transferred to the display interface.

Highlighting an outer portion of the animation therefore allows distinguishing the animations that can be transferred to the display interface from those which the user cannot transfer (for example, for driving safety reasons).

The control unit may be configured to highlight at least one portion of an edge of the second interface.

More specifically, the control unit allows highlighting at least one portion of the edge of the input interface when the first animation is close to the edge of the input interface.

The control unit may be configured to highlight at least one portion of the first animation and to highlight at least one portion of an edge of the second interface.

Advantageously, highlighting at least a portion of the edge at the first animation when the animation is moved towards the edge of the input interface allows the user to know whether that item can effectively be transferred to the display interface.

Further, highlighting at least a portion of the edge of the interface tells the user whether the swipe gesture has been made in the correct direction.

The steering wheel may be provided with at least one control and the display interface may comprise a related portion at that portion of the steering wheel.

The control may be, for example, a knob, a pushbutton, a lever switch, and so on.

When a swipe gesture is made on the input interface, the control unit is configured to show a category of animations related to that control in the portion of the interface associated with the portion of the steering wheel and in which the control is situated.

The category of animations may be, for example, the media category (radio, audio, phone contact list) or the category of widgets relating to vehicle configurations and dynamics.

The vehicle may comprise a third interface.

The control unit is configured to send a third signal to the third interface as a function of a first animation selected and of a swipe gesture made on the input interface.

Advantageously, being able to drag the first animation with a single swipe gesture to transfer it from the input screen, positioned centrally on the dashboard, to the passenger's display screen considerably simplifies the operations which the user (who, in the case of the third interface may be the driver or the passenger) needs to perform to manage the infotainment functions.

Thanks to the single swipe, the user no longer needs to respond to different questions on different screens of the input interface or of the display interface, which used to make it particularly complicated for a user to manage the relation between the infotainment interface and the virtual cockpit.

It also avoids the passenger interfering with the driver by operating on the same interface.

On the contrary, thanks to the ease with which the multimedia contents between the different interfaces of the vehicle can be managed, the passenger can even be of assistance to the driver.

The passenger's display interface is configured for displaying a third animation as a function of the third signal.

The third animation may be a reproduction of the first animation, copying the shape and multimedia content of the first animation, preferably in a smaller size.

The third animation may be, for example, a schematic representation of the first animation, which only partly reproduces the information content of the first animation.

In other examples, the third animation may be an animation that represents the first animation or a symbolic representation thereof.

Advantageously, displaying an animation on the third interface, that is, on the interface intended for the passenger, means that the passenger does not interfere with the operations performed by the driver.

In addition, the passenger can, for example, help the driver by managing the functions, that is, the multimedia contents, of the vehicle.

The control unit is configured for replacing a previous animation with the second animation in the third interface.

Brief description of the drawings

The main features of the invention are more apparent from the detailed description which follows, with reference to the accompanying drawings which illustrate a preferred embodiment of the invention purely by way of non-limiting example, and in which:

- Figure 1 shows a configuration of a vehicle dashboard comprising two display interfaces and an input interface, according to the disclosure, in a schematic front view;

- Figure 2A shows a configuration of a vehicle dashboard comprising two display interfaces and an input interface, according to the disclosure, in a schematic front view;

- Figure 2B shows a configuration of a vehicle dashboard comprising two display interfaces and an input interface, according to the disclosure, in a schematic front view;

- Figure 3A shows a configuration of a vehicle dashboard comprising two display interfaces and an input interface, according to the disclosure, in a schematic front view;

- Figure 3B shows a configuration of a vehicle dashboard comprising two display interfaces and an input interface, according to the disclosure, in a schematic front view;

- Figure 4 shows a configuration of a vehicle dashboard comprising two display interfaces and an input interface, according to the disclosure, in a schematic front view;

- Figure 5 shows a configuration of a vehicle dashboard comprising two display interfaces and an input interface, according to the disclosure, in a schematic front view;

- Figure 6 shows a configuration of a vehicle dashboard comprising two display interfaces and an input interface, according to the disclosure, in a schematic front view.

Detailed description of preferred embodiments of the invention

With reference to the accompanying drawings, in particular Figure 1 , the numeral 1 denotes a vehicle.

The vehicle 1 essentially comprises:

- at least one control unit 2;

- a dashboard 3 comprising at least an interface 4A (serving as a display interface for the driver) and an interface 4B (serving as an input interface);

- a steering wheel 5 connected to the dashboard 3 and disposed below the interface 4A.

The interface 4B is positioned at the centre of the dashboard 3 and the interface 4A is positioned in a side portion of the dashboard 3.

The interface 4B is configured to enable selection of an animation 10. More specifically, the interface 4B allows a user to select the animation 10 of interest from a menu M1 relating to a function F1 , that is, to a multimedia content of the vehicle.

The control unit 2 puts the interfaces 4A and 4B, that is, the display interface and the input interface, in communication with each other.

In an embodiment, the dashboard 3 may comprise an interface 4C that defines a display interface for a passenger.

In an embodiment, as illustrated in the accompanying drawings, the control unit 2 puts the interfaces 4A and 4C in communication with the interface 4B. The unit 2 is configured to send a signal S1 to the interface 4A.

The signal S1 is generated as a function of the animation 10 selected by the user and as a function of a swipe gesture made by the user on the interface 4B in the direction of the interface 4A.

The interface 4A is configured to display an animation 15 processed as a function of the signal S1 .

In the example illustrated, the interface 4A has a rev counter located at a fixed, central position on the interface 4A.

As illustrated in Figure 3A, the rev counter C divides the interface C into two portions 4A' and 4A".

Preferably, the animations related to the functions such as media, radio navigator, phone contact list are displayed in the portion 4A' of the interface 4A, while the widgets relating to vehicle dynamics and configurations are displayed in the portion 4A" of the interface.

As illustrated in Figure 4, the portions 4A' and 4A" of the interface 4 are associated with respective portions 5' and 5" of the steering wheel 5.

Preferably, the portions 5' and 5" are provided with controls C1 and C2, respectively.

The division of the interface 4A according to the type of animation, as described above, is preferably consistent with the controls C1 , C2 available to the driver on the steering wheel 5.

For example, the steering wheel portion 5' at the interface portion 4A' includes controls C1 such as pushbuttons and/or knobs or other controls that allow the driver to select and scroll through animations of a category A1 , such as audio media, radio or phone contacts.

For example, the steering wheel portion 5" located below the interface portion 4A" includes controls C2 such as pushbuttons and/or controls for selecting a category A2 of animations, that is, widgets regarding vehicle dynamics and configurations.

More specifically, the control unit 2 is configured to show, respectively, in the portions 4A', 4A" of the interface 4, following the swipe gesture made on the interface 4B, a category A1 , A2 of animations related to the type of control C1 , C2.

The unit 2 is configured to send a signal S3 to the interface 4C.

The signal S3 is generated/processed as a function, for example, of the animation 10 and of a swipe (or dragging) gesture made on the interface 4B in the direction of the interface 4C.

The interface 4C is configured to display an animation 15' processed as a function of the signal S3, as shown in Figure 3B.

The animation 10 comprises a portion 12.

The portion 12 may be the outermost portion of the animation 10 such as, for example, the edge of the animation.

In an embodiment, as illustrated in Figures 2A and 2B, the unit 2 is configured to highlight at least the portion 12 of the animation 10 and to highlight at least a portion 7 of an edge 6 of the interface 4B.

In an embodiment, when the animation 10 moves close to the edge 6 as a result of the swipe gesture, the unit 2 is configured to highlight at least the portion 12 of the animation 10 and/or to highlight at least a portion 7 of an edge 6 of the interface 4B.

In the example illustrated in Figure 2A, the edge portion 7 that is highlighted is the one closest to the display interface, in this case 4A, to which the multimedia content, that is, the animation 10, is to be transferred.

In the example illustrated in Figure 2B, the edge portion 7 that is highlighted is the one closest to the display interface, in this case the passenger's display interface 4C, to which the multimedia content (that is, the animation 10) is to be transferred.

The interface 4B is configured to enable selection of an animation 20, different from the animation 10.

In other words, the user can select from the interface 4B an animation 20 relating to a function F2, that is, to a second multimedia content of the infotainment system of the vehicle.

In the non-limiting example illustrated in the accompanying drawings, in particular Figure 4, the animation 15 is an audio track playback drop-down menu and the animation 20, a navigator road map.

The unit 2 is configured to send a signal S2 to the interface 4A, generated as a function of the animation 20 selected and of a second dragging gesture, or swipe, made on the interface 4B in the direction of the interface 4A.

The control unit 2 is configured to replace the animation 15 with an animation 25 in the interface 4a, as a function of the signal S2.

As illustrated in the example of Figure 6, the user (in this case the driver) will display the map animation 25 on the interface 4A.

The animation 25 may be reproduction or a schematic representation of the animation 20 shown on the interface 4B.

Similarly, the control unit 2 is configured to replace, on the interface 4C, the reproduction 15' with a reproduction of a different animation as a function of the animation selected and as a function of a second dragging gesture (or swipe) made by the user on the interface 4B.

In an embodiment not illustrated, the control unit 2 is configured to send a movement signal to the first interface 4A as a function of a movement, performed by a user, of a portion of the first animation 10 into the second interface 4B.

In this embodiment, the interface 4A and/or the interface 4C is configured to display a movement of an internal portion of the animation 15 as a function of the movement signal. In other words, any movement performed by the user on the animation 10 is viewable by the user on the interface 4A and/or 4C.

This disclosure has for an object a method for managing infotainment functions in a vehicle.

The method comprises a step of providing a vehicle comprising a dashboard 3, in turn comprising at least an interface 4A and an interface 4B, and at least one control unit 2 in communication with the interfaces 4A and 4B.

The interface 4A is positioned in a side portion of the dashboard 3 and defines a display interface for the driver.

The interface 4B occupies a central position of the dashboard 3 and is an input interface for a user.

The method comprises a step of selecting an animation 10 relating to a function F1 from the interface 4B.

The method comprises a step of dragging the animation 10 in the direction of the interface 4A by a dragging gesture (made) on the interface 4B.

The method comprises a step of sending an animation 15 to the interface 4A as a function of the animation 10 and of the dragging gesture.

The method comprises a step of displaying the animation 15 on the interface 4A.

In an embodiment, the step of sending an animation 15 to the interface 4A as a function of the animation 10 and of the dragging gesture comprises a step of sending to the interface 4A a signal S1 , processed by the control unit 2 as a function of the animation 10 selected and of the dragging gesture. In an embodiment, the step of displaying the animation 15 on the interface 4A comprises a step of sending an animation 15 to the interface 4A as a function of the signal S1 .

In a preferred embodiment, the step of selecting an animation 10 relating to a function F1 from the interface 4B comprises a step of selecting the animation 10 from a menu M1 relating to the function F1 .

As illustrated in sequence in Figures 1 , 2A and 3A, the method comprises a step of replacing a previous animation 0 with the second animation 15 on the interface 4A.

In an embodiment illustrated in the accompanying drawings, where the vehicle comprises an interface 4C, the method comprises the following steps:

- dragging the animation 10 in the direction of the interface 4C by a dragging gesture (made) on the interface 4B;

- sending, to the interface 4A, a signal S3 processed by the control unit 2 as a function of the animation 10 selected and of the dragging gesture;

- sending an animation 15' to the interface 4C as a function of the signal S3;

- displaying the animation 15' of the animation 10 on the interface 4C.

In an embodiment, the method comprises a step of:

- moving an internal portion of the animation 10 into the interface 4B;

- sending to the first interface 4A a movement signal as a function of a movement of the internal portion of the first animation 10 into the interface 4B;

- displaying on the interface 4A a movement of an internal portion of the animation 15 as a function of the movement signal.

For example, in a case where the animation 10 is an audio track playback drop-down menu with a minute progress bar, the user can use the touch screen of the input interface to move the progress bar to the desired playback minute of the progress bar.

In a preferred embodiment, the method comprises a step of moving the animation 10 close to an edge 6 of the interface 4B and highlighting at least one portion 12 of the animation 10 when the animation 10 is close to the edge 6 of the interface 4B.

Preferably, the portion 12 of the animation 10 is an outer portion or the edge of the animation 10 itself.

In a preferred embodiment, the method comprises a step of moving the animation 10 close to an edge 6 of the interface 4B and highlighting at least one portion 7 of the edge 6 at or close to the animation 10.

Preferably, when the animation 10 is close to the edge 6 of the interface 4B, both the portion 12 of the animation and the portion 7 of the edge of the input interface at the display interface towards which the swipe gesture is made, are highlighted.

Once the user has selected the animation 10 and starts the swipe gesture by moving the animation closer to the edge 6 of the interface 4B, a wait animation 30, 30' appears on the display interface 4A, 4C, as illustrated in Figures 2A and 2B, while the user completes the swipe gesture.

In an embodiment, the method comprises a step of:

- selecting an animation 20 of a function F2 from the interface 4B;

- dragging the animation 20 in the direction of the interface 4A with a swipe gesture made on the interface 4B;

- sending a signal S2 to the control unit 2 as a function of the function F2 selected and of the second swipe gesture;

- sending an animation 25 to the interface 4A as a function of the signal S2;

- replacing the animation 15 with the animation 25 on the interface 4A;

- displaying the animation 25 on the interface 4A.

Preferably, when the animation 20 is close to the edge 6 of the interface 4B, both the portion 22 of the animation 20 and the portion 7 of the edge 6 are highlighted.

Once the user has selected the animation 20 and starts the swipe gesture by moving the animation closer to the edge 6 of the interface 4B, an animation 30" appears on the display interface 4A, as illustrated in Figure 5, while the user completes the swipe gesture.

In a preferred embodiment, the step of selecting an animation 20 relating to a function F2 from the interface 4B comprises a step of selecting the animation 20 from a menu relating to the function F2.