Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
TOUCH-GESTURE RECOGNITION-BASED OPERATION IN VEHICLES
Document Type and Number:
WIPO Patent Application WO/2014/002113
Kind Code:
A1
Abstract:
The present subject matter describes a method for achieving operations based on touch-gesture pattern recognition, on-board a vehicle. In an implementation, the method includes determining a pressure exerted on a touch pad (102) of a touch-gesture recognition system (100) in forming at least one touch-gesture pattern. The touch-gesture pattern is formed for operating a device of the vehicle. Based on at least the exerted pressure, the touch-gesture pattern is identified. Further, based on the identifying, an indicating device (212) is operated for indicating the operational device.

Inventors:
CHIPPA SUNIL KUMAR (IN)
DHINAGAR SAMRAJ JABEZ (IN)
Application Number:
PCT/IN2013/000371
Publication Date:
January 03, 2014
Filing Date:
June 14, 2013
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TVS MOTOR CO LTD (IN)
International Classes:
B60Q1/34; B60K37/06
Foreign References:
US20060047386A12006-03-02
US20110241850A12011-10-06
US20100250071A12010-09-30
US20100156656A12010-06-24
GB2355055A2001-04-11
Other References:
None
Attorney, Agent or Firm:
SEKARAN, Jayaram et al. (B-6/10 Safdarjung Enclave, New Delhi 9, IN)
Download PDF:
Claims:
I/We claim:

1. A method of achieving operations based on touch-gesture pattern recognition on-board a vehicle, the method comprising:

determining a pressure exerted on a touch pad (102) of a touch-gesture recognition system (100) in forming at least one touch-gesture pattern, the at least one touch-gesture pattern being formed for operating a device of the vehicle;

identifying the at least one touch-gesture pattern, based on at least the exerted pressure; and

operating an indicating device (212) to indicate the operational device, based on the identifying.

2. The method as claimed in claim 1, wherein the identifying comprises comparing the at least one touch-gesture pattern to a plurality of predefined patterns.

3. The method as claimed in claim 1, wherein the identifying comprises:

determining at least one contact parameter associated with the touch-gesture pattern; ascertaining at least one actuation command, based on the exerted pressure and the at least one contact parameter; and

executing the at least one actuation command for operating the device of the vehicle.

4. The method as claimed in claim 3, wherein the at least one contact parameter is one of a type of contact and a position coordinates associated with the contact achieved on the touch pad (102) while forming the touch-gesture pattern.

5. A touch-gesture pattern recognition system (100) on-board a vehicle, the touch-gesture pattern recognition system (100) comprising:

a touch pad (102) to receive at least one touch-gesture pattern, the touch pad (102) having one or more touch pad sensors (210) for transmitting the at least one touch-gesture pattern;

a microcontroller (104) connected to the touch pad (102), the microcontroller (104) comprising,

a recognition module (204) configured to,

identify the at least one touch-gesture pattern received from the one or more touch pad sensors (210), the touch-gesture pattern being identified based on at least a pressure exerted on the touch pad (102) for forming the at least one touch - gesture pattern; and

determining, based on the identification, an actuation command for operating a device of the vehicle; and

an indicator activation module (206) configured to activate an indicating device

(212) to indicate the device in operation, the activation being achieved based on the actuation command.

6. The touch-gesture pattern recognition system (100) as claimed in claim 5, wherein the device of the vehicle comprises at least one vehicle peripheral device.

7. The touch-gesture pattern recognition system (100) as claimed in claim 5, wherein the device comprise at least one of a horn, a high beam lamp, a low beam lamp, a dipper, a head light, a tail light and an indicator.

8. The touch-gesture pattern recognition system (100) as claimed in claim 5, further comprising:

a pressure determining module (200) to determine the pressure exerted on the touch pad

(102) during formation of the at least one touch-gesture pattern; and

a movement detection module (202) to determine a directional movement associated with the at least one touch-gesture pattern;

wherein the recognition module (204) is configured to identify the at least one touch- gesture pattern and to determine the actuation command, based on the determining by the pressure determining module (200) and the movement detection module (202).

9. The touch-gesture pattern recognition system (100) as claimed in claim 5, wherein the recognition module (204) is further configured to determine a validity of the at least one touch-gesture pattern, based on pre-configured patterns.

10. The touch-gesture pattern recognition system (100) as claimed in 5, wherein the touch pad (102) is positioned in vicinity of a steering mechanism of the vehicle.

Description:
TOUCH-GESTURE RECOGNITION-BASED OPERATION IN VEHICLES

TECHNICAL FIELD

[0001] The present subject matter, in general, relates to vehicle systems and, in particular, relates to touch-gesture recognition-based operations on-board vehicles.

BACKGROUND

[0002] Conventional switching systems in a vehicle require physical activity or manual effort from a driver to accomplish various switching operations while driving the vehicle. Such conventional switching systems include multiple switches to operate multiple devices within these vehicles. These switches may be in the form of buttons or knobs, etc., which are fitted almost everywhere around the operator of the vehicle. The physical activity that may be required includes pushing a button or moving a switch or knob located on the vehicle. For example, in order to turn on the headlights of the vehicle, a driver of a four-wheeled vehicle has to operate a switch. The presence of such excessive number of switches and devices may lead to confusion and distraction of the operator while driving the vehicle, which can be detrimental to traffic safety. Therefore, manual handling of the switches on-board the vehicles is a complex and skillful task. On the other hand, if usage of only a few switches is advised as a precautionary measure to avoid distraction, then the available resources may not be optimally utilized.

[0003] In an attempt to alleviate the amount of manual handling of vehicle controls and to minimize the driver's distraction, speech recognition and voice command systems have been developed and implemented, conventionally. However, the operation of the speech recognition and the voice command systems are based on the clarity of a voice command given, without which these systems will not operate in a prescribed way. This may increase the driver's distraction as the driver has to frequently repeat a command several times so as to execute a switching operation. Moreover, the driver has to remember a large number of commands in order to operate all the systems.

SUMMARY

[0004] The present subject matter describes a method for achieving operations based on touch- gesture pattern recognition on-board a vehicle. In an implementation, the method includes determining a pressure exerted on a touch pad of a touch-gesture recognition system in forming at least one touch-gesture pattern. The touch-gesture pattern is formed for operating a device of the vehicle. Based on at least the exerted pressure, the touch-gesture pattern is identified. Further, based on the identifying, an indicating device is operated for indicating the operational device.

[0005] These and other features, aspects, and advantages of the present subject matter will be better understood with reference to the following description and appended claims. This summary is provided to introduce a selection of concepts in a simplified form. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.

BRIEF DESCRIPTION OF DRAWINGS

[0006] The above and other features, aspects and advantages of the subject matter will be better understood with regard to the following description, appended claims, and accompanying drawings where:

[0007] Fig. 1 illustrates a block diagram representation of a touch-gesture recognition system in a vehicle, according to an embodiment of the present subject matter.

[0008] Fig. 2 illustrates a detailed block diagram representation of the touch-gesture recognition system of Fig. 1 , according to an embodiment of the present subject matter.

[0009] Fig. 3 a to 3e depict configuration of a touch pad of the touch-gesture recognition system of Fig. 1 , in various modes of operation, in accordance with an embodiment of the present subject matter.

[0010] Fig. 4 a method for touch-gesture pattern recognition, according to an embodiment of the present subject matter.

DETAILED DESCRIPTION

[0011] Conventionally, switching devices including multiple switches are provided on-board vehicles, to operate various devices and perform various functions on-board the vehicles. These switches are usually provided on the vehicle, around an operator's seat in the vehicle for accessibility. However, to operate such excessive number of switches and devices while driving the vehicle may lead to distraction of the operator. In addition, while driving the vehicle, the operator may not be able to identify certain functions or devices on-board the vehicle, which are in operation. For example, functions or devices, besides those having an indicator on an instrument panel of the vehicle, may not be identifiable by the operator as to whether such functions or devices are in operation or not. In such a case, the efforts of the operator to operate an already operational device may be duplicated or wasted, and can cause an inconvenience to the operator.

[0012] The disclosed subject matter relates to a touch-gesture recognition system and a method for achieving operations based on touch-gesture pattern recognition in a vehicle. According to an aspect, the system and method can be implemented for executing a number of actuation commands to selectively operate devices within the vehicle. Further, according to an aspect of the present subject matter, the touch-gesture recognition system is configured to determine the device being operated, and subsequently, activate an indicating device to indicate the device in operation, to an operator of the vehicle. The touch-gesture recognition system includes a touch pad, having a touch pad sensor, and a microcontroller. An operator or driver of the vehicle touches the touch pad in a predefined way to apply a touch-gesture pattern for selectively switching ON or OFF the devices and functions to perform the various switching operations within the vehicle, for example, for operation of various vehicle peripheral devices. The vehicle peripheral devices can be, for example, the horn, the high beam and low beam lamps, the dipper, the head and tail lights, the indicators, the air-conditioner, the music player, the window switch, the door lock, wiper assembly, and touch-start mechanism for engine starting.

[0013] In response to the receipt of the touch-gesture pattern, the touch pad sensor of the touch pad generates an intermediate code. The microcontroller analyzes the intermediate code and, based on this analysis, activates one or more devices on-board the vehicle.

[0014] In operation, the touch-gesture recognition system determines the kind of contact made by the operator on the touch pad, say whether the contact made by the operator on the touch pad sensor is a point contact or a sliding contact, and the directional movement of the contact, say position coordinates of the contact on the touch pad in case the contact is of sliding contact type. Additionally, the touch-gesture recognition system also evaluates the amount of pressure exerted by the operator on the touch pad while making the touch-gesture pa ttern. Bas ed upon the determined contact or the pressure or both, the touch-gesture recognition system retrieves and executes a command which corresponds to the touch-gesture pattern to operate one or more devices of the vehicle. With the execution of the command, the one or more devices associated with the command may be operated, for example, activated or deactivated, as the case may be. In said example, if the devices are operational, they may be deactivated, and the devices are switched off, they may be activated, by the execution of the command.

[0015] Since, the touch-gesture pattern is recognized on the basis of pressure, in addition to the directional movement of the contact, the touch-gesture recognition system can accommodate a large number of combinations of touch-gestures, i.e., directional movement and pressure associated with the touch gesture, for operating various devices of the vehicle. For example, a single-point touch-gesture on the touch pad by exerting considerably low pressure can activate a horn of the vehicle, whereas, the same touch-gesture formed by exerting a high pressure can achieve engine on-off operation. Since the number of combinations of the touch-gestures is high, a single touch-pad recognition system can be used for a large number of vehicle operations. As will be understood, such touch-gestures can also be recognized based on various factors, such as vehicle condition (vehicle speed or engine speed) and region of the touch pad on which the touch-gesture is formed.

[0016] The touch-gesture recognition system as disclosed herein provides a centralized switching system for activating or deactivating the devices implemented in the vehicle. Accordingly, the present subject matter eliminates the possibility of any confusion arising in the mind of the operator due to simultaneous usage of two or more manual switches and control knobs in the vehicle while driving. Further, the touch-gesture recognition system facilitates ease of operation of the various devices of the vehicle by the operator, as compared to conventional manually operated switches and knobs. Additionally, the amount of activity to be performed on the touch pad to operate the devices is considerably less than the activity required to manually actuate mechanical switches or knobs.

[0017] Further, according to an aspect, based on the execution of the command, the touch- gesture recognition system activates an indicating device, such as a light emitting diode (LED) matrix, to indicate, to the operator, the devices that have become operational in response to the touch-gesture. With the provision and operation of the indicating device of the touch-gesture recognition system to depict the operational devices, the operator of. the vehicle is always aware of the devices in operation. For example, certain functions and devices that may not be realized by the operator from the instrument cluster, such as door operation or window operation, can be depicted by the indicating system. In addition, devices and functions indicated by the instrument cluster of the vehicle, such as indicator lamps or head lamps, can also be indicated by the indicating system of the touch-gesture recognition system. As will be understood, the devices or functions of the vehicle, which are deactivated, will no longer be indicated by the indicating device. [0018] In addition, the indicating device can also be activated by the touch-gesture recognition system to assist the operator in identifying an initial position on the touch pad for creating the touch-gesture pattern on the touch pad. For example, the LEDs on the periphery of the LED matrix can be perpetually activated to allow the operator to easily identify the working region of the touch pad. The touch-gesture pattern formed henceforth by the operator on the screen is correctly detected and recognized by the touch-gesture recognition system, and the operation is easy and convenient for the operator.

[0019] The aspects related to touch-gesture pattern recognition on-board a vehicle shall be explained in detail with respect to the figures. While such aspects of touch-gesture pattern recognition can be implemented in a variety of ways, touch-gesture pattern recognition is described with reference to the following implementations. It should be noted that the description and figures merely illustrate the principles of the present subject matter. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the present subject matter and are included within its spirit and scope. Moreover, all statements herein reciting principles, aspects, and embodiments of the present subject matter, as well as specific examples thereof, are intended to encompass equivalents thereof.

[0020] Fig. 1 illustrates a schematic representation of a touch-gesture recognition system 100 in a vehicle, in accordance with one embodiment of the present subject matter. [0021] As depicted in Fig. 1 , the touch-gesture recognition system 100, hereinafter interchangeably referred to as the system 100, includes a touch pad 102 and a microcontroller 104. The touch pad 102 is communicatively connected to the microcontroller 104. The combination of the touch pad 102 and the microcontroller 104 operates to execute actuation commands, which in turn operate a number of devices and functions on-board the vehicle. The devices may vary depending upon the type and class of the vehicle. For example, the devices in case of a two-wheeled vehicle may be less in number and complexity as compared to a four- wheeled vehicle. Without limiting the scope of subject matter, the recognition system 100 may be implemented in various types of vehicles, such as the two-wheeled vehicle, a three-wheeled vehicle, or the four-wheeled vehicle.

[0022] In an embodiment, the touch pad 102 may be placed in the vicinity of a steering mechanism of the vehicle. For example, the touch pad 102 can be provided on either side of the steering mechanism, i.e., a steering handlebar, of the two-wheeled vehicle, or along with the instrument cluster on the steering mechanism. Likewise, in another example, the touch pad sensor 102 may be placed on a dashboard of a three- wheeled vehicle in the vicinity of the steering mechanism, or on the steering mechanism, i.e., the steering wheel or the steering handle bar. In case of a four-wheeled vehicle, the touch pad sensor 102 may be placed either on the steering mechanism, i.e., the steering wheel or on the dashboard in the vicinity of the steering mechanism. Such a provision of the touch pad 102 allows the operator to easily and conveniently access the touch pad 102 while driving the vehicle. Hence, according to an aspect of the present subject matter, the operator to use the touch-gesture recognition system 100 while driving the vehicle, allowing uninterrupted and convenient access to the operator.

[0023] Further, the touch pad 102 may be of various sizes and shapes, also depending upon the type and class of the vehicle. The shape of the touch-sensitive surface 106 of the touch pad 102 may be rectangular, circular, oval, or any other shape in accordance with the aesthetics of the vehicle. In addition, the touch pad 102 can include a plurality of touch pad sensors (not shown) associated with the touch-sensitive surface 106 of the touch pad 102. As will be explained in detail later, the touch pad sensors can be configured to determine whether a touch-gesture pattern is formed by the operator on the touch pad 102. In addition, the touch pad sensors can be configured to determine other details regarding the touch-gesture pattern, as will be described later. As will be understood, the touch-gesture pattern is formed by the operator to operate one or more devices or functions on-board the vehicle. In an example, the devices or functions provided within the vehicle and activated by the touch-gesture pattern may include vehicle peripheral devices. The vehicle peripheral devices can be, for example, the horn, the high beam and low beam lamps, the dipper, the head and tail lights, the indicators, the air-conditioner, the music player, the window switch, the door lock, wiper assembly, and touch-start mechanism for engine starting. Further, in an implementation, the touch-gesture patterns may be made on the touch pad 102 by hand or by using a stylus pen. Moreover, more than one touch pad 102 may be employed in the same vehicle. [0024] Fig. 2 illustrates a block diagram representation of the system 100 of Fig. 1 , in accordance with an embodiment of the present subject matter. As mentioned previously, the system 100 is configured to recognize the touch-gesture pattern formed on the touch pad 102 and operate one or more devices on-board the vehicle, based on the recognized touch-gesture pattern. According to an implementation, the microcontroller 104 includes pressure determining module 200, a movement detection module 202, a recognition module 204, and an indicator activation module 206. As will be understood, the various modules may include programs or coded instructions that supplement applications or functions performed by the system 100.

[0025] In addition, the system 100 can include a repository 208. In an implementation, the actuation commands may be stored in the repository 208 as factory settings, during the configuration of the repository 208. The actuation commands may correspond to a variety of composite data patterns related to different touch-gesture patterns. For example, every touch- gesture pattern can include a unique set of numerical values in terms of the pressure value and the position coordinates on the touch-sensitive surface 106 of the touch pad 102. Accordingly, each touch-gesture pattern can be associated with one or more predefined actuation commands, and the microcontroller 104 configured to execute such actuation commands upon identification of the touch-gesture pattern formed on the touch pad 102. In one example, the actuation commands may be configurable by the operator, whereas, in another example, the actuation commands can be preconfigured and stored in the repository 208. Further, it will be understood that although the repository 208 is shown external to the microcontroller 104, from where the microcontroller 104 retrieves the data and instructions for operation, the repository 208 'can be provided internal to the microcontroller 104. Further, the repository 208 may serve for storing data that is processed, received, or generated as a result of the execution of one or more modules in the microcontroller 104.

[0026] As discussed previously, the microcontroller 104 of the system 100 is configured to recognize a touch-gesture pattern formed by the operator on the touch pad 102. As will be understood, the microcontroller 104 is connected to the touch pad 102 and can receive inputs from touch pad sensors 210 of the touch pad 102 for recognizing the touch-gesture pattern formed on the touch pad 102. In addition, according to an embodiment of the present subject matter, the touch pad 102 can include an indicating device 212 which can indicate a currently operational device or a function on-board the vehicle, operable through the touch pad 102. In an example, the indicating device 212 can include a matrix of light emitting diode (LED) indicators provided on a printed circuit board (PCB), referred hereinafter as an LED matrix, and connected to the microcontroller 104 for operation. In said example, the touch-sensitive surface 106 of the touch pad 102 can be a substantially opaque surface, with a plurality of holes. The plurality of holes facilitates the visibility of the indicating device 212, such as the LED matrix, to the operator. As will be understood, the plurality of holes on the touch-sensitive surface 106 and the components of the indicating device 212 are aligned with respect to each other, in the assembled state. In another example, the indicating device 212 can include a liquid crystal display (LCD) monitor, and can be provided with the touch-sensitive surface 106 of the touch pad 102. In such a case, the touch-sensitive surface 106 can be substantially transparent. The indicating device 212, in an example, can be provided based on the type and class of the vehicle.

[0027] In an implementation, an operator may form the touch-gesture pattern by pressing his or her finger against the touch-sensitive surface of the touch pad 102. Such touch-gesture pattern may be made either by sliding a finger or stylus on the touch-sensitive surface in one direction, or by making other different geometrical shapes and figures, such as a square, a circle, a triangle etc. thereon.

[0028] Further, as mentioned earlier, the touch pad sensors 210 receive the touch-gesture pattern formed by the operator on the touch-sensitive surface 106 of the touch pad 102, and accordingly, provide the information regarding the touch-gesture pattern to the microcontroller 104. In one an implementation, the touch pad sensors 210 may be a motion detection sensor. Such a motion detection sensor may have an embedded electronic system for calculating the size and trace of the applied touch-gesture pattern.

[0029] Further, in an implementation, the touch pad sensors 210 can also include pressure sensors which provide information regarding the pressure exerted by the operator on the touch pad 102 while making the touch-gesture pattern. In an example, the touch pad sensors 102 can generate an intermediate code corresponding to the touch-gesture pattern formed on the touch pad 102 and communicate the intermediate code to the microcontroller 104. As will be understood, the intermediate code can include information regarding the pattern of the touch- gesture and the pressure exerted on the touch-sensitive surface while making the pattern. Further, the microcontroller 104 can analyze or decode the intermediate code received from the touch pad 102.

[0030] In an implementation, based on the intermediate code, the pressure determining module 200 determines the amount of pressure exerted by the operator while making the touch-gesture pattern. In an example, the amount of pressure exerted by the operator onto the touch pad sensor ,102, while applying the touch-gesture pattern, is compared with a pre-defined threshold value by the pressure determining module 200. Accordingly, the pressure determining module 200 can reject the touch-gesture pattern in case the applied pressure is below the pre-defined threshold value.

[0031] Further, upon the determining made by the pressure determining module 200 that the pressure exerted while forming the pattern is above the predefined threshold, the movement detection module 202 determines the design or trace of the touch-gesture pattern. For example, the movement detection module 202, can analyze the position coordinates corresponding of the touch-gesture pattern, say provided in the intermediate code.

[0032] Additionally, the movement detection module 202 can determine one or more contact parameters associated with the touch-gesture pattern to determine the directional movement associated with the touch-gesture pattern. The contact parameter can include position coordinates associated with the touch-gesture pattern as part of the contact parameters. Based on the position coordinates, the movement detection module 202 can determine the directional movement achieved while forming the touch-gesture pattern. [0033] In addition, the contact parameters can include the type of contact made with the touch sensitive surface 106 when the touch-gesture pattern is formed, i.e., whether the touch-gesture pattern was made by a point contact or a sliding contact made by the operator, and accordingly, the function on-board the vehicle can be executed. For example, a point touch-gesture, at a certain predefined location on the touch pad 102, can signify operation of door locking or window operation. In an example, the type of contact of the touch-gesture can be determined based on the position coordinates. Say, if the intermediate code includes a single pair of X and Y coordinates corresponding to the position of the touch-gesture pattern, in a two dimensional X-Y plane, then the movement detection module 202 can determine that the touch-gesture pattern is made by a point contact. On the other hand, if the intermediate code includes more than one pair of X-Y coordinates, the movement detection module 202 determines that the operator has made a sliding contact while making the touch-gesture pattern. Accordingly, if there is more than one pair of position coordinates in an instantaneous intermediate code, then the movement detection module 202 determines that an intended directional movement of the corresponding touch- gesture pattern is achieved on the touch pad 102. Such rules can be stored in the repository 208 for use by the microcontroller 104.

[0034] Further, in an example, on the basis of the determined contact parameters, the movement detection module 202 can further determine the trace of the touch-gesture pattern. Such operation of the movement detection module 202 enables ascertaining whether the touch- gesture pattern is a geometrical figure (as in case of switching mode of operation of the system 100) or a linguistic character or a string (as in case of write pad mode of the system 100).

[0035] The results obtained an evaluation of the touch-gesture pattern, done by the pressure determining module 200 and the movement detection module 202, are communicated to the recognition module 204. Such results in the form of data are analyzed by the recognition module 204. Based upon the analysis of the information, the recognition module 204 generates a composite data pattern, and in turn, on the basis of this composite data pattern, the recognition module 204 retrieves a related actuation com mand from the repository 208 from among a plurality of actuation commands. For example, based on the analysis of the touch-gesture pattern received from the operator, the recognition module 204 can identify the device or the function of the vehicle that the operator intends to operate, say activate or deactivate. [0036] In an implementation, based upon the analysis of the inputs from the pressure determining module 200 and the movement detection module 202, the recognition module 204 identifies the touch-gesture pattern by comparison with pre-configured touch-gesture patterns previously stored in the repository 208. Once the touch-gesture pattern formed by the operator has been identified, the recognition module 204 determines the device or function on-board the vehicle that the operator is operating, and the kind of operation that the operator wants to execute, say activation or deactivation of the device or function. Based on such determining, the recognition module 204 further determines the actuation command related to that instantaneous touch-gesture pattern and then executes the actuation command to perform the operation that the operator intends to execute. On the basis of this execution, the recognition module 204 operates one or more devices in the vehicle, associated with the identified touch-gesture pattern.

[0037] In one embodiment, neural network algorithms may be employed by the recognition module 204 for fetching of different commands from the repository 208. In case of a variety of operators using the same vehicle, neural network algorithms may be additionally utilized by the microcontroller 104 for storage of one or more operator identifications.

[0038] In a situation, various vehicle operators using the same recognition system 100 may have different ways of applying touch-gesture patterns on the touch pad 102. In other words, each operator tends to apply the touch-gesture patterns belonging to his/her own characteristic. In order to interpret all the touch-gesture patterns of various characteristics, an operator authentication may be performed at the beginning of a driving session. Such operator authentication may be performed with the help of stored profiles of operator who generally drive the vehicle. Moreover, the information related to characteristic gestures exhibited by these operators may be pre-entered into the microcontroller 104. In another implementation, all of the operators may use a common pre-defined gesture for a given operation. [0039] Further, the repository 208 of the system 100 may also be employed for storage of other parameters apart from the actuate commands. As an example, if the keypad mode of the touch pad 102 is implemented as a password driven authentication means, then the repository 208 may also be employed to store the password in its memory. Accordingly, the keypad mode of operation may be employed to switch ON or OFF any device within the vehicle by password- protection. In the present embodiment, the touch pad 102 in the keypad mode is segmented into different zones and may accordingly have different numerical digits engraved in different zones. In another implementation, alphanumeric and special characters may be engraved on the touch pad 102 and accordingly a touch-gesture pattern may include a sequence of such alphanumeric or special characters. For example, an operator may use a sequence of alphabets and numbers, such as "A12" or "A&B" as a password to operate the system 100 configured in the keypad mode.

[0040] Additionally, the indicator activation module 206 is configured to determine the device or function being operated, say activated or deactivated, in response to the execution of the actuation command. In an example, the indicator activation module 206 can be configured to determine the device or function being operated based on the actuation command retrieved by the recognition module 204 to operate that device or function. In another example, the indicator activation module 206 can receive prompts from the recognition module 204 for identifying the device or function being operated. According to an embodiment, the indicator activation module 206 is configured to activate the indicating device 212, based on which device or function is being operated. For example, the indicator activation module 206 can selectively illuminate certain LEDs from the LED matrix to depict the device in operation. In such example, the indicator activation module 206 can activate one or more pre-configured code sets for the LED matrix to activate the indicating device 212 for indicating to the operator about the device and the operation being executed.

[0041] Additionally, the indicating device 212 can be activated by the indicator activation module 206, based on pre-configured rules, to illuminate the touch-sensitive region of the touch pad 102. Such illumination of the touch pad 102 allows the operator to easily identify the usable region of the touch pad 102, and also allows the operator to identify the initial position on the touch pad 102 to form the touch-gesture pattern.

[0042] In addition, the indicator activation module 206 can further be configured to generate an alert or indication of a successfully analyzed touch-gesture pattern for execution of an associate operation. Similarly, in case of an incorrect touch-gesture pattern, a failure alert is exhibited by the indicator activation module 206, say on a display panel (not shown) connected to the touch-gesture recognition system 100. In addition, in case the touch-gesture pattern is intended for an already ongoing operation, the recognition module 204 can determine such a situation and then an appropriate failure alert can be provided to the operator, say on the display panel. As will be understood that such failure alert, as in this, case can occur in spite of a successful touch-gesture pattern being formed by the operator on the touch pad 102. The failure alert generated by the indicator activation module 206 in this case may be of a different type then the failure alert in case of the incorrect touch-gesture pattern. [0043] Fig. 3a, 3b, 3c, 3d, and 3e illustrate various views of the touch pad 102 of the system 100, in an embodiment of the present subject matter.

[0044] Fig. 3a, 3b and 3c depict various configurations of the touch pad 102 by which the indicating device 212, such as the LED matrix, is activated for the convenience of the user. As aforementioned, under the switch mode of operation, the touch-gesture pattern may be made anywhere on the touch-sensitive surface 106 of the touch pad 102. According to an implementation, the touch-gesture patterns may be location or position sensitive and the operation being implemented may depend on the location of the touch-gesture pattern made on the touch pad 102. For instance, in the switch mode of operation of the touch pad 102, a straight line may be drawn in the middle to execute an actuate command that activates a headlight, while a straight line drawn in proximity to either the right hand and the left hand edge of the touch pad 102 can activate a right and left turn indicator, respectively.

[0045] In addition, the operation of the touch pad 102 in the keypad mode can also be location sensitive as the touch-sensitive surface of the touch pad 102 is segmented, and hence, the location of the touch-gesture pattern determines the operation or function on-board the vehicle being implemented. For example, in the keypad mode, a point contact may be made by the operator by touching the touch pad 102 with the help of a finger or a stylus pen within a particular zone or segment. This point contact is applied by the operator at any zone of the touch- sensitive surface of the touch pad 102. Accordingly, the touch-sensitive surface 106 of the touch pad 102 can be segmented to function as a standard keypad, such as a QWERTY keyboard or a standard modern telephone dialing pad.

[0046] Further, each point contact achieved by the operator on the touch pad 102 and the location of the contact can be associated with the function or device. For example, the symbols associated with the zones may include numerical digits. In an implementation, these symbols may be alphanumeric symbols or special characters or a combination of both. Each symbol may correspond to a predetermined actuate command. For example, the numeral "1 " may be entered as the input by making a point contact over the zone bearing the numeral "1 ", so as to activate a horn of the vehicle. Similarly, the numeral "2" may be entered to activate a head lamp. Likewise, other zones provided on the touch-sensitive surface 106 of the touch pad 102 may also be contacted by the operator so as to select those symbols that are desired by the operator.

[0047] In an implementation, the touch pad 102 can operate in the write-pad mode is location insensitive and in such a mode the touch pad 102 operates with sliding touch. The touch-gesture patterns under the write-pad mode include a signature or a scribbled letter applied on the touch- sensitive surface 106 of the touch pad 102. Any pre-defined linguistic character or a string may be scribbled within the periphery of the touch-sensitive surface 106 of the touch pad 102. Such mode of operation of the touch pad 102 can allow the system 100 to use an effective password driven authentication for operation.

[0048] Further, the touch-gesture pattern in the switch mode may include various geometrical shapes and figures, while in the write pad mode, characters or a string including several characters may be entered. Therefore, the write-pad mode and the switch mode differ in terms of the applicable touch-gesture patterns. Hence, as will be understood, for the operation of the system 100, the touch-sensitive surface 106 is location sensitive to the device or function of the vehicle activated or deactivated by the operator. The touch pad 102, as described herein, provides direction for the operator to operate the system 100. The indicator activation module 206 activates the indicating device 212 to allow the user to identify the location on the touch pad for making the touch-gesture pattern.

[0049] For example, fig. 3a illustrates an implementation of the touch pad 102 in which the peripheral portion of the indicating device 212 is activated by the indicator activation module 206 for the convenience and ease of use of the operator. In said example, the indicator activation module 206 can activate the peripheral LEDs and central LEDs of the LED matrix to illuminate the touch-sensitive surface 106 of the touch pad 102 as shown in fig. 3a. Further, fig. 3b illustrates another implementation in which the indicating device 212 is activated to form a cross-like pattern on the touch-sensitive surface 106 of the touch pad 102. In an example, the indicator activation module 206 can illuminate the LEDs on the LED matrix to illuminate a cross-like pattern on the touch pad 102. Further, as shown in fig. 3c, the indicator activation module 206 can activate the indicating device 212. For example, the entire LED matrix can be illuminated, as shown in fig. 3c.

[0050] Further, as mentioned previously, the indicator activation module 206 can determine the device or function being currently operated, based on the touch-gesture pattern identified by the recognition module 204, and accordingly, the indicator activation module 206 can selectively activate the indicating device 212. The activation of the indicating device 212 is done to indicate the device or function of the vehicle in operation, to the operator. Fig. 3d and 3e illustrate the touch pad 102, with the indicating device 212, such as the LEDs on the LED matrix, activated to depict the function in operation. For example, fig. 3d illustrates the pattern of illumination of the LEDs to depict that the right turn indicator of the vehicle is activated, and fig. 3e illustrates the pattern of illumination of the LEDs to depict that the left turn indicator of the vehicle is activated.

[0051] Fig. 4 illustrates a method 400 for achieving operations based on touch-gesture pattern recognition on-board a vehicle, according to an embodiment of the present subject matter. In an implementation, the method 400 is implemented on the touch-gesture recognition system 100, and is executed by the microcontroller 104 of the touch-gesture recognition system 100. In one implementation, the method 400 can be employed by the microcontroller 104 to recognize the touch-gesture pattern, when the touch pad 102 operates under either of the aforementioned modes. [0052] The method 400 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, and modules. Further, the order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method, or an alternative method. Additionally, individual blocks may be deleted from the method 400 without departing from the spirit and scope of the subject matter described herein. Furthermore, the method 400 can be implemented in any suitable hardware, software, firmware, or combination thereof.

[0053] At block 402, a touch-gesture pattern is received on a touch pad 102 of a touch-gesture recognition system 100, on-board a vehicle. In an implementation, the touch-gesture pattern is received on a touch-sensitive surface 106 of the touch pad 102. Further, the touch-gesture pattern can be formed by an operator using one of the digits of the hand, i.e., a finger, or by using a stylus. In an implementation, the touch-gesture pattern can be converted into an intermediate code by touch pad sensors 210 and provided to the microcontroller 104 of the touch-gesture recognition system 100.

[0054] Further, at block 404 the touch-gesture pattern is identified. In an implantation, to identify the touch-gesture pattern, a pressure exerted on the touch pad 102 while forming the touch-gesture pattern is determined. In addition, the pressure value associated with the intermediate code is compared with a predetermined threshold pressure value P. Thus, it is determined whether a value of pressure in terms of any instantaneous touch-gesture pattern is either below or above the threshold value P. Any pressure below the threshold valve may be regarded as null. In an example, the above mentioned determining and comparison of the pressure associated with the touch-gesture pattern can be achieved by the pressure determining module 200. [0055] Further, in an implementation, at block 404, to identify the touch-gesture pattern, it is ascertained whether there is any directional movement associated with the applied touch-gesture pattern. For this purpose, one or more contact parameters, such as the position coordinates associated with the touch-gesture pattern are considered. In said example, on the basis of presence of more than one pair of position coordinate, the shape and size of the instantaneous touch-gesture pattern may be ascertained, say by the movement detection module 202. As discussed before, if a single pair of position coordinates is present in the data pattern, then the corresponding touch-gesture pattern is considered as a point contact having no directional movement.

[0056] According to an aspect of the present subject matter, the recognition module 204 generates a composite data pattern based upon the pressure data and the contact parameters, say to identify the type of contact (point or sliding), associated with the touch-gesture pattern. Further, based on the pressure value and the directional movement associated with the touch- gesture pattern, the touch-gesture pattern can be identified, say by the recognition module 204. In an example, based on various combinations of the pressure value and the contact parameters, a touch-gesture pattern can be identified from among a plurality of touch-gesture patterns stored in the repository 208. As explained previously, each touch- gesture pattern can be associated with one or more operations sets, i.e., operations responsible for switching ON or OFF the devices or functions on board the vehicle.

[0057] At block 406, at least one actuation command can be identified and executed based on the identification of the touch-gesture pattern, to operate one or more vehicle peripheral devices on-board the vehicle. In an implementation, to identify the actuation command, first the operation set from the plurality of operation sets associated with the touch-gesture pattern can be determined by the recognition module 204. Further, the recognition module 204 retrieves the actuation command associated with that operation set from the repository 208. In another example, the touch-gesture pattern can be associated with more than one operation sets. In such a case, a most relevant operation set is identified, say based on user-defined rules. Further, the actuation command is selected based on the most relevant operation set. In yet another example, the different actuation commands associated with the various operation sets associated with the touch-gesture pattern can be retrieved from the repository 208, and one of the actuation commands is determined for execution, based on used-defined rules. Further, the actuation command is executed by the touch-gesture recognition system 100, to perform an operation onboard the vehicle. The operation can be executed by operating, say activating or deactivating, the vehicle peripheral devices. In an example, the vehicle peripheral devices can be the horn, the high beam and low beam lamps, the dipper, the head and tail lights, the indicators, the air- conditioner, the music player, the window switch, the door lock, wiper assembly, and touch-start mechanism for engine starting.

[0058] Further, in case the touch-gesture pattern is unacceptable, for example, of the pressure exerted while forming the touch-gesture pattern is below the threshold value P, then a failure to may be signified to the operator, say with the help of a display panel associated with the system 1Q0. Additionally, in another example, if a relevant actuate command corresponding to the touch-gesture pattern is not found within the repository 208, then the operator is alerted of failure of operation.

[0059] Further, according to an implementation, at block 408, the device or function being operated, say activated or deactivated, is determined in response to the execution of the actuation command. In an example, the indicator activation module 206 can be configured to determine the device or function being operated based on the actuation command retrieved by the recognition module 204 to operate that device or function. According to an embodiment, the indicator activation module 206 is configured to activate the indicating device 212, based on which device or function is being operated. For example, the indicator activation module 206 can selectively illuminate certain LEDs from the LED matrix to depict the device in operation. In such example, the indicator activation module 206 can activate one or more pre -configured code sets for the LED matrix to activate the indicating device 212 for indicating to the operator about the device and the operation being executed.

[0060] The previously described versions of the subject matter and its equivalent thereof have many advantages, including those which are described below.

[0061] The touch-gesture recognition system 100 can be a centralized switching system that activates or deactivates various devices in the vehicle. Accordingly, an implementation of numerous independent switches for each device is eradicated. Hence, the touch-gesture recognition system 100 eradicates the confusion caused to the operator due to simultaneous usage of a number of switches while driving the vehicle. In addition, time elapsed between usage of a switch and performance of a concerned operation is minimized.

[0062] The touch pad 102 occupies less space and can be practically placed anywhere within the vehicle. Moreover, the present touch pad 102 is not prone to mechanical wear and tear and can be judiciously used for years. Further, the provision of the indicating device 212 and the operation of the indicating device 212 with the system 100 facilitates the operator to operate the system conveniently. In addition, with such provision, the operator is less prone to make erroneous touch-gesture patterns while operating the various devices and functions on the touch pad 102. As a result, a smooth and effective operation of the system 100 is facilitated by the provision of the indicating device 212 and the activation of the indicating device 212 by the microcontroller 104.

[0063] Although the subject matter has been described in considerable detail with reference to certain embodiments thereof, other embodiments are possible. It is to be understood that the appended claims are not necessarily limited to the features described herein. Rather, the features are disclosed as embodiments of touch-gesture pattern recognition-based operations on-board a vehicle.