Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
OPERATION DEVICE AND OPERATION METHOD
Document Type and Number:
WIPO Patent Application WO/2016/001730
Kind Code:
A1
Abstract:
An operation device includes a touch operation unit and a controller. In at least a part of a period of time during a slide operation, the controller maintains an image displayed on a screen of a display device without moving the image, and, based on a positional relationship between an initial contact position of a finger at a beginning of the slide operation and a current contact position of the finger, outputs on the displayed image movement information (98) indicating a movement mode in which the displayed image is moved on the screen in a case where the slide operation terminates at the current contact position of the finger. When the slide operation terminates, the controller moves the displayed image based on a positional relationship between the initial contact position of the finger and a terminal contact position of the finger at a termination of the slide operation.

Inventors:
SHIKATA HIROSHI (JP)
HAMABE RYOTA (JP)
Application Number:
PCT/IB2015/001039
Publication Date:
January 07, 2016
Filing Date:
June 24, 2015
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TOYOTA MOTOR CO LTD (JP)
International Classes:
G06F3/0488; G01C21/00; G01C21/36; G06F3/048; G06F3/0485; G06F3/0486
Foreign References:
US20140071130A12014-03-13
EP2444778A22012-04-25
JP2014006708A2014-01-16
Download PDF:
Claims:
CLAIMS:

1. An operation device comprising

a touch operation unit including an operation surface provided separately from a display device, and

a controller configured to: (i) in at least a part of a period of time during which a slide operation in which a user slides a finger on the operation surface is performed, maintain an image displayed on a screen of the display device without moving the image, and, based on a positional relationship between an initial contact position of the finger in the operation surface at a beginning of the slide operation and a current contact position of the finger in the operation surface, output on the image displayed on the screen movement information indicating a movement mode in which the image displayed on the screen is moved on the screen in a case where the slide operation terminates at the current contact position of the finger; and (ii) when the slide operation terminates, move the image displayed on the screen based on a positional relationship between the initial contact position of the finger and a terminal contact position of the finger in the operation surface at a termination of the slide operation.

2. The operation device according to claim 1 , wherein the movement mode includes a movement direction in which the image displayed on the screen is moved and a movement amount by which the image displayed on the screen is moved.

3. The operation device according to claim 1 or 2, wherein:

the image displayed on the screen is a map image; and

the controller is configured to determine, based on the positional relationship between the initial contact position of the finger and the current contact position of the finger, a position in the map image to be moved to a center position of the screen in the case where the slide operation terminates at the current contact position of the finger, and output the movement information indicating the determined position on the image displayed on the screen.

4. The operation device according to claim 3 wherein the movement information includes an image of an arrow that connects a current center position of the map image with the position in the map image to be moved to the center position of the screen in the case where the slide operation terminates at the current contact position of the finger.

5. The operation device according to claim 3, wherein the controller is configured to determine the position in the map image to be moved to the center position of the screen in the case where the slide operation terminates at the current contact position of the finger such that a distance and a direction from the initial contact position of the finger to the current contact position of the finger correspond to a distance and a direction from the center position of the screen to the position in the map image to be moved to the center position of the screen in the case where the slide operation terminates at the current contact position of the finger.

6. An operation method using a touch operation unit including an operation surface provided separately from a display device, and a controller, the method comprising:

in at least a part of a period of time during which a slide operation in which a user slides a finger on the operation surface is performed, maintaining an image displayed on a screen of the display device without moving the image, and, based on a positional relationship between an initial contact position of the finger in the operation surface at a beginning of the slide operation and a current contact position of the finger in the operation surface, outputting on the image displayed on the screen movement information indicating a movement mode in which the image displayed on the screen is moved on the screen in a case where the slide operation terminates at the current contact position of the finger; and when the slide operation terminates, moving the image displayed on the screen based on a positional relationship between the initial contact position of the finger and a terminal contact position of the finger in the operation surface at a termination of the slide Ġ operation.

Description:
OPERATION DEVICE AND OPERATION METHOD

BACKGROUND OF THE INVENTION

1. Field of the Invention

[0001] The invention relates to an operation device and an operation method.

2. Description of Related Art

[0002] There is known a scroll control device in which, in response to each flick operation, the displayed information is scrolled by a certain amount and then the scrolling is stopped (see Japanese Patent Application No. 2014-006708 (JP-A-2014-006708) for example).

[0003] In the configuration described in JP-A-2014-006708, however, since the scroll amount in response to each flick operation is constant, it may be difficult for a user to achieve the desired scroll operation. For example, the user has difficulty in achieving the desired scroll operation when the user wants to perform the scroll operation to arrange a specific point at the center of a screen but the constant scroll amount does not match a distance between a current position of the specific point and the center of the screen.

SUMMARY OF THE INVENTION

[0004] The invention provides an operation device and operation method with which the user can relatively easily achieve movement of an image displayed on a screen in the desired mode.

[0005] A first aspect of the invention relates to an operation device including a touch operation unit including an operation surface provided separately from a display device, and a controller. The controller is configured to: (i) in at least a part of a period of time during which a slide operation in which a user slides a finger on the operation surface is performed, maintain an image displayed on a screen of the display device without moving the image, and, based on a positional relationship between an initial contact position of the finger in the operation surface at a beginning of the slide operation and a current contact position of the finger in the operation surface, output on the image displayed on the screen movement information indicating a movement mode in which the image displayed on the screen is moved on the screen in a case where the slide operation terminates at the current contact position of the finger; and (ii) when the slide operation terminates, move the image displayed on the screen based on a positional relationship between the initial contact position of the finger and a terminal contact position of the finger in the operation surface at a termination of the slide operation.

[0006] A second aspect of the invention relates to an operation method using a touch operation unit including an operation surface provided separately from a display device, and a controller. The method includes: in at least a part of a period of time during which a slide operation in which a user slides a finger on the operation surface is performed, maintaining an image displayed on a screen of the display device without moving the image, and, based on a positional relationship between an initial contact position of the finger in the operation surface at a beginning of the slide operation and a current contact position of the finger in the operation surface, outputting on the image displayed on the screen movement information indicating a movement mode in which the image displayed on the screen is moved on the screen in a case where the slide operation terminates at the current contact position of the finger; and when the slide operation terminates, moving the image displayed on the screen based on a positional relationship between the initial contact position of the finger and a terminal contact position of the finger in the operation surface at a termination of the slide operation.

[0007] With the above configurations, the user can relatively easily achieve movement of the image displayed on the screen in the desired mode. BRIEF DESCRIPTION OF THE DRAWINGS

[0008] Features, advantages, and technical and industrial significance of exemplary embodiments of the invention will be described below with reference to the accompanying drawings, in which like numerals denote like elements, and wherein:

FIG. 1 is a system diagram that shows a main configuration of a vehicular operation device 1 according to an embodiment of the invention;

FIG. 2 is a top view that schematically shows a touchpad 10;

FIG. 3 is a sectional view that schematically shows section of a main part of the touchpad 10;

FIGs. 4A and 4B are illustrations that show examples of map images displayed on a display 20;

FIG. 5 is a schematic view that shows an example of an arrangement of the touchpad 10 and the display 20 in the operation device 1 according to the embodiment;

FIG. 6 is a diagram that shows an example of state transition implemented by a control unit 16 of the touchpad 10 and a display control unit 30;

FIG. 7 is an illustration that shows an example of a map image including a movement vector image;

FIGs. 8A and 8B are illustrations for generation of the movement vector image;

FIG. 9 is an illustration of an example of a map image after a scroll processing is performed on the map image shown in FIG 7; and

FIG. 10 is a diagram that shows another example of state transition implemented by the control unit 16 of the touchpad 10 and the display control unit 30.

DETAILED DESCRIPTION OF EMBODIMENTS

[0009] Hereinafter, an embodiment of the invention is described in detail with reference to the drawings.

[0010] FIG. 1 is a system diagram that shows a main configuration of a vehicular operation device 1 according to an embodiment of the invention. FIG. 2 is a top view that schematically shows a touchpad 10. FIG 3 is a sectional view that schematically shows section of a main part of the touchpad 10. Note that a hand of a user that operates the touchpad 10 is schematically shown in FIG. 2 and not shown in FIG. 3. FIGs. 4A and 4B are illustrations that show examples of map images displayed on a display 20. Specifically, FIG. 4A shows an example of a map image displayed in a cursor state and FIG. 4B shows an example of a map image displayed in a pointer state. FIG 5 is a schematic view that shows an example of an arrangement of the touchpad 10 and the display 20 in the operation device 1 according to the embodiment.

[OOllj A vehicular operation device 1 according to this embodiment includes a touchpad 10 as an example of a touch operation unit, a display 20, and a display control unit 30 as an example of a controller.

[0012] The touchpad 10 is provided at an appropriate location in a vehicle cabin. Preferably, the touchpad 10 is arranged at a suitable location so as to be operated by a driver easily (or at a location within an arm's length of the driver with maintaining a driving position). The touchpad 10 may be arranged on a console box or in the vicinity of the console box, as shown in FIG. 5. As shown in FIG. 1 , the touchpad 10 includes a coordinate detection unit 12, a pressure detection unit 14, a control unit 16, and a memory 18.

[0013] The coordinate detection unit 12 has a two-dimensional operation surface

(touch operation surface), which is a generally flat surface, as shown in FIG. 2. The coordinate detection unit 12 includes a capacitance sensor. Detection signals of the capacitance sensor are transmitted to the control unit 16. The coordinate detection unit 12 is constituted by, for example, a capacitance pad. The capacitance pad has a structure in which electrodes (the capacitance sensor) are arranged in a plane such that a first set of electrodes extend linearly in the x-direction, a second set of electrodes extend linearly in the y-direction, and an insulator is sandwiched between the first set of the electrodes and the second set of the electrodes. When a human finger approaches any of the electrodes with an insulator panel in between, a capacitor consisting of the electrode and the finger as conductive plates is formed, and electric charge on the electrode (and a capacitance) varies. In this case, detection signals of the electrode (signals indicating variation in charge on the electrode) may be transmitted to the control unit 16.

[0014] The coordinate detection unit 12 is configured to be movable in an up-down direction (the z-direction in FIG 3). Any known mechanism can be employed for making the coordinate detection unit 12 movable in the up-down direction. In the example shown in FIG. 3, the coordinate detection unit 12 is supported by a substrate 60 via elastic members 54. A movable range of the coordinate detection unit 12 in the up-down direction may be set as necessary and may be very narrow.

[0015] The pressure detection unit 14 outputs a signal indicating downward movement of the coordinate detection unit 12. The pressure detection unit 14 is constituted by, for example, a tact switch or a pressure-sensitive sensor such as a piezoelectric element. The pressure detection unit 14 may be arranged at any location as long as the pressure detection unit 14 comes in contact with coordinate detection unit 12 when the operation surface of the coordinate detection unit 12 moves downward. For example, the pressure-sensitive sensor constituting the pressure detection unit 14 is provided under a central portion of the coordinate detection unit 12 in the example shown in FIG. 3, but the pressure-sensitive sensor may be provided under a peripheral portion of the coordinate detection unit 12. Further, a plurality of pressure-sensitive sensors that constitutes the pressure detection unit 14 may be provided separately.

[0016] The control unit 16 and the memory 18 is constituted by a microcomputer(s).

[0017] The control unit 16 detects contact of the finger with the touch operation surface based on outputs (output signals) from the capacitance sensor of the coordinate detection unit 12. The control unit 16 generates a coordinate signal indicating a coordinate of a position in the touch operation surface, that is, a position in the operation surface at which the user touches the touch operation surface with the finger (a position in the operation surface at which the finger comes in contact with the touch operation surface). In a case where the coordinate detection unit 12 is constituted by the capacitance pad, as described above, electric charge is stored in a capacitor consisting of an electrode and the finger, and as a result, a variation amount in electric charge on each electrode varies depending on the position of the finger. Thus, the position of the finger can be specified based on detection signals from the electrodes. Specifically, the control unit 16 generates, when an output level from the coordinate detection unit 12 exceeds a predetermined reference value (a detection threshold), a coordinate signal based on a position of an electrode from which a detection signal having the maximum (local maximum) level is output. The predetermined reference value is a value related to, for example, a variation amount in electric charge on the electrode. When a variation amount (maximum electric charge variation amount) in electric charge on the electrode exceeds the reference value, the control unit 16 determines that the finger is in contact with the touch operation surface and generates a coordinate signal (indicating, for example, a two-dimensional position at which the variation amount in electric charge becomes maximum). On the other hand, when a variation amount in electric charge on the electrode does not exceed the reference value, the control unit 16 determines that the finger has no contact with the touch operation surface and does not generate a coordinate signal. The reference value may be stored in the memory 18. The generated coordinate signal is transmitted to the display control unit 30.

[0018] The control unit 16 generates a determination signal based on an output signal from the pressure detection unit 14. In a case where the pressure detection unit 14 is constituted by a pressure-sensitive sensor, the control unit 16 detects a determination operation performed by the user when an output (indicating a pressure for being pressed down) from the pressure-sensitive sensor exceeds a predetermined threshold Pn, and then generates a determination signal. In a case where the pressure detection unit 14 is constituted by a tact switch, the control unit 16 detects a determination operation performed by the user when an ON signal is input from the tact switch, and then generates a determination signal. The generated determination signal is transmitted to the display control unit 30.

[0019] The control unit 16 communicates with the display control unit 30 and transmits various information including a coordinate signal and a determination signal to the display control unit 30. A part of or whole function of the control unit 16 may be implemented by the coordinate detection unit 12.

[0020] The display 20 is arranged at a location apart from the touchpad 10, as shown in FIG. 5. The display 20 may be a known display device such as a liquid crystal display or a head-up display (HUD). The display 20 is arranged at an appropriate location in the vehicle cabin, and may be arranged on an instrument panel. The display 20 may be a touch panel display. In the embodiment, however, a display without a touch operation function is used as the display 20. A map image shown in FIG. 4 is displayed on the display 20. Television pictures, images captured by a perimeter monitoring camera, or the like may be displayed on the display 20 when the map image is not displayed.

[0021] The display control unit 30 is constituted by a microcomputer. The display control unit 30 may be embodied as an electronic control unit (ECU). Connection between the display control unit 30 and the touchpad 10 is established in a known mode. The connection may be wired connection, wireless connection, the combination of wired and wireless connections, direct connection, or indirect connection. A part of or whole function of the display control unit 30 may be implemented by the control unit 16 of the touchpad 10 or control unit (not shown in the drawings) provided in the display 20. Conversely, a part of or whole function of the control unit 16 of the touchpad 10 may be implemented by the display control unit 30.

[0022] The display control unit 30 synchronizes the display 20 with the touchpad 10 and supports the user's operation on the touchpad 10, as basic function.

[0023] FIG 6 is a diagram that shows an example of state transition implemented by the control unit 16 of the touchpad 10 and the display control unit 30. FIG 6 shows an example of state transition when a map image is displayed on the display 20.

[0024] An operation mode implemented by the control unit 16 of the touchpad 10 and the display control unit 30 mainly includes a cursor mode and a scroll mode.

[0025] The cursor mode starts when output of the map image starts after an initial state (see an arrow P in FIG 6), for example. When the display control unit 30 in a standby state receives a coordinate signal input from the touchpad 10 (an example of a condition A), the display control unit 30 makes transition to a touch-ON state from the standby state. The display control unit 30 makes transition from the touch-ON state to one of the cursor state and the pointer state depending on a positional relationship between a coordinate value indicated by the coordinate signal and an operation item displayed on the map image at a time of the transition to the touch-ON state. For example, when the coordinate value indicated by the coordinate signal at the time of the transition to the touch-ON state corresponds not to a position on an operation item displayed on the map image but to a position in a map region (an example of a condition C), the display control unit 30 makes transition from the touch-ON state to the pointer state. On the other hand, when the coordinate value indicated by the coordinate signal at the time of the transition to the touch-ON state corresponds to the position on the operation item displayed on the map image (an example of a condition B), the display control unit 30 makes transition from the touch-ON state to the cursor state. While the display control unit 30 continuously receives coordinate signals input from the touchpad 10, the display control unit 30 makes transition between the cursor state and the pointer state depending on variation in coordinate values indicated by the coordinate signals. For example, when the display control unit 30 in the cursor state receives a coordinate signal with a coordinate value corresponding to a position in the map region (an example of a condition Kl), the display control unit 30 makes transition from the cursor state to the pointer state. On the other hand, when the display control unit 30 in the pointer state receives a coordinate signal with a coordinate value corresponding to a position on the operation item displayed on the map image (an example of a condition K2), the display control unit 30 makes transition from the pointer state to the cursor state.

[0026] In the cursor state, a cursor 80 (see FIG 4A) is drawn on the map image displayed on the display 20. In the cursor state, the display control unit 30 moves a position of the cursor 80 between selection items based on coordinate signal from the touchpad 10. When display control unit 30 in the cursor state receives a determination signal from the touchpad 10 (an example of a condition E), the display control unit 30 makes transition from the cursor state to a button selected state. In the button selected state, the display control unit 30 performs a determination processing. When the determination operation is performed in the example shown in FIG 4A, the display control unit 30 may display an image for setting a destination. When input of coordinate signals from the touchpad 10 stops (an example of a condition D), the display control unit 30 makes transition from the cursor state to the standby state.

[0027] In the pointer state, a pointer 92 (see FIG 4B) that is movable according to the user's operation performed on the touchpad 10 is drawn on the map image displayed on the display 20 In the pointer state, the display control unit 30 moves the pointer 92 on the map image based on a coordinate signal from the touchpad 10. When the display control unit 30 in the pointer state receives a determination signal from the touchpad 10 (an example of a condition F), the display control unit 30 makes transition from the pointer state to the scroll mode. The display control unit 30 operates in a scroll standby state at a time of the transition to the scroll mode. When the display control unit 30 in the scroll standby state detects an operation performed on a back button, the display control unit 30 returns to the pointer state.

[0028] In the scroll standby state, the display control unit 30 generates a map image for scrolling. The map image for scrolling may be substantially identical to the map image displayed in the pointer state, but in this case, operation items may be hidden or made invalid preferably. FIG 7 shows an example of the map image for scrolling. In the example shown in FIG. 7, a cursor 96 instead of the pointer 92 drawn in the pointer state is drawn on the map image for scrolling, along with a movement vector image 98, which will be described later. Alternatively, the pointer 92 be continuously used instead of the cursor 96. In addition, in the example shown in FIG7, an information item 99 indicating a distance on the map image from a host vehicle position image 91 to the cursor 96 is drawn on the map image for scrolling. The cursor 96 is always displayed at a center position of a screen of the display 20. Thus, a position of the cursor 96 does not change even when the map image is scrolled. At the transition from the pointer state to the scroll standby state, the display control unit 30 moves (scrolls) the map image such that a position of the pointer 92 in the map image coincides with the center position of the screen of the display (the position of the cursor 96). The center position of the screen of the display 20 is not necessarily the exact center of the screen of the display 20, and is a position within a predetermined range around the exact center.

[0029] When the display control unit 30 in the scroll standby state detects a determination operation (an operation to press the touchpad 10), the display control unit 30 may output location information about a location in the map image, at which the cursor 96 is currently positioned, and/or location information about the periphery of the location, such as a point of interest (POI). With this configuration, the user who performs the determination operation (designation of a location) when the cursor 96 is positioned on the desired position in the map image in the scroll mode can access location information about the location and/or location information about the periphery of the location. In addition, when the determination operation is made, the display control unit 30 may make transition to one of the pointer state and the cursor state in the cursor mode, and may display on the map image a selection item for each location. In this case, when the display control unit 30 detects a determination operation with a selection item for a specific location being selected (in the cursor state), then the display control unit 30 may outout detailed information about the specific location.

[0030] When the display control unit 30 in the scroll standby state receives a coordinate signal from the touchpad 10 (an example of a condition G), the display control unit 30 makes transition from the scroll state to a movement vector image displayed state, and keep operating in the movement vector image displayed state until input of coordinate signals stops (an example of a condition H). That is, the display control unit 30 operates in the movement vector image displayed state during a slide operation in which the user slides the finger on the operation surface of the touchpad 10. The slide operation refers to a series of operations in which the user touches the operation surface of the touchpad 10 with the finger, moves the finger on the operation surface while keeping contact of the finger with the operation surface, and then takes the finger off the operation surface. The slide operation includes a drag operation and a flick operation. [0031] The display control unit 30 in the movement vector image displayed state (during a slide operation) maintains the image currently displayed on the screen of the display 20 without moving the image, and, based on a coordinate signal initial value (a position in the operation surface, at which the finger comes in contact with the operation surface at a beginning of the slide operation, which will be referred to as an initial contact position of the finger) and a coordinate signal current value (a position in the operation surface, at which the finger is currently in contact with the operation surface, which will be referred to as a current contact position of the finger), outputs movement information indicating a movement mode, as a mode in which the image displayed on the screen is moved on the screen, in a case where the slide operation terminates at the current contact position of the finger. In the example shown in FIG 7, the display control unit 30 draws a movement vector image 98 on the map image based on the coordinate signal initial value and the coordinate signal current value. At this time, the display control unit 30 does not move (scroll) the map image. The display control unit 30 draws the movement vector image 98 such that the starting point of the movement vector image 98 is positioned at the position of the cursor 96. The display control unit 30 determines the ending point of movement vector image 98 based on a relationship between the coordinate signal initial value and the coordinate signal current value. In this case, the display control unit 30 may operate in a relative coordinate mode. In the relative coordinate mode of the movement vector image displayed state, a coordinate system of the screen of the display 20 and a coordinate system of the operation surface of the touchpad 10 typically have a relationship in which the origin of the coordinate system of the screen of the display 20 coincides with the position of the cursor 96 and the origin of the coordinate system of the operation surface of the touchpad 10 coincides with a position indicated by the coordinate signal initial value. Specifically, in the relative coordinate mode, an ending point position (X, Y) of the movement vector image 98 is determined based on a movement vector [(x2, y2) - (xl , yl )] of the finger moving on the operation surface regardless of the contact position of the finger in the operational surface, as conceptually shown in FIGs. 8 A and 8B. In a case where an aspect ratio of the screen of the display 20 is equal to an aspect ratio of the operational surface, the ending point position (X, Y) of the movement vector image 98 and the movement vector may follow a relationship (X, Y) = K [(x2, y2) - (xl , yl)]. Here, K is a constant, and may be a fixed value or a variable value to be set by the user. The display control unit 30 may perform a correction of the ending point position (X, Y) of the movement vector image 98 such that, when the calculated ending point position (X, Y) is positioned outside the edge of the currently displayed map image (the edge of the screen), the ending point position is replaced with a position corresponding to the edge of the map image.

[0032] When input of coordinate signals stops (an example of a condition H), the display control unit 30 in the movement vector image displayed state makes transition from the movement vector image displayed state to a scroll processing state. That is, the display control unit 30 moves the image displayed on the screen when the slide operation terminates, based on a positional relationship between the initial contact position of the finger and a position in the operation surface at which the finger is in contact with the operation surface at a termination of the slide operation (which will be referred to as a terminal contact position of the finger). Specifically, in scroll processing state, the display control unit 30 moves the map image such that the ending point position (X, Y) of the movement vector image 98 immediately before the input of the coordinate signals stops is moved to the position of the cursor 96. That is, the display control unit 30 moves the map image such that a position in the map image corresponding to a coordinate value of the last coordinate signal received before the input of the coordinate signals stops (the terminal contact position of the finger) is moved to the position of the cursor 96. At this time, the display control unit 30 stops displaying the movement vector image 98. In other words, when the ending point position of the movement vector image 98 has reached the position pf the cursor 96, the length of the movement vector image 98 becomes zero and thus the movement vector image 98 is not displayed. FIG. 9 shows an example of a map image after a scroll processing is performed in a case where the input of coordinate signals stops when the movement vector image 98 in FIG. 7 is displayed on the map image. In the example shown in FIG. 9, a position S in the map image corresponding to the ending point position of the movement vector image 98 in FIG.7 has moved to the position of the cursor 96. When the display control unit 30 in the scroll processing state completes the scroll processing (an example of a condition I), the display control unit 30 makes transition from the scroll processing state to the scroll standby state.

[0033] Thus, in the movement vector image displayed state, the user can moves the ending point position of the movement vector image 98 to the desired position (the desired location or facility) in the map image by touching the operation surface of the coordinate detection unit 12 with an operation finger (such as a forefinger) and moving the operation finger on the operation surface, while viewing the display 20. Then, by taking the operation finger off the operation surface when the ending point position of the movement vector image 98 has reached the desired position in the map image, the user can perform the scroll operation such that the position (see the position S in FIGs. 7 and 9) in the map image corresponding to the ending point position of the movement vector image 98 is moved to the position of the cursor 96.

[0034] As described above, according to the embodiment, the user can moves the ending point position of the movement vector image 98 and select the desired position in the map image by touching the operation surface of the coordinate detection unit 12 with the operation finger and moving the operation finger on the operation surface. Then, by taking the operation finger off the operation surface when the ending point position of the movement vector image 98 has reached the desired position in the map image, the user can scroll the map image in a mode in which the desired position is moved to the position of the cursor 96. As a result, the user can relatively easily achieve scrolling of the map image in the desired mode.

[0035] In addition, according to the embodiment, the map image is not scrolled in the movement vector image displayed state. With this configuration, the user can move the desired position in the map image to the position of the cursor 96 easier as compared with a comparative configuration in which the user needs to move the desired position in the map image to the position of the cursor 96 while scrolling the map image. In addition, it is possible to prevent the driver from fixing eyes on the display 20 or driving inattentively (or it is possible to reduce driver distractions) by stopping scrolling of the map image in the movement vector image displayed state. The movement vector image 98 is basically a static graphic for supporting the driver, although the ending point position of the movement vector image 98 changes (the length and/or the direction of the movement vector image 98 also changes with the change in the ending point position) as described above. The change in the ending point position of the movement vector image 98 does not involve scrolling of the map image, so the movement vector image 98 is determined to be a static graphic. Thus, with the movement vector image 98, it is possible to prevent the driver from fixing the eyes on the display 20 or driving inattentively.

[0036] In the pointer state, a scroll method in which a press operation is performed on the touchpad 10 after the pointer 92 is moved to the desired position in the map image can be employed for achieving a scrolling similar to the scrolling in the above embodiment. In such a scroll method, however, the position of the pointer 92 is likely to be slightly displaced due to a change in a finger position at a time of the press operation, vibration occurring in the host vehicle, or the like. On the other hand, in the embodiment, an operation to set the ending point position of the movement vector image 98 is established when the user takes the finger off the operating surface. Thus, the displacement in the position of the pointer 92 can be reduced and operability can be improved.

[0037] Hereinabove, the embodiment is described in detail. However, the invention is not limited to a specific embodiment and various modifications in the scope of the claims are possible. In addition, any of or whole elements in the above embodiment may be combined together.

[0038] For example, a touch operation performed on the touchpad 10 is detected using a capacitance sensor in the above embodiment, but the touch operation may be detected using a mechanism (a sensor) other than the capacitance sensor. For example, the touchpad 10 may be constituted by a small pressure-sensitive touch panel. Alternatively, the touchpad 10 may have a configuration using resistance wires, a conductive rubber, or information about light blocked by the finger. [0039] In the above embodiment, a determination operation is an operation to press the operation surface of the touchpad 10. However, a determination operation may be performed using a press switch (a mechanical switch) provided separately from the touchpad 10.

[0040] The above embodiment relates to an operation device 1 for a vehicle.

However, the invention may be applied to various operation devices other than the vehicular operation device, including an operation device for a marine vessel and an operation device for a machine such as a robot.

[0041 J the map image displayed on the screen is scrolled. However, an image to be scrolled may be an image other than the map image. For example, the image to be scrolled may include contents arranged on plural pages of a book. In this case, an operation to turn a page of the book may serve as the scroll operation in the above embodiment. For example, in the movement vector image displayed state, the display control unit 30 displays a movement vector image on the image including the book contents based on a coordinate signal initial value and a coordinate signal current value. In this case, the movement vector image may extend rightward or leftward from a position of a cursor such that a direction pointed by the movement vector image may indicate a direction to turn a page (for example, the movement vector image pointing to the right indicates an operation to go to the next page), and a length of the movement vector image may indicate the number of pages to be turned. In addition, a figure for the number of pages to be turned may be displayed.

[0042] In the above embodiment, the map image is not scrolled in the movement vector image displayed state (during the slide operation), but in a specific situation, the map image may be scrolled even in the movement vector image displayed state. In other words, the map image may not be scrolled in a part of a slide operation period, which is a period of time during which the slide operation is performed, but the map image may be scrolled in the other part of the slide operation period. For example, as shown in FIG. 10, when the ending point position of the movement vector image 98 remains at a position corresponding to the edge of the map image for a predetermined time or longer (an example of a condition K), the display control unit 30 makes transition to the scroll state. In this case, the display control unit 30 may scroll the map image in a direction opposite to a direction pointed by the movement vector image 98 in the scroll state. In the scroll state, the display control unit 30 may continue or stop displaying the movement vector image 98. When the ending point position of the movement vector image 98 moves away from the position corresponding to the edge of the map image (an example of a condition L) in the scroll state, the display control unit 30 returns to the movement vector image displayed state. When input of coordinate signals stops (an example of a condition H), the display control unit 30 in the scroll state may make transition from the scroll state to the scroll processing state.

[0043] In the above embodiment, the movement vector image 98 is used as information indicating a mode for scrolling the map image when contact of the finger with the operation surface is cancelled (or the movement information indicating the movement mode of the image displayed on the screen in a case where the slide operation terminates at the current contact position of the finger) in the movement vector image displayed state, but information (image) other than the movement vector image 98 may be used instead of or in addition to the movement vector image 98. The information other than the movement vector image 98 may be any information as long as the information indicates, directly or indirectly, a position in the map image that will be moved to the position of the cursor 96 when input of coordinate signals stops. For example, the display control unit 30 may indicate the position in the map image that will be moved to the position of the cursor 96 when the input of coordinate signals stops (the position corresponding to the ending point position of the movement vector image 98) in a highlighted manner. Indicating the position in the highlighted manner may include putting the position in a frame having a circular shape or the like, and blinking the flame.

[0044] In the above embodiment, the cursor 96 is drawn on the map image for scrolling, but the cursor 96 may be omitted.

[0045] In the above embodiment, the touchpad 10 is mounted in the vehicle. However, a touch operation unit of a mobile device (for example, a smartphone or a tablet computer) that can be brought in the vehicle cabin may be used as an alternative to the touchpad 10. In this case, the mobile device may communicate with the display control unit 30 via Bluetooth (a registered trademark) to implement a function similar to that of the above described touchpad 10.