Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
TOUCH INPUTS DETECTIONS
Document Type and Number:
WIPO Patent Application WO/2019/182609
Kind Code:
A1
Abstract:
An example display device includes a display panel having a display region. The display device also includes a first touch sensor positioned on top of the display region. The first touch sensor is to detect a first touch input received at an active region of the first touch sensor. The display region of the first touch sensor corresponds to a first region of the display region. The display device further includes a second touch sensor positioned on top of the first touch sensor. The second touch sensor is to detect a second touch input received at an active region of the second touch sensor. The active region of the second touch sensor corresponds to a second region of the display region that is different from the first region.

Inventors:
JABORI MONJI G (US)
THAI THONG (US)
WANG SIMON (US)
Application Number:
PCT/US2018/024009
Publication Date:
September 26, 2019
Filing Date:
March 23, 2018
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HEWLETT PACKARD DEVELOPMENT CO (US)
International Classes:
G06F3/041
Foreign References:
US8525799B12013-09-03
US20140164991A12014-06-12
US20140306905A12014-10-16
US6459424B12002-10-01
Attorney, Agent or Firm:
SU, Benjamin (US)
Download PDF:
Claims:
Claims

What is claimed is:

1. A display device comprising:

a display parse! having a display region:

a first touch sensor positioned on top of the display region , the first touch sensor is to detect a first touch input received at an active region of the first touch sensor, wherein the display region of the first touch sensor corresponds to a first region of the display region; and a second touch sensor positioned on top of the first touch sensor, the

second touch sensor is to detect a second touch input received at an active region of the second touch sensor, wherein the active region of the second touch sensor corresponds to a second region of the display region that Is different from the first region.

2. The display device of claim 1, wherein the first touch sensor has a first scan rate, and wherein the second touc sensor has a second scan rate different than the first scan rate.

3. The display device of claim 1 , further comprising:

a glass layer positioned on top of the second touch sensor;

a first controller connected to the first touch sensor; and

a second controller connected to the second touch sensor, wherein the second touch sensor is attached to the display panel via an adhesive layer.

4. The display device of claim 1 , wherein the first touch sensor and the second touch sensor are capacitive touch sensors.

5. The display device of claim 1 , wherein the first touch sensor and the second touch sensor are resistive touch sensors.

6. The display device of claim 1 , wherein the active region of the first touch sensor and the active region of the second touch sensor overlap have an overlapping side.

7. A non-transitory computer-readable storage medium comprising instructions that when executed cause a processor of a computing device to: set an active region of a first touch sensor of a display device of the

computing device to match dimensions of a display region of a display device of the computing device;

in response to receiving a partitioning instruction:

partition the display region into a first region and a second region; change the active region of the first touch sensor to match

dimensions of the first region; and

set an active region of a second touch sensor of the display device to match dimensions of the second region

8. The non-transitory computer-readable storage medium of claim 7S wherein the instructions when executed further cause the processor to:

set a scan rate of the first touch sensor to a first scan rate; and

set a scan rate of the second touch sensor to a second scan rate different from the first scan rate.

9. The non-transitory computer-readable storage medium of claim 8, wherein the first scan rate corresponds to a first type of touch input, and wherein the second scan rate corresponds to a second type of touch input.

10. The non-transitory computer-readable storage medium of claim 7, wherein the active region of the first touch sensor and the active region of the second touch sensor have an overlapping side, and wherein the instructions when executed furthe cause the processor to process a particular touch input involving the overlapping side differently than touch inputs not involving the overlapping side.

11. The non-transltory computer-readable storage medium of claim 7, wherein the instructions when executed further cause the processor to: disable the second touch sensor when the active region of the first touc sensor is set to match the dimensions of the display region.

12. A non-transitory computer-readable storage medium comprising instructions that when executed cause a display device to:

detect an invalid touch input via a first touch sensor of the display device; set a rejection region within an active region of the first touc sensor via a first controller, wherein the first controller is connected to the first touch sensor;

in response to detecting a subsequent touch input within the rejection region:

discard first touch information associated with the subsequent touch input via the first controller, wherein the first touch information is generated via the first touch sensor; and output second touch information associated with the subsequent touch input to a processor connected to the display device via a second controller of the display device, wherein the second touch information is generated via a second touch sensor of the display device.

13. The non-transitory computer-readable storage medium of claim 12, wherein the instructions when executed further cause the display device to, in response to determining an absence of the invalid touch input, remove the rejection region.

14. The non-transitory computer-readable storage medium of claim 12, wherein the instructions when executed further cause the display device to: set the active region of the first touch sensor to match dimensions of a display region of the display device; and

set an active region of the second touch sensor to match the dimensions of the display region.

15. The non-transitory computer-readable storage medium of claim 12, wherein the instructions when executed further cause the display device to, in response to detecting the subsequent touch input outside the rejection region, output touch information associated with the subsequent touch input via a single controller.

Description:
ϊ

TOUCH INPUTS DETECTIONS

BACKGROUND

[0001] A computing device, such as a noteoook computer, a tablet, may receive an Input from a user of the computing device via different input devices. An example input device may be a keyboard. Another example input device may be a mouse.

BRIEF DESCRIPTION OF THE DRAWINGS

[0002J Some examples of the present: application are described with respect to the following figures:

[0003] FIG. 1A Illustrates a functional block diagram of a display device having a plurality of touch sensors to detect touch inputs, according to an example;

[0004] FIG. 1B illustrates a display region of the display device of FiG. 1A, according to an example;

10005] FiG, 1C illustrates a cross sectional view of a layered stack of the display device of FIG, 1 A, according to an example;

fOOOOj FIG. 1 D illustrates a cross sectional view of a layered stack of the display device of FIG. 1A, according to another example;

| OfMi?] FIGs. 2A-2B illustrate an approach to detect touch inputs using a plurality of touch sensors, according to an example;

(0008] FIG. 3 illustrates an approach to process a touch input involving an overlapping side, according to an example;

[0009] FIG. 4 Illustrates an approach to filter invalid touch inputs using a plurality of touch sensors, according to an example;

[0010] FIG. 5 illustrates a method of operation at a display device to set active regions of the plurality of touch sensors, according to an example;

[0011 | FIG. 6 Illustrates a method of operation at a display device having a plurality of touch sensors to filter invalid touch inputs, according to an example; [0012] FIG. ? illustrates a computing device with a display device having a plurality of touch sensors, according to an example; and

[0013] FIG. 8 illustrates a display device having a plurality of touch sensors to detect an invalid touch input, according to an example.

DETAILED DESCRIPTION

10Q14] One approach to provide inputs to a computing device is via touch.

For example, a touch sensor may be integrated with a display panel to form a display device, A user may use a finger or a stylus to directly interact with content displayed on the display device. The touch sensor may detect the location of a physical contact between the finger or stylus and a surface of the display device. The location information may be used as an input (e.g , the location may indicate an item or a function selected by the user)

[0015] in some situations, a physical display device may display content from a plurality of sources, such as different operating systems. Thus, the area where content is presented may be partitioned via processor executable instructions into a plurality of virtual displays. Each virtual display may be configured independently of the other virtual displays (e.g., different resolutions, different sources, etc).

[0 10] Examples described herein provide an approach to provide touch input detection for a plurality of virtual displays. For example, a display device may include a display panel having a display region. The display device may also include a first touch sensor positioned on top of the display region. The first touch sensor may detect a first touch input received at an active region of the first touch sensor. The display region of the first touch sensor may correspond to a first region of the display region . The display device may further include a second touch sensor positioned on top of the first touch sensor. The second touch sensor may detect a second touch input received at an active region of the second touch sensor. The active region of the second touch sensor may correspond to a second region of the display region that is different from the first region.

[6017] in another example, a non-transitory computer readabie storage medium may inciude instructions that when executed cause a processor of a computing device to set an active region of a first touch sensor of the computing device to match dimensions of a display region of the computing device in response to receiving a partitioning instruction, the instructions when executed may also cause the processor to partition the display region into a first region and a second region; change the active region of the first touch sensor to match dimensions of the first region; and set an active region of a second touch sensor of the computing device to match dimensions of the second region.

[0618] In another example, a non-transitory computer readabie storage medium may inciude instructions that when executed cause a display device to detect an invalid touch input via a first touch sensor of the display device. The instructions when executed may also cause the display device to set a rejection region within an active region of the first touch sensor via a first controller. The first controller may be connected to the first touch sensor. In response to detecting a subsequent touch input within the rejection region, the instructions when executed may further cause the display device to discard first touch information associated with the subsequent touch input via the first controller and output second touch information associated with the subsequent touch input to a processor connected to the display device via a second controller of the display device. The first touch information may be generated via the first touch sensor. The second touch information may be generated via a second touch sensor of the display device. Thus, examples described herein may enabie touch input detection for a display device with a plurality of virtual displays.

[0019] FiG, 1A illustrates a functionai block diagra of a display device 100 having a plurality of touch sensors to detect touch inputs, according to an example. As used herein, display device 100 may be an electronic device that outputs information as images visible to humans. Display device 100 may be implemented using hardware components, processor executable instructions, or a combination thereof- Display device 100 may include a display panel 102, a first touch sensor 104, a second touch sensor 106, a first controller 108, and a second controller 110.

[0020) As used herein, display panel 102 may be an electronic device that converts input information to images. Display panel 102 may be implemented using a plurality of technologies. Display panel 102 may be, for example, a liquid crystal display (LCD) panel, an organic light-emitting diode (OLED) panel, a plasma panel, etc.

[0021) As used herein, a touch sensor may be an electronic device that produces an electrical response in response to a physical contact at a sensing area. Touch sensors 104 and 106 may be implemented using a plurality of technologies. In some examples, touch sensors 104 and 106 may be capacitive touch sensors where a touch is detected by measuring a change in capacitance in some examples, touch sensors 104 and 106 may be resistive touch sensors where a touch is detected by measuring a change in resistance. In some examples, touch sensors 104 and 106 may be implemented using the same kind of sensing technology.

[0022] As used herein, a controller may be a hardware device suitable for retrieval and execution of instructions stored in a computer-readable storage medium (not shown in FIG. 1A). Controllers 108 and 110 may be implemented as semiconductor-based microprocessors.

[0023) During operation, first controller 108 may be connected to first touch sensor 104 via an electrical connection (e.g., cables, circuit traces). First controller 108 may control operations of first touch sensor 104. Second controller 110 may be connected to second touch sensor 106 via an electrical connection. Second controller 110 may control operations of second touch sensor 108. In some examples, first controller 108 and second controller 110 may be connected via an electrical connection to exchange information,

[0924] FIG. 1 B illustrates a display region 112 of display device 100 of FIG. 1A S according to an example. Display region 112 may be located on a side of display panel 102. As used herein, display region 112 may be an area of display pane! 102 where images are displayed or rendered visible. Display region 112 may be of any shape. As illustrated in FIG. 1 B, display region 112 may have a rectangular shape with a length L and a width W.

{00251 FIG. 1C illustrates a cross sectional view of a layered stack 114 of display device 100 of FIG. 1A, according to an example. Display panel 1G2 S first touch sensor 104, and second touch sensor 108 may form layered stack 114. Display panel 102 may be the bottom layer of stack 114. First touch sensor 104 may be the middle layer of stack 114, First touch sensor 104 may be positioned on top of a surface of display panel 102 where display region 112 is formed.

First touch sensor 104 may be sized to cover the entire area of display region 112 Second touch sensor 106 may be the top layer of stack 114. Second touch sensor 106 may be positioned on top of first touch sensor 104 such that first touch sensor 104 is positioned between display panel 102 and second touch sensor 106. In some examples, controllers 108 and 110 (not shown in FIG. 1C) may be positioned outside of stack 114.

[0026] Although FIG. 1C illustrates display panel 102 being positione adjacent to first touch sensor 104 and first touch sensor 104 being positioned adjacent to second touch sensor 106, other layer(s) may be positioned between dispiay panel 102 and first touch sensor 104 and/or between second touch sensor 106 and first touch sensor 104.

{0627} FIG. 1 D illustrates a cross sectional view of a layered stack 116 of display device 100 of FIG. 1A, according to another example. Layered stack 116 may include display panel 102 as the botom of stack 116. Layered stack 116 may also include adhesive layer 118 on top of display panel 102, Layered stack 116 may further include first touch sensor 104 positioned on top of dispiay panel 102 such that adhesive layer 118 may be positioned between display pane! 102 and first touch sensor 104. Adhesive layer 118 may be implemented using any material that affixes first touch sensor 104 to display panel 102. Layered stack 116 may further include second touch sensor 106 positioned on top of first touch sensor 104. Layered stack 116 may further include a glass layer 120 positioned on top of second touch sensor 106, Glass layer 120 may provide protection against physical damages such as scratches.

[0028] FIGs. 2A-2B illustrate an approach to detect touch inputs using a plurality of touch sensors, according to an exampie. FIGs. 2A-2B may be described with reference to FIGs. 1A-1D. In FIG. 2A, display region 112 is shown in the absence of multiple virtual displays, an active region of first touch sensor 104 may be set to match the dimensions of display region 112 via first controller 108. That is, the active region of first touch sensor 104 may have the same length and width as display region 112. As used herein, an active region may be an area of a touch sensor (e.g., touch sensor 104, touch sensor 106) where an electrical signal generated in response to a physical touch is reported or output by a controller (e.g., first controller 108, second controller 110) connected to the touch sensor. In some examples, non-active regions of first touch sensor 104 regions of first touch sensor 104 not set as part of the active region) may be disabled such that a physical touch at the non-active region may not be able to generate an electrical signal. In some examples, electrical signals generated by touches at the non-active regions may be discarded or ignored by first controller 108.

00293 When a physical touch is detected by first touch sensor 104, first touch sensor 104 may generate touch information that indicates the location of the touch on first touch sensor 104. The touch information may indicate the location based on the layout of scan lines of first touch sensor 104. For example, first touch sensor 104 may have scan !ines positioned in intersecting rows and columns. The scan iines may be implemented as electrodes. For example, the scan lines positioned in columns may be implemented as a first layer of electrodes. The scan Iines positioned in rows may be implemented as a second layer of electrodes. Thus, each physical location of first touch sensor 104 may have corresponding coordinates (X (row), Y {column)). The coordinates (in row and column) may be used as the touch information.

[0030] When the active region of first touch sensor 104 is set to match the dimensions of display region 112, touch information generated by first touch sensor 104 may directly correspond to a location of display region 112 as the action region an display region 112 have the same area. The area of display region 112 may be digitally divided into the same row and column layout as the scan Sines of first touch sensor 104. For example, the left top corner of display region 112 may have coordinates (0, 0), indicating that the coordinates are the intersection of scan line row 0 and column 0. As another example, the right top comer of display region 112 may have coordinates {0, 1000), indicating that the coordinates are the intersection of scan line row 0 and column 1000. As other examples, a touch input T1 may have coordinates (350, 100) and a touch input T2 may have coordinates (400, 800).

10031 i In some examples, second touch sensor 108 may be disabled in the absence of multiple virtual displays. In some examples, in the absence of multiple virtual display, the active region of second touch sensor 106 may also be set to match the dimensions of display regions 112, but touch information generated by second touch sensor 106 may be discarded by second controller 110.

[0032] Turning to FIG. 2B, during operation, display device 100 may receive a partitioning instruction 202. Partitioning instruction 202 may be executable by display device 100 (e.g., via first controller 108 and/or second controller 110). In some examples, displa device 100 may receive partitioning instruction 202 from processing entity that controls the operations of display device. For example, display device 100 may be integrated as a display of a notebook computer. The processing entity may be a processor of the notebook computer. Partitioning instruction 202 may include information on how display region is to be partitioned into different virtual displays. For example, partitioning instruction 202 may include information on the number of virtual displays that is to be shown on display region 112, the dimensions (e.g., height, width) of each virtual display, and/or the positioning of each virtual display relative to other virtual disp!ay(s).

[0033] in response to receiving partitioning instruction 202, display device 100 may partition display region 112 into multiple regions based on the number of virtual displays to be presented. Display device 100 may also set active regions of touch sensors 104 and 106 based on partitioning instruction 202 For example, partitioning instruction 202 may indicate a first virtual display 204 and a second virtual display 206 are to be displayed on display region 112. Thus, a first region 208 of display region 112 may be used for first virtual display 204 and a second region 210 of display region 112 may be used for second virtual display 206 Second region 210 may be different from first region 208. As an example, first region 208 may be 20% of display region 112 and second region 210 may be the remaining 80% of display region 112.

[0034] First controller 108 may change the active region of first touch sensor

104 from matching the dimensions of display region 112 to matching the dimensions of first virtual display 204 (thus the dimensions of first region 208). Second controller 110 may set an active region of second touch sensor 106 to match the dimensions of second virtual displa 206 (thus the dimensions of second region 210) For example, the active region of first touch sensor 104 may be set to left top corner (0, 0), left bottom comer (500, 0), right top corner (0,

200), and right bottom corner (500, 200), The active region of second touch sensor 106 may be set to left top corner (0, 200), left bottom corner (500, 200), right top corner (0, 800), and right bottom corner (500, 800)

[0035] Each virtual display 204 and 206 may have internal coordinates for each virtual location within. For example, the interna! coordinates of the first virtual display may be (0, 0) for top left corner, (500, 0) for botom left corner, (D, 200) for top right corner, and (500, 200) for bottom right corner. The internal coordinates of the second virtual display may be (0, 0) for top left corner, (500, 0) for bottom left corner, (0, 800) for top right comer, and (500, 800) for bottom right comer.

[0036] Based on the relative positioning of virtual displays 204 and 206 (e.g., regions 208 and 210), the touch information generated by each touch sensor 104 and 106 may be converted or mapped to the corresponding internal coordinates of each virtual display 204 and 206. For example, first touch sensor 104 may detect touch input T1 in first virtual display 204 to generate the first touch information. The first touch information associated with touch input T1 may be (350, 100} Whe the processing entity controlling first virtual display 204 (e.g., a processor or an operating system) receives the first touch information from first controller 108, the processing entity may convert the coordinates in the first touch information to the internal coordinates of first virtual display 204, which may also be (350, 100). As another example, second touch sensor 106 may generate second touch information associated with touch input 12. The second touch information may be (400, 800). The second touch information may be converted to corresponding interna! coordinates of second virtual display 206, which may be (200, 600).

|0O37J In some examples, first virtual display 204 may be dedicated to receiving a first particular type of touch input and second virtual dispiay 206 ma be dedicated to receiving a second particular type of touch input Thus, the scan rate of the active region of first touch sensor 104 may be set to a first scan rate at first controller 108 and the scan rate of the active region of second touch sensor 106 may be set to a second scan rate at second controller 110. As used herein, a scan rate may be the frequency in which the output of a touch sensor (e.g., touch sensors 104 and 106) is measured by a touch controller (e.g., controllers 108 and 110). The second scan rate may be different from the first scan rate.

As an example, first virtual dispiay 204 may be dedicated to receiving touch inputs fro a stylus or a pen, thus, the scan rate of the active region of first touch sensor 104 may be set to 300 Hertz (Hz). Second virtual display 206 may be dedicated to receiving touch inputs from a human finger, thus, the scan rate of the active region of second touch sensor 106 may be set to 100 Hz. !n some examples, both touch sensors 104 and 106 may be set to have the same scan rate.

[0038) FIG. 3 illustrates an approach to process a touch input involving an overlapping side, according to an example. FIG. 3 may be described with reference to FSGs, 1A-1C and 2A-2B. First region 208 used for first virtual display 204 and second region 210 used for second virtual display 206 may have an overlapping side 302 as scan lines of first touch sensor 104 and scan lines of second touch sensor 106 may overlap. For example, the overlapping side 302 may correspond to intersections between scan lines at column 200 and rows 0 to 500 on both touch sensors 104 and 108

[6039] In some examples, when a particular touch input involving overlapping side 302 may be processed (e.g , by a processor of a computing device) differently than touch inputs not involving overlapping side 302, For example, a particular touch input T3 may be a touch input involving overlapping side 302 Touch input T3 may be a continuous touch input (e.g., a swipe gesture) that crosses overlapping side 302 (e.g., as a starting point or an end point). Thus, touch information of touch input T3 may include multiple sets of coordinates, such as (200, 200), (200, 199), and (200, 198). First controller 108 may transmit the touch information to a processing entity that may process the touch information. For example, the processing entity may be an operating system of a computing device, in response to receiving the touch information, the processing entity may process the touch information as an input to change content displayed in virtual display 204 since the touch information involved overlapping side 302.

[6640] In contrast, a touch input T4 may be a touch input not involving overlapping side 302. That is, touch input T4 does not cross overlapping side 302. For example, touch information of touch input T4 may include coordinates (250, 170), (250, 169), and (250, 168). When the processing entity receives the touch information of touch input T4, the processing entity may process the touch Information of touch input T4 as a selection of content displayed at coordinates (250, 170), (250, 169), and (250, 168).

[0041] FIG. 4 illustrates an approach to filter invalid touch inputs using a plurality of touch sensors, according to an example. FIG. 4 may be described with reference to FIGs. 1A-1 C. In some examples, the active regions of first touch sensor 104 and second touch sensor 106 may be set to match the dimensions of display region 112.

[0042] During operation, when a user of display device TOO grips display region 112, first touch sensor 104 may detect the grip (as indicated by thumb 402) and generate touch information of the grip. First controller 108 may determine if the touch input corresponding to the grip is a valid input touch based I I on touch information generated by first touch sensor 104. For example, first touch sensor 104 may generate the same touch information when the grip is present, thus, based on the duration of the repeated touch information, first controiier 108 may determine that the touch information indicates a grip rather than a valid touch input.

|0043j in response to determining that a grip is present (Le., an invalid touch input), first controller 108 may set a rejection region 404 within the active region of first touch sensor 104, As used herein, a rejection region may be an area of a touch sensor (e.g., touch sensor 104, touch sensor 106) where an electricai signal generated in response to a physical touch is discarded by a controiier (e.g,, first controller 108, second controller 110) connected to the touch sensor. Thus, when rejection region 404 is enabled, touch information of any subsequent touch inputs detected within rejectio region 404 may be discarded by first controiier 108 as invalid touch inputs. For example, when first touch sensor 104 detects a touch input 15, first controller 108 may receive touch information of touch input T5 from first touch sensor 104. First controller 108 may determine if touch input T5 is within rejection region 404 by comparing the touch information of touch input T5 to coordinates of rejection region 404. in response to a determination that touch input T5 is within rejection region 404, first controller 108 may determine touch input T5 to be an invalid touch input and discard the touch information of touch input T5. Thus, first controiier 108 Is not output or report the touch information of touch input 15.

j0044j In some examples, first controller 108 may inform second controller 110 that grip is detected and communicate the touch information of the grip to second controller 110. When second touch sensor 106 detects touch input T5, second controller 110 may receive the touch information of touch input T5 from second touch sensor 108. Second controiier 110 may compare the touch information of touch input T5 to the touch information of the grip to determine if touch input T5 is the grip in response to a determination that touch input T5 is different from the grip, second controller 110 may output or report the touch information of touch input T5 (e.g., to a processor of a computing device) for further processing

[0045] In some examples, after rejection region 404 is enabled, when a subsequent touch input 16 is received that is outside of rejection region 404, touch information of touch input T6 may be reported or output by one of first controller 108 and second controller 110. For example, first controller 108 may receive the touch information of touch input T8 from first touch sensor 104 as first touch information. Second controller 110 may receive the touch information of touch input T6 from second touch sensor 106 as second touch information. First controller 108 may determine that the first touch information is associated with a valid touch input as the first touch information may indicate that touch input T6 is outside of rejection region 404. First controller 108 may transmit the first touch information to second controller 110. Second controller 110 may compare the first touch information to the second touch information. In response to a determination that the first touch information matches the second touch information, second controller 110 may output or report the second touch information for further processing instead of outputting both first touch

information and the second touch information.

10046] In some examples, the user may remove the grip from display region 112, thus, first touch sensor 104 may not be abie to detect the grip. First controller 108 may determine an absence of the grip since first controller 108 may stop receiving the touch information of the grip. In response to the determination, first controller 108 may remove rejection region 404 from the active region of first touch sensor 104.

[0047] FIG. 5 illustrates a method 500 of operation at a display device to set active regions of the plurality of touch sensors, according to an example. Method 500 may be imp!emented using display device 100 of FIG. 1A. Method 500 may be described with reference to FJGs. 1A. and 2B Method 500 includes setting an active region of a touch sensor, at 502. For example, first controlle 108 may set an active region of first touch sensor 104. Second controller 110 may set an active region of second touch sensor 106. in some examples, the active region of first touch sensor 104 and/or the active region of second touch sensor 106 may match the dimensions of display region 112. In some examples, the active region of first touch sensor 104 and/or the active region of second touch sensor 106 may be set to the dimensions of a region smaller than display region 112, such as first region 208 and second region 210.

[0048] Method 500 also includes determining if a partitioning instruction, such as partitioning instruction 202, has been received at display device 100, at 504. The dimensions of the active regions of first touch sensor 104 and second touch sensor 108 may be maintained (unchanged) if in the absence of partitioning instruction 202. Method 500 further indudes changing the active region based on the partitioning instruction, at 506, For example, in response to receiving partitioning instruction 202, first controller 108 may change the active region of first touch sensor 104 from matching the dimensions of display region 112 to matching the dimensions of first virtual display 204 (thus the dimensions of first region 208). Second controller 110 may set an active region of second touch sensor 106 to match the dimensions of second virtual display 206 (thus the dimensions of second region 210)

(0049] FIG. 6 illustrates a method 600 of operation at a display device having a plurality of touch sensors to filter invalid touch inputs, according to an example. Method 600 may be implemented using display device 100 of FIG. 1A. Method 600 may be described with reference to FIGs. 1 A and 4. Method 600 includes detecting a touch input, at 602, For example, first touch sensor 104 may detect a touch input

[0O50j Method 600 also includes determining if the touch input is a valid touch input, at 804. For example, first controller 108 may determine if a touch input is a valid input touch based on touch information generated by first touch sensor 104. Method 600 further includes outputting touch information of the touch input when the touch input is valid, at 606. For example, first controller 108 may output the touch information to a processing entity for further processing.

[0051] Method 600 further includes setting a rejection region when the touch input is an invalid touch Input, at 608. For example, in response to determining that a grip is present (i.e., an invalid touch input), first controller 108 may set a rejection region 404 within the active region of first touch sensor 104.

[0052] Method 600 further includes determining if a touch input is detected within the rejection region, at 610 For example, first controller 108 may determine if a touch input, such as touch input T5, is detected within rejection region 404 by comparing touch information of the touch input to coordinates of rejection region 404. Method 600 further includes outputting touch information via a single controller when the touch input is detected outside the rejection region, at 612. For example, when a touch input 16 is received that is outside of rejection region 404, touch information of touch input T6 may be reported or output by one of first controller 108 and second controller 110,

[0053] Method 600 further includes discarding first touch information when the touch input is detected within the rejection region, at 614. For example, when rejection region 404 is enabled, touch information of any subsequent touch inputs detected within rejection region 404 may be discarded by first controller 108 as invalid touch inputs. Method 600 further includes outputting second touch information, at 616 For example, second controller 110 may output or report the second touch Information of touch input T6 for further processing instead of outputting both first touch information and the second touch information.

[0054] FIG. 7 illustrates a computing device 700 with a display device having a plurality of touch sensors, according to an example. Computing device 700 may be, for example, a web-based server, a local area network server, a cloud- based server, a notebook computer, a desktop computer, an all-in-one system, a tablet computing device, a mobile phone, an electronic book reader, or any other electronic device suitable for receiving touch inputs. Computing device 700 may include display device 100 of FIG. 1A. Computing device 700 may also Include a processor 702 and a computer-readable storage medium 704.

[0»S5j Processor 702 may be a central processing unit (CPU), a

semiconductor-based microprocessor, and/or other hardware devices suitable for retrieval and execution of instructions stored in computer-readable storage medium 704. Processor 702 may fetch, decode, and execute instructions 706, 70S, 710, 712, 714, and 716 to control a process of detecting touch inputs via multiple touch sensors As an alternative or in addition to retrieving and executing instructions, processor 702 may include at least one electronic circuit that Includes electronic components for performing the functionality of

instructions 706, 708, 710, 712, 714. 716, or a combination thereof.

!0056| Computer-readable storage medium 704 may be any electronic, magnetic, optical, or other physical storage device that contains or stores executable instructions. Thus, computer-readable storage medium 704 may be, for example. Random Access Memory (RAM), an Electrically Erasable

Programmable Read-Only Memory (EEPROM), a storage device, an optical disc, etc. In some examples, computer-readable storage medium 704 may be a non-transltory storage medium, where the term“non-transitory” does not encompass transitory propagating signals. As described in detail below, computer-readable storage medium 704 may be encoded with a series of processor executable instructions 706, 70S, 710, 712, 714, and 716. Processor 702 may control operations of display device 100 based on instructions 706, 708, 710, 712, 714, and/or 716. In some examples, instructions 706, 708, 710, 712, 714, 716, or a combination thereof may be encoded in computer-readable storage medium (not shown in FiG. 7) of first controller 108 and/or second controller 110.

[0957] Active region setting instructions 706 may set an active region of a touch sensor. For example, referring to FIG. 2A, the active region of first touch sensor 104 may be set to match the dimensions of display region 112.

Partitioning instruction receiving instructions 708 may receive a partitioning instruction. For example, referring to FIG. 2B, display device 100 may receive a partitioning instruction 202.

[0058] Display region partitioning instructions 710 may partition display region 112 of display device into multiple regions base on the number of virtual displays to be presented. For example, referring to FIG 2B, in response to receiving partitioning Instruction 202, display device 100 may partition display region 112 into multiple regions based on the number of virtual displays to be presented. Active region changing instructions 712 may change the active region of a controller based on the partitioning instruction. For example, referring to FIG. 2B, display device 100 may also set active regions of touch sensors 104 and 106 based on partitioning instruction 202.

[0059] Scan rate setting instructions 714 may set a scan rate of a touch sensor. Fo example, referring to FIG. 2S, the scan rate of the active region of first touch sensor 104 may be set to a first scan rate at first controller 108 and the scan rate of the acti ve region of second touch sensor 106 may be set to a second scan rate at second controller 110. Overlapping side touch input processing instructions 716 may process a touch input involving an overlapping side of two virtual displays differently than a touch input not involving an overlapping side. For example, referring to FiG 3, when a particular touch input involving overlapping side 302 may be processed (e.g., by a processor of a computing device) differently than touch inputs not invoiving overlapping side 302.

[0960] FiG. 8 illustrates a display device 800 having a plurality of touch sensors to detect an invalid touch input, according to an example. Display device 800 may implement display device 100 of FIG. 1 A Display device 800 may include display panel 102, first touch sensor 104, second touch sensor 106, first controller 108, second controller 110, and a computer-readable storage medium 802. Computer-readable storage medium 802 may be similar to computer-readable storage medium 704 of FiG. 7. Computer-readable storage medium 802 may be encoded with a series of instructions 804, 806, 808, 810, 812, 814, and 816 to control operations of first controller 108 and/or second controller 110.

[0061 ] Invalid touch input detecting instructions 804 may detect an invalid touch input. For example, referring to FiG. 4, first controller 108 may determine if the touch input corresponding to the gri is a valid input touch based on touch information generated by first touch sensor 104 Rejection region setting instructions 806 may set a rejection region in response to a detection of an Invalid touch input. For example, referring to FiG. 4, in response to determining that a grip is present (s.e., an invalid touch input), first controller 108 may set rejection region 404 within the active region of first touch sensor 104.

[0062] Subsequent touch input detecting instructions BOB may detect a touch input received after the rejection region is enabled. For example, referring to FiG, 4, after rejection region 404 is enabled, when a subsequent touch input T6 is received that is outside of rejection region 404, touch information of touch input T8 may be reported or output by one of first controller 108 and second controller 110. Touch information discarding instructions 810 may discard touch

information of a touch input that is received within the rejection region. For example, referring to F!G. 4, in response to a determination that touch input T5 is within rejection region 404, first controller 108 may determine touch input T5 to be an invalid touch input and discard the touch information of touch input T5

[0063] Touch information outputting instructions 812 may output touch information of a touch input received outside the rejection region. For example, referring to F!G 4, in response to a determination that the first touch information matches the second touch information, second controller 110 may output or report the second touch information for further processing.

[0664] Rejection region removing instructions 814 may remove the rejection region. For example, referring to FIG. 4, first controller 108 may remove rejection region 404 from the active region of first touch sensor 104. Active region setting Instructions 818 may set an active region of a touch sensor. For example, referring to FIG. 2A, an active region of first touch sensor 104 may be set to match the dimensions of display region 112 via first controller 108,

[0065j The use of’’corn rising",’ ncluding" or "having" are synonymous and variations thereof herein are meant to be Inclusive or open-ended and do not exclude additional unrecited elements or method steps.