Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
MOBILE DEVICE AND METHOD FOR INTERPERTING GESTURES WITHOUT OBSTRUCTING THE SCREEN
Document Type and Number:
WIPO Patent Application WO/2016/054190
Kind Code:
A1
Abstract:
A mobile computer device comprising a front, a back, and at least two sides. The device comprises a touch screen in the front display. The sides connect the front surface to the back surface. The mobile device comprises a side touch sensor disposed in at least one of the two sides. The mobile device also comprises a proximity sensor that senses heat, a processor, and a computer memory. The computer memory may have executable instructions for the processor that comprise activating the side touch sensor when the proximity sensor senses heat above a threshold.

Inventors:
VESIKIVI PETRI (FI)
Application Number:
PCT/US2015/053198
Publication Date:
April 07, 2016
Filing Date:
September 30, 2015
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
PCMS HOLDINGS INC (US)
International Classes:
G06F1/32; G06F3/0488
Foreign References:
US20110187660A12011-08-04
US20110264928A12011-10-27
US20100287470A12010-11-11
EP2720129A12014-04-16
US20120249411A12012-10-04
Other References:
None
Attorney, Agent or Firm:
ROCCIA, Vincent, J. et al. (Suite 1700Philadelphia, PA, US)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. A mobile computer device, comprising:

a front having a front length and a front width and a display;

a back having a back length and a back width;

at least two sides, each having a side length and a side width, wherein the side width is less than the front width and the back width;

a side, having a touch sensor, disposed in at least one of the two sides; and

a computer processor comprising executable instructions for recognizing a gesture sensed by the touch sensor and manipulating the display based on the gesture.

2. The mobile device of claim 1, wherein the display comprises a display touch screen.

3. The mobile device of claim 1, further comprising a proximity sensor, that is electrically coupled to the processor, and wherein the computer memory further comprises executable instructions for activating the side touch screen in response to the proximity sensor.

4. The mobile device of claim 3, wherein the proximity sensor comprises a temperature sensor for sensing heat.

5. The mobile device of claim 1, wherein the side touch sensor comprises a capacitive sensor and a touch screen.

6. The mobile device of claim 1, wherein the mobile device comprises a wireless transmit/receive unit (WTRU) comprising a receiver and a transmitter for communicating wirelessly.

7. The mobile device of claim 1, where in the side touch sensor is disposed in the at least two sides.

8. A mobile computer device, comprising:

a front comprising a display comprising a front touch screen;

a back;

at least two sides, wherein the sides connect the front to the back;

a side touch sensor disposed in at least one of the two sides; a proximity sensor, disposed in the device, that senses heat; and

a processor comprising executable instructions for activating the side touch sensor when the proximity sensor senses heat above a first threshold.

9. The mobile device of claim 8, wherein the side touch sensor is disposed in the at least two sides.

10. The mobile device of claim 8, wherein the proximity sensor comprises a temperature sensor for sensing heat.

11. The mobile device of claim 8, wherein the side touch sensor comprises a capacitive sensor and a touch screen.

12. The mobile device of claim 8, wherein the mobile device comprises a wireless transmit/receive unit (WTRU) comprising a receiver and a transmitter for communicating wirelessly.

13. The mobile device of claim 8, wherein the processor further comprises executable instructions for deactivating the side touch sensor if the magnitude of the difference between the temperature determined by the proximity sensor on the front of the device and the temperature of the back of the device is less than a second threshold.

14. The mobile device of claim 13, further comprising a third side and a fourth side and the side touch sensor is disposed in the third and fourth sides.

15. A mobile device , comprising :

a front comprising a front display comprising a front touch screen;

a back;

at least two sides, wherein the sides connect the front surface to the back;

a side touch sensor, disposed in at least one of the two sides, that is distinct from the front touch screen.

16. The mobile device of claim 15, further comprising a processor and memory electrically coupled to the side touch sensor and the front touch screen.

17. The mobile device of claim 16, further comprising a proximity sensor, disposed in the device, that senses heat, and wherein the processor comprises executable instructions for the processor comprising activating the side touch sensor when the proximity sensor senses heat above a threshold.

18. The mobile device of claim 15, wherein the side touch sensor comprises a side touch screen and a capacitive sensor.

19. The mobile device of claim 15, wherein the mobile device comprises a wireless transmit/receive unit (WTRU) comprising a receiver and a transmitter for communicating wirelessly.

20. The mobile device of claim 15, wherein the side touch screen is disposed in the at least two sides.

Description:
MOBILE DEVICE AND METHOD FOR INTERPERTING TOUCH GESTURES WITHOUT OBSTRUCTING THE SCREEN

CROSS-REFERENCE TO RELATED APPLICATIONS

[0001] This application claims the benefit of the United States Provisional Application No. 62/057,752, filed September 30, 2014, which is hereby incorporated by reference herein.

BACKGROUND

[0002] Mobile devices may have a display that has a touch screen user interface. The touch screen user interface may also be a computer display that displays electrical images to the user. The touch screen may be configured to input gestures to the mobile devices such as swiping, inputting data by touching the screen, or moving display images with gestures. Through the use of these gestures, the user may control the inputs to the device.

[0003] Viewing data on a touch screen may be difficult while a user is manipulating the touch screen because when a user is touching the screen or gesturing the user's hand may block at least a portion of the screen. For devices that use an electrical input device as opposed to a finger, the device may block the screen as well. This problem is illustrated by for example trying to read text on the display while at the same time touching the screen to scroll the text to the next page or to continue displaying text or images. The user may have to interrupt reading text or viewing an image to conduct a gesture.

SUMMARY

[0004] A mobile computer device comprises a front that may have a display comprising a touch screen. The device may also have a back and at least two sides. The sides connect the front to the back. The mobile device comprises a side touch sensor, such as a capacitive sensor (e.g., a touch screen) disposed in at least one of the two sides. The side touch sensor may control movement on the front touch screen in addition to or alternatively to movement from controlled by the front touch screen. For example, a user may conduct gestures on the side touch sensor, such as scrolling, swiping, selecting, focusing, and moving images, to control the images on the front display. The mobile device may have a processor and memory with executable instructions for using the side touch sensor gestures inputted into the side touch sensor to move the images on the front display.

[0005] The mobile device may also comprise a proximity sensor that senses heat, a processor, and a computer memory. The sensors may be disposed to sense the temperature on the back side and the front side of the device. The computer memory may have executable instructions for the processor that comprise activating the side touch sensor when the proximity sensor senses heat above a threshold. The processor may be programmed with logic to subtract the temperature sensed at the front and back sensors and if the magnitude of the sensed temperature difference is greater than a threshold, the processor can activate the side touch sensor. If the difference in the magnitude of the sensed temperatures is less than a threshold, the processor can disable the side touch sensor. By enabling and disabling the side touch sensor based on the magnitude of the front and back temperature difference, the device can activate the touch screen when held in the palm of user's hand (e.g., for mobile devices that can fit within the hand), so that the side touch sensor can be responsive to the user's fingers, and disable the side touch sensor when not held by the user's palm, so that inadvertent touches of the capacitive side touch sensor are not used to change the display.

[0006] The side touch sensor may comprise a capacitive sensor senses touch and a voltage difference that is determined based on touching the capacitive sensor.

[0007] The mobile device may comprise a wireless transmit/receive unit (WTRU) comprising a receiver and a transmitter for communicating wirelessly. BRIEF DESCRIPTION OF THE DRAWINGS

[0008] Figure 1 A is a diagram of an example communications system in which one or more disclosed embodiments may be implemented.

[0009] Figure IB is a system diagram of an example wireless transmit/receive unit in which one or more disclosed embodiments may be implemented.

[00010] Figure 1C is a system diagram of a radio access network and core network in which one or more disclosed embodiments may be implemented.

[00011] Figure ID is a system diagram of a radio access network and a core network in which one or more disclosed embodiments may be implemented.

[00012] Figure IE is a system diagram of the radio access network and the core network in which one or more disclosed embodiments may be implemented.

[00013] Figure 2 is an embodiment of a mobile device for implanting one or more disclosed embodiments;

[00014] Figure 3 is a top view of a front of the embodiment of the mobile device of Figure 2;

[00015] Figure 4 is a bottom view of the back of the embodiment of the mobile device of Figure 2;

[00016] Figure 5 is a side view of the embodiment of the mobile device of Figure 2;

[00017] Figure 5 A is an exploded diagrammatical side view of the embodiment of the mobile device of Figure 2.

[00018] Figure 6 is a diagrammatical view of the embodiment of Figure 2;

[00019] Figure 7 is a diagrammatical view of the embodiment of Figure 2.

[00020] Figure 7A is a diagrammatical view of the embodiment of Figure 2 showing the device held in a user hand;

[00021] Figure 8 is a diagram of an embodiment of an electrical system that can be used with the embodiment of Figure 2;

[00022] Figure 9 is a diagram of an embodiment of an electrical system that can be used with the embodiment of Figure 2; [00023] Figure 10 is an embodiment of an algorithm that can be used with the embodiment of Figure 2; and

[00024] Figure 11 is an embodiment of signal processing that can be used with the embodiment of Figure 2.

[00025] Figure 12 is a top view of the back of a mobile device and side views of the back of a mobile device showing a sensor configuration.

[00026] Figure 13 a top view of the back of a mobile device and side views of the back of a mobile device showing a sensor configuration.

DETAILED DESCRIPTION

[00027] A detailed description of illustrative embodiments will now be described with reference to the various Figures. Although this description provides a detailed example of possible implementations, it should be noted that the details are intended to be exemplary and in no way limit the scope of the application.

[00028] FIG. 1A is a diagram of an example communications system 100 in which one or more disclosed embodiments may be implemented. The communications system 100 may be a multiple access system that provides content, such as voice, data, video, messaging, broadcast, etc., to multiple wireless users. The communications system 100 may enable multiple wireless users to access such content through the sharing of system resources, including wireless bandwidth. For example, the communications systems 100 may employ one or more channel access methods, such as code division multiple access (CDMA), time division multiple access (TDM A), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single- carrier FDMA (SC-FDMA), and the like.

[00029] As shown in FIG. 1A, the communications system 100 may include wireless transmit/receive units (WTRUs) 102a, 102b, 102c, and/or 102d (which generally or collectively may be referred to as WTRU 102), a radio access network (RAN) 103/104/105, a core network 106/107/109, a public switched telephone network (PSTN) 108, the Internet 110, and other networks 112, though it will be appreciated that the disclosed embodiments contemplate any number of WTRUs, base stations, networks, and/or network elements. Each of the WTRUs 102a, 102b, 102c, 102d may be any type of device configured to operate and/or communicate in a wireless environment. By way of example, the WTRUs 102a, 102b, 102c, 102d may be configured to transmit and/or receive wireless signals and may include user equipment (UE), a mobile station, a fixed or mobile subscriber unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, consumer electronics, and the like.

[00030] The communications systems 100 may also include a base station 114a and a base station 114b. Each of the base stations 114a, 114b may be any type of device configured to wirelessly interface with at least one of the WTRUs 102a, 102b, 102c, 102d to facilitate access to one or more communication networks, such as the core network 106/107/109, the Internet 110, and/or the networks 112. By way of example, the base stations 114a, 114b may be a base transceiver station (BTS), a Node-B, an eNode B, a Home Node B, a Home eNode B, a site controller, an access point (AP), a wireless router, and the like. While the base stations 114a, 114b are each depicted as a single element, it will be appreciated that the base stations 114a, 114b may include any number of interconnected base stations and/or network elements.

[00031] The base station 114a may be part of the RAN 103/104/105, which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC), a radio network controller (RNC), relay nodes, etc. The base station 114a and/or the base station 114b may be configured to transmit and/or receive wireless signals within a particular geographic region, which may be referred to as a cell (not shown). The cell may further be divided into cell sectors. For example, the cell associated with the base station 114a may be divided into three sectors. Thus, in one embodiment, the base station 114a may include three transceivers, i.e., one for each sector of the cell. In another embodiment, the base station 114a may employ multiple-input multiple output (MIMO) technology and, therefore, may utilize multiple transceivers for each sector of the cell. [00032] The base stations 114a, 114b may communicate with one or more of the WTRUs 102a, 102b, 102c, 102d over an air interface 115/116/117, which may be any suitable wireless communication link (e.g., radio frequency (RF), microwave, infrared (IR), ultraviolet (UV), visible light, etc.). The air interface 115/116/117 may be established using any suitable radio access technology (RAT).

[00033] More specifically, as noted above, the communications system 100 may be a multiple access system and may employ one or more channel access schemes, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA, and the like. For example, the base station 114a in the RAN 103/104/105 and the WTRUs 102a, 102b, 102c may implement a radio technology such as Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access (UTRA), which may establish the air interface 115/116/117 using wideband CDMA (WCDMA).

WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA+). HSPA may include High-Speed Downlink Packet Access (HSDPA) and/or High-Speed Uplink Packet Access (HSUPA).

[00034] In another embodiment, the base station 114a and the WTRUs 102a, 102b, 102c may implement a radio technology such as Evolved UMTS Terrestrial Radio Access (E-UTRA), which may establish the air interface 115/116/117 using Long Term Evolution (LTE) and/or LTE-Advanced (LTE-A).

[00035] In other embodiments, the base station 114a and the WTRUs 102a, 102b, 102c may implement radio technologies such as IEEE 802.16 (i.e., Worldwide Interoperability for Microwave Access (WiMAX)), CDMA2000, CDMA2000 IX, CDMA2000 EV-DO, Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), Global System for Mobile communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), GSM EDGE (GERAN), and the like.

[00036] The base station 114b in FIG. 1A may be a wireless router, Home Node B, Home eNode B, or access point, for example, and may utilize any suitable RAT for facilitating wireless connectivity in a localized area, such as a place of business, a home, a vehicle, a campus, and the like. In one embodiment, the base station 114b and the WTRUs 102c, 102d may implement a radio technology such as IEEE 802.11 to establish a wireless local area network (WLAN). In another embodiment, the base station 114b and the WTRUs 102c, 102d may implement a radio technology such as IEEE 802.15 to establish a wireless personal area network (WPAN). In yet another embodiment, the base station 114b and the WTRUs 102c, 102d may utilize a cellular- based RAT (e.g., WCDMA, CDMA2000, GSM, LTE, LTE-A, etc.) to establish a picocell or femtocell. As shown in FIG. 1 A, the base station 114b may have a direct connection to the Internet 110. Thus, the base station 114b may not be required to access the Internet 110 via the core network 106/107/109.

[00037] The RAN 103/104/105 may be in communication with the core network 106/107/109, which may be any type of network configured to provide voice, data, applications, and/or voice over internet protocol (VoIP) services to one or more of the WTRUs 102a, 102b, 102c, 102d. For example, the core network 106/107/109 may provide call control, billing services, mobile location-based services, pre-paid calling, Internet connectivity, video distribution, etc., and/or perform high-level security functions, such as user authentication. Although not shown in FIG. 1A, it will be appreciated that the RAN 103/104/105 and/or the core network 106/107/109 may be in direct or indirect communication with other RANs that employ the same RAT as the RAN 103/104/105 or a different RAT. For example, in addition to being connected to the RAN 103/104/105, which may be utilizing an E-UTRA radio technology, the core network

106/107/109 may also be in communication with another RAN (not shown) employing a GSM radio technology.

[00038] The core network 106/107/109 may also serve as a gateway for the WTRUs 102a, 102b, 102c, 102d to access the PSTN 108, the Internet 110, and/or other networks 112. The PSTN 108 may include circuit-switched telephone networks that provide plain old telephone service (POTS). The Internet 110 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and the internet protocol (IP) in the TCP/IP internet protocol suite. The networks 112 may include wired or wireless

communications networks owned and/or operated by other service providers. For example, the networks 112 may include another core network connected to one or more RANs, which may employ the same RAT as the RAN 103/104/105 or a different RAT.

[00039] Some or all of the WTRUs 102a, 102b, 102c, 102d in the communications system 100 may include multi-mode capabilities, i.e., the WTRUs 102a, 102b, 102c, 102d may include multiple transceivers for communicating with different wireless networks over different wireless links. For example, the WTRU 102c shown in FIG. 1A may be configured to communicate with the base station 114a, which may employ a cellular-based radio technology, and with the base station 114b, which may employ an IEEE 802 radio technology.

[00040] FIG. IB is a system diagram of an example WTRU 102. As shown in FIG. IB, the WTRU 102 may include a processor 118, a transceiver 120, a transmit/receive element 122, a speaker/microphone 124, a keypad 126, a display/touchpad 128, non-removable memory 130, removable memory 132, a power source 134, a global positioning system (GPS) chipset 136, and other peripherals 138. It will be appreciated that the WTRU 102 may include any subcombination of the foregoing elements while remaining consistent with an embodiment. Also, embodiments contemplate that the base stations 114a and 114b, and/or the nodes that base stations 114a and 114b may represent, such as but not limited to transceiver station (BTS), a Node-B, a site controller, an access point (AP), a home node-B, an evolved home node-B (eNodeB), a home evolved node-B (HeNB), a home evolved node-B gateway, and proxy nodes, among others, may include some or all of the elements depicted in FIG. IB and described herein.

[00041] The processor 118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller,

Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. The processor 118 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 102 to operate in a wireless environment. The processor 118 may be coupled to the transceiver 120, which may be coupled to the

transmit/receive element 122. While FIG. IB depicts the processor 118 and the transceiver 120 as separate components, it will be appreciated that the processor 118 and the transceiver 120 may be integrated together in an electronic package or chip.

[00042] The transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station (e.g., the base station 114a) over the air interface 115/116/117. For example, in one embodiment, the transmit/receive element 122 may be an antenna configured to transmit and/or receive RF signals. In another embodiment, the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example. In yet another embodiment, the transmit/receive element 122 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wireless signals.

[00043] In addition, although the transmit/receive element 122 is depicted in FIG. IB as a single element, the WTRU 102 may include any number of transmit/receive elements 122. More specifically, the WTRU 102 may employ MIMO technology. Thus, in one embodiment, the WTRU 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 115/116/117.

[00044] The transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122. As noted above, the WTRU 102 may have multi-mode capabilities. Thus, the transceiver 120 may include multiple transceivers for enabling the WTRU 102 to communicate via multiple RATs, such as UTRA and IEEE 802.11, for example.

[00045] The processor 118 of the WTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display (LCD) display unit or organic light-emitting diode (OLED) display unit). The processor 118 may also output user data to the speaker/microphone 124, the keypad 126, and/or the display/touchpad 128. In addition, the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132. The non-removable memory 130 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. The removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 118 may access information from, and store data in, memory that is not physically located on the WTRU 102, such as on a server or a home computer (not shown).

[00046] The processor 118 may receive power from the power source 134, and may be configured to distribute and/or control the power to the other components in the WTRU 102. The power source 134 may be any suitable device for powering the WTRU 102. For example, the power source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like.

[00047] The processor 118 may also be coupled to the GPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 102. In addition to, or in lieu of, the information from the GPS chipset 136, the WTRU 102 may receive location information over the air interface 115/116/117 from a base station (e.g., base stations 114a, 114b) and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.

[00048] The processor 118 may further be coupled to other peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity. For example, the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.

[00049] FIG. 1C is a system diagram of the RAN 103 and the core network 106 according to an embodiment. As noted above, the RAN 103 may employ a UTRA radio technology to communicate with the WTRUs 102a, 102b, 102c over the air interface 115. The RAN 103 may also be in communication with the core network 106. As shown in FIG. 1C, the RAN 103 may include Node-Bs 140a, 140b, 140c, which may each include one or more transceivers for communicating with the WTRUs 102a, 102b, 102c over the air interface 115. The Node-Bs 140a, 140b, 140c may each be associated with a particular cell (not shown) within the RAN 103. The RAN 103 may also include RNCs 142a, 142b. It will be appreciated that the RAN 103 may include any number of Node-Bs and RNCs while remaining consistent with an embodiment.

[00050] As shown in FIG. 1C, the Node-Bs 140a, 140b may be in communication with the RNC 142a. Additionally, the Node-B 140c may be in communication with the RNC 142b. The Node-Bs 140a, 140b, 140c may communicate with the respective RNCs 142a, 142b via an Iub interface. The RNCs 142a, 142b may be in communication with one another via an lur interface. Each of the RNCs 142a, 142b may be configured to control the respective Node-Bs 140a, 140b, 140c to which it is connected. In addition, each of the RNCs 142a, 142b may be configured to carry out or support other functionality, such as outer loop power control, load control, admission control, packet scheduling, handover control, macrodiversity, security functions, data encryption, and the like.

[00051] The core network 106 shown in FIG. 1C may include a media gateway (MGW) 144, a mobile switching center (MSC) 146, a serving GPRS support node (SGSN) 148, and/or a gateway GPRS support node (GGSN) 150. While each of the foregoing elements are depicted as part of the core network 106, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.

[00052] The RNC 142a in the RAN 103 may be connected to the MSC 146 in the core network 106 via an IuCS interface. The MSC 146 may be connected to the MGW 144. The MSC 146 and the MGW 144 may provide the WTRUs 102a, 102b, 102c with access to circuit-switched networks, such as the PSTN 108, to facilitate communications between the WTRUs 102a, 102b, 102c and traditional land- line communications devices.

[00053] The RNC 142a in the RAN 103 may also be connected to the SGSN 148 in the core network 106 via an IuPS interface. The SGSN 148 may be connected to the GGSN 150. The SGSN 148 and the GGSN 150 may provide the WTRUs 102a, 102b, 102c with access to packet- switched networks, such as the Internet 110, to facilitate communications between and the WTRUs 102a, 102b, 102c and IP-enabled devices.

[00054] As noted above, the core network 106 may also be connected to the networks 112, which may include other wired or wireless networks that are owned and/or operated by other service providers.

[00055] FIG. ID is a system diagram of the RAN 104 and the core network 107 according to an embodiment. As noted above, the RAN 104 may employ an E-UTRA radio technology to communicate with the WTRUs 102a, 102b, 102c over the air interface 116. The RAN 104 may also be in communication with the core network 107.

[00056] The RAN 104 may include eNode-Bs 160a, 160b, 160c, though it will be appreciated that the RAN 104 may include any number of eNode-Bs while remaining consistent with an embodiment. The eNode-Bs 160a, 160b, 160c may each include one or more transceivers for communicating with the WTRUs 102a, 102b, 102c over the air interface 116. In one

embodiment, the eNode-Bs 160a, 160b, 160c may implement MIMO technology. Thus, the eNode-B 160a, for example, may use multiple antennas to transmit wireless signals to, and receive wireless signals from, the WTRU 102a.

[00057] Each of the eNode-Bs 160a, 160b, 160c may be associated with a particular cell (not shown) and may be configured to handle radio resource management decisions, handover decisions, scheduling of users in the uplink and/or downlink, and the like. As shown in FIG. ID, the eNode-Bs 160a, 160b, 160c may communicate with one another over an X2 interface. [00058] The core network 107 shown in FIG. ID may include a mobility management gateway (MME) 162, a serving gateway 164, and a packet data network (PDN) gateway 166. While each of the foregoing elements are depicted as part of the core network 107, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.

[00059] The MME 162 may be connected to each of the eNode-Bs 160a, 160b, 160c in the RAN 104 via an SI interface and may serve as a control node. For example, the MME 162 may be responsible for authenticating users of the WTRUs 102a, 102b, 102c, bearer

activation/deactivation, selecting a particular serving gateway during an initial attach of the WTRUs 102a, 102b, 102c, and the like. The MME 162 may also provide a control plane function for switching between the RAN 104 and other RANs (not shown) that employ other radio technologies, such as GSM or WCDMA.

[00060] The serving gateway 164 may be connected to each of the eNode-Bs 160a, 160b, 160c in the RAN 104 via the SI interface. The serving gateway 164 may generally route and forward user data packets to/from the WTRUs 102a, 102b, 102c. The serving gateway 164 may also perform other functions, such as anchoring user planes during inter-eNode B handovers, triggering paging when downlink data is available for the WTRUs 102a, 102b, 102c, managing and storing contexts of the WTRUs 102a, 102b, 102c, and the like.

[00061] The serving gateway 164 may also be connected to the PDN gateway 166, which may provide the WTRUs 102a, 102b, 102c with access to packet-switched networks, such as the Internet 110, to facilitate communications between the WTRUs 102a, 102b, 102c and IP-enabled devices.

[00062] The core network 107 may facilitate communications with other networks. For example, the core network 107 may provide the WTRUs 102a, 102b, 102c with access to circuit- switched networks, such as the PSTN 108, to facilitate communications between the WTRUs 102a, 102b, 102c and traditional land- line communications devices. For example, the core network 107 may include, or may communicate with, an IP gateway (e.g., an IP multimedia subsystem (IMS) server) that serves as an interface between the core network 107 and the PSTN 108. In addition, the core network 107 may provide the WTRUs 102a, 102b, 102c with access to the networks 112, which may include other wired or wireless networks that are owned and/or operated by other service providers.

[00063] FIG. IE is a system diagram of the RAN 105 and the core network 109 according to an embodiment. The RAN 105 may be an access service network (ASN) that employs IEEE 802.16 radio technology to communicate with the WTRUs 102a, 102b, 102c over the air interface 117. As will be further discussed below, the communication links between the different functional entities of the WTRUs 102a, 102b, 102c, the RAN 105, and the core network 109 may be defined as reference points.

[00064] As shown in FIG. IE, the RAN 105 may include base stations 180a, 180b, 180c, and an ASN gateway 182, though it will be appreciated that the RAN 105 may include any number of base stations and ASN gateways while remaining consistent with an embodiment. The base stations 180a, 180b, 180c may each be associated with a particular cell (not shown) in the RAN 105 and may each include one or more transceivers for communicating with the WTRUs 102a, 102b, 102c over the air interface 117. In one embodiment, the base stations 180a, 180b, 180c may implement MIMO technology. Thus, the base station 180a, for example, may use multiple antennas to transmit wireless signals to, and receive wireless signals from, the WTRU 102a. The base stations 180a, 180b, 180c may also provide mobility management functions, such as handoff triggering, tunnel establishment, radio resource management, traffic classification, quality of service (QoS) policy enforcement, and the like. The ASN gateway 182 may serve as a traffic aggregation point and may be responsible for paging, caching of subscriber profiles, routing to the core network 109, and the like.

[00065] The air interface 117 between the WTRUs 102a, 102b, 102c and the RAN 105 may be defined as an Rl reference point that implements the IEEE 802.16 specification. In addition, each of the WTRUs 102a, 102b, 102c may establish a logical interface (not shown) with the core network 109. The logical interface between the WTRUs 102a, 102b, 102c and the core network 109 may be defined as an R2 reference point, which may be used for authentication, authorization, IP host configuration management, and/or mobility management.

[00066] The communication link between each of the base stations 180a, 180b, 180c may be defined as an R8 reference point that includes protocols for facilitating WTRU handovers and the transfer of data between base stations. The communication link between the base stations 180a, 180b, 180c and the ASN gateway 182 may be defined as an R6 reference point. The R6 reference point may include protocols for facilitating mobility management based on mobility events associated with each of the WTRUs 102a, 102b, 102c.

[00067] As shown in FIG. IE, the RAN 105 may be connected to the core network 109. The communication link between the RAN 105 and the core network 109 may defined as an R3 reference point that includes protocols for facilitating data transfer and mobility management capabilities, for example. The core network 109 may include a mobile IP home agent (MIP-HA) 184, an authentication, authorization, accounting (AAA) server 186, and a gateway 188. While each of the foregoing elements are depicted as part of the core network 109, it will be

appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.

[00068] The MIP-HA may be responsible for IP address management, and may enable the WTRUs 102a, 102b, 102c to roam between different ASNs and/or different core networks. The MIP-HA 184 may provide the WTRUs 102a, 102b, 102c with access to packet-switched networks, such as the Internet 110, to facilitate communications between the WTRUs 102a, 102b, 102c and IP-enabled devices. The AAA server 186 may be responsible for user authentication and for supporting user services. The gateway 188 may facilitate interworking with other networks. For example, the gateway 188 may provide the WTRUs 102a, 102b, 102c with access to circuit-switched networks, such as the PSTN 108, to facilitate communications between the WTRUs 102a, 102b, 102c and traditional land-line communications devices. In addition, the gateway 188 may provide the WTRUs 102a, 102b, 102c with access to the networks 112, which may include other wired or wireless networks that are owned and/or operated by other service providers.

[00069] Although not shown in FIG. IE, it will be appreciated that the RAN 105 may be connected to other ASNs and the core network 109 may be connected to other core networks. The communication link between the RAN 105 the other ASNs may be defined as an R4 reference point, which may include protocols for coordinating the mobility of the WTRUs 102a, 102b, 102c between the RAN 105 and the other ASNs. The communication link between the core network 109 and the other core networks may be defined as an R5 reference, which may include protocols for facilitating interworking between home core networks and visited core networks.

[00070] Figure 2 depicts a diagrammatical view of a preferred embodiment of a mobile device 200. The mobile device 200 may be a WRTU as described with references to Figures 1A-1E. The mobile device need not have either a receiver or a transmitter. In alternative embodiments, the mobile device 200 does not have either or both a receiver and a transmitter. The mobile device 200 may be a smart phone, cellular telephone, tablet, laptop, or other mobile computing device. The mobile device may have a front side 202, a back side 204, a first side 206, a second side 208, a third side 210, and a fourth side 212, as shown with reference to Figure 2-6. The front side 202, the back side 204, the first side 206, the second side 208, the third side 210, and the fourth side 212 may be plastic, metal, or any suitable material for a mobile device. The front side 202, the back side 204, the first side 206, the second side 208, the third side 210, and the fourth side 212 may be integrally formed. Alternatively, one or more of the front side 202, the back side 204, the first side 206, the second side 208, the third side 210, and the fourth side 212 may separate pieces that are connected by any conventional means, such as one or more fasteners, one or more adhesives, heat, or a fusion process.

[00071] Figure 3 is a top view of the front side 202 of the mobile device 200. The front side 202 may have a computer display 214. The display 214 may have a touch screen that permits a user to input gestures to the device 200 and thereby move images on the display 214 or input data. The display 214 may include a resistive or force sensor or sensors, a capacitive and/or proximity sensor or sensors, surface acoustic wave technologies, optical imaging technologies, dispersive signal technologies, acoustic pulse recognition, and/or any other suitable sensor or technology that may be used to record a tap and/or a pressure thereof. For example, the display 214 may be a capacitive touch screen that may include a capacitive touch sensor used in smart phones, tablets, or computers (e.g., Apple Iphone or Ipad). Examples of capacitive touch screens that can be used, may include a TrueTouch® Multi-Touch All-Points Touchscreen Controller connected to a capacitive touch sensor integrated to the liquid-crystal display (LCD) like the displays generally used in smart phones. The front side preferably has a front width 216 and a front length 218.

[00072] Figure 4 is a top view of the back side 204 of the mobile device 200. The back side 204 may have a back width 220 and a back length 222.

[00073] The mobile 200 may have one or more proximity sensors 224. An example of a proximity sensor that may be used may include an electrode capacitive sensor connected to a controller that may recognize the change in capacitance when it may be blocked by an object such as a hand. The one or more proximity sensors 224 may comprise temperature sensors 224a, 224b disposed in the front side 202 and the back side 204 of the device 200 for sensing the temperature difference between the front side 202 and the back side 204. When the temperature sensors 224a, 224b sense a temperature difference exceeding a certain threshold they can activate the touch screens 250 directly or be coupled to the processor to activate the touch screen.

Conversely when the temperature difference is within a certain threshold, the sensors 224a, 224b can directly, or as controlled by the processor, deactivate the touch screen 250. For example, if a device is held by a user in the palm of the hand (e.g., Fig. 7A), the sensors 224a, 224b may sense a higher temperature on the back side 204 than on the front side 202 and activate the touch screen 250. When the device 200 is set on a surface, the temperature difference sensed by sensors 224a and 224b between the front side 202 and the back side 204 may decrease below a threshold and the touch screen 250 will be deactivated. In an alternative or additional examples (e.g., rather than or in conjunction with temperature sensors) a capacitive proximity sensor may be used. For example, the device configuration may be such that the back side may have two touch sensitive areas covering the top and bottom half of the back side, a plastic film capacitive sensor with, for example a controller such as a Cypress True Touch, touch screen controller like the CY8CTMA340 including the TrueTouch® Multi-Touch All-Points Touchscreen Controller that may support up to four fingers with accuracy. Such a device configuration may be used to detect a hand. Additionally, a simple electrode that may be used as a touch button in the middle of lower part of the body and upper part of the body connected to, for example, a controller such as a Cypress Capsense controller, including the CYCMBR21 10, may be used to detect proximity of the hand and activate the side sensors accordingly.

[00074] As shown in Figure 5, the first side 206 may have a side length 228 and a side width 230. The side width 230 may be less than the front width 216 and the back width 220. The side length 228 may be substantially the same as the front length 218 and the back length 222. The second side 208 may have a side length and side width that is respectively the same or substantially the same (e.g., the lengths are within 95% of each other), as the first side length 228 and width 230. Alternatively, the side length and the side width of the second side 208 may differ from the first side length and width if the device is irregularly shaped. The third side 210 and the fourth side 212 may have a side length and a side width. The side width of the third side 210 and the fourth side 212 may be the same or substantially the same (e.g., the widths are within 95% of each other), as the side width 230 of the first side 206 and the second side 208. The side length of the third side 210 and the fourth side may be the same or substantially the same (e.g., the lengths are within 95% of each other), or not if the device is irregularly shaped.

[00075] As shown in Figure 5 A, the first side 206 may have a touch sensor 250. The touch sensor 250 may be a capacitive sensor or any other suitable sensors or technologies that may receive a touch such as a tap and/or a pressure associated therewith as described herein (e.g., a resistive or force sensor or sensors, a proximity sensor or sensors, surface acoustic wave technologies, optical imaging technologies, dispersive signal technologies, acoustic pulse recognition, and/or any other suitable sensor or technology that may be used to record a tap and/or a pressure thereof).

[00076] The touch sensor 250 may have two capacitive sensors 250a and 250b. Each of the second side 208, third side 210, and fourth side 212 may also have a touch sensor 250. The touch sensor 250 may run around the entire sides 206, 208, 210, 212, of the device 200 or any portion of the device 200. The sensor on the side may comprise of an array of electrodes connected to a multiplexing controller that may detect the changes in capacitance caused by touches on the side. Further, in an example, a slider type of sensor composed of chevron shaped electrodes may be used for detecting the positioning and movement of fingers along the side of the device. The sensor may be connected to, for example, a controller such as a Cypress Capsense controller including CY8CMBR21 10 that may support ten electrodes. According to an example (e.g., for better or improved accuracy in some instances for detecting a swipe), a plastic film capacitive sensor may be used (e.g., which may typically used in a touch screen) with, for example, a Cypress True Touch, touch screen controller including the CY8CTMA340:

TrueTouch® Multi-Touch All-Points Touchscreen Controller, which may support up to four fingers. Such a configuration or set up may be used to create two touch sensitive panels on the back of the device including, for example, one in the upper part of the body and the other one in the lower part of the body. In examples, these could be used to not only detect how the device may be held in the hand, but also to detect gestures on the back side of the device.

[00077] The user may use the touch sensor 250 to manipulate the display 214. For example, the user may provide a swipe gesture with the touch sensor 250 that moves the display as if swiping the display 214 itself. By way of example, a swipe with the touch sensor 250 may move the displayed image up or down to provide scrolling action to provide additional text or images. As shown in Figure 6, the image 251 on the display 14 may be "moved" in the direction from the first side 210 to the second side 212 or in the direction from the second side 212 to the first side 210 by gesturing with the touch screen 250 from the first side 210 to the second side 212 or the second side 212 to the first side 210 as indicated by the arrow 253. "Moved" is used to describe the process of electrically displaying new material and includes for example scrolling, page turning, and moving images. A display image 259 may be moved laterally by swiping the touch sensor from the front side 210 to the back side 212 or the back side 212 to the front 210, as shown by the arrow 255 in Figure 7. For example, a gesture in the direction of arrow 255 in Figure 7 from the first side 210 to the second side 212 can simulate paging down to the next page in a document, and a gesture in the direction from the second side 212 can simulate paging up to the previous page in a document. This is also shown in Figure 7A. Figure 7A depicts the device 200 held in a user's right hand. The side touch screen 250 may be manipulated by the thumb. The touch sensor 250 may be manipulated with a thumb. Alternatively or additionally, the touch sensor may be manipulated with one or more other fingers on side 208 (e.g., a two finger tap may be used as a gesture). The mobile device 200 can be configured so that a swipe gesture in any particular direction causes a particular movement of the image on the touch screen (e.g., page up, page down, next page, previous page etc. ..).

[00078] A "zoom/pinch" gesture may be performed by a touch on a touch screen accompanied by movement. A "zoom/pinch" gesture may be performed by providing a gesture from the first side 210 to the second side 212 or the second side 212 to the first side 210 on the touch screen 250 on either side 206 or 208, and providing a touch gesture on the touch screen 250 on the other side 206 or 208. It will be appreciated that the zoom/pinch feature can be employed by using a touch sensor on any side of the device as the touch sensor and the touch sensor on another side of the device as the movement gesture. For example, a thumb may be used to perform a gesture on one side and one or more fingers may be used to provide a gesture on the other side. The thumb can be constant and one or more fingers may be moved up or down to provide a zoom/pinch gesture. Moving a finger up towards the top may be a zoom in (e.g., enlarge) gesture, and moving a finger down may be a zoom out (e.g., reduce size) gesture. The mobile device 200 can be configures so that the zoom/pinch gesture corresponds to a particular gesture.

[00079] One or more touch screens 250 can be used to zoom or focus in or out on the display 214. For example by touching the touch screen 250 an image can be selected and by moving fingers together or apart, the selected display image can be larger or smaller. An image can be selected for focusing on (e.g., zooming in or out) and when selected, vibration of the selected image can be simulates on the display by moving the object, as controlled by the processor in response to selection. An image can be selected with various gestures such as tapping the touch screen 250 twice within a certain period of time. The mobile device can be configures so that when an image is selected it vibrates or shakes, and if there is no gesture within a certain time frame, the selected image will stop shaking. This can be performed by the mobile device processor comparing whether a gesture for the selected image has been received within a threshold time period.

[00080] For mobile devices 200 that are handheld, the mobile device 200 may have a feature for configuring the mobile device 200 for either a left handed or right handed user. For a handheld mobile 200, the mobile 200 may determine in which hand the device is being held and control the touch screens 250 in response to the left or the right hand holding the device. For example, if the touch screen senses two or more simultaneously inputs (e.g., two or more fingers), that indicates that the opposing side touch screen is proximal to the person's thumb. With reference to Figure 2, if the touch screen 250 in side 206 senses two inputs, the processor may determine that the device is held in the left hand. The touch screen 250 that senses two or more inputs may be used for selected an image or zooming and the opposite side touch screen can be used for scrolling (e.g., with the thumb). For example, in Figure 7A, the thumb is shown on the one side and the other fingers are shown in the opposing side and can each be sensed by the touch screens on opposing sides.

[00081] The proximity sensor 224 can sense a device is being held be a user in the user's hand. The proximity sensor may be electrically coupled to the processor such that the processor only activates the touch sensor 250 when the proximity sensor 224 senses a person's hand. For example, the touch sensor may be coupled to the proximity sensor directly or through the processor by an electrical or mechanical switch. The switch can activate the touch sensor when the proximity sensor senses the hand. The switch can be an electrical switch that is comprised of hardware or software. For example, the processor can be programmed to determine whether the proximity sensor senses a hand, and if the proximity sensor senses a hand, the processor can enable the touch sensor 205 by for example providing power. When the processor determines that the proximity sensor does not sense a hand, the processor disables the touch sensor by for example removing power. The proximity feature 224 prevents unwanted gestures being inputted to the touch screen 250 when the mobile 200 is not held in a hand.

[00082] The mobile device 200 may have a power switch 201 , as shown in Figure 2. The power switch 201 may be electrically coupled to the battery and the touch screen 250 directly or from the processor. Upon activating the power switch 201 the touch screen 250 may receive power and be enabled. Upon deactivating the power switch 201 power may be removed from the touch screen 250 and the touch screen 250 may be disabled. The power switch 201 can control power to the touch screen 250 in addition to the proximity sensor 224 or the proximity sensor 224 need not be used.

[00083] Activation of the touch screen 250 may also be controlled by one or more software programs such as mobile applications that run on the mobile device. For example, a mobile application can have software that asks the user whether to enable or disable the touch screen 250 and the software can be coupled to the processor to control the activation/deactivation of the touch screen 250.

[00084] The display 214 may have a touch screen 214a. The touch screen 214a may operate independently of one or more side touch screens 250, such that the display touch screen 214a is not inhibited or affected by the touch sensors 250.

[00085] The mobile device 200 may have the electrical processing as shown in Figure 8. The processing includes a processor comprising a gesture recognizer 802, an application 804 having logic to process gestures, and a screen controller 806. As shown, the gesture recognizer can be electrically coupled to the touch screen 250 (e.g., capacitive sensors) and the proximity sensors 224. The gesture recognizer 802 may also be electrically coupled to the application logic 804. The application logic 804 may electrically coupled to the screen controller 806, and the screen controller may be electrically coupled to the device screen 214.

[00086] The gesture recognizer 806 is preferably programmed with an algorithm stored in any of the computer memory described above. The gesture recognizer 806 algorithm can recognize the gesture type as a touch, movement type, amount of movement, selection, etc... The gesture recognizer may send a signal to the application 804 indicative of the type of gesture that has been inputted into the touch screen 250. The application logic 804 determines the corresponding effect on the screen display 214 from the inputted gesture and transmits instructions to the screen controller corresponding to the type of gesture. The screen controller 806 controls the screen display in response to the application instructions, by for example, scrolling, selecting images, zooming, etc....

[00087] As shown in Figure 9, the mobile device 200 may have processing of that shown in Figure 8 wherein the proximity sensors 224 comprise temperature sensors 224a, 224b. In this embodiment, the mobile device may further have a comparator which may be implemented in hardware or software that compares the temperature sensed by temperature sensor 224a with the temperature sensed by temperature sensor 224b. The logic of the comparator is set forth in Figure 10 and when the comparator is implemented in software executable instructions for executing the algorithm in Figure 10 may be saved in the device memory, which can be any suitable computer memory, as described above with reference to Figures 1A-1E. At step 1002, the processor may receive the temperature Tl sensed from the one of the two sensors 224a, 224b. At step 1004 the processor may receive the temperature T2 sensed from the other of the two sensors 224a, 224b. The temperatures from the sensors can be saved in computer memory of the type described with reference to Figures 1A-1E. At step 1006, the processor can then subtract Tl and T2 and save the difference in memory. The processor can compare the magnitude of the difference between Tl and T2 to a first threshold as step 1006. If the magnitude of the difference Tl and T2 is greater (or greater than or equal to) than the first threshold, the processor can then enable the side touch screen 250 at step 1008. The processor enables the side touch screen 250 because the temperature difference indicates that the device is held by a user, as shown in Figure 7A. The first threshold may be set to be indicative of a user holding a device 200 in a hand with one side (e.g., 204) covered by the hand and the other side (e.g., 202) exposed to ambient. If the magnitude of the difference between Tl and T2 is not greater (or not greater than or equal to in another embodiment) than a second first threshold, the processor can move to step 1010. The threshold may be indicative of the difference between ambient temperature and the temperature of a person's hand. At step 1010, the processor can compare the difference between Tl and T2 to a second threshold. If the magnitude of the difference between Tl and T2 is less than (or less than or equal to), the second threshold, the processor can disable the side touch screen at step 1012. The processor disables the side touch screen 250 because the temperature difference indicates that the device is not held by a user. The first threshold may be equal to the second threshold. The first threshold may differ from the second threshold.

[00088] Figure 1 1 depicts a signal flow diagram for the signals that may be used in the device 200. As shown, at 1 102, the proximity sensor 224 can send a signal to the gesture recognizer. If the signal indicates that the device is hand held, (e.g., with the temperature sensors and the temperature difference being above a threshold), the gesture recognizer 802 sends a signal to the screen control at 1 104 to enable the side touch screen 250. When the side touch screen 250 receives a gesture, that gesture is sent to the gesture recognizer 802 at 1 106. The gesture recognizer 802 communicates a signal that is indicative of the gesture (e.g., gesture type, magnitude, direction) at 1 108 (e.g., swipe plus direction is shown in Figure 1 1) to the application 804. The application 804 determines the signals to send to the screen controller 806 that correspond to the gesture to display images on the display 214 and sends those signals to the screen controller at 1 1 10.

[00089] The mobile device may have a configuration as shown in Figure 12. Figure 12 illustrates the positioning of sensors on the back side 214 and the sides 210, 212 of the WTRU. The back side may have a sensor array a 1200 and a sensor array b 1202. Side 212 may have sensor array c 1204 and sensor array d 1206. Side 210 may have sensor array e 1208 and sensor array f 1210. The sensitive area on sides of the mobile device may be activated when the device is held in a palm of the hand. This can be accomplished either by a proximity sensor in the lower part of the mobile device body (e.g., sensor array b 1202), with two proximity sensors (one in upper part and one in the lower part of the mobile device body, e.g., sensor arrays a,b 1200, 1202), or with two temperature sensors (one in the upper part and the other one in the lower part of the mobile device) as described above. Around the side of the device there may be two capacitive touch sensors 1204, 1206, which would run parallel but at a maximal distance from each other from the bottom of the device to the top of the mobile device. A similar set of touch sensors 1208, 1210 may be embedded on the other long edge of the mobile device. The touch screen on the front of the mobile device may still work as a touch screen and touch on it can also be interpreted. A user may scroll the content by swiping the front touch screen instead of the sides 210, 212 of the mobile device. The two touch sensitive areas 1200, 1202 on the back 214 of the mobile device that can be used for determining the mobile device's position in the hand and as touch control devices for the front side touch screen.

[00090] In order to recognize multiple fingers sensors 1204, 1206, 1208, and 1210 are arrays of sensors. If both sensor array 1200 and sensor array 1202 are simultaneously covered, the back and side panel sensors may be switched off. If the front side touch screen is covered by at least 50% (e.g., covered by a hand), the back panel sensor arrays 1200, 1202 and the sensors on the sides 1204, 1206, 1208, 1210 may be switched on. If sensor array 1202 senses that it is covered by at least 50% (e.g., by the palm of the hand) and sensor 1206 senses more than one finger and sensor 1210 senses that it is mostly covered (e.g., more than 50%>), sensors 1200, 1208 and 1204 are activated. In this case, the mobile device may be held in a hand (e.g., in a right hand).

Likewise, if sensor array 1202 senses that it is covered by at least 50%> (e.g., by the palm of the hand) and sensor 1210 senses more than one finger and sensor 1206 sensor that it is mostly covered (e.g., more than 50%>), sensors 1200, 1204, 1208 are activated. In this case, the mobile device may be held in a hand (e.g., held in a left hand). If sensor array 1200 senses that it is mostly covered (e.g., by more than 50%>, e.g., by the palm of the hand) and sensor 1208 senses more than one finger and sensor 1204 senses that it is mostly covered (e.g., by more than 50%), sensors 1202, 1210 and 1206 are activated. In this case, the mobile device may be held in a hand (e.g., held in a right hand). If sensor array 1200 senses that it is covered (e.g., by more than 50%, e.g., by the palm of the hand) and sensor 1204 senses more than one finger and sensor 1208 senses that it is mostly covered (e.g., by more than 50%>), sensors 1202, 1210 and 1206 are activated. In this case, the mobile device may be held in a palm (e.g. , the mobile device is held in the left hand).

[00091] The mobile device may have a configuration as shown in FIG. 13. Figure 13 illustrates the positioning of sensors on the back side 214 and the sides 210, 212 of the mobile device. The back side may have a sensor array a 1300 and a sensor array b 1302. Side 214 may have sensor array c 1304 and sensor array d 1306. Side 212 may have sensor array e 1308 and sensor array f 1310. The configuration of FIG. 13 may operate similar to that of FIG. 12, but may also have IR/temperature sensors 1300a, 1300b, 1300c, 1300d, 1302a, 1302b, 1302c, 1302d as an alternative or complementary way to using sensor arrays 1300 and 1302 for detection. If at least three of the sensors 1302a, 1302b, 1302c, 1302d are each mostly covered (e.g., more than 50%) (e.g., detecting proximity of the hand), the sensors 1300, 1308 and 1304 may be activated. This may indicate the mobile device is held up right. Similarly if at least three of the sensors 1300a, 1300b, 1300c, 1300d are each mostly covered (e.g., more than 50%>), sensors 1302, 1306, and 1310 and are activated. The configuration shown in FIGS. 12 and 13 can work with the other description of the mobile device described herein. The mobile device of FIGS. 12 and 13 with touch screen has a proximity sensor on the back of the device may activate the "touch area" on the sides of the device. Touch area is composed of two capacitive sensors that can sense the movement of the finger up and down on the side of the device and scroll the content on the screen in relation to the movement of the finger. The mobile device may have a touch sensitive area around it, but switching it on from the setup of the device may activate it. Switching the touch sensitive area around the device on and off may be controlled programmatically from an application running on the device. Content may be scrolled to next page by swiping from the front of the side to the back of the side of the device. Likewise, content can be moved to previous page by swiping from the backside of the device to the front side of the device.

Keeping the thumb in place and moving the index finger up and down the touch sensor on the opposite side of the device may generate a zoom/pinch gesture. Moving an index finger up may be interpreted as zoom in (content on the touch screen is enlarged and moving the index finger down may be considered as zoom out (content becomes smaller). Regardless of the extra touch area on the sides of the device, the front side touch screen will continue to work normally, e.g., also touches on it will be interpreted in parallel to the touches on the touch sensors on the sides. A temperature sensor embedded in the lower part of the device body and another sensor on the top of the body may be used to recognize if the device is hold in the hand instead of a proximity sensor. If the temperature difference sensed by the two sensors exceeds a set threshold, the device is determined to be held in hand and the touch sensors on sides of the device may be activated. A capacitive sensor on the side of the device may be used to determine in which hand the device is hold by sensing a multi-touch. If a side sensor may recognizes two or more fingers, the signal from the other side sensor may be interpreted for moving content up and down and vice versa. This way the system may automatically work both for left handed and right handed users without any additional setup. Instead of scrolling text the movement of a thumb can be interpreted as moving focus on selectable items on the touch screen. If an item on screen is on focus, the focus may be indicating by vibrating the item in focus on the screen. If an item on screen is in focus, it may be selected by tapping the side touch sensitive area twice with a thumb within a set time period. If time period is exceeded, the gesture is not processed further. Two fingers tap may be recognized if user taps the sides both with index finger and thumb. Two fingers swipe gesture may be recognized, if user simultaneously moves both index finger and thumb upwards or downwards along the side of the device. Rotation gesture may be recognized, if a user moves a thumb and a index finger to opposite directions {e.g., thumb up and index finger down or vice versa). The touch sensitive panel on the back or the touch screen on the front may be used to recognize long press, pan and screen edge pan. If a user covers the back of the device with a hand, the back panel and the side sensors may be turned off. If more than 50% of the front screen is covered by hand, the side sensors as well as the back panel sensors may be switched on.

[00092] The gestures with the sides may not obstructing the user's view of the screen (e.g., while reading books or magazines). The front side touch screen may work in parallel with the extra touch sensitive area on the side of the device. Touch area on the sides may not be active if the device is not held in the palm of the hand. The gesture recognizer for the side touch sensor may be active when the screen is active (e.g., back lights are on) and/or when a temperature sensor recognizes the device is held.

[00093] Because computing processes can be moved between computers, it is to be understood that the inventions described herein are not limited to a certain computer or processor, and the inventions can be carried out in one or more computers as described above. It is to be understood, however, that even though numerous characteristics and advantages of the present invention have been set forth in the foregoing description, together with details of the structure and function of the invention, the disclosure is illustrative only, and changes may be made in detail, especially in matters of arrangement of parts, details reflected in the icons, and other display characteristics, within the principles of the invention to the full extent indicated by the broad general meaning of the terms in which the appended claims are expressed.

[00094] Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer- readable medium for execution by a computer or processor. Examples of computer-readable media include electronic signals (transmitted over wired or wireless connections) and computer- readable storage media. Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.