Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND SYSTEM OF MANAGING A USER INTERFACE OF A COMMUNICATIONS DEVICE
Document Type and Number:
WIPO Patent Application WO/2003/088013
Kind Code:
A2
Abstract:
A method and a system of managing a user interface of a mobile wireless communications device for use in data devices, comprises the step of determining a relative location of the user and the device. The determination is performed on the basis of the signals from an array (208) of sensors (210). A control unit (204) determines this relative location. Knowing the relative location of the user, the control unit (204) can automatically change the orientation of the user interface. If the user interface (202) is a display screen, its content can be rotated to a position that ensures the user the best possible view. If the user interface (202) is a keypad or a touch screen, functions of each key can be redefined to ensure the best possible position for use. Knowing the distance between the user and the user interface, the control unit (204) can change the size of the most important information window on the display screen. The change is made automatically as the distance between the user and the user interface changes. The importance of information windows is ranked, and the ranking is stored in memory (206).

Inventors:
NAVARRO-PRIETO RAQUEL (GB)
BAKER PAUL DOMINIC (GB)
REX JAMES ALEXANDER (GB)
Application Number:
PCT/EP2003/001610
Publication Date:
October 23, 2003
Filing Date:
February 18, 2003
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
MOTOROLA INC (US)
NAVARRO-PRIETO RAQUEL (GB)
BAKER PAUL DOMINIC (GB)
REX JAMES ALEXANDER (GB)
International Classes:
G06F1/16; G09G5/00; H04M1/724; H04M1/72466; (IPC1-7): G06F1/00
Domestic Patent References:
WO2001027727A22001-04-19
WO2002093331A12002-11-21
Foreign References:
EP1316877A12003-06-04
Other References:
ANON.: "Personal Computer environmental control via a proximity sensor" IBM TECHNICAL DISCLOSURE BULLETIN, vol. 36, no. 8, August 1993 (1993-08), pages 343-345, XP000390248 Armonk, NY, US
Attorney, Agent or Firm:
Treleven, Colin (Property Operations Midpoint, Alencon Lin, Basingstoke Hampshire RG22 4PD, GB)
Download PDF:
Claims:
Claims
1. A method of managing a content and/or an orientation of a user interface (202) of a mobile wireless communication device, the method comprising the steps of: the mobile wireless communication device sensing a relative location (102, 104) of a user and said user interface of said device; adapting said content and/or said orientation of said user interface.
2. A method according to claim 1, wherein said sensing of said relative location is performed on the basis of signals from an array of sensors.
3. A method according to claim 1, wherein said content comprises an image presented on said user interface, and the step of adapting the content comprises the steps of: zooming in to the information window, when the angle between the normal to the display screen and the lineofsight to the user increases; zooming back from the information window, when the angle between the normal to the display screen and the lineofsight to the user decreases.
4. A method according to claim 1, wherein said content comprises an image presented on said user interface, and the step of adapting the content is applied when said display screen contains more than one information window, and said adapting comprises the steps of: zooming in to the information window with the most important content, when the said distance between said display screen and said user exceeds a predefined value ; zooming back from the information window with the most important content, when said distance between said display screen and said user is shorter than said predefined value, allowing other information windows with content of a lower ranked importance to be displayed on said display screen.
5. A method according to claim 4, wherein the ranking of said importance of said content of said display screen is predefined by said user, and can be changed dynamically.
6. A method according to claim 4, wherein the ranking of said importance of said content of said display screen is predefined in dependence on the kind of content.
7. A method according to claim 4, wherein said kind of content can be a video picture, a text, a graphic.
8. A method according to claims 5 or 6, wherein said predefined ranking of said content is stored in a memory of said device.
9. A method according to claim 1, wherein the user interface is a display screen, and said sensing of said relative position consists of: determining a distance between a user and said display screen; and/or determining said relative position of said display screen and said user; characterised wherein the adapting of said orientation of said display screen consists of a step of a rotation of said content of said display screen, when said relative position of said user and said display screen would not ensure a practical position for viewing of said display screen.
10. A method according to claim 9, wherein an angle of said rotation is approximately equal to 90 degrees, or a multiple thereof.
11. A method according to claim 1, wherein said user interface is a keypad and/or a touch screen.
12. A method according to claim 11, wherein said adapting of said orientation of said keypad and/or said touch screen consists of a step of redefining of functions of each key of said keypad and/or said touch screen, when said relative position of said user and said keypad and/or said touch screen does not ensure a practical position for using said keypad and/or said touch screen.
13. A method according to claim 12 wherein said redefining of functions of each key of said keypad and/or said touch screen ensures said user a practical position for using said keypad and/or said touch screen.
14. A method according to claim 1 wherein said adapting is performed automatically.
15. A method according to claim 1 wherein said user can manually influence said adapting.
16. A system for managing a content and/or an orientation of a user interface (202) of a mobile wireless communication device, said system comprising: a control unit (204), adapted to determine a relative location of a user and said user interface (202), and to control said content and/or said orientation; a user interface (202) for entering and/or providing information, connected to said control unit; an array (208) of sensors (204). for detecting said user, connected to said control unit.
17. A system according to claim 16, wherein said user interface is a display screen and/or a keypad and/or a touch screen.
18. A system according to claim 17, comprising a memory for storing a predefined ranking of an importance of said content of said display screen, that is connected to said control unit.
19. A system according to claim 17, wherein said system has said display screen on more than one face.
20. A system according to claim 16, wherein said array of sensors consists of one or more of at least two microphones, or infrared sensors, or ultrasonic sensors, or capacitive sensors, or radar, or any combination thereof.
21. A mobile telephone, a portable or mobile (PMR) radio, a personal digital assistant (PDA), or a laptop computer according to any of claims 1620, or adapted to operate in accordance with the method of any of claims 115.
Description:
Method and System of Managing A User Interface of a Communications Device Technical Field The present invention relates to managing a content and/or an orientation of a mobile wireless communications device. The invention may, for example, be useful when a keyboard or mouse is not convenient.

Background With the growth of wireless telecommunications, there has been an increased focus on small mobile data devices. These provide a wide range of services. This wide range of services on small devices has dramatically increased the number of design issues related to the presentation of this information. The mobile data devices may provide information comprising video, audio, data or a combination of these.

Estimating the direction of talkers and other sound sources using microphone arrays is known prior art. Videoconferencing systems that use microphone arrays to locate talkers and direct cameras at them are known prior art.

Diverse stimuli are used as inputs for interface navigation, e. g. gaze, voice.

Systems that use eye gaze interaction are attractive because they use people's spontaneous eye movements to control the interface. Although these systems could accurately track a user's attention on the display, they are not practical for many products. One of the reasons is the high cost of the eye-tracking technology. Another disadvantage of these systems is the high level of intrusiveness of the eye tracking devices, e. g. head mounted systems.

In a method known in the prior art, the three-dimensional (tilt) movements of a mobile communication device are used to control the display orientation.

In other devices, the user can change the orientation of the display content by actuating special features. The user can also actively select the orientation, for example by pressing a button.

Statement of Invention It is an object of the present invention to provide a novel method and system for managing a user interface of a mobile wireless communication device which overcomes the disadvantages of the prior art.

The invention comprises a method in accordance with appended independent claim 1, a system in accordance with appended independent claim 16, and devices in accordance with claim 21.

The prime benefit of a user interface in accordance with the present invention is that the user does not need to re-position either him/herself or the device in order to use it. Additionally, multiple users could more easily time-share a single display.

With a multi-orientation hand-held device, the user does not have to consider how to pick up the device. The device will automatically adapt itself to provide the best possible presentation of information. Also, as the display could rotate freely, left- handed people may not have some of the difficulties that the non-symmetric displays of PDAs (Personal Digital Assistant) present at the moment.

Another application of the present invention is for managing orientation of other kinds of user interfaces than just displays. It can be easily applied for managing functions of keys on keypads and touch screens. The keypad and touch screen of the device will be automatically redefined to provide orientation that is easiest to use.

This is a significant departure from the current art, as seen in WO 01/88679 (MathEngine). The cited prior art only concerns displays, and relies on knowledge of the actual position of the device.

Brief description of the drawings The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which: Figure 1 is a flowchart illustrating a method of managing content and orientation of a user interface in accordance with an embodiment of the present invention; Figure 2 is a schematic illustration of a system for managing content and orientation of a user interface in accordance with an embodiment of the present invention.

Detailed description of the preferred embodiment The term a"content of a user interface"herein below refers, in case of a display screen, to a picture presented on it. In the cases of keypads and touch screens, this refers to a function that is assigned to each key.

The term"information window"herein below refers to an integral part of a picture on the display screen with consistent content. This may be, for example, one of several windows presented by a'Windows' operating system.

The term"redefining"herein below refers to a process of changing assignments of each of the keys of the keypad and/or touch screen.

The present invention allows a mobile wireless communications device or system to gather information about the orientation and relative position of the user.

Technologies used for tracking relative position of the user may involve voice, ultrasonic, capacitive, radar and infrared tracking. This information is used to select how the user interface should be adapted and oriented to enhance user interaction.

Hence, this invention presents a distinctive way of managing the information presented on the displays of small mobile devices. It is, however, within the contemplation of a person resonably skilled in the art to extend this principle to the interfaces of fixed devices. One example of such a fixed device is a public information kiosk, which may for example be built into a table.

In accordance with the present invention, there is thus provided a method and system for managing diverse"levels"of information on the display. For instance it can be applied when the device offers diverse windows, each using a different media. In such case the user can select which one is the most important. For example, a window showing a diagram may be more significant to the user than another window that is showing text.

The method allows optimisation of the organization (zooming in and out) of the windows presented to the user. In the case where the device has a single window, the distance between the user and the display can automatically cause the window to zoom in. The advantage of this method is that it helps automate both window management, and the zoom ratio for a user. The method takes into account what the user can see at any given distance from the display. Automatic window managing becomes very important in scenarios where the user is mobile and not able to use a mouse or keyboard for this task.

The device ascertains information about the relative position of the user and the device. Therefore the device can zoom in to the image displayed as the angle between the normal to the display screen and the user's line of sight increases, i. e. the user moves away from the optimal viewing axis of the device. The device can zoom out from the image displayed as the angle between the normal to the display screen and the user's line of sight decreases, i. e. the user moves towards the optimal viewing axis of the device. Prior art zooming strategies based simply upon the distance of the user from the device may also be incorporated.

In accordance with another aspect of the present invention there is provided a method and a system for automatically orienting displays. It can be applied in small data devices that can be used irrespective of their orientation. The display screen of the device will show images in various orientations relative to the body of the device. This useful feature enables a device's user to interact with it from multiple viewing directions. Additionally if it had such screens on more than one face, it could be held in yet more different orientations. This concept can also be applied to mobile phones.

Referring to figures 1 and 2, in step 100 a relative location of a user and user interface is determined. For this purpose, a signal from an array 208 of sensors 210 is used. Control unit 204 determines this relative location.

The determination performed in step 100 consists of two independent substeps, 102 and 104. In step 102, the distance between the user and the user interface 202 is determined. In step 104, control unit 204 determines the relative position of the user and the user interface.

One example of a user interface 202 that can be managed according to the present invention is a display screen.

The control unit 204 in step 106 compares the determined distance with predefined values. These predefined values are distances related to sizes of information window on the display screen. Each predefined distance value has a corresponding size of information window that assures necessary detail recognition. When the determined distance exceeds any of these predefined values, the control unit 204 zooms in, in step 108, the information window with the most important content. In step 110, the information window with the most important content is zoomed-out. Zooming-out is performed when the determined distance is shorter than the predefined one for this size of information window.

After zooming-out, other windows may then also be displayed on the display screen. Steps 108 and 110 may only be performed when the display screen contains more than one information window.

The importance of an information window is ranked by the user. The ranking is stored in memory 206 and can be changed dynamically. The ranking can also be predefined by a kind of content of the information window. The information window may contain a video picture, a text, or a graphic. The user will select the one of these that is most useful when relatively far from the display.

The relative position of the user and the user interface determined in step 104 can be used for changing orientation of the display screen.

The control unit 204, in step 112, compares the determined relative position of the user and the display screen with a predefined set of relative positions. Each predefined position has a corresponding orientation of the display screen content that assures the best possible view direction. In step 114, the display screen content is rotated when the determined relative position of the user does not match the predefined position of current orientation of the display screen content.

After rotation, the relative position of the user matches the predefined position of current orientation of the display screen content.

The angle of rotation of the display screen content can be freely chosen.

However for some kinds of display screens the angle of rotation should be approximately equal to 90 degrees or a multiple thereof.

Other examples of the user interface 202 that can be managed according to the present invention are a keypad and a touch screen.

The control unit 204, in step 112, compares the determined relative position of the user and the keypad and/or the touch screen with a predefined set of positions.

Each predefined position has corresponding orientation of the keys of the keypad and the touch screen that assures the easiest possible use. In step 116, the keys' definitions are redefined when the determined relative position of the user does not match the predefined position of current keys'definitions. After redefining, the relative position of the user matches the predefined position of the current keys' definitions.

The relative position of the user and screen are known. Therefore, the image or a window may be zoomed in, as the angle between the user and the normal axis to the screen increases. Conversely, the image or window may be zoomed out, as the angle between the user and the normal axis to the screen decreases.

All changes of the screen content and its orientation, as well as redefining the keys of the keypad and the touch screen, can be referred to as adapting. This adapting according to the present invention is performed automatically, but it is also possible that the user of the device can manually influence this adapting.

Reference is now made to Fig. 2, which depicts a system that can manage the content and/or orientation of the user interface in accordance with an embodiment of the present invention. In another embodiment of the present invention, a device incorporating system 200 has the display screen on more than one face.

A system 200 managing the content and/or orientation of the user interface 202, according to the present invention, comprises an array 208 of sensors 210. The array 208 of sensors 210 is connected to a control unit 204.

The control unit 204 is able to determine relative location of the user and the user interface. The determination is made on the basis of signals from array 208 of sensors 21. 0. The control unit 204 is connected to the user interface 202 and controls its content and/or orientation.

The user interface 202 can be a display screen and/or a keypad and/or a touch screen. A memory 206, for storing a predefined ranking of an importance of the content of the display screen, is connected to the control unit 204.

The array 208 of sensors 210 can consist of at least two microphones, or infrared sensors, capacitive sensors, a radar transponder, or ultrasonic sensors, or any combination of them.

A system in accordance with the invention, or the method of the invention may be used in various data devices. In particular, the invention is usable in portable or mobile radio communication devices. Therefore the system may be used in a mobile telephone or a portable or mobile PMR radio. The invention also may be used in a personal digital assistant (PDA) or laptop computer, linked for example by a radio or infra-red communication link to a cellular network. Such a network may be in a building, or be a cellular telephone network, or a UMTS/3G network.