Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
DISPLAY VIEWING POSITION SETTINGS BASED ON USER RECOGNITIONS
Document Type and Number:
WIPO Patent Application WO/2017/160302
Kind Code:
A1
Abstract:
In one example, an electronic device is described, which includes a position activator, a database including display positions associated with a plurality of users, and a processor coupled to the position activator and the database. The processor may retrieve a display position corresponding to a user operating the electronic device from the database and trigger the position activator to set a viewing position of a display of the electronic device based on the retrieved display position.

Inventors:
LO ERIC (TW)
LEE AUSTIN (TW)
CHIANG JEFFREY (TW)
Application Number:
PCT/US2016/022996
Publication Date:
September 21, 2017
Filing Date:
March 18, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
HEWLETT PACKARD DEVELOPMENT CO LP (US)
International Classes:
G06F3/01; G06F1/16
Domestic Patent References:
WO2002071315A22002-09-12
Foreign References:
US20100295827A12010-11-25
US20090025022A12009-01-22
US20150070271A12015-03-12
US20110292009A12011-12-01
Attorney, Agent or Firm:
SU, Benjamin et al. (US)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. An electronic device comprising:

a position activator;

a database comprising display positions associated with a plurality of users; and

a processor coupled to the position activator and the database, wherein the processor is to:

retrieve a display position corresponding to a user operating the electronic device from the database; and

trigger the position activator to set a viewing position of a display of the electronic device based on the retrieveddisplay position.

2. The electronic device of claim 1 , wherein the position activator is to adjust a heightof the display, a viewing angle of the display, or a combination thereof.

3. The electronic device of claim 2,wherein the position activator is to adjust a horizontal viewing angle of the display, adjust a vertical viewing angle of the display, rotate the display in clockwise or counter clockwise direction along anX-Y plane, or a combination thereof.

4. The electronic device of claim 1 , further comprising a user recognition engine to recognize the user operating the electronic device using a facial recognition process, a gesture recognition process, a speech recognition process, or a voiceprint analysis process.

5. The electronic device of claim 1 , further comprising:

a supporting platform connected to the position activator and the display of the electronic device, and wherein the position activator is to set the viewing position of the display based on the retrieveddisplay position via the supporting platform.

6. The electronic device of claim 1 , further comprising:

an image capturing device to capture video data of the user of the electronic device; and

a user recognition engine to:

extract face information from the video data received from the image capturing device; and

recognize a face of the user by comparing the extracted face information with face information stored in the database, wherein the processor to retrieve the display position corresponding to the user upon recognizing the face of the user.

7. A method comprising:

providing a database comprising settings information related to display positions and user information related to users associated with the display positions;

visually recognizing a user of an electronic device by a user recognition engine using the user information;

retrieving a display position corresponding to the recognized user using the settings information; and

automatically adjusting a viewing position of a display of the electronic device based on theretrieveddisplay position.

8. The method of claim 7, wherein automatically adjustingthe viewing position of the display comprisesadjusting a height of the display, a viewing angle of the display, or a combination thereof.

9. The method of claim 8, wherein adjusting the viewing angle of the display comprises:

adjusting a horizontal viewing angle of the display, adjusting a vertical viewing angle of the display, rotating the display in clockwise or counter clockwise direction along anX-Y plane, or a combination thereof.

10. The method of claim 7, wherein visually recognizing the user of the electronic device using the user information comprises: extracting face information from input image data coming from an image capturing device; and

performing a facial recognition process in which the extracted face information is compared with face information stored in the databaseto visually recognize the user, wherein the user information comprises the face information of the user.

11. The method of claim 7, wherein the viewing position of the display is automatically adjusted based on the retrieveddisplay position via a supporting platform connected to the electronic device.

12. A non-transitory machine-readable storage medium comprising instructions executable by a processor of an electronic device to:

receive input image data from an image capturing device;

extract face information from the input image data;

identify a user of the electronic device by comparing the extracted face information with face information stored in a database;

retrieve adisplay position corresponding to the identified user; and instructto adjust aviewing position of adisplay of the electronic device based on theretrieveddisplay position.

13. The non-transitory machine-readable storage medium of claim 12, wherein the instructions are executable by the processor toadjust a heightof the display, a viewing angle of the display, or a combination thereof.

14. The non-transitory machine-readable storage medium of claim 13, wherein the instructions are executable by the processor to:

adjust a horizontal viewing angle of the display, adjust a vertical viewing angle of the display, rotate the display in clockwise or counter clockwise direction along anX-Y plane, or a combination thereof.

15. The non-transitory machine-readable storage medium of claim 12, further comprising instructions that are executable by the processor to:

provide the database, wherein the display position and the face information related to the user associated with the display position are stored in the database.

AMENDED CLAIMS

received by the International Bureau on 01 August 2017 (01.08.17)

1 . An electronic device comprising:

a position activator;

a database comprising display positions associated with a plurality of users; and a processor coupled to the position activator and the database, wherein the processor is to:

retrieve a display position corresponding to a user operating the electronic device from the database, wherein the display position indicates a viewing position preference of the user; and

trigger the position activator to set a viewing position of a display of the electronic device based on the retrieved display position.

2. The electronic device of claim 1 , wherein the position activator is to adjust a height of the display, a viewing angle of the display, or a combination thereof.

3. The electronic device of claim 2, wherein the position activator is to adjust a horizontal viewing angle of the display, adjust a vertical viewing angle of the display, rotate the display in clockwise or counter clockwise direction along an X-Y plane, or a combination thereof.

4. The electronic device of claim 1 , further comprising a user recognition engine to recognize the user operating the electronic device using a facial recognition process, a gesture recognition process, a speech recognition process, or a voiceprint analysis process.

5. The electronic device of claim 1 , further comprising:

a supporting platform connected to the position activator and the display of the electronic device, and wherein the position activator is to set the viewing position of the display based on the retrieved display position via the supporting platform.

6. The electronic device of claim 1 , further comprising:

an image capturing device to capture video data of the user of the electronic device; and

a user recognition engine to:

extract face information from the video data received from the image capturing device; and

recognize a face of the user by comparing the extracted face information with face information stored in the database, wherein the processor to retrieve the display position corresponding to the user upon recognizing the face of the user.

7. A method comprising:

providing a database comprising settings information related to display positions and user information related to users associated with the display positions;

visually recognizing a user of an electronic device by a user recognition engine using the user information;

retrieving a display position corresponding to the recognized user using the settings information, wherein the display position indicates a viewing position preference of the user; and

automatically adjusting a viewing position of a display of the electronic device based on the retrieved display position.

8. The method of claim 7, wherein automatically adjusting the viewing position of the display comprises adjusting a height of the display, a viewing angle of the display, or a combination thereof.

9. The method of claim 8, wherein adjusting the viewing angle of the display comprises:

adjusting a horizontal viewing angle of the display, adjusting a vertical viewing angle of the display, rotating the display in clockwise or counter clockwise direction along an X-Y plane, or a combination thereof.

10. The method of claim 7, wherein visually recognizing the user of the electronic device using the user information comprises:

extracting face information from input image data coming from an image capturing device; and

performing a facial recognition process in which the extracted face information is compared with face information stored in the database to visually recognize the user, wherein the user information comprises the face information of the user.

1 1 . The method of claim 7, wherein the viewing position of the display is automatically adjusted based on the retrieved display position via a supporting platform connected to the electronic device.

12. A non-transitory machine-readable storage medium comprising instructions executable by a processor of an electronic device to:

receive input image data from an image capturing device;

extract face information from the input image data;

identify a user of the electronic device by comparing the extracted face information with face information stored in a database;

retrieve a display position corresponding to the identified user, wherein the display position indicates a viewing position preference of the user; and

instruct to adjust a viewing position of a display of the electronic device based on the retrieved display position.

13. The non-transitory machine-readable storage medium of claim 12, wherein the instructions are executable by the processor to adjust a height of the display, a viewing angle of the display, or a combination thereof.

14. The non-transitory machine-readable storage medium of claim 13, wherein the instructions are executable by the processor to:

adjust a horizontal viewing angle of the display, adjust a vertical viewing angle of the display, rotate the display in clockwise or counter clockwise direction along an X-Y plane, or a combination thereof.

15. The non-transitory machine-readable storage medium of claim 12, further comprising instructions that are executable by the processor to:

provide the database, wherein the display position and the face information related to the user associated with the display position are stored in the database.

Description:
DISPLAY VIEWING POSITION SETTINGS BASED ON USER RECOGNITIONS

BACKGROUND

[0001] Users are increasingly utilizing electronic devices such as desktop computers, personal computers, all-in-one personal computers, tablet computers, notebook computers, game players, and televisions, for performing various tasks. For example, users may view electronic device displays from different locations or positions.

BRIEF DESCRIPTION OF THE DRAWINGS

[0002] Examples are described in the following detailed description and in reference to the drawings, in which:

[0003] Fig.1 is a block diagram of an example electronic deviceincluding components to set a display viewing position corresponding to a user;

[0004] Fig. 2 is a block diagram of the example electronic device illustrating additional components;

[0005] Fig. 3 is an example display illustrating a supporting platform to adjust the display viewing position;

[0006] Fig. 4 is an example flow chart of a method for setting display viewing position based on user recognitions; and

[0007] Fig. 5 illustrates a block diagram of an example computing device for setting display viewing positions based on user recognitions.

DETAILED DESCRIPTION

[0008] Users may have differing preferences for viewing position of electronic devices such as personal computers. As users take turns using such a device, each user may have to adjust the viewing position of the electronic device in accordance with the user's preference. The users may have to manually adjust the viewing position of the electronic device, and therefore time and labor may be involved.

[0009] (Examples described herein may provide a database including settings information related to display positions and user information related to users associated with the display positions (i.e., individual settings associated with each user). For example, the user information stored in the databasemay be face image of the user instructing the settings change to the electronic device. The user information and settings information related to individual settings for each user can be registered in advance in the database. During operation, a user of an electronic device may be recognizedby a user recognition engine using the user information. For example, user information may be extracted from input data and the extracted user information may becompared with theuser information stored in the database to recognize the user. Further, a preferred display position of the recognized user may be retrievedusing the settings information. Furthermore, a viewing position of a display of the electronic device may be automatically set (e.g., adjusted)based on the retrieved display position.

[0010] For example, automatically adjusting the viewing position of the display may include adjusting a height of the display.a viewing angle (i.e., angular position) of the display, or a combination thereof. In this case, adjusting the viewing angle of the display may include adjusting a horizontal viewing angle of the display, adjusting a vertical viewing angle of the display, rotating the display in clockwise or counter clockwise direction along anX-Y plane, or a combination thereof.Similarly, other settings associated with electronic device such as volume, equalization, resolution, contrast and/or brightness, for example, can also be automatically set/adjusted based on user recognition.

[0011] As described below.examples described herein may automatically set viewing positionof the display when logging to an electronic devicebased on facial recognition technique. Thereby, enabling each member of a group, such as a family or a small company, to share the same electronic device without wasting time on re-configuring the settings (i.e., adjusting the viewing position of the display). Also, users may experience that the default settings of the electronic device may be like the personal settings of the users.

[0012] Turning now to the figures, Fig. 1 is a block diagram of example electronic device 102 including components to set a display viewing position corresponding to a user. Example electronic device may include a desktop computer, portable personal computer, all-in-one personal computer, a tablet computer, a notebook computer, a game player, or a television. Further, electronic device 102 may include a display for presenting visual information, such as text, graphics, and the like. Electronic device 102may include position activator 108, database 104 and processor 106 coupled to position activator 108 and database 104.

[0013] Database 104 may be a storage unit to store display positions associated with a plurality of users and user information related to the users associated with the display positions. Example user information stored in database 104 may include face image of the user instructing the settings change to electronic device 102. The user information and settings information (e.g., display positions associated with the users) related to individual settings for each user can be registered/stored in advance in database 104. In the example shown in Fig. 1 , database 104 is shown as a part of electronic device 102, however, database 104 can also reside in an external storage device, such as a hard disk, a storage card, or a data storage medium and can be accessible by electronic device 102.

[0014] Duringoperation, processor 106 may retrieve a display position corresponding to a user operating electronic device 102 from database 104. In one example, the user operating electronic device 102 may be recognized using a facial recognition process, a gesture recognition process, a speech recognition process, a voiceprint analysis process or the like.Further, processor 106 may trigger position activator 108 to set a viewing position of a display of electronic device 102 based on the retrieved display position. Position activator 108may adjust a height of the display, a viewing angle of the display, or a combination thereof. For example, the position activator may adjust a horizontal viewing angle of the display, adjust a vertical viewing angle of the display, rotate the display in clockwise or counter clockwise direction along the X-Y plane, or a combination thereof.The vertical viewing angle may refer to a degree above or below an imaginary horizontal line at the level of the viewer's eyes and the center of the display. The horizontal viewing angle may refer to a degree left or right to an imaginary horizontal line at the level of the viewer's eyes and the center of the display.

[0015] Referring now to Fig. 2, which is a block diagram of the example electronic deviceshown in Fig. 1, illustrating additional components. Electronic device 102 may include user recognition engine 202. Processor 106 may include physical communication interface 204 and configuration manager 206. User recognition engine 202 and position activator 108 may be connected to processor 106 via physical communication interface 204. In the example of Fig. 2, database 104 is shown asbeing connected to processor 106and user recognition engine 202, however, database 104 can also be implemented as a part of user recognition engine 202 and/or processor 106.

[0016] During operation, an image capturing device may capture video data of the user viewing the display of the electronic device. The image capturing device may include an inbuilt camera in electronic device 102 or an external camera communicatively connected to electronic device 102. The image capturing device may provide video data (e.g., image data) to user recognition engine 202. In one example.the image capturing device may capture video data of the user of the electronic device 102 when the user starts/logs into electronic device 102. Further, user recognition engine 202 may extract face information from the video data received from the image capturing device. Furthermore, user recognition engine202 may recognize a face of the user (i.e., identifiesthe user of electronic device 102)by comparing the extracted face information with face information (i.e., face image) storedin advance in database 104. Even though the examples herein describe recognizing user using facial recognition process, other techniques such as gesture recognition process, speech recognition process, voiceprint analysis processand the like can also be used to recognize the user. [0017] For example, when face information of user A is registered in database 104, thenuser recognition engine 202 (e.g., facial recognition engine) mayidentify the user of electronic device 102 as the user A by the aforementioned facial recognition process. In another example, when user A has registered the keyword information in database 104, user recognition engine 202 (e.g., speech recognition engine) can identify the user of electronic device 102as the user A by speech recognition process in which the extracted keyword information from input voice may becompared with keyword information that the user A has stored in advance in database 104.

[0018] In yet another example, when user Ahas registered the voiceprint information in database 104, user recognition engine 202 (e.g., voiceprint analysis engine) can identify the user of electronic device 102 as the user A by voiceprint analysis process in which extracted voiceprint informationfrom voice uttered by user A may becompared with voiceprint information that the user A has stored in advance in database 104. In yet another example, when user A has registered gesture information in database 104, user recognition engine 202 (e.g., gesture recognition engine) can identify the user of electronic device 102 as user A by the gesture recognition process in which the extracted gesture information from the input image may becompared with gesture information stored by a user of electronic device 102 in advance in database 104.

[0019] Upon recognizing the user, configuration manager 206 residing in processor 106 may retrieve the display position corresponding to the user from database 104. For example, configuration manager 206 may retrieve a display position associated with a user using the settings information stored in database 104. Then, position activator 108 may adjust a height of the display, a viewing angle of the display, or a combination thereof based on the retrieved display position. In one example, user recognition engine 202, configuration manager 206, and position activator 108 may be implemented as engines or modules comprising any combination of hardware and programming to implement the functionalities described herein. [0020] Referring now to Fig. 3, which is an example display302 illustrating supporting platform 304 to adjust the display viewing position. Example electronic device 300 may include supporting platform 304 connected to display 302 of electronic device 300 and image capturing device 306 (e.g., camera). Also, supporting platform 304 may be communicatively coupled to a processor and/or a position activator of electronic device 300. In one example, electronic device 300 may be a device (e.g., all-in-one computer) that houses each component except the keyboard and mouse inside the same case as display 302 or a device (e.g., desktop computer) that includes display 302 and a central processing unit externally connected to display 302. During operation, the position activator may set the viewing position of display 302 based on the retrieved display position via supporting platform 304.

[0021] Fig. 4 is an example flow chart 400 of a method for setting display viewing position based on user recognitions. At 402, a database including settings information related to display positions and user information related to users associated with the display positions may be provided. For example, the settings information and the user informationmay be registered or stored in the database by users of the electronic device. The display positions may include positions change to be applied to the display and face information may include face information of the users associated with the display positions.

[0022] At 404, a user of an electronic device may be visually recognized by a user recognition engine using the user information. For example, face information may be extracted from input image data coming from an image capturing device (e.g., camera) and a facial recognition process may beperformed in which the extracted face information may becompared with face information stored in the databaseto visually recognize the user.

[0023] At 406, a display position corresponding to the recognized user may be retrieved using the settings information in the database. At 408, a viewing position of a display of the electronic device may beautomatically adjustedbased on the retrieved display position. In one example, automatically adjusting the viewing position of the display may include adjusting a height of the display, a viewing angle of the display, or a combination thereof. Further, adjusting the viewing angle of the display may include adjusting a horizontal viewing angle of the display, adjusting a vertical viewing angle of the display, rotating the display in clockwise or counter clockwise direction along anX-Y plane, or a combination thereof. For example, the viewing position of the display may beautomatically adjusted based on the retrieved display position via a supporting platform connected to the electronic device.

[0024] Fig. 5 illustrates a block diagram 500 of an example electronic device 502 for setting display viewing position based on user recognitions.The electronic device 502 may include processor 504 and a machine-readable storage medium 506 communicatively coupled through a system bus. Processor 504 may be any type of central processing unit (CPU), microprocessor, or processing logic that interprets and executes machine-readable instructions stored in the machine- readable storage medium 506. Machine-readable storage medium 506 may be a random access memory (RAM) or another type of dynamic storage device that may store information and machine-readable instructions that may be executed by processor 504. For example, the machine-readable storage medium 506 may be synchronous DRAM (SDRAM), double data rate (DDR), rambus DRAM (RDRAM), rambus RAM, etc., or storage memory media such as a floppy disk, a hard disk, a CD-ROM, a DVD, a pen drive, and the like. In an example, the machine-readable storage medium 506 may be a non-transitory machine- readable medium. In an example, the machine-readable storage medium 506 may be remote but accessible to the electronic device 502.

[0025] The machine-readable storage medium 506 may store instructions 508-516. In an example, instructions 508-516 may be executed by the processor 504 to provide a mechanism for setting display viewing position based on user recognitions. Instructions 508 may be executed by the processor 504 to receive input image data coming from an image capturing device (e.g., camera). Instructions 510 may be executed by the processor 504 to extract face information from the input image data. Instructions 512 may be executed by the processor 504 toidentify a user of the electronic device by comparing the extracted face information with face informationstored in a database. Instructions 514 may be executed by the processor 504 to retrieve a display position corresponding to the identified user. Instructions 516 may be executed by the processor 504 to provide instructions to adjust the viewing position of the display of the electronic device based on the retrieved display position. Upon receiving instructions, the position activator may adjust the viewing position of the display based on the retrieveddisplay position.

[0026] It may be noted that the above-described examples of the present solution is for the purpose of illustration only. Although the solution has been described in conjunction with a specific example thereof, numerous modifications may be possible without materially departing from the teachings and advantages of the subject matter described herein. Other substitutions, modifications and changes may be made without departing from the spirit of the present solution. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.

[0027] The terms "include," "have," and variations thereof, as used herein, have the same meaning as the term "comprise'' or appropriate variation thereof. Furthermore, the term "based on", as used herein, means "based at least in part on." Thus, a feature that is described as based on some stimulus can be based on the stimulus or a combination of stimuli including the stimulus.

[0028] The present description has been shown and described with reference to the foregoing examples. It is understood, however, that other forms, details, and examples can be made without departing from the spirit and scope of the present subject matter that is defined in the following claims.