Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND USER APPARATUS FOR WINDOW COLORING
Document Type and Number:
WIPO Patent Application WO/2015/039567
Kind Code:
A1
Abstract:
Window coloring methods and user apparatus are provided. A background image under a coverage area of a window is obtained. The window includes component parts, and the background image includes pixel points. A color-value of each pixel point of the background image is counted. According to the color-value of each pixel point, a color-value of each component part in the window is determined. Each component part is rendered (or over-painted) according to the color-value of each component part.

Inventors:
PANG MENGYU (CN)
CHEN TIAN (CN)
YAO XIAOWEN (CN)
Application Number:
PCT/CN2014/085975
Publication Date:
March 26, 2015
Filing Date:
September 05, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TENCENT TECH SHENZHEN CO LTD (CN)
International Classes:
G06F9/44
Foreign References:
CN101299804A2008-11-05
CN102768625A2012-11-07
CN102568010A2012-07-11
US20070139430A12007-06-21
US20090132938A12009-05-21
Attorney, Agent or Firm:
ADVANCE CHINA IP LAW OFFICE (No. 85 Huacheng Avenue Tianhe Distric, Guangzhou Guangdong 3, CN)
Download PDF:
Claims:
Claims

1. A window coloring method comprising:

obtaining a background image under a coverage area of a window, wherein the window comprises a plurality of component parts, and the background image comprises a plurality of pixel points;

counting a color-value of each pixel point of the background image; and

determining a color-value of each component part in the window according to the color-value of each pixel point, and rendering each component part according to the color-value of each component part.

2. The method according to claim 1, wherein the step of counting the color- value of each pixel point in the background image comprises:

traversing the color-value of each pixel point in the background image, and

grouping each pixel point according to the color-value of each pixel point into a plurality of groups such that a color-value distance between a minimum color-value and a maximum color-value of the pixel points in a same group is less than a first pre-set threshold value.

3. The method according to claim 2, wherein the step of determining the color- value of each component part in the window according to the color-value of each pixel point comprises:

(a) selecting a first-part group from the plurality of groups such that the first-part group contains a maximum number of pixel points with an average pixel-point color-value, averaged from the maximum number of pixel points in the first-part group and determined as a first part color-value of the window;

(b) deleting one or more groups each having an average pixel-point color-value that provides a color difference from the first part color-value by less than a second pre-set threshold value, and providing a plurality of remaining groups after deleting the one or more groups from the plurality of groups;

(c) selecting a second-part group from the plurality of remaining groups such that the second-part group contains a maximum number of pixel points having an average pixel-point color- value determined as a second part color- value of the window; and

(d) repeating the steps (b) and (c), till a color-value of each component part of the window is determined, wherein the color-value of each component part of the window comprises at least the first part color-value and the second part color-value.

4. The method according to claim 2, wherein the step of determining the color- value of each component part in the window according to the color-value of each pixel point comprises:

selecting a first-part group from the plurality of groups such that the first-part group contains a maximum number of pixel points having an average pixel-point color-value, averaged from pixel points in the first-part group and determined as a first part color-value of the window; and

calculating color-values of component parts other than the first part color-value of the window, according to the first part color-value and a first distribution scheme pre-set for color-values of component parts in the window.

5. The method according to claim 2, wherein the step of determining the color- value of each component part in the window according to the color-value of each pixel point comprises:

calculating color-values of component parts of the window, according to an average pixel- point color-value of a group containing a maximum number of pixel points and a second distribution scheme pre-set for color-values of component parts in the window.

6. The method according to any claim of claims 1-5, wherein after obtaining the background image under the coverage area of the window, the method further comprises:

compressing the background image.

7. The method according to claim 1, wherein the window comprises a focus box, a text input box, or a combination thereof.

8. The method according to claim 1, wherein the background image comprises a screen capture of the window.

9. The method according to claim 1, wherein each component part of the window comprises background, main title, subtitle, and body text.

10. A user apparatus comprising:

an obtaining unit, configured to obtain a background image under a coverage area of a window, wherein the window comprises a plurality of component parts, and the background image comprises a plurality of pixel points;

a counting unit, configured to count a color-value of each pixel point in the background image obtained by the obtaining unit; a determining unit, configured to determine a color-value of each component part of the window according to the color-value of each pixel point counted by the counting unit; and

a rendering unit, configured to render each component part based on the color- value of each component part determined by the determining unit.

11. The apparatus according to claim 10, wherein the counting unit comprises:

a traversing sub-unit, configured to traverse the color-value of each pixel point in the background image; and

a grouping sub-unit, configured to group each pixel point according to the color-value of each pixel point, traversed by the traversing sub-unit, into a plurality of groups such that a color-value distance between a minimum color-value and a maximum color-value of the pixel points in a same group is less than a first pre-set threshold value.

12. The apparatus according to claim 11, wherein the determining unit comprises:

a first determining sub-unit, configured to select a first-part group from the plurality of groups such that the first-part group contains a maximum number of pixel points with an average pixel-point color-value, determined as a first part color-value of the window;

a deleting sub-unit, configured to delete one or more groups each having an average pixel- point color-value that provides a color difference from the first part color-value by less than a second pre-set threshold value, and to provide a plurality of remaining groups after deleting the one or more groups from the plurality of groups; and

a second determining sub-unit, configured to select a second-part group from the plurality of remaining groups such that the second-part group contains a maximum number of pixel points having an average pixel-point color-value determined as a second part color-value of the window, wherein the deleting sub-unit and the second determining sub-unit is repeatedly used till a color-value of each component part of the window is determined, and wherein the color-value of each component part of the window comprises at least the first part color-value and the second part color-value.

13. The apparatus according to claim 11, wherein the determining unit comprises:

a third determining sub-unit, configured to select a first-part group from the plurality of groups such that the first-part group contains a maximum number of pixel points having an average pixel-point color-value, averaged from the maximum number of the pixel points in the first-part group and determined as a first part color-value of the window; and a calculating sub-unit, configured to calculate color- values of component parts other than the first part color-value of the window, according to the first part color-value determined by the third determining sub-unit and a first distribution scheme pre-set for color-values of component parts in the window.

14. The apparatus according to claim 11, wherein:

the determining unit is configured to calculate color-values of component parts of the window, according to an average pixel-point color-value of a group containing a maximum number of pixel points and a second distribution scheme pre-set for color-values of component parts in the window.

15. The apparatus according to any claim of claims 10-14, further comprising:

a compressing unit, configured to compress the background image.

16. The apparatus according to claim 10, wherein the window comprises a focus box, a text input box, or a combination thereof.

17. The apparatus according to claim 10, wherein the background image comprises a screen capture of the window.

18. The apparatus according to claim 10, wherein each component part of the window comprises background, main title, subtitle, body text.

Description:
METHOD AND USER APPARATUS FOR WINDOW COLORING

CRO S S-REFERENCES TO RELATED APPLICATIONS

[0001] This application claims priority to Chinese Patent Application No. 2013104253368, filed on September 17, 2013, the entire content of which is incorporated herein by reference.

FIELD OF THE DISCLO SURE

[0002] The present disclosure generally relates to the field of computer technology and, more particularly, relates to methods and user apparatus for window coloring.

BACKGROUND

[0003] Generally, a window displayed on a computer screen is suspended on a background image. Color of the background image may have certain influence on titles or contents in the window. To have the window and the background image look more harmonious and more comfortable for human eyes, an appearance of frosted glass may be used to provide fuzzy effect and to allow color- value changes of adjacent pixels to be gentle and smooth. As such, sharp details on the background image can be removed and smooth hue can be unified. However, use of frosted glass effect may have to maintain a large amount of background image information, which sometimes is order-less.

[0004] Therefore, the background image greatly influences color of the window. When text on the window or balky color of icons has small contrast with the background image, it can be hard to accurately identify the text and balky color displayed on the window.

BRIEF SUMMARY OF THE DISCLO SURE

[0005] One aspect or embodiment of the present disclosure includes a method for window coloring. A background image under a coverage area of a window is obtained. The window includes component parts, and the background image includes pixel points. A color-value of each pixel point of the background image is counted. According to the color-value of each pixel point, a color-value of each component part in the window is determined. Each component part is rendered according to the color-value of each component part.

[0006] Another aspect or embodiment of the present disclosure includes a user apparatus.

The user apparatus includes an obtaining unit, a counting unit, a determining unit, and a rendering unit. The obtaining unit is configured to obtain a background image under a coverage area of a window. The window includes component parts, and the background image includes pixel points. The counting unit is configured to count a color-value of each pixel point in the background image obtained by the obtaining unit. The determining unit is configured to determine a color-value of each component part of the window according to the color-value of each pixel point counted by the counting unit. The rendering unit is configured to render each component part based on the color- value of each component part determined by the determining unit.

[0007] Other aspects or embodiments of the present disclosure can be understood by those skilled in the art in light of the description, the claims, and the drawings of the present disclosure.

BRIEF DESCRIPTION OF THE DRAWINGS

[0008] The following drawings are merely examples for illustrative purposes according to various disclosed embodiments and are not intended to limit the scope of the present disclosure.

[0009] FIG. 1 depicts an exemplary window coloring method consistent with various disclosed embodiments;

[0010] FIG. 2 depicts an exemplary user apparatus consistent with various disclosed embodiments;

[0011] FIG. 3 depicts another exemplary user apparatus consistent with various disclosed embodiments;

[0012] FIG. 4 depicts another exemplary user apparatus consistent with various disclosed embodiments;

[0013] FIG. 5 depicts another exemplary user apparatus consistent with various disclosed embodiments;

[0014] FIG. 6 depicts another exemplary user apparatus consistent with various disclosed embodiments; and

[0015] FIG. 7 depicts another exemplary user apparatus consistent with various disclosed embodiments.

DETAILED DES CRIPTION

[0016] Reference will now be made in detail to exemplary embodiments of the disclosure, which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.

[0017] FIGS. 1-7 depict exemplary window coloring methods and user apparatus consistent with various disclosed embodiments. As disclosed, color of each component part of window can be determined according to color of the background image such that each component part of the window plus the background image as a whole can be are more integrated and harmonious.

[0018] FIG. 1 depicts an exemplary window coloring method consistent with various disclosed embodiments.

[0019] In Step 101, a background image under a coverage area of a window is obtained. The window includes component parts, and the background image includes pixel points. [0020] In various embodiments, a window can be a focus box, a text input box, or any suitable displaying area on a screen. A background image can be a desktop image or any suitable image(s) that are opened, e.g., as shown on a screen. In various embodiments, the background image can be an obtained image having a size corresponding to size of a coverage area of a window on the background image. In some cases, a background image can be considered as a screen capture of a corresponding window covering the background image.

[0021] In Step 102, a color- value of each pixel point of the background image is counted.

The color-value of each pixel point in the background image may or may not be the same. In one embodiment, the color-value of each pixel point in the background image is different.

[0022] In Step 103, a color-value of each component part in the window is determined according to the color-value of each pixel point. Each component part is rendered (e.g., over-painted) according to the color-value of each component part.

[0023] As disclosed, a background image under a coverage area of a window is obtained.

The window includes component parts, and the background image includes pixel points. A color- value of each pixel point of the background image is counted. According to the color-value of each pixel point, a color-value of each component part in the window is determined. According to the color-value of each component part, each component part is rendered.

[0024] Currently, color of component parts in the window may be in low contrast with color of corresponding background image and it is hard to distinguish them from one another. The disclosed window coloring method can determine the color of each component part in the window according to the color of the background image, which allows the window and the background image look more integrated and harmonious, e.g., on a screen of a terminal device.

[0025] Optionally, based on the method disclosed in FIG. 1, the step of counting a color- value of each pixel point of the background image includes the following exemplary process.

[0026] The color-value of each pixel point in the background image can be traversed. Each pixel point can be grouped, according to the color-value of each pixel point, into multiple groups, such that a color-value distance between a minimum color-value and a maximum color-value of the pixel points in a same group is less than a first pre-set threshold value.

[0027] In one embodiment, after traversing the color- value of each pixel point in the background image, a set of pixel points can be established. The set of pixel points can contain each pixel point of the background image. The pixel points of the set can then be grouped. Exemplary grouping methods can include, starting from the pixel point having a minimum color-value (or a maximum color-value), then grouping pixel points having a color-value distance with a minimum color- value (or a maximum color- value) of less than the first pre-set threshold value into this group. Then once again, starting from the pixel point having a minimum color-value (or a maximum color- value) in the remaining pixel points, the above-mentioned grouping step is repeatedly processed till all pixel points are grouped. In one embodiment, the first pre-set threshold value can be about 0.01 or any other suitable value without limitation.

[0028] Optionally, based on the exemplary method of FIG. 1, the step of determining the color-value of each component part in the window according to the color-value of each pixel point includes the following exemplary steps.

[0029] A first-part group can be selected from multiple groups such that the first-part group contains a maximum number of pixel points having an average pixel-point color-value, averaged from the maximum number of pixel points in the first-part group and determined as a first part color- value of the window (Step a).

[0030] One or more groups each having an average pixel-point color- value that provides a color difference from the first part color-value by less than a second pre-set threshold value can then be deleted to provide multiple remaining groups after deleting the one or more groups from the multiple (Step b).

[0031] A second-part group can be selected from the multiple remaining groups such that the second-part group contains a maximum number of pixel points having an average pixel-point color- value determined as a second part color-value of the window (Step c).

[0032] Steps (b) and (c) can be repeatedly performed till a color-value of each component part of the window is determined. The color-value of each component part of the window can include at least the first part color-value and the second part color-value.

[0033] For example, each component part of the window can include background, main title, subtitle, body text, while the first part can be the background having a first part color-value and the second part can be the main title having a second part color-value.

[0034] After the pixel points are grouped, the set of pixel points can thus contain multiple groups. The group containing a maximum number of pixel points can then be determined. In the cases when multiple (e.g., two or more) groups contain a same number of pixel points, one of the multiple groups can be randomly selected and the average pixel-point color-value of the randomly- selected group can be determined as the background color-value of the window.

[0035] To highlight the main title out of the background color, the main title can have a color having a sufficient contrast with the background color. A minimum color-value distance between the background color-value and the main title color-value can be pre-defined. Such minimum color- value distance can be about 0.5 or any other suitable value that is pre-defined.

[0036] One or more groups having an average color- value distance from the background color-value by less than a second pre-set threshold value can be deleted, and the set of the groups containing these previously-deleted can then be updated after deletion. [0037] A second-part group can be selected from the remaining groups after deletion, such that the second-part group contains a maximum number of pixel points having an average color- value, which is then determined as the main title color-value of the window.

[0038] Likewise, the set of the groups can further be updated. In certain updating processes, other values (instead of using the second pre-set threshold value) can be referred to for the updating. Then another group containing a maximum number of pixel points can be selected from remaining groups after the set is updated. An average color-value of the pixel points in this another group containing the maximum number of pixel points can be determined as a color-value for the subtitle and the body text of the window.

[0039] When there are other component parts, the color- value of these other component parts can also be determined using the same methods as disclosed herein.

[0040] Optionally, based on the window coloring method of FIG. 1, the step of determining the color-value of each component part in the window according to the color-value of each pixel point can include the following exemplary steps.

[0041] A first-part group can be selected from multiple groups such that the first-part group contains a maximum number of pixel points having an average pixel-point color-value, averaged from pixel points in the first-part group and determined as a first part color-value of the window.

[0042] According to the first part color-value and a first distribution scheme pre-set for color- values of component parts in the window, color-values of component parts other than the first part color-value of the window can be calculated.

[0043] For example, according to the above-mentioned method, background color- value can be determined. According to the determined background color-value and a first distribution scheme pre-set for color-values of component parts in the window, the color-values of component parts other than the first part color-value of the window can be calculated.

[0044] Optionally, based on the window coloring method of FIG. 1, the step of determining the color-value of each component part in the window according to the color-value of each pixel point can include the following exemplary steps.

[0045] According to an average pixel-point color-value of a group containing a maximum number of pixel points and a second distribution scheme pre-set for color-values of component parts in the window, color-values of component parts of the window can be calculated.

[0046] In one embodiment, based on the average pixel-point color- value of a group containing a maximum number of pixel points, the color-values of component parts of the window can be directly calculated, while there is no need to calculate the color-values of component parts of the window, according to the average color- value of the group. [0047] Optionally, based on the window coloring method of FIG. 1, after obtaining the background image under the coverage area of the window, the method further includes: compressing the background image.

[0048] In one embodiment, captured background image (e.g., by screen-shot) may be too large with big sizes, which can affect analysis efficiency in real-time. In this case, pixel points of the obtained background image can be compressed, e.g., be proportionally compressed to have 80*80 pixel point or smaller.

[0049] FIG. 2 depicts an exemplary user apparatus for window coloring consistent with various disclosed embodiments. The exemplary user apparatus 20 includes obtaining unit 201, counting unit 202, determining unit 203, and/or rendering unit 204.

[0050] The obtaining unit 201 is configured to obtain a background image under a coverage area of a window. The window includes component parts, and the background image includes pixel points.

[0051] The counting unit 202 is configured to count a color- value of each pixel point in the background image obtained by the obtaining unit 201.

[0052] The determining unit 203 is configured to determine a color-value of each component part of the window according to the color- value of each pixel point counted by the counting unit 202.

[0053] The rendering unit 204 is configured to render each component part based on the color- value of each component part determined by the determining unit 203.

[0054] As disclosed, the obtaining unit 201 obtains a background image under a coverage area of a window. The counting unit 202 counts a color- value of each pixel point in the background image obtained by the obtaining unit 201. The determining unit 203 determines a color- value of each component part of the window according to the color-value of each pixel point counted by the counting unit 202. The rendering unit 204 renders each component part based on the color-value of each component part determined by the determining unit 203.

[0055] Color of component parts in the window may be in low contrast with color of corresponding background image and it is hard to distinguish them from one another. The disclosed window coloring method can determine the color of each component part in the window according to the color of the background image, which allows the window and the background image look more integrated and harmonious, e.g., on a screen of a terminal device.

[0056] Optionally, based on the apparatus of FIG. 2, the counting unit 202 further includes color- value traversing sub-unit 2021 and/or grouping sub-unit 2022, as shown in FIG. 3. The color- value traversing sub-unit 2021 is configured to traverse the color- value of each pixel point in the background image. [0057] The grouping sub-unit 2022 is configured to group each pixel point according to the color- value of each pixel point, traversed by the color- value traversing sub-unit 2021, into a plurality of groups such that a color-value distance between a minimum color-value and a maximum color- value of the pixel points in a same group is less than a first pre-set threshold value.

[0058] Optionally, based on the apparatus of FIG. 3, the determining unit 203 includes: first determining sub-unit 2031, deleting sub-unit 2032, and/or second determining sub-unit 2033, as shown in FIG. 4.

[0059] The first determining sub-unit 2031 is configured to select a first-part group from the plurality of groups such that the first-part group contains a maximum number of pixel points with an average pixel-point color-value, determined as a first part color-value of the window.

[0060] The deleting sub-unit 2032 is configured to delete one or more groups each having an average pixel-point color-value that provides a color difference from the first part color-value, determined by the first determining unit 2031, by less than a second pre-set threshold value, and to provide a plurality of remaining groups after deleting the one or more groups from the plurality of groups.

[0061] The second determining sub-unit 2033 is configured to select a second-part group from the plurality of remaining groups such that the second-part group contains a maximum number of pixel points having an average pixel-point color-value determined as a second part color-value of the window.

[0062] The deleting sub-unit and the second determining sub-unit is repeatedly used till a color-value of each component part of the window is determined, and wherein the color-value of each component part of the window comprises at least the first part color-value and the second part color-value.

[0063] Optionally, based on the apparatus of FIG. 3, the determining unit 203 includes: third determining sub-unit 2034 and/or calculating sub-unit 2035, as shown in FIG. 5.

[0064] The third determining sub-unit 2034 is configured to select a first-part group from the plurality of groups such that the first-part group contains a maximum number of pixel points having an average pixel-point color-value, averaged from the maximum number of the pixel points in the first-part group and determined as a first part color-value of the window.

[0065] The calculating sub-unit 2035 is configured to calculate color- values of component parts other than the first part color- value of the window, according to the first part color- value determined by the third determining sub-unit 2034 and a first distribution scheme pre-set for color- values of component parts in the window.

[0066] Optionally, in the apparatus of FIG. 3, the determining unit 203 is configured to calculate color-values of component parts of the window, according to an average pixel-point color- value of a group containing a maximum number of pixel points and a second distribution scheme pre-set for color-values of component parts in the window.

[0067] Optionally, based on the apparatus of FIG. 2, the user apparatus 20 can further include a compressing unit 205 configured to compress the background image as shown in FIG. 6.

[0068] The present disclosure solves technical problems regarding user apparatus. FIG. 7 depicts another exemplary user apparatus consistent with various disclosed embodiments. The user apparatus can be used to implement the disclosed window coloring methods.

[0069] The exemplary user apparatus can include a mobile phone, a tablet computer, a PDA

(personal digital assistant), a POS (point of sales), a car-carrying computer, or any desired terminal devices.

[0070] Note that FIG. 7 depicts at least a portion of an exemplary user apparatus. As shown in FIG. 7, the exemplary user apparatus 700 can include an RF (Radio Frequency) circuit 710, a storage device 720 including one or more computer-readable storage media, an input unit 730, a display unit 740, a sensor 750, an audio circuit 760, a transmission module 770, a processor 780 including one or more processing cores, a power supply 790, and/or other components. In various embodiments, the user apparatus described herein can include more or less components as depicted in FIG. 7. Certain components/parts can be omitted, combined, replaced, and/or added.

[0071] The RF circuit 710 can be used to send/receive information or send/receive signal during communication. In particular, after receiving downlink information from a base station, such information can be processed by the one or more processors 780. Further, the data related to the uplink can be sent to the base station. Generally, the RF circuit 710 can include, but be not limited to, an antenna, at least one amplifier, a tuner, one or more oscillators, a user identity module (SIM) card, a transceiver, a coupler, LNA (i.e., Low Noise Amplifier), a duplexer, etc. In addition, the RF circuit 710 can communicate with other devices via a wireless communication network. The wireless communication can use any communication standards or protocols, including but not limited to, GSM (Global System for Mobile Communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband encode Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (Short Messaging Service).

[0072] The storage device 720 can be used for storing software programs and modules. By running software programs and modules stored in the storage device 720, the processor 780 can perform various functional applications and data processing to achieve business processing. The storage device 720 can include a program storage area and a data storage area. The program storage area can store the operating system, applications (such as sound playback, image playback, etc.) required by at least one function. The data storage area can store data (such as audio data, phone book, etc.) created when using the user apparatus. In addition, the storage device 720 can include a high-speed random access memory and a non-volatile memory. For example, the storage device 720 can include at least one disk memory, flash memory, and/or other volatile solid-state memory elements. Accordingly, the storage device 720 can further include a memory controller to provide the processor 780 and the input unit 730 with access to the storage device 720.

[0073] The input unit 730 can be used to receive inputted numeric or character information, and to generate signal input of keyboard, mouse, joystick, and trackball or optical signal input related to the user settings and function controls. Specifically, the input unit 730 can include a touch control panel 731 and other input device(s) 732. The touch-sensitive surface 731, also known as a touch screen or touch panel, can collect touch operations that a user conducts on or near the touch- sensitive surface 731. For example, a user can use a finger, a stylus, and any other suitable object or attachment on the touch-sensitive surface 731 or on an area near the touch-sensitive surface 731. The touch- sensitive surface 731 can drive a connecting device based on a preset program.

Optionally, the touch control panel 731 can include a touch detection device and a touch controller. The touch detection device can detect user's touch position and detect a signal due to a touch operation and send the signal to the touch controller. The touch controller can receive touch information from the touch detection device, convert the touch information into contact coordinates to send to the processor 780, and receive commands sent from the processor 780 to execute.

Furthermore, the touch control panel 731 can be realized by resistive, capacitive, infrared surface acoustic wave, and/or other types of surface touch. In addition to the touch control panel 731, the input unit 730 can also include other input device(s) 732. Specifically, the other input device(s) 732 can include, but be not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), a trackball, a mouse, an operating lever, or combinations thereof.

[0074] The display unit 740 can be used to display information inputted by the user, information provided to the user, and a variety of graphical user interfaces of the user apparatus 700. These graphical user interfaces can be formed by images, text, icons, videos, and/or any

combinations thereof. The display unit 740 can include a display panel 741 configured by, e.g., LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), etc. Further, the touch control panel 731 can cover the display panel 741. When the touch control panel 731 detects a touch operation on or near the touch sensitive surface, the touch operation can be sent to the processor 780 to determine a type of the touch operation. Accordingly, the processor 780 can provide visual output on the display panel 741. Although in FIG. 7 the touch-sensitive surface 731 and the display panel 741 are shown as two separate components to achieve input and output functions, in some

embodiments, the touch control panel 731 and the display panel 741 can be integrated to perform input and output functions. [0075] The user apparatus 700 in FIG. 7 can further include at least one sensor 750, such as optical sensors, motion sensors, and other suitable sensors. Specifically, the optical sensors can include an ambient optical sensor and a proximity sensor. The ambient optical sensor can adjust brightness of the display panel 741 according to the brightness of ambient light. The proximity sensor can turn off the display panel 741 and/or turn on backlighting, when the user apparatus 700 moves to an ear. As a type of motion sensor, a gravity sensor can detect amount of an acceleration in each direction (e.g., including three axis) and detect magnitude and direction of gravity when in stationary. The gravity sensor can be used to identify phone posture (for example, switching between horizontal and vertical screens, related games, magnetometer calibration posture, etc.), and/or vibration recognition related functions (e.g., pedometer, percussion, etc.). The user apparatus 700 can also be configured with, e.g., a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, and/or other sensors.

[0076] The audio circuit 760 can include an audio input device 761 such as a microphone and an audio output device 762 such as a speaker and can provide an audio interface between the user and user apparatus 700. The audio circuit 760 can transmit an electrical signal converted from the received audio data to the speaker 761 to convert into audio signal output. On the other hand, the microphone 762 can convert the collected sound signal to an electrical signal, which can be received by the audio circuit 760 to convert into audio data. The audio data can be output to the processor 780 for processing and then use the RF circuit 710 to transmit to, e.g., another terminal. Alternatively, the audio data can be output to the storage device 720 for further processing. The audio circuit 760 can also include an earplug jack to provide communications between the peripheral headset and the user apparatus 700.

[0077] The user apparatus 700 can use the transmission module 770 to help users

send/receive emails, browse websites, access streaming media, etc. The transmission module 770 can provide users with a wireless or wired broadband Internet access. In various embodiments, the transmission module 770 can be configured within or outside of the user apparatus 700 as depicted in FIG. 7.

[0078] The processor 780 can be a control center of the user apparatus 700: using a variety of interfaces and circuits to connect various parts, e.g., within a mobile phone; running or executing software programs and/or modules stored in the storage device 720; calling the stored data in the storage device 720; and/or performing various functions and data processing of the user apparatus 700, e.g., to monitor the overall mobile phone.

[0079] Optionally, the processor 780 can include one or more processing cores. In an exemplary embodiment, the processor 780 can integrate application processor with modulation and demodulation processor. The application processor is mainly used to process operating system, user interface, and applications. The modulation and demodulation processor is mainly used to deal with wireless communications. In various embodiments, the modulation and demodulation processor may or may not be integrated into the processor 780.

[0080] The user apparatus 700 can further include a power supply 790 (such as a battery) to power various components of the user apparatus. In an exemplary embodiment, the power supply can be connected to the processor 780 via the power management system, and thus use the power management system to manage charging, discharging, and/or power management functions. The power supply 790 can also include one or more DC or AC power supplies, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and/or any other suitable components. Although not shown in FIG. 7, the user apparatus 700 can further include a camera, a Bluetooth module, etc. without limitations.

[0081] The processor(s) 780 of the user apparatus 700 can upload executable files corresponding to processes of one or more programs to the storage device 720. The processor(s) 780 can then be used to run these one or more programs stored in the storage device 720.

[0082] For example, the processor(s) 780 can cause the exemplary user apparatus to perform disclosed window coloring methods. In the methods, a background image under a coverage area of a window is obtained. The window includes component parts, and the background image includes pixel points. A color-value of each pixel point of the background image is counted. According to the color-value of each pixel point, a color-value of each component part in the window is determined. Each component part is rendered according to the color-value of each component part.

[0083] When counting the color- value of each pixel point in the background image, the color-value of each pixel point in the background image can be traversed, and each pixel point can be grouped according to the color-value of each pixel point into a plurality of groups such that a color- value distance between a minimum color-value and a maximum color-value of the pixel points in a same group is less than a first pre-set threshold value.

[0084] When determining the color-value of each component part in the window according to the color-value of each pixel point, exemplary steps can be performed including: (a) selecting a first- part group from the plurality of groups such that the first-part group contains a maximum number of pixel points with an average pixel-point color-value, averaged from the maximum number of pixel points in the first-part group and determined as a first part color-value of the window; (b) deleting one or more groups each having an average pixel-point color-value that provides a color difference from the first part color-value by less than a second pre-set threshold value, and providing a plurality of remaining groups after deleting the one or more groups from the plurality of groups; (c) selecting a second-part group from the plurality of remaining groups such that the second-part group contains a maximum number of pixel points having an average pixel-point color-value determined as a second part color-value of the window; and (d) repeating the steps (b) and (c), till a color-value of each component part of the window is determined, wherein the color-value of each component part of the window comprises at least the first part color-value and the second part color-value.

[0085] For determining the color-value of each component part in the window according to the color-value of each pixel point, a first-part group can be selected from the plurality of groups such that the first-part group contains a maximum number of pixel points having an average pixel- point color-value, averaged from pixel points in the first-part group and determined as a first part color-value of the window. Color-values of component parts other than the first part color-value of the window can be calculated, according to the first part color-value and a first distribution scheme pre-set for color-values of component parts in the window.

[0086] Optionally, for determining the color-value of each component part in the window according to the color-value of each pixel point, color-values of component parts of the window can be calculated according to an average pixel-point color-value of a group containing a maximum number of pixel points and a second distribution scheme pre-set for color-values of component parts in the window.

[0087] After obtaining the background image under the coverage area of the window, the background image can be compressed. The window can include a focus box, a text input box, or a combination thereof. The background image can include a screen capture of the window. Each component part of the window can include background, main title, subtitle, and body text.

[0088] Conventionally, color of component parts in the window may be in low contrast with color of corresponding background image and it is hard to distinguish them from one another. The disclosed window coloring method can determine the color of each component part in the window according to the color of the background image, which allows the window and the background image look more integrated and harmonious, e.g., on a screen of a user apparatus or any terminal device.

[0089] The disclosed methods can be implemented by an apparatus/device including one or more processor, and a non-transitory computer-readable storage medium having instructions stored thereon. The non-transitory computer-readable storage medium can be included in the storage device shown in FIG. 7 or can be independently configured outside the user apparatus. The instructions can be executed by the one or more processors of the apparatus/device to perform the methods disclosed herein. In some cases, the instructions can include one or more modules corresponding to the disclosed window coloring methods. In the methods, a background image under a coverage area of a window is obtained. The window includes component parts, and the background image includes pixel points. A color-value of each pixel point of the background image is counted. According to the color-value of each pixel point, a color-value of each component part in the window is determined. Each component part is rendered according to the color-value of each component part.

[0090] When counting the color-value of each pixel point in the background image, the color-value of each pixel point in the background image can be traversed, and each pixel point can be grouped according to the color-value of each pixel point into a plurality of groups such that a color- value distance between a minimum color-value and a maximum color-value of the pixel points in a same group is less than a first pre-set threshold value.

[0091] When determining the color- value of each component part in the window according to the color-value of each pixel point, exemplary steps can be performed including: (a) selecting a first- part group from the plurality of groups such that the first-part group contains a maximum number of pixel points with an average pixel-point color-value, averaged from the maximum number of pixel points in the first-part group and determined as a first part color-value of the window; (b) deleting one or more groups each having an average pixel-point color-value that provides a color difference from the first part color-value by less than a second pre-set threshold value, and providing a plurality of remaining groups after deleting the one or more groups from the plurality of groups; (c) selecting a second-part group from the plurality of remaining groups such that the second-part group contains a maximum number of pixel points having an average pixel-point color-value determined as a second part color-value of the window; and (d) repeating the steps (b) and (c), till a color-value of each component part of the window is determined, wherein the color-value of each component part of the window comprises at least the first part color-value and the second part color-value.

[0092] For determining the color-value of each component part in the window according to the color-value of each pixel point, a first-part group can be selected from the plurality of groups such that the first-part group contains a maximum number of pixel points having an average pixel- point color-value, averaged from pixel points in the first-part group and determined as a first part color-value of the window. Color-values of component parts other than the first part color-value of the window can be calculated, according to the first part color-value and a first distribution scheme pre-set for color-values of component parts in the window.

[0093] Optionally, for determining the color- value of each component part in the window according to the color-value of each pixel point, color-values of component parts of the window can be calculated according to an average pixel-point color-value of a group containing a maximum number of pixel points and a second distribution scheme pre-set for color-values of component parts in the window.

[0094] After obtaining the background image under the coverage area of the window, the background image can be compressed. The window can include a focus box, a text input box, or a combination thereof. The background image can include a screen capture of the window. Each component part of the window can include background, main title, subtitle, and body text.

[0095] As such, current background color of a window displayed on a terminal device cannot be analyzed and/or extracted. "Automatic" coloring component parts of a window are desired to provide better visual effect. As disclosed, in one embodiment, a coloring algorithm of a translucence window is provided based on color analysis of the background image such that ground-color of the window can be "automatically" merged with color of the desktop image as a background of the window. Based on the disclosed methods and apparatus, text (e.g., title, subtitle, body text, etc.) displayed on the window can automatically select a color having sufficient contrast with the background image but having a hue style consistent with the original background image.

[0096] Certain embodiments also provide color analysis of background color including exemplary steps of capturing an image; processing the image; counting color-value; grouping color- values; obtaining background color; deleting groups similar with the background color; obtaining title color; and/or obtaining color-value of subtitle.

[0097] For capturing an image, when the window is moving, a timer is started to timedly capture the image in the back area of a window, e.g., by importing a rectangular coordinate frame to be captured and the current window ID "wid", by the function of "CGWindowListCreatelmage (*(CGRect*) &frame, kCGWindowListOptionOnScreenBelowWindow, wid,

kCGWindowImageShouldBeOpaque) to obtain layered images of all layers under the window in the assigned rectangle.

[0098] For processing the image, when the captured image is too big in size, analysis efficiency in real-time can be affected. Pixel point of the obtained background image can be compressed, e.g., proportionally compressed into a size of 80*80 pixel point or smaller ("drawInRect" the original image in rectangle of "NSMakeRect (0, 0, 80, 80)").

[0099] For counting color-value, each pixel of the processed image can be traversed. Color value of each pixel can be stored in a mathematical array (e.g., using function of "colorAtX:Y" to obtain color value in assigned coordinate of the "NSBitmapImageRep" image data, and to convert the color-value into YUV color-value). YUV is an exemplary color encoding method.

[00100] For grouping color- value, all pixels of the mathematical array can be grouped. For example, when pixels have a color-value distance less than a "delta", these pixels can be grouped together in one group. Multiple groups can be formed including Array 1, Array2, ... , ArrayN, with a similar color- value in a same group. "Grouping Array" can be used to store the grouped array. As disclosed herein, a same color-value may refer to color values having a color-value distance therebetween less than the value of "delta", which can be, e.g., 0.01 or any suitable number. [00101] For obtaining background color (e.g., a primary color in the captured image), among groups Array 1, Array2, ... , ArrayN, the group containing the maximum elements, indicating that the contained similar color-values have a maximum appearance times, can be used as the background color.

[00102] After deleting groups similar as the background color, "Grouping Array" can then be updated. Secondary color to be obtained (e.g., color for title, text, etc.) has to have sufficient contrast with the background color (that is, the color-value distance is greater than a certain threshold value). Therefore, groups of "ArrayXl, ArrayX2" which have the color- value distance less than "delta" in "GroupingArray" ("delta" may represent enough contrast, e.g., "delta"=0.5) are deleted.

[00103] For obtaining color- value of title (or main title), another group having maximum elements in the updated "GroupingArray" can have a first secondary color, having great contrast with the background and having more appearance times in the image. This can be used as the color for main title. Once again, "GroupingArray" is further updated by deleting groups having smaller contrast with the title color-value in the "GroupingArray" (for example, deleting all groups having the color-value distance less than 0.1).

[00104] For obtaining color- value of subtitle, body text etc., repeated process can be performed similar as obtaining color of the title, e.g., by obtaining a group having maximum elements in "GroupingArray" as the color-value, and then updating "GroupingArray" by deleting corresponding groups.

[00105] In some cases, when a color- value of title, subtitle, and/or body text cannot be obtained by the above exemplary steps, for example, when the background color is a pure color, a text color can be calculated to provide enough contrast and to be matched with the background color.

[00106] In addition, the obtained primary color and secondary color can be used to replace color of corresponding area of the window. When the background color is being changed, a smooth transition may be provided using animations.

[00107] It should be understood that steps described in various methods of the present disclosure may be carried out in order as shown, or alternately, in a different order.

Therefore, the order of the steps illustrated should not be construed as

limiting the scope of the present disclosure. In addition, certain steps may be performed

simultaneously.

[00108] In the present disclosure each embodiment is progressively described, i.e., each embodiment is described and focused on difference between embodiments. Similar and/or the same portions between various embodiments can be referred to with each other. In addition, exemplary apparatus and/or systems are described with respect to corresponding methods. [00109] The disclosed methods, apparatus, and/or systems can be implemented in a suitable computing environment. The disclosure can be described with reference to symbol(s) and step(s) performed by one or more computers, unless otherwise specified. Therefore, steps and/or implementations described herein can be described for one or more times and executed by computer(s). As used herein, the term "executed by computer(s)" includes an execution of a computer processing unit on electronic signals of data in a structured type. Such execution can convert data or maintain the data in a position in a memory system (or storage device) of the computer, which can be reconfigured to alter the execution of the computer as appreciated by those skilled in the art. The data structure maintained by the data includes a physical location in the memory, which has specific properties defined by the data format. However, the embodiments described herein are not limited. The steps and implementations described herein may be performed by hardware.

[00110] As used herein, the term "module" or "unit" can be software objects executed on a computing system. A variety of components described herein including elements, modules, units, engines, and services can be executed in the computing system. The methods, apparatus, and/or systems can be implemented in a software manner. Of course, the methods, apparatus, and/or systems can be implemented using hardware. All of which are within the scope of the present disclosure.

[00111] A person of ordinary skill in the art can understand that the units/modules included herein are described according to their functional logic, but are not limited to the above descriptions as long as the units/modules can implement corresponding functions. Further, the specific name of each functional module is used to be distinguished from one another without limiting the protection scope of the present disclosure.

[00112] In various embodiments, the disclosed units/modules can be configured in one apparatus (e.g., a processing unit) or configured in multiple apparatus as desired. The units/modules disclosed herein can be integrated in one unit/module or in multiple units/modules. Each of the units/modules disclosed herein can be divided into one or more sub- units/modules, which can be recombined in any manner. In addition, the units/modules can be directly or indirectly coupled or otherwise communicated with each other, e.g., by suitable interfaces.

[00113] One of ordinary skill in the art would appreciate that suitable software and/or hardware (e.g., a universal hardware platform) may be included and used in the disclosed methods, apparatus, and/or systems. For example, the disclosed embodiments can be implemented by hardware only, which alternatively can be implemented by software products only. The software products can be stored in computer-readable storage medium including, e.g., ROM/RAM, magnetic disk, optical disk, etc. The software products can include suitable commands to enable a terminal device (e.g., including a mobile phone, a personal computer, a server, or a network device, etc.) to implement the disclosed embodiments.

[00114] For example, the disclosed methods can be implemented by an apparatus/device including one or more processor, and a non-transitory computer-readable storage medium having instructions stored thereon. The instructions can be executed by the one or more processors of the apparatus/device to perform the methods disclosed herein. In some cases, the instructions can include one or more modules corresponding to the disclosed methods.

[00115] Note that, the term "comprising", "including" or any other variants thereof are intended to cover a non-exclusive inclusion, such that the process, method, article, or apparatus containing a number of elements also include not only those elements, but also other elements that are not expressly listed; or further include inherent elements of the process, method, article or apparatus. Without further restrictions, the statement "includes a " does not exclude other elements included in the process, method, article, or apparatus having those elements.

[00116] The embodiments disclosed herein are exemplary only. Other applications, advantages, alternations, modifications, or equivalents to the disclosed embodiments are obvious to those skilled in the art and are intended to be encompassed within the scope of the present disclosure.

INDUSTRIAL APPLICABILITY AND ADVANTAGEOUS EFFECT S

[00117] Without limiting the scope of any claim and/or the specification, examples of industrial applicability and certain advantageous effects of the disclosed embodiments are listed for illustrative purposes. Various alternations, modifications, or equivalents to the technical solutions of the disclosed embodiments can be obvious to those skilled in the art and can be included in this disclosure.

[00118] Window coloring methods and user apparatus are provided. A background image under a coverage area of a window is obtained. The window includes component parts, and the background image includes pixel points. A color-value of each pixel point of the background image is counted. According to the color-value of each pixel point, a color-value of each component part in the window is determined. Each component part is rendered according to the color-value of each component part.

[00119] By using the disclosed methods and user apparatus, color of each component part of window can be determined according to color of the background image such that each component part of the window plus the background image as a whole are more integrated and harmonious.

REFERENCE SIGN LIST

User apparatus 20

Obtaining unit 201

Counting unit 202

Determining unit 203

Rendering unit 204

Compressing unit 205

Color- value traversing sub-unit 2021

Grouping sub-unit 2022

First determining sub-unit 2031

Deleting sub-unit 2032

Second determining sub-unit 2033

Third determining sub-unit 2034

Calculating sub-unit 2035

RF (radio frequency) circuit 710

Storage device 720

Input unit 730

Display unit 740

Sensor 750

Audio circuit 760

Transmission module 770

Processor 780

Power supply 790

Touch control panel 731

Other input device(s) 732

Display panel 741

Audio input device 761

Audio output device 762