Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD, TERMINAL DEVICE, AND SYSTEM FOR INSTANT MESSAGING
Document Type and Number:
WIPO Patent Application WO/2015/032284
Kind Code:
A1
Abstract:
A method, a terminal device and a system for instant messaging (IM) are provided. Image data corresponding to a first display area are received for displaying a profile picture on the first display area on a communication interface of the terminal device during an IM communication. Communication data corresponding to a second display area are received for at least displaying text on the second display area on the communication interface during the IM communication. The image data corresponding to the first display area and the communication data corresponding to the second display area are sent to a receiver.

Inventors:
WANG JIAO (CN)
Application Number:
PCT/CN2014/085263
Publication Date:
March 12, 2015
Filing Date:
August 27, 2014
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
TENCENT TECH SHENZHEN CO LTD (CN)
International Classes:
H04L12/58
Foreign References:
CN102780649A2012-11-14
CN102594723A2012-07-18
CN102289339A2011-12-21
Attorney, Agent or Firm:
BEIJING SAN GAO YONG XIN INTELLECTUAL PROPERTY AGENCY CO., LTD. (No.5 Huizhong Road Chaoyang District, Beijing 1, CN)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. An IM method implemented by amessagesending terminal device, comprising:

receiving image data corresponding to a first display area for displaying a profile picture on the first display area on a communication interface of the message sending terminal deviceduring an IM communication;

receiving communication data corresponding to a second display area forat least displaying text on the second display area on the communication interface during the IM communication; and

sending the image data corresponding to the first display area and the communication data corresponding to the second display area to a receiver.

2. The method according to claim 1 , wherein before receiving the image data corresponding to the first display area, the method further comprises importing and storing the image data corresponding to the first display areaby

importing a local image and storing the local image as the image data corresponding to the first display area; or

importing a network image and storing the network image as the image data

corresponding to the first display area; or

capturing a real-time image and storing the real-time image as the image data corresponding to the first display area.

3. The method according to any claim of claims 1-2, further comprising: setting predefined data corresponding to the image data of the first display area and contained in the communication data of the second display area; and

receiving the predefined data on the second display area and receiving image data corresponding to the predefined data on the first display area.

4. The method according to claim 1, wherein:

the step of receiving the image data corresponding to the first display area

comprises identifying a long -press operation on an image and using the long-pressed image as the image data corresponding to the profile picture on the first display area; and

the step of receiving the communication data corresponding to the second display area comprises receiving configuration information for dynamically displaying the communication data corresponding to the second display area.

5. The method according to claim 1, wherein:

the image data corresponding to the first display area contain static image data or dynamic image data corresponding to an image sent for a first time or an image icon corresponding to an image sent for a non-first time; and

the communication data contain at least one of text information data, image information data, and voice information data.

6. The method according to claim 5, wherein the step of sending the image data corresponding to the first display area and the communication data corresponding to the second display area to the receiver comprises: encapsulating the image data and the communication data according to a data type to provide encapsulated data, wherein the image data are encapsulated with a profile picture ID used to identify the image data for displaying the profile picture on the first display area during the IM communication, and

sending the encapsulated data to the receiver.

7. An IM method implemented by a message receiving terminal device, comprising:

receiving data sent from a sender;

parsing the received data to obtain image data corresponding to a first display area and communication data corresponding to a second display area on a communication interface of the message receiving terminal device,

wherein the image data corresponding to the first display area are configured to display a profile picture on the first display area during an IM communication, and wherein the communication data corresponding to the second display area are configured to at least display text during the IM communication; and

displaying the image data on the first display area and displaying the communication data on the second display area.

8. The method according to claim 7, wherein the step of parsing the received data to obtain the image data corresponding to the first display area and the communication data corresponding to the second display area comprises:

parsing to obtain image-type data according to a data type of the received data; and parsing to obtain the image data corresponding to the first display area according to a profile picture ID encapsulated by the image-type data, whereinthe received data include the image data corresponding to the first display area, and remaining data in the received data are the communication data corresponding to the second display area.

9. The method according to claim 7, wherein:

the image data corresponding to the first display area contain static image data or dynamic image data corresponding to an image received for a first time or an image icon corresponding to an image received for a non-first time; and

the communication data contain at least one of text information data, image information data, and voice information data.

10. The method according to claim 9, wherein the step of displaying the image data on the first display area comprises:

when the image data corresponding to the first display area include the static image data or the dynamic image data corresponding to the image received for the first time, displaying a default image on the first display area, loading the static image data or the dynamic image data, and displaying the static image or the dynamic image on the first display area; and

when the image data corresponding to the first display area includethe image icon corresponding to the image received for the non-first time, obtaining the received image data bya receiver, and displaying the image corresponding to the image data.

11. The method according to claim 7, further comprising: parsing the received data to obtain configuration information for dynamically displaying the communication data corresponding to the second display area;

according to the configuration information for dynamically displaying, dynamically displaying the communication data corresponding to the second display area;

determining whether the communication data of the second display area contain predefined data corresponding to the image data of the first display area; and

when the communication data of the second display area contain the predefined data, displaying image data corresponding to the predefined data on the first display area.

12. An IM terminal device for sending a message, comprising:

a first inputting module, configured to receive image data corresponding to a first display area for displaying a profile picture on the first display area on a communication interface during an IM communication;

a second inputting module, configured to receive communication data corresponding to a second display areaforat least displaying text on the second display area on the communication interface during the IM communication; and

a data sending module, configured to send the image data corresponding to the first display area and the communication data corresponding to the second display area to a receiver.

13. The device according to claim 12, further comprising a profile -picture importing module configured to import and store the image data corresponding to the first display area,

wherein the profile -picture importing module is configured to import a local image and store the local image as the image data corresponding to the first display area; or to import a network image and store the network image as the image data corresponding to the first display area; or to capture a real-time image and store the real-time image as the image data

corresponding to the first display area.

14. The device according to any claim of claims 12 -13, further comprising a setting module, wherein:

the setting module is configured to set predefined data corresponding to the image data of the first display area and contained in the communication data of the second display area; and after the second inputting module receives the predefined data, the first inputting module receives image data corresponding to the predefined data.

15. The device according to claim 12, wherein

the first inputting module further comprisesa long-press identifying control configured to identify a long-press operation on the image and to use the long-pressed image as the image data corresponding to the profile picture on the first display area; and

the second inputting module is further configured to receive configuration information for dynamically displaying the communication data corresponding to the second display area.

16. The device according to claim 12, wherein:

the image data corresponding to the first display area and inputted by the first inputting module contain static image data or dynamic image data corresponding to an image sent for a first time or an image icon corresponding to an image sent for a non-first time; and the communication data, inputted by the second inputting module, contain at least one of text information data, image information data, and voice information data.

17. The device according to claim 16, wherein the data sending module is configured to encapsulate the image data and the communication data according to a data type to provide encapsulated data and to send the encapsulated data to the receiver,

wherein the image data are encapsulated with a profile picture ID used to identify the image data for displaying the profile picture on the first display area during the IM

communication.

18. An IM terminal device for receiving a message, comprising:

a data receiving module, configured to receive data sent from a sender;

a data processing module, configured to parse the received data to obtain image data corresponding to a first display area and communication data corresponding to second display areaon a communication interface,

wherein the image data corresponding to the first display area are configured to display a profile picture on the first display area during an IM communication, and wherein the communication data corresponding to the second display area are configured to display text during the IM communication; and

a displaying module, configured to display the image data on the first display area and to display the communication data on the second display area.

19. The device according to claim 18, wherein the data processing module is configured to parse to obtain image-type data according to a data type of the received data, and toparsing to obtain the image data corresponding to the first display area according to a profile picture ID encapsulated by the image-type data, wherein:

the received data include the image data corresponding to the first display area, and remaining data in the received data are the communication data corresponding to the second display area;

the image data corresponding to the first display area contain static image data or dynamic image data corresponding to an image received for a first time or an image icon corresponding to an image received for a non-first time; and

the communication data contain at least one of text information data, image information data, and voice information data.

20. The device according to claim 19, wherein the displaying module is configured:

when the image data corresponding to the first display area include the static image data or the dynamic image data corresponding to the image received for the first time, to display a default image on the first display area, to load the static image data or the dynamic image data, and to display the static image or the dynamic image on the first display area; and

when the image data corresponding to the first display area include the image icon corresponding to the image received for the non-first time, to obtain the received image data by the receiver, and to display the image corresponding to the image data.

21. The device according to claim 18, wherein: the data processing module is further configured to determine whether the communication data of the second display area contain predefined data corresponding to the image data of the first display area;

the displaying module is configured, when the communication data of the second display area contain the predefined data, to display image data corresponding to the predefined data on the first display area;

the data processing module is further configured to parse the received data to obtain configuration information for dynamically displaying the communication data corresponding to the second display area; and

the displaying module is configured, according to the configuration information for dynamically displaying, to dynamically display the communication data corresponding to the second display area.

22. An IM system comprising a sending terminal according to any claim of claims 12-17.

23. An IM systemcomprising a receiving terminalaccording to any claim of claims 18-21.

Description:
METHOD, TERMINAL DEVICE, AND SYSTEM FOR

INSTANT MESSAGING

CROSS-REFERENCES TO RELATED APPLICATIONS

[0001] This application claims priorityto Chinese Patent Application No.2013104006375, filed on September 05, 2013, the entire content of which is incorporated herein by reference.

FIELD OF THE DISCLOSURE

[0002] The present disclosure generally relates to the field of communication

technologyand, more particularly, relates to methods, terminal devices, and systems for instant messaging (IM).

BACKGROUND

[0003] Instant Messaging (IM) is an instant communication message technology based on Internet. Conventionally, most of IM sending terminals only need to edit communication message to be sent, which includes text, emoticon, and picture. The communication message is then sent to an IM receiving terminal. On a receiving terminalinterface, a profile picture display areais usuallyused to display a profile picture of a user who sends the message. Generally, another display area, which is often called as "bubble", may be located near this profile picture display area. The bubble display area may generally containtext information and images.

[0004] However, a same profile picture of the user is repeatedly displayed on the profile picture display area, which may waste display area of the IM terminal interface, especially formobile terminals. In addition, when both the bubble display area and the profile picture display area include pictures, image decoding and displaying have tobe performed simultaneously. This may overly occupy system resource and thus affect system performance.

[0005] Thus, there is a need to overcome these and other problems of the prior art and to provide methods, terminal devices, and systems for instant messaging (IM). BRIEF SUMMARY OF THE DISCLOSURE

[0006] One aspect or embodiment of the present disclosure includes an IM method implemented by a message sending terminal device. In the methods, image data corresponding to a first display area are received for displaying a profile picture on the first display area on a communication interface of the message sending terminal device during an IM communication. Communication data corresponding to a second display area are received for at least displaying text on the second display area on the communication interface during the IM communication. The image data corresponding to the first display area and the communication data

corresponding to the second display area are sent to a receiver.

[0007] Another aspect or embodiment of the present disclosure includes an IM method implemented by a message receiving terminal device. Data sent from a sender are received. The received data are parsed to obtain image data corresponding to a first display area and communication data corresponding to a second display area on a communication interface of the message receiving terminal device. The image data corresponding to the first display area are configured to display a profile picture on the first display area during an IM communication, and wherein the communication data corresponding to the second display area are configured to at least display text during the IM communication. The image data are displayed on the first display area and the communication data are displayed on the second display area. [0008] Another aspect or embodiment of the present disclosure includes an IM terminal device for sending a message. The IM terminal device includes a first inputting module, a second inputting module, and a data sending module. The first inputting module is configured to receive image data corresponding to a first display area for displaying a profile picture on the first display area on a communication interface during an IM communication. The second inputting module is configured to receive communication data corresponding to a second display area for at least displaying text on the second display area on the communication interface during the IM communication. The data sending module is configured to send the image data corresponding to the first display area and the communication data corresponding to the second display area to a receiver.

[0009] Another aspect or embodiment of the present disclosure includes an IM terminal device for receiving a message. The IM terminal device includes a data receiving module, a data processing module, and a displaying module. The data receiving module is configured to receive data sent from a sender. The data processing module is configured to parse the received data to obtain image data corresponding to a first display area and communication data corresponding to second display area on a communication interface. The image data corresponding to the first display area are configured to display a profile picture on the first display area during an IM communication. The communication data corresponding to the second display area are configured to display text during the IM communication. The displaying module is configured to display the image data on the first display area and to display the communication data on the second display area. [0010] Other aspects or embodiments of the present disclosure can be understood by those skilled in the art in light of the description, the claims, and the drawings of the present disclosure.

BRIEF DESCRIPTION OF THE DRAWINGS

[0011] The following drawings are merely examples for illustrative purposes according to various disclosed embodiments and are not intended to limit the scope of the present disclosure.

[0012] FIG. 1 depicts an exemplary instant messenger (IM) method consistent with various disclosed embodiments; [0013] FIG. 2 depicts another exemplary IM method consistent with various disclosed embodiments;

[0014] FIGS. 3A-3F depict exemplary inputting interfaces of an IM terminal consistent with various disclosed embodiments;

[0015] FIG. 4 depicts an exemplary interface for importing a profile picture consistent with various disclosed embodiments;

[0016] FIG. 5 depicts another exemplary IM method consistent with various disclosed embodiments;

[0017] FIGS. 6A-6B depict exemplary displaying interfaces of an IM terminal consistent with various disclosed embodiments; [0018] FIG. 7 depicts an exemplary IM terminal consistent with various disclosed embodiments; [0019] FIG. 8 depicts another exemplary IM terminal consistent with various disclosed embodiments;

[0020] FIG. 9 depicts another exemplary IM terminal consistent with various disclosed embodiments; [0021] FIG. 10 depicts an exemplary IM system consistent with various disclosed embodiments;

[0022] FIG. 11 depicts another exemplary IM system consistent with various disclosed embodiments;

[0023] FIG. 12 depicts an exemplary environment incorporating certain disclosed embodiments; and

[0024] FIG. 13 depicts an exemplary IM mobile terminal consistent with various disclosed embodiments.

DETAILED DESCRIPTION

[0025] Reference will now be made in detail to exemplary embodiments of the disclosure, which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.

[0026] FIGS. 1-13 depict exemplary methods, terminal devices, and systems for instant messaging (IM)consistent with various disclosed embodiments.

[0027] FIG. 1 depicts an exemplary instant messenger (IM) method consistent with various disclosed embodiments. The exemplary IM method can be implemented by an IM sending terminal (or IM sending terminal device). The IM sending terminal can be a stationary sending terminal such as a desktop computer, or can be a mobile sending terminal such as a personal computer, a mobile phone, a personal digital assistant, etc. The sending terminal can be used for "single player to single player" IM application, or for multiplayer IM application. The multiplayer can be in a group or a discussion sub-group. [0028] In Step 102, image data corresponding to a first display area are received.

[0029] The image data corresponding to the first display area are configured to display a profile picture on the first display area during an IM communication. That is, the profile picture of the IM user is displayed on the first display area of the IM interface. The received image data corresponding to the first display area inputted by the user can be static image data or dynamic image data corresponding to an image sent for a first time or an image icon corresponding to an image sent for a non-first time. The static image, e.g., PNG (Portable Network Graphic Format), can be displayed as a static profile picture, the dynamic image, e.g., GIF (Graphic Interchange Format), can be displayed as a profile picture which having an animation effect.

[0030] When the image to be sent is sent the first time, the static image data or the dynamic image data corresponding to the image can be sent. When the image has been sent for more than one time, only the image icon (which has been previously used) corresponding to the image needs to be re-sent. The image icon can include, but be not limited to, a serial number of pre-stored image and/or an informative abstract value calculated according to the image data, such as MD5 (Message Digest Algorithm 5). [0031 ] As such, the image data corresponding to the first display area inputted by the user are received. The user inputting method includes: inputting the image data corresponding to the first display area via a button inputting control. For example, referring to FIGS. 3A-3B, when clicking on button 302 for profile picture emoticons (PP emoticons), selection box 304 of profile picture emoticon for selection can be displayed on the interface. When one profile picture in the selection box 304 is selected, the profile picture can be inputted into area 306 in the most left side of an input box as shown in FIG. 3B.

[0032] When the inputted image (e.g., the selected profile picture emoticon or the selected profile picture from the selection box 304) is sent for the first time, the image data corresponding to the image needs to be sent. When the image has been sent for more than one time, only the image icon corresponding to the image needs to be sent when the image is sent again.

[0033] The user inputting method further includes: using a long-press identifying control to input the image data corresponding to the first display area. By identifying a long-press operation on an image by the user, the long-pressed image can be used as the image data corresponding to the profile picture on the first display area. For example, referring to FIG. 3C, when long-pressing the profile picture 308 (e.g., that has been previously sent), the profile picture 308 can be inputted in the area 310 of the input box. The long -pressing can take about 2 to 3 seconds, for example. Because the profile picture 308 has been sent before, resending the profile picture 308 only needs to send the image icon corresponding to the image (e.g., profile picture 308).

[0034] The inputting method further includes: inputting the image data corresponding to the first display area via an input box control corresponding to the first display area. Referring to FIG. 3D, the input box can be pre-set to include two inputting areas having one area 312 and another area. The area 312 is used to input the image data corresponding to the profile picture, while the area right to the area 312 is configured to input the communication data, such as text, image, etc. [0035] In Step 104, communication data corresponding to a second display area are received. The communication data corresponding to the second display area are used to display text (also referred to as text body) during IM communication. For example, the text can be displayed in the bubble area of the terminal interface. The received communication data inputted by the user can contain at least one of text information data, image information data, and voice information data. Various methods for inputting communication data can be used. For example, referring to FIG. 3E, text information data of "from bottom of your heart" can be inputted in area 314 of the input box.

[0036] After inputting the image data corresponding to the first display area and the communication data corresponding to the second display data, the image data and the communication data can be sent to a message receiver. Referring to FIG. 3F, when the sender decides or confirms to send, for example, by pressing the "enter" key, the profile picture corresponding to the image data and the communication data can firstly be displayed on the area 316 of the sending terminal interface. [0037] In one embodiment, when implementing the step of receiving the communication data corresponding to the second display area, extended function can be included. For example, configuration information for dynamically displaying the communication data corresponding to the second display area selected and inputted by the user can be received. When the

communication receiver receives the configuration information for dynamically displaying the communication data corresponding to the second display area, the data can be displayed according to corresponding dynamic displaying effect. For example, the user can select and input the configuration information for verbatim displaying effect for the text information data of the communication corresponding to the second display area, and can select and input the configuration information for gradually changing displaying effect for the static image of the communication corresponding to the second display area.

[0038] In Step 106, the image data corresponding to the first display area and the communication data corresponding to the second display area are sent to a receiver. [0039] After the sender user confirms to send, the sending terminal sends the image data corresponding to the first display area and the communication data corresponding to the second display area to a receiver, according to the pre-set communication protocol. The sending method includes: directly sending to the receiver by the sender, or transmitting to the receiver via the server. [0040] In the disclosed IM method, by inputting the image data corresponding to the first display area and inputting the communication data corresponding to the second display area, and sending the image data and the communication data to the receiver, each time sending the information, the sender user can select an image data corresponding to a different profile picture. The terminal can then be effectively used, especially for the limited display interface of mobile terminals. Conventional operations of processing the repeatedly displayed static profile picture can be omitted. IM performance and human-device interaction can be improved.

[0041] For example, conventionally, when a sender sends an image, the receiver needs to process displaying profile picture in the first display area and also needs to process displaying an image sent by the sender in the second display area. In the disclosed IM method, the image sent by the sender (e.g., an image of emoticon) can be placed in a first display area, which omits operation of processing displaying the static profile picture in the first display area to improve IM performance. Of course, in the second display area, other images may or may not be inputted. In one embodiment, the image data corresponding to the first display area contain static image data or dynamic image data corresponding to an image sent for a first time or an image icon corresponding to an image sent for a non-first time. The disclosed IM method effectively improves the image sending speed and saves the data volume.

[0042] FIG. 2 depicts another exemplary IM method consistent with various disclosed embodiments. The exemplary IM method can be implemented by an IM sending terminal. The IM sending terminal can be a stationary sending terminal such as a desktop computer, or can be a mobile sending terminal such as a personal computer, a mobile phone, a personal digital assistant, etc. The sending terminal can be used for "single player to single player" IM application, or for multiplayer IM application. The multiplayer can be in a group or a discussion sub-group. [0043] In Step 202, the image data corresponding to the first display area can be imported and stored.

[0044] The importing and storing image data corresponding to the first display area can include, but be not limited to, importing a local image/picture and storing the local

images/pictures as the image data corresponding to the first display area; or importing a network image and storing the network image as the image data corresponding to the first display area; or capturing the real-time image and storing the real-time image as the image data corresponding to the first display area. In one embodiment, the image data corresponding to the first display area can be imported via the profile picture emoticon importing control.

[0045] For example, as shown in FIG. 4, in an importing interface 402, an image made by a third-party or an image within the terminal can be imported by clicking on button of "Click to import customized profile picture". An image can be captured as the image data

corresponding to the first display area by clicking on button of "Click to capture customized profile picture". [0046] In Step 204, image data corresponding to the first display area are received.

[0047] The image data corresponding to the first display area are configured to display a profile picture on the first display area during an IM communication. That is, the profile picture of the IM user is displayed on the first display area of the IM interface. The received image data corresponding to the first display area inputted by the user can be static image data or dynamic image data corresponding to an image sent for a first time or an image icon corresponding to an image sent for a non-first time. Various inputting methods can be used to input the image data corresponding to the first display area by a user, e.g., as described in FIGS. 3A-3B and in Step 102. [0048] In one embodiment, when implementing the step of receiving the communication data corresponding to the second display area, extended function can be included. For example, configuration information for dynamically displaying the communication data corresponding to the second display area selected and inputted by the user can be received. When the

communication receiver receives the configuration information for dynamically displaying the communication data corresponding to the second display area, the data can be displayed according to corresponding dynamic displaying effect. For example, the user can select and input the configuration information for verbatim displaying effect for the text information data of the communication corresponding to the second display area, and can select and input the configuration information for gradually changing displaying effect for the static image of the communication corresponding to the second display area.

[0049] In Step 206, communication data corresponding to a second display area are received.The communication data corresponding to the second display area are used to display text (also referred to as text body) during IM communication. The received communication data inputted by the user can contain at least one of text information data, image information data, and voice information data. Exemplary inputting methods can refer to FIG. 3E and Step 104.

[0050] In various embodiments, before implementing Step 204 and Step 206, predefined data corresponding to the image data of the first display area and contained in the

communication data of the second display area can be set. After receiving the predefined data on the second display area, the image data corresponding to the predefined data on the first display area can be received. For example, a corresponding relationship between the text information data "uhn" of the communication data and a certain image data corresponding to the first display area can be predefined. When receiving the text information data "uhn" inputted by the user, the image data of the first display area corresponding to "uhn" can be "automatically" received and displayed as correspondingly inputted profile picture in the first display area.

[0051] In Step 208, the image data corresponding to the first display area and the communication data corresponding to the second display area are sent to a receiver.

[0052] After the sender user confirms to send, the sending terminal sends the image data corresponding to the first display area and the communication data corresponding to the second display area to a receiver, according to the pre-set communication protocol. The sending method includes: directly sending to the receiver by the sender, or transmitting to the receiver via the server.

[0053] In one embodiment, the sender can encapsulate the image data and the communication data according to a data type to provide encapsulated data. The image data are encapsulated with a profile picture ID used to identify the image data for displaying a profile picture on the first display area during the IM communication. The encapsulated data can be sent to the receiver. For example, the data type can be used to identify data as, e.g., text, image, voice, etc. for the communication.

[0054] For images, a parameter of "FaceSticker" can be added thereto. When

"FaceSticker"=true, that is, the value of "FaceSticker" is a true value, image corresponding to the profile picture ID can be used for profile picture display in the first display area. When

"FaceSticker"=false, that is, the value of "FaceSticker" is a false value, image corresponding to the profile picture ID can be an image used for displaying in the second display area.

[0055] In addition, in some embodiments, the profile picture can be directly defined as a data type, for example, defined as profile picture "faceimage". As such, the data type sent from the sender contains text, voice, image and profile picture, while the data of the image type can be displayed in the second display area, e.g., in bubble areas. In this manner, data can be identified directly from the data type. For example, <tagxtype=sound>content</tag> is used to identify the data content as data of voice type. <tagxtype=faceimage>content</tag> is used to identify the data content as data of profile picture type. [0056] The disclosed IM methods, terminal devices, and systems can be used to input the image data corresponding to the first display area and input the communication data

corresponding to the second display area on a communication interface of a terminal device during an IM communication, and then sendthe image data and the communication data to the receiver. Each time when sending messages, the sender can select image data corresponding to a different profile picture. The terminal interface space can then be effectively used, especially for mobile terminalswith limited interface space for displaying. Conventional operations of processing the repeatedly displayed static profile picture can thus be omitted. IM performance and human-device interactivity can be improved. [0057] In addition, the image data corresponding to the first display area can contain static image data or dynamic image data corresponding to an image sent for a first time or an image icon corresponding to an image sent for a non-first time. The disclosed IM methods, terminal devices, and systems can effectively improve image sending speed and save data volume. Further, before sending the image data corresponding to the first display area, the image data corresponding to the first display area can be imported and stored, which is convenient for importing a large amount of image data. Processing efficiency can then be improved.

[0058] FIG. 5 depicts another exemplary IM method consistent with various disclosed embodiments. The exemplary IM method can be implemented by an IM sending terminal. The IM receiving terminal can be a stationary receiving terminal such as a desktop computer, or can be a mobile receiving terminal such as a personal computer, a mobile phone, a personal digital assistant, etc. The receiving terminal can be used for "single player to single player" IM application, or for multiplayer IM application. The multiplayer can be in a group or a discussion sub-group. [0059] In Step 502, data sent from a sender are received.According to the pre-set communication protocol, the receiving terminal receives data sent from sender. The receiving terminal can directly receive data sent from the sender by the receiving terminal, or can receive data transmitted via a server.

[0060] In Step 504, the received data are parsed to obtain image data corresponding to a first display area and communication data corresponding to a second display area.

[0061] According to the pre-set encapsulating protocol, the receiving terminal parses the received data to obtain image data corresponding to the first display area and communication data corresponding to the second display area. The image data corresponding to the first display area are configured to display a profile picture on the first display area during an IM communication, and the communication data corresponding to the second display area are configured to display text during the IM communication.

[0062] In one embodiment, according to the data type of received data, the receiving terminal processes parsing to obtain the image type data and to further obtain text, voice, etc. Then, according to the profile picture ID encapsulated by the image type data, the receiving terminal processes parsing to obtain the image data corresponding to the first display area.

[0063] For example, for the image type data, a parameter of "FaceSticker" can be added thereto. When "FaceSticker"=true, that is, the value of "FaceSticker" is a true value, image corresponding to the profile picture ID is used to display the profile picture in the first display area. When "FaceSticker"=false, that is, the value of "FaceSticker" is a false value, image corresponding to the profile picture ID is identified to use as the image displayed in the second display area.

[0064] In addition, in one embodiment, the profile picture can be directly defined as a certain data type, for example, defined as profile picture "faceimage", so the data type sent from the sending terminal contains text, voice, image and profile picture, the image type data is configured to be displayed in the second display area, such as in the bubble areas. So the receiving terminal can parse the received data directly from the data type, for example, <tagxtype=sound>content</tag> is identified the data content as the data of voice type, <tagxtype=faceimage>content</tag> is identified the data content as the data of profile picture type.

[0065] In Step 506, the image data corresponding to the first display area and the communication data corresponding to the second display area are displayed. [0066] The image data corresponding to the first display area can be static image data or dynamic image data corresponding to an image sent for a first time or an image icon

corresponding to an image sent for a non-first time. The static image (e.g., PNG) can be displayed as a static profile picture, the dynamic image (e.g., GIF) can be displayed as a profile picture which having an animation effect. The image icon can be, e.g., a serial number of pre- stored image or an informative abstract value calculated according to the image data, such as MD5. The communication data contain at least one of text information data, image information data, and voice information data.

[0067] FIGS. 6A-6B depict exemplary displaying interfaces of an IM terminal consistent with various disclosed embodiments. As shown in FIG. 6A, image data corresponding to the first display area are displayed. That is, the profile picture is displayed in the first display area. When the image data corresponding to the first display area is static image data or dynamic image data corresponding to an image sent for a first time, in the first display area of the receiving terminal display interface, default image 602 can be firstly displayed. The default image 602 can be, e.g., pre-defined image/picture used to identify that the image data is loading in the first display area. The receiving terminal loads the static image data or the dynamic image data from a server or a sending terminal. After loading of the data, corresponding static image or dynamic image 604 can be displayed in the first display area as shown in FIG. 6B.

[0068] When the image data corresponding to the first display area is an image icon corresponding to an image sent for a non-first time, the receiving terminal directly obtains the received image data, and displays the corresponding image in the first display area.

[0069] The communication data corresponding to the second display area can be displayed. For example, the communication data corresponding to the second display area can be displayed in bubble areas including, e.g., text, image, etc. Voice message can be displayed as an icon. By clicking on this icon, the voice message can be played by the receiving terminal.

[0070] In addition, when displaying the communication data corresponding to the second display area, extended function can be included. For example, it can be determined whether the communication data corresponding to the second display area contain pre-defined data corresponding to the image data of the first display area. When the communication data corresponding to the second display area contain the pre-defined data, the image data

corresponding to the pre-defined data can be displayed in the first display area.

[0071 ] Further, the received data can be parsed to obtain the configuration information for dynamically displaying the communication data corresponding to the second display area.

According to the configuration information, the communication data corresponding to the second display area can be dynamically displayed. For example, when displaying text, the text can be verbatim displayed. When displaying the image, the image can be displayed with gradual changing effect. [0072] In the disclosed IM method, the receiving terminal receives data sent from sender, and parses to obtain image data corresponding to first display area and communication data corresponding to second display area, and displays the image data corresponding to the first display area and the communication data corresponding to the second display area. Different images can thus be displayed in the first display area. By displaying different profile pictures during communication, terminal interface, especially the limited display interface of mobile terminals, can be effectively utilized. Interactivity of the terminal can be improved.

[0073] FIG. 7 depicts an exemplary IM terminal (or IM terminal device) consistent with various disclosed embodiments.The IM terminal can be used as a message sending terminal 700. The sending terminal 700 can be, e.g., a desk computer, a personal computer, a mobile phone, a personal digital assistant, etc.

[0074] The exemplary sending terminal 700 includes: first inputting module 702, second inputting module 704, and data sending module 706. [0075] The first inputting module 702 is configured to receive image data corresponding to a first display area. The image data corresponding to the first display area are configured to display a profile picture on the first display area during an IM communication.

[0076] The second inputting module 704 is configured to receive communication data corresponding to a second display area. The communication data corresponding to the second display area are configured to display text during the IM communication.

[0077] The data sending module 706 is configured to send the image data corresponding to the first display area and the communication data corresponding to the second display area to a receiving terminal device.

[0078] The image data corresponding to the first display area, inputted by a user and received by the first inputting module 702, can include static image data or dynamic image data corresponding to an image sent for a first time or an image icon corresponding to an image sent for a non-first time. The communication data, received by the second inputting module 704 and inputted by the user, can contain at least one of text information data, image information data, and voice information data. [0079] In addition, the first inputting module 702 can contain, e.g., a long-press identifying control configured to identify a long-press operation on the image by the user, and use the long-pressed image as the image data corresponding to a profile picture on the first display area. The second inputting module 704 is further configured to receive configuration information for dynamically displaying the communication data corresponding to the second display area.

[0080] The data sending module 706 is configured to encapsulate the image data and the communication data according to a data type to provide encapsulated data and to send the encapsulated data to the receiver. The image data can be encapsulated with a profile picture ID used to identify the image data for displaying a profile picture on the first display area during the IM communication.

[0081 ] FIG. 8 depicts another exemplary IM terminal consistent with various disclosed embodiments. The exemplary IM terminal can be a sending terminal 800. The sending terminal 800 can be, e.g., a desk computer, a personal computer, a mobile phone, a personal digital assistant, etc.

[0082] The sending terminal 800 includes profile -picture importing module 802, first inputting module 806, second inputting module 808, and/or data sending module 810. [0083] The profile -picture importing module 802 is configured to import and store image data corresponding to a first display area. For example, the profile-picture importing module 802 can import the local image and store the local image as the image data corresponding to the first display area; or, to import a network image and store the network image as the image data corresponding to the first display area; or to capture a real-time image and store the real-time image as the image data corresponding to the first display area. [0084] The first inputting module 806 is configured to receive image data corresponding to a first display area. The image data corresponding to the first display area are configured to display a profile picture on the first display area during an IM communication.

[0085] The second inputting module 808 is configured to receive communication data corresponding to a second display area. The communication data corresponding to the second display area are configured to display text during the IM communication.

[0086] The data sending module 810 is configured to send the image data corresponding to the first display area and the communication data corresponding to the second display area to a receiving terminal device. [0087] In one embodiment, the image data corresponding to the first display area, received by the first inputting module 806 and inputted by a user, can be static image data or dynamic image data corresponding to an image sent for a first time or an image icon

corresponding to an image sent for a non-first time. The communication data, received by the second inputting module 808 and inputted by the user, can contain at least one of text information data, image information data, and voice information data.

[0088] In addition, the first inputting module 702 can contain, e.g., a long-press identifying control configured to identify a long -press operation on the image by a user, and to use the long-pressed image as the image data corresponding to a profile picture on the first display area. [0089] The second inputting module 704 is further configured to receive configuration information for dynamically displaying the communication data corresponding to the second display area. [0090] In the embodiment, the sending terminal 800 can include extended functions. The sending terminal 800 can further include setting module 804 configured to set the predefined data corresponding to the image data of the first display area and contained in the

communication data of the second display area. [0091] After the second inputting module 808 receives the predefined data on the second display area, the first inputting module 806 can automatically receive the corresponding image data. For example, the setting module 804 can pre-define a corresponding relationship between the text information data "uhn" of the communication data and certain image data corresponding to the first display area. When receiving the text information data "uhn" inputted by a user, the image data of the first display area corresponding to "uhn" can be automatically received and displayed as correspondingly inputted profile picture in the first display area.

[0092] The data sending module 810 is configured to encapsulate the image data and the communication data according to a data type to provide encapsulated data and to send the encapsulated data to the receiver. The image data are encapsulated with a profile picture ID used to identify the image data for displaying a profile picture on the first display area during the IM communication.

[0093] FIG. 9 depicts another exemplary IM terminal consistent with various disclosed embodiments. The exemplary IM terminal can include a receiving terminal 900 including, e.g., a desk computer, a personal computer, a mobile phone, a personal digital assistant, etc. [0094] The receiving terminal 900 includes: data receiving module 902, data processing module 904, and/or displaying module 906. The data receiving module 902 is configured to receive data sent from a sender. [0095] The data processing module 904 is configured to parse the received data to obtain image data corresponding to a first display area and communication data corresponding to second display area. The image data corresponding to the first display area are configured to display a profile picture on the first display area during an IM communication, and the communication data corresponding to the second display area are configured to display text during the IM communication.

[0096] The displaying module 906 is configured to display the image data corresponding to the first display area and the communication data corresponding to the second display area.

[0097] In one embodiment, the data processing module 904 is configured to parse to obtain image-type data according to data type of the received data, and toparse to obtain the image data corresponding to the first display area according to the profile picture ID

encapsulated by the image-type data. In some cases, the received data include the image data corresponding to the first display area, and remaining data in the received data can be the communication data corresponding to the second display area. [0098] The image data corresponding to the first display area can contain static image data or dynamic image data corresponding to an image received for a first time or an image icon corresponding to an image received for a non-first time.The communication data can contain at least one of text information data, image information data, and voice information data.

[0099] In the embodiment, the displaying module 906 is configured: when the image data corresponding to the first display area are the static image data or the dynamic image data corresponding to the image received for the first time, to display a default image on the first display area, load the static image data or the dynamic image data, and display the static image or the dynamic image on the first display area. The displaying module 906 is further configured: when the image data corresponding to the first display area are the image icon corresponding to the image received for the non-first time, to obtain the received image data in the receiver, and display the image corresponding to the image data.

[00100] The data processing module 904 is further configured to determine whether the communication data of the second display area contain predefined data corresponding to the image data of the first display area. The displaying module 906 is configured, when the communication data of the second display area contain the predefined data, to display image data corresponding to the predefined data on the first display area.

[00101] In addition, the data processing module 904 is further configured to parse the received data to obtain configuration information for dynamically displaying the communication data corresponding to the second display area. The displaying module 906 is further configured, according to the configuration information for dynamically displaying, to dynamically display the communication data corresponding to the second display area, for example, to verbatim display text. [00102] FIG. 10 depicts another exemplary IM system consistent with various disclosed embodiments. The exemplary IM includes sending terminal 1002 and receiving terminal 1004. The sending terminal 1002 can be any sending terminal as described above. The receiving terminal 1004 can be any receiving terminal as described above.

[00103] FIG. 11 depicts another exemplary IM system consistent with various disclosed embodiments. The system exemplary includes sending terminal 1102, server 1104, and receiving terminal 1106. The sending terminal 1102 can be any sending terminal as described above. The receiving terminal 1106 can be any receiving terminal as described above. The server 1104 can be configured to receive the image data corresponding to the first display area and the communication data corresponding to the second display area sent from the sending terminal device 1102. The server 1104 can further be configured to transmit the image data corresponding to the first display area and the communication data corresponding to the second display area to the receiving terminal 1106. [00104] FIG. 12 depicts an exemplary environment 1200 incorporating certain disclosed embodiments. As shown in FIG. 12, environment 1200 can include a server 1204, a terminal 1206, and a communication network 1202. The server 1204 and the client 1206 may be coupled through the communication network 1202 for information exchange, such as webpage browsing, Internet searching, data downloading, etc. Although only one client 1206 and one server 1204 is shown in the environment 1200, any number of clients 1206 or servers 1204 may be included, and other devices may also be included.

[00105] Communication network 1202 may include any appropriate type of

communication network for providing network connections to the server 1204 and client 1206 or among multiple servers 1204 or clients 1206. For example, communication network 1202 may include the Internet or other types of computer networks or telecommunication networks, either wired or wireless.

[00106] The terminal 1206 can refer to any appropriate user terminal or terminal device with certain computing capabilities, such as a personal computer (PC), a work station computer, a server computer, a hand-held computing device (tablet), a smart phone or mobile phone, or any other user-side computing device. The terminal 1206 can include one or more terminal devices, e.g., as shown in FIGS. 7-11. The server 1204 can refer one or more server computers configured to provide certain server functionalities, such as database management and search engines. A server may also include one or more processors to execute computer programs in parallel.

[00107] In operation, terminal 1206 may cause server 1204 to perform certain actions, such as an Internet search or other database operations. Server 1204 may be configured to provide structures and functions for such actions and operations. More particularly, server 1204 may include a data transmitting system for real-time database transmission.

[00108] Suitable software and/or hardware may be included and used in the disclosed methods, terminal devices, and/or systems. For example, the disclosed embodiments can be implemented by hardware only, which alternatively can be implemented by software products only. The software products can be stored in computer-readable storage medium including, e.g., ROM/RAM, magnetic disk, optical disk, etc. The software products can include suitable commands to enable a terminal device (e.g., including a mobile phone, a personal computer, a server, or a network device, etc.) to implement the disclosed embodiments.

[00109] Various embodiments further include an IM mobile terminal. Exemplary IM mobile terminal can include a mobile phone, a tablet computer, a PDA (personal digital assistant), a POS (point of sales), a car-carrying computer, etc.

[00110] FIG. 13 depicts at least a portion of an exemplary mobile terminal. As shown in

FIG. 13, the exemplary terminal 1300 can include an RF (Radio Frequency) circuit 1310, a storage device 1320 including one or more computer-readable storage media, an input unit 1330, a display unit 1340, a sensor 1350, an audio circuit 1360, a transmission module 13130, a processor 1380 including one or more processing cores, a power supply 1390, and/or other components. In various embodiments, the terminal(s) described herein can include more or less components as depicted in FIG. 13. Certain components/parts can be omitted, combined, replaced, and/or added.

[00111] The RF circuit 1310 can be used to send/receive information or send/receive signal during communication. In particular, after receiving downlink information from a base station, such information can be processed by the one or more processors 1380. Further, the data related to the uplink can be sent to the base station. Generally, the RF circuit 1310 can include, but be not limited to, an antenna, at least one amplifier, a tuner, one or more oscillators, a user identity module (SIM) card, a transceiver, a coupler, LNA (i.e., Low Noise Amplifier), a duplexer, etc. In addition, the RF circuit 1310 can communicate with other devices via a wireless communication network. The wireless communication can use any communication standards or protocols, including but not limited to, GSM (Global System for Mobile

Communications), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access), WCDMA (Wideband encode Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (Short Messaging Service). [00112] The storage device 1320 can be used for storing software programs and modules, such as those software programs and modules disclosed herein. By running software programs and modules stored in the storage device 1320, the processor 1380 can perform various functional applications and data processing to achieve IM communications. The storage device 1320 can include a program storage area and a data storage area. The program storage area can store the operating system, applications (such as sound playback, image playback, etc.) required by at least one function. The data storage area can store data (such as audio data, phone book, etc.) created when using the terminal. In addition, the storage device 1320 can include a highspeed random access memory and a non-volatile memory. For example, the storage device 1320 can include at least one disk memory, flash memory, and/or other volatile solid-state memory elements. Accordingly, the storage device 1320 can further include a memory controller to provide the processor 1380 and the input unit 1330 with access to the storage device 1320.

[00113] The input unit 1330 can be used to receive inputted numeric or character information, and to generate signal input of keyboard, mouse, joystick, and trackball or optical signal input related to the user settings and function controls. Specifically, the input unit 1330 can include a touch control panel 1331 and other input device(s) 1332. The touch-sensitive surface 1331, also known as a touch screen or touch panel, can collect touch operations that a user conducts on or near the touch-sensitive surface 1331. For example, a user can use a finger, a stylus, and any other suitable object or attachment on the touch-sensitive surface 1331 or on an area near the touch-sensitive surface 1331. The touch-sensitive surface 1331 can drive a connecting device based on a preset program. Optionally, the touch control panel 1331 can include a touch detection device and a touch controller. The touch detection device can detect user's touch position and detect a signal due to a touch operation and send the signal to the touch controller. The touch controller can receive touch information from the touch detection device, convert the touch information into contact coordinates to send to the processor 1380, and receive commands sent from the processor 1380 to execute. Furthermore, the touch control panel 1331 can be realized by resistive, capacitive, infrared surface acoustic wave, and/or other types of surface touch. In addition to the touch control panel 1331, the input unit 1330 can also include other input device(s) 1332. Specifically, the other input device(s) 1332 can include, but be not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), a trackball, a mouse, an operating lever, or combinations thereof. [00114] The display unit 1340 can be used to display information inputted by the user, information provided to the user, and a variety of graphical user interfaces of the terminal 1300. These graphical user interfaces can be formed by images, text, icons, videos, and/or any combinations thereof. The display unit 1340 can include a display panel 1341 configured by, e.g., LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), etc. Further, the touch control panel 1331 can cover the display panel 1341. When the touch control panel 1331 detects a touch operation on or near the touch sensitive surface, the touch operation can be sent to the processor 1380 to determine a type of the touch operation. Accordingly, the processor 1380 can provide visual output on the display panel 1341. Although in FIG. 13 the touch- sensitive surface 1331 and the display panel 1341 are shown as two separate components to achieve input and output functions, in some embodiments, the touch control panel 1331 and the display panel 1341 can be integrated to perform input and output functions.

[00115] The terminal 1300 in FIG. 13 can further include at least one sensor 1350, such as optical sensors, motion sensors, and other suitable sensors. Specifically, the optical sensors can include an ambient optical sensor and a proximity sensor. The ambient optical sensor can adjust brightness of the display panel 1341 according to the brightness of ambient light. The proximity sensor can turn off the display panel 1341 and/or turn on backlighting, when the terminal 1300 moves to an ear. As a type of motion sensor, a gravity sensor can detect amount of an acceleration in each direction (e.g., including three axis) and detect magnitude and direction of gravity when in stationary. The gravity sensor can be used to identify phone posture (for example, switching between horizontal and vertical screens, related games, magnetometer calibration posture, etc.), and/or vibration recognition related functions (e.g., pedometer, percussion, etc.). The terminal 1300 can also be configured with, e.g., a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, and/or other sensors.

[00116] The audio circuit 1360 can include an audio input device 1361 such as a microphone and an audio output device 1362 such as a speaker and can provide an audio interface between the user and terminal 1300. The audio circuit 1360 can transmit an electrical signal converted from the received audio data to the speaker 1361 to convert into audio signal output. On the other hand, the microphone 1362 can convert the collected sound signal to an electrical signal, which can be received by the audio circuit 1360 to convert into audio data. The audio data can be output to the processor 1380 for processing and then use the RF circuit 1310 to transmit to, e.g., another terminal. Alternatively, the audio data can be output to the storage device 1320 for further processing. The audio circuit 1360 can also include an earplug jack to provide communications between the peripheral headset and the terminal 1300.

[00117] The terminal 1300 can use the transmission module 13130 to help users send/receive emails, browse websites, access streaming media, etc. The transmission module 13130 can provide users with a wireless or wired broadband Internet access. In various embodiments, the transmission module 13130 can be configured within or outside of the terminal 1300 as depicted in FIG. 13.

[00118] The processor 1380 can be a control center of the terminal 1300: using a variety of interfaces and circuits to connect various parts, e.g., within a mobile phone; running or executing software programs and/or modules stored in the storage device 1320; calling the stored data in the storage device 1320; and/or performing various functions and data processing of the terminal 1300, e.g., to monitor the overall mobile phone. Optionally, the processor 1380 can include one or more processing cores. In an exemplary embodiment, the processor 1380 can integrate application processor with modulation and demodulation processor. The application processor is mainly used to process operating system, user interface, and applications. The modulation and demodulation processor is mainly used to deal with wireless communications. In various embodiments, the modulation and demodulation processor may or may not be integrated into the processor 1380.

[00119] The terminal 1300 can further include a power supply 1390 (such as a battery) to power various components of the terminal. In an exemplary embodiment, the power supply can be connected to the processor 1380 via the power management system, and thus use the power management system to manage charging, discharging, and/or power management functions. The power supply 1390 can also include one or more DC or AC power supplies, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and/or any other suitable components. Although not shown in FIG. 13, the terminal 1300 can further include a camera, a Bluetooth module, etc. without limitations.

[00120] The processor(s) 1380 of the terminal 1300 can upload executable files

corresponding to processes of one or more programs to the storage device 1320. The processor(s) 1380 can then be used to run these one or more programs stored in the storage device 1320. For example, the processor(s) 1380 can cause the exemplary mobile terminal to perform disclosed IM methods.

[00121] For example, the processor(s) 1380 can cause the exemplary mobile terminal to perform disclosed IM methods. In the methods,image data corresponding to a first display area are received for displaying a profile picture on the first display area on a communication interface of the message sending terminal device during an IM communication. Communication data corresponding to a second display area are received for at least displaying text on the second display area on the communication interface during the IM communication. The image data corresponding to the first display area and the communication data corresponding to the second display area are sent to a receiver.

[00122] For example, before the image data corresponding to the first display area are received, the image data corresponding to the first display area can be imported and stored by importing a local image and storing the local image as the image data corresponding to the first display area; or importing a network image and storing the network image as the image data corresponding to the first display area; orcapturing a real-time image and storing the real-time image as the image data corresponding to the first display area. [00123] Optionally, predefined data can be set corresponding to the image data of the first display area and contained in the communication data of the second display area. The predefined data can be received on the second display area and image data corresponding to the predefined data can be received on the first display area.

[00124] When receiving the image data corresponding to the first display area, a long- press operation on an image can be identified and the long -pressed image can be used as the image data corresponding to the profile picture on the first display area. When receiving the communication data corresponding to the second display area comprises: configuration information can be received for dynamically displaying the communication data corresponding to the second display area. [00125] The image data corresponding to the first display area can contain static image data or dynamic image data corresponding to an image sent for a first time or an image icon corresponding to an image sent for a non-first time. Thecommunication data can contain at least one of text information data, image information data, and voice information data. [00126] When sending the image data corresponding to the first display area and the communication data corresponding to the second display area to the receiver, the image data and the communication data can be encapsulated according to a data type to provide encapsulated data. The image data are encapsulated with a profile picture ID used to identify the image data for displaying the profile picture on the first display area during the IM communication. The encapsulated data can besent to the receiver.

[00127] In various embodiments, the processor(s) 1380 can cause the exemplary mobile terminal to perform disclosed IM methods. In the methods,data sent from a sender are received. The received data are parsed to obtain image data corresponding to a first display area and communication data corresponding to a second display area on a communication interface of the message receiving terminal device. The image data corresponding to the first display area are configured to display a profile picture on the first display area during an IM communication, and wherein the communication data corresponding to the second display area are configured to at least display text during the IM communication. The image data are displayed on the first display area and the communication data are displayed on the second display area.

[00128] When parsing the received data to obtain the image data corresponding to the first display area and the communication data corresponding to the second display area, image-type data according to a data type of the received data can be obtained and the image data

corresponding to the first display area according to a profile picture ID encapsulated by the image-type data can be obtained. The received data include the image data corresponding to the first display area, and remaining data in the received data are the communication data corresponding to the second display area. [00129] The image data corresponding to the first display area contain static image data or dynamic image data corresponding to an image received for a first time or an image icon corresponding to an image received for a non-first time. The communication data contain at least one of text information data, image information data, and voice information data. [00130] For displaying the image data on the first display area, when the image data corresponding to the first display area include the static image data or the dynamic image data corresponding to the image received for the first time, a default image on the first display area can be displayed, the static image data or the dynamic image data can be loaded, and the static image or the dynamic image can be displayed on the first display area. When the image data corresponding to the first display area include the image icon corresponding to the image received for the non-first time, the received image data can be obtainedby a receiver, and the image corresponding to the image datacan be displayed.

[00131] The received data can be parsed to obtain configuration information for dynamically displaying the communication data corresponding to the second display area.

According to the configuration information for dynamically displaying, the communication data corresponding to the second display area can be dynamically displayed. It can be determined whether the communication data of the second display area contain predefined data

corresponding to the image data of the first display area. When the communication data of the second display area contain the predefined data, image data corresponding to the predefined data can be displayedon the first display area.

[00132] In current chat tools (e.g., IM tools), profile picture and emoticons are separately displayed on terminal interfaces. The profile picture is repeatedly displayed. For mobile phones having limited displaying size, the repeatedly displayed profile picture seems waste the space. In addition, when emoticons and text content are displayed together, more displaying effect and stronger emoticon expression are desirable. Further, pictures can be posted and mixed with the text content in current terminal interface, which overly consumes data volume.

[00133] The present disclosure solves technical problems of terminal technology including mobile phones, computers, and handheld electronic device, especially when using these terminal devices for displaying profile picturein IM tools, static picture, dynamic picture, text messages, etc.

[00134] In an exemplary embodiment, when chatting, an area on a terminal interface for displaying a profile picture can be used to display different contents (e.g., pictures/images such as .gif and/or .png, and/or text) according to the messaging content of the user.

[00135] In one embodiment, by introducing profile picture emoticon, customized profile picture made by a third-party and/or captured by the user-self can be imported. In some cases, pictures can be imported from local terminal or GIF can be taken after certain notifications.

[00136] For example, the disclosed profile picture emoticon can be used by clicking on conventional emoticon entrance, clicking on groups for profile picture emoticon, selecting a desired profile picture, and inputting text. The disclosed profile picture emoticon can also be used by long-pressing a profile picture which has already been used in a chatting webpage to directly use as the profile picture. The disclosed profile picture emoticon can further be used by having a fixed resident profile picture entrance in the input box. [00137] When receiving profile picture emoticon, the profile picture emoticon can be processed (e.g., loading) after firstly displaying a default image. After successfully downloaded, images/pictures of the profile picture emoticon can be displayed (sometimes with animation effect). In some cases, animation effect can be extended for dynamically displaying text including, e.g., verbatim display.

[00138] In a certain embodiment, when receiving terminal receives a message, the message content can be parsed (e.g., including FaceSticker parameter) to determine whether the parsed image/picture is a profile picture emoticon. When it is determined that the parsed image/picture is a profile picture emoticon (e.g., FaceSticker= true), the parsed image/picture can be pulled and displayed on the display area for profile picture. When it is determined that the parsed image/picture is not a profile picture emoticon (e.g., FaceSticker= false), the parsed image/picture can be displayed on the bubble display area. [00139] It should be understood that steps described in various methods of the present disclosure may be carried out in order as shown, or alternately, in a different order.

Therefore, the order of the steps illustrated should not be construed as

limiting the scope of the present disclosure. In addition, certain steps may be performed simultaneously. [00140] In the present disclosure each embodiment is progressively described, i.e., each embodiment is described and focused on difference between embodiments. Similar and/or the same portions between various embodiments can be referred to with each other. In addition, exemplary apparatus and/or systems are described with respect to corresponding methods.

[00141] The disclosed methods, apparatus, and/or systems can be implemented in a suitable computing environment. The disclosure can be described with reference to symbol(s) and step(s) performed by one or more computers, unless otherwise specified. Therefore, steps and/or implementations described herein can be described for one or more times and executed by computer(s). As used herein, the term "executed by computer(s)" includes an execution of a computer processing unit on electronic signals of data in a structured type. Such execution can convert data or maintain the data in a position in a memory system (or storage device) of the computer, which can be reconfigured to alter the execution of the computer as appreciated by those skilled in the art. The data structure maintained by the data includes a physical location in the memory, which has specific properties defined by the data format. However, the

embodiments described herein are not limited. The steps and implementations described herein may be performed by hardware.

[00142] As used herein, the term "module" or "unit" can be software objects executed on a computing system. A variety of components described herein including elements, modules, units, engines, and services can be executed in the computing system. The methods, apparatus, and/or systems can be implemented in a software manner. Of course, the methods, apparatus, and/or systems can be implemented using hardware. All of which are within the scope of the present disclosure.

[00143] A person of ordinary skill in the art can understand that the units/modules included herein are described according to their functional logic, but are not limited to the above descriptions as long as the units/modules can implement corresponding functions. Further, the specific name of each functional module is used to be distinguished from one another without limiting the protection scope of the present disclosure.

[00144] In various embodiments, the disclosed units/modules can be configured in one apparatus (e.g., a processing unit) or configured in multiple apparatus as desired. The units/modules disclosed herein can be integrated in one unit/module or in multiple units/modules. Each of the units/modules disclosed herein can be divided into one or more sub- units/modules, which can be recombined in any manner. In addition, the units/modules can be directly or indirectly coupled or otherwise communicated with each other, e.g., by suitable interfaces.

[00145] For example, the disclosed methods can be implemented by an apparatus/device including one or more processor, and a non-transitory computer-readable storage mediumhaving instructions stored thereon. The instructions can be executed by the one or more processors of the apparatus/device to perform the methods disclosed herein. In some cases, the instructions can include one or more modules corresponding to the disclosed methods.

[00146] Note that, the term "comprising", "including" or any other variants thereof are intended to cover a non-exclusive inclusion, such that the process, method, article, or apparatus containing a number of elements also include not only those elements, but also other elements that are not expressly listed; or further include inherent elements of the process, method, article or apparatus. Without further restrictions, the statement "includes a " does not exclude other elements included in the process, method, article, or apparatus having those elements.

[00147] The embodiments disclosed herein are exemplary only. Other applications, advantages, alternations, modifications, or equivalents to the disclosed embodiments are obvious to those skilled in the art and are intended to be encompassed within the scope of the present disclosure.

INDUSTRIAL APPLICABILITY AND ADVANTAGEOUS EFFECTS

[00148] Without limiting the scope of any claim and/or the specification, examples of industrial applicability and certain advantageous effects of the disclosed embodiments are listed for illustrative purposes. Various alternations, modifications, or equivalents to the technical solutions of the disclosed embodiments can be obvious to those skilled in the art and can be included in this disclosure.

[00149] The disclosed IM methods, terminal devices, and systems can be used to input the image data corresponding to the first display area and input the communication data

corresponding to the second display area on a communication interface of a terminal device during an IM communication, and then sendthe image data and the communication data to the receiver. Each time when sending messages, the sender can select image data corresponding to a different profile picture. The terminal interface space can then be effectively used, especially for mobile terminalswith limited interface space for displaying. Conventional operations of processing the repeatedly displayed static profile picture can thus be omitted. IM performance and human-device interactivity can be improved.

[00150] In addition, the image data corresponding to the first display area can contain static image data or dynamic image data corresponding to an image sent for a first time or an image icon corresponding to an image sent for a non-first time. The disclosed IM methods, terminals, and systems can effectively improve image sending speed and save data volume. Further, before sending the image data corresponding to the first display area, the image data corresponding to the first display area can be imported and stored, which is convenient for importing a large amount of image data. Processing efficiency can then be improved. REFERENCE SIGN LIST

Sending terminal 700

First inputting module 702

Second inputting module 704

Data sending module 706

Sending terminal 800

Profile-picture importing module 802

First inputting module 806

Setting module 804

Receiving terminal 900

Data receiving module 902

Data processing module 904

Displaying module 906

Second inputting module 808

Data sending module 810

Sending terminal 1002

Receiving terminal 1004

Sending terminal 1102

Server 1104

Receiving terminal 1106

Environment 1200

Communication network 1202

Server 1204 Terminal 1206

Rf (radio frequency) circuit 1310 Storage device 1320

Input unit 1330

Display unit 1340

Sensor 1350

Audio circuit 1360

Transmission module 13130 Processor 1380

Power supply 1390

Touch control panel 1331 Other input device(s) 1332 Display panel 1341

Audio input device 1361 Audio output device 1362