Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
METHOD AND SYSTEM FOR COMPUTERIZED SEARCHING AND MATCHING USING EMOTIONAL PREFERENCE
Document Type and Number:
WIPO Patent Application WO/2007/117980
Kind Code:
A4
Abstract:
A method and system for capturing emotional preference of human subjects, generating machine-readable emotional code and using the code to optimize computerized searching and matching operations between entities is disclosed. The entity can be a human user, a product, or a service. The emotional code can thus be a universal language expressing human emotion that communicates among entities. After understanding the sending parties's emotional profile, the receiving party can adapt its operation to achieve more optimum results.

Inventors:
WILLCOCK ALEX (GB)
Application Number:
PCT/US2007/064971
Publication Date:
December 31, 2008
Filing Date:
March 27, 2007
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
IMAGINI HOLDINGS LTD (GB)
LUI JACQUELINE C (CN)
WILLCOCK ALEX (GB)
International Classes:
G06F17/00
Attorney, Agent or Firm:
LUI, Jacqueline, C. (1201 12/F Kwai Hung,Holdings Centre,89 King's Road, North Point, Hong Kong, CN)
Download PDF:
Claims:

What is claimed is:

1. A method of codifying human emotion into a machine-readable language by a first computer application comprising the steps of:

(a) creating multimedia object of entity; said multimedia object being selectable by a user through an end-user system;

(b) obtaining human emotional preferences through capturing user responses from a plurality of users while they make selections from a plurality of said multimedia objects through said end-user system; said selection implicitly revealing said user's emotional preference;

(c) categorizing said user responses into categories; and

(d) assigning category label to each category; said category label being an emotional code in said machine-readable language, whereby said user's emotional preference may be digitally characterized and conveyed to a second computer application; and wherein said second computer application is able to adapt its operation based on the interpretation of said emotional code.

2. The method of claim 1, wherein said multimedia objects are digital images.

3. The method of claim 1, wherein said multimedia objects are attached with a plurality of keywords; said keywords describing the human feeling towards a said multimedia object.

4. The method of claim 3, wherein said user response may be at least one of the following responses:

(a) positive response indicating said multimedia object said user prefers;

(b) negative response indicating said multimedia object said user dislikes;

(c) don't know response indicating said multimedia object said user does not know;

(d) said user's comments on said positive response;

(e) said user's comments on said negative response; and

(f) said user's comments on said don't know response.

5. The method of claim 4, wherein said categorizing step further comprising the steps of:

(a) retrieving said user responses from said user;

(b) tallying up the number of occurrences of each keyword on a list of keywords, said

1002 003 PCT

list of keywords aggregated from said multimedia objects said user selected; (c) invoking an expert system that categorizes said user to at least one category based on said user responses and result of said tallying step.

6. The method of claim 5, wherein said expert system is a rule -based expert system that captures psychology expertise in categorizing said user based on said user responses.

7. The method of claim 5, wherein said expert system is a neural network classification system that classifies said user responses into said categories.

8. The method of claim 1, wherein said categorizing step further comprising the steps of:

(a) retrieving said user responses from said plurality of users;

(b) invoking a cluster analysis module to categorize said plurality of user responses into at least one cluster; and

(c) assigning said category label to each said cluster.

9. The method of claim 8, wherein said invoking step invokes a statistical clustering module to generate a cluster mean parameter and a cluster variance parameter for each said cluster; said cluster mean parameter being a multi-dimensional vector defining the cluster center, and said cluster variance parameter being a multi-dimensional matrix.

10. The method of claim 1 further comprising associating said emotional codes to an entity; said associating step comprising

(a) obtaining an entity multimedia object of said entity;

(b) choosing a plurality of representative multimedia objects; each representative multimedia object pre-selected to represent a broad category of human emotion

(c) pre-assigning at least one said emotional code to each of a plurality of representative multimedia objects;

(d) displaying said plurality of representative multimedia objects and said entity multimedia object on a display device for a second user to operate;

(e) requesting said second user to select among said plurality of representative multimedia objects a first representative multimedia object that is most similar to said entity object; and

(f) assigning said emotional code pre-assigned to said first representative multimedia object to said entity.

11. A method according to claim 10 further comprising a search step for an enterprise to present a filtered plurality of entities to a third user when said third user reveals his emotional code to said enterprise, said search step further comprising the steps of:

(a) defining a peripheral region using said third user's emotional code, said peripheral region centered around said third user's emotional code;

(b) selecting a plurality of said entities having emotional code falling within said peripheral region;

(c) presenting said plurality of said entities from step (b) to said third user as said filtered plurality of entities such that irrelevant entities whose emotional code does not remotely match said third user's emotional code will not be presented to said third user.

12. The method of claim 11, wherein said categorizing step (in claim 1) invokes a statistical clustering module to categorize said plurality of user responses into at least one cluster whereby each cluster is characterized by a cluster mean parameter and a cluster variance parameter; said peripheral region being centered at said cluster mean parameter; and the extent of said peripheral region being a proportional constant multiplying said cluster variance parameter.

13. A method of filtering irrelevant items from a search operation using machine -readable emotional code; said emotional code generated by categorizing user responses into categories; said user responses captured when said user selects items from a plurality of choices; said selection revealing said user's emotional preferences; and labeling said emotional code to each of said category; and said emotional code assigned to a first entity, said first entity being said user who generates said user responses; and said user responses categorized to said category corresponding to said emotional code assigned to said first entity; a second entity, said second entity being a product by moving a multimedia representation of said product to a second multimedia object that has been assigned a second emotional code, said multimedia representation of said product inheriting said second emotional code, and a third entity, said third entity being a service by moving a multimedia representation of said service to a second multimedia object that has been assigned a third emotional code, said multimedia representation of

said service inheriting said third emotional code; said method comprising the steps of:

(a) obtaining a search entity; said search entity selected from a group of said first entity, said second entity and said third entity;

(b) using said search entity emotional code to define a peripheral region centered around said search entity emotional code;

(c) connecting to a database that comprises a plurality of database entities; each of said database entity selected from a group of said first entity, said second entity and said third entity;

(d) retrieving a plurality of database entities; each of said database entities having its emotional code falling within said peripheral region; and

(e) presenting said plurality of database entities from step (d) to said search entity.

14. A method of filtering irrelevant items from a match operation using machine-readable emotional code; said emotional code generated by categorizing user responses into categories; said user responses captured when said user selects items from a plurality of choices; said selection revealing said user's emotional preferences; and labeling said emotional code to each of said category; and said emotional code assigned to a first entity, said first entity being said user who generates said user responses; and said user responses categorized to said category corresponding to said emotional code assigned to said first entity; a second entity, said second entity being a product by moving a multimedia representation of said product to a second multimedia object that has been assigned a second emotional code, said multimedia representation of said product inheriting said second emotional code, and a third entity, said third entity being a service by moving a multimedia representation of said service to a second multimedia object that has been assigned a third emotional code, said multimedia representation of said service inheriting said third emotional code; said method comprising the steps of:

(a) obtaining a match entity; said match entity selected from a group of said first entity, said second entity and said third entity;

(b) connecting to a database that comprises a plurality of database entities; each of said database entity selected from a group of said first entity, said second

entity and said third entity;

(c) retrieving a plurality of database entities; each of said database entities having its emotional code identical to said emotional code of said match entity; and

(d) presenting said plurality of database entities from step (c) to said match entity.

15. A computer system for codifying human emotion into a machine readable language, comprising:

(a) a profiling server configured to:

I. capture the emotional preferences from a plurality of users;

II. generate emotional codes from said emotional preferences; and

III. generate a plurality of representative multimedia objects; each associated with at least one said emotional codes;

(b) an enterprise system capable of assigning entity emotional codes to a plurality of entity multimedia objects; said entity multimedia object representing an entity offered by an enterprise, and said entity multimedia object inherits at least one said emotional code from said representative multimedia object;

(c) an end-user system capable of receiving user emotional code of said user from said server; said user emotional code being said emotional code describing said emotional preferences of said user; and

(d) a search and match engine configured to receive said user emotional code from said end-user system and said entity emotional code from said enterprise system; said search and match engine further configured to retrieve a plurality of said entity multimedia object of which entity emotional codes proximate said user emotional code.

16. The computer system in claim 15, wherein said profiling server further comprising:

(a) a cataloguing system configured to conduct emotional surveys to capture said emotional preferences; to categorize the results of said emotional surveys into categories, and to assign said emotional codes to said categories;

(b) a toolbox for customizing said emotional surveys; and

(c) a search engine optimization system for providing said search and match engine the capability of matching using said emotional codes.

17. The computer system in claim 16, wherein said cataloguing system further comprising:

(a) a survey module configured to send said emotional surveys to said users and collect said results of said emotional surveys from said users;

(b) a plurality of databases for storing and providing relevant information of said emotional surveys; and

(c) an analysis module to receive said results of said emotional surveys from said survey module and to categorize said results into categories; said analysis module further assigning category label to each category, said category label being an emotional code in said machine-readable language.

18. The computer system of claim 15, wherein said enterprise system and said search and match engine run on same computer system.

19. The computer system of claim 15, wherein said profile server and said search and match engine run on same computer system.

20. The computer system of claim 15, wherein said end user system is capable of running an Internet browser application and said emotion code is transmitted to said end user system in the form of a computer cookie file.

Description:

METHOD AND SYSTEM FOR COMPUTERIZED SEARCHING AND MATCHING

USING EMOTIONAL PREFERENCE

FIELD OF INVENTION [0001] The present invention is related to a computerized searching and matching system, and in particular a searching and matching system that is customized to make use of the user's emotional profile.

BACKGROUND OF INVENTION [0002] In conventional text-based computerized searching and matching systems, a user needs to specify a text string as matching criteria to perform a searching and matching operation. Since it requires users to exercise their logical thought process to input the search criteria precisely, this kind of searching and matching systems is only effective in retrieving the factual information from users. It is not effective in capturing users' emotional reflex. However, it is well observed that humans make snap judgments based on instinctive emotional inclination. Unfortunately, there is a severe limitation in expressing the user's emotional reflex to conventional searching and matching systems in an intuitive manner, As a result, the system may not return meaningful matching results to the users.

SUMMARY OF INVENTION

[0003] In the light of the foregoing background, it is an object of the present invention to provide an alternative computerized system that can capture the emotional profiles of users and use such profiles to assist in searching or matching operations. This system may also be applied for searching and matching between different entities, where an entity can be a human being, a product or a service.

[0004] Accordingly, the present invention, in one aspect, is a method of codifying human emotion into a machine-readable language by a first computer application. This method

first creates a multimedia object for each entity. It then presents those multimedia objects for users to select. When users make the selection, they implicitly reveal their emotional preferences. Such human emotion preferences are then captured and categorized into categories. Each of these categories is assigned an emotional code in a machine readable language. In so doing, user's emotional preference may be digitally characterized and conveyed to a second computer application; and the second computer application can then adapt its operation based on the interpretation of the emotional code. This operation may be a search operation, a match operation, an operation to find items with similar emotional code, an operation to find dissimilar emotional code, or any combination thereof.

[0005] In a preferred embodiment of the present invention, the multimedia objects are digital images. These multimedia objects are also attached with a plurality of keywords; with each keyword describing the human feeling towards that multimedia object.

[0006] In another preferred embodiment, the user response may comprise positive or negative responses from the user; as well as the comments that the user makes on those responses.

[0007] In one implementation, the categorizing step comprises the steps of (a) retrieving user responses from said user; (b) compiling a list of keywords aggregated from the multimedia objects that the user selected; (c) tallying up the number of occurrences of each keyword on a list of keywords and (d) invoking an expert system that categorizes this user to at least one category based on the result of step (c) and other relevant information of user's responses. In one embodiment of this implementation, the expert system is a rule- based expert system that captures the expertise of a psychologist when he categorizes the user based on the user's responses. In an alternative embodiment, this expert system is a neural network classification system that classifies the user responses into said categories.

[0008] In another implementation, the categorizing step invokes a statistical cluster analysis module to categorize a plurality of user responses into at least one cluster. Each cluster has a cluster mean parameter and a cluster variance parameter. And each cluster is assigned an emotional code.

[0009] In yet another implementation, this invention further comprises a method of associating an emotional code to an entity, which can be either the products or services that an enterprise is offering. This associating method first selects a plurality of representative multimedia objects from a pool. Each of them is pre-selected to represent a broad category of human emotion and is assigned at least one emotional code. Then for each entity, an entity multimedia object is created. This can be, for example, a digital image of that entity. Next, the plurality of representative multimedia objects and the entity multimedia object are displayed on a display device for a second user to operate. When the second user selects among the plurality of representative multimedia objects a first representative multimedia object that is most similar to that entity object; the emotional code pre-assigned to the first representative multimedia object is transfer to that entity and hence the entity inherits that emotional code.

[0010] In a variation of the above method, this invention further comprises a search step for the enterprise to present a filtered plurality of entities to a third user when the third user chooses to reveal his emotional code to the enterprise. The search step first defines a peripheral region centered around the third user's emotional code. It then selects a plurality of entities having emotional code falling within the peripheral region and then presents them to the third user as the filtered plurality of entities, so that irrelevant entities whose emotional code does not remotely match third user's emotional code will not be presented to him.

[0011] In a preferred embodiment of this aspect of the present invention, and when the categorizing step invokes a statistical clustering module to categorize a plurality of user responses into at least one cluster so that each cluster is characterized by a cluster mean parameter and a cluster variance parameter, the peripheral region can be created so that it is centered at the cluster mean parameter; and the extent of this region is a proportional constant multiplying the cluster variance parameter.

[0012] According to another aspect of the present invention, a system for codifying human emotion into a machine readable language is disclosed. The system comprises a profiling

85 server, an enterprise system, an end-user system and a search and match engine. The profiling server is configured to (a) capture the emotional preferences from a plurality of users; (b) generate emotional codes from the emotional preferences; and (c) generate a plurality of representative multimedia objects, each of which is associated with at least one emotional code. The enterprise system has a product or service catalog and each of the

90 products or services is tagged with an emotional code. (Detailed description of how this is done will be given later). The end-user system is capable of receiving the user's emotional code from the server. The search and match engine is configured to receive the user's emotional code from the end-user system and the product or service emotional code from the enterprise system so that it can retrieve a plurality of products or services of which

95 their emotional codes proximate the user's emotional code.

[0013] In one embodiment, the enterprise system and the search and match engine run on the same computer system. Whereas in another embodiment, the profile server and the search and match engine is hosted in the same computer system.

[0014] In a preferred embodiment, the end user system is capable of running an Internet 100 browser application and the emotion code is transmitted to the end user system in the form of a computer cookie file.

[0015] There are many advantages to the present invention. A primary advantage is that it can capture the emotional reflex of a user, something that cannot be done by conventional text-based questionnaires, and use this emotional profile to improve search precision. It is

105 well known in the art that consumers make snap judgment decision in their buying habits. Such decision is largely based on their instinctive emotional preferences rather than a logical, analytical thought process. Before this invention, there are no effective and efficient methods to codify human emotional preference to a computer. But through this invention, the emotional dimension of human preferences can now be effectively captured

110 and conveyed just like other rational information. The emotional profiling methodology therefore provides additional insight for market researchers to understand the behavior of the consumers. Such insights cannot be obtained by conventional text-based surveys.

[0016] Another advantage of the present invention is that when all users and objects are tagged with their respective emotional code, it allows market researchers to perform

115 targeted marketing in the emotional space. The emotional code can become a universally accepted vocabulary of communication in that space. As such, this invention makes it possible for human emotional feelings and preferences to be communicated digitally across cyber-space, so that commerce can zoom into what the consumer actually likes immediately. Similarly, users can also find like-minded users easily.

120

BRIEF DESCRIPTION OF FIGURES

[0017] Fig. 1 is the block diagram of the emotional profiling system in one embodiment.

[0018] Fig. 2 is the block diagram of the computerized emotional profiling server according to one example of the present invention.

125 [0019] Fig. 3a shows the major software modules and databases of the emotional profile cataloguing system.

[0020] Fig. 3b shows the internal composition of the survey form design toolbox.

[0021] Fig. 4 illustrates a specific example of the user interface when the user is doing the survey.

130 [0022] Fig. 5 shows another specific example of the user interface when the user is doing the survey.

[0023] Fig. 6 is the illustration of a specific data representation example of the emotional code or Visual DNA.

[0024] Fig. 7a is a specific example of auto image tagger, where tagging has not been 135 done yet.

[0025] Fig. 7b is the specific example of auto image tagger, where tagging is partially done.

[0026] Fig. 8a shows a specific example of user interface of the search engine with a diverse range of results

140 [0027] Fig. 8b shows a specific example of user interface of the search engine with a more homogeneous range of results

[0028] Fig. 9 shows a specific example of the internal architecture of the hardware and software system of the matching system.

145 DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

[0029] As used herein and in the claims, "comprising" means including the following elements or steps but not excluding others. Moreover, "entity" includes both a living being such as a user, and an entity object. The term "entity object" is used herewith to denote a non-living entity such as but not limited to a company, a corporation, a product, a service,

150 a brand, a concept such as 'holiday', or scenery. And the term "multimedia object" refers to a multimedia data structure that represents the entity in the computer system. It can be, but not limited to, a digital image, a video clip, a sound file, or a text-based object. Moreover, the terms "emotional code", "Visual DNA" and "emotional fingerprint" are used interchangeably in this document and they all mean a machine-readable code that

155 codifies human emotion.

[0030] Referring now to Fig. 1 , the first embodiment of the present invention is an emotional profiling system (22) comprising multiple subsystems, which include a computerized emotional profiling server (35), at least one enterprise system (91), at least one search and match engine (92) and at least one end-user system (93). These subsystems

160 are each connected to the computerized emotional profiling server (35) via different data communication channels (23a, 23b, 23c, 23d and 23e). These data communication channels establish point to point data path between the two parties. This can be done either through a private communication network, a public network such as the Internet, or a virtual private network (VPN). It may traverse one or more local area network (LAN),

165 metropolitan area network (MAN), wide area network (WAN), or a combination thereof.

Each of such networks may be implemented using leased lines, optical fiber, wireless technologies, or other networking technologists.

[0031] In Fig. 2, the internal structure of the computerized emotional profiling server (35) is revealed. It further comprise (a) an emotional profile cataloguing system (36) that sends

170 out emotional survey forms to users, collects and categorizes the survey results and assigns emotional code to different emotional categories; (b) a survey form design toolbox (45) that allows a designer to design a customized survey form quickly; and (c) a search engine optimization module (46). This module can be embedded to the search and match engine (92) so that the latter can make use of the emotional code to retrieve items that

175 closely matches user's emotional preference.

[0032] Fig. 3a and 3b further illustrates one implementation of the emotional profile cataloguing system (36) and that of the survey form design toolbox (45) respectively. This cataloguing system (36) makes use of the media survey module (20), the analysis module (44), and a plurality of databases including the survey result database (101), the emotional

180 profile database (102), the multimedia presentation database (103), the name list database (104) and the survey document database (105). The media survey module (20) sends out survey forms (21) to users. The survey result database (101) stores the survey results from the media survey module (20) and sends them to the analysis module (44). The latter in turn connects to emotional profile database (102). The survey form design toolbox (45)

185 consists of three parts. They are auto image tagger (178), the image database (180) and the module builder (182) as shown in Fig.3b. The details of these systems will be elaborated later.

[0033] The system architecture of the first embodiment of the present invention is thus described. Now the description will turn to the operation of the multiple subsystems, and 190 how they interact with each other to achieve the object of the present invention.

[0034] Return back to Fig.l, the computerized emotional profiling server (35) sends a set of data packages which is called the Insight packages in this specific example to those enterprise systems (91) that subscribe to its service via the data communication channel

(23a). Each Insight package stores the analyzed results of user's emotional information, a 195 set of representative multimedia objects and the associated emotional codes. Using the Insight packages, the staff at the enterprise can gain a better understanding of users' preferences. Next, the enterprise system (91) will tag their products or services with the emotional codes (This will be explained in more detailed later), and then send the product or service information, together with its emotional code to a search and match engine (92). 200 Meanwhile, the computerized emotional profiling server (35) may also distribute the user's emotional code as well as information unique to this user back to the user in the form of a 'know-me' packet to an end-user system (93) owned by the user via another data communication channel (23e).

[0035] In one specific example of the emotional profiling system (22), the user is a

205 consumer, the enterprise system is an on-line shopping site, and the search and match engine is provided by a third party commerce system. The consumer, through the end- user system (91), connects to the commerce system hosting the search and match engine (92) via the data communication path (23d); and the merchandiser makes use of the enterprise system (91) to offer his product or service information to the commerce system

210 via another data communication path (23b). Through the commerce system, the consumer can select what product or service to purchase. As mentioned before, each product or service can be tagged with an emotional code. When the consumer also reveals his emotional code to the commerce system, the commerce system can select those products from the merchandiser's enterprise system (91) that matches the consumer's emotional

215 code, thus providing the consumer their mostly desired products. In so doing, targeted trading between merchants and consumers can be facilitated. Furthermore, the commerce system may also record the choices that the consumer has selected from the plurality of products or services presented to him, and send this information back to the computerized emotional profiling server (35) for subsequent 'click-through' analysis. This is done via

220 another data communication path (23c).

Survey Form Generation and Processing

[0036] Turning now to Fig. 3a, where the internal structure of the emotional profile cataloguing system (36) is shown. As described previously, the cataloguing system (36) sends out emotional survey forms to users, collects and categorizes the survey results and

225 assigns emotional code to different emotional categories. More specifically, the cataloguing system (36) sends out a survey form (21) to a user and prompts the user to choose one or more answers. The survey form (21) comprises at least one question together with a plurality of multimedia objects. The multimedia object can be but is not limited to a textual object, an image, photograph, picture, audio clip, sound clip, movie, or

230 any combination thereof. In a preferred embodiment, the survey form is displayed on the web browser of the user's computing device. A typical web browser is Internet Explorer from Microsoft. Fig. 4 shows one example of a survey form whereby the multimedia objects are digital image. In yet another preferred embodiment, the image is a mouse- clickable object so that when the user clicks on a particular image, the web-browser

235 detects which image the user has chosen and sends this information back to the media survey module (20) as the user's choice of that survey question.

[0037] The survey document database (105) comprises a plurality of survey forms. The survey form may be entirely text-based (i.e. the multimedia objects are text objects). This kind of survey form is to record the factual and demographic information about the users

240 such as their sex, age range, income level and the likes. This is the traditional survey questionnaire. An important aspect of the present invention is not only to gather demographic information from the user but also his emotional preferences. Hence in a typical survey document, the survey forms comprise not only pure text-based forms but also forms with rich multimedia objects such as the one shown in Fig. 4. Research has

245 shown that when pure text -based questionnaires are presented to a user, the user will go through a 'considered thought process' to read up the multiple choice answers before he selects the one that is applicable to him. However, it is well known that feelings drive a vast majority of human behavior and choices, and how people feel in the test environment is closer to how they would feel and act in real-life. These kinds of inner feelings can

250 better be captured by user's direct emotional response to visual stimuli. Hence a survey form comprising a plurality of images or visual objects can capture the emotional reflex of

the user that can not be done by conventional text-based questionnaires. Recent experimental result showed that the number of people willing to respond to a multimedia questionnaire was almost three times higher than answering a traditional text-based 255 questionnaire. More importantly, new consumer segments were found in the multimedia survey that would not have been uncovered by a traditional text based survey.

[0038] An exemplary implementation of generating the survey forms and conducting the survey is given here. Firstly, appropriate multimedia objects are chosen to build survey forms. In a preferred embodiment, these multimedia objects are images which are

260 extracted from an image database. This image database can be a private collection, public database, or commercial database. An example of a commercial image database is Gettyimages®. Each image is tagged with a set of keywords. These keywords are carefully selected to describe the human's view and feeling towards that image. An exemplary set of keywords are beach, snow mountain, metropolitan area, natural setting;

265 or classical, romantic, traditional, trendy, etc. In one embodiment, this is done manually by humans. In a preferred embodiment, it is done by psychologist experts. In another embodiment, this process can be performed by a computer module, the details of which will be discussed later.

[0039] In this exemplary implementation, a suite of survey forms are created, each for a 270 different domain. For example, there is one survey form for sports, another one for travel, etc. Each survey form comprises multiple survey pages. Some of the survey page captures the factual information of a user and they are the traditional text -based survey page. Other survey pages comprise a textual question, and multiple images that are tagged with keywords for the user to choose from.

275 [0040] Afterwards, the survey forms are to be sent out to a plurality of users to fill out. In one implementation, the list of users can be acquired from various name-list databases (104) that contain the contact information or email addresses of the users. This can be a company's name-list database, an advertising company that sells commercialized name-list database, user communities like interests clubs or professional organizations, or online

280 communities such as Yahoo or MSN group. To achieve unbiased survey results, the more users participating in the survey, the more reliable the survey statistics will be. On the other hand, the pool of users from the name-list database (104) can be partitioned into many groups with different classifications. For example, the users can be divided into groups according to ethnical origins, different races, geographical territories, culture or

285 languages, etc. Hence the same survey may be conducted on different user groups. Though the size of each group may not be as big as a large pool, the group so partitioned may have certain common characteristics, and hence it may yield more refined and meaningful emotional profiles as compared to the survey results from a large user pool.

[0041] When a user responds to the survey question by choosing visually the most

290 desirable image from a plurality of alternatives, the system also records the keywords associated with the selected image for subsequent analysis. As the keywords to each image are invisible to the user, what he selected is a full reflection of his emotional preference on that subject. Referring to Fig. 4 as a specific example for illustration, the question on "Happiness to me is ..." shows a number of images to the user such as praying under the

295 sun (186), relaxing in the bathtub (188), taking a ride through the countryside (190), or sleeping with a pet (192). If the user selects "relaxing in the bathtub" (188) image for instance, he has indirectly selected several keywords such as quiescence, warmth and indoor that have been associated to this image previously. Nonetheless, the survey results to be transmitted are not the keywords but just the simple selection the user made to the

300 images, as the extraction and analysis of keywords associated with selected images is done in the emotional profiling server (35). Moreover, there is a chance that the user dislikes some of these images. In one implementation, a 'remove' button (194) for each item as shown in Fig. 5 is also provided for user to express his negative feeling towards this image. This negative feedback is equally as important as his positive selections and will also be

305 saved as part of the survey results for subsequent analysis.

[0042] During the survey, the system also provides means to let the user write comments about what and why he likes and dislikes about the images. These comments are also saved as part of the survey record and may be used by the analysis module to determine

the most appropriate emotional code for this user. The comments will also be added to his 310 survey results.

[0043] In addition to capturing user's emotional preference by asking him to go through the entire survey form explicitly, it may also be possible to do so implicitly when he is browsing the Internet. In a specific embodiment when the user is visiting an on-line shopping site, what product/service the user clicks on the on-line product/service catalog 315 also reveals his emotional preference. Such information can be captured by the system and store in the user's survey record for further analysis.

[0044] Last but not least, the system also provides the users the facility to invite their friends to join a user group and answer the survey questions. They can invite their friends to join the group through websites, email, blog, podcasting, or any other means available 320 in online social networks. This kind of 'viral effect' can greatly extend the user survey pool.

Analysis of Survey Results

[0045] After all the survey result records from the user community have been collected, the emotional profile cataloguing system (36) can invoke the analysis module (44) to

325 analyze them. In general, the survey result record has a complex data structure in order to store the multi- facet demographic data and emotional preferences of the user in a multidimensional data representative. For example, it may be implemented as a high dimensional matrix, a tree structure or an object-oriented data type. In one implementation, it comprises a vector that records the demographic data of the user, a multi-dimensional

330 matrix that records his emotional preferences, and text fields that record the positive and negative comments from the user. The multi-dimensional matrix may further comprise the choice vector that registers the choices made by the user. It may further comprise the speed vector to record the time it takes for the user to make that choice(s) and the sequential ordering vector that registers the ordering of choices if the user selects more

335 than one choice for a question.

[0046] Having obtained the survey results stored in a multi-dimensional data representative from the survey result database (101), the analysis module (44) can now perform two kinds of analysis that assign users to different emotional code categories. The first type of analysis is keyword analysis which can be done manually or automatically.

340 Keyword analysis is to analyze the keywords associated with the image selected by the respondent, and deduce which category this user should be assigned to. In one implementation, the keyword analysis module reads the chosen images stored in the survey result, and finds the keywords from a database associated with each image. It then aggregates them into a keyword list. In a specific implementation, this analysis is

345 performed by psychologist experts. The experts can study the keyword list, the number of occurrences of each keyword in the list; the positive and negative comments entered by the user, and then determine the emotional code of this user.

[0047] In another implementation, this analysis is done automatically by a computer module. As a specific illustrative example, the user has answered thirteen questions in a

350 survey. The keyword "sunshine" appears five times, and is the predominant keyword. Though the keyword "sad" also appears once, the module weights in all the evidence and assigns this user to the "optimist" emotional code category. Alternatively, other Artificial Intelligence techniques can also be employed. Specifically, when a psychology expert articulates the reasons why he assigns a user a certain emotional code after studying the

355 user's survey record, those reasons can be encoded to a rule-based expert system so that the latter can draw the same conclusion. Alternatively, after a large number of users have been categorized, a neural network classifier can be used to learn this categorization mapping. In this case, the keywords tallying up from the user's survey result are fed to the input nodes of a neural network, and the output nodes are set to the particular emotional

360 code that this user is assigned to. The neural network training algorithm is then invoked to learn the associative mapping of this past example. After training, the neural network can then be used to assign the emotional code to a new user when his keyword list is available.

[0048] In any case, the emotional code, together with the keyword statistics, and all other relevant information are stored as an emotional profile record for this user in the emotional 365 profile database (102).

[0049] The second analysis method is a statistical clustering technique that performs clustering analysis on these multi-dimensional vectors. As a typical example, this module can invoke the SPSS statistical package from SPSS Inc. The cluster analysis module produces at least one cluster from the survey results; and also the statistics associated with 370 this cluster.

[0050] In operation, the statistical analysis module retrieves all the survey result records from the survey result database (101) and segregates them into clusters or emotional profile categories. Each cluster is represented by a cluster center with its cluster statistics such as the cluster mean and variance. The cluster mean is represented by a multi 375 dimensional vector while the cluster variance is a multi-dimensional matrix. This clustering statistics, together with the emotional code, and all other relevant information related to this emotional profile category are stored as an emotional profile record in the emotional profile database (102).

[0051] After keyword or cluster analysis, users in the same emotional profile category can 380 then be assigned to the same emotional code. This code becomes the personal emotional code for this segment of users. As it denotes the emotional preference of the user, it is also referred as his Visual DNA, or emotional fingerprint. It is advantageous to use an easy-to- remember name or image to denote the emotional code for future references. For example, the names 'traditionalist', 'modernist' and 'environmentalist' can be used. This can 385 become the international emotional code; and when this emotional code is adopted worldwide, it can greatly improve the efficiency of the consumer / commerce communication as will be seen by way of examples in subsequent paragraphs. Alternatively, a numeric code can be adopted. For example, each cluster can simply be referred by a cluster number.

390 [0052] A specific data representation example of the emotional code or Visual DNA is illustrated in Fig. 6. The emotional preferences of a user are divided into twelve alternative tags to one of which the user belongs. For instance if he likes basketball in Sport domain, he belongs to Tag 5. And if he prefers herbal medicine, then he belongs to Tag 2 in the Health domain. The analysis module (44) is able to assign a code to this user because the

395 user has completed the surveys for these two domains. Hence if he completes all surveys for all twelve domains, then his emotional code or Visual DNA code (152) would be a text string of eight entries, i.e. "S5 HE2 EN8 L4 F8 E2 P3 FA7 TR3 Hl T6 FR2" as shown in Fig. 6. On the other hands, if the user only completes a subset of survey forms, his visual DNA code will only consist of codes from those domains that have data on his emotional

400 preferences.

[0053] In Fig. 6, an example having twelve domains and eight tags in each domain is used to illustrate a specific data representation of the Visual DNA. It should be emphasized that this inventive system can cater for any number of domains and any number of tags within each domain. Hence the specific numbers mentioned should not be 405 construed as the only numbers required in the implementation of this invention.

Survey Form Design Toolbox

[0054] As briefly introduced before, the survey form design toolbox (45) facilitates the generation of survey documents. Referring back to Fig. 3b, in one implementation the toolbox (45) consists of three modules. The first module is the image database (180) from

410 which images for a survey question can be extracted for use. A typical commercial image database is Gettyimages®. The second essential part of the toolbox (45) is an auto image tagger (178) that assigns keywords to images in a simple and intuitive manner. Its detailed operation will be discussed in the following paragraph. The third module is the module builder (182) which assists a survey designer to compose a survey document consisting of

415 a plurality of survey pages, each of which comprises a question and multiple image or text objects for users to choose. It may also provide facility for the designer to incorporate

additional information such as company logo or other relevant data to the survey document.

[0055] An exemplary computer screen layout of the auto image tagger is shown in Fig. 7a 420 and 7b. The auto image tagger interface screen comprises two panels: the untagged image panel (158) and tagged image panel (160). As its name implies, the former contains images not tagged with keywords while those in the later panel are tagged. To assign keywords to those untagged images, a user simply compares an untagged image against all those in the tagged image panel, and identifies the most similar one. He then uses a 425 computer mouse to drag the untagged image onto that most similar one in the tagged image panel. In Fig. 7a, for instance, the first image (196), the second image (198) and the third image (200) are residing in the untagged image panel (158); and images (202) and (204) are standard images tagged with keywords. Then the user drags the second picture (198) and the third picture (200) onto a first standard image (202), whereas the first 430 picture (196) is dragged onto a second standard image (204). The newly tagged images (196), (198) and (200) in Fig. 7b will become smaller in size and reside on the corresponding tagged images in the tagged images panel (160). By so doing, the untagged image will inherit those keywords from the tagged image.

End User System (93)

435 [0056] After a new user is assigned an emotional code or Visual DNA code, the computerized emotional profiling server (35) creates a 'know-me' package for the user. The package includes the emotional code assigned to this user by the server (35), his comments on the images he selected or removed, his personal information or hobbies, his network of friends... etc. The server (35) can send this 'know-me' package to the user's

440 end user system (91). In one implementation, the 'know-me' package can be in the form of a 'cookie file' that is widely used by the web browsers. In other implementations, it can be stored in the form of a token carried by the user. As an illustrative example, the 'know- me' package can be stored in the SIM card or the USIM card inside the mobile phone, a smart-card carried by the user himself, or any memory devices that can store data

445 persistently. With the user's permission, he can share his 'know-me' package with his friends, a user community, or to a shopping site so that others can understand his emotional preferences.

Enterprise System (91)

[0057] Merchandisers or other service providers have keen interest in knowing the 450 consumer preferences and latest trends. Hence, the computerized emotional profiling system (35) can send an Insight package to those enterprise systems (91) whose owners subscribe to this service. The Insight package stores the analyzed results of the consumer surveys for at least one domain, a plurality of representative multimedia objects, each of which representing a broad category of human emotion and is tagged with emotional 455 codes or visual DNA, and other relevant information. In one implementation, representative multimedia objects are chosen by an expert. As an illustrative example, one multimedia object may be chosen to represent 'happiness' while another one represents 'sadness', ..., etc. When a consumer is willing to reveal his emotional code, the merchandiser or other service providers can select those products or services in his catalog 460 that appeals to that emotional code. To do so, the merchandiser or other service providers need to tag his products or services with the same set of emotional codes. This can be done by an emotional code auto-tagger system. This system is similar to the auto image tagger module described previously except that the latter tags images with keywords while the emotional code auto-tagger tags images with emotional code.

465 [0058] In operation, an entity multimedia object of a product or service offered by the merchandiser or other service providers is first created. In a preferred implementation, both the entity multimedia object and the representative multimedia objects are digital images or visual objects. The user interface of the emotional code auto-tagger is the same as shown in Fig. 7a and Fig. 7b. The tagged image panel (160) is filled with a plurality of

470 representative multimedia objects, each of which is tagged with at least one emotional code or visual DNA, while the untagged image panel (158) is filled with the entity multimedia objects that represent the merchandiser's products or services. The

merchandiser can then drag an image from the untagged image panel (158) onto one of the images in the tagged image panel (160) as mentioned before, and in so doing, the entity 475 multimedia object will inherit the emotional code from that representative multimedia object.

Search and Match Engine (92)

[0059] As mentioned previously, the emotional code or Visual DNA can be used as a universal code by both the consumers and the merchandisers. The consumer can use this 480 code to express his emotional preference while the merchandisers can segment his products or services according to this code.

[0060] In a traditional on-line shopping site, a consumer visiting the site will typically enter a few keywords on what he wants, and a search engine at this site will search the product or service catalog and display a plurality of choices for the consumer to select.

485 Unfortunately, many recommendations may not match what the consumer wants, resulting in waste of time for the consumer and loss of sales opportunities to the merchandisers. The search and match engine (92) can incorporate the search engine optimization module (46) from the computerized emotional profiling server (36) so that it can make use of the universal Visual DNA to weed out the irrelevant choices, thus providing a more pleasant

490 and favorable shopping experience to the consumer. The actual operation of this system is illustrated by way of an example in the following paragraphs:

[0061] In a specific example and refer back to Fig. 1 , a consumer uses a web browser available at his end-user system (93) to visit an on-line commerce system that is equipped with a search and match engine (92). The commerce system, in turn, receives product and

495 service catalog from an enterprise system (91) of a furniture store. In this case, each of the furniture (i.e., a sofa) is represented by a multimedia object in the form of an image that has already been tagged with at least one emotional code as described previously. The consumer's 'know-me' package (which contains the consumer's Visual DNA) is stored at his end-user system (93) in the form of the 'cookie file' that can be accessed by the web

500 browser.

[0062] Fig. 8a is an illustrative example of the screen shot when the consumer first enters the aforementioned on-line commerce site. The user only needs to input the word "sofas" as search string into the input box (168), and a plurality of sofa images are displayed in the display box (174). If the user is willing to provide his 'know-me' to the furniture site,

505 which is necessary for Visual DNA matching, he can click the 'know-me' check box (170). This will cause the web browser to send the consumer's 'know-me' cookie file to the online commerce site. Failure to do so would result in the search engine behaving like an ordinary text-based one without optimization. After the user checked the 'know-me' box and clicks the "search" button (172) again, another plurality of sofa images appears on the

510 display box (174). A sliding bar (176) on top of the page becomes enabled too. At this stage the sliding bar (176) is placed on the rightmost or broadest position, which means that the search engine returns a set of items with broadest diversity relative to the consumer's Visual DNA. The sofas thus selected are showed in the display box (174) of Fig. 8a. As one can see, they vary greatly in the style and structure. Whereas in Fig. 8b,

515 when the sliding bar (176) is positioned at leftmost or narrowest position, the sofas shown in the display box ( 174) are quite similar in style and structure to each other since they are selected with smallest diversity relative to the consumer's Visual DNA. Therefore, the consumer does not need to specify his search string with detailed textual description of his preference but instead use the sliding bar (176) to adjust the proximities of search results

520 according to his emotional preference. The advantages of this search optimization are that not only is it fast and accurate, but it also saves the effort for the user to literately input every aspect of his preference to the search string, which is inefficient and time consuming.

[0063] Behind the scene, the search engine optimization module that is embedded to search and match engine (92) of the commerce system uses the consumer's Visual DNA

525 code to define a 'peripheral region' according to the sliding bar settings in the consumer's browser. If the setting is broadest, the peripheral region is set to be wider, and the commerce system (92) chooses product or service items from its catalog from that wider peripheral region. Hence the selected items will have a more diverse styles or structures among one another. When the setting is narrowest, the peripheral region becomes smaller;

530 hence the items selected will be more homogenous.

[0064] One specific implementation of this peripheral region is disclosed as follows. When statistical clustering analysis is used, the clustering algorithm produces one or more cluster centers. Each center is associated with its statistical mean and variance. The cluster center can be used as one of the component of the emotional code or Visual DNA. In this setting,

535 the peripheral region is the region centered at the cluster mean, with its extent or radius proportional to the cluster variance. Hence a wide peripheral region corresponds to a bigger proportional constant so that when this constant multiplies the cluster variance, it yields a bigger radius that defines a bigger peripheral region area. Conversely, a narrow peripheral region corresponds to a smaller proportional constant, yielding a smaller

540 coverage area. And the items selected will be closer to the cluster mean and hence more homogeneous.

[0065] As the consumer may not take part in all the surveys covering all domains, the visual DNA on those domains with no survey data may not be available (the missing data problem). One way to overcome this problem is to check the category that this consumer

545 belongs to, and find out for each of those missing domain areas, what visual DNA code do most of the users in that category belong to, and assign that visual DNA code to this consumer as an initial guess. As a specific example, the consumer may only answer the survey on sport, but not the vacation domain. So he has visual DNA code for sport (i.e. he likes 'winter sport'). The analysis result shows that most people who like winter sport

550 would like to go to a skiing resort for vacation. So he is assigned to a visual DNA code of 'skiing resort' in the vacation category, even though there is no survey data on 'vacation' for him. When this consumer pays multiple visits to one or more vacation sites, the vacation products or services that this consumer selected will be sent back to the computerized emotional profiling server (35) in the form of 'click- through' data via data

555 communication channel (23c). The later can invoke the analysis module (44) to analyze this click through data to get a more accurate emotional profile about this consumer on this vacation domain. This may result in a new Visual DNA code for this consumer and will pass back to the consumer in the form of a new know-me package.

Hardware and Software Implementation Details

560 [0066] The system block diagram shown in Fig. 1 depicts one embodiment of the present invention which employs distributed computing architecture, where the profiling server (35), the enterprise system (91), the search and match engine (92) and end-user system (93) are physically located apart from each other. Each of these systems can be a data processing system (800) as shown in Fig. 9. This system (800) consisting of both the

565 hardware (801) and software components (802) that can be used to implement the embodiment of the present invention. The hardware components further comprises a Central Processing Unit (CPU) (810), memory (811), storage (812), and multiple interfaces such as the peripheral interface (813), network interface (814), input interface (815) and output interface (816).

570 [0067] CPU (810) can be a single microprocessor or multiple processors combined together. Memory (811) can include read-only memory, random-access memory or other memory technologies, singly or jointly combined. Storage (812) typically includes persistence storage such as magnetic hard disk, floppy disk, optical storage devices such as CD-ROM, and semiconductor storage devices such as flash memory cards, or other

575 storage technologies, singly or in combination. Network interface (814) enables the data processing device (800) to exchange information with the external data communication network such as the Personal Area Network (PAN), the Local Area Network (LAN), the Wide Area Network (WAN), the Internet, and other data communication network architectures, upon which the data communication channel is established. The network

580 interface (814) can include the Ethernet interface, the Wireless LAN interface device, the Bluetooth interfacing device and other networking devices, singly or in combination.

[0068] Software (802) further includes the operating system (850), and one or more software implementations of those systems in Fig. 1.

[0069] It should be understood for those skilled in the art that the division between

585 hardware and software is a conceptual division for ease of understanding and is somewhat arbitrary. Furthermore, the application software systems may be executed in a distributed computing environment. The software program and its related databases can be stored in a

separate file server or database server and are transferred to the local host for execution. The data processing device (800) as shown in Fig. 9 is therefore an exemplary 590 embodiment of how it can be implemented. Those skilled in the art will appreciate that alternative embodiments can be adopted to implement the present invention.

[0070] The preferred embodiments of the present invention are thus fully described. Although the description referred to particular embodiments, it will be clear to one skilled in the art that the present invention may be practiced with variation of these specific details. 595 Hence this invention should not be construed as limited to the embodiments set forth herein.

[0071] For example, in the system block diagram shown in Fig. 1, the enterprise system (91) may host its own search and match engine (92) so both subsystems can be run on a single computer. It is clear, however, that the computerized emotional profiling server (35)

600 may also host the search and match engine for other merchandisers to use. In other cases, the end user may not have a powerful computing device to act as the end-user system (93). In that case, the profile server (35), upon receiving instruction from the user, can directly send the 'know-me' package to the search and match engine (92). Hence, it is clear to one skilled in the art that one or more subsystems can be co-located in a single computer

605 system. The block diagram in Fig. 1 teaches a flexible framework on how the emotional profiling system (22) can be implemented. Any variation of implementation based on the teaching of this framework will therefore also fall into the scope of this invention.

[0072] In addition, it is clear to those skilled in the art that there are many ways to interconnect the computerized emotional profiling server (35) to many different kinds of

610 computing devices that the user or designer use. For example, the computing device can be but not limited to an information kiosk, a personal computer, desk-top computer, laptop computer, palm-top, PDA, data-tablet, smart-phone, interactive TV, Internet TV or any other devices that can prompt the user a message and solicit a response from them and can also be connected to the data communication channel. The data communication

615 network can be but not limited to the Internet (25), a cellular wireless network, a wired

99

local area network (LAN), a wireless LAN, wired or wireless metropolitan area networks, or any combination of the above.

[0073] Moreover, the search engine optimization system (46) may be implemented as a downloadable module to be installed in a third party computer system as a specialized 620 search engine. For example, it can replace the traditional search engine of an on-line shopping site.

[0074] In Fig. 5, a specific example of the user interface when the user is responding to the survey is disclosed. Although only a positive response and a negative response are shown in this figure, it is obvious for those skilled in the art that other kinds of responses 625 could be displayed for the user to choose. For example, a "don't know" option can be added together with a text-box for him to enter his comments.

[0075] The keyword analysis method that assigns an emotional code to a user described above is preferably done by automatic process that mimics a human expert's reasoning and decision making process. While expert system and neural network approaches have been 630 mentioned previously, other Artificial Intelligent technologies can also be used. These include but not limited to biologically-inspired computing, fuzzy logic, or commonsense reasoning. Those skilled in the art can also adopt other Artificial Intelligence techniques based on the teaching of this invention to develop other automated methods.

[0076] While the aforementioned paragraphs use on-line furniture shopping scenario to 635 teach how the emotional code or Visual DNA can be used to overcome the limitations of the traditional search engines, the underlining invention can be applied to encompass many other scenarios. Hence, instead of a consumer searching the products or services of an online site, the emotional profiling system (22) can be generalized to retrieving, searching or matching operations between two entities, where an entity can be a user, a product, or a 640 service as mentioned in the first paragraph in the detailed description section. In one such scenario, the emotional profiling system (22) may be configured for a search entity to find a list of database entities that have similar emotional code. When both the search and the database entities are human beings, the system matches people with similar emotional

profile. When the search entity is a product or service from a merchandiser, and the 645 database entities are human users, then it is a direct marketing application. When both the search and database entities are either products or services, then the system can identify a suite of offerings for users having a particular Visual DNA code. Similarly, the same situation applies in matching application whereby the system will return a list of database entities that matches exactly the same emotional code of a match entity.

650 [0077] In addition, while an e-commerce scenario is given here, the emotional profiling system (22) can actually be applied to much broader areas - between an information seeker and an information provider, where the latter can be a government institution, a public library, or any other similar organizations. When all the entities are tagged with the Visual DNA code, this code becomes a universal, machine readable language that codifies

655 human emotion. Those skilled in the art can make use of this code, and the teaching disclosed in this invention to make existing search and match operation much more efficient.