Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM AND METHOD FOR VIRTUALLY TRYING-ON CLOTHES
Document Type and Number:
WIPO Patent Application WO/2022/137245
Kind Code:
A1
Abstract:
This invention is directed to a system (110) and method for virtually trying-on clothes. This invention allows a user to virtually predict the fitness and appearance of clothes when worn. The user inputs one or more body metrics, including gender, height, and weight of the user into the system. Furthermore, the user inputs a front pose (610) and a selfie photograph, wherein the photograph includes face and body of the user. The system, based on the input metrics and the photographs, generates a 3D avatar (700) of the user using augmented reality and computer vision. Thereafter, 3D clothes are superimposed on the 3D avatar such that the user can visualize the clothes as worn over the 3D avatar in 360 degrees by rotating the 3D avatar along the vertical axis of the 3D avatar.

Inventors:
CHHAPOLIA SAKSHI (IN)
KUMAR ANKIT (IN)
Application Number:
PCT/IN2020/051050
Publication Date:
June 30, 2022
Filing Date:
December 24, 2020
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
CHHAPOLIA SAKSHI (IN)
KUMAR ANKIT (IN)
International Classes:
G06T19/00; G06F3/01; G06Q30/06; G06T15/00
Foreign References:
US10664903B12020-05-26
Other References:
TEXEL: "Texel - Virtual try-on in store", YOUTUBE, XP055952770, Retrieved from the Internet
Attorney, Agent or Firm:
BANSAL, Rohit (IN)
Download PDF:
Claims:
CLAIMS

We claim:

1. A computer implemented method for virtually trying-on clothes, the method comprising: a. generating one or more 3D clothes; b. receiving, from a user, user-metrics, the user-metrics comprising height of the user and weight of the user; c. receiving, from the user, a first photograph, the first photograph depicting a front side of the user; d. receiving, from the user, a second photograph, the second photograph depicting a selfie of the user; e. generating a 3D avatar based on the user-metrics, the first photograph and the second photograph; and f. applying the one or more 3D clothes to the 3D avatar.

2. The computer implemented method of claim 1, wherein the user metrics further comprises gender of the user.

3. The computer implemented method of claim 1 further comprises displaying the 3D avatar to the user after step f.

4. The computer implemented method of claim 1 further comprises allowing the user to rotate the 3D avatar in 360 degrees after step f.

5. The computer implemented method of claim 1, wherein the front side comprises body and face. A virtual trying-on system for clothes, the virtual trying-on system comprising: a system (110) in electronic communication (150) with one or more user access devices (140), the system (110) comprising: one or more processors; and a non-transitory, computer-readable medium comprising a set of instruction that, when executed by one or more processors, cause the one or more processors to perform operations comprising a) retrieving one or more 3D clothes from a clothes database (120); b) receiving, from a user, user-metrics, the user-metrics comprising height of the user and weight of the user; c) receiving, from the user, a first photograph, the first photograph depicting a front side of the user; d) receiving, from the user, a second photograph, the second photograph depicting a selfie of the user; e) generating a 3D avatar based on the user-metrics, the first photograph and the second photograph; and f) applying the one or more 3D clothes to the 3D avatar. The virtual trying-on system of claim 6, wherein operations further comprises displaying the 3D avatar to the user after step f. The virtual trying-on system of claim 6, wherein operations further comprises allowing the user to rotate the 3D avatar in 360 degrees after step f. The virtual trying-on system of claim 6, wherein the user access device is a desktop computer. The virtual trying-on system of claim 6, wherein the user access device is a smartphone.

16

Description:
SYSTEM AND METHOD FOR VIRTUALLY TRYING-ON CLOTHES

FIELD OF INVENTION AND USE OF INVENTION

The invention relates to a virtual clothing modeling system and a computer-implemented method, more particularly, the invention relates to a system and a method for virtually trying on clothes.

PRIOR ART AND PROBLEM TO BE SOLVED

E-commerce, also known as electronic commerce, refers to buying or selling of goods or services over the internet. E-commerce has boosted buying or selling of goods or services by eliminating any theoretical geographical limitations. Several advantages of e- commerce include faster buying/selling procedure, orders can be placed 24 hours and 7 days a week. Customers can easily select products from different providers without moving around physically.

In the case of clothes, a consumer often travels to a store and tries on several articles of clothing in a trial room. For example, the consumer try-on the clothes to be assured that corresponding clothes will properly fit them. Moreover, the clothes are tried by the consumer to know how they look wearing the clothes they wish to buy. Trying-on of clothes becomes a major limitation in online buying or selling of clothes. The consumer has to rely on the photographs of the clothes and measurements scales provided by the seller. Overall, the buying experience of clothes by the consumers is generally unsatisfactory. Sellers often allow the consumers to return clothes, in case the same does not fit properly or are not liked. However, returning clothes results in extra burden on the sellers and shrink the profits.

Addressing the issues with selling or buying of clothes online properly would help the efficiency of the retailer's business, increase the satisfaction and experience of the consumer and impact positively on the carbon footprint if fewer items are returned.

Various approaches are known for improving the reliability of online retailing for customers. For example, a US Patent Application, Pub. No. US20120299912A1 discloses a method to help a user visualize how a wearable article will look on the user's body. The method includes creating a head and body model of the user based on depth maps and thereafter creating a three-dimensional avatar based on the model.

The methods of the prior art suffer from one or more disadvantages, such as being too complex in processing or requiring a complex setup. Another major disadvantage of the methods of prior arts is they do not actually allow a user to predict if the clothes will fit them or how they will look wearing the corresponding clothes. Generally, the clothes are applied on the front side of the model and not actually worn over the model. Thus, the user does not get a fair idea about the fitting of the clothing.

Thus, considering the advantages of e-commerce, a method which allows users to virtually and more accurately predict the fitness and appearance of clothes when worn is The term clothes herein connote all forms of wearable articles of clothing and includes apparels and garments. Moreover, the terms clothes, apparel and garment may be interchangeably used.

The term avatar herein connotes to a 3D model of a user in accordance with one or more embodiments of the invention.

OBJECTS OF THE INVENTION

An objective of this invention is, therefore, to allow users to virtually and precisely predict the fitness and appearance of clothes on wearing, the clothes they wish to buy.

Another objective of this invention is to allow the users to rotate a 3D model of the user 360 degrees for visualizing the clothes superimposed on the 3D model.

Yet another objective of this invention is that the invention is simpler and economic to execute.

SUMMARY OF THE INVENTION

Certain embodiments of this invention are directed to a system and method for virtually trying-on clothes. This invention allows a user to virtually and precisely predict the fitness and appearance of clothes on their bodies. The user inputs one or more body metrics into the system. The body metrics include gender, height, and weight of the user. Furthermore, the user inputs a front pose and a selfie photograph, wherein the photograph includes face and body of the user. The system, based on the input metrics and photographs, generates a 3D avatar of the user using augmented reality and computer vision. Thereafter, 3D clothes are superimposed on the 3D avatar such that the user can visualize the clothes as worn over the 3D avatar in 360 degrees by rotating the 3D avatar along the vertical axis of the 3D avatar.

BRIEF DESCRIPTION OF DRAINGS

The accompanying figures, which are incorporated herein, form part of the specification and illustrate embodiments of this invention. Together with the description, the figures further serve to explain the principles of this invention and to enable a person skilled in the relevant arts to make and use the invention.

Fig. 1 is a block diagram showing components of the virtual try-on system, in accordance with an exemplary embodiment of this invention. Fig. 2 is a flow chart showing a method of providing clothes database in accordance with an exemplary embodiment of this invention.

Fig. 3 is a flow chart showing a method of generating a 3D avatar in accordance with an exemplary embodiment of this invention.

Fig. 4 is a flow chart showing a method of visualizing 3D clothes on a 3D avatar in accordance with an exemplary embodiment of this invention.

Fig. 5 is an embodiment of clothes as displayed to the user, for example in a catalogue or website.

Fig. 6 shows an embodiment of an interface for inputting the front and selfie photographs by the user. Fig. 7 shows rotation of the 3D avatar in accordance with an exemplary embodiment of this invention.

DETAILED DESCRIPTION OF INVENTION

Certain embodiments of this invention are directed to a virtual trying-on system and a computer implemented method.

Subject matter will now be described more fully hereinafter with reference to the accompanying drawings, which form a part hereof, and which show, by way of illustration, specific exemplary embodiments. Subject matter may, however, be embodied in a variety of different forms and, therefore, covered or claimed subject matter is intended to be construed as not being limited to any exemplary embodiments set forth herein; exemplary embodiments are provided merely to be illustrative. Likewise, a reasonably broad scope for claimed or covered subject matter is intended. Among other things, for example, the subject matter may be embodied as methods, devices, components, or systems. The following detailed description is, therefore, not intended to be taken in a limiting sense.

The word “exemplary” is used herein to mean “serving as an example, instance, or illustration." Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments. Likewise, the term "embodiments of the present invention" does not require that all embodiments of the invention include the discussed feature, advantage or mode of operation. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of embodiments of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises”, “comprising,”, “includes” and/or “including”, when used herein, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.

Embodiments of the invention may be implemented in one or a combination of hardware, firmware, and software. Embodiments of the invention may also be implemented as instructions stored on a machine -readable medium, which may be read and executed by a computing platform to perform the operations described herein. A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer). For example, a machine-readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), and others.

Unless specifically stated otherwise, and as may be apparent from the following description and claims, it should be appreciated that throughout the specification descriptions utilizing terms such as “processing,” “computing,” “calculating,” “determining,” or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.

Embodiments of this invention may also include apparatuses and systems for performing the operations described herein. An apparatus or system may be specially constructed for the desired purposes, or it may comprise a general-purpose device selectively activated or reconfigured by a program stored in the device.

The following detailed description includes the best currently contemplated mode or modes of carrying out exemplary embodiments of the invention. The description is not to be taken in a limiting sense but is made merely for the purpose of illustrating the general principles of the invention, since the scope of the invention will be best defined by the allowed claims of any resulting patent.

Referring to Fig. 1 which shows an embodiment of this invention, one or more user access devices 140 and system 110 are connected through a communication network 150. Further shown in Fig. 1 are the databases 120 and 130 connected to the system 110. The database 120 comprises 3D clothes, and database 130 comprises user profiles. The user profiles comprising 3D avatar of the user.

In one case, the user access device 140 is a computing system having a display and an input means. For example, the user device can be a wireless handheld device such as a mobile phone, smart phone, PDA or the like. The user device can also be a desktop computing device, such as a laptop or other personal computer or the like.

The communication network 150 can be a wired connection or wireless connection. The communications can be achieved via one or more networks, such as, but are not limited to, one or more of WiMax, a Local Area Network (LAN), Wireless Local Area Network (WLAN), a Personal area network (PAN), a Campus area network (CAN), a Metropolitan area network (MAN), a Wide area network (WAN), a Wireless wide area network (WWAN), Global System for Mobile Communications (GSM), Personal Communications Service (PCS), Bluetooth, Wi-Fi, 2G, 2.5G, 3G, 4G, IMT-Advanced, pre-4G, 3G LTE, 3GPP LTE, LTE Advanced, mobile WiMax, WiMax 2, Wireless MAN-Advanced networks. In addition, communications to and from the user access devices 140 can be achieved by, an open network, such as the Internet, or a private network, such as an intranet and/or the extranet. In one exemplary embodiment, communications can be achieved by a secure communications protocol, such as secure sockets layer (SSL), or transport layer security (TLS).

The system 110 is a computing system comprising a memory and a processor. The processor may include one or more processor cores to execute the instructions of the system 110. In various embodiments, the processor may include any processor of various commercially available processors, including but not limited to, speed AMD® Long ® (AtMon®), Duron ® (Dimm: ® :) or Opteron > (Optoron®) processor; ARM® application, embedded processor or safety; IBM® and / or Motorola: ® dragon Ball ® (DragonBall®) or PowerPC® processor; the IBM and / or Sony Cell processor ®; or

Intel Celeron ® ® (Celeron®), Core (2) Duo® .. „ Core (2), Core i: 3® , Core: ! i5®,

Core 7 vine, Atom ® (Atom parameter), Itanium ® (Itani thief), Pentium ® (Pentium®),

Xeon ,® (XeonCS)) or XScale® processor.

The memory includes a machine-readable medium on which is stored one or more sets of data structures and instructions (e.g., software) embodying or utilized by any one or more of the methodologies or functions described herein. The instructions may also reside, completely or at least partially, within the main memory, static memory, and/or within the processor during execution thereof by the computer system, with the main memory, static memory, and the processor also constituting machine-readable media.

While the memory is illustrated in an exemplary embodiment to be a single medium, the term “memory” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more instructions. The term “machine-readable medium” shall also be taken to include any tangible medium that is capable of storing, encoding or carrying instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure or that is capable of storing, encoding or carrying data structures utilized by or associated with such instructions. The term “machine- readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, and optical and magnetic media. Specific examples of machine-readable media include non-volatile memory, including but not limited to, by way of example, semiconductor memory devices (e.g., electrically programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM)) and flash memory devices; magnetic disks such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.

Now referring to Fig. 2 which illustrates a method 200 of generating and storing 3D clothes. The clothes are generated in 3D using any of the commercially known algorithms. Various parameters such as size and fitting of the clothes are annexed to the 3D clothing. The 3D clothes are stored in the clothes database 120 which is connected to the system 110. It is to be understood that although the database 120 is shown to be connected to the system 110, the database 120 can be a part of the system 110 itself or externally connected to the system 110.

Referring to Fig.3 shows a method 300 for generating 3D avatar of the user. At step 310, the user logs into the user access device 140. For example, the user may enter his credentials including username and password. On logging into the system 110, the user can be provided with an interface on the display of the user access device 140. The interface may allow the user to input one or more parameters as requested by the system 110. At step 320, the system 110 requests the user his gender. The form may include a drop-down list which includes possible answers, such as male, female and like. The user may select his gender from the drop-down list. At step 330, the system 110 receives the height of the user. For example, the user may select “cm” as the unit from a drop-down list and input " 167" as the height. At step 340, the user inputs his weight along with the corresponding unit of weight. For example, the user may select "pounds" from a drop- down list and input 70 as his weight. At step 350, the user can upload his front pose photograph i.e., a photograph of the front side of the user including his body and face. Alternatively, the user can be provided with an option to capture the photograph using a camera of the user access device 140. At step 360, the user uploads, another photograph of his selfie. Fig 6 illustrates an embodiment of the interface presented by the system 110 to the user for requesting the front and selfie photographs. The first pose in the figure is a front pose 610 and the second pose is a left selfie 620.

On successful uploading of the photographs by the user, the empty outlines illustrating a body are filled with the photographic details of the user. Referring back to Fig. 3, at step 370, the system 110 renders the details obtained above from the user to generate a 3D avatar of the user. The 3D avatar is a simulated 3D model of the user rendered using augmented reality and computer vision, and having a face similar to the face of the user. The body of the 3D avatar resembles the body metrics of the user. At step 380, the 3D avatar is stored in the user profile database 130 and can be retrieved later by the system 110. Alternatively, the 3D avatar is only stored in the user access device 140. For example, the 3D avatar can be stored in the cache memory of the user access device 140. Storing the 3D avatar in the user accesses device 140 may be required in case when the user does not want to share/save his details in an external server.

Now referring to Fig. 4, shown is a method 400 for visualizing clothes superimposed on the 3D avatar. The users, by visualizing the clothes on their 3D avatar, can predict if how the corresponding clothes will fit them or how they will look wearing the corresponding clothes. At step 410, the user may log into the user access device 410 as discussed for step 310. It is obvious that the user once logged into the system 110 at step 310 may not need to log in again at step 410 unless the user had logged out of the system 110. At step 420, the user may browse through a catalog of the seller. For example, the seller may list the clothes on their website and the user can browse the clothes by accessing the website. Photographs of the clothes are listed with information such as size, color, fitting, etc. Fig. 5 illustrates one such listing which is a photograph of the front side of clothes. Each of the listings of clothes is linked to a corresponding 3D clothes.

On selecting clothes for visualization by the user at step 420, the system 410 retrieves the 3D clothes linked to the selected clothes, at step 430. Thereafter, at step 440, the system 410 retrieves the 3D avatar of the user from the user profiles database 130 or the cache memory of the user device 140, as the case may be. At step 350, the system 110 using augmenting reality, superimposes the 3D clothes over the 3D avatar, such as the 3D avatar is wearing the clothes. At step 460, the 3D avatar is displayed to the user on their user access device 140. The user is also provided with suitable controls for rotation of the avatar at step 470. Step 470 is further illustrated in Fig. 7 which shows the 3D avatar. The user, through the controls, can rotate the 3D avatar along its vertical axis in 360 degrees. First, the user is presented with the front side of the avatar as shown in Fig 7A. The user thereafter rotates the avatar clockwise 90 degrees, as shown in Fig. 7B. The user further rotates the avatar 90 degrees along its vertical axis (height of the avatar), as shown in Fig. 7C. Similarly, the user can further rotate the avatar to full 360 degrees.

While the foregoing written description of the invention enables one of ordinary skill to make and use what is considered presently to be the best mode thereof, those of ordinary skill will understand and appreciate the existence of variations, combinations, and equivalents of the specific embodiment, method, and examples herein. The invention should therefore not be limited by the above-described embodiment, method, and examples, but by all embodiments and methods within the scope and spirit of the invention as claimed.