Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
A SYSTEM FOR MANUFACTURING PERSONALIZED PRODUCTS BY MEANS OF ADDITIVE MANUFACTURING DOING AN IMAGE-BASED RECOGNITION USING ELECTRONIC DEVICES WITH A SINGLE CAMERA
Document Type and Number:
WIPO Patent Application WO/2018/130291
Kind Code:
A1
Abstract:
It is proposed a method for generating a personalized eyewear frame model, comprising the steps of : acquiring a real-time video of a user's face; taking into account eyewear design parameters selected by the user; generating a plurality of specific points matching the face of the user; recognizing face parameters and calculating measurements from the specific points; loading an eyewear frame model matching the face parameters, measurements and the selected design parameters; displaying a real-time augmented reality video including the real time video of the user's face and the loaded frame embedded into it; sensing facial expression from the user's face; calculating a degree of satisfaction; repeating the previous steps until the degree of satisfaction is greater than a threshold; selecting the corresponding eyewear frame model; creating a specific file of the selected eyewear frame model that can be used as is by a manufacturing module.

Inventors:
MIRANDA ORTE JOSÉ MARIA (ES)
RODRIGUEZ DE LA PENA MARCOS (ES)
BRAGADO HERNANDO ALEJANDRO (ES)
FERNANDEZ MARTINEZ JESUS (ES)
Application Number:
PCT/EP2017/050615
Publication Date:
July 19, 2018
Filing Date:
January 12, 2017
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
ATOS SPAIN SA (ES)
International Classes:
G06T19/00; G06K9/00
Foreign References:
US20150055085A12015-02-26
US20140050408A12014-02-20
US20150127132A12015-05-07
Other References:
G S SHERGILL ET AL: "COMPUTERIZED SALES ASSISTANTS: THE APPLICATION OF COMPUTER TECHNOLOGY TO MEASURE CONSUMER INTEREST - A CONCEPTUAL FRAMEWORK", JOURNAL OF ELECTRONIC COMMERCE RESEARCH, vol. 9, no. 2, 7 May 2008 (2008-05-07), California State University, pages 176 - 191, XP055240869, Retrieved from the Internet [retrieved on 20160113]
Attorney, Agent or Firm:
LOUISET, Raphaël et al. (FR)
Download PDF:
Claims:
CLAIMS

1. A computing method for generating a three dimensional personalized eyewear frame model, comprising the steps of :

- acquiring a real-time video of a user's face, composed of a plurality of images,

taking into account eyewear design parameters selected by the user,

for each image of the video, generating a plurality of specific points matching the face of the user,

recognizing face parameters and calculating measurements from the specific points,

loading an eyewear frame model matching the face parameters and measurements and the selected design parameters,

- displaying a real-time augmented reality video including the real time video of the user's face and the loaded frame embedded into it in a normal wearing position according to the face parameters and measurements,

sensing facial expression from the user's face within the video, - calculating a degree of satisfaction from the facial expression, repeating the previous steps until the degree of satisfaction is greater than a threshold,

selecting the corresponding eyewear frame model,

creating a specific file of the selected eyewear frame model that can be used as is by a manufacturing module such as a three dimensional printer.

2. Method for manufacturing an eyewear frame product comprising the steps of the method of claim 1, and further comprising a step of manufacturing an eyewear frame product according to the generated specific file.

3. Method according to claim 1 or 2, wherein the number of generated specific points is 67.

4. Method according to any of claims 1 to 3, wherein the face parameters are chosen among the following list: positions of the pupils, position of the tip of the nose, position of the mouth, positions of the end-parts of the cheeks, positions of the ears, position of the tip of the chin, positions of the forehead extremities.

5. Method according to any of claims 1 to 4, wherein the measurements are chosen among the following list: distance between pupils, distance between pupils line and nose, distance between the pupils line and the upper lip, distance between the pupils line and the top of the forehead, width of jaw at the level of the upper lip line, width of face at the level of the pupils line, length of the face, frontal nose angle, width of nose at the level where the eyewear frame is supported, the height of ears, lateral distance between the line of the nose where the eyewear frame is supported, position of the upper anchoring of ear to the skull.

6. A system for generating a three dimensional personalized eyewear frame model, comprising:

a server to store data mining market trends studies and a library of eyewear frame models,

a customization interface allowing a user to select design parameters such as a color or a texture,

a computer vision module comprising a single camera to acquire a real-time video of the user's face composed of a plurality of images, and a processing module programmed for:

o identifying specific points on each image of the realtime video,

o recognizing the face parameters,

o calculating measurements on the face, o detecting variations of the positions of the specific points,

o calculating a degree of satisfaction of the user according to these variations,

a parametric optimization module programmed for:

o taking into account the facial parameters, the degree of satisfaction of the user, and qualitative information such as market trends or user feedback, o loading an eyewear frame model matching the face parameters and measurements and the selected design parameters,

a computer aid design module programmed for creating specific eyewear frame models for each image and completing the model library with these models, an augmented reality module programmed for displaying a generated frame model on the image of the user's face.

7. A system for manufacturing a personalized eyewear frame product comprising the system of claim 6 and a manufacturing module to manufacture an eyewear frame according to an eyewear frame model file.

Description:
A system for manufacturing personalized products by means of additive manufacturing doing an image-based recognition using electronic devices with a single camera FIELD OF THE INVENTION

The present invention relates to a system that makes possible the creation of personalized parts by additive manufacturing technics using electronic devices and only a single camera for the acquisition. In particular, this invention relates to a reality augmented system for generating a personalized eyewear frame model for the later manufacturing, allowing the user to visualize himself wearing the modeled eyewear frame.

BACKGROUND OF THE INVENTION

The present invention focuses on personalization of eyewear frames, allowing the user to visualize himself wearing a personalized eyewear frame in real-time. It relates to a system developed by building blocks guiding the user from a step of recognizing his personal metrics, passing through the parametric optimization of the chosen geometry and finally manufacturing the eyewear frame by three-dimensional printing.

As far as real-time data acquisition is concerned, using a single camera to obtain as many points of control as required is a challenge. A calibration mode must be developed for measuring the filmed object, whereby focal distance may be accurately calculated.

Augmented reality is commonly used to develop real textures of filmed objects, overlapping those objects with real-time video. This solution must withstand the challenge of approximating the real human- eye perception of textures and colors. The present invention will accomplish that issue by creating a three-layer texture where overlapping of intermediate images would provide depth characteristics.

There is no existing commercial manufacturing file that includes every features related not only to geometry but also to finishing details, colors and all metadata capable of being added to a customer account. The present invention will accomplish that issue by a manufacturing file different from the commonly used files.

The specific characteristics of the present invention that makes it unique are presented below:

- it takes measures of face features in real time;

it provide customers with the eyewear frame geometry most suitable to their faces;

it allows users to re-design the frame geometry. In other words, users may select and modify a frame geometry by switching parameters;

the real-time showcase of the invention brings users the possibility to vary geometry and let them exchange colors and textures;

the present system permits users to fit, show and purchase products by their own.

SUMMARY OF THE INVENTION

The present invention provides a computing method for generating a three dimensional personalized eyewear frame model, comprising the steps of :

acquiring a real-time video of a user's face, composed of a plurality of images,

taking into account eyewear design parameters selected by the user,

for each image of the video, generating a plurality of specific points matching the face of the user,

recognizing face parameters and calculating measurements from the specific points,

- loading an eyewear frame model matching the face parameters and measurements and the selected design parameters, displaying a real-time augmented reality video including the real time video of the user's face and the loaded frame embedded into it in a normal wearing position according to the face parameters and measurements,

sensing facial expression from the user's face within the video, calculating a degree of satisfaction from the facial expression, repeating the previous steps until the degree of satisfaction is greater than a threshold,

selecting the corresponding eyewear frame model,

- creating a specific file of the selected eyewear frame model that can be used as is by a manufacturing module such as a three dimensional printer.

It is also proposed a method for manufacturing an eyewear frame product comprising the steps of the previous method and further comprising a step of manufacturing an eyewear frame product according to the generated specific file.

According to various features considered alone or in combination: the number of generated specific points is 67,

the face parameters are chosen among the following list: positions of the pupils, position of the tip of the nose, position of the mouth, positions of the end-parts of the cheeks, positions of the ears, position of the tip of the chin, positions of the forehead extremities,

the measurements are chosen among the following list: distance between pupils; distance between pupils line and nose; distance between the pupils line and the upper lip; distance between the pupils line and the top of the forehead; width of jaw at the level of the upper lip line; width of face at the level of the pupils line; length of the face; frontal nose angle; width of nose at the level where the eyewear frame is supported; height of ears; lateral distance between the line of the nose where the eyewear frame is supported; position of the upper anchoring of ear to the skull. It is also provided a system for generating a three dimensional personalized eyewear frame model, comprising:

- a server to store data mining market trends studies and a library of eyewear frame models,

a customization interface allowing a user to select design parameters such as a color or a texture,

a computer vision module comprising a single camera to acquire a real-time video of the user's face composed of a plurality of images, and a processing module programmed for: o identifying specific points on each image of the real-time video,

o recognizing the face parameters,

o calculating measurements on the face,

o detecting variations of the positions of the specific points, o calculating a degree of satisfaction of the user according to these variations,

a parametric optimization module programmed for:

o taking into account the facial parameters, the degree of satisfaction of the user, and qualitative information such as market trends or user feedback,

o loading an eyewear frame model matching the face parameters and measurements and the selected design parameters,

a computer aid design module programmed for creating specific eyewear frame models for each image and completing the model library with these models,

an augmented reality module programmed for displaying a generated eyewear frame model on the image of the user's face.

It is also provided a system for manufacturing a personalized eyewear frame product comprising the previous system and a manufacturing module for manufacturing an eyewear frame according to an eyewear frame model file.

The invention will be better understood and other details, features and advantages of the invention will appear on reading the following description given by way of no limiting examples with reference to the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG.1 illustrates specific points and two distance measurement on a user's face.

FIG.2 is an illustration of an eyewear frame model example.

FIG.3 is an illustration of all the required measurements on the user's face.

FIG.3 is an illustration of face parameters. FIG.4 is a functional diagram presenting the flowchart of the system.

FIG.5a is an illustration of the pattern used by the computer aid design module.

FIG.5b and FIG.5c are two examples of eyewear shapes derived from the pattern of FIG.5a.

DETAILED DESCRIPTION

The present invention refers to a system and a method for manufacturing a personalized eyewear frame product. The method provided by the system is focused on creating or selecting the eyewear frame model that satisfies the best the customer.

The method for generating a three dimensional personalized eyewear frame model comprises the steps of:

acquiring a real-time video of a user's face, composed of a plurality of images,

taking into account eyewear design parameters selected by the user,

for each image of the video, generating a plurality of specific points matching the face of the user,

recognizing face parameters and calculating measurements from the specific points,

- loading an eyewear frame model matching the face parameters and measurements and the selected design parameters, displaying a real-time augmented reality video including the real time video of the user's face and the loaded frame embedded into it in a normal wearing position according to the face parameters and measurements,

sensing facial expression from the user's face within the video, calculating a degree of satisfaction from the facial expression, repeating the previous steps until the degree of satisfaction is greater than a threshold,

- selecting the corresponding eyewear frame model, creating a specific file of the selected eyewear frame model that can be used as is by a manufacturing module such as a three dimensional printer,

manufacturing an eyewear frame product according to the generated specific file.

In a preferred embodiment of the present invention, the number of generated specific points is 67. Positions of specific points are illustrated in FIG.1. The ensemble of points constitutes a mesh that matches the face of the user. The specific points are generated through different convergence algorithms. Once convergence is achieved, it is possible to know the position of each point.

Referring to FIG.3, The face parameters are chosen among the following list:

positions of the pupils 1, 2,

- position of the tip of the nose 3,

position of the mouth 4, positions of the end-parts of the cheeks 5 and 6,

positions of the ears 7 and 8,

position of the tip of the chin 9,

- positions of the forehead extremities 10 and 11.

The measurements are chosen among the following list:

distance between pupils,

distance between pupils line and nose,

distance between the pupils line and the upper lip,

- distance between the pupils line and the top of the forehead,

distance (d1) between the upper anchoring of ears to the skull, width (d2) of the jaw at the level of the upper lip line,

width of the face at the level of the pupils line,

length of the face,

- frontal nose angle,

width of the nose at the level where the eyewear frame is supported,

height of the ears,

lateral distance between the line of the nose where the eyewear frame is supported. The user may select design parameters for an eyewear frame model. Generally the types of design parameters that can be set in are defined according to the ISO 8624 standard for the conventional design of an eyewear frame.

To load an eyewear frame model 12 matching the face parameters and measurements and the selected design parameters, a specific database is used. If no model in the database matches both face parameters and selected design parameters, then a new model is created by a dedicated application (typically a parametric three dimensional modeler, ex. FreeCAD) and added to the database. An eyewear frame model 12 has a modular geometry, variable between circular and rectangular shape which permits to build up a large amount of different shapes as illustrated in FIG.5a, FIG.5b and FIG.5c.

The selected eyewear frame model 12 is displayed in real-time in the video of the user's face in a normal wearing position according to the face parameters and measurements.

Facial expression algorithms are used to detect variations of the positions of the specific points. According to these variations and to a table regrouping correspondences between specific points position variation in the face and particular facial expression, it is possible to attribute a degree of satisfaction to the user. This metric can be a percentage of satisfaction for example. The goal of the method is to propose to the user a model that provides the highest degree of satisfaction possible. The method is performed until this metric reaches a value greater to a threshold.

The method is performed in real time, so the user can change the design parameters at any time. The system providing the method is always updating the model according to the current selected design parameters.

It is also proposed a system for manufacturing a personalized eyewear frame, providing the previously described method. The system and its flowchart are illustrated in FIG.4. The system comprises:

a server 101 to store data mining market trends studies and a library of eyewear frame models,

- a customization interface 102 allowing a user to select design parameters such as a color or a texture, a computer vision module 103 comprising a single camera to acquire a real-time video of the user's face composed of a plurality of images, and a processing module programmed for:

o identifying specific points on each image of the real-time video,

o recognizing the face parameters,

o calculating measurements on the face,

o detecting variations of the positions of the specific points, o calculating a degree of satisfaction of the user according to these variations,

a parametric optimization module 104 programmed for:

o taking into account the facial parameters, the degree of satisfaction of the user, and qualitative information such as market trends or user feedback,

o loading an eyewear frame model 12 matching the face parameters and measurements and the selected design parameters,

a computer aid design module 105 programmed for creating specific eyewear frame models for each image and completing the model library with these models,

an augmented reality module 106 programmed for displaying a generated frame on the image of the user's face.

a manufacturing module 107 to manufacture an eyewear frame according to an eyewear frame model file.

The computer vision module 103 recognizes the facial parameters of the user by filming a video with only one camera.

A calibration procedure of the single camera is performed. The process of the calibration consists in taking measures of an object whose dimensions are known to establish the relation between a distance in meter and a distance in pixels in the image.

The computer vision module 103 recognizes the positions of the 67 specific points in the user's face, image by image. Then, it is able to recognize the parameters illustrated in FIG.3 and to calculate measurements on the user's face. On the other hand, this module contains algorithm of facial expression recognition, which evaluates the degree of satisfaction of the user. Each specific point is considered as an action unit and the temporal evolution of these action units could be considered like a human expression. An artificial neural network may be previously trained to recognize those action units and classify them like emotions in a table. Then when an action unit is recognized, the corresponding emotion is known with the classification table. For each emotion, a degree of satisfaction (for example a percentage) can be affected. Thereby, at any moment of the process, the state of satisfaction of the user may be evaluated.

The parametric optimization module 104 generates sets of design parameters to provide the user the most suitable set of eyewear frames models 12. It takes into account the facial parameters detected by the computer vision module 103. It is also continuously fed with information regarding physiognomic studies, market trends (for example in a certain geographic area), and user feedback (ex. personal style).

The parametric models are developed by the computer aid design module 105. It uses typically a parametric three dimensional modeler such as FreeCAD to build the eyewear frame models. To build a model, a Pattern is defined to get glasses shapes. According to a preferred embodiment of the present invention, four arcs and five lines are chosen as illustrated in FIG.5a, and auxiliary shape is determined with the following parameters: type of glasses box, nasofrontal angle, and external angle. This pattern allows changing the lens shape easily as illustrated in FIG.5a, FIG.5b and FIG.5c.

Once user's facial parameters are acquired by the computer vision module 103, automatic process of glasses recommendation is run by a recommendation algorithm and a model 12 is presented to the user by displaying it on the user's face in a real-time video in a normal wearing position. Then, if the user does not accept the presented model 12, the user may change some shape aspects (design parameter) and another eyewear frame model 12 will be presented.

The models are stored in a database in a digital format which contains parameters and identification numbers for quick access.

The augmented reality module 106 is responsible for displaying the models of eyewear frames generated real-time by the computer aid design module 105 and displayed in the video on the user's face. The manufacturing module 107 is used for manufacturing the eyewear frames according to the selected or modified model 12. The material used to manufacture the eyewear frame product may be a plastic (such as polyamide) or a metal (such as steel). For efficiency and tracking reasons, printing files are preferably sent to the manufacturing module in a compact format (typically .CAD or . st I extensions) which contains all information (geometry, finish, color etc.) of the selected eyewear frame.

This system makes it possible to create personalized eyewear frames by means of an additive manufacturing technique and electronic devices which, from the user's point of view, are reduced to a single camera for data acquisition.