WO2018130291A1 - Système de fabrication de produits personnalisés au moyen d'une fabrication additive réalisant une reconnaissance à base d'image à l'aide de dispositifs électroniques à caméra unique - Google Patents

Système de fabrication de produits personnalisés au moyen d'une fabrication additive réalisant une reconnaissance à base d'image à l'aide de dispositifs électroniques à caméra unique Download PDF

Info

Publication number
WO2018130291A1
WO2018130291A1 PCT/EP2017/050615 EP2017050615W WO2018130291A1 WO 2018130291 A1 WO2018130291 A1 WO 2018130291A1 EP 2017050615 W EP2017050615 W EP 2017050615W WO 2018130291 A1 WO2018130291 A1 WO 2018130291A1
Authority
WO
WIPO (PCT)
Prior art keywords
face
user
eyewear frame
parameters
eyewear
Prior art date
Application number
PCT/EP2017/050615
Other languages
English (en)
Inventor
José Maria MIRANDA ORTE
Marcos RODRIGUEZ DE LA PENA
Alejandro BRAGADO HERNANDO
Jesus FERNANDEZ MARTINEZ
Original Assignee
Atos Spain Sociedad Anonima
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Atos Spain Sociedad Anonima filed Critical Atos Spain Sociedad Anonima
Priority to EP17700932.1A priority Critical patent/EP3568835A1/fr
Priority to PCT/EP2017/050615 priority patent/WO2018130291A1/fr
Publication of WO2018130291A1 publication Critical patent/WO2018130291A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Definitions

  • the present invention relates to a system that makes possible the creation of personalized parts by additive manufacturing technics using electronic devices and only a single camera for the acquisition.
  • this invention relates to a reality augmented system for generating a personalized eyewear frame model for the later manufacturing, allowing the user to visualize himself wearing the modeled eyewear frame.
  • the present invention focuses on personalization of eyewear frames, allowing the user to visualize himself wearing a personalized eyewear frame in real-time. It relates to a system developed by building blocks guiding the user from a step of recognizing his personal metrics, passing through the parametric optimization of the chosen geometry and finally manufacturing the eyewear frame by three-dimensional printing.
  • a calibration mode must be developed for measuring the filmed object, whereby focal distance may be accurately calculated.
  • Augmented reality is commonly used to develop real textures of filmed objects, overlapping those objects with real-time video. This solution must withstand the challenge of approximating the real human- eye perception of textures and colors.
  • the present invention will accomplish that issue by creating a three-layer texture where overlapping of intermediate images would provide depth characteristics.
  • the present system permits users to fit, show and purchase products by their own.
  • the present invention provides a computing method for generating a three dimensional personalized eyewear frame model, comprising the steps of :
  • a manufacturing module such as a three dimensional printer.
  • It is also proposed a method for manufacturing an eyewear frame product comprising the steps of the previous method and further comprising a step of manufacturing an eyewear frame product according to the generated specific file.
  • the number of generated specific points is 67
  • the face parameters are chosen among the following list: positions of the pupils, position of the tip of the nose, position of the mouth, positions of the end-parts of the cheeks, positions of the ears, position of the tip of the chin, positions of the forehead extremities,
  • the measurements are chosen among the following list: distance between pupils; distance between pupils line and nose; distance between the pupils line and the upper lip; distance between the pupils line and the top of the forehead; width of jaw at the level of the upper lip line; width of face at the level of the pupils line; length of the face; frontal nose angle; width of nose at the level where the eyewear frame is supported; height of ears; lateral distance between the line of the nose where the eyewear frame is supported; position of the upper anchoring of ear to the skull. It is also provided a system for generating a three dimensional personalized eyewear frame model, comprising:
  • a customization interface allowing a user to select design parameters such as a color or a texture
  • a computer vision module comprising a single camera to acquire a real-time video of the user's face composed of a plurality of images, and a processing module programmed for: o identifying specific points on each image of the real-time video,
  • a parametric optimization module programmed for:
  • a computer aid design module programmed for creating specific eyewear frame models for each image and completing the model library with these models
  • an augmented reality module programmed for displaying a generated eyewear frame model on the image of the user's face.
  • It is also provided a system for manufacturing a personalized eyewear frame product comprising the previous system and a manufacturing module for manufacturing an eyewear frame according to an eyewear frame model file.
  • FIG.1 illustrates specific points and two distance measurement on a user's face.
  • FIG.2 is an illustration of an eyewear frame model example.
  • FIG.3 is an illustration of all the required measurements on the user's face.
  • FIG.3 is an illustration of face parameters.
  • FIG.4 is a functional diagram presenting the flowchart of the system.
  • FIG.5a is an illustration of the pattern used by the computer aid design module.
  • FIG.5b and FIG.5c are two examples of eyewear shapes derived from the pattern of FIG.5a.
  • the present invention refers to a system and a method for manufacturing a personalized eyewear frame product.
  • the method provided by the system is focused on creating or selecting the eyewear frame model that satisfies the best the customer.
  • the method for generating a three dimensional personalized eyewear frame model comprises the steps of:
  • the number of generated specific points is 67. Positions of specific points are illustrated in FIG.1.
  • the ensemble of points constitutes a mesh that matches the face of the user.
  • the specific points are generated through different convergence algorithms. Once convergence is achieved, it is possible to know the position of each point.
  • the face parameters are chosen among the following list:
  • the measurements are chosen among the following list:
  • the user may select design parameters for an eyewear frame model.
  • design parameters for an eyewear frame model.
  • types of design parameters that can be set in are defined according to the ISO 8624 standard for the conventional design of an eyewear frame.
  • An eyewear frame model 12 To load an eyewear frame model 12 matching the face parameters and measurements and the selected design parameters, a specific database is used. If no model in the database matches both face parameters and selected design parameters, then a new model is created by a dedicated application (typically a parametric three dimensional modeler, ex. FreeCAD) and added to the database.
  • An eyewear frame model 12 has a modular geometry, variable between circular and rectangular shape which permits to build up a large amount of different shapes as illustrated in FIG.5a, FIG.5b and FIG.5c.
  • the selected eyewear frame model 12 is displayed in real-time in the video of the user's face in a normal wearing position according to the face parameters and measurements.
  • Facial expression algorithms are used to detect variations of the positions of the specific points. According to these variations and to a table regrouping correspondences between specific points position variation in the face and particular facial expression, it is possible to attribute a degree of satisfaction to the user.
  • This metric can be a percentage of satisfaction for example.
  • the goal of the method is to propose to the user a model that provides the highest degree of satisfaction possible. The method is performed until this metric reaches a value greater to a threshold.
  • the method is performed in real time, so the user can change the design parameters at any time.
  • the system providing the method is always updating the model according to the current selected design parameters.
  • FIG.4 The system comprises:
  • a server 101 to store data mining market trends studies and a library of eyewear frame models
  • a customization interface 102 allowing a user to select design parameters such as a color or a texture
  • a computer vision module 103 comprising a single camera to acquire a real-time video of the user's face composed of a plurality of images
  • a processing module programmed for:
  • a parametric optimization module 104 programmed for:
  • a computer aid design module 105 programmed for creating specific eyewear frame models for each image and completing the model library with these models
  • an augmented reality module 106 programmed for displaying a generated frame on the image of the user's face.
  • a manufacturing module 107 to manufacture an eyewear frame according to an eyewear frame model file.
  • the computer vision module 103 recognizes the facial parameters of the user by filming a video with only one camera.
  • a calibration procedure of the single camera is performed.
  • the process of the calibration consists in taking measures of an object whose dimensions are known to establish the relation between a distance in meter and a distance in pixels in the image.
  • the computer vision module 103 recognizes the positions of the 67 specific points in the user's face, image by image. Then, it is able to recognize the parameters illustrated in FIG.3 and to calculate measurements on the user's face.
  • this module contains algorithm of facial expression recognition, which evaluates the degree of satisfaction of the user.
  • Each specific point is considered as an action unit and the temporal evolution of these action units could be considered like a human expression.
  • An artificial neural network may be previously trained to recognize those action units and classify them like emotions in a table. Then when an action unit is recognized, the corresponding emotion is known with the classification table. For each emotion, a degree of satisfaction (for example a percentage) can be affected. Thereby, at any moment of the process, the state of satisfaction of the user may be evaluated.
  • the parametric optimization module 104 generates sets of design parameters to provide the user the most suitable set of eyewear frames models 12. It takes into account the facial parameters detected by the computer vision module 103. It is also continuously fed with information regarding physiognomic studies, market trends (for example in a certain geographic area), and user feedback (ex. personal style).
  • the parametric models are developed by the computer aid design module 105. It uses typically a parametric three dimensional modeler such as FreeCAD to build the eyewear frame models.
  • a Pattern is defined to get glasses shapes. According to a preferred embodiment of the present invention, four arcs and five lines are chosen as illustrated in FIG.5a, and auxiliary shape is determined with the following parameters: type of glasses box, nasofrontal angle, and external angle. This pattern allows changing the lens shape easily as illustrated in FIG.5a, FIG.5b and FIG.5c.
  • the models are stored in a database in a digital format which contains parameters and identification numbers for quick access.
  • the augmented reality module 106 is responsible for displaying the models of eyewear frames generated real-time by the computer aid design module 105 and displayed in the video on the user's face.
  • the manufacturing module 107 is used for manufacturing the eyewear frames according to the selected or modified model 12.
  • the material used to manufacture the eyewear frame product may be a plastic (such as polyamide) or a metal (such as steel).
  • printing files are preferably sent to the manufacturing module in a compact format (typically .CAD or . st I extensions) which contains all information (geometry, finish, color etc.) of the selected eyewear frame.
  • This system makes it possible to create personalized eyewear frames by means of an additive manufacturing technique and electronic devices which, from the user's point of view, are reduced to a single camera for data acquisition.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un procédé de génération de modèle de monture de lunettes personnalisé, comprenant les étapes consistant à : acquérir une vidéo en temps réel du visage d'un utilisateur; prendre en compte des paramètres de conception de lunettes sélectionnés par l'utilisateur; générer une pluralité de points spécifiques correspondant au visage de l'utilisateur; reconnaître des paramètres de visage et calculer des mesures à partir des points spécifiques; charger un modèle de monture de lunettes correspondant aux paramètres de visage, aux mesures et aux paramètres de conception sélectionnés; afficher une vidéo de réalité augmentée en temps réel comprenant la vidéo en temps réel du visage de l'utilisateur et la monture chargée intégré à ce dernier; détecter une expression faciale à partir du visage de l'utilisateur; calculer un degré de satisfaction; répéter les étapes précédentes jusqu'à ce que le degré de satisfaction soit supérieur à un seuil; sélectionner le modèle de monture de lunettes correspondant; créer un fichier spécifique du modèle de monture de lunettes sélectionné qui peut être utilisé tel quel par un module de fabrication.
PCT/EP2017/050615 2017-01-12 2017-01-12 Système de fabrication de produits personnalisés au moyen d'une fabrication additive réalisant une reconnaissance à base d'image à l'aide de dispositifs électroniques à caméra unique WO2018130291A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP17700932.1A EP3568835A1 (fr) 2017-01-12 2017-01-12 Système de fabrication de produits personnalisés au moyen d'une fabrication additive réalisant une reconnaissance à base d'image à l'aide de dispositifs électroniques à caméra unique
PCT/EP2017/050615 WO2018130291A1 (fr) 2017-01-12 2017-01-12 Système de fabrication de produits personnalisés au moyen d'une fabrication additive réalisant une reconnaissance à base d'image à l'aide de dispositifs électroniques à caméra unique

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2017/050615 WO2018130291A1 (fr) 2017-01-12 2017-01-12 Système de fabrication de produits personnalisés au moyen d'une fabrication additive réalisant une reconnaissance à base d'image à l'aide de dispositifs électroniques à caméra unique

Publications (1)

Publication Number Publication Date
WO2018130291A1 true WO2018130291A1 (fr) 2018-07-19

Family

ID=57860838

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2017/050615 WO2018130291A1 (fr) 2017-01-12 2017-01-12 Système de fabrication de produits personnalisés au moyen d'une fabrication additive réalisant une reconnaissance à base d'image à l'aide de dispositifs électroniques à caméra unique

Country Status (2)

Country Link
EP (1) EP3568835A1 (fr)
WO (1) WO2018130291A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10685457B2 (en) 2018-11-15 2020-06-16 Vision Service Plan Systems and methods for visualizing eyewear on a user

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140050408A1 (en) * 2012-08-14 2014-02-20 Samsung Electronics Co., Ltd. Method for on-the-fly learning of facial artifacts for facial emotion recognition
US20150055085A1 (en) * 2013-08-22 2015-02-26 Bespoke, Inc. Method and system to create products
US20150127132A1 (en) * 2013-11-01 2015-05-07 West Coast Vision Labs Inc. Method and system for generating custom-fit eye wear geometry for printing and fabrication

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140050408A1 (en) * 2012-08-14 2014-02-20 Samsung Electronics Co., Ltd. Method for on-the-fly learning of facial artifacts for facial emotion recognition
US20150055085A1 (en) * 2013-08-22 2015-02-26 Bespoke, Inc. Method and system to create products
US20150127132A1 (en) * 2013-11-01 2015-05-07 West Coast Vision Labs Inc. Method and system for generating custom-fit eye wear geometry for printing and fabrication

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
G S SHERGILL ET AL: "COMPUTERIZED SALES ASSISTANTS: THE APPLICATION OF COMPUTER TECHNOLOGY TO MEASURE CONSUMER INTEREST - A CONCEPTUAL FRAMEWORK", JOURNAL OF ELECTRONIC COMMERCE RESEARCH, vol. 9, no. 2, 7 May 2008 (2008-05-07), California State University, pages 176 - 191, XP055240869, Retrieved from the Internet <URL:http://aut.researchgateway.ac.nz/bitstream/handle/10292/1715/paper7.pdf?sequence=2&isAllowed=y> [retrieved on 20160113] *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10685457B2 (en) 2018-11-15 2020-06-16 Vision Service Plan Systems and methods for visualizing eyewear on a user

Also Published As

Publication number Publication date
EP3568835A1 (fr) 2019-11-20

Similar Documents

Publication Publication Date Title
US11914226B2 (en) Method and system to create custom, user-specific eyewear
US11495002B2 (en) Systems and methods for determining the scale of human anatomy from images
CN110678875B (zh) 用于引导用户拍摄自拍照的系统和方法
CN109690617B (zh) 用于数字化妆镜的系统和方法
CN109310196B (zh) 化妆辅助装置以及化妆辅助方法
US20170169501A1 (en) Method and system for evaluating fitness between wearer and eyeglasses
JP5225870B2 (ja) 情動分析装置
JP2017194301A (ja) 顔形状測定装置及び方法
CN112884556A (zh) 一种基于混合现实的店铺展示方法、系统、设备和介质
WO2018130291A1 (fr) Système de fabrication de produits personnalisés au moyen d&#39;une fabrication additive réalisant une reconnaissance à base d&#39;image à l&#39;aide de dispositifs électroniques à caméra unique
KR102267688B1 (ko) 모발 자가진단 정확성을 향상시킨 어플리케이션 실행 방법
WO2022024274A1 (fr) Dispositif de traitement d&#39;image, procédé de traitement d&#39;image et support d&#39;enregistrement
JP7439932B2 (ja) 情報処理システム、データ蓄積装置、データ生成装置、情報処理方法、データ蓄積方法、データ生成方法及び記録媒体、並びに、データベース
KR101734212B1 (ko) 표정 연습 시스템
KR20240009440A (ko) 컴퓨터-기반 신체 부위 분석 방법들 및 시스템들
KR20180132309A (ko) 사용자 맞춤형 제품 제공 방법 및 이를 위한 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17700932

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017700932

Country of ref document: EP

Effective date: 20190812