KR101805056B1 - Eyeglasses try-on simulation method using augumented reality - Google Patents

Eyeglasses try-on simulation method using augumented reality Download PDF

Info

Publication number
KR101805056B1
KR101805056B1 KR1020150102782A KR20150102782A KR101805056B1 KR 101805056 B1 KR101805056 B1 KR 101805056B1 KR 1020150102782 A KR1020150102782 A KR 1020150102782A KR 20150102782 A KR20150102782 A KR 20150102782A KR 101805056 B1 KR101805056 B1 KR 101805056B1
Authority
KR
South Korea
Prior art keywords
face
change
tracking
image
facial
Prior art date
Application number
KR1020150102782A
Other languages
Korean (ko)
Other versions
KR20170010985A (en
Inventor
배유환
박준엽
Original Assignee
(주)월드트렌드
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)월드트렌드 filed Critical (주)월드트렌드
Priority to KR1020150102782A priority Critical patent/KR101805056B1/en
Publication of KR20170010985A publication Critical patent/KR20170010985A/en
Application granted granted Critical
Publication of KR101805056B1 publication Critical patent/KR101805056B1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C13/00Assembling; Repairing; Cleaning
    • G02C13/003Measuring during assembly or fitting of spectacles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention relates to a method of simulating spectacle wearing of a subject by using an augmented reality, comprising the steps of: converting the 3D modeling data of eyeglasses into a simulable resource and classifying the imported data into a lens group, a material group and a decorative group As a resource of the resource; The facial image is normalized by recognizing the image of the subject's face and tracking the changed posture, performing the Fourier transform of the detected face, generating and projecting a facial shape (eigenface) A face tracking step of outputting 6DOF values; A simulation step of generating an image in which the eyeglass is overlaid on the face of the subject, changing a display state by selection, and processing a subroutine program of lens change, decoration change, and model change in response to a user's event request; And in the simulation process, a virtual space is created by linking the wearing state of glasses in response to a change in the posture of the face, resources are loaded, facial tracking data is applied to link the facial positions, the background of the virtual space is replaced with a camera image source And an updating step of converting the tracking result into a 6DOF value and processing the result as a snapshot in the course of the simulation.
Accordingly, even when the actual glasses are not worn on the optician, the change in the wearing state is performed in real time in conjunction with various attitude changes, thereby realizing not only real time expression of various fashion senses but also convenient selection for oneself.

Description

[0001] The present invention relates to an eyeglass wear simulation method using augmented reality,

More particularly, the present invention relates to a method of simulating eyeglasses wearing a spectacle, and more particularly, it relates to a method of simulating wear of a spectacle, The present invention relates to a method for simulating wearing glasses using an augmented reality.

Computers, smart phones, and other information devices are moving beyond simply experiencing a new world, and now we are turning into a useful tool that can be used directly in real life. This is also used as a commercial marketing means, and it is increasingly applied to the fields of costume, jewelry and glasses.

Related prior art documents are disclosed in Korean Patent Laid-Open Publication No. 2010-0050052 (Prior Document 1) and Korean Patent Registration No. 1260287 (Prior Document 2).

A first step of inputting a face image of a wearer, a second step of designating a position reference point in the image, a second step of designating a position based on the rotation angle of the face image based on the reference point, A third step of calculating the position of the glasses, a fourth step of generating an image of the glasses modified according to the value, and a fifth step of synthesizing the images of the glasses created in the image of the wearer.

The prior art document 2 includes photographing an actual surrounding environment through a camera module; Detecting an operation pattern by a user operation and acquiring information about an object-time-aware operation; Generating a virtual user-customized spectacle lens image based on the information; And outputting a visual adjustment effect image by superimposing the imaginary user-customized spectacle lens image on the actual surroundings image.

However, since the prior art document 1 synthesizes the glasses based on the photograph of the wearer, the simulat line corresponding to various posture changes in the optician's office is insufficient, and the preceding document 2 outputs the images before and after the adjustment of the eyesight Therefore, it is difficult to produce various fashion sense of wearing glasses in real time.

1. Korean Patent Laid-Open Publication No. 2010-0050052 "Virtual Wear Method of Glasses" (Published Date: May 13, 2010) 2. Korean Patent Registration No. 1260287 entitled " Method of Simulating Eyeglass Lenses Using Augmented Reality "(Published on 2013.05.03.)

It is an object of the present invention to overcome the above-mentioned problems of the related art by providing a real time representation of a change in wearing state in conjunction with various posture changes without wearing actual glasses at an optician, And to provide a method of simulating wearing glasses using an augmented reality which can be conveniently selected according to a taste.

According to another aspect of the present invention, there is provided a method of ablating a wearer's glasses wearing a display using an augmented reality, the method comprising: converting the 3D modeling data of glasses into a simulable resource, A material group, and an ornamental group, and storing them as respective resources; The facial image is normalized by recognizing the image of the subject's face and tracking the changed posture, performing the Fourier transform of the detected face, generating and projecting a facial shape (eigenface) A face tracking step of outputting 6DOF values; A simulation step of generating an image in which the eyeglass is overlaid on the face of the subject, changing a display state by selection, and processing a subroutine program of lens change, decoration change, and model change in response to a user's event request; And in the simulation process, a virtual space is created by linking the wearing state of glasses in response to a change in the posture of the face, resources are loaded, facial tracking data is applied to link the facial positions, the background of the virtual space is replaced with a camera image source And an updating step of converting the tracking result into a 6DOF value and processing the result as a snapshot in the course of the simulation.

delete

delete

delete

delete

It should be understood, however, that the terminology or words of the present specification and claims should not be construed in an ordinary sense or in a dictionary, and that the inventors shall not be limited to the concept of a term It should be construed in accordance with the meaning and concept consistent with the technical idea of the present invention based on the principle that it can be properly defined. Therefore, the embodiments described in the present specification and the configurations shown in the drawings are merely the most preferred embodiments of the present invention, and not all of the technical ideas of the present invention are described. Therefore, It is to be understood that equivalents and modifications are possible.

As described above, according to the present invention, it is possible to conveniently select the one that fits the user's taste as well as real-time expression of various fashion senses by directing the change of the wearing state in real time, There is an effect.

Figure 1 shows a schematic representation of a system to which a method according to the invention is applied;
Figure 2 is a graph showing an example of the data attributes of the method according to the invention
3 is a flowchart illustrating a data loading step according to the present invention;
4 is a flow chart illustrating facial tracking steps according to the present invention;
5 is a flow chart illustrating simulation steps according to the present invention;
6 is a flowchart showing the updating step according to the present invention;
FIG. 7 is a graph showing a screen state of a main part in which a method according to the present invention is implemented

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.

The present invention proposes a method of simulating wearing of a subject's glasses on a display using an augmented reality. It can be understood as a process of confirming and selecting a subject without wearing glasses (model) at an optician, but the present invention is not limited to this situation. It belongs to the art of augmented reality in which real and virtual are combined and processed by simulation on the display.

The present invention is implemented based on a computer (PC) having a microprocessor, a memory, a main body having an input / output interface, an input unit of a keyboard, a mouse, a camera, and a display. In FIG. 1, steps S10, S20, S30, and S40 are performed in the main body, but indicate states connected to the input unit and the output unit, respectively. Steps S10, S20, S30, and S40 are merely exemplary and are not strictly limited. The microprocessor handles sequential flows and interrupts in conjunction with memory (including hard disks).

Step S10 of the present invention is a data loading step for converting 3D modeling data of spectacles into simulable resources. It imports 3D modeling data (Max data) of spectacles, and reloads the imported data into a group and stores it as a resource. This is done by the microprocessor using a hard disk or an external server.

In the detailed construction of the present invention, the data loading step is characterized in that the loaded data is classified into a lens group, a material group, and a decoration group and is stored as respective resources. This process is a process of parsing and classifying the imported max data. The lens is classified and grouped in the classified data, and the parts in which the color and texture are changeable are classified and grouped in the classified data. , The corresponding parts are classified and grouped, the grouped data is collected and converted (re-built) into a simulatable resource (FBX) and stored. FBX is Autodesk's 3D production and exchange format that includes information such as textures and materials and enables layer blending.

Meanwhile, the lens grouping illustrated in FIG. 3 may include degrees of a lens, color, shape, and the like, and material grouping may include various materials that are applied to glasses. It may include things that do not belong to the group. Glasses frames can be classified into material groups or decorative groups.

Step S20 of the present invention is a facial tracking step of recognizing an image of the subject's face and tracking the changed posture. It is a process of recognizing that the face of the subject changes to the front state, the leftward rotation state, and the rightward rotation state. This is done by a microprocessor through the camera and by an algorithm embedded in the memory.

In a detailed configuration of the present invention, the face tracking step detects face to perform normalization, performs Fourier transform of the detected face, generates and projects a facial shape (eigenface), and obtains a 6DOF value And outputs the output signal. This is accomplished through facial (face) detection and normalization operations, removing textures and light, Fourier transforming the detected facial values, filtering out unwanted values, creating eigenface shapes, And finding a value that matches the shape, and outputting it as a 6 DOF value.

Step S30 of the present invention is a simulation step of generating an image in which the glasses are overlaid on the face of the subject and changing the display state by selection. It is a process of synthesizing various types of glasses on the face of the subject and virtually synthesizing them into a wearable state. This is accomplished by an algorithm embedded in the memory according to instructions entered through the keyboard or mouse by the microprocessor.

As a detailed configuration of the present invention, the simulation step is characterized by processing a subroutine program of lens change, decoration change, and model change in response to a user's event request. Receiving a user event by a mouse click or the like, changing the main color of the model, changing the decoration of the model containing the decoration, changing to another model, or changing the color of the lens in the case of sunglasses .

Step S40 of the present invention is an updating step for interlocking the wearing state of glasses in response to a change in the attitude of the face in the simulation process. The wearing state of the spectacle model is changed in conjunction with the situation in which the subject turns the face to the left or right side. This is the process of outputting to the display while the microprocessor performs the set algorithm.

In the detailed configuration of the present invention, the updating step may include generating a virtual space, loading a resource, applying face tracking data to link the face position, replacing the background of the virtual space with a camera image source, Value, and processes it as a snapshot during the simulation. The virtual space is created, the resource is loaded, the pivot (center) of the virtual space is applied, the 6DOF value of the face tracking data is applied to apply the same position value as the motion of the face, , Rigid Face Tracking API to convert the tracking results into 6DOF values, replace the background of the virtual space with the image source of the camera, generate a snapshot, and output it to the display. This snapshot keeps the underlying data constantly updated without interfering with the backup operation.

In FIG. 2, the attributes of the main steps are merely illustrative, and assist in searching based on metadata.

7 (a) shows a state in which the user wears glasses, Fig. 7 (b) shows a state in which the wear of glasses is interlocked with the rotation of the face through facial trekking, Fig. 7 It indicates the state of changing the color, size, etc. of the glasses in the process. 7 (a) to (c) are stored in the memory, so that the user can easily reproduce the same model without having to photograph the same model again.

According to the present invention, it is possible to provide a real-time display of various fashion senses by easily changing the wearing state in real time in conjunction with various attitude changes without wearing actual glasses in an optician's office or the like, .

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit and scope of the invention as defined by the appended claims. It is therefore intended that such variations and modifications fall within the scope of the appended claims.

S10: Data Loading Step S20: Facial Tracking Step
S30: Simulation step S40: Updating step

Claims (5)

A method for simulating spectacle wear of a subject on a display using an augmented reality, the method comprising:
A data loading step of classifying the imported data into lens groups, material groups, and decorative groups while converting 3D modeling data of spectacles into simulatable resources and storing them as respective resources;
The facial image is normalized by recognizing the image of the subject's face and tracking the changed posture, performing the Fourier transform of the detected face, generating and projecting a facial shape (eigenface) A face tracking step of outputting 6DOF values;
A simulation step of generating an image in which the eyeglasses are overlaid on the face of the subject, changing a display state by selection, and processing a subroutine program of lens change, decoration change, and model change in response to a user's event request; And
In the simulation process, a virtual space is created by interlocking the glasses wearing state in response to a change in the posture of the face, resources are loaded, facial tracking data is applied to the face position, the background of the virtual space is replaced with a camera image source, And converting the tracking result into a 6DOF value and processing the result as a snapshot during the simulation.
delete delete delete delete
KR1020150102782A 2015-07-21 2015-07-21 Eyeglasses try-on simulation method using augumented reality KR101805056B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150102782A KR101805056B1 (en) 2015-07-21 2015-07-21 Eyeglasses try-on simulation method using augumented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150102782A KR101805056B1 (en) 2015-07-21 2015-07-21 Eyeglasses try-on simulation method using augumented reality

Publications (2)

Publication Number Publication Date
KR20170010985A KR20170010985A (en) 2017-02-02
KR101805056B1 true KR101805056B1 (en) 2017-12-05

Family

ID=58154216

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150102782A KR101805056B1 (en) 2015-07-21 2015-07-21 Eyeglasses try-on simulation method using augumented reality

Country Status (1)

Country Link
KR (1) KR101805056B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220096461A (en) 2020-12-31 2022-07-07 주식회사 산업기술경영진흥원 Glasses wearing simulation system using kiosk

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102286146B1 (en) 2017-12-28 2021-08-05 (주)월드트렌드 System for sales of try-on eyeglasses assembly and the assembled customized glasses
KR102149395B1 (en) 2018-08-31 2020-08-31 주식회사 더메이크 System for providing eyewear wearing and recommendation services using a true depth camera and method of the same
KR102620702B1 (en) 2018-10-12 2024-01-04 삼성전자주식회사 A mobile apparatus and a method for controlling the mobile apparatus
KR102231239B1 (en) 2018-12-18 2021-03-22 김재윤 Eyeglasses try-on simulation method
KR102325829B1 (en) * 2018-12-31 2021-11-12 이준호 Recommendation method for face-wearing products and device therefor

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100386962B1 (en) 2000-11-02 2003-06-09 김재준 Method and system for putting eyeglass' image on user's facial image
JP5648299B2 (en) 2010-03-16 2015-01-07 株式会社ニコン Eyeglass sales system, lens company terminal, frame company terminal, eyeglass sales method, and eyeglass sales program

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100050052A (en) 2008-11-05 2010-05-13 김영준 Virtual glasses wearing method
KR101260287B1 (en) 2012-04-27 2013-05-03 (주)뷰아이텍 Method for simulating spectacle lens image using augmented reality

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100386962B1 (en) 2000-11-02 2003-06-09 김재준 Method and system for putting eyeglass' image on user's facial image
JP5648299B2 (en) 2010-03-16 2015-01-07 株式会社ニコン Eyeglass sales system, lens company terminal, frame company terminal, eyeglass sales method, and eyeglass sales program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220096461A (en) 2020-12-31 2022-07-07 주식회사 산업기술경영진흥원 Glasses wearing simulation system using kiosk

Also Published As

Publication number Publication date
KR20170010985A (en) 2017-02-02

Similar Documents

Publication Publication Date Title
KR101805056B1 (en) Eyeglasses try-on simulation method using augumented reality
US20210215953A1 (en) Systems and methods for creating eyewear with multi-focal lenses
US20200410775A1 (en) Systems and methods for determining the scale of human anatomy from images
KR102399289B1 (en) Virtual try-on system and method of glasses
KR101821284B1 (en) Method and system to create custom products
EP3137938B1 (en) Facial expression tracking
US11900569B2 (en) Image-based detection of surfaces that provide specular reflections and reflection modification
KR20220049600A (en) Virtual fitting system and method for eyeglasses
JP2017531221A (en) Countering stumbling when immersed in a virtual reality environment
CN115803750B (en) Virtual try-on system for glasses using a frame of reference
US11062476B1 (en) Generating body pose information
KR20100050052A (en) Virtual glasses wearing method
JP2021174553A (en) Virtual image generation method and system of deep learning base
CN107492001B (en) Virtual glasses try-on method and device and service terminal
CN104299143A (en) Virtual try-in method and device
WO2023160074A1 (en) Image generation method and apparatus, electronic device, and storage medium
Tang et al. Making 3D eyeglasses try-on practical
CN116452291A (en) Virtual fitting method, virtual fitting device, electronic equipment and storage medium
CN106651500B (en) Online shopping system based on video image recognition technology and virtual reality technology of spatial feedback characteristics
KR20220067964A (en) Method for controlling an electronic device by recognizing movement in the peripheral zone of camera field-of-view (fov), and the electronic device thereof
CN113744411A (en) Image processing method and device, equipment and storage medium
Bai Mobile augmented reality: Free-hand gesture-based interaction
Kang et al. Eyeglass Remover Network based on a Synthetic Image Dataset.
KR102580427B1 (en) Method for providing virtual fitting service and system for same
US20240135555A1 (en) 3d space carving using hands for object capture

Legal Events

Date Code Title Description
A201 Request for examination
J201 Request for trial against refusal decision
J301 Trial decision

Free format text: TRIAL NUMBER: 2016101004055; TRIAL DECISION FOR APPEAL AGAINST DECISION TO DECLINE REFUSAL REQUESTED 20160708

Effective date: 20170731

S901 Examination by remand of revocation
GRNO Decision to grant (after opposition)