WO2015044866A1 - Système de mesure et de visualisation virtuelle de lunettes dans un scénario réel - Google Patents

Système de mesure et de visualisation virtuelle de lunettes dans un scénario réel Download PDF

Info

Publication number
WO2015044866A1
WO2015044866A1 PCT/IB2014/064773 IB2014064773W WO2015044866A1 WO 2015044866 A1 WO2015044866 A1 WO 2015044866A1 IB 2014064773 W IB2014064773 W IB 2014064773W WO 2015044866 A1 WO2015044866 A1 WO 2015044866A1
Authority
WO
WIPO (PCT)
Prior art keywords
observer
camera
glasses
lenses
face
Prior art date
Application number
PCT/IB2014/064773
Other languages
English (en)
Portuguese (pt)
Other versions
WO2015044866A4 (fr
Inventor
César SILVA
Original Assignee
Silva César
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Silva César filed Critical Silva César
Publication of WO2015044866A1 publication Critical patent/WO2015044866A1/fr
Publication of WO2015044866A4 publication Critical patent/WO2015044866A4/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C13/00Assembling; Repairing; Cleaning
    • G02C13/003Measuring during assembly or fitting of spectacles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/246Calibration of cameras

Definitions

  • the present invention is in the field of optics and optometry, and relates to the measurement and virtual viewing of eyeglasses using a camera electronics as support for eyewear e-commerce applications and the simulation of ophthalmic and optical lenses. sun protection lenses.
  • the present invention allows a user to respond to two problems when you want to choose remotely (for example via Internet) a particular model of glasses: (1) what the wearer looks like that particular pair of glasses, what does it look like when the glasses they are placed over your face; and (2) how glasses affect the vision of the what is the effect of the lens on the objects surrounding the user.
  • the present invention proposes a divergent stereo system that captures simultaneously the observer and the observed scene. It also proposes a method that calibrates the system, rebuilds and measures points on the viewer's face, and allows the visualization of overlapping virtual objects over the actual scenario.
  • the Internet is a network that connects computers, businesses and people from around the world, providing a unique infrastructure for various aspects, such as logistics, communications, retail, transactions, payments, organizations, in the forms of work, etc.
  • e-commerce has growth in all business areas, including less conventional areas such as services or furniture retailing, clothing, shoes and glasses.
  • the present invention is a system of measurement and visualization of virtual glasses on a real scenario involving the observer and the scene observed. It is proposed not only to simulate the general appearance of glasses on the face simulate the observed scene seen through the lens virtual.
  • the proposed system comprises a vision system divergent stereo (two-chamber) - which means that the fields of view are not intersect.
  • Divergent stereo vision systems can be found in many different currently available devices such as tablets or smartphones whose cameras are in front of and behind the device.
  • the stereo vision system divergent captures the observer's face and the scene observed by the observer. Presenting these two views on the screen, the system offers a simultaneous perception of the observer and the scene observed.
  • This perception is enriched by the overlap of objects or virtual effects.
  • glasses overlap virtual images over the face.
  • objects image effects to simulate the lens of glasses (ophthalmic or sun).
  • FIG 1 shows the main steps that make up the method associated with the present invention.
  • FIG 2 shows divergent stereo system inserted in a real scenario composed of the observer and the observed scene. Also shows the movement made by the device to capture a sequence of images video.
  • FIG 3 shows the system in its stereo configuration convergent, including a mirror.
  • FIG 4 shows in geometric terms a shape simplified how the system works in its converged stereo configuration.
  • Figure 5 shows in geometrical terms how the calibration using a calibration device and its own points. device.
  • Figure 6 shows the images resulting from the capture of two cameras in the divergent stereo vision configuration, showing the observer and the observed scene, with overlapping virtual objects.
  • FIG 7 shows in geometric terms the measurements optometric parameters obtainable in the present invention.
  • the present invention consists of a stereo system divergent method and method for simultaneously capturing an observer and the observed scene with the following features:
  • the system consists of: a unit of processing; two camcorders - one front (201) (501), defined as the which points to the observer, and another rear (202) (502), defined as the one points to the observed scene - placed in opposite directions; a screen that displays the images captured by the cameras; a rotation estimation module system (using internal device sensors such as gyro, compass and accelerometer as claimed in Claim 4); and yet one calibration device which combined with a flat mirror (503) allows calibrate the system.
  • Calibration device and mirror are elements and interact with the system as follows:
  • the realization of (Claim 3) is a planar grid (507) formed of black squares on a white background contain a set of dots previously measured and which are introduced into the calibration process, which is essential to know the geometric relationship between the two chambers;
  • FIG. 5 is a possible configuration of cameras C1 (501) and C2 (502), mirror S (503) and calibration device (507).
  • an M point 505 known from the calibration device.
  • the estimation of P1 and P2 corresponds to the resolution of a inverse problem when knowing m1, m2 and m.
  • the solution uses tools known in the field of algebra and regression. For the estimation process converge and be robust, not just an M point (505) is used but a collection points taken from the surface of the calibration In this configuration are the corners of the black squares. To further increase For this collection of points, various calibration devices are used in different positions and inclinations.
  • the points of the captured by camera C2 contribute to the estimation of matrix P2 (Claim 7), using the same expression as above, where M is replaced by Q.
  • Figure 4 shows geometrically how to reconstruct a three-dimensional point R (405) knowing the corresponding coordinates r1 and r2 projecting that point on images taken by cameras C1 and C2. Knowing r1, r2 and the relative position between the mirror S (403) and the optical centers of the cameras C1 and C2, you can draw the two projection lines starting from the optical centers and pass through r1 and r2 (where the straight line of C2 reflects in mirror S). By triangulation (intersection of two lines referred to), the point R is reconstructed in its three-dimensional coordinates.
  • the system is placed in its configuration divergent stereo (no mirror) as shown in FIG. 2.
  • the observer (204) stands motionless looking at the observed scene (Claim 10) and proceeds to a movement (203) of the camera system whose The trajectory is preferably horizontal and rotary between -60 and 60 degrees, as set forth in Claim 11.
  • Fig.6 shows an example of images represented on the system screen.
  • the face (600) makes an apparent rotation movement between -60 and 60 degrees (604).
  • the observed scene 601 has a certain relative motion in image 605 as shown in FIG 6.
  • the videos are not intended to be in perfect conformity with the real world, because the observer is not truly following the actual scene in coherence with the movement of the cameras. In fact the observer is static relative to the observed scene throughout the entire video. However, the coherence between the apparent movement of the observer and the movement of the scene observed is accurate (in terms of speed giving the illusion that the observer follows with the gaze (and the head) to scene observed.
  • a step (103) of estimation of the rotation of device of said system performed by the rotation estimation module said system, which in the preferred configuration uses accelerometer, gyroscope and compass. You can also use the image information to estimate the rotation from the follow-up of characteristic points of the images captured either by the front camera or the rear camera.
  • step (104) in which virtual glasses overlap on the observer's face, taking into account the estimated device rotation in d) and points on the observer, estimated in b):
  • FIG. 6 shows an image 602 with glasses virtual resources (606).
  • the glasses must be rotated movement (604) of the device at each moment of the video.
  • the virtual glasses are scaled and positioned so that the projected lenses in the image are aligned or centered with respect to the center of the left (608) and right (607) pupils (Claim 12).
  • To align the virtual glasses is necessary to detect in the image, for each instant of the video, the center of the pupils. This can be achieved using techniques of known image processing for eye and pupil detection.
  • the element overlay process applies (609) also for the observed scene (601), as presented in Claim 13, as the rotation of the video is known or can be fixed and follow visual elements in the observed scene.
  • the observed scenario can be simulated by combining the eyepiece with the ophthalmic lens.
  • step b) of the presented Method we estimate the pupil centers.
  • FIG. 7 shows the supporting geometry for obtaining the most relevant measures.
  • the front view is the image taken when the viewer is looking facing the camera (formally the optical axis of the front camera is parallel to the optical axis of the observer).
  • the measurements taken in the frontal view are: - interpupillary distance (700), which is the distance between the centers of the pupils (Claim 17); - the right height (701) consisting of the height between the center of the right pupil and the lower dimension (or horizontal line bottom) of the right lens contour (701); - the left height (702) that consists of the height between the center of the right pupil and the lower bottom horizontal line) of the right lens contour (702) (Claim 18); - the main measurements of lens contours such as the width of the lens (705), the minimum distance between the lenses (703) and the maximum height (704) (Claim 19);

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Optics & Photonics (AREA)
  • Eyeglasses (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

La présente invention relève du domaine technique de l'optique et de l'ophtalmologie. La présente invention concerne un système et un procédé permettant à un utilisateur non seulement de visualiser des lunettes virtuelles sur son propre visage, mais également de simuler l'effet des lentilles de ces lunettes sur une scène observée. À cette fin, l'invention concerne un système stéréo divergent constitué par deux caméras, et un procédé qui étalonne ce système, reconstruit des points sur le visage de l'observateur, permettant des mesures exactes, et superpose des objets virtuels sur le scénario réel.
PCT/IB2014/064773 2013-09-26 2014-09-23 Système de mesure et de visualisation virtuelle de lunettes dans un scénario réel WO2015044866A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
PT107198 2013-09-26
PT107198A PT107198A (pt) 2013-09-26 2013-09-26 Sistema de medida e visualização virtual de óculos num cenário real

Publications (2)

Publication Number Publication Date
WO2015044866A1 true WO2015044866A1 (fr) 2015-04-02
WO2015044866A4 WO2015044866A4 (fr) 2015-06-18

Family

ID=51897397

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/064773 WO2015044866A1 (fr) 2013-09-26 2014-09-23 Système de mesure et de visualisation virtuelle de lunettes dans un scénario réel

Country Status (2)

Country Link
PT (1) PT107198A (fr)
WO (1) WO2015044866A1 (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001088654A2 (fr) * 2000-05-18 2001-11-22 Visionix Ltd. Systeme d'adaptation de lunettes et procedes d'adaptation correspondants

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100380176C (zh) * 1997-05-16 2008-04-09 Hoya株式会社 眼镜定制系统和合成图像的方法
US20080074440A1 (en) * 2006-09-25 2008-03-27 Yiling Xie System and method of optimizing optical products through public communication network
FR2945365B1 (fr) * 2009-05-11 2011-06-24 Acep France Procede et systeme de selection en ligne d'une monture de lunettes virtuelle

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001088654A2 (fr) * 2000-05-18 2001-11-22 Visionix Ltd. Systeme d'adaptation de lunettes et procedes d'adaptation correspondants

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LEBRALY P ET AL: "Flexible extrinsic calibration of non-overlapping cameras using a planar mirror: Application to vision-based robotics", INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2010 IEEE/RSJ INTERNATIONAL CONFERENCE ON, IEEE, PISCATAWAY, NJ, USA, 18 October 2010 (2010-10-18), pages 5640 - 5647, XP031920672, ISBN: 978-1-4244-6674-0, DOI: 10.1109/IROS.2010.5651552 *
MIAOLONG YUAN ET AL: "A mixed reality system for virtual glasses try-on", PROCEEDINGS OF THE 10TH INTERNATIONAL CONFERENCE ON VIRTUAL REALITY CONTINUUM AND ITS APPLICATIONS IN INDUSTRY, VRCAI '11, 1 January 2011 (2011-01-01), New York, New York, USA, pages 363, XP055158336, ISBN: 978-1-45-031060-4, DOI: 10.1145/2087756.2087816 *

Also Published As

Publication number Publication date
WO2015044866A4 (fr) 2015-06-18
PT107198A (pt) 2015-03-26

Similar Documents

Publication Publication Date Title
TWI755671B (zh) 用於眼鏡之虛擬試戴系統及方法
CN111031893B (zh) 用于确定与眼科装置相关联的至少一个参数的方法
US11550151B2 (en) Method of determining an eye parameter of a user of a display device
JP5562520B2 (ja) ユーザの光学パラメータを決定する装置、方法、および関連するコンピュータプログラム
JP6014038B2 (ja) 眼鏡装用シミュレーション方法、プログラム、装置、眼鏡レンズ発注システム及び眼鏡レンズの製造方法
KR102073460B1 (ko) 렌즈 시스템을 통한 드리프트 프리 눈 추적을 제공하는 머리 장착형 눈 추적 디바이스 및 방법
US20160173864A1 (en) Pickup of objects in three-dimensional display
PT106430B (pt) Sistema para medição da distância interpupilar usando um dispositivo equipado com um ecrã e uma câmara
US10620454B2 (en) System and method of obtaining fit and fabrication measurements for eyeglasses using simultaneous localization and mapping of camera images
JP2019194702A (ja) 累進眼科用レンズの少なくとも1つの光学的設計パラメータを判断する方法
US20180330545A1 (en) Device and method for providing augmented reality for user styling
US20180042477A1 (en) Device and method for distance determination and / or centering using corneal reflexions
CN109979016B (zh) 一种ar设备显示光场影像的方法、ar设备和存储介质
US10685457B2 (en) Systems and methods for visualizing eyewear on a user
JP2012066002A (ja) 眼鏡の視野画像表示装置
JP6446465B2 (ja) 入出力装置、入出力プログラム、および入出力方法
Tong et al. Optical distortions in VR bias the perceived slant of moving surfaces
US12008711B2 (en) Determining display gazability and placement of virtual try-on glasses using optometric measurements
Lee et al. A calibration method for eye-gaze estimation systems based on 3D geometrical optics
Wibirama et al. Design and implementation of gaze tracking headgear for Nvidia 3D Vision®
WO2015044866A1 (fr) Système de mesure et de visualisation virtuelle de lunettes dans un scénario réel
JP6479835B2 (ja) 入出力装置、入出力プログラム、および入出力方法
US20200209652A1 (en) System and Method of Obtaining Fit and Fabrication Measurements for Eyeglasses Using Depth Map Scanning
CN111587397B (zh) 图像生成装置、眼镜片选择系统、图像生成方法以及程序
Zhou et al. Optical-path-difference analysis and compensation for asymmetric binocular catadioptric vision measurement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14796892

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14796892

Country of ref document: EP

Kind code of ref document: A1