IL286236B2 - Method, centering device and computer program product for measuring the distance of a user from a centering device - Google Patents

Method, centering device and computer program product for measuring the distance of a user from a centering device

Info

Publication number
IL286236B2
IL286236B2 IL286236A IL28623621A IL286236B2 IL 286236 B2 IL286236 B2 IL 286236B2 IL 286236 A IL286236 A IL 286236A IL 28623621 A IL28623621 A IL 28623621A IL 286236 B2 IL286236 B2 IL 286236B2
Authority
IL
Israel
Prior art keywords
user
distance
image data
determined
face
Prior art date
Application number
IL286236A
Other languages
Hebrew (he)
Other versions
IL286236A (en
IL286236B1 (en
Original Assignee
Rodenstock Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rodenstock Gmbh filed Critical Rodenstock Gmbh
Publication of IL286236A publication Critical patent/IL286236A/en
Publication of IL286236B1 publication Critical patent/IL286236B1/en
Publication of IL286236B2 publication Critical patent/IL286236B2/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C13/00Assembling; Repairing; Cleaning
    • G02C13/003Measuring during assembly or fitting of spectacles
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C13/00Assembling; Repairing; Cleaning
    • G02C13/003Measuring during assembly or fitting of spectacles
    • G02C13/005Measuring geometric parameters required to locate ophtalmic lenses in spectacles frames

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Optics & Photonics (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Description

R 3274IL Method, centering device and computer program product for determining the distance of a user from a centering device The present invention relates to a method for measuring the distance of a user from a centering device, a centering device for measuring the distance of a user from the centering device, and a computer program product for carrying out the method.
Since the introduction of individually optimized spectacle lenses, it has been possible to meet the requirements of persons with visual defects, and to provide, for example, spectacle lenses with individually optimized visual ranges. Individually configured spectacle lenses enable to optimally correct optical defects of vision of a user of the spectacle lenses.
The lenses of optical eyeglasses are regularly edged to certain target specifications, which may vary depending on the user of the eyeglasses. For example, the optical lenses of the eyeglasses are regularly arranged in the spectacle frame in such a way that look-through points of the user’s eyes in the position of use are arranged at predetermined positions on the respective eyeglass lenses. In this context, it is said that the spectacle lenses are correctly centered in the frame of the eyeglasses.
Checking this centering of the spectacle lenses, i.e., checking manufacturing quality and/or grinding-in quality, may be carried out either on the finished eyeglasses or on the edge-formed spectacle lenses. For the centering, optical parameters of the user can be determined, such as a pupil distance, look-through points, a frame lens angle, etc.
Optical parameters and terms such as "pupil distance", "grinding-in height", "centering point", "position of use", "functional engravings" "look-through point", etc. are defined in, and may be learned from, relevant standards, such as DIN EN ISO 1366, DIN 58 208, DIN EN ISO 8624 and DIN 5340.
Traditionally, centering is checked using manual measuring procedures, e.g. at the optician, by marking the functional engravings and measuring distances using a pupil distance ruler. Since they are performed by human operators, manual checks are always prone to error.
There are also known semi-automated methods for checking the centering, the grinding-in height and/or the pupil distance, wherein individual points (such as a centering point) are first pre-marked on the spectacle lenses.
Document WO 2006/079540 A1 discloses a centering device by means of which optical parameters of a user can be determined and thus a centering can be performed. With this prior art centering device, the user is positioned, for example by an optician, in front of the centering device in such a way that the user is positioned at a target distance in front of it, in which the centering device records images of the user. The centering device comprises imaging devices that generate images of the user's face. The optical parameters required for centering are calculated from these images.
It is an objection of the invention to simplify and/or improve the operation of a centering device, in particular simplifying and/or improving a positioning of the user in front of a centering device.
This object is achieved by the subject-matter of the independent claims. Embodiments of the invention are subject-matter of the dependent claims.
An aspect relates to a method for measuring the distance of a user to a centering device which is configured to determine optical parameters of the user. In this, it is determined the distance from the centering device that comprises two imaging devices having a previously known calibration with respect to each other. The user is positioned in front of the centering device in such a way that the two imaging devices are at least partially directed at the face of the user facing the centering device. By means of the two imaging devices two image data sets of at least partial areas of the user's face are generated from two different spatial directions. In each of the two image data sets, a plurality of predetermined characteristic facial points of the user are determined by means of image processing algorithms. By comparing the position of corresponding characteristic facial points in the two image data sets, the distance of the user from the centering device is determined, taking into account the calibration of the two imaging devices with respect to each other.
The method is used to measure the distance of the user from the centering device. The method may have the objective to position the user in front of the centering device in such a way that the latter can determine optical parameters of the user in order to perform centering. In this context, it is advantageous if the user is positioned in a target distance range in order to perform the centering with particularly good results The target distance range may depend, for example, on the depth of field range to which the centering device’s imaging devices are set. Since at least the determination of the optical parameters of the user is preferably determined based on pixel-precise image data, it is advantageous if the user is arranged in the target distance range in such a way that the imaging devices can record high-quality, i.e. sharp, image data of at least parts of the user's face.
The centering device may be a centering device as disclosed in above mentioned document WO 2006/079540 A1. The centering device comprises at least two imaging devices having a fixed, predetermined calibration with respect to each other. For determining the optical parameters, there are generated centering image data of the user, more precisely at least of partial areas of the user's face, using the at least two imaging devices. Using the image data generated by the two imaging devices, and based on the knowledge of the a priori known calibration and, as the case may be, the user’s distance from the centering device, it is then possible to determine 3D coordinates of individual points in the face of the user and/or of eyeglasses or a spectacle frame worn by the user in the position of use. These 3D coordinates may then be used to calculate the optical parameters important for centering, such as pupil distance, frame lens angle, etc.
It is crucial for the determination of the optical parameters that the user is correctly positioned in front of the centering device. The positioning of the user in front of the centering device may include a check of the distance of the user from the centering device. With the method according to the invention, the distance of the user from the centering device can be determined without additional hardware components (such as an infrared distance sensor), that is using only hardware components that the centering device already comprises. For example, at least two of the imaging devices, preferably exactly two of the imaging devices of the centering device are used, as well as a data processing device which may also already be present in the centering device. Additional hardware components can be dispensed with, which may reduce the number of components and/or the costs of the centering device.
According to the method, the user is positioned in front of the centering device in such a way that the at least two imaging devices are at least partially directed at the face of the user who is facing the centering device. In other words, the user is positioned in front of the centering device in such a way that the imaging devices can capture images of the user's face at least partially. In this first, approximate positioning of the user, it is not yet necessary that the user be positioned exactly within a target distance range. Rather, the user may be positioned arbitrarily in front of the centering device as long as the user is more or less facing it. Preferably, however, the user may already be positioned in front of the centering device at least approximately in a target distance range (e.g. estimated by the optician).
Subsequently, using the at least two imaging devices, there are generated at least two image data sets of at least partial areas of the user's face from two different spatial directions. The image data sets respectively may contain either at least partial areas of the user's face, or additionally also eyeglasses and/or a spectacle frame which the user is wearing in the position of use. The image data sets are generated from two different spatial directions because they are generated by the two differently positioned imaging devices. Here, the imaging devices substantially simultaneously generate the two image data sets so that these contain two images and/or recordings and/or information of at least partial areas of the user's face at approximately the same time.
Subsequently, a plurality of predetermined, in particular individual, characteristic facial points of the user are determined in each of the two image data sets by means of image processing algorithms. The individual, predetermined characteristic facial points of the user may include nasal and/or temporal points on the eyebrows, the corners of the eyes, pupil centers, cheekbone points, ear base points, chin base points, and/or similar facial points of the user. Here, the image processing algorithms may include image recognition algorithms that automatically and/or semi- automatically recognize the predetermined characteristic facial points in the image data sets.
The image processing algorithms may have been trained accordingly in order to recognize the predetermined facial points as accurately as possible. The image processing algorithms are thus preferably trained image processing algorithms which have been trained, for example by means of "machine learning", to recognize the predetermined characteristic facial points, preferably on the basis of at least 20, more preferably on the basis of at least 100, image data sets of different faces.
It should be noted here that the detection of individual, predetermined characteristic facial points in the image data sets according to the invention differs from a so-called block matching of the different image data sets. In prior art stereo camera systems, it is known to use so-called block matching methods to determine corresponding areas in the recorded image. In that case, a pixel area is selected in each of the images captured by the two different stereo cameras and compared with a pixel area in the other image data set. For example, in conventional stereo camera systems, there may be found in the two different image data sets pixel areas of about 10 x 10 pixels each that are as similar as possible and therefore possibly match each other. In this way conventional stereo camera systems are able to determine a distance and/or a spatiality. Block matching methods however are only suitable for stereo camera systems in which the recording cameras only generate image data sets from recording directions that differ insignificantly. Thus, there are generated only images having sufficiently corresponding pixel blocks. Therefore, block matching methods considerably limit the positioning possibilities of the stereo cameras.
Using the image processing algorithms to determine the characteristic individual facial points of the user, it is possible to arrange the imaging devices with a larger vertical and/or horizontal distance to each other on the centering device than would be possible with block matching methods. The image processing algorithms used thus improve the possibilities of positioning the imaging devices. For example, commercially available face recognition software can recognize individual facial points even when a face is captured only in profile.
After the predetermined characteristic facial points have been determined in the two (preferably digital) image data sets, the positions of corresponding characteristic facial points in the two image data sets are compared with each other. In particular, the pixel positions of the corresponding characteristic facial points may be compared. Here, the term "corresponding characteristic facial points" means the same facial point in the two image data sets, such as the left pupil center, the chin tip, the nose tip, etc. For example, the pixel position of the left pupil center point in the first image data set can be compared with the pixel position of the left pupil center point in the second image data set. Taking into account the calibration of the two imaging devices with respect to each other, the distance of the user from the centering device can be determined from the positions of the corresponding characteristic facial points, for example, by means of triangulation. Such triangulation methods are known in the state of the art and allow, for example, to determine 3D coordinates of the corresponding characteristic facial points, which in turn may allow to calculate and/or determine the distance from the centering device.
The distance of the user determined in this way may be used in the course of calculating the optical parameters in a subsequent measuring step, and/or may be used to correct the position of the user in front of the centering device. For example, the determined distance may indicate that the distance of the user from the centering device needs to be reduced and/or increased before the actual measurement of the user's optical parameters is performed. The distance measurement may be repeated until the user is within a target distance range in which the determination of the optical parameters by the centering device can take place.
Alternatively or additionally, it is also possible to use the distance determined in this way directly in the calculation of the optical parameters, specifically as part of the basic data based on which the user data in three-dimensional space are generated by the centering device.
Thus, the method according to the invention allows to make a distance determination without requiring further additions to the hardware of the centering device. The distance can be determined using solely the hardware components already present in the centering device. This can improve the application possibilities of the centering device as well as the general handling of the centering device. Furthermore, a more accurate distance determination may improve the result of the determination of the user's optical parameters.
According to one embodiment, the characteristic facial points in the two sets of image data are determined using face recognition software. Face recognition software is designed to recognize faces of persons in image data sets. For this purpose, face recognition software programs use so-called feature points, which they attempt to identify and recognize in the faces, and/or on the basis of which they recognize faces in image data sets. In the method, the predetermined characteristic face points are used as feature points of a face recognition software. Here, there may be used by default exactly those face points that the corresponding face recognition software uses by default. Alternatively, additional and/or other face points may be used for this purpose. These facial points may include, in particular, facial points close to the eyes, on the basis of which the distance of the user's eye area from the centering device can be determined particularly well. As an example, the image processing algorithms of the Dlib libraries, available on the Internet, may be used for this purpose. Dlib are free software libraries written in the C++ programming language comprising algorithms for machine learning, image processing and machine vision. Thus, standardized, free and/or commercially available face recognition software may be used to determine the distance of the user from the centering device.
In an embodiment, in each of the two sets of image data there are determined from about 20 to about 100 different characteristic facial points. This number is necessary and/or sufficient in order to recognize with sufficient accuracy the face in the image data sets. Preferably, there are determined in each image data set from about 40 to about 80 different characteristic facial points. For example, there exist standardized face recognition software programs that recognize faces in image data sets based on different face points as feature points.
According to an embodiment, there is generated a feedback depending on the determined distance at least until the user is positioned in a target distance range in front of the centering device. In this case, the user may be guided into the target distance range by means of a feedback signal. The feedback may be output by the centering device, for example by means of a colored signal lamp, a display of the current actual distance and/or an acoustic signal and/or a voice output, which guides the user into the target distance range.
The target distance range in particular may be a range in front of the centering device in which the image acquisition devices achieve a maximum of their depth of field. For example, the target distance range may be arranged as a distance range from about cm to about 100 cm, preferably from about 60 cm to about 75 cm, in front of the centering device. Depending on the quality of the image acquisition devices, the target distance range may also be dimensioned differently, in particular may be larger, if higher quality image acquisition devices (e.g. with a larger depth of field range) are used. In this case, the determination of the distance may be performed iteratively to allow the user to be guided into the target distance range. Furthermore, even after the target distance range has been reached, the distance may continue to be determined, for example to provide positive feedback, for example to indicate that the user is within the target distance range. The last determined distance of the user from the centering device may be taken as an input for determining the user's optical parameters during a subsequent main measurement.
According to an embodiment, the characteristic facial points include both a right and a left near-pupil facial point. Taking into account the distance of the determined right and left near pupil facial points, an average pupil distance of the user from the centering device is determined as the distance of the user from the centering device. This ensures that the characteristic points of the face and/or of the eyeglasses arranged on the face which are relevant for determining the optical parameters are all arranged approximately in the target distance range. For example, the points relevant for determining the optical parameters may all be arranged near the pupil. In particular, the respective center of the pupil may be used as the right and/or left near­pupil facial point. Alternatively, another characteristic facial point of the respective eye or of a part of the face close to the eye may be used for this purpose, such as at the brows and/or the nose of the user.
According to an embodiment, digital videos are recorded with the imaging devices as the two image data sets. Face recognition software may also be used to recognize characteristic facial points in videos. Typically, face recognition software is able to identify and recognize individual characteristic facial points in both still images and moving images, i.e., videos.
In a further aspect of this embodiment, the two sets of image data are generated as a live feed in which the user's characteristic facial points are identified (e.g., iteratively) in real time. In this case, the live feed may for example be sent to a screen on which an operator, such as an optician, can verify that the image processing algorithms correctly recognize the face. The live feed may be used particularly well to guide the user into a target distance range.
According to an embodiment, after having performed the distance measurement using the two image acquisition devices, the centering device generates respective centering image data of at least one partial area of the user's face and determines optical parameters of the user based on these centering image data (e.g., relevant for centering).
There may be generated 3D coordinates of predetermined points in and/or around the user's face, from which the optical parameters can be calculated. The 3D coordinates may be determined in any 3D coordinate system, e.g., in the reference system of the earth, in the reference system of the user, in the reference system of the centering device, and/or in the reference system of the image acquisition devices. The 3D coordinates subsequently may be further processed.
The centering image data may in particular include information about the user's head and/or face with a spectacle frame attached thereto. There may be acquired 3D coordinates of predetermined points of the system that comprising the user's head with spectacle frame, which may be used to calculate optical parameters, e.g., pupil centers.
In a further aspect of this embodiment, the image data sets for determining the distance of the user are generated at a lower resolution than the centering image data for determining the optical parameters. For example, the image data sets for determining the distance may be generated with half or a quarter of the full and/or maximum camera resolution, whereas the centering image data are generated with the full, maximum camera resolution. Using a reduced amount of data for distance measurement, and thus a reduced volume of data, may be utilized for example to enable a live data connection via a local WLAN. Technically, it may be sufficient for distance determination to generate the image data sets for distance measurement in a lower resolution, since measuring the distance does not require to be performed with a high pixel accuracy. Thus, for distance measurement, it is only necessary to place the user within a target distance range, which may include a distance range of several centimeters and thus comprises a distance variance. The actual main measurement, i.e. the measurement of the optical parameters, is however preferably carried out with pixel accuracy and the highest possible resolution in order to enable the most accurate centering possible.
According to an embodiment, the calibration comprises both an internal and an external calibration. The internal calibration may include data and/or metrics that describe and/or define how a light beam exits the imaging device(s). For example, the internal calibration may include a focus length, an image size, a lens distortion, an image center (i.e., a target of the central beam from the imaging device), a pixel size, and/or a relation of the pixel size to a unit of measure in real space. The external calibration may include data and/or metrics that describe and/or define how the two imaging devices are positioned relative to each other. In particular, the external calibration may include a distance between the two cameras, i.e., location information in three-dimensional space, and an orientation of the optical axes of the imaging devices, e.g., an angle therebetween.
An aspect relates to a centering device for determining optical parameters of a user and for measuring the distance of the user from the centering device, comprising two imaging devices having a pre-known calibration to each other, which are designed and arranged to respectively generate image data sets of at least partial areas of the user's face from two different spatial directions. A data processing device comprises point recognition means configured to determine, in each of the generated image data sets, a plurality of predetermined characteristic facial points of the user, respectively, by means of image processing algorithms. The data processing device further comprises distance determination device configured to determine a distance of the user from the centering device by comparing the position of corresponding characteristic facial points in the two image data sets, taking into account the calibration of the two imaging devices with respect to each other.
The centering device may in particular carry out the method according to the aspect described above. Therefore, all explanations regarding the method also apply from the centering device and vice versa.
According to an embodiment, the point recognition means is configured to determine the characteristic facial points in the two sets of image data by means of face recognition software.
In an embodiment, the centering device comprises distance output means configured to generate a feedback depending on the distance determined by the distance determination device, at least until the user is positioned within a target distance range in front of the centering device. The distance output means may, for example, comprise a screen that outputs a feedback regarding the current actual distance of the user from the centering device. Alternatively or additionally, the distance output means may comprise a speaker and/or indicator lights such as LEDs.
According to an embodiment, the two imaging devices are further configured and arranged respectively to generate centering image data of at least portions of the user's face. The data processing device comprises user data determination device configured to determine, based on the generated centering image data, user data of at least a portion of the face or at least a portion of a system of the face and a pair of eyeglasses of the user disposed thereon in a position of use, the user data comprising location information in three-dimensional space of predetermined points of the portion of the face or the portion of the system. The data processing device further comprises parameter determination device configured to determine at least a portion of the user's optical parameters based on the user data. The centering device comprises a data output means configured to output at least a portion of the determined optical parameters of the user. Details of the arrangement of the imaging devices, the parameter determination device and the data output means can be found in WO 2006/079540 A1.
In other words, the centering device comprises all elements for a professional, individual centering of a pair of eyeglasses for the user. Here, the data processing device may be realized in one or in several parts. Furthermore, the data output means may also be realized as being the distance output means. Alternatively, separate output means may be provided for this purpose. Preferably, however, the distance measurement is performed without additional hardware, the data processing device being a single unit and the output means being designed both as a distance output means and as a data output means. As a result, the component cost is reduced and thus the overall construction cost is reduced.
One aspect relates to a computer program product comprising program portions which, when loaded in a computer, are configured to perform the method described above.
In the context of the present invention, the terms "substantially" and/or "about" may be used to include a deviation of up to 5% from a numerical value following the term, a deviation of up to 5° from a direction following the term, and/or from an angle following the term.
Terms such as top, bottom, above, below, etc., refer to the reference frame of the earth in an operating position of the subject matter of the invention, unless otherwise specified.
The invention is described in more detail below with reference to exemplary embodiments shown in figures. Here, identical or similar reference signs may indicate identical or similar features of the embodiments. Individual features shown in the figures may be implemented in other embodiments.
Fig. 1A is a schematic top view of a user with elements of a centering device;Fig. 1B is a schematic side view of the user with elements of the centering device;Fig. 2A is a schematic representation of an image data set generated by animaging device containing information about the user's face; andFig. 2B is a schematic representation of the image data set generated by the imaging device with marked characteristic facial features of the user.
Figures 1A and 1Bschematically show a user 100 in the measuring area of a centering device, which realized as a device for determining 3D coordinates of predetermined points. Regarding the centering device, only a first imaging device and a second imaging device 20 are shown as elements. Both imaging devices and 20 are aligned to a designated measuring range. The measuring range is not specifically identified in Figures 1A and 1B and may correspond to a target distance range. In the measuring range, there is positioned the user 100, more precisely, the head of the user 100 with a spectacle frame 110 arranged thereon in the use position. The use position is defined in the standards mentioned in the above. The centering device is configured and provided to detect predetermined points of the system consisting of the head of the user 100 and the spectacle frame 110 arranged thereon in the position of use, and to calculate their 3D coordinates. This is used to calculate optical parameters such as a pupil distance, a frame lens angle, a spectacle lens pre-tilt, a corneal vertex distance of the eyeglasses and the eye, and a grinding-in height of the spectacle lenses as optical parameters.
The imaging devices 10 and 20 may be digital cameras and respectively comprise an optical axis 11 and 21. The first optical axis 11 is directed from the center of a recording lens and/or aperture of the first imaging device 10 to the measurement area, and defines an imaging direction of an image that may be recorded by the first imaging device 10. Likewise, the second optical axis 21 is directed from the center of a pickup lens and/or aperture of the second imaging device 20 to the measurement area and defines an imaging direction of an image that may be recorded by the imaging device 20.
The first imaging device 10 and the second imaging device 20 are arranged in a pre­known relation to each other, i.e. they have a pre-known calibration. This may mean that between the two imaging devices 10 and 20, there is a known distance in three­dimensional space. Furthermore, the arrangements and in particular the directions of progression of the optical axes 11 and 21 with respect to each other may be pre­known. In particular, in case the optical axes 11 and 21 intersect, an angle between the first optical axis 11 and the second optical axis 21 may be pre-known.
In the embodiment shown in Figures 1A and 1B, the optical axes 11 and 21 intersect at least approximately between the eyes of the user on the bridge of the nose. In alternative embodiments, the optical axes 11 and 21 do not necessarily intersect at a point, but may merely comprise a minimum distance by which they are spaced apart from each other in the measurement range and/or target distance range. Such a smallest distance to each other may be, for example, 10 cm maximum. Here, as part of the calibration, there may be known the smallest distance between the optical axes 11 and 21, an intermediate vertical angle, if any, between projections of the optical axes 11 and 21 onto a vertical plane, and/or an intermediate horizontal angle, if any, between projections of the optical axes 11 and 21 onto a horizontal plane.
In the exemplary embodiment shown in Figs. 1A and 1B, the optical axes 11 and have a horizontal offset from each other, and thus a vertical angle between them (cf. Fig. 1B), as well as a vertical offset from each other, and thus a horizontal angle between them (cf. Fig. 1A). This arrangement with both a horizontal and a vertical offset between the two image pickup devices 10 and 20 can enable a particularly favorable determination of the optical parameters relevant for centering.
Based on the aforementioned arrangements of the imaging devices 10 and 20 as well as the associated optical axes 11 and 21, it is possible to calculate by means of triangulation the 3D coordinates of predetermined points in centering image data sets recorded by the imaging devices 10 and 20. The 3D coordinates of the predetermined points can be used to calculate the optical parameters required for centering.
Here, the centering device may be adjusted, calibrated and/or fixed under statically fixed arrangement of the imaging devices 10 and 20 with respect to each other. A calibration object may be used to calibrate the device, i.e., to record and/or store the pre-known relation of the first imaging device 10 to the second imaging device 20.
By means of the centering device, however, there are determined not only the optical parameters, but also, in a preceding step, there is first determined the distance of the user from the centering device, i.e. for example the average distance to the two imaging devices 10 and 20. For this purpose, each of the two imaging devices and 20 generates a respective image data set which contains at least a partial area of the user's face.
Fig. 2Ashows schematically such a digital image data set generated by the imaging device 10, which contains information about the face of the user 100. Here, the image data set contains the entire face of the user 100, on which a spectacle frame 110 and/or eyeglasses are arranged in the position of use. However, since the spectacle frame 110 is less relevant for the distance measurement, the distance measurement may also be performed without the spectacle frame, i.e. may only contain information about the face of the user 100 without the spectacle frame 110.
For measuring the distance, each of the two digital image data sets, which were recorded by the two imaging devices 10 and 20 at the same time, is examined for individual, predetermined, characteristic facial points. This examination is performed automatically or semi-automatically by means of trained image processing algorithms, in particular by means of face recognition software.
Fig. 2Bshows schematically which individual, characteristic facial points 30, 31, have been determined and marked by the image processing algorithms in the image data set generated by the imaging device 10 (and shown in Fig. 1A). The image processing algorithms are trained to recognize predetermined facial points 30, 31, in the image data sets generated by the imaging devices 10 and 20.
Commercially available image processing algorithms for face recognition may, for example, be trained to identify and determine 68 different face points as so-called feature points. The image processing algorithm used for distance measurement can, for example, use exactly these standardly used facial points and may determine them during the distance measurement.
Alternatively, the image processing algorithm may be trained to determine additional facial points, and/or some of the default facial points used may be omitted.
Completely different, newly defined facial points may also be used and determined for distance measurement.
In an embodiment, there are used for distance measurement for example only facial points around the eye area, which (in another embodiment) may be supplemented by facial points of the nose area. In these embodiments, image data sets may suffice for distance measurement, which contain only partial areas of the face of the user 100, e.g., only the eye area and/or nose area of the user 100.
In an embodiment, a different number of facial points are identified and used for distance measurement. For a sufficient balance between required computing power and achieved accuracy, it is advantageous to use between about 40 and about 1different, single characteristic facial points for distance measurement. However, this number may vary, and in particular this number may be increased as transmission capacity and/or computing power increases.
In the image data set shown in Fig. 2B, face points 30, 31 and 32 are determined and marked with a black circle, only some of which are exemplarily marked with a reference sign. Some neighboring face points are connected with a connecting line according to a scheme specified by the image processing algorithm.
In the image data set, a group of facial points 30 around the chin area up to the two ears is marked and connected with connecting lines 40. Furthermore, a group of facial points 30 around the mouth area, a group around the nose area, a group around each of the two eyes, and a group around each of the two eyebrows are each marked with a plurality of facial points 30, which are each connected to each other with connecting lines 40 guided by predetermined algorithm rules.
In general, the image processing algorithm used may use and determine in the image data sets at least one, more, or all of the following groups of characteristic facial points: - a chin group facial points 30 around the chin area, possibly up to both ears of the user 100, where the chin group may include the tip of the chin as a facial point 30;- a mouth group facial points 30 around the mouth area of the user 100, wherein the mouth group may include the right and/or left corner of the lips as a facial point 30;- a nose group face points 30 around the nose portion of the user 100, wherein the nose group may include the tip of the nose, a right nostril center, and/or a left nostril center as a face point 30;- a right eye group face points 30 around the right eye of the user 100, wherein said right eye group may include the right pupil center 31 as a face point 30;- a left eye group facial points 30 around the left eye of the user 100, wherein said left eye group may include the left pupil center 32 as a facial point 30;- a right eyebrow group face points 30 around the right eyebrow of the user 100, wherein said right eyebrow group may include a right and/or left eyebrow tip as a face point 30; and/or- a left eyebrow group facial points 30 around the left eyebrow of the user 100, wherein the left eyebrow group may include a right and/or left eyebrow tip as a facial point 30.
In particular, in the embodiment shown, a right pupil-near facial point 31 as well as a left pupil-near facial point 32 is determined and identified. To measure the distance of the user from the centering device, a mean pupil distance may be used, which is the average of the distance of the right pupil-near facial point 31 and the left pupil-near facial point 32. Alternatively, as the distance of the user from the centering device there may also be used a mean eyebrow distance and/or a nose distance.
The evaluation of the image data sets, i.e. the recognition and marking of the characteristic facial points, is performed on exactly those two image data sets which were generated substantially simultaneously by the two differently positioned imaging devices 10 and 20. These two image data sets to be evaluated differ from each other because they were taken from different spatial directions, e.g. from a different vertical height and/or from a different (horizontal) cardinal direction.
By comparing the pixel position of mutually corresponding facial points 30 in these two evaluated image data sets, i.e., for example, by comparing the two right facial points 31 close to the pupil and/or the two left facial points 32 close to the pupil, there may be determined the distance of the user 100 from the centering device by means of triangulation, taking into account the calibration of the two imaging devices and 20.
For the distance measurement, it may be sufficient to calculate the distance of a single facial point 30 from the centering device, e.g., the distance of the facial point of the tip of the nose, the left corner of the mouth, or the right center of the pupil 31, etc. Preferably, however, for purpose of error control, the centering device calculates the distances of at least two of the facial points 30, 31, 32 and forms an average value. If these distances differ greatly, the measurement may be repeated on a new image and/or the mean value of the distance may be formed from a larger number of determined facial points 30, 31, 32, in particular from all facial points 30, 31, 32.
The evaluation of the image data sets may be performed on live images, i.e. using a live feed. This live evaluation may be used to guide the user 100 into a target distance range, e.g., into an optimal focus range of the two imaging devices 10 and 20.
This enables an active, purely software-based distance measurement without additional hardware, e.g. without a distance measuring device (e.g. based on ultrasound or a laser interferometer). This enables an improved positioning of the user 100 and reduces user errors during positioning. This can improve the measurement result of the centering. 10203032100110 List of reference signs first imaging devicefirst optical axissecond imaging devicesecond optical axisfacial pointfacial point close to right pupilfacial point close to left pupilconnecting lineuserspectacle frame

Claims (13)

V 0282202272- Claims:
1. Method for measuring the distance of a user from a centring device which is designed to determine optical parameters of the user, wherein - the distance to the centring device is determined, which has two image recording devices with a previously known calibration relative to one another; - the user is positioned in front of the centring device in such a way that the two image recording devices are at least partially directed towards the face of the user facing the centring device; - two image data sets of at least partial areas of the face of the user are generated from two different spatial directions by means of the two image recording devices; - in each of the two image data sets, a plurality of predetermined characteristic facial points of the user are determined by means of image processing algorithms, wherein the characteristic facial points are used as feature points of a facial recognition software, which are determined in the two image data sets by means of the facial recognition software; and - by comparing the position of corresponding characteristic facial points in the two image data sets, taking into account the calibration of the two image recording devices with respect to each other, the distance of the user from the centring device is determined.
2. Method according to claim 1, wherein in each of the two image data sets from about 20 to about 100different characteristic viewpoints are determined.
3. Method according to claim 1or 2, wherein feedback is generated as a function of the determined distance at least until the user is positioned in a desired distance range in front of the centring device.
4. Method according to one of the preceding claims, wherein - the characteristic viewpoints contain both a right and a left viewpoint close to the pupil, and V 0282202272- - a mean distance between the centring device and the pupils of the user is determined as the distance of the user from the centring device, taking into account the determined right and left viewpoints close to the pupil.
5. Method according to any of the preceding claims, wherein digital videos are recorded with the image recording devices as the two sets of image data.
6. Method according to claim 5 , wherein the two image data sets are generated as a live feed in which the characteristic viewpoints of the user are determined in real time.
7. Method according to one of the preceding claims, wherein the centring device generates centring image data of at least one partial area of the face of the user in each case after the distance measurement with the two image recording devices and determines optical parameters of the user on the basis of this centring image data.
8. Method according to claim 7 , wherein the image data sets for determining the distance of the user are generated at a lower resolution than the centring image data for determining the optical parameters.
9. Method according to any of the preceding claims, wherein the calibration comprises both an internal and an external calibration.
10. Centring device for determining optical parameters of a user and for measuring the distance of the user from the centring device, comprising: - two image recording devices with a previously known calibration to each other, which are designed and arranged to generate image data sets of at least partial areas of the face of the user from two different spatial directions; - a data processing device with - a point recognition device which is designed to determine a plurality of predetermined characteristic facial points of the user in each of the generated image data sets by means of image processing algorithms, wherein the characteristic facial points are used as feature points V 0282202272- of a facial recognition software which are determined in the two image data sets by means of the facial recognition software; - a distance determination device which is designed to determine a distance of the user from the centring device by comparing the position of corresponding characteristic facial points in the two image data sets, taking into account the calibration of the two image recording devices with respect to each other.
11. Centring device according to claim 10 , having a distance output device which is designed to generate a feedback signal as a function of the distance determined by the distance determination device at least until the user is positioned in a target distance range in front of the centring device.
12. Centring device according to claim 10 or 11 , wherein - the two image recording devices are also designed and arranged to each generate centring image data of at least partial areas of the user ’s face; - the data processing means comprises user data determining means adapted to determine, from the generated centring image data, user data of at least a portion of the face or at least a portion of a system of the face and spectacles of the user disposed thereon in a position of use, the user data comprising location information in three-dimensional space of predetermined points of the portion of the face or the portion of the system; - the data processing means further comprises parameter determining means adapted to determine at least a portion of the optical parameters of the user from the user data; and - the centring device comprises data output means adapted to output at least a portion of the determined optical parameters of the user.
13. Computer program product comprising program parts which, when loaded in a computer, are adapted to perform a method according to any one of claims 1to 9.
IL286236A 2019-03-12 2020-03-11 Method, centering device and computer program product for measuring the distance of a user from a centering device IL286236B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019001762.5A DE102019001762A1 (en) 2019-03-12 2019-03-12 Method, centering device and computer program product for measuring the distance of a user from a centering device
PCT/EP2020/056473 WO2020182867A1 (en) 2019-03-12 2020-03-11 Method, centering device and computer program product for measuring the distance of a user from a centering device

Publications (3)

Publication Number Publication Date
IL286236A IL286236A (en) 2021-12-01
IL286236B1 IL286236B1 (en) 2025-09-01
IL286236B2 true IL286236B2 (en) 2026-01-01

Family

ID=69844812

Family Applications (1)

Application Number Title Priority Date Filing Date
IL286236A IL286236B2 (en) 2019-03-12 2020-03-11 Method, centering device and computer program product for measuring the distance of a user from a centering device

Country Status (5)

Country Link
EP (1) EP3938838B1 (en)
DE (1) DE102019001762A1 (en)
ES (1) ES2982718T3 (en)
IL (1) IL286236B2 (en)
WO (1) WO2020182867A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12360404B2 (en) * 2019-12-19 2025-07-15 Essilor International Geometrico-morphological parameter of a subject wearing an eyewear
CN113032047B (en) * 2021-03-29 2024-07-05 京东方科技集团股份有限公司 Face recognition system application method, electronic device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4323384A1 (en) * 1992-09-09 1994-03-10 Buchmann Optical Eng Spectacles frame lens matching measuring device - with image of head divided into four parts eliminating need for holding and centring head
US6076928A (en) * 1998-06-15 2000-06-20 Fateh; Sina Ideal visual ergonomic system for computer users
WO2017174525A1 (en) * 2016-04-04 2017-10-12 Carl Zeiss Vision International Gmbh Method and device for determining parameters for spectacle fitting

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7315631B1 (en) * 2006-08-11 2008-01-01 Fotonation Vision Limited Real-time face tracking in a digital image acquisition device
DE102005003699B4 (en) 2005-01-26 2018-07-05 Rodenstock Gmbh Apparatus and method for determining optical parameters of a user; A computer program product
DE102010015795A1 (en) * 2010-04-20 2011-10-20 Hans-Joachim Ollendorf Video centering system with visual field evaluation
DE102017115136B4 (en) * 2017-07-06 2024-08-14 Bundesdruckerei Gmbh Device and method for detecting biometric features of a person’s face

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4323384A1 (en) * 1992-09-09 1994-03-10 Buchmann Optical Eng Spectacles frame lens matching measuring device - with image of head divided into four parts eliminating need for holding and centring head
US6076928A (en) * 1998-06-15 2000-06-20 Fateh; Sina Ideal visual ergonomic system for computer users
WO2017174525A1 (en) * 2016-04-04 2017-10-12 Carl Zeiss Vision International Gmbh Method and device for determining parameters for spectacle fitting

Also Published As

Publication number Publication date
EP3938838B1 (en) 2024-04-24
EP3938838C0 (en) 2024-04-24
IL286236A (en) 2021-12-01
DE102019001762A1 (en) 2020-09-17
EP3938838A1 (en) 2022-01-19
ES2982718T3 (en) 2024-10-17
IL286236B1 (en) 2025-09-01
WO2020182867A1 (en) 2020-09-17

Similar Documents

Publication Publication Date Title
US11333906B2 (en) Determination of at least one optical parameter of a spectacle lens
US11126016B2 (en) Method and device for determining parameters for spectacle fitting
US10216010B2 (en) Determining user data based on image data of a selected eyeglass frame
KR102195268B1 (en) Methods, apparatus and computer programs for determining near vision points
JP2010259605A (en) Gaze measurement apparatus and gaze measurement program
JP2012239566A (en) Measuring apparatus for glasses, and three-dimensional measuring apparatus
US20170336654A1 (en) Apparatus and method for determining optical parameters
JP4536329B2 (en) Eye point position determination method and eye point measurement system
CN114830015A (en) Method for determining a value of at least one geometric parameter of a subject wearing an eye-wear
US11397339B2 (en) Computer-implemented method for determining centring parameters
US11892366B2 (en) Method and system for determining at least one optical parameter of an optical lens
IL286236B2 (en) Method, centering device and computer program product for measuring the distance of a user from a centering device
US12044902B2 (en) System and method for determining at least one feature of at least one lens mounted in a spectacle frame
US11428959B2 (en) Method and apparatus for determining a reference head posture of a subject
US11324399B2 (en) Method and system for determining a pupillary distance of an individual
CN111417893B (en) Method and assembly for inspecting the installation of an ophthalmic lens in a frame
US12307590B2 (en) Method for head image recording and corresponding mobile device