WO2018074528A1 - 画像作成装置、画像作成方法、画像作成プログラム、眼鏡レンズの設計方法および眼鏡レンズの製造方法 - Google Patents
画像作成装置、画像作成方法、画像作成プログラム、眼鏡レンズの設計方法および眼鏡レンズの製造方法 Download PDFInfo
- Publication number
- WO2018074528A1 WO2018074528A1 PCT/JP2017/037741 JP2017037741W WO2018074528A1 WO 2018074528 A1 WO2018074528 A1 WO 2018074528A1 JP 2017037741 W JP2017037741 W JP 2017037741W WO 2018074528 A1 WO2018074528 A1 WO 2018074528A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- spectacle lens
- retinal
- eye
- wearer
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 51
- 238000004519 manufacturing process Methods 0.000 title claims description 16
- 238000002360 preparation method Methods 0.000 title abstract description 6
- 210000005252 bulbus oculi Anatomy 0.000 claims abstract description 105
- 230000004256 retinal image Effects 0.000 claims abstract description 98
- 210000001525 retina Anatomy 0.000 claims abstract description 70
- 210000001508 eye Anatomy 0.000 claims abstract description 58
- 230000003287 optical effect Effects 0.000 claims abstract description 51
- 238000003860 storage Methods 0.000 claims abstract description 14
- 238000004364 calculation method Methods 0.000 claims description 26
- 210000004087 cornea Anatomy 0.000 claims description 25
- 238000013461 design Methods 0.000 claims description 19
- 238000012545 processing Methods 0.000 claims description 19
- 238000012937 correction Methods 0.000 claims description 18
- 210000001747 pupil Anatomy 0.000 claims description 12
- 238000009826 distribution Methods 0.000 claims description 11
- 239000002131 composite material Substances 0.000 claims description 6
- 230000004927 fusion Effects 0.000 claims description 3
- 210000000695 crystalline len Anatomy 0.000 description 170
- 238000010276 construction Methods 0.000 description 32
- 230000004438 eyesight Effects 0.000 description 29
- 238000004891 communication Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 14
- 238000005286 illumination Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 8
- 238000012986 modification Methods 0.000 description 7
- 230000004048 modification Effects 0.000 description 7
- 230000004270 retinal projection Effects 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 7
- 230000002207 retinal effect Effects 0.000 description 6
- 238000011156 evaluation Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 5
- 201000009310 astigmatism Diseases 0.000 description 4
- 210000002159 anterior chamber Anatomy 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000008602 contraction Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 239000000203 mixture Substances 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 210000004127 vitreous body Anatomy 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000010422 painting Methods 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 208000014733 refractive error Diseases 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000000547 structure data Methods 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
- 230000004304 visual acuity Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C7/00—Optical parts
- G02C7/02—Lenses; Lens systems ; Methods of designing lenses
- G02C7/024—Methods of designing ophthalmic lenses
- G02C7/025—Methods of designing ophthalmic lenses considering parameters of the viewed object
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0025—Operational features thereof characterised by electronic signal processing, e.g. eye models
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C13/00—Assembling; Repairing; Cleaning
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C7/00—Optical parts
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C7/00—Optical parts
- G02C7/02—Lenses; Lens systems ; Methods of designing lenses
- G02C7/024—Methods of designing ophthalmic lenses
- G02C7/027—Methods of designing ophthalmic lenses considering wearer's parameters
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C7/00—Optical parts
- G02C7/02—Lenses; Lens systems ; Methods of designing lenses
- G02C7/06—Lenses; Lens systems ; Methods of designing lenses bifocal; multifocal ; progressive
- G02C7/061—Spectacle lenses with progressively varying focal power
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/06—Ray-tracing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C13/00—Assembling; Repairing; Cleaning
- G02C13/003—Measuring during assembly or fitting of spectacles
- G02C13/005—Measuring geometric parameters required to locate ophtalmic lenses in spectacles frames
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30041—Eye; Retina; Ophthalmic
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0081—Depth or disparity estimation from stereoscopic image signals
Definitions
- the present invention relates to an image creation device, an image creation method, an image creation program, a spectacle lens design method, and a spectacle lens manufacturing method.
- Patent Document 1 There is known a method of displaying an image showing binocular visual performance when observing by turning both eyes around each object point in the field of view through a spectacle lens (see Patent Document 1). However, it does not reflect an image obtained by projecting the entire field of view onto the retina.
- the image creating apparatus includes three-dimensional object scene information related to the arrangement, shape, and optical characteristics of a structure in a virtual object scenery, and a third spectacle lens that relates to the arrangement, shape, and optical characteristics of the spectacle lens.
- a storage unit that stores original information and three-dimensional eyeball information regarding the arrangement, shape, and optical characteristics of the eye of the wearer who views the target landscape through the virtual spectacle lens, the target scene three-dimensional information, and the spectacles
- a retinal image creation unit configured to create a retinal image based on the three-dimensional information of the lens and the three-dimensional information of the eyeball, and the retinal image is obtained when the wearer views the target scene through the spectacle lens. It is a virtual image projected on the retina of the eye.
- the image creating method includes three-dimensional object scene information relating to the arrangement, shape, and optical characteristics of a structure in a virtual object scenery, and a third-order spectacle lens relating to the arrangement, shape, and optical characteristics of an eyeglass lens.
- Creating a retina image based on original information and three-dimensional eyeball information regarding the arrangement, shape, and optical characteristics of a wearer's eye for virtually viewing the target scene through the spectacle lens Is an image projected onto the retina of the eye of the wearer, the target scenery that the wearer virtually views through the eyeglass lens.
- the image creation program stores the three-dimensional information of the object scene relating to the arrangement, shape, and optical characteristics of the structure in the virtual object scenery, and the eyeglass lens tertiary relating to the arrangement, shape, and optical characteristics of the eyeglass lens.
- the retinal image is an image in which the target scene that the wearer virtually sees through the spectacle lens is projected onto the retina of the eye of the wearer.
- a spectacle lens design method designs a spectacle lens based on the shape of the spectacle lens used to create the retinal image in the image creation device of the first aspect.
- a method for manufacturing a spectacle lens includes: designing the spectacle lens by the design method of the fourth aspect; and manufacturing the spectacle lens designed by the design method. Including.
- the image creation apparatus creates a retinal image and a binocular vision image when the wearer virtually views the image.
- FIG. 1 is a diagram schematically showing a configuration of an image creating apparatus 1 according to the present embodiment.
- the image creating apparatus 1 includes an input unit 8, a storage unit 9, a control unit 10, a display unit 21, and a communication unit 22.
- the control unit 10 includes an external model construction unit 11, an eyeball model construction unit 12, a spectacle lens model construction unit 13, a retina image creation unit 14, a corresponding point calculation unit 17, a binocular vision image creation unit 18, And a moving image creation unit 19.
- the retinal image creation unit 14 includes a ray tracing unit 15. The arrows in FIG. 1 indicate the main flow of information related to image creation.
- the input unit 8 includes an input device such as a keyboard, and receives input of input data and the like necessary for processing in an external model building unit 11, an eyeball model building unit 12, and a spectacle lens model building unit 13, which will be described later.
- the input unit 8 outputs the input data to the external model building unit 11, the eyeball model building unit 12, and the spectacle lens model building unit 13 of the control unit 10.
- a configuration in which a communication unit 22 described later receives input data and outputs the input data to the control unit 10 may be employed.
- the input data input method is not particularly limited, and data stored in advance in the storage unit 9 may be used as input data.
- the storage unit 9 is configured by a non-volatile storage medium such as a memory or a hard disk, and exchanges data with the control unit 10.
- Various data such as retinal images and binocular images are stored.
- the control unit 10 is constituted by a CPU or the like, functions as a main body for controlling the image creating apparatus 1, and executes a program mounted on the storage unit 9 or the non-volatile memory arranged in the control unit 10. Various processing including image creation processing is performed.
- the external world model construction unit 11 constructs an external world model in which a geometric object is associated with a three-dimensional coordinate by using later-described external world description data (see FIG. 3) among the input data received by the input unit 8.
- the external model construction unit 11 outputs the constructed three-dimensional external model to the retinal image creation unit 14.
- the outside world model is an indoor landscape model, and an example (see FIG. 4) in which a rectangular parallelepiped or columnar object that looks like a desk or chair is arranged will be described. If it can describe in the original, it will not specifically limit.
- “landscape” refers to an external world that can be visually recognized, and the content is not particularly limited.
- the eyeball model construction unit 12 constructs a three-dimensional eyeball model by using later-described eyeball description data (see FIG. 3) among the input data received by the input unit 8, and position information of the eyeball model in the external model At the same time, it is output to the retinal image creation unit 14.
- the spectacle lens model construction unit 13 constructs a three-dimensional model of the spectacle lens using spectacle lens description data (see FIG. 3) among the input data received by the input unit 8, and the position of the spectacle lens model in the external model
- the information is output to the retinal image creation unit 14 together with the information.
- the retinal image creation unit 14 receives external world model data input from the external world model construction unit 11, eyeball model data input from the eyeball model construction unit 12, position information in the external world model, and input from the spectacle lens model construction unit 13.
- a retina image is created based on the data of the spectacle lens model to be performed and the position information in the external model.
- the retinal image of the present embodiment is a virtual image in which a landscape virtually viewed by a wearer through a spectacle lens is projected onto each retina of both eyes of the wearer.
- the ray tracing unit 15 of the retinal image creating unit 14 calculates the luminance of light incident on each position of the retina of both eyes of the eyeball model by two-stage ray tracing.
- incident light rays are traced in the opposite direction from each position of the retina of the eyeball model, and the position and incident direction of the corresponding incident light on the cornea front surface of the eyeball model are calculated.
- the incident light at the front of the cornea calculated in the first stage is traced in the reverse direction, and the brightness of the light at the corresponding position of the retina is calculated from the scattered light at the corresponding object point in the external model To do.
- the ray tracing unit 15 outputs the obtained binocular retinal image to the corresponding point calculation unit 17.
- the corresponding point calculation unit 17 calculates the corresponding points of the left and right retinal images based on the correlation coefficient and the difference between the pixel values of the binocular retinal images.
- the corresponding points in this case refer to the position of the left-eye retinal image and the position of the right-eye retinal image to which light from an object point in the external model is input. Further, the corresponding point calculation unit 17 calculates the difference between the pixel positions (x, y) between the corresponding points of the retina images of the left and right eyes as the parallax.
- the corresponding point calculation unit 17 outputs the binocular retinal image and the corresponding point and parallax information at a plurality of positions on the retinal image to the binocular vision image creation unit 18.
- the corresponding point calculation unit 17 can create a parallax display image indicating the parallax distribution of the binocular vision image before and / or after the parallax correction.
- the corresponding point calculation unit 17 outputs the obtained parallax display image to the display unit 21 for display, outputs the parallax display image to the communication unit 22 for transmission to an external device, or outputs it to the moving image creation unit 19.
- a moving image can be created or output to the storage unit 9 and stored as appropriate.
- the binocular vision image creation unit 18 creates a binocular vision image based on the binocular retinal image input from the corresponding point calculation unit 17 and information on the corresponding point and parallax.
- the binocular vision image creation unit 18 can vary the amount of parallax correction depending on the position of the retinal image by using a parallax correction parameter serving as an index of the amount of parallax correction.
- the binocular vision image creation unit 18 outputs the obtained binocular vision image to the display unit 21 for display, outputs it to the communication unit 22 for transmission to an external device, or outputs it to the moving image creation unit 19.
- a moving image can be created or output to the storage unit 9 and stored as appropriate.
- the moving image creation unit 19 temporally changes various parameters such as data constituting the input data and parallax correction parameters for the image input from the binocular vision image creation unit 18 and the storage unit 9, thereby generating a retinal image, A moving image showing a change in a binocular image or a parallax display image is created.
- the moving image creation unit 19 outputs and displays the obtained moving image on the display unit 21, outputs it to the communication unit 22 and transmits it to an external device, or outputs and stores it in the storage unit 9 as appropriate. be able to.
- the display unit 21 is configured by an apparatus capable of displaying an image such as a liquid crystal monitor, and displays an image input from the binocular vision image creation unit 18, the moving image creation unit 19, or the like.
- the communication unit 22 is configured by a communication device capable of communicating via the Internet or the like, and transmits an image created by the image creation device 1 and transmits / receives necessary data as appropriate.
- Each function of the control unit 10 may be distributed and arranged in a plurality of devices, and may be configured to perform the above-described image creation processing as one system as a whole while communicating information between the devices. Further, the storage unit 9, the display unit 21, and the communication unit 22 may be configured using an external device of the image creation device 1.
- FIG. 2 is a flowchart showing an image creation method performed by the image creation apparatus of the present embodiment and the flow of design and manufacture of a spectacle lens.
- the image creation method and the like will be described in detail along the flowchart of FIG.
- step S1001 the input unit 8 accepts input data necessary for model construction.
- FIG. 3 shows the structure of input data.
- the input data includes external world description data that defines the content of the external model, eyeball description data that defines the content of the eyeball model, and spectacle lens description data that defines the content of the spectacle lens model.
- the data included in the input data is not limited to the data illustrated in FIG. 3, and a part of the data illustrated in FIG. 3 may be a predetermined fixed value, and the design may be changed as appropriate. Can do.
- the external world description data includes data used by the external world model construction unit 11 to construct the external world model, and the classification / shape of the geometric object arranged in the external world model, the position information of the geometric object, the material property information, the illumination Information, gaze point information, etc.
- the geometric object includes a geometric object such as a sphere, a plane, a cylindrical surface, and a cube, and a composite object representing a structure such as a wall, a desk, or a chair defined by combining them.
- the geometric object is divided into two sections, a geometric element and a composite object.
- Each geometric object is set with position information that indicates where and in what direction it is placed in the external model, and information on light reflectance and transmittance, color, texture, etc.
- the fine three-dimensional structure or the like of the geometric object can also be expressed by being replaced with plane information of the geometric object as a texture.
- the illumination information includes the position of the illumination, the color of the illumination light, the wavelength distribution, the light intensity, and the like.
- the gaze point information includes the position of the gaze point.
- the eyeball description data is configured to include data used by the eyeball model construction unit 12 to construct an eyeball model, and includes geometric information of the eyeball structure of the wearer, material property information of the eyeball structure of the wearer, information on the retina, eyeball Contains model location information.
- the geometric information of the eyeball structure includes the positions of optical elements included in the eyeball such as the crystalline lens, retina, cornea, and pupil, the radius of curvature, and the pupil diameter.
- the material characteristic information of the eyeball structure includes optical characteristics such as the refractive index of the optical element included in the eyeball.
- the information on the retina includes a retinal projection range on which an external model is projected.
- the retinal projection range is a range where a ray tracing start point exists when ray tracing is performed in the reverse direction from the retina.
- the position information of the eyeball model includes position information indicating where and in what direction in the external model of the eyeball model.
- the spectacle lens description data includes spectacle lens geometric information, spectacle lens material property information, and spectacle lens model position information.
- the spectacle lens geometric information includes spectacle lens outer shape information, center thickness, two front and rear surfaces on the object side and eyeball side of the spectacle lens, and shape data of the peripheral surface.
- the lens surface shape data of these spectacle lenses is described by, for example, spline expression.
- the material characteristic information of the spectacle lens includes data such as a refractive index.
- the position information of the spectacle lens model includes position information indicating where and in which direction in the external model of the spectacle lens model.
- a salesperson of a spectacle lens store can acquire prescription data of a wearer, and can appropriately acquire and input data necessary for model construction by actually measuring the spectacle lens store.
- the salesperson asks the wearer's daily behavior and what kind of environment he / she is often in, and selects from among external models prepared in advance, or allows the wearer to select it.
- data from a shape measuring device using X-rays or the like may be acquired, a known value may be used with reference to the model eye of Gulstrand, and the age and sex of the wearer Based on the above, general and average values may be input.
- the spectacle lens description data is acquired from the design device, and can be calculated and acquired based on the prescription data of the wearer and the frame selected by the wearer.
- the input data acquisition method shown in FIG. 3 is not particularly limited. If input data is input, it will progress to step S1003.
- step S1003 the external model building unit 11 builds an external model based on the external description data input in step S1001.
- the external model construction unit 11 places a geometric object at each position in a virtual space whose position is specified by three-dimensional coordinates, and determines illumination and a gazing point.
- FIG. 4 is a diagram illustrating an example of the external world model.
- the outside world model 5 includes a gazing point 51, a lighting 52, an object 53, a wall 58, and a floor 59.
- the gaze point 51 indicates a position where the eyeball models 30L and 30R are virtually viewed.
- the illumination 52 is illumination that illuminates a target landscape expressed by an external model.
- the object 53 is a geometric object or a synthetic object configured by synthesizing a plurality of geometric objects, and is an interior such as a painting, a figurine, a desk, a chair, or the like in the target landscape.
- the wall 58 and the floor 59 may use a predetermined data set or may be set by input data.
- the object 53 may be a visual measurement target such as a visual chart. Thereby, it can be used as a reference for virtual visual acuity.
- either the center or the center of the floor 59 may be set as the origin of the coordinate system, or any other position may be set as the origin.
- step S1005 the eyeball model construction unit 12 constructs an eyeball model based on the eyeball description data input in step S1001.
- steps S1005 to S1009 are configured such that each process is performed for both eyes and then the next step is performed. However, after the process from step S1005 to step S1009 is performed for one eye, The processing from step S1005 to step S1009 may be performed for one eye.
- FIG. 5 is a diagram illustrating an example of an eyeball model.
- the eyeball model 30 includes a lens 31, a retina 32, a pupil 35, a cornea 36, a vitreous body 37, and an anterior chamber 38.
- the crystalline lens 31 includes a crystalline lens edge portion 33 and a crystalline lens core 34.
- the crystalline lens edge portion 33 includes a crystalline lens rear surface 330p and a crystalline lens front surface 330a.
- the crystalline lens core 34 includes a crystalline lens core rear surface 340p and a crystalline lens core front surface 340a.
- the cornea 36 includes a corneal rear surface 360p and a corneal front surface 360a.
- an optical axis 39 of an eye optical system that includes a crystalline lens 31, a retina 32, a pupil 35, and a cornea 36 is defined. Since the eyeball model 30 is three-dimensional shape structure data, the optical axis 39 can be decentered or tilted.
- the retina 32 is shown by hatched portions.
- a retinal projection range (not shown) defined by input data is defined, and light incident on the retinal projection range is a target of ray tracing described later. Since the refractive index of the lens in the actual eyeball is different between the center and the edge, the eyeball model 30 is optically equivalent to the two parts of the lens edge 33 and the lens core 34 having different refractive indexes.
- the lens is modeled so as to have characteristics such as refractive index.
- the pupil 35 is modeled so as to transmit light through the central opening so as to reflect the optical characteristics of the diaphragm.
- the cornea 36 refracts light in the entire cornea 36, while the cornea front surface 360 a serves as an incident site of light from outside the living body.
- the glass body 37 is an optical path medium between the lens rear surface 330p and the retina, and the anterior chamber 38 is an optical path medium between the lens front surface 330a and the corneal rear surface 360p.
- the position of each optical element constituting the eye optical system is defined.
- the cornea 36, the vitreous body 37, the anterior chamber 38, the lens edge portion 33, and the lens core 34 have a refractive index and the like.
- a curvature radius or the like is defined for each of the core front surface 340a and the lens core rear surface 340p.
- the structure of the eyeball model can be designed as appropriate, for example, by dividing the lens 31 into a plurality of parts, and the orientation of each component of the eyeball model, the reference position, etc. can be changed. Good.
- the eyeball model construction unit 12 also constructs an eyeball model 30 when the thicknesses of the lens edge 33 and the lens core 34 are changed in order to simulate the adjustment function of the wearer.
- FIG. 6 is a diagram showing a lens system including the lens edge 33, the lens core 34, the pupil 35, and the cornea 36 in the eyeball model 30 of FIG. 6 (a) and 6 (b), parts corresponding to those in FIG.
- the lens edge portion 33 and the lens core 34 in FIG. 6A are in a state before contraction (non-contraction), and the lens core front surface 340a-1, the lens core rear surface 340p-1, the lens front surface 330a-1, and the lens
- the space between each of the rear surfaces 330p-1 is narrower than in the case of FIG.
- FIG. 6B is a simulation of the case where the wearer changes the adjustment force in the lens system of FIG.
- the lens edge portion 33 and the lens core 34 have a large thickness along the optical axis, and the distance between the lens front surface 330a-2 and the lens rear surface 330p-2, and the lens core front surface 340a-2, The distance from the lens core rear surface 340p-2 is increased.
- the absolute values of the radii of curvature of the lens front surface 330a-2, the lens rear surface 330p-2, the lens core front surface 340a-2, and the lens core rear surface 340p-2 are small.
- the eyeball model construction unit 12 relates to the contraction of the lens edge 33 and the lens core 34 from the state of FIG.
- a plurality of eyeball models 30 in which the positions of 330p, the lens core front surface 340a, and the lens core rear surface 340p and the radius of curvature are divided into a plurality of stages are created. If the eyeball model 30 is constructed, the process proceeds to step S1007.
- a plurality of eyeball models 30 may be constructed by changing the cornea 36, the front surface of the cornea 360a, the rear surface of the cornea 360p, the pupil 35, and other optical elements.
- a plurality of different eyeball models 30 may be constructed according to the environment such as illumination of the external model 5. For example, the pupil 35 feeds back the intensity of light reaching the retina 32 obtained by ray tracing described later. You may acquire and change the magnitude
- the spectacle lens model construction unit 13 constructs a spectacle lens model based on the spectacle lens description data input in step S1001.
- the spectacle lens model construction unit 13 constructs a three-dimensional model of the spectacle lens based on the outer shape information of the spectacle lens, the center thickness, the front and rear two surfaces of the spectacle lens on the object side and the eyeball side, and the surface shape data of the peripheral surface.
- the surface shape data is represented by spline expression, and a spectacle lens having an arbitrary shape including a progressive addition lens can be modeled.
- the process proceeds to step S1009. Note that the method is not particularly limited as long as a three-dimensional spectacle lens model can be constructed. If the store has shape data, it may be constructed using the shape data.
- the ray tracing unit 15 of the retinal image creation unit 14 calculates the optical path, intensity, wavelength distribution, and the like of the light from the external model 5 incident on each position of the retina 32 by ray tracing.
- the ray tracing unit 15 ray-traces the light incident on each position of the retinal projection range of the retina 32 in the direction opposite to the traveling direction, and at the front surface of the cornea 360a. The incident position and the incident direction are calculated.
- FIG. 7 is a diagram schematically showing a method of ray tracing inside the eyeball model 30 of the ray tracing unit 15.
- the retinal projection range is a range portion where the longitude and latitude of the retinal spherical surface are 90 °.
- the light 43 emitted from each position in the retinal projection range of the retina 32 may be traced, and the position and traveling direction of the corresponding light 45 emitted from the corneal front surface 360a may be calculated. If the direction of the light 45 emitted from the corneal front surface 360a is reversed, the incident position and the incident direction on the corneal front surface 360a corresponding to the position of the retina 32 can be calculated.
- the ray tracing unit 15 For ray tracing in the external environment model 5 in the second stage, the ray tracing unit 15 performs ray tracing in the reverse direction based on the position and traveling direction of the light incident on the corneal front surface 360a obtained in the first stage. Calculate intersections with other objects, trace reflected / transmitted rays, and calculate lighting. For example, the ray tracing unit 15 can calculate from which object point of the external model the light is scattered from the position and traveling direction of the light incident on the cornea front surface 360a, the reflectance at the object point, and the like. The light intensity, wavelength, etc. can be calculated based on the light from the illumination to the object point.
- the ray tracing unit 15 calculates the luminance indicated by RGB or the like of each point in the retina 32 based on the intensity or wavelength of light incident from the object point of the obtained external model.
- the obtained luminance data at each point of the retina 32 constitutes a retinal image.
- FIG. 8 is a diagram illustrating a retinal image.
- the retinal image of FIG. 8 is an image in which the external model 5 is projected onto the right eyeball model 30R when the gazing point 51 (see FIG. 4) is virtually viewed from the right eyeball model 30R of the external model 5 of FIG. is there.
- FIG. 8 it can be seen that the wall 58, the floor 59, and the object 53 are projected on the retinal image 70.
- the retinal image 70 is an image in which the retina 32 that is a curved surface is assigned to two-dimensional coordinates.
- the retinal image generation unit 14 approximates the shape of the retina 32 as a part of a spherical retinal sphere, and the luminance data at each point on the retina 32 calculated by the ray tracing unit 15 is used as the latitude on the retinal sphere, Mapping is performed by associating an angle such as longitude with a coordinate position on a plane.
- the retinal image generation unit 14 maps, for example, luminance data in the range of longitudes ⁇ 0 to ⁇ 1 and latitudes ⁇ 0 to ⁇ 1 on the retinal sphere to the retinal image 70 of Nh ⁇ Nv pixels.
- Nh represents the number of pixels in the horizontal direction
- Nv represents the number of pixels in the vertical direction.
- the process proceeds to step S1011.
- the retinal image generation unit 14 sets the origin of coordinates at the intersection of the optical axis 39 of the eyeball model 30 and the retina 32, and the brightness data of each point on the retina 32 passes through the origin and is a plane perpendicular to the optical axis. You may project to.
- a known conversion formula can be used for calculation of conversion from the three-dimensional spherical coordinate system (r, ⁇ , ⁇ ) to the three-dimensional orthogonal coordinate system (x, y, z).
- the method for setting the coordinate system is not particularly limited as long as desired conversion such as setting the origin can be realized.
- step S1011 the corresponding point calculation unit 17 calculates corresponding points of the left and right retinal images 70.
- the corresponding point refers to the position of the left and right retinal images 70 on which the object point is projected or the pixel corresponding to the position with respect to a pixel corresponding to an arbitrary object point of the external model 5.
- FIG. 9 is a diagram for explaining a method of calculating corresponding points.
- FIG. 9A shows a retinal image 70L of the left-eye eyeball model 30L in FIG. 4 with a template 60L indicating a calculation range when calculating corresponding points.
- FIG. 9B is obtained by adding a template 60R indicating a calculation range when calculating corresponding points to the retinal image 70R of the right-eye eyeball model 30R in FIG.
- both the template 60L and the template 60R are composed of pixels in an 11 ⁇ 11 square range centering on the corresponding pixel.
- the template setting method corresponding to the pixel of interest can be adjusted as appropriate.
- the size of the template can be set to a square such as 3 ⁇ 3, 5 ⁇ 5, or 17 ⁇ 17.
- the corresponding point calculation unit 17 includes the luminance value of the pixel included in the template 60L corresponding to the pixel 61L in the left-eye retina image 70L and the pixel included in the template 60R corresponding to the pixel 61R in the right-eye retina image 70R.
- the degree of similarity between the brightness values is calculated.
- the corresponding point calculation unit 17 calculates the similarity of luminance values between a plurality of pixels included in the templates 60L and 60R by a method using a correlation coefficient.
- the coordinate origin is set at a position corresponding to the intersection of the optical axis 39 of the eyeball model 30 and the retina 32, and the X axis is set in the horizontal direction of the image and the Y axis is set in the vertical direction of the image. Then, each pixel in the template is designated by local coordinates (xi, yj).
- the corresponding point calculation unit 17 shifts the pixel 61L in the left-eye retinal image 70L from the pixel in the corresponding position in the right-eye retinal image 70R by dx and dy on the X axis and Y axis, respectively.
- a correlation coefficient of luminance values is calculated from the template and a template centered on the pixel 61L.
- the corresponding point calculation unit 17 changes dx and dy in a range from 0 to several pixels, and obtains a template having a high degree of similarity, that is, a template having the highest correlation coefficient and its center pixel.
- the obtained center pixel becomes the corresponding point 61R of the pixel 61L.
- the corresponding points may be calculated by performing a difference between corresponding pixels between templates and calculating a template having the smallest difference sum of squares using the difference sum of squares as a similarity.
- the luminance may be any value of RGB, or may be calculated using the luminance signal Y calculated from RGB.
- the corresponding point calculation unit 17 calculates the corresponding point of each pixel of the left and right retinal images 70
- the corresponding point of the retinal image 70 of both eyes corresponds to the retinal image 70 using the number of difference pixels in the X and Y directions as a parallax.
- a parallax display image that displays the mapped parallax distribution or the parallax distribution is created.
- step S1013 the binocular vision image creation unit 18 creates the binocular vision image 71 by combining the left and right retinal images 70L and 70R.
- FIG. 10 is a diagram in which the left and right retinal images 70 are superimposed without being processed. Since the left and right retinal images 70 have parallax, the corresponding points do not coincide with each other when the retinal images 70 are synthesized without being processed, and the images are not clear.
- the shift is schematically shown by indicating the object 53L of the left-eye retinal image 70L with a broken line and the object 53R of the right-eye retinal image 70R with a solid line.
- the binocular vision image creation unit 18 synthesizes the left and right images locally shifted based on the parallax information for each pixel and its correction parameter.
- the correction parameter can be appropriately set based on experience.
- the correction parameter can adjust the ratio of the fusion ratio, which is the right / left luminance ratio at the time of image composition, and the right / left shift ratio depending on the degree of dominant eye.
- FIG. 11 is a diagram showing a binocular image 71 obtained by synthesizing the retinal image 70. Unlike the figure in which the left and right retinal images 70 in FIG. 10 are superimposed, the image is clear.
- the process proceeds to step S1015.
- step S1015 the binocular vision image creation unit 18 processes the binocular vision image 71 obtained in step S1013 for display.
- the binocular vision image creating unit 18 may send a plurality of binocular vision images 71 obtained by repeating steps S1003 to S1013 to the moving image creation unit 19 to create a moving image, for example.
- the moving image creating unit 19 creates a moving image configured to sequentially display a plurality of binocular images 71 obtained by changing parameters such as input data. For example, a moving image including a case where the lens edge portion 33 and the lens core 34 are changed from the state of FIG. 6A to the state of FIG.
- the moving image to be created can be configured by changing any parameter of the eyeball model 30 along the time axis. Further, a case where the line-of-sight direction is changed may be hypothesized, and a change in the retinal image 70 or the like when the eyeball moves may be represented as a moving image.
- step S1017 the display unit 21 displays the created retinal image 70, binocular vision image 71, parallax display image, moving image, and the like.
- the process proceeds to step S1019.
- step S1019 the control unit 10 determines whether to change the spectacle lens and display the retinal image 70 and the like again.
- the control unit 10 confirms the retina image 70, the binocular image 71, the parallax display image, the moving image, etc. displayed on the display unit 21 by the wearer or the salesperson of the spectacle lens store, and changes the spectacle lens again. If it is input to create the retinal image 70 or the like, an affirmative determination is made in step S1019, and the process returns to step S1007. Otherwise, a negative determination is made in step S1019 and the process proceeds to step S1021.
- the control unit 10 can issue a redesign instruction to the design apparatus 93 (see FIG. 12) as necessary.
- the design device 93 may design the spectacle lens based on the shape of the spectacle lens used for creating the retinal image 70 and the like, the correction parameter used when creating the binocular vision image 71, the parallax distribution, and the like. Good.
- step S1021 the control unit 10 transmits, to the design apparatus 93 (see FIG. 12), a processing instruction for the spectacle lens used to create the retinal image 70, as well as information necessary for processing the spectacle lens.
- the processing instruction is transmitted, the process proceeds to step S1023.
- FIG. 12 shows a spectacle lens manufacturing system 90 that manufactures spectacle lenses used by the image generating apparatus of the present embodiment to generate the retinal image 70.
- the spectacle lens manufacturing system 90 includes an image creation device 1, a processing machine control device 91, a spectacle lens processing machine 92, and a design device 93.
- the arrows in FIG. 12 indicate the flow of data used for manufacturing a spectacle lens.
- the spectacle lens processing machine 92 manufactures the spectacle lens to which the processing instruction is sent in step S1021.
- the design device 93 transmits the spectacle lens design data sent to the image creation device 1 as a part of the input data to the processing machine control device 91, and the spectacle lens processing machine 92 is controlled by the processing machine control device 91. Manufacturing.
- the image creating apparatus 1 relates to external environment description data regarding the optical characteristics such as the arrangement, shape, and reflectance of the object 53 in the external model 5 and optical characteristics such as the arrangement, shape, and refractive index of the spectacle lens. Based on the spectacle lens description data and the eyeball description data regarding the optical characteristics such as the arrangement, shape, and refractive index of both eyes of the wearer who virtually sees the external model 5 through the spectacle lens, the wearer passes the spectacle lens.
- the external model 5 that is virtually viewed includes a retinal image creation unit 14 that creates a retinal image 70 projected onto each retina of both eyes of the wearer.
- the image creation apparatus 1 of the present embodiment calculates corresponding points corresponding to an arbitrary position in the external model 5 in the binocular retinal image 70, and based on the calculated corresponding points, the binocular corresponding to the position And a corresponding point calculation unit 17 that calculates the parallax.
- the image creating apparatus 1 of the present embodiment uses the brightness of each pixel included in the template 60L composed of a plurality of pixels set in the left-eye retinal image 70L and the plurality of pixels set in the right-eye retinal image 70R. Corresponding points are calculated based on the correlation coefficient Dcorr or the difference with the brightness of each pixel included in the template 60R. Thereby, a corresponding point can be directly detected from the comparison of the retina images 70 of both eyes.
- the binocular vision image creating unit 18 is based on binocular parallax and correction parameters such as the ratio of shifting the left and right image components and the fusion ratio.
- a binocular vision image 71 is created from the retina image 70.
- the display unit 21 displays the binocular disparity distribution corresponding to the retinal image 70.
- the ray tracing unit 15 calculates the incident direction and the incident position on the cornea front surface 360a of both eyes for the light incident on each position of the retina 32 of both eyes, Based on the incident direction and the incident position, the light ray path from the external model 5 propagates to each position of the retina 32 through the cornea front surface 360a and the luminance of the pixel corresponding to each position of the retina 32 are calculated. Thereby, the light from the external model 5 that reaches the retina 32 can be appropriately tracked.
- the display unit 21 uses the retina image 70 or the binocular visual image 71 based on the binocular retina image 70 as a moving image based on the change in the eyeball description data. indicate. Thereby, it is possible to provide an easy-to-understand assumption of a visual image in a practical scene when the eyeball parameter is changed.
- the shape of both eyes is calculated from the adjustment power of the wearer and the pupil diameter. Thereby, the refractive power of the eye optical system and the degree of aperture can be appropriately reproduced.
- the retinal image creation unit 14 may create the retinal image 70 in combination with a correction lens in consideration of the refractive eyeball structure. Accordingly, it is possible to provide a lens that appropriately corrects the refractive error based on the retinal image 70 and the like of the present embodiment.
- a retina image 70, a binocular image 71, a parallax distribution, a moving image, and the like are created.
- the spectacle lens model construction unit 13 constructs a virtual correction lens model and outputs it to the retinal image creation unit 14.
- the retinal image creation unit 14 creates a retinal image 70 based on the correction lens model.
- the virtual corrective lens model can be constructed based on the wearer's prescription received by the input unit 8 and input data relating to the corrective lens.
- the display unit 21 can display the retinal image 70, the binocular vision image 71, the parallax distribution, the moving image, and the like with and without the correcting lens in a comparable manner, for example, simultaneously.
- the retinal image creating unit 14 further projects on the retinas 32 of both eyes based on data such as input data regarding the placement, shape, and optical characteristics of the virtual corrective lens.
- a retina image 70 is created. Thereby, the effect by the correction lens can be displayed in an easy-to-understand manner.
- the eyeball structure is described by the eyeball description data.
- the eyeball description data may be calculated from the prescription data of the wearer.
- the eyeball model 30 can be constructed from the prescription data even if the eyeball structure of the wearer is actually measured or data directly measured cannot be acquired.
- FIG. 13 is a diagram for explaining a method of calculating eyeball description data from the wearer's prescription data.
- parameters of the eyeball structure in particular, the curvature or radius of curvature of the anterior corneal surface 360a and the posterior corneal surface 360p are calculated from the prescription data of the spherical power, the astigmatic power, and the astigmatic axis angle by an iterative algorithm.
- the cornea 36, the crystalline lens 31, the retina position center 320 positioned on the optical axis 39 indicated by the alternate long and short dash line, and the like are shown.
- the eyeball model construction unit 12 determines the curvatures of the anterior corneal surface 360a and the posterior corneal surface 360p so as to match the spherical power of the prescription data.
- the curvatures of the anterior corneal surface 360a and the posterior corneal surface 360p are arbitrarily set, and by ray tracing based on the curvatures, exit from the cornea through the pupil from the retinal position center 320, and the intersection position with a spherical surface 25 mm from the center of rotation of the eyeball
- the refractive power of the ray wavefront at is calculated.
- the absolute value of the difference is less than 0.02 diopter (hereinafter referred to as “D”). If there is, it is determined by the curvature of the set corneal front surface 360a and corneal rear surface 360p. If the absolute value of the difference between the refractive power at the intersection point and the prescription spherical power is 0.02D or more, the difference between the refractive power at the intersection point position and the prescription spherical power is obtained by calculating the curvature of the corneal front surface 360a and the corneal rear surface 360p.
- the light ray tracing is performed again after increasing / decreasing appropriately based on the value of. For example, if the spherical power is + S degrees, the curvature is tight, and if it is -S, the curvature is flattened. This procedure is repeated until the difference between the refractive power at the intersection position and the prescription spherical power becomes less than 0.02D.
- the standard for determining whether or not to adopt the curvatures of the set corneal front surface 360a and corneal back surface 360p is 0.02D. However, it can be appropriately set to a value such as 0.01D or 0.03D.
- the calculation position of the refractive power of the light wavefront can be set as appropriate, for example, by selecting from the range of 10 to 15 mm in the direction of the optical axis leaving the cornea. The same applies to the following astigmatism.
- the corneal front surface 360a and the corneal rear surface 360p are toric surfaces.
- the toric surface is configured so that a surface having a minimum curvature and a surface having a maximum curvature at every 90 degrees centered on a predetermined axis, and a surface having a maximum curvature, the base curvature, the cross curvature, And the direction of the base curvature.
- the eyeball model construction unit 12 sets the base curvature, the cross curvature, and the direction to be the base curvature, exits the cornea by ray tracing, and bases the ray wavefront at the intersection point of the 25 mm spherical surface and the ray from the center of rotation of the eyeball.
- the refractive power in the direction, the refractive power in the cross direction, and the base refractive power direction are calculated.
- the eyeball model construction unit 12 uses an absolute value of 0 for the difference between the refractive power in the base direction and the prescription spherical power, and the difference between the refractive power in the base direction minus the refractive power in the cross direction and the prescription astigmatism power. If the value is less than 02 or the like, the evaluation criterion is satisfied.
- the eyeball model construction unit 12 satisfies the evaluation criterion if the difference is less than several degrees, for example, less than 1 degree in the base refractive power direction.
- the eyeball model construction unit 12 employs the set base curvature, cross curvature, and toric surface in the direction of the base curvature as a model of the anteroposterior surface. If any of the evaluation criteria is not satisfied, the eyeball model construction unit 12 sets the base curvature, the cross curvature, and the direction of the base curvature again, and performs the evaluation.
- a spectacle lens corresponding to the prescription data is arranged in front of the eyeball, ray tracing of a plane wave from the front of the spectacle lens toward the optical center of the spectacle lens is performed, and the refractive power at the retina center position 320 becomes less than 0.02D or the like.
- parameters such as the curvature of the corneal front surface 360a and the corneal rear surface 360p may be determined by an iterative algorithm. This makes it possible to construct a more precise model reflecting the situation at the time of optometry.
- a program for realizing the information processing function of the image creating apparatus 1 is recorded on a computer-readable recording medium, and the program relating to the control of the above-described image creating process and related processes recorded on the recording medium is stored in the computer. It may be loaded into the system and executed.
- the “computer system” includes an OS (Operating System) and hardware of peripheral devices.
- the “computer-readable recording medium” refers to a portable recording medium such as a flexible disk, a magneto-optical disk, an optical disk, and a memory card, and a storage device such as a hard disk built in the computer system.
- the “computer-readable recording medium” dynamically holds a program for a short time like a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line.
- a volatile memory in a computer system serving as a server or a client in that case may be included and a program that holds a program for a certain period of time may be included.
- the above program may be for realizing a part of the functions described above, or may be realized by a combination with the program already recorded in the computer system. .
- FIG. 14 shows the state.
- the PC 950 is provided with a program via the CD-ROM 953. Further, the PC 950 has a connection function with the communication line 951.
- a computer 952 is a server computer that provides the program, and stores the program in a recording medium such as a hard disk.
- the communication line 951 is a communication line such as the Internet or personal computer communication, or a dedicated communication line.
- the computer 952 reads the program using the hard disk and transmits the program to the PC 950 via the communication line 951. That is, the program is transmitted as a data signal by a carrier wave and transmitted via the communication line 951.
- the program can be supplied as a computer-readable computer program product in various forms such as a recording medium and a carrier wave.
- the present invention is not limited to the contents of the above embodiment.
- Other embodiments conceivable within the scope of the technical idea of the present invention are also included in the scope of the present invention.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Ophthalmology & Optometry (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Optics & Photonics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Medical Informatics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Graphics (AREA)
- Biomedical Technology (AREA)
- Evolutionary Computation (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Geometry (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Quality & Reliability (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
- Image Generation (AREA)
- Eyeglasses (AREA)
Abstract
Description
本発明の第2の態様によると、画像作成方法は、仮想の対象風景における構造物の配置、形状および光学特性に関する対象風景三次元情報と、眼鏡レンズの配置、形状および光学特性に関する眼鏡レンズ三次元情報と、前記対象風景を前記眼鏡レンズを通して仮想的に目視する装用者の眼の配置、形状および光学特性に関する眼球三次元情報と、に基づいて網膜画像を作成することを含み、前記網膜画像は、前記装用者が前記眼鏡レンズを通して仮想的に目視する前記対象風景が、前記装用者の前記眼の網膜に投影される画像である。
本発明の第3の態様によると、画像作成プログラムは、仮想の対象風景における構造物の配置、形状および光学特性に関する対象風景三次元情報と、眼鏡レンズの配置、形状および光学特性に関する眼鏡レンズ三次元情報と、前記対象風景を前記眼鏡レンズを通して仮想的に目視する装用者の眼の配置、形状および光学特性に関する眼球三次元情報と、に基づいて網膜画像を作成する網膜画像作成処理を、コンピュータに実行させ、前記網膜画像は、前記装用者が前記眼鏡レンズを通して仮想的に目視する前記対象風景が、前記装用者の前記眼の網膜に投影される画像である。
本発明の第4の態様によると、眼鏡レンズの設計方法は、第1の態様の画像作成装置における前記網膜画像の作成に用いた前記眼鏡レンズの形状に基づいて、眼鏡レンズを設計する。
本発明の第5の態様によると、眼鏡レンズの製造方法は、第4の態様の設計方法により前記眼鏡レンズを設計することと、前記設計方法により設計された前記眼鏡レンズを製造することとを含む。
なお、後述の通信部22が入力データを受信し、制御部10に出力する構成にすることもできる。入力データの入力方法は特に限定されず、予め記憶部9に記憶されているデータを入力データとして用いてもよい。
なお、本実施形態において、「風景」とは、視認できる外界を指し、特に内容は限定されない。
なお、入力データに含まれるデータは、図3に例示されたものに限定されず、また、図3に例示されたものの一部を予め定められた固定の値としてもよく、適宜設計変更することができる。
入力データが入力されたら、ステップS1003に進む。
なお、オブジェクト53としては、視力表等の視力測定用の対象物を置いてもよい。これにより、仮想的な視力の参考にすることができる。外界モデル5では、床59の中心もしくは中央部のいずれかの位置を座標系の原点としてもよいし、またはその他の任意の位置を原点として設定してもよい。
外界モデルが構築されたら、ステップS1005に進む。
なお、ステップS1005からステップS1009について、それぞれの処理を両眼について行ってから次のステップに進むように構成されているが、片方の眼についてステップS1005~ステップS1009までの処理を行ってから、もう一方の眼についてステップS1005~ステップS1009までの処理を行ってもよい。
なお、水晶体31をさらに複数の部分に分けてモデル化する等、眼球モデルの構成は適宜設計することができ、また、眼球モデルの各構成要素の向きや基準となる位置等を変化させてもよい。
なお、角膜36、角膜前面360a、角膜後面360p、瞳孔35やその他の光学的要素を変化させて眼球モデル30を複数構築してもよい。外界モデル5の照明等の環境に応じて異なる複数の眼球モデル30を構築してもよく、例えば、瞳孔35は、後述の光線追跡で得た網膜32へ到達する光の強さをフィードバックして取得し、開口部の大きさを変化させてもよい。また、装用者の調節力に基づいて、水晶体辺縁部33および水晶体コア34の変化量を定め、眼球モデル30を決定してもよい。
なお、三次元の眼鏡レンズモデルを構築できれば、特にその方法は限定されず、販売店に形状データがあればそれを利用して構築してもよい。
網膜画像70は、曲面である網膜32を二次元座標に割り当てた画像である。網膜画像生成部14は、網膜32の形状を球面である網膜球の一部として近似し、光線追跡部15が算出した網膜32での各点での輝度のデータを、網膜球上の緯度、経度等の角度と平面上の座標位置とを対応付けてマッピングする。網膜画像生成部14は、例えば網膜球上の経度θ0~θ1、緯度φ0~φ1の範囲の輝度データを、Nh×Nv画素の網膜画像70にマッピングする。ここで、Nhは水平方向の画素数、Nvは垂直方向の画素数を表す。また、一つの画素(ピクセル)の大きさSh×Svは、Sh=(θ1-θ0)/Nh、Sv=(φ1-φ0)/Nvで算出される。各画素の対応位置は、θ=θ0+(i+1/2)・Sh、φ=φ0+(j+1/2)・Sv(但し、0≦i<Nh、0≦j<Nv)等、適宜設定することができる。両眼の網膜画像70が構築されたら、ステップS1011に進む。
なお、上記では、網膜球上の角度に基づいて二次元にマッピングしているが、三次元球面座標系(r、θ、φ)を三次元直交座標系(x、y、z)で表した後に、XZ平面等、任意の平面に直接投影してもよい。例えば、網膜画像生成部14は、眼球モデル30の光軸39と網膜32との交点に座標の原点を設定し、網膜32上の各点の輝度データを、原点を通り光軸に垂直な平面に射影してもよい。なお、三次元球面座標系(r、θ、φ)から三次元直交座標系(x、y、z)への変換の計算には、公知の変換式を用いることができる。原点の取り方等、所望の変換が実現できれば座標系の設定の方法は特に限定されない。
なお、注目しているピクセルに対応するテンプレートの設定の仕方は適宜調整することができる。例えば、テンプレートの大きさを3×3、5×5、17×17等の正方形に設定することができる。
・・・(1)
ここで、上方に横棒のついたfl、frは、テンプレート全体での輝度値の平均を表し、以下の式(2)で算出される。
・・・(2)
なお、相関係数ではなく、テンプレート間で対応する画素の差分を行い、差分の二乗和を類似度として、差分の二乗和が最も小さいテンプレートを算出して、対応点を算出してもよい。また、類似度の算出に当たり、輝度は、RGBのいずれかの値を用いてもよいし、RGBから算出した輝度信号Y等を用いて計算してもよい。さらに、上記では左眼のあるピクセル61Lに対して右眼の対応点を探索する構成にしたが、右眼のあるピクセルに対して左眼の対応点を探索してもよい。
対応点の算出と、視差分布等の作成が終わったら、ステップS1013に進む。
両眼視画像71が作成されたら、ステップS1015に進む。
なお、両眼視画像作成部18は、例えばステップS1003~S1013までを繰り返して得た複数の両眼視画像71を動画像作成部19に送り、動画像を作成させてもよい。動画像作成部19は、入力データ等のパラメータを変化させて得られた複数の両眼視画像71を並べて逐次表示するように構成した動画像を作成する。例えば、図6(a)の状態から図6(b)の状態まで水晶体辺縁部33および水晶体コア34を変化させた場合を含んだ動画像を、両状態間の異なる複数の段階の眼球モデル30に基づいて作成された両眼視画像71を基に作成する。
作成する動画像は、眼球モデル30の任意のパラメータを時間軸に沿って変化させて構成することができる。また、視線方向を変えた場合等を仮想し、眼球が運動した場合の網膜画像70等の変化を動画像として表してもよい。
なお、ステップS1007に戻る場合、必要に応じて、制御部10は設計装置93(図12参照)に再設計の指示を出すことができる。設計装置93は、網膜画像70等の作成に用いた眼鏡レンズの形状や、両眼視画像71の作成の際に用いた補正パラメータや、視差分布等に基づいて眼鏡レンズの設計を行ってもよい。
ステップS1023において、眼鏡レンズ加工機92は、ステップS1021で加工指示が送られた眼鏡レンズを製造する。設計装置93が、例えば入力データの一部として画像作成装置1に送った眼鏡レンズ設計データを加工機制御装置91に送信し、加工機制御装置91の制御により、眼鏡レンズ加工機92が眼鏡レンズを製造する。
(1)本実施形態の画像作成装置1は、外界モデル5におけるオブジェクト53の配置、形状および反射率等の光学特性に関する外界記述データと、眼鏡レンズの配置、形状および屈折率等の光学特性に関する眼鏡レンズ記述データと、外界モデル5を眼鏡レンズを通して仮想的に目視する装用者の両眼の配置、形状および屈折率等の光学特性に関する眼球記述データと、に基づいて、装用者が眼鏡レンズを通して仮想的に目視する外界モデル5が、装用者の両眼のそれぞれの網膜に投影される網膜画像70を作成する網膜画像作成部14を備える。これにより、装用者が眼鏡レンズをかけたと仮想した際、眼球構造や風景、眼鏡レンズ等に応じて、様々な実際的な場面での視画像の想定を行うことができる。
(変形例1)
上述の実施形態において、網膜画像作成部14は、屈折異常の眼球構造を考慮して、矯正レンズとの組み合わせで網膜画像70を作成してもよい。これにより、本実施形態の網膜画像70等に基づいて、屈折異常を適切に矯正するレンズを提供することが可能となる。
上述の実施形態においては、眼球構造が眼球記述データにより記述されるとしたが、装用者の処方データから眼球記述データを算出するように構成してもよい。これにより、装用者の眼球構造を実測したり、直接計測したデータが取得できなくても、処方データから眼球モデル30を構築することができる。
なお、上記では、設定した角膜前面360aおよび角膜後面360pの曲率を採用するか否かの基準を0.02Dとしたが、0.01D、0.03D等の値に適宜設定することができる。また、光線波面の屈折力の算出位置も、角膜を出て光軸方向に10~15mmの範囲等から選択する等、適宜設定することができる。以下の乱視の有る場合でも同様である。
なお、処方データに対応する眼鏡レンズを眼球前方に配置し、眼鏡レンズ前方から眼鏡レンズの光学中心に向かう平面波の光線追跡をし、網膜中心位置320での屈折力が0.02D未満等になるように角膜前面360aおよび角膜後面360pの曲率等のパラメタを反復アルゴリズムで決定してもよい。これにより、検眼時の状況を反映してより精密なモデル構築が可能となる。
画像作成装置1の情報処理機能を実現するためのプログラムをコンピュータ読み取り可能な記録媒体に記録して、この記録媒体に記録された、上述した画像作成処理およびそれに関連する処理の制御に関するプログラムをコンピュータシステムに読み込ませ、実行させてもよい。なお、ここでいう「コンピュータシステム」とは、OS(Operating System)や周辺機器のハードウェアを含むものとする。また、「コンピュータ読み取り可能な記録媒体」とは、フレキシブルディスク、光磁気ディスク、光ディスク、メモリカード等の可搬型記録媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置のことをいう。さらに「コンピュータ読み取り可能な記録媒体」とは、インターネット等のネットワークや電話回線等の通信回線を介してプログラムを送信する場合の通信線のように、短時間の間、動的にプログラムを保持するもの、その場合のサーバやクライアントとなるコンピュータシステム内部の揮発性メモリのように、一定時間プログラムを保持するものを含んでもよい。また上記のプログラムは、前述した機能の一部を実現するためのものであってもよく、さらに前述した機能をコンピュータシステムにすでに記録されているプログラムとの組み合わせにより実現するものであってもよい。
日本国特許出願2016年第205990号(2016年10月20日出願)
Claims (15)
- 仮想の対象風景における構造物の配置、形状および光学特性に関する対象風景三次元情報と、
眼鏡レンズの配置、形状および光学特性に関する眼鏡レンズ三次元情報と、
前記対象風景を仮想的な前記眼鏡レンズを通して目視する装用者の眼の配置、形状および光学特性に関する眼球三次元情報と
を記憶する記憶部と、
前記対象風景三次元情報、前記眼鏡レンズ三次元情報、及び前記眼球三次元情報に基づいて網膜画像を作成する網膜画像作成部を備え、
前記網膜画像は、前記装用者が前記眼鏡レンズを通して前記対象風景を目視した際に前記装用者の前記眼の網膜に投影される仮想的な画像である画像作成装置。 - 請求項1に記載の画像作成装置において、
両眼の前記網膜画像において、前記対象風景における任意の位置に対応する対応点を算出する対応点算出部と、
前記対応点に基づいて、前記位置に対する前記両眼の視差を算出する視差算出部と、
を備える画像作成装置。 - 請求項2に記載の画像作成装置において、
前記対応点算出部は、
前記両眼のうち、一方の眼の前記網膜画像に設定した複数ピクセルからなる第1ピクセル領域に含まれるそれぞれのピクセルの輝度と、他方の眼の前記網膜画像に設定した複数ピクセルからなる第2ピクセル領域に含まれるそれぞれのピクセルの輝度との相関係数または差分に基づいて前記対応点を算出する画像作成装置。 - 請求項2または3に記載の画像作成装置において、
前記両眼の視差と、融像割合を含む視差の補正パラメータとに基づいて、前記両眼の前記網膜画像から合成画像を作成する合成画像作成部を備える画像作成装置。 - 請求項2から4までのいずれか一項に記載の画像作成装置において、
前記網膜画像に対応する前記両眼の視差の分布を表示する視差表示部を備える画像作成装置。 - 請求項1から5までのいずれか一項に記載の画像作成装置において、
前記網膜画像作成部は、さらに矯正レンズの配置、形状および光学特性に関する矯正レンズ三次元情報に基づいて、前記眼のそれぞれの網膜に投影される前記網膜画像を作成する矯正画像作成部を備える画像作成装置。 - 請求項1から6までのいずれか一項に記載の画像作成装置において、
前記眼の前記網膜の各位置へ入射する光線について、前記眼の角膜の前面への入射方向および入射位置を算出し、
前記入射方向および前記入射位置に基づいて、前記対象風景から前記光線が、前記角膜の前面を通って前記網膜の前記各位置へ伝播する光線経路と前記網膜の前記各位置に対応するピクセルの輝度を算出する光線追跡部を備える画像作成装置。 - 請求項1から7までのいずれか一項に記載の画像作成装置において、
前記装用者の前記眼の形状や光学特性を、前記装用者の処方データから算出する眼球形状算出部を備える画像作成装置。 - 請求項8に記載の画像作成装置において、
前記眼球形状算出部は、前記眼の形状を、前記装用者の調節力および瞳孔径から算出する画像作成装置。 - 請求項1から9までのいずれか一項に記載の画像作成装置において、
前記眼球三次元情報の変化に基づいて、前記網膜画像、または両眼の前記網膜画像に基づいた合成画像を動画像として表示する動画表示部を備える画像作成装置。 - 仮想の対象風景における構造物の配置、形状および光学特性に関する対象風景三次元情報と、眼鏡レンズの配置、形状および光学特性に関する眼鏡レンズ三次元情報と、前記対象風景を前記眼鏡レンズを通して仮想的に目視する装用者の眼の配置、形状および光学特性に関する眼球三次元情報と、に基づいて網膜画像を作成することを含み、
前記網膜画像は、前記装用者が前記眼鏡レンズを通して仮想的に目視する前記対象風景が、前記装用者の前記眼の網膜に投影される画像である画像作成方法。 - 請求項11に記載の画像作成方法において、
両眼の視差を補正する量を設定した視差補正パラメータに基づいて、前記両眼の前記網膜画像から合成画像を作成する画像作成方法。 - 仮想の対象風景における構造物の配置、形状および光学特性に関する対象風景三次元情報と、眼鏡レンズの配置、形状および光学特性に関する眼鏡レンズ三次元情報と、前記対象風景を前記眼鏡レンズを通して仮想的に目視する装用者の眼の配置、形状および光学特性に関する眼球三次元情報と、に基づいて網膜画像を作成する網膜画像作成処理を、コンピュータに実行させ、
前記網膜画像は、前記装用者が前記眼鏡レンズを通して仮想的に目視する前記対象風景が、前記装用者の前記眼の網膜に投影される画像である画像作成プログラム。 - 請求項1から10までのいずれか一項に記載の画像作成装置における前記網膜画像の作成に用いた前記眼鏡レンズの形状に基づいて、眼鏡レンズを設計する眼鏡レンズの設計方法。
- 請求項14に記載の設計方法により前記眼鏡レンズを設計することと、
前記設計方法により設計された前記眼鏡レンズを製造することと
を含む眼鏡レンズの製造方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP17863041.4A EP3531194A4 (en) | 2016-10-20 | 2017-10-18 | IMAGE PRODUCTION DEVICE, IMAGE PRODUCTION METHOD, IMAGE PRODUCTION PROGRAM, METHOD FOR DESIGNING AN EYE GLASS LENS AND METHOD FOR PRODUCING AN EYE GLASS LENS |
JP2018546389A JP7078540B2 (ja) | 2016-10-20 | 2017-10-18 | 画像作成装置、画像作成方法、画像作成プログラム、眼鏡レンズの設計方法および眼鏡レンズの製造方法 |
CA3040852A CA3040852C (en) | 2016-10-20 | 2017-10-18 | Image creation device, method for image creation, image creation program, method for designing eyeglass lens and method for manufacturing eyeglass lens |
US16/389,428 US10958898B2 (en) | 2016-10-20 | 2019-04-19 | Image creation device, method for image creation, image creation program, method for designing eyeglass lens and method for manufacturing eyeglass lens |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016205990 | 2016-10-20 | ||
JP2016-205990 | 2016-10-20 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/389,428 Continuation US10958898B2 (en) | 2016-10-20 | 2019-04-19 | Image creation device, method for image creation, image creation program, method for designing eyeglass lens and method for manufacturing eyeglass lens |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018074528A1 true WO2018074528A1 (ja) | 2018-04-26 |
Family
ID=62018651
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/037741 WO2018074528A1 (ja) | 2016-10-20 | 2017-10-18 | 画像作成装置、画像作成方法、画像作成プログラム、眼鏡レンズの設計方法および眼鏡レンズの製造方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10958898B2 (ja) |
EP (1) | EP3531194A4 (ja) |
JP (1) | JP7078540B2 (ja) |
CA (1) | CA3040852C (ja) |
WO (1) | WO2018074528A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021149031A (ja) * | 2020-03-23 | 2021-09-27 | ホヤ レンズ タイランド リミテッドHOYA Lens Thailand Ltd | 仮想画像生成装置及び仮想画像生成方法 |
JP2022525806A (ja) * | 2019-03-22 | 2022-05-19 | エシロール・アンテルナシオナル | 視覚タスクのための視覚機器の性能を評価する装置及び方法 |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2901477C (en) | 2015-08-25 | 2023-07-18 | Evolution Optiks Limited | Vision correction system, method and graphical user interface for implementation on electronic devices having a graphical display |
US11631224B2 (en) * | 2016-11-21 | 2023-04-18 | Hewlett-Packard Development Company, L.P. | 3D immersive visualization of a radial array |
US11353699B2 (en) | 2018-03-09 | 2022-06-07 | Evolution Optiks Limited | Vision correction system and method, light field display and light field shaping layer and alignment therefor |
CA3021636A1 (en) | 2018-10-22 | 2020-04-22 | Evolution Optiks Limited | Light field display, adjusted pixel rendering method therefor, and vision correction system and method using same |
US11693239B2 (en) | 2018-03-09 | 2023-07-04 | Evolution Optiks Limited | Vision correction system and method, light field display and light field shaping layer and alignment therefor |
US11500460B2 (en) | 2018-10-22 | 2022-11-15 | Evolution Optiks Limited | Light field device, optical aberration compensation or simulation rendering |
US11327563B2 (en) | 2018-10-22 | 2022-05-10 | Evolution Optiks Limited | Light field vision-based testing device, adjusted pixel rendering method therefor, and online vision-based testing management system and method using same |
US10636116B1 (en) * | 2018-10-22 | 2020-04-28 | Evolution Optiks Limited | Light field display, adjusted pixel rendering method therefor, and vision correction system and method using same |
US10936064B2 (en) | 2018-10-22 | 2021-03-02 | Evolution Optiks Limited | Light field display, adjusted pixel rendering method therefor, and adjusted vision perception system and method using same addressing astigmatism or similar conditions |
US10860099B2 (en) | 2018-10-22 | 2020-12-08 | Evolution Optiks Limited | Light field display, adjusted pixel rendering method therefor, and adjusted vision perception system and method using same addressing astigmatism or similar conditions |
US11966507B2 (en) | 2018-10-22 | 2024-04-23 | Evolution Optiks Limited | Light field vision testing device, adjusted pixel rendering method therefor, and vision testing system and method using same |
US10761604B2 (en) | 2018-10-22 | 2020-09-01 | Evolution Optiks Limited | Light field vision testing device, adjusted pixel rendering method therefor, and vision testing system and method using same |
US11500461B2 (en) | 2019-11-01 | 2022-11-15 | Evolution Optiks Limited | Light field vision-based testing device, system and method |
US11789531B2 (en) | 2019-01-28 | 2023-10-17 | Evolution Optiks Limited | Light field vision-based testing device, system and method |
US11635617B2 (en) | 2019-04-23 | 2023-04-25 | Evolution Optiks Limited | Digital display device comprising a complementary light field display or display portion, and vision correction system and method using same |
US11902498B2 (en) | 2019-08-26 | 2024-02-13 | Evolution Optiks Limited | Binocular light field display, adjusted pixel rendering method therefor, and vision correction system and method using same |
EP3798944A1 (en) * | 2019-09-30 | 2021-03-31 | Hoya Lens Thailand Ltd. | Learning model generation method, computer program, eyeglass lens selection support method, and eyeglass lens selection support system |
US11823598B2 (en) | 2019-11-01 | 2023-11-21 | Evolution Optiks Limited | Light field device, variable perception pixel rendering method therefor, and variable perception system and method using same |
US11487361B1 (en) | 2019-11-01 | 2022-11-01 | Evolution Optiks Limited | Light field device and vision testing system using same |
CN112353363A (zh) * | 2020-10-24 | 2021-02-12 | 泰州市华仕达机械制造有限公司 | 基于开孔分析的尺寸选择系统 |
DE102020215285A1 (de) | 2020-12-03 | 2022-06-09 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zur Parallaxeregelung und binokulare Datenbrille mit einer Recheneinheit zu einem Durchführen des Verfahrens |
KR20230119176A (ko) * | 2020-12-10 | 2023-08-16 | 어플라이드 머티어리얼스, 인코포레이티드 | 웹 에지 계측 |
EP4043946A1 (en) * | 2021-02-12 | 2022-08-17 | Essilor International | A device and method for automatically evaluating visual equipment |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003075785A (ja) * | 2001-06-20 | 2003-03-12 | Vision Megane:Kk | 眼鏡・コンタクトレンズ度数決定システムおよびその方法 |
JP2003177076A (ja) * | 2001-09-06 | 2003-06-27 | Hoya Corp | 眼鏡レンズの両眼視性能表示方法及びその装置 |
WO2010044383A1 (ja) * | 2008-10-17 | 2010-04-22 | Hoya株式会社 | 眼鏡の視野画像表示装置及び眼鏡の視野画像表示方法 |
JP2010134460A (ja) * | 2008-11-06 | 2010-06-17 | Seiko Epson Corp | 眼鏡レンズ用視覚シミュレーション装置、眼鏡レンズ用視覚シミュレーション方法及び眼鏡レンズ用視覚シミュレーションプログラム |
WO2013175923A1 (ja) * | 2012-05-25 | 2013-11-28 | Hoya株式会社 | シミュレーション装置 |
WO2014046206A1 (ja) * | 2012-09-19 | 2014-03-27 | 株式会社ニコン | 視線検出装置、表示方法、視線検出装置較正方法、眼鏡レンズ設計方法、眼鏡レンズ選択方法、眼鏡レンズ製造方法、印刷物、眼鏡レンズ販売方法、光学装置、視線情報検出方法、光学機器設計方法、光学機器、光学機器選択方法、及び、光学機器製造方法 |
JP2015108852A (ja) * | 2007-01-25 | 2015-06-11 | ローデンストック.ゲゼルシャフト.ミット.ベシュレンクテル.ハフツング | フレキシブル遠近両用レンズのオプティマイザ |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6792401B1 (en) * | 2000-10-31 | 2004-09-14 | Diamond Visionics Company | Internet-based modeling kiosk and method for fitting and selling prescription eyeglasses |
EP1291633B1 (en) * | 2001-09-06 | 2005-09-28 | Hoya Corporation | Method for evaluating binocular performance of spectacle lenses, method for displaying said performance and apparatus therefore |
US7275822B2 (en) * | 2004-03-18 | 2007-10-02 | Essilor International (Compagnie Generale D'optique) | Progressive addition lenses with adjusted image magnification |
FR2874709B1 (fr) * | 2004-08-27 | 2006-11-24 | Essilor Int | Procede de determination d'une paire de lentilles ophtalmiques progressives |
JP5410548B2 (ja) * | 2009-02-03 | 2014-02-05 | エルジー・ケム・リミテッド | 立体映像表示装置用光学フィルタの製造方法 |
CN102369476B (zh) * | 2009-02-05 | 2014-04-30 | Hoya株式会社 | 眼镜镜片的评价方法、眼镜镜片的设计方法、眼镜镜片的制造方法、眼镜镜片的制造系统及眼镜镜片 |
FR2950984B1 (fr) * | 2009-10-05 | 2012-02-03 | Interactif Visuel Systeme Ivs | Procede et equipement de mesures pour la personnalisation et le montage de lentilles ophtalmiques correctrices |
GB201310368D0 (en) * | 2013-06-11 | 2013-07-24 | Sony Comp Entertainment Europe | Head-mountable apparatus and systems |
WO2015168794A1 (en) * | 2014-05-07 | 2015-11-12 | Michael Quigley | Method of prescribing/making eyewear for an individual |
US10121178B2 (en) * | 2014-06-13 | 2018-11-06 | Ebay Inc. | Three-dimensional eyeglasses modeling from two-dimensional images |
US9665984B2 (en) * | 2014-07-31 | 2017-05-30 | Ulsee Inc. | 2D image-based 3D glasses virtual try-on system |
US11956414B2 (en) * | 2015-03-17 | 2024-04-09 | Raytrx, Llc | Wearable image manipulation and control system with correction for vision defects and augmentation of vision and sensing |
US20170132845A1 (en) * | 2015-11-10 | 2017-05-11 | Dirty Sky Games, LLC | System and Method for Reducing Virtual Reality Simulation Sickness |
EP3430804A4 (en) * | 2016-03-15 | 2019-11-13 | Deepsee Inc. | APPLICATIONS, METHOD AND APPARATUS FOR 3D DISPLAY |
CN105913487B (zh) * | 2016-04-09 | 2018-07-06 | 北京航空航天大学 | 一种基于人眼图像中虹膜轮廓分析匹配的视线方向计算方法 |
-
2017
- 2017-10-18 EP EP17863041.4A patent/EP3531194A4/en active Pending
- 2017-10-18 CA CA3040852A patent/CA3040852C/en active Active
- 2017-10-18 JP JP2018546389A patent/JP7078540B2/ja active Active
- 2017-10-18 WO PCT/JP2017/037741 patent/WO2018074528A1/ja unknown
-
2019
- 2019-04-19 US US16/389,428 patent/US10958898B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003075785A (ja) * | 2001-06-20 | 2003-03-12 | Vision Megane:Kk | 眼鏡・コンタクトレンズ度数決定システムおよびその方法 |
JP2003177076A (ja) * | 2001-09-06 | 2003-06-27 | Hoya Corp | 眼鏡レンズの両眼視性能表示方法及びその装置 |
JP2015108852A (ja) * | 2007-01-25 | 2015-06-11 | ローデンストック.ゲゼルシャフト.ミット.ベシュレンクテル.ハフツング | フレキシブル遠近両用レンズのオプティマイザ |
WO2010044383A1 (ja) * | 2008-10-17 | 2010-04-22 | Hoya株式会社 | 眼鏡の視野画像表示装置及び眼鏡の視野画像表示方法 |
JP2010134460A (ja) * | 2008-11-06 | 2010-06-17 | Seiko Epson Corp | 眼鏡レンズ用視覚シミュレーション装置、眼鏡レンズ用視覚シミュレーション方法及び眼鏡レンズ用視覚シミュレーションプログラム |
WO2013175923A1 (ja) * | 2012-05-25 | 2013-11-28 | Hoya株式会社 | シミュレーション装置 |
WO2014046206A1 (ja) * | 2012-09-19 | 2014-03-27 | 株式会社ニコン | 視線検出装置、表示方法、視線検出装置較正方法、眼鏡レンズ設計方法、眼鏡レンズ選択方法、眼鏡レンズ製造方法、印刷物、眼鏡レンズ販売方法、光学装置、視線情報検出方法、光学機器設計方法、光学機器、光学機器選択方法、及び、光学機器製造方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3531194A4 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022525806A (ja) * | 2019-03-22 | 2022-05-19 | エシロール・アンテルナシオナル | 視覚タスクのための視覚機器の性能を評価する装置及び方法 |
JP7434353B2 (ja) | 2019-03-22 | 2024-02-20 | エシロール・アンテルナシオナル | 視覚タスクのための視覚機器の性能を評価する装置及び方法 |
JP2021149031A (ja) * | 2020-03-23 | 2021-09-27 | ホヤ レンズ タイランド リミテッドHOYA Lens Thailand Ltd | 仮想画像生成装置及び仮想画像生成方法 |
WO2021193261A1 (ja) * | 2020-03-23 | 2021-09-30 | ホヤ レンズ タイランド リミテッド | 仮想画像生成装置及び仮想画像生成方法 |
JP7272985B2 (ja) | 2020-03-23 | 2023-05-12 | ホヤ レンズ タイランド リミテッド | 仮想画像生成装置及び仮想画像生成方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2018074528A1 (ja) | 2019-09-26 |
CA3040852C (en) | 2023-04-04 |
CA3040852A1 (en) | 2018-04-26 |
EP3531194A4 (en) | 2020-05-06 |
JP7078540B2 (ja) | 2022-05-31 |
EP3531194A1 (en) | 2019-08-28 |
US10958898B2 (en) | 2021-03-23 |
US20190246095A1 (en) | 2019-08-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018074528A1 (ja) | 画像作成装置、画像作成方法、画像作成プログラム、眼鏡レンズの設計方法および眼鏡レンズの製造方法 | |
JP7381482B2 (ja) | ディスプレイシステムのための深度ベースの中心窩化レンダリング | |
US10319154B1 (en) | Methods, systems, and computer readable media for dynamic vision correction for in-focus viewing of real and virtual objects | |
JP6972105B2 (ja) | 固定距離の仮想現実システムおよび拡張現実システムならびに方法 | |
JP6014038B2 (ja) | 眼鏡装用シミュレーション方法、プログラム、装置、眼鏡レンズ発注システム及び眼鏡レンズの製造方法 | |
KR102077105B1 (ko) | 사용자 인터랙션을 위한 디스플레이를 설계하는 장치 및 방법 | |
JP2023126616A (ja) | ディスプレイシステムのための深度ベース中心窩化レンダリング | |
JP7227165B2 (ja) | ディスプレイにおける虚像を制御する方法 | |
TW201937238A (zh) | 虛擬及擴增實境頭戴式耳機的改善或關於該耳機的改善 | |
AU2019224109B2 (en) | Holographic real space refractive sequence | |
JP2024069461A (ja) | 左および右ディスプレイとユーザの眼との間の垂直整合を決定するためのディスプレイシステムおよび方法 | |
WO2020079906A1 (ja) | ヘッドマウントディスプレイおよびこれに用いられる広焦点レンズの設計方法 | |
CN111373307A (zh) | 立体眼镜、该立体眼镜中使用的眼镜镜片的设计方法以及立体图像的观察方法 | |
US11253149B2 (en) | Holographic real space refractive sequence | |
JP2003177076A (ja) | 眼鏡レンズの両眼視性能表示方法及びその装置 | |
JP3347514B2 (ja) | 眼光学系のシミュレーション装置 | |
JP7241702B2 (ja) | 画像作成装置、眼鏡レンズ選択システム、画像作成方法およびプログラム | |
RU2609285C1 (ru) | Способ формирования многопланового изображения и мультифокальный стереоскопический дисплей | |
US20240176418A1 (en) | Method and system for improving perfomance of an eye tracking system | |
JP2024004042A (ja) | シミュレーション装置、データ送信装置、モデル生成装置、事前データ生成装置、画像生成装置、シミュレーション方法、及びシミュレーションプログラム | |
JP2024129041A (ja) | 頭部搭載型ディスプレイシステムにおける変形に関する補償 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17863041 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018546389 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 3040852 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2017863041 Country of ref document: EP Effective date: 20190520 |