WO2013175923A1 - シミュレーション装置 - Google Patents
シミュレーション装置 Download PDFInfo
- Publication number
- WO2013175923A1 WO2013175923A1 PCT/JP2013/061897 JP2013061897W WO2013175923A1 WO 2013175923 A1 WO2013175923 A1 WO 2013175923A1 JP 2013061897 W JP2013061897 W JP 2013061897W WO 2013175923 A1 WO2013175923 A1 WO 2013175923A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- simulation
- lens
- distance
- display
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0041—Operational features thereof characterised by display arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
- A61B3/028—Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
- A61B3/032—Devices for presenting test symbols or characters, e.g. test chart projectors
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C13/00—Assembling; Repairing; Cleaning
Definitions
- the present invention relates to a simulation apparatus that displays a pseudo image that an eyeglass lens wearer would see through the eyeglass lens.
- a simulation apparatus that allows a person who intends to wear spectacle lenses to experience a state of wearing the spectacle lenses is used (for example, see Patent Documents 1 and 2).
- a person who intends to wear spectacle lenses can experience the appearance (image distortion, blur, etc.) when passing through the spectacle lens prior to ordering the lenses.
- the eyeglasses store uses image formation by simulation, it is not necessary to prepare a sample lens such as a lens prescription desired by a person who intends to wear it, and when a lens of a type not included in the sample lens is worn. You can also make the wearer experience how you see it.
- an original image (still image or moving image) that is a basis of image formation by simulation is taken in advance, or computer graphics (computer graphics (hereinafter abbreviated as “CG”)).
- CG computer graphics
- the conventional simulation apparatus is configured to allow a person who intends to wear spectacle lenses to experience a lens wearing state in a simulated manner using an original image prepared in advance.
- the reason for using the original image prepared in advance by the conventional simulation apparatus is as follows.
- image formation by simulation in addition to lens data such as a lens prescription, data on the distance from the lens to the constituent elements (for example, each pixel) of the original image is indispensable.
- the lens focal length is a fixed value for each spectacle lens, so that if the distance to the component of the original image is different, the appearance of the component through the spectacle lens is naturally different.
- the original image is prepared in advance, it is possible to specify the distances about the constituent elements of the original image in advance. For example, in the case of an image generated by CG, it is possible to have distance data for each pixel. For this reason, the conventional simulation apparatus uses an original image prepared in advance.
- Patent Document 1 if a plurality of imaging cameras using, for example, a CCD (Charge-Coupled Device) sensor or a CMOS (Complementary-Metal-Oxide Semiconductor) sensor are used, the stereo view using the plurality of devices is used. It is also possible to obtain an image and distance data corresponding to the image.
- CCD Charge-Coupled Device
- CMOS Complementary-Metal-Oxide Semiconductor
- An object of the present invention is to provide a simulation apparatus capable of generating a simulation image in which the difference between the two is correctly reflected.
- the present invention has been devised to achieve the above object.
- the inventors of the present application first examined using an image that matches the reality of real space as an original image that is the basis of image formation by simulation.
- an imaging camera that captures an image that matches the reality of real space.
- the image in conformity with the real space is an image for displaying in real time an object (subject) in the field of view while dynamically corresponding to the field of vision of the person who will wear the spectacle lens.
- the distance data that can be acquired is limited even when using stereo vision with a plurality of imaging cameras as described above, the images that can be simulated are limited, so only the imaging results with the imaging camera are available.
- the distance is simply specified, it may be possible to use a distance measuring device such as a widely used infrared length measuring sensor.
- the distance needs to be specified for each component (for example, each pixel) of the original image. For this reason, for example, it is not very realistic to detect distance data for each pixel using a general distance measuring device in view of the information processing capability of a common-sense simulation apparatus.
- the inventors of the present application have further studied earnestly.
- the inventors of the present application in a simulation application that allows a person who intends to wear a spectacle lens to experience the lens wearing state in a simulated manner, in order to display an object in the field of view in real time while dynamically responding to the field of view of the person who intends to wear the lens.
- a distance image sensor used in a surveillance system or a game device is used in a simulation apparatus that requires analysis processing such as optical ray tracing, and the same angle of view as that obtained by an imaging camera using the distance image sensor.
- the new idea of acquiring a range image of the camera was not found in the past.
- the present invention has been made based on the above-mentioned new idea by the inventors.
- a first aspect of the present invention is a simulation device that displays an image that a spectacle lens wearer will see through the spectacle lens in a pseudo manner, and an imaging camera that captures an image within the field of view of the wearer; Based on the distance image sensor that acquires a distance image having the same angle of view as the imaging result of the imaging camera, data acquisition means that acquires lens data of the spectacle lens, the imaging camera based on the distance image and the lens data Image generation means for performing image processing on the imaging result of the image and generating a simulation image reflecting the appearance when the image is passed through the spectacle lens, and the wearer will see the simulation image through the spectacle lens
- a simulation apparatus comprising an image display for displaying and outputting an image.
- the image generation means and the image display display individually correspond to the left-eye simulation image and the right-eye simulation image, respectively. It is characterized by being.
- the imaging camera and the distance image sensor are also provided individually corresponding to the left and right eyes.
- the image handled by the imaging camera, the distance image sensor, the image generation unit, and the image display is a moving image. It is characterized by that.
- the imaging camera, the distance image sensor, and the image display display can be attached to the head of the wearer. It is characterized by being incorporated in a simple housing.
- the camera according to the fifth aspect further comprising a camera drive mechanism that changes an arrangement state of the imaging camera.
- at least the imaging camera, the distance image sensor, and the image display display are incorporated in a portable information processing terminal. It is characterized by being.
- the present invention it is possible for a person who intends to wear a spectacle lens to experience a lens wearing state while viewing an image that matches the reality in real space, and even in such a case, the appearance corresponding to the difference in distance can be seen. It is possible to generate a simulation image that correctly reflects the difference.
- FIG. 1 is an explanatory diagram illustrating a schematic configuration example of the entire simulation apparatus according to the first embodiment.
- the simulation apparatus displays an image that a spectacle lens wearer will see through the spectacle lens in a pseudo manner, thereby causing a person who intends to wear the spectacle lens to experience the lens wearing state in a simulated manner. Is. Therefore, the simulation apparatus according to the first embodiment is roughly classified as shown in FIG. 1A.
- a head-mounted display (hereinafter abbreviated as “HMD”) unit 10 and a control computer unit 20 are provided. And is configured.
- HMD head-mounted display
- the HMD unit 10 includes a housing 11 and a mounting band 12 connected to the housing 11, so that the HMD unit 10 is mounted on the head of a person who intends to wear spectacle lenses (hereinafter simply referred to as “lens wearer”). Is configured to be possible.
- the housing 11 incorporates an imaging camera 13, a distance image sensor 14, and an image display display 15.
- the imaging camera 13 dynamically corresponds to the field of view of the lens wearer and captures an image of an object (subject) existing in the field of view, thereby realizing an image that matches the reality of the real space where the lens wearer is located. Is taken.
- the imaging camera 13 for example, it may be possible to use a camera configured using a CCD sensor, a CMOS sensor, or the like.
- the imaging camera 13 is desirably provided individually for each of the left and right eyes of the lens wearer. In that case, as shown in FIG. 1 (b), each of the imaging cameras 13 provided individually performs imaging for an angle of view 16L corresponding to the visual field of the left eye of the lens wearer, and the other is a lens. Imaging is performed for the angle of view 16R corresponding to the visual field of the right eye of the wearer.
- the imaging camera 13 is compatible with imaging of moving images.
- the distance image sensor 14 acquires a distance image having the same angle of view as the imaging result obtained by the imaging camera 13.
- the “distance image” refers to an image having distance information from the distance image sensor 14 to the subject instead of values such as color and shading of a general two-dimensional RGB image.
- the distance image sensor 14 that acquires the distance image is the magnitude of the distance in the depth direction of the field angle between the subject and the distance image sensor 14 for the subject in the image having the same angle of view as the imaging result of the imaging camera 13.
- the distance detection may be performed using a known method.
- Known methods include, for example, the TOF (Time Of Flight) method, which detects by measuring the time it takes for the emitted light to bounce off the subject, or the light reflected by applying a laser beam having a certain pattern to the subject.
- the SL (Structured Light) method for measuring the distance by the distortion of the pattern is mentioned.
- Such a distance image sensor 14 is desirably provided individually corresponding to each of the left and right eyes of the lens wearer, similarly to the imaging camera 13. However, even when the imaging camera 13 is individually provided corresponding to each of the left and right eyes of the lens wearer, it has a function of correcting the distance detection result by the distance image sensor 14 for the left and right eyes. If so, it is conceivable to share one distance image sensor 14 for the left and right eyes.
- the distance image sensor 14 is preferably capable of handling moving images in the same manner as the imaging camera 13, and can handle smooth images as long as it can handle high frame rate moving images. Even more desirable.
- the distance image sensor 14 acquires a distance image having the same angle of view as the imaging result obtained by the imaging camera 13, but the “same angle of view” here refers to a case where the angle of view completely matches. Of course, the case where alignment is performed so that the angles of view are not the same but the same is included.
- the distance image sensor 14 is not necessarily separate from the imaging camera 13. That is, the distance image sensor 14 may be configured integrally with the imaging camera 13 by using, for example, a camera device that can simultaneously acquire a general two-dimensional RGB image and a distance image having distance information. Absent.
- the image display 15 is arranged in front of the eye of the lens wearer wearing the housing 11 of the HMD unit 10 and performs image display for the lens wearer.
- the image display 15 for example, it may be possible to use a display configured using an LCD (Liquid Crystal Display).
- LCD Liquid Crystal Display
- a simulation image that is an image that a lens wearer will see through a spectacle lens can be cited.
- the image display 15 is preferably composed of a left eye simulation image and a right eye simulation image individually corresponding to the left eye simulation image and the right eye display panel. .
- it is desirable that the image display 15 is compatible with video display output.
- the control computer unit 20 has a function as a computer device that performs information processing instructed by a predetermined program. Specifically, the control computer unit 20 has a central processing unit (CPU), a hard disk drive (HDD), and a read only memory (ROM). ), RAM (Random Access Memory), an external interface (I / F), and the like.
- the control computer unit 20 may be incorporated in the housing 11 of the HMD unit 10 or may be provided separately from the HMD unit 10. When provided separately from the HMD unit 10, the control computer unit 20 can communicate with the HMD unit 10 via a wired or wireless communication line.
- FIG. 2 is a block diagram illustrating a functional configuration example of the simulation apparatus according to the first embodiment.
- functions as the communication unit 21, the data acquisition unit 22, and the image generation unit 23 are realized by the control computer unit 20 executing a predetermined program.
- the communication means 21 is a function for the control computer unit 20 to communicate with the HMD unit 10. Specifically, the communication unit 21 receives an image captured by the imaging camera 13 and a distance image acquired by the distance image sensor 14, or sends a simulation image generated by an image generation unit 23 described later to the image display display 15. To do.
- the communication protocol used by the communication unit 21 is not particularly limited.
- the data acquisition means 22 is a function for the lens wearer to acquire lens data for spectacle lenses that are scheduled to be worn.
- the acquired lens data includes at least lens prescription data and lens shape design data for each of the left eyeglass lens and the right eyeglass lens.
- the lens prescription data is data relating to the lens power, the addition power, the spherical power, the astigmatism power, the astigmatism axis, the prism power, the prism base direction, the near inset amount, and the like.
- the lens shape design data is data related to the shape of the spectacle lens, such as the refractive index and Abbe number of the lens material, the coordinate value data of the lens refracting surfaces (front and back surfaces), thickness data such as the lens center thickness, progressive zone length, etc.
- lens data acquisition may be performed, for example, by the data acquisition means 22 in the control computer unit 20 accessing the data server device on the network line through a network line (not shown). Further, for example, if lens data is stored in a database and stored in a storage device included in the control computer unit 20, the data acquisition unit 22 in the control computer unit 20 may access the storage device. .
- the lens data for various spectacle lenses exists in the data server device or the like, for example, the manufacturer name of the spectacle lens
- the desired lens data may be acquired by performing a search using the product model number as a key.
- the image generation means 23 is a function for generating a simulation image that is an image that the lens wearer will see through the spectacle lens. As will be described in detail later, the image generation unit 23 uses the imaging camera 13 based on the distance image acquired by the distance image sensor 14 and received by the communication unit 21 and the lens data of the spectacle lens acquired by the data acquisition unit 22. A simulation image is generated by performing image processing on the captured image that is captured and received by the communication unit 21 to reflect the appearance when the spectacle lens is passed.
- the image generation means 23 can individually correspond to the left-eye simulation image and the right-eye simulation image, that is, can generate the left-eye simulation image and the right-eye simulation image. It is desirable that Moreover, it is desirable that the image generation means 23 is compatible with generation of a simulation image using a moving image.
- the communication means 21, data acquisition means 22, and image generation means 23 described above are realized by the control computer unit 20 having a function as a computer device executing a predetermined program (that is, a simulation execution program).
- a predetermined program that is, a simulation execution program.
- the simulation execution program is installed and used in the HDD or the like of the control computer unit 20, but may be provided through a network line connected to the control computer unit 20 prior to the installation, Alternatively, the program may be provided by being stored in a storage medium readable by the control computer unit 20.
- Simulation processing procedure> Next, a description will be given of an execution procedure of a simulation process performed in order to make the lens wearer simulate the wearing state of the spectacle lens in the simulation apparatus configured as described above.
- FIG. 3 is a flowchart illustrating an example of an execution procedure of a simulation process performed by the simulation apparatus according to the first embodiment.
- the simulation process described in the first embodiment can be broadly divided into a pre-processing step (S1), an original image acquisition step (S2), a distortion information specifying step (S3), and a blur information specifying step (S4).
- An image processing step (S5) and a simulation image output step (S6) are provided.
- the lens data is acquired by the data acquisition means 22 for the spectacle lens that the lens wearer plans to wear.
- the data acquisition means 22 acquires lens data for each of the left eyeglass lens and the right eyeglass lens.
- the spectacle lens that the lens wearer intends to wear may be specified in accordance with the operation contents of the operation unit (not shown) of the HMD unit 10 or the control computer unit 20.
- an object (subject) existing in the field of view of the lens wearer with the HMD unit 10 mounted on the head of the lens wearer is incorporated into the HMD unit 10.
- Imaging is performed by the imaging camera 13.
- the captured image obtained by the imaging camera 13 is received by the communication means 21, and the captured image is used as an original image from which the simulation image is based.
- the control computer unit 20 for example, if the imaging camera 13 is individually provided corresponding to each of the left and right eyes of the lens wearer, the left eye original image corresponding to the left eye field of the lens wearer. And an original image for the right eye corresponding to the visual field of the right eye of the lens wearer.
- the distance image sensor 14 acquires a distance image having the same angle of view as the image pickup result in accordance with the image pickup by the image pickup camera 13, and the communication means 21 receives the distance image.
- the control computer unit 20 acquires the left-eye original image and the right-eye original image based on the imaging result of the imaging camera 13, for each pixel constituting the left-eye original image.
- Distance information (distance magnitude) and distance information (distance magnitude) for each pixel constituting the original image for the right eye are also performed in one frame. It is assumed that each frame is performed individually with one unit.
- the distortion information specifying step (S3) regarding the distortion of the image generated when the lens wearer looks through the spectacle lens scheduled to be worn, the generation mode of each of the left eye spectacle lens and the right eye spectacle lens is determined.
- the image generation means 23 performs the identification. When viewed through a spectacle lens, the light beam is refracted. In other words, the position of the object point when viewed with the naked eye moves to another position when viewed through the spectacle lens.
- the image distortion is caused by the movement.
- the position of an arbitrary light transmission point in the spectacle lens is recognized based on the lens data acquired in the preprocessing step (S1), and the original image acquisition step (S2).
- the generation mode can be specified by using, for example, a ray tracing method.
- the distance image acquired by the distance image sensor 14 specifies the distance between the distance image sensor 14 and the subject.
- the distance specified by the distance image by the distance image sensor 14 may be assumed as the distance between the spectacle lens and the subject as it is.
- the distance between the spectacle lens and the subject may be calculated by correcting the distance image obtained by the distance image sensor 14 by performing a predetermined calculation process.
- the ray tracing method may be performed using a known technique (see Japanese Patent No. 3342423, International Publication No. 2010/044383, etc.), and thus detailed description thereof is omitted here.
- image distortion it may be possible to reduce the processing load by using, for example, spline interpolation approximation of light ray data.
- the spline interpolation approximation of the ray data may be performed using a known technique (see Japanese Patent No. 3342423, International Publication No. 2010/044383, etc.).
- the blur information specifying step (S4) regarding the blur of the image generated when the lens wearer looks through the spectacle lens to be worn, the generation mode of each of the spectacle lens for the left eye and the spectacle lens for the right eye is described.
- the image generation means 23 performs the identification.
- the reason why the image is blurred when viewed through the spectacle lens is that all rays from the object point do not converge on one point of the retina. That is, the light from the object point forms a light quantity distribution that spreads in a range centered on the image point. This distribution is referred to as a point spread function (hereinafter abbreviated as “PSF”). Therefore, it is possible to specify the generation mode of the blur of the image by obtaining the PSF.
- PSF point spread function
- the image generation means 23 uses the distance image acquired in the original image acquisition step (S2) to determine the distance about the constituent elements (for example, each pixel) of the original image as in the distortion information specifying step (S3). Therefore, it is possible to appropriately obtain different PSFs depending on the distance.
- the PSF is obtained by recognizing the position of an arbitrary light transmission point in the spectacle lens based on the lens data acquired in the pre-processing step (S1) and moving to the subject based on the distance image acquired in the original image acquisition step (S2). May be performed using a known technique (see Japanese Patent No. 3342423, International Publication No. 2010/044383, etc.).
- Processing is performed by the image generation means 23.
- the distortion specified in the distortion information specifying step (S3) for the left eyeglass lens and the blur specified by the PSF obtained in the blur information specifying step (S4) are detected.
- an image obtained by the image processing is set as a left eye simulation image.
- the right eye original image reflects the distortion specified in the distortion information specifying step (S3) for the right eyeglass lens and the blur specified by the PSF obtained in the blur information specifying step (S4).
- Perform image processing Then, the image obtained by the image processing is set as a right-eye simulation image.
- the image processing in the image processing step (S5) may be performed as follows, for example.
- the distortion can be reflected by obtaining the correspondence between the image side and the object (subject) side for all pixels in the field of view and applying (moving) the luminance information of the original image based on the correspondence. it can.
- a distorted image in which distortion information is reflected in the original image is obtained.
- the blur can be reflected by distributing the luminance of each pixel to peripheral pixels based on the PSF and reconstructing the luminance of all the pixels of the image.
- Such a process is also called a convolution operation. That is, in the image processing step (S5), a simulation image is generated from the original image by performing a convolution operation between the distortion image and the PSF of each pixel.
- the detailed technique of the image processing performed in the image processing step (S5) may be performed using a known technique (see Japanese Patent No. 3342423, International Publication No. 2010/044383, etc.). The description is omitted here.
- the communication means 21 sends the left eye simulation image and the right eye simulation image generated in the image processing step (S5) to the image display display 15, respectively.
- the image display 15 displays and outputs the left-eye simulation image on the left-eye display panel and the right-eye simulation image on the right-eye display panel.
- the simulation apparatus repeatedly performs such a series of simulation processes for each frame constituting the moving image.
- the lens wearer wearing the HMD unit 10 on the head visually wears the spectacle lens by visually recognizing a simulation image reflecting the appearance when the spectacle lens is passed. You can have a simulated experience of the condition.
- the simulation apparatus includes an imaging camera 13 and uses an imaging result obtained by the imaging camera 13 as an original image as a basis of a simulation image. Therefore, in order to allow the lens wearer to experience the wearing state of the spectacle lens in a simulated manner, an image based on the actual space in which the lens wearer is present, not an image of the virtual space drawn by the CG, is used. Real-time display is possible while dynamically responding to the wearer's field of view.
- the distance image sensor 14 acquires a distance image having the same angle of view as the imaging result of the imaging camera 13, so that each of the original images is configured based on the distance image. It is possible to grasp the distance information (distance magnitude) about the pixel, and to correctly reflect the difference in appearance corresponding to the difference in distance. That is, by using the distance image acquired by the distance image sensor 14, for example, unlike a case where distance data is detected for each pixel individually using a general distance measuring device, one frame is obtained. Since the distance information for each pixel constituting the frame can be obtained as a single unit, the processing load during simulation processing can be avoided and it is very suitable for real-time display of simulation images It becomes.
- the simulation apparatus of the first embodiment it is possible for the lens wearer to experience the lens wearing state while observing an image that matches the reality in real space. It can be said that it is possible to generate a simulation image that correctly reflects the difference in appearance corresponding to the difference in distance.
- the image display 15, the lens data acquired by the data acquisition unit 22, and the simulation image generated by the image generation unit 23 are individually applied to the left and right eyes of the lens wearer. Since it corresponds, it is possible to display and output different simulation images for the left and right eyes. Therefore, according to the simulation apparatus of the first embodiment, for example, even when different spectacle lenses are prescribed for the left and right eyes, a simulation image that accurately reflects the appearance when passing through each spectacle lens, It can be displayed and output to the wearer. In addition, if each simulation image corresponding to each of the left and right eyes considers the convergence angle of the left and right eyes, it can be realized to correspond to a stereoscopic image called 3D.
- the simulation apparatus of the first embodiment not only includes the image display display 15, the lens data, and the simulation image, but also the imaging camera 13 and the distance image sensor 14 in the HMD unit 10 for each of the left and right eyes of the lens wearer. It is a thing corresponding to. Therefore, even when different simulation images are displayed and output for the left and right eyes of the lens wearer as described above, the original image obtained by the imaging camera 13 and the distance image obtained by the distance image sensor 14 are used as they are. Can do. That is, for example, it is not necessary to perform a complicated process such as performing a data correction process on one original image to obtain a left-eye original image and a right-eye original image, and therefore, compared to a case where the complicated process is required. As a result, the processing load during the simulation process can be reduced.
- the simulation apparatus of the first embodiment visually recognizes a simulation image as compared with a still image.
- the still image continues to display and output a simulation image corresponding to the field of view before the orientation of the head is changed. If it is compatible with a moving image, the display output content of the simulation image can be switched in accordance with the change in the orientation of the head by the lens wearer.
- the imaging camera 13, the distance image sensor 14, and the image display display 15 are incorporated in the housing 11 of the HMD unit 10. Therefore, when the lens wearer wears the HMD unit 10 on the head, the appearance of the object (subject) existing in the field of view in the direction in which the head is facing when the eyeglass lens is passed through is accurately reflected. It can be visually recognized as a simulated image.
- the lens wearer only has to wear the HMD unit 10 on the head in the simulated experience of the wearing state of the spectacle lens, and thus is very convenient. Become.
- the imaging camera 13, the distance image sensor 14, and the like are individually provided corresponding to the left and right eyes of the lens wearer as an example. It may be configured to share things. For example, as for the imaging camera 13, even if one camera is shared by the left and right eyes, if distance images are acquired for each of the left and right eyes, based on distance information specified from each distance image. Thus, it is possible to obtain a left-eye original image and a right-eye original image by performing data correction processing on one original image. Specifically, for example, an original image for the left eye and an image for the right eye viewed from the viewpoints of the left and right eyes from an image photographed by one imaging camera 13 arranged around an intermediate position between the left and right eyes and the distance image thereof.
- An original image can be generated.
- the imaging results of the imaging cameras 13 are different from the case where a plurality of imaging cameras 13 are used.
- the distance image sensor 14 the distance to the subject is not significantly different between the left and right eyes, and therefore it is conceivable to share one that is arranged around an intermediate position between the left and right eyes.
- the imaging camera 13 is fixed at a position where it can capture an angle of view corresponding to the field of view of the lens wearer.
- the imaging camera 13 may be movably supported by a mechanism including a drive source such as a motor.
- the positions of the imaging cameras 13 provided individually corresponding to the left and right eyes of the lens wearer are moved, and the distance between the imaging cameras 13 is set to the distance between the pupils of the lens wearer (PD). ) Can be realized.
- the imaging camera 13 is turned to bring the angle of view to be inwardly aligned in accordance with the convergence of the left and right eyes. It becomes feasible.
- FIG. 4 is an explanatory diagram illustrating a schematic configuration example of the entire simulation apparatus according to the second embodiment.
- the simulation apparatus according to the second embodiment is roughly configured to include an information processing terminal unit 30 and a control computer unit 20. That is, the simulation apparatus in the second embodiment includes an information processing terminal unit 30 instead of the HMD unit 10 described in the first embodiment.
- the information processing terminal unit 30 is composed of a portable information processing terminal such as a tablet terminal, a smart phone, or a PDA (Personal Digital Assistants), and is configured to be used by a lens wearer by hand. Yes.
- the information processing terminal unit 30 includes a plate-shaped (tablet-shaped) housing 31 having a size that can be held by a lens wearer.
- the housing 31 incorporates an image display display 32 on one surface (the surface on the side seen by the lens wearer) and on the other surface (the surface on the side not seen by the lens wearer).
- An imaging camera 33 and a distance image sensor 34 are incorporated.
- the image display 32 is provided with a function as a touch panel that is operated by a lens wearer to input information.
- a function as a computer device that performs information processing instructed by a predetermined program is incorporated in the casing 31 (not shown). With these functions, the information processing terminal unit 30 can perform information communication with the outside.
- the image display display 32 displays an image for the lens wearer in the same manner as the image display display 15 described in the first embodiment.
- the housing 31 incorporates only one image display display 32 that selectively displays either the left-eye simulation image or the right-eye simulation image.
- the image display 32 is not necessarily limited to one, and may be configured by a left-eye display panel and a right-eye display panel as in the case of the first embodiment. Further, one display screen may be divided into regions so as to individually correspond to the left-eye simulation image and the right-eye simulation image.
- the imaging camera 33 captures an image in accordance with the reality of the real space where the lens wearer is present, like the imaging camera 13 described in the first embodiment.
- the housing 31 selectively captures either an angle of view corresponding to the left eye field of view of the lens wearer or an angle of view corresponding to the right eye field of view. Only one imaging camera 33 is incorporated.
- the image display display 32 is not necessarily limited to only one, and may be provided individually corresponding to the left and right eyes as in the case of the first embodiment.
- the distance image sensor 34 acquires a distance image having the same angle of view as that of the imaging result obtained by the imaging camera 33, similarly to the distance image sensor 14 described in the first embodiment. Also for the distance image sensor 34, just like the imaging camera 33, only one distance image sensor 34 is incorporated in the housing 31. In addition, when the imaging camera 33 is provided individually corresponding to each of the left and right eyes, it is desirable that the distance image sensor 34 is also provided individually corresponding to each of the left and right eyes.
- the imaging camera 33 and the distance image sensor 34 are configured integrally with each other rather than separately. That is, it is desirable that the imaging camera 33 and the distance image sensor 34 are configured by a monocular camera (sensor). This is because a single eye can easily reduce the size and weight of the information processing terminal unit 30 as compared to the case of multiple eyes. Therefore, as the imaging camera 33 and the distance image sensor 34, for example, it is conceivable to use a CMOS sensor that can simultaneously acquire a normal RGB image and a distance image having distance information. This CMOS sensor is configured by integrating pixels (Z pixels) for obtaining a distance image and RGB pixels.
- the RGB image and the distance image can be simultaneously acquired with a single CMOS sensor (for example, refer to URL: http://www.nikkei.com/article/DGXNASFK24036_U2A220C1000000/).
- CMOS sensor for example, it is conceivable to use an image sensor for a monocular 3D camera that enables 3D imaging with one image sensor.
- this image sensor left-eye / right-eye pixels that convert light incident from the left-right direction into left and right electrical signals are alternately arranged in a row, thereby enabling a single image sensor to simultaneously obtain a left-eye / right-eye image. Yes.
- control computer unit 20 has a function as a computer device that performs information processing instructed by a predetermined program.
- the control computer unit 20 may be incorporated in the casing 31 of the information processing terminal unit 30, that is, may utilize a function as a computer device in the casing 31.
- the information processing terminal unit 30 is provided separately.
- the control computer unit 20 can communicate with the information processing terminal unit 30 via a wireless or wired communication line (for example, a public wireless LAN). That is, when viewed from the information processing terminal unit 30, the control computer unit 20 is for realizing so-called cloud computing.
- control computer unit 20 includes functions as a communication unit 21, a data acquisition unit 22, and an image generation unit 23. Since these functions 21 to 23 are the same as those in the first embodiment, the description thereof is omitted here.
- Simulation processing procedure> Next, a description will be given of an execution procedure of a simulation process performed in order to make a lens wearer experience the wearing state of a spectacle lens in a simulation apparatus according to the second embodiment.
- the following description will be given by taking as an example a case where only one imaging camera 33 and distance image sensor 34 are incorporated in the information processing terminal unit 30.
- the simulation process described in the second embodiment includes a preprocessing step (S1), an original image acquisition step (S2), a distortion information specifying step (S3), and a blur information specifying step (S4). ), An image processing step (S5), and a simulation image output step (S6).
- S1 preprocessing step
- S2 original image acquisition step
- S3 distortion information specifying step
- S4 blur information specifying step
- S5 An image processing step
- S6 simulation image output step
- the lens wearer operates the information processing terminal unit 30 to access the control computer unit 20, and specifies from the information processing terminal unit 30 which of the left and right eyes is to be processed. Receiving this designation information, the control computer unit 20 acquires lens data for either the left eye spectacle lens or the right eye spectacle lens by the data acquisition means 22.
- the control computer unit 20 receives the captured image by the communication unit 21 and sets it as an original image as a basis of the simulation image, and receives the distance image by the communication unit 21 to obtain distance information about each pixel constituting the captured image ( Grasp the distance).
- the subsequent distortion information specifying step (S3), blur information specifying step (S4), and image processing step (S5) are the same as in the first embodiment.
- the control computer unit 20 generates a simulation image corresponding to the imaging result of the information processing terminal unit 30.
- the communication means 21 sends the simulation image generated in the image processing step (S5) to the information processing terminal unit 30.
- the image display display 32 displays an image of the simulation image for the lens wearer.
- the information processing terminal unit 30 switches the designated content regarding which of the left and right eyes is to be processed, the information processing terminal unit 30 and the control computer unit 20 describe the designated content after the switching as described above. Repeat a series of processing steps.
- the lens wearer who operates the information processing terminal unit 30 visually recognizes the simulation image reflecting the appearance when the eyeglass lens is passed, so that the wearing state of the eyeglass lens Can be simulated.
- the simulation apparatus of the second embodiment it is possible for a lens wearer to experience a lens wearing state while observing an image that matches the reality in real space, and even in such a case, it can cope with a difference in distance. It is possible to generate a simulation image that correctly reflects the difference in appearance. That is, also in the second embodiment, the same effect as in the first embodiment can be obtained.
- the simulation apparatus includes an information processing terminal unit 30 instead of the HMD unit 10 described in the first embodiment.
- the housing 31 constituting the information processing terminal unit 30 incorporates at least an image display display 32, an imaging camera 33, and a distance image sensor 34.
- the information processing terminal unit 30 can communicate with the control computer unit 20 that realizes cloud computing. Therefore, according to the simulation apparatus of the second embodiment, by accessing the control computer unit 20 from the information processing terminal unit 30, the lens wearer can have a simulated experience of wearing the spectacle lens. That is, for the lens wearer, for example, it is possible to easily experience the wearing state of the spectacle lens easily using the information processing terminal unit 30 owned by the user, which is very convenient.
- the information processing terminal unit 30 owned by the lens wearer can be used on the side (a spectacle store, a spectacle lens manufacturer, etc.) that provides the lens wearer with a simulated experience service of the lens wearing state, a special terminal is provided. It becomes possible to provide a service at a low cost without requiring an apparatus or the like.
- the simulation apparatus of 2nd Embodiment displays a simulation image on the image display display 32 of the information processing terminal part 30, unlike the case where it is based on the HMD part 10 demonstrated in 1st Embodiment, the simulation image is displayed in multiple numbers. People can see at the same time. Therefore, according to the simulation apparatus of the second embodiment, for example, the lens wearer and the store clerk of the spectacles store simultaneously view the simulation image, and the store clerk of the spectacle store explains the characteristics of the simulation image to the lens wearer. Thus, it becomes feasible to construct a simulated experience service of the lens wearing state which is interactive (interactive type).
- the simulation image displayed on the image display 32 of the information processing terminal unit 30 is an image corresponding to the entire field of view of the lens wearer.
- “full field of view” refers to a range of viewing angles that a lens wearer can visually recognize through a spectacle lens, for example, a range of about 90 ° in the horizontal direction and about 70 ° in the vertical direction.
- the image display display 32 of the information processing terminal unit 30 may not necessarily have a sufficient screen size for displaying an image corresponding to the entire field of view of the lens wearer. From this, the image display display 32 of the information processing terminal unit 30 may selectively display a simulation image for a partial visual field corresponding to a part of the entire visual field.
- FIG. 5 is an explanatory diagram illustrating a display example of a simulation image by the simulation apparatus according to the second embodiment.
- FIG. 5A shows a partial region including the central portion of the right-side portion, the central portion, and the left-side portion of the far vision region when the spectacle lens is a progressive power lens.
- the state which is displaying on the image display 32 is illustrated as a corresponding partial visual field.
- FIG. 5B shows a case where the spectacle lens is a progressive-power lens, and the partial region including the central portion of the right side portion, the central portion, and the left side portion of the near vision region is shown for the entire field of view.
- a partial visual field corresponding to a part, a state where the image is displayed on the image display 32 is illustrated.
- the spectacle lens is a progressive addition lens
- the entire visual field region is divided into nine regions, and the right side portion of the far vision region, the center Select a part, left part, right part of the near vision area, center part, left part, right part of the intermediate vision area, center part, or left part so that each belongs to a different area. It is possible to display.
- these plurality of partial visual field regions may suffice as long as each of them corresponds to a part of the total visual field region, and each partial visual field region may have an image portion overlapping each other.
- the control computer unit 20 divides the original image acquired in the original image acquisition step (S2) into a plurality of regions based on a predetermined division mode, and performs an image processing step for each region. This can be realized by performing the image processing of (S5). Further, when performing selective display for each partial visual field, display switching (partial visual field switching) on the image display 32 may be performed using a known operation function provided in the information processing terminal unit 30. .
- the display output can be performed without reducing the simulation image. .
- the entire visual field can be visually recognized by a lens wearer through display switching. That is, the selective display for each partial visual field exemplified here is very suitable for application when the image display 32 is not of sufficient screen size, and an information processing terminal in which portability, portability, etc. are emphasized. It can be said that the affinity with the part 30 is very high.
- the simulation image displayed on the image display 32 of the information processing terminal unit 30 may be one in which the contour image of the clearness index of the spectacle lens is superimposed as shown in FIG. 5 (a) or (b).
- “Clarity index” refers to one index for evaluating the performance of spectacle lenses (particularly progressive-power lenses). Note that the details of the clear index are based on a known technique (for example, see Japanese Patent No. 3919097), and the description thereof is omitted here.
- the control computer unit 20 obtains the clear index of the spectacle lens based on the lens data acquired in the preprocessing step (S1), and displays the contour line of the clear index in the image processing step (S5).
- By superimposing the contour image with such a clear index it becomes easier to grasp the characteristics of the lens visual characteristic than when the contour image is not superimposed.
- the image display 32 has a low resolution, the lens wearer can be made to have a simulated experience of the lens wearing state using the information processing terminal unit 30 that is small and light and inexpensive. Furthermore, if the contour image is superimposed, it is possible to make a subtle difference in lens visual characteristics between the spectacle lenses due to the difference in the superimposed contour image.
Abstract
Description
この目的達成のために、本願発明者らは、先ず、シミュレーションによる像形成の基になる原画像として、実空間の現実に即した画像を用いることについて検討した。この点については、例えば実空間の現実に即した画像を撮像する撮像カメラを用いることが考えられる。ここでいう実空間の現実に即した画像とは、眼鏡レンズの装用予定者の視野に動的に対応しつつ、その視野内の物体(被写体)をリアルタイム表示するための画像である。ところが、上述したように複数台の撮像カメラによるステレオ視を利用した場合であっても取得できる距離データには制約があり、シミュレーション可能な画像が限られてしまうため、撮像カメラでの撮像結果だけでは、レンズ装用状態を疑似体験させるシミュレーションに適したものとはならない。
単に距離を特定するだけであれば、例えば広く一般的に用いられている赤外線測長センサのような距離測定器を利用することも考えられる。しかしながら、距離は、原画像の構成要素(例えば各画素)毎に特定する必要がある。そのため、例えば一般的な距離測定器を利用して各画素について距離のデータを検出することは、常識的なシミュレーション装置における情報処理能力を考慮すると、とても現実的であるとは言えない。
この点につき、本願発明者らは、さらに鋭意検討を重ねた。そして、本願発明者らは、眼鏡レンズの装用予定者にレンズ装用状態を疑似体験させるシミュレーション用途において、その装用予定者の視野に動的に対応しつつその視野内の物体をリアルタイム表示するために、監視システムやゲーム機器等に利用される距離画像センサを、敢えて光学系の光線追跡等の解析処理を必要とするシミュレーション装置に用い、その距離画像センサによって撮像カメラでの撮像結果と同画角の距離画像を取得するという、従来には無かった新たな着想に至った。
本発明は、上述した本願発明者らによる新たな着想に基づいてなされたものである。
本発明の第2の態様は、第1の態様に記載の発明において、少なくとも前記画像生成手段および前記画像表示ディスプレイは、左眼用シミュレーション画像と右眼用シミュレーション画像とのそれぞれに個別に対応したものであることを特徴とする。
本発明の第3の態様は、第2の態様に記載の発明において、前記撮像カメラおよび前記距離画像センサについても、左右眼のそれぞれに対応して個別に設けられていることを特徴とする。
本発明の第4の態様は、第1、第2または第3の態様に記載の発明において、前記撮像カメラ、前記距離画像センサ、前記画像生成手段および前記画像表示ディスプレイで扱う画像が動画であることを特徴とする。
本発明の第5の態様は、第1から第4のいずれか1態様に記載の発明において、少なくとも前記撮像カメラ、前記距離画像センサおよび前記画像表示ディスプレイは、前記装用者の頭部に装着可能な筐体に組み込まれていることを特徴とする。
本発明の第6の態様は、第5の態様に記載の発明において、前記撮像カメラの配置状態を変化させるカメラ駆動機構を備えることを特徴とする。
本発明の第7の態様は、第1から第4のいずれか1態様に記載の発明において、少なくとも前記撮像カメラ、前記距離画像センサおよび前記画像表示ディスプレイは、携帯型の情報処理端末機に組み込まれていることを特徴とする。
第1実施形態では、以下の順序で項分けをして説明を行う。
1.シミュレーション装置の概略構成
2.シミュレーション装置の機能構成
3.シミュレーション処理の手順
4.第1実施形態の効果
5.変形例等
先ず、第1実施形態におけるシミュレーション装置全体の概略構成について説明する。
図1は、第1実施形態におけるシミュレーション装置全体の概略構成例を示す説明図である。
HMD部10は、筐体11とこれに接続された装着バンド12とを備えており、これらにより眼鏡レンズの装用予定者(以下、単に「レンズ装用者」という。)の頭部に装着することが可能に構成されている。そして、筐体11には、撮像カメラ13、距離画像センサ14および画像表示ディスプレイ15が組み込まれている。
また、距離画像センサ14は、撮像カメラ13と同様に動画を取り扱い可能なものであることが望ましく、さらに高フレームレート動画を扱えるものであれば、滑らかな映像に対応し得るようになるので、より一層望ましい。
また、距離画像センサ14は、必ずしも撮像カメラ13と別体である必要はない。つまり、距離画像センサ14は、例えば一般的な二次元RGB画像と距離情報を持つ距離画像とを同時に取得できるカメラ装置を用いることで、撮像カメラ13と一体で構成されたものであっても構わない。
制御コンピュータ部20は、所定プログラムで指示された情報処理を行うコンピュータ装置としての機能を有するものであり、具体的にはCPU(Central Processing Unit)、HDD(Hard disk drive)、ROM(Read Only Memory)、RAM(Random Access Memory)、外部インタフェース(I/F)等の組み合わせによって構成されたものである。なお、制御コンピュータ部20は、HMD部10の筐体11に組み込まれたものであってもよいし、HMD部10とは別体で設けられたものであってもよい。HMD部10と別体で設けられる場合、制御コンピュータ部20は、HMD部10との間で、有線または無線の通信回線を介して、通信を行うことが可能になっているものとする。
続いて、第1実施形態におけるシミュレーション装置の機能構成について説明する。
図2は、第1実施形態におけるシミュレーション装置の機能構成例を示すブロック図である。
次に、以上のように構成されたシミュレーション装置において、レンズ装用者に眼鏡レンズの装用状態を疑似体験させるために行うシミュレーション処理の実行手順について説明する。
第1実施形態で説明するシミュレーション処理は、大別すると、事前処理ステップ(S1)と、原画像取得ステップ(S2)と、歪み情報特定ステップ(S3)と、ボケ情報特定ステップ(S4)と、画像処理ステップ(S5)と、シミュレーション画像出力ステップ(S6)とを備えている。
画像の歪みは、ある仮想的な三次元座標空間において、事前処理ステップ(S1)で取得したレンズデータに基づき眼鏡レンズにおける任意の光透過点の位置を認識するとともに、原画像取得ステップ(S2)で取得した距離画像に基づき眼鏡レンズから被写体までの距離の大きさを認識した上で、例えば光線追跡の手法を用いることで、その発生態様を特定することが可能である。このとき、距離画像センサ14が取得する距離画像は、当該距離画像センサ14と被写体との間の距離を特定するものである。ただし、距離画像センサ14と眼鏡レンズとが近傍位置にあることから、距離画像センサ14による距離画像で特定された距離を、そのまま眼鏡レンズと被写体との間の距離として擬制すればよい。または、距離画像センサ14による距離画像に対して、所定の演算処理を行って補正することで、眼鏡レンズと被写体との間の距離を算出してもよい。このようにすれば、原画像の構成要素(例えば各画素)について、眼鏡レンズとの間の距離が分かるので、画像の歪みの特定が可能となる。なお、光線追跡の手法については、公知の技術を利用して行えばよいため(特許第3342423号明細書、国際公開第2010/044383号等参照)、ここではその詳細な説明を省略する。
また、画像の歪みの特定にあたっては、例えば光線データのスプライン補間近似を利用することで、その処理負荷の軽減を図ることも考えられる。なお、光線データのスプライン補間近似についても、公知の技術を利用して行えばよい(特許第3342423号明細書、国際公開第2010/044383号等参照)。
ただし、たとえ眼鏡レンズ上の同じ位置を通して見ていても、物体点までの距離が異なっていれば、PSFも異なる。これに対して、画像生成手段23では、原画像取得ステップ(S2)で取得した距離画像により、歪み情報特定ステップ(S3)の場合と同様に原画像の構成要素(例えば各画素)についての距離が分かるので、当該距離によって異なるPSFを適切に求めることが可能となる。PSFの求め方は、事前処理ステップ(S1)で取得したレンズデータに基づき眼鏡レンズにおける任意の光透過点の位置を認識するとともに、原画像取得ステップ(S2)で取得した距離画像に基づき被写体までの距離の大きさを認識した上で、公知の技術を利用して行えばよい(特許第3342423号明細書、国際公開第2010/044383号等参照)。
歪みについては、視野内の全ての画素について像側と物体(被写体)側との対応関係を求めて、その対応関係に基づいて原画像の輝度情報を当てる(移動させる)ことで反映させることができる。これにより、原画像に歪み情報を反映させた歪み画像が得られることになる。
ボケについては、各画素の輝度をPSFに基づいて周辺画素に分配して、画像の全画素の輝度を再構成することで反映させることができる。このような処理は、畳み込み演算(Convolution)とも呼ばれる。
つまり、画像処理ステップ(S5)では、歪み画像と各画素のPSFとの畳み込み演算を行うことで、原画像からシミュレーション画像を生成するのである。
なお、画像処理ステップ(S5)で行う画像処理の詳細な手法等については、公知の技術を利用して行えばよいため(特許第3342423号明細書、国際公開第2010/044383号等参照)、ここではその説明を省略する。
第1実施形態で説明したシミュレーション装置によれば、以下に述べるような効果が得られる。
以上に本発明の第1実施形態を説明したが、上述した開示内容は、本発明の例示的な実施形態を示すものに過ぎず、本発明の技術的範囲が上述の例示的な実施形態に限定されるものではない。
以下に、上述した実施形態以外の変形例について説明する。
例えば、撮像カメラ13については、左右眼で一つのものを共用する場合であっても、左右眼のそれぞれについて距離画像を取得していれば、それぞれの距離画像から特定される距離情報を基にすることで、一つの原画像に対するデータ補正処理を行って左眼用原画像および右眼用原画像を得ることが可能である。具体的には、例えば左右眼の間の中間位置あたりに配された一つの撮像カメラ13で撮影した画像とその距離画像から、左右眼それぞれの視点から見た左眼用原画像および右眼用原画像を生成することができる。その場合には、一つの原画像から左眼用原画像および右眼用原画像を生成することになるので、複数台の撮像カメラ13を用いる場合とは異なり、各撮像カメラ13での撮像結果のアライメント調整が不要になるという利点がある。
また、例えば、距離画像センサ14については、被写体までの距離が左右眼で大きく異なる訳ではないので、左右眼の間の中間位置あたりに配された一つのものを共用することが考えられる。
その場合には、例えば、レンズ装用者の左右眼のそれぞれに対応して個別に設けられた撮像カメラ13の位置を移動させて、各撮像カメラ13の間隔をレンズ装用者の瞳孔間距離(PD)に合致させるといったことが実現可能となる。
さらには、例えば、レンズ装用者が近方視する場合に、左右眼が寄って輻輳状態となるのに合わせて、撮像カメラ13を旋回させて撮像する画角を内寄せ状態にするといったことも実現可能となる。
次に、本発明の第2実施形態について説明する。なお、ここでは、主として、上述した第1実施形態との相違点について説明する。
6.シミュレーション装置の概略構成
7.シミュレーション装置の機能構成
8.シミュレーション処理の手順
9.第2実施形態の効果
10.変形例等
先ず、第2実施形態におけるシミュレーション装置全体の概略構成について説明する。
図4は、第2実施形態におけるシミュレーション装置全体の概略構成例を示す説明図である。
情報処理端末部30は、例えばタブレット端末、スマートフォンまたはPDA(Personal Digital Assistants)等の携帯型の情報処理端末機からなるもので、レンズ装用者が手に持って使用することが可能に構成されている。そのために、情報処理端末部30は、レンズ装用者が手に持ち得る大きさの板状(タブレット状)の筐体31を備えている。そして、筐体31には、一方の面(レンズ装用者に目視される側の面)に画像表示ディスプレイ32が組み込まれているとともに、他方の面(レンズ装用者に目視されない側の面)に撮像カメラ33および距離画像センサ34が組み込まれている。画像表示ディスプレイ32には、レンズ装用者が操作して情報入力等を行うタッチパネルとしての機能が設けられている。また、筐体31内には、所定プログラムで指示された情報処理を行うコンピュータ装置としての機能が組み込まれている(ただし不図示)。これらの機能によって、情報処理端末部30は、外部との情報通信等を行い得るようになっている。
そのために、撮像カメラ33および距離画像センサ34としては、例えば、通常のRGB画像と距離情報を持つ距離画像を同時に取得できるCMOSセンサを用いることが考えられる。このCMOSセンサは、距離画像取得のための画素(Z画素)とRGB画素とが集積されて構成されている。このような構成によれば、単一のCMOSセンサでRGB画像と距離画像とを同時に取得できるようになる(例えばURL:http://www.nikkei.com/article/DGXNASFK24036_U2A220C1000000/参照)。
また、その他にも、撮像カメラ33および距離画像センサ34としては、例えば、一つのイメージセンサで3D撮像を可能とする単眼3Dカメラ向けイメージセンサを用いることが考えられる。このイメージセンサは、左右方向から入射する光線を左右それぞれの電気信号に変換する左目/右目画素を列交互に配置することで、一つのイメージセンサで左目/右目画像を同時に得ることを可能にしている。このような構成によれば、左目/右目画像のずれを利用することで立体情報を抽出し、さらにその立体情報から距離画像を抽出し得るようになるので、単一のイメージセンサで、RGB画像と距離画像とを同時に取得できるようになる(例えばURL:http://panasonic.co.jp/news/topics/2013/108643.html参照)。
制御コンピュータ部20は、第1実施形態の場合と同様に、所定プログラムで指示された情報処理を行うコンピュータ装置としての機能を有するものである。なお、制御コンピュータ部20は、情報処理端末部30の筐体31内に組み込まれたもの、すなわち当該筐体31内におけるコンピュータ装置としての機能を利用したものであってもよいが、情報処理能力等を考慮すると、情報処理端末部30とは別体で設けられたものであることが望ましい。その場合に、制御コンピュータ部20は、情報処理端末部30との間で、無線または有線の通信回線(例えば公衆無線LAN)を介して、通信を行うことが可能になっているものとする。つまり、制御コンピュータ部20は、情報処理端末部30からみると、いわゆるクラウドコンピューティングを実現するためのものである。
第2実施形態のシミュレーション装置において、制御コンピュータ部20は、通信手段21、データ取得手段22、および、画像生成手段23としての機能を備える。これらの各機能21~23は、第1実施形態の場合と同様であるため、ここではその説明を省略する。
次に、第2実施形態のシミュレーション装置において、レンズ装用者に眼鏡レンズの装用状態を疑似体験させるために行うシミュレーション処理の実行手順について説明する。ここでは、撮像カメラ33および距離画像センサ34が情報処理端末部30に一つのみ組み込まれている場合を例に挙げて、以下の説明を行う。
第2実施形態で説明したシミュレーション装置によれば、以下に述べるような効果が得られる。
以上に本発明の第2実施形態を説明したが、上述した開示内容は、本発明の例示的な実施形態を示すものに過ぎず、本発明の技術的範囲が上述の例示的な実施形態に限定されるものではない。
以下に、上述した実施形態以外の変形例について説明する。
ところで、情報処理端末部30は、携帯性や可搬性等に優れていることが望ましい。そのため、情報処理端末部30の画像表示ディスプレイ32は、レンズ装用者の全視野に対応した画像を表示する上で、必ずしも十分な画面サイズではないこともあり得る。
このことから、情報処理端末部30の画像表示ディスプレイ32は、シミュレーション画像の表示を、全視野の一部に相当する部分視野について選択的に行うものであってもよい。
図5(a)は、眼鏡レンズが累進屈折力レンズである場合に、遠方視領域の右側部分、中央部分、左側部分のうちの中央部分を含む部分領域につき、これを全視野の一部に相当する部分視野として、画像表示ディスプレイ32で表示している状態を例示している。
また、図5(b)は、眼鏡レンズが累進屈折力レンズである場合に、近方視領域の右側部分、中央部分、左側部分のうちの中央部分を含む部分領域につき、これを全視野の一部に相当する部分視野として、画像表示ディスプレイ32で表示している状態を例示している。
これらの表示例に代表されるように、画像表示ディスプレイ32においては、眼鏡レンズが累進屈折力レンズであれば、例えば、全視野領域を9つの領域に区分けし、遠方視領域の右側部分、中央部分、左側部分、近方視領域の右側部分、中央部分、左側部分、中間視領域の右側部分、中央部分、左側部分のいずれかを、それぞれが別領域に属するようにした状態で選択的に表示することが考えられる。なお、これら複数の部分視野領域は、それぞれが全視野領域の一部分に相当するものであればよく、各部分視野領域が互いに重複する画像部分を有していてもよい。
明瞭指数の等高線画像の重畳は、制御コンピュータ部20において、事前処理ステップ(S1)で取得したレンズデータを基に眼鏡レンズの明瞭指数を求め、画像処理ステップ(S5)でその明瞭指数の等高線を表す画像を生成し、その等高線画像をシミュレーション画像に合成することで、実現することが可能となる。
このような明瞭指数の等高線画像の重畳を行えば、当該等高線画像の重畳が無い場合に比べて、レンズ視覚特性の特徴の把握が容易化する。このことは、特に、画像表示ディスプレイ32における解像度が十分でない場合に有効である。なぜならば、画像表示ディスプレイ32の解像度が十分でない場合には、シミュレーション画像に反映させたボケ・ゆがみ等を必ずしも完全に再現できないおそれもあるが、等高線画像が重畳されていれば、その完全に再現できない部分を等高線によって補足できるからである。画像表示ディスプレイ32が低解像度のものでよければ、小型軽量で安価な情報処理端末部30を用いてレンズ装用者にレンズ装用状態の疑似体験をさせ得るようにもなる。さらには、等高線画像が重畳されていれば、その重畳された等高線画像の違いによって、各眼鏡レンズ間におけるレンズ視覚特性の微妙な違いを顕在化することも可能になる。
Claims (7)
- 眼鏡レンズの装用者が当該眼鏡レンズを通して見るであろう画像を疑似的に表示するシミュレーション装置であって、
前記装用者の視野内について撮像を行う撮像カメラと、
前記撮像カメラでの撮像結果と同画角の距離画像を取得する距離画像センサと、
前記眼鏡レンズのレンズデータを取得するデータ取得手段と、
前記距離画像および前記レンズデータに基づき、前記撮像カメラでの撮像結果に対する画像処理を行って、前記眼鏡レンズを通した場合の見え方を反映させたシミュレーション画像を生成する画像生成手段と、
前記シミュレーション画像を前記装用者が前記眼鏡レンズを通して見るであろう画像として表示出力する画像表示ディスプレイと
を備えることを特徴とするシミュレーション装置。 - 少なくとも前記画像生成手段および前記画像表示ディスプレイは、左眼用シミュレーション画像と右眼用シミュレーション画像とのそれぞれに個別に対応したものである
ことを特徴とする請求項1記載のシミュレーション装置。 - 前記撮像カメラおよび前記距離画像センサについても、左右眼のそれぞれに対応して個別に設けられている
ことを特徴とする請求項2記載のシミュレーション装置。 - 前記撮像カメラ、前記距離画像センサ、前記画像生成手段および前記画像表示ディスプレイで扱う画像が動画である
ことを特徴とする請求項1、2または3記載のシミュレーション装置。 - 少なくとも前記撮像カメラ、前記距離画像センサおよび前記画像表示ディスプレイは、前記装用者の頭部に装着可能な筐体に組み込まれている
ことを特徴とする請求項1から4のいずれか1項に記載のシミュレーション装置。 - 前記撮像カメラの配置状態を変化させるカメラ駆動機構を備える
ことを特徴とする請求項5記載のシミュレーション装置。 - 少なくとも前記撮像カメラ、前記距離画像センサおよび前記画像表示ディスプレイは、携帯型の情報処理端末機に組み込まれている
ことを特徴とする請求項1から4のいずれか1項に記載のシミュレーション装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP13794149.8A EP2856931B1 (en) | 2012-05-25 | 2013-04-23 | Simulation device |
CN201380025382.7A CN104284622B (zh) | 2012-05-25 | 2013-04-23 | 模拟装置 |
US14/403,701 US9967555B2 (en) | 2012-05-25 | 2013-04-23 | Simulation device |
JP2014516730A JP6023801B2 (ja) | 2012-05-25 | 2013-04-23 | シミュレーション装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-119885 | 2012-05-25 | ||
JP2012119885 | 2012-05-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013175923A1 true WO2013175923A1 (ja) | 2013-11-28 |
Family
ID=49623622
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/061897 WO2013175923A1 (ja) | 2012-05-25 | 2013-04-23 | シミュレーション装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9967555B2 (ja) |
EP (1) | EP2856931B1 (ja) |
JP (1) | JP6023801B2 (ja) |
CN (1) | CN104284622B (ja) |
WO (1) | WO2013175923A1 (ja) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016045866A1 (de) * | 2014-09-22 | 2016-03-31 | Carl Zeiss Vision International Gmbh | Anzeigevorrichtung zur demonstration optischer eigenschaften von brillengläsern |
WO2018074528A1 (ja) * | 2016-10-20 | 2018-04-26 | 株式会社ニコン・エシロール | 画像作成装置、画像作成方法、画像作成プログラム、眼鏡レンズの設計方法および眼鏡レンズの製造方法 |
WO2019009034A1 (ja) * | 2017-07-03 | 2019-01-10 | 株式会社ニコン・エシロール | 眼鏡レンズの設計方法、眼鏡レンズの製造方法、眼鏡レンズ、眼鏡レンズ発注装置、眼鏡レンズ受注装置および眼鏡レンズ受発注システム |
WO2019044710A1 (ja) * | 2017-08-31 | 2019-03-07 | 株式会社ニコン | 眼科機器、画像生成装置、プログラム、及び眼科システム |
JP2020004325A (ja) * | 2018-07-02 | 2020-01-09 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP2021149031A (ja) * | 2020-03-23 | 2021-09-27 | ホヤ レンズ タイランド リミテッドHOYA Lens Thailand Ltd | 仮想画像生成装置及び仮想画像生成方法 |
JP7467830B2 (ja) | 2019-05-08 | 2024-04-16 | 株式会社ニデック | 視覚シミュレーション方法、および、視覚シミュレーションプログラム |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105353512B (zh) * | 2015-12-10 | 2018-06-29 | 联想(北京)有限公司 | 一种图像显示方法和图像显示装置 |
EP3276327B1 (en) | 2016-07-29 | 2019-10-02 | Essilor International | Method for virtual testing of at least one lens having a predetermined optical feature and associated device |
EP3295864B1 (en) | 2016-09-15 | 2019-05-15 | Carl Zeiss Vision International GmbH | Apparatus for assisting in establishing a correction for correcting heterotropia or heterophoria and method of operating a computer for assisting in establishing a correction for correcting heterotropia or heterophoria |
CN110574375B (zh) * | 2017-04-28 | 2023-06-02 | 苹果公司 | 视频流水线 |
US10979685B1 (en) | 2017-04-28 | 2021-04-13 | Apple Inc. | Focusing for virtual and augmented reality systems |
US10861142B2 (en) | 2017-07-21 | 2020-12-08 | Apple Inc. | Gaze direction-based adaptive pre-filtering of video data |
US11175518B2 (en) | 2018-05-20 | 2021-11-16 | Neurolens, Inc. | Head-mounted progressive lens simulator |
US11559197B2 (en) | 2019-03-06 | 2023-01-24 | Neurolens, Inc. | Method of operating a progressive lens simulator with an axial power-distance simulator |
US10783700B2 (en) | 2018-05-20 | 2020-09-22 | Neurolens, Inc. | Progressive lens simulator with an axial power-distance simulator |
US11202563B2 (en) | 2019-03-07 | 2021-12-21 | Neurolens, Inc. | Guided lens design exploration system for a progressive lens simulator |
US11259697B2 (en) | 2019-03-07 | 2022-03-01 | Neurolens, Inc. | Guided lens design exploration method for a progressive lens simulator |
US11259699B2 (en) | 2019-03-07 | 2022-03-01 | Neurolens, Inc. | Integrated progressive lens simulator |
US11288416B2 (en) | 2019-03-07 | 2022-03-29 | Neurolens, Inc. | Deep learning method for a progressive lens simulator with an artificial intelligence engine |
US11241151B2 (en) | 2019-03-07 | 2022-02-08 | Neurolens, Inc. | Central supervision station system for Progressive Lens Simulators |
GB2589657B (en) * | 2019-05-22 | 2022-09-14 | Otos Wing Co Ltd | Welding guiding system providing high quality images |
CA3171478A1 (en) | 2020-02-21 | 2021-08-26 | Ditto Technologies, Inc. | Fitting of glasses frames including live fitting |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3342423B2 (ja) | 1998-10-09 | 2002-11-11 | ホーヤ株式会社 | 眼光学系のシミュレーション装置 |
JP3919097B2 (ja) | 2001-09-06 | 2007-05-23 | Hoya株式会社 | 眼鏡レンズの両眼視性能表示方法及びその装置 |
JP2009003812A (ja) * | 2007-06-22 | 2009-01-08 | Sgi Japan Ltd | デフォーカス画像の生成方法及び生成装置 |
WO2010044383A1 (ja) | 2008-10-17 | 2010-04-22 | Hoya株式会社 | 眼鏡の視野画像表示装置及び眼鏡の視野画像表示方法 |
JP2010134460A (ja) * | 2008-11-06 | 2010-06-17 | Seiko Epson Corp | 眼鏡レンズ用視覚シミュレーション装置、眼鏡レンズ用視覚シミュレーション方法及び眼鏡レンズ用視覚シミュレーションプログラム |
JP4609581B2 (ja) * | 2008-03-26 | 2011-01-12 | セイコーエプソン株式会社 | シミュレーション装置、シミュレーションプログラムおよびシミュレーションプログラムを記録した記録媒体 |
JP2012066002A (ja) * | 2010-09-27 | 2012-04-05 | Hoya Corp | 眼鏡の視野画像表示装置 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1993688B (zh) * | 2004-08-03 | 2012-01-18 | 西尔弗布鲁克研究有限公司 | 走近启动的打印 |
US7914148B2 (en) * | 2005-11-15 | 2011-03-29 | Carl Zeiss Vision Australia Holdings Limited | Ophthalmic lens simulation system and method |
EP1862110A1 (en) * | 2006-05-29 | 2007-12-05 | Essilor International (Compagnie Generale D'optique) | Method for optimizing eyeglass lenses |
EP2198769A1 (en) * | 2008-12-22 | 2010-06-23 | Essilor International (Compagnie Générale D'Optique) | A method of and an apparatus for simulating an optical effect of an optical lens |
US20110075257A1 (en) * | 2009-09-14 | 2011-03-31 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | 3-Dimensional electro-optical see-through displays |
US9348141B2 (en) * | 2010-10-27 | 2016-05-24 | Microsoft Technology Licensing, Llc | Low-latency fusing of virtual and real content |
US9304319B2 (en) * | 2010-11-18 | 2016-04-05 | Microsoft Technology Licensing, Llc | Automatic focus improvement for augmented reality displays |
US8576276B2 (en) * | 2010-11-18 | 2013-11-05 | Microsoft Corporation | Head-mounted display device which provides surround video |
-
2013
- 2013-04-23 EP EP13794149.8A patent/EP2856931B1/en active Active
- 2013-04-23 WO PCT/JP2013/061897 patent/WO2013175923A1/ja active Application Filing
- 2013-04-23 JP JP2014516730A patent/JP6023801B2/ja active Active
- 2013-04-23 US US14/403,701 patent/US9967555B2/en active Active
- 2013-04-23 CN CN201380025382.7A patent/CN104284622B/zh active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3342423B2 (ja) | 1998-10-09 | 2002-11-11 | ホーヤ株式会社 | 眼光学系のシミュレーション装置 |
JP3919097B2 (ja) | 2001-09-06 | 2007-05-23 | Hoya株式会社 | 眼鏡レンズの両眼視性能表示方法及びその装置 |
JP2009003812A (ja) * | 2007-06-22 | 2009-01-08 | Sgi Japan Ltd | デフォーカス画像の生成方法及び生成装置 |
JP4609581B2 (ja) * | 2008-03-26 | 2011-01-12 | セイコーエプソン株式会社 | シミュレーション装置、シミュレーションプログラムおよびシミュレーションプログラムを記録した記録媒体 |
WO2010044383A1 (ja) | 2008-10-17 | 2010-04-22 | Hoya株式会社 | 眼鏡の視野画像表示装置及び眼鏡の視野画像表示方法 |
JP2010134460A (ja) * | 2008-11-06 | 2010-06-17 | Seiko Epson Corp | 眼鏡レンズ用視覚シミュレーション装置、眼鏡レンズ用視覚シミュレーション方法及び眼鏡レンズ用視覚シミュレーションプログラム |
JP2012066002A (ja) * | 2010-09-27 | 2012-04-05 | Hoya Corp | 眼鏡の視野画像表示装置 |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016045866A1 (de) * | 2014-09-22 | 2016-03-31 | Carl Zeiss Vision International Gmbh | Anzeigevorrichtung zur demonstration optischer eigenschaften von brillengläsern |
US10292581B2 (en) | 2014-09-22 | 2019-05-21 | Carl Zeiss Vision International Gmbh | Display device for demonstrating optical properties of eyeglasses |
WO2018074528A1 (ja) * | 2016-10-20 | 2018-04-26 | 株式会社ニコン・エシロール | 画像作成装置、画像作成方法、画像作成プログラム、眼鏡レンズの設計方法および眼鏡レンズの製造方法 |
JP7252892B2 (ja) | 2017-07-03 | 2023-04-05 | 株式会社ニコン・エシロール | 眼鏡レンズの設計方法、および眼鏡レンズの製造方法 |
WO2019009034A1 (ja) * | 2017-07-03 | 2019-01-10 | 株式会社ニコン・エシロール | 眼鏡レンズの設計方法、眼鏡レンズの製造方法、眼鏡レンズ、眼鏡レンズ発注装置、眼鏡レンズ受注装置および眼鏡レンズ受発注システム |
US11754856B2 (en) | 2017-07-03 | 2023-09-12 | Nikon-Essilor Co., Ltd. | Method for designing eyeglass lens, method for manufacturing eyeglass lens, eyeglass lens, eyeglass lens ordering device, eyeglass lens order receiving device, and eyeglass lens ordering and order receiving system |
JPWO2019009034A1 (ja) * | 2017-07-03 | 2020-05-21 | 株式会社ニコン・エシロール | 眼鏡レンズの設計方法、眼鏡レンズの製造方法、眼鏡レンズ、眼鏡レンズ発注装置、眼鏡レンズ受注装置および眼鏡レンズ受発注システム |
WO2019044710A1 (ja) * | 2017-08-31 | 2019-03-07 | 株式会社ニコン | 眼科機器、画像生成装置、プログラム、及び眼科システム |
JPWO2019044710A1 (ja) * | 2017-08-31 | 2020-10-01 | 株式会社ニコン | 眼科機器、画像生成装置、プログラム、及び眼科システム |
JP7182920B2 (ja) | 2018-07-02 | 2022-12-05 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP2020004325A (ja) * | 2018-07-02 | 2020-01-09 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP7467830B2 (ja) | 2019-05-08 | 2024-04-16 | 株式会社ニデック | 視覚シミュレーション方法、および、視覚シミュレーションプログラム |
JP2021149031A (ja) * | 2020-03-23 | 2021-09-27 | ホヤ レンズ タイランド リミテッドHOYA Lens Thailand Ltd | 仮想画像生成装置及び仮想画像生成方法 |
WO2021193261A1 (ja) * | 2020-03-23 | 2021-09-30 | ホヤ レンズ タイランド リミテッド | 仮想画像生成装置及び仮想画像生成方法 |
JP7272985B2 (ja) | 2020-03-23 | 2023-05-12 | ホヤ レンズ タイランド リミテッド | 仮想画像生成装置及び仮想画像生成方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2013175923A1 (ja) | 2016-01-12 |
EP2856931A1 (en) | 2015-04-08 |
US20150163480A1 (en) | 2015-06-11 |
JP6023801B2 (ja) | 2016-11-09 |
US9967555B2 (en) | 2018-05-08 |
CN104284622A (zh) | 2015-01-14 |
CN104284622B (zh) | 2016-11-09 |
EP2856931B1 (en) | 2020-03-18 |
EP2856931A4 (en) | 2016-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6023801B2 (ja) | シミュレーション装置 | |
KR102417177B1 (ko) | 인사이드-아웃 위치, 사용자 신체 및 환경 추적을 갖는 가상 및 혼합 현실을 위한 머리 장착 디스플레이 | |
CN107071382B (zh) | 立体图像显示装置 | |
CA3043204C (en) | Apparatus and method for a dynamic "region of interest" in a display system | |
KR20190026004A (ko) | 단일 깊이 추적 순응-수렴 솔루션 | |
CN106484116B (zh) | 媒体文件的处理方法和装置 | |
Balram et al. | Light‐field imaging and display systems | |
WO2013027755A1 (ja) | 眼鏡装用シミュレーション方法、プログラム、装置、眼鏡レンズ発注システム及び眼鏡レンズの製造方法 | |
AU2014370001A1 (en) | 3-D light field camera and photography method | |
CN108141578A (zh) | 呈现相机 | |
JPWO2014030403A1 (ja) | シミュレーション装置、シミュレーションシステム、シミュレーション方法及びシミュレーションプログラム | |
JP2012079291A (ja) | プログラム、情報記憶媒体及び画像生成システム | |
CN109799899B (zh) | 交互控制方法、装置、存储介质和计算机设备 | |
JP7148634B2 (ja) | ヘッドマウントディスプレイ装置 | |
JP2012513604A5 (ja) | ||
JP6509101B2 (ja) | 眼鏡状の光学シースルー型の両眼のディスプレイにオブジェクトを表示する画像表示装置、プログラム及び方法 | |
CN109255838B (zh) | 避免增强现实显示设备观看重影的方法及设备 | |
JP6915368B2 (ja) | 多焦点視覚出力方法、多焦点視覚出力装置 | |
JP2017098596A (ja) | 画像生成方法及び画像生成装置 | |
CN107087153B (zh) | 3d图像生成方法、装置及vr设备 | |
CN112153319B (zh) | 基于视频通信技术的ar信息显示方法和装置 | |
CN111736692B (zh) | 显示方法、显示装置、存储介质与头戴式设备 | |
Balram | Fundamentals of light field imaging and display systems | |
US20220232201A1 (en) | Image generation system and method | |
Wetzstein | Augmented and virtual reality |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13794149 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014516730 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14403701 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013794149 Country of ref document: EP |