WO2019058811A1 - Dispositif d'affichage, et procédé de commande d'affichage - Google Patents
Dispositif d'affichage, et procédé de commande d'affichage Download PDFInfo
- Publication number
- WO2019058811A1 WO2019058811A1 PCT/JP2018/030102 JP2018030102W WO2019058811A1 WO 2019058811 A1 WO2019058811 A1 WO 2019058811A1 JP 2018030102 W JP2018030102 W JP 2018030102W WO 2019058811 A1 WO2019058811 A1 WO 2019058811A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixel
- image
- eye
- display
- display surface
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
- H04N13/307—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using fly-eye lenses, e.g. arrangements of circular lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
Definitions
- the present disclosure relates to a display device and a display control method.
- a display device includes a pixel array, a lens array, and a control unit.
- the pixel array has a display surface in which a plurality of pixels are arranged at a first pitch.
- the lens array is disposed to face the display surface and has a plurality of lenses arranged at a second pitch greater than the first pitch along the extension direction of the display surface.
- the control unit generates a virtual image at a position different from the display surface by causing a plurality of lights from a plurality of pixels to be viewed as a continuous integral image through a plurality of lenses, and is independent of the position of the virtual image.
- the display operation of the pixel array is controlled to arbitrarily control the position of the binocular convergence surface in the direction orthogonal to the display surface.
- a plurality of lights from a plurality of pixels in a pixel array having a display surface in which a plurality of pixels are arranged at a first pitch are along the extending direction of the display surface.
- a plurality of pixel areas corresponding to each of the plurality of lenses are divided into a right eye pixel area and a left eye pixel area respectively, and the right eye in the right eye pixel area among the plurality of pixels
- the image for the right eye is displayed on the pixel for the left eye
- the image for the left eye is displayed on the pixel for the left eye in the pixel region for the left eye among the plurality of pixels
- the convergence position and the adjustment position are orthogonal to the display surface It includes controlling the pixel array so as to arbitrarily controlled along the direction.
- the control unit controls the convergence position and the adjustment position to be arbitrarily controlled. For example, when the observer tilts the face with respect to the display surface Even in this case, it can be avoided that the image viewed by the observer becomes a double image, and the possibility of feeling tired of unpleasant eyes can be avoided.
- the display device as an embodiment of the present disclosure, it is possible to reduce the burden on the observer while securing excellent visibility.
- the effect of this indication is not limited to this, Any effect as described in the following may be sufficient.
- FIG. 1 is a schematic view illustrating a basic configuration of a display device according to a first embodiment of the present disclosure. It is a schematic diagram showing one example of composition of a display which displays a common two-dimensional picture. It is a conceptual diagram which represents typically the state in which the image of the display surface is image-formed on the observer's retina in a common two-dimensional display apparatus. It is a conceptual diagram which represents typically the state which the image of the display surface has not image-formed on the observer's retina in a common two-dimensional display apparatus.
- FIG. 3 is a conceptual view schematically showing a state in which an image to be displayed on the observer's retina is clearly reproduced in the display device as the first embodiment of the present disclosure.
- FIG. 7 is an explanatory diagram for explaining a relationship between a virtual image position and a minimum pixel array pitch in the display device shown in FIG. 6. It is a conceptual diagram for demonstrating control of a congestion position under certain conditions of the display apparatus shown in FIG. FIG.
- FIG. 7 is a conceptual diagram for illustrating the relationship between an image recognized by the left eye and an image recognized by the right eye when the viewer's viewpoint is rotated under conditions where a virtual image is obtained by the display device shown in FIG. is there. It is a conceptual diagram showing the image visually recognized in the case of FIG. When a virtual image is obtained by the display device shown in FIG. 6, under conditions different from the conditions shown in FIG. 10, an image recognized by the left eye and an image recognized by the right eye when the viewpoint of the observer rotates It is a conceptual diagram for demonstrating a relation. It is a conceptual diagram showing the image visually recognized on the conditions shown in FIG. FIG. 7 is a schematic view showing a relationship between a continuous display area in monocular viewing in the display device shown in FIG.
- FIG. 6 is a conceptual diagram for describing the relationship between the position of the observer's pupil and the position of the continuous display area in the case of compound vision in the display device as the first embodiment of the present disclosure. It is an explanatory view showing the example of the 1st application of the display as a 1st embodiment of this indication. It is an explanatory view showing the 2nd example of application of a display as the 1st embodiment of this indication.
- FIG. 19 is a flow chart showing an example of a processing procedure of a display control method in the display device shown in FIG. 18;
- FIG. 19 is an explanatory diagram for describing a relation between a virtual image position and a minimum pixel array pitch in the display device shown in FIG. 18;
- FIG. 19 is a schematic view showing a relationship between a continuous display region in the case of monocular viewing in the display device shown in FIG. 18, microlenses corresponding thereto, and a unit pixel region of a pixel array.
- FIG. 19 is a schematic view showing a relationship between a continuous display area in the case of compound vision in the display device shown in FIG. 18, microlenses corresponding thereto, and a unit pixel area of a pixel array. It is a schematic diagram showing the whole composition of the display as a modification of this indication.
- First Embodiment 1-1 Basic principle in the first embodiment 1-2.
- Display according to the first embodiment 1-2- (1) Configuration for visual acuity compensation of display 1-2-2 (2) Operation for visual acuity compensation of display 1-2-2 (3)
- Display Display control method for visual acuity compensation in device 1-2- (4)
- Application example of display device 1-3- (1) Mobile Example of application to device 1-3- (2) Example of application to electronic loupe 1-3- (3) Example of application to vehicle-mounted device Second Embodiment 2-1. Background of the Second Embodiment 2-2.
- Device configuration for vision compensation 2-3 Display control method for visual acuity compensation 2-4. Control of congestion position 2-5. Function and effect of display device Other variations
- FIG. 1 is a schematic view showing a configuration example of a light beam reproduction type display device. Further, for comparison, one configuration example of a display device for displaying a general two-dimensional image is shown in FIG.
- FIG. 2 is a schematic view showing a configuration example of a display device for displaying a general two-dimensional image.
- the display surface 815 of the general display device 80 is configured by a pixel array 810 in which a plurality of pixels 811 are two-dimensionally arrayed.
- the pixel array 810 is illustrated in FIG. 2 as if the pixels 811 are arranged in a line for convenience, in practice, the pixels 811 are also arranged in the paper depth direction.
- the amount of light from each pixel 811 is not controlled by the direction of emission, and the amount of light similarly controlled is emitted in any direction.
- a two-dimensional image is displayed on the display surface 815 of the pixel array 810.
- the display device 80 displaying a two-dimensional image represented by FIG. 2, that is, image information having no depth information is also referred to as a two-dimensional display device 80. Call it.
- the light beam reproduction type display device 15 shown in FIG. 1 includes a pixel array 110 in which a plurality of pixels 111 are two-dimensionally arranged, and a microlens array 120 provided on the display surface 115 of the pixel array 110.
- the pixel array 110 is illustrated as if the pixels 111 are arranged in a line, but in practice, the pixels 111 are also arranged in the paper depth direction.
- the microlenses 121 are actually arranged in the depth direction of the drawing. Since the light from each pixel 111 is emitted through the micro lens 121, in the light beam reproduction type display device 15, the lens surface 125 of the micro lens array 120 is an apparent display surface.
- the pitch of the microlenses 121 constituting the microlens array 120 is configured to be larger than the pitch of the pixels 111 constituting the pixel array 110. That is, the plurality of pixels 111 exist immediately below one microlens 121. Therefore, light from a plurality of pixels 111 enters one microlens 121 and is emitted with directivity. Therefore, by appropriately controlling the driving of each pixel 111, it is possible to adjust the direction, the intensity, and the like of the light emitted from each of the microlenses 121.
- each micro lens 121 constitutes a light emitting point, and the light emitted from each light emitting point is controlled by the plurality of pixels 111 provided directly under each micro lens 121. Ru. By driving each pixel 111 based on the light ray information, light emitted from each light emission point is controlled, and a desired light ray state is realized.
- FIG. 3 is a schematic view showing a state in which an image of a display surface is formed on the retina of the observer in a general two-dimensional display device 80.
- FIG. 4 is a schematic view showing a state in which the image of the display surface is not formed on the retina of the observer in a general two-dimensional display device 80.
- FIG. 5 is a schematic view showing the relationship between the virtual image plane in the light beam reproduction type display device 15 and the imaging plane on the retina of the observer. That is, FIG.
- FIGS. 3 to 5 schematically shows a state in which the image to be displayed on the retina of the observer in the light beam reproduction type display device 15 is clearly reproduced.
- the pixel array 810 and its display surface 815 of a general two-dimensional display device 80, or the microlens array 120 and its lens surface 125 of the light beam reproduction type display device 15 The lens 201 and the retina 203 are schematically shown.
- FIG. 3 how an image 160 is displayed on the display surface 815 is schematically illustrated.
- a general two-dimensional display device 80 in a state in which the observer's focus adjustment matches the display surface 815, the light from each pixel 811 of the pixel array 810 passes through the lens 201 of the observer's eye, Image is formed on the surface 203. That is, the imaging plane 204 exists on the retina 203.
- Arrows drawn with different line types in FIG. 3 indicate light of different intensities emitted from each pixel 811.
- the display surface 815 is closer to the observer than the state shown in FIG. 3, and the observer's focus adjustment does not match the display surface 815.
- the light from each pixel 811 of the pixel array 810 does not form an image on the retina 203 of the observer, and the imaging plane 204 is located farther from the pixel array 810 than the retina 203. In this case, for the observer, an out-of-focus blurred image is recognized.
- FIG. 4 shows a state in which, for example, an observer having presbyopia sees a blurred image in an attempt to look at a display surface present nearby.
- FIG. 5 illustrates the state of the light beam when the light beam reproduction type display device 15 is driven to display the image 160 on the virtual image plane 150 as a virtual image to the observer.
- the lens surface 125 exists relatively near the viewer.
- the virtual image surface 150 is set as a virtual display surface located farther than the lens surface 125 which is a real display surface.
- each micro lens 121 that is, each light emission point emits light of different light intensity in different directions instead of emitting light isotropically.
- the emission state of the light can be controlled.
- the light emitted from each of the microlenses 121 is controlled to reproduce the light from the image 160 on the virtual image plane 150. More specifically, assuming, for example, virtual pixels 151 (151a, 151b) on the virtual image surface 150, in order to display the image 160 on the virtual image surface 150, the first virtual pixel 151a is selected. It can be considered that light of an intensity is emitted and light of a second intensity is emitted from the other virtual pixel 151b.
- the micro lens 121a emits the light of the first intensity in the direction corresponding to the light from the pixel 151a, and emits the light of the second intensity in the direction corresponding to the light from the pixel 151b.
- the light emission state of the light is controlled.
- a pixel array is provided on the back side of the microlens array 120 (right side in FIG. 6), and each pixel of the pixel array is By controlling the drive of the light emission state of the light from the micro lens 121a is controlled.
- the distance from the retina 203 of the virtual image plane 150 is set to a position at which the observer adjusts the focus, for example, the position of the display surface 815 shown in FIG.
- the light beam reproduction display device 15 By driving the light beam reproduction display device 15 so as to reproduce light from the image 160 on the virtual image surface 150 existing at such a position, imaging of light from the lens surface 125 which is a real display surface is performed.
- the surface 204 is located behind the retina 203, the image 160 on the virtual image plane 150 is at a position where it is imaged on the retina 203, and the actual light beam state is the same as when a real image is present on 150 Is reproduced.
- this light ray regenerating display device which produces a light ray state similar to that when a virtual image is present. Therefore, in a viewer with presbyopia, even if the distance between the viewer and the lens surface 125 is short, the viewer can view a good image 160 similar to far vision.
- the light beam reproduction type display device reproduces the light from the image 160 on the virtual image plane 150 which is set to a position where focus adjustment is easy for the presbyopic observer, and the light To the observer.
- This allows the viewer to view the image 160 on the virtual image plane 150 where focus adjustment can be achieved. Therefore, for example, even if the image 160 is at a position where the observer can not adjust the focus at the viewing distance on the lens surface 125 which is the actual display surface, an in-focus image is provided to the observer. Therefore, the observer can clearly observe the fine image 160 without using an additional optical compensation device such as reading glasses.
- the viewer's eyesight is high-density because the eyesight of the viewer is supplemented to enable observation at a short distance.
- the image on which the information is displayed can be observed well.
- the virtual image surface 150 is set farther than the lens surface 125 which is the actual display surface.
- the present embodiment is not limited to such an example.
- the virtual image surface 150 may be set closer to the lens surface 125 which is a real display surface.
- the virtual image plane 150 is set, for example, to be in focus by a myopic observer. This enables a myopic observer to observe the in-focus image 160 without using an optical compensation device such as glasses or a contact lens.
- the display can be switched between the case of performing vision compensation for a presbyopia observer and the case of performing vision compensation for a myopia observer by changing only the data displayed on each pixel. And need not be accompanied by changes in hardware features.
- FIG. 6 illustrates the entire configuration of a display device 10 according to an embodiment of the present disclosure.
- the display device 10 includes a pixel array 110 in which a plurality of pixels 111 are two-dimensionally arranged, a microlens array 120 arranged to face the display surface 115 of the pixel array 110, and And a control unit 130 that controls driving of each pixel 111 of the array 110.
- the pixel array 110 and the microlens array 120 shown in FIG. 6 are substantially the same as those shown in FIG.
- the control unit 130 drives each pixel 111 so as to reproduce a predetermined light beam state based on the light beam information.
- the display device 10 is configured as a light beam reproduction display device.
- the pitch of the microlenses 121 in the microlens array 120 is configured to be larger than the pitch of the pixels 111 in the pixel array 110, as in the light beam reproduction type display device 15 shown in FIG.
- Light from a plurality of pixels 111 is incident on one microlens 121, and each light is emitted with directivity.
- each micro lens 121 constitutes a light emission point.
- the microlenses 121 correspond to pixels in a general two-dimensional display device, and in the display device 10, the lens surface 125 of the microlens array 120 is an apparent display surface.
- the pixel array 110 is formed of, for example, a liquid crystal panel of a liquid crystal display device having a pixel pitch of about 10 ⁇ m.
- the pixel array 110 may be connected to various configurations provided for pixels in a general liquid crystal display device, such as a drive element for driving each pixel of the pixel array 110 and a light source (backlight).
- a general liquid crystal display device such as a drive element for driving each pixel of the pixel array 110 and a light source (backlight).
- the present embodiment is not limited to such an example, and another display device such as an organic EL display device may be used as the pixel array 110.
- the pixel pitch is not limited to the above example, and may be appropriately designed in consideration of the resolution to be realized.
- the microlens array 120 is configured, for example, by two-dimensionally arranging a convex lens having a focal length of 3.5 mm at a pitch of 0.15 mm.
- the microlens array 120 is provided to substantially cover the entire pixel array 110.
- the distance between the pixel array 110 and the microlens array 120 is set to be longer than the focal length of each of the microlenses 121 of the microlens array 120, and the pixel array 110 and the microlens array 120
- the image on the display surface 115 is configured so as to form an image on a plane substantially parallel to the display surface 115 (or the lens surface 125) including the pupil of the observer.
- An imaging position of this image can be generally set in advance as an observation position assumed when the observer observes the display surface 115.
- the focal length of the microlenses 121 in the microlens array 120 and the pitch thereof are not limited to the above example, and the arrangement relationship with other members and the imaging position of the image on the display surface 115 It may be designed appropriately according to the observation position of the person).
- the control unit 130 is configured by a processor such as a central processing unit (CPU) or a digital signal processor (DSP).
- the control unit 130 controls driving of each pixel 111 of the pixel array 110 by operating according to a predetermined program.
- the control unit 130 has a light beam information generation unit 131 and a pixel drive unit 132 as its functional parts.
- the light beam information generation unit 131 generates light beam information based on the area information, the virtual image position information, and the image information.
- the area information refers to an area group including a plurality of areas smaller than the pupil diameter of the observer and set on a plane substantially parallel to the lens surface 125 of the microlens array 120 including the pupil of the observer. It is information.
- the area information includes information on the distance between the plane on which the area is set and the lens surface 125, and information on the size of the area.
- a plane 205 including the pupil of the observer, a plurality of areas 207 set on the plane 205, and an area group 209 are simply illustrated.
- a plurality of regions 207 are set to exist in the pupil of the observer.
- the area group 209 is set on the plane 205 in such a range that the light emitted from each of the microlenses 121 can reach. That is, the microlens array 120 is configured such that the light emitted from each pixel 111 irradiates the region group 209 according to the position and the angle of the light beam.
- the intensity and the like of the light emitted from each of the microlenses 121 are adjusted in accordance with the combination of the microlenses 121 and the region 207. That is, for each region 207, the irradiation state of light incident on the region 207 is controlled.
- the area 207 corresponds to the size on which one pixel 111 is projected on the pupil (the size of the projection on the pupil of the pixel 111), and the distance between the areas 207 corresponds to when light is incident on the pupil of the observer. It can be said that it indicates spatial sampling (sampling) intervals. In order to obtain sufficient visual effects, the area 207 needs to be sufficiently smaller than the pupil diameter. In the following description, the area 207 is also referred to as a sampling area 207.
- the virtual image position information is information on a position (virtual image generation position) at which a virtual image is generated.
- the virtual image generation position is the position of the virtual image plane 150 shown in FIG.
- the virtual image position information includes information on the distance from the lens surface 125 to the virtual image generation position.
- the image information is two-dimensional image information presented to the observer.
- the light beam information generation unit 131 When an image based on image information is displayed at a virtual image generation position based on virtual image position information based on the area information, virtual image position information and image information, the light beam information generation unit 131 generates light from the image based on the area information. It generates ray information representing a ray state for entering the sampling area 207.
- the light beam information includes information on the light emission state of each micro lens 121 for reproducing the light beam state, and information on the irradiation state of the light on each sampling area 207.
- the image information may be transmitted from another device, or may be stored in advance in a storage device (not shown) provided in the display device 10.
- the image information may be information on an image, a text, a graph, or the like representing the results of various processes performed by a general information processing apparatus.
- the virtual image position information may be input in advance by, for example, an observer or a designer of the display device 10, and may be stored in the storage device.
- the virtual image generation position is set so as to be in focus for the observer.
- a designer of the display device 10 may set a general focus position that fits a relatively large number of viewers having presbyopia as a virtual image generation position.
- the virtual image generation position may be appropriately adjusted by the observer in accordance with the user's visual acuity, and the virtual image position information in the storage device may be updated each time.
- the area information may be input in advance by, for example, the observer or a designer of the display device 10, and may be stored in the storage device.
- the distance between the lens surface 125 and the plane 205 (which corresponds to the observation position of the observer) in which the sampling area 207 is set, which is included in the area information is generally an observer observes the display device 10 It may be set based on the position assumed to be. For example, if the device on which the display device 10 is mounted is a wristwatch-type wearable device, the distance may be set in consideration of the distance between the observer's pupil and the arm that is the wearing position of the wearable device.
- the above-mentioned distance in consideration of the distance between the general observer's pupil and the television when viewing the television May be set.
- the distance may be automatically adjusted by an observation distance measuring device by a camera or the like, or may be appropriately adjusted by the observer according to the use mode, and the virtual image position information in the storage device each time the adjustment is performed. May be updated.
- the light beam information generation unit 131 provides the generated light beam information to the pixel drive unit 132.
- the pixel drive unit 132 drives each pixel 111 of the pixel array 110 so as to reproduce the light beam state when the image based on the image information is displayed on the virtual image plane based on the light beam information. At this time, the pixel driving unit 132 drives each pixel 111 so that the light emitted from each microlens 121 is independently controlled for each sampling region 207. Thereby, as described above, the irradiation state of the light incident on the sampling area 207 is controlled for each of the sampling areas 207. For example, in the example illustrated in FIG. 6, a state in which light 123 configured by overlapping light from a plurality of pixels 111 is incident on each sampling area 207 is illustrated.
- the display device 10 has a part of the configuration similar to that of a light beam reproduction type display device widely used as a naked eye 3D display device.
- the main purpose is to display an image having binocular parallax for the left and right eyes of the observer, so the emission state of the emitted light is controlled only in the horizontal direction.
- the control of the emitting state is not performed in the vertical direction. Therefore, for example, many lenticular lenses are provided on the display surface of the pixel array.
- the display device 10 is mainly intended to display a virtual image for the purpose of compensating the adjustment function of the eye to the observer. For this reason, naturally, it is necessary to control the emission state in both the horizontal direction and the vertical direction.
- two-dimensionally arrayed microlenses 121 instead of the above-mentioned lenticular lens so as to face the display surface 115 of the pixel array 110.
- FIG. 7 is a view for explaining light rays emitted from the micro lens 121.
- FIG. 7 similarly to FIG. 5, the microlens array 120 and the lens surface 125 which is the display surface thereof, the virtual image surface 150, the virtual pixels 151 on the virtual image surface 150, and the image 160 on the virtual image surface 150.
- 4 schematically shows the lens 201 of the eye of the observer and the retina 203 of the eye of the observer.
- FIG. 7 adds the display surface 115 of the pixel array 110 to FIG. 5 described above. Therefore, duplicate explanations of the matters already described with reference to FIG. 5 will be omitted.
- Light is emitted from each of the microlenses 121 so as to reproduce light from the image 160 on the virtual image plane 150.
- the image 160 can be thought of as a two-dimensional image on the virtual image plane 150 displayed by the virtual pixels 151 on the virtual image plane 150.
- the range 124 of light that can be independently controlled in one certain microlens 121 is schematically illustrated.
- a pixel group 112 (a part of the pixel array 110) immediately below the microlens 121 is a ray from a virtual pixel 151 on the virtual image plane 150 included in the range 124 and passes through this one microlens 121. It is driven to reproduce the light beam.
- By performing the same drive control in each of the microlenses 121 light is emitted from each of the microlenses 121 so as to reproduce light rays in each direction from the image 160 on the virtual image surface 150.
- the micro lens array 120 is such that the irradiation state of light to each sampling area 207 is periodically repeated in units larger than the user's maximum pupil diameter.
- the distance B between the lens surface 125 and the pupil, the distance C between the pixel array 110 and the microlens array 120, the pitch of the microlenses 121 in the microlens array 120, and the pixel size and pitch in the pixel array 110 are set.
- the conditions required for the repetition period of the irradiation state of the sampling area 207 will be considered more specifically.
- the repetition period of the irradiation state of the sampling area 207 (hereinafter also referred to simply as the repetition period) is another pixel group 112 (a part of the pixel array 110) immediately below the microlens 121 in FIG.
- the movement distance of the pupil that is to be observed similarly through the micro lens 121 of the H.sup.P which can be set with reference to the user's inter-pupil distance (PD: Pupil Distance).
- PD Pupil Distance
- the repetition period ⁇ is larger than the interpupillary distance PD, the left and right eyes can be included in the same repetition period. Therefore, it becomes possible to perform stereoscopic vision together with the display for compensating the visual acuity described above, similarly to the technique of the naked eye 3D display.
- the user's viewpoint transits between sampling area groups, normal viewing is hindered, but even if the viewpoint is moved by increasing the repetition period ⁇ , the user's viewpoint is the sampling area group Since the frequency of transition between is reduced, the frequency with which such a display is disturbed can be reduced.
- the repetition period ⁇ be as large as possible when realizing other functions other than visual acuity compensation such as stereoscopic vision together.
- FIG. 8 is a flow chart showing an example of the processing procedure of the display control method for visual acuity compensation according to the present embodiment. Each process shown in FIG. 8 corresponds to each process executed by the control unit 130 shown in FIG.
- light ray information is generated based on region information, virtual image position information, and image information (step S101).
- the area information includes the pupil of the observer, and the area group 209 including a plurality of sampling areas 207 set on a plane substantially parallel to the display surface of the display device 10 (the lens surface 125 of the microlens array 120). It is information.
- the virtual image position information is information on a position (virtual image generation position) at which the display device 10 generates a virtual image. For example, the virtual image generation position is set to a position in focus for the observer.
- the image information is two-dimensional image information presented to the observer.
- step S101 information indicating a light beam state for light from an image based on image information displayed at a virtual image generation position based on virtual image position information to be incident on each sampling area constituting the sampling area group is , Generated as ray information.
- the light ray information includes information on the light emission state of each micro lens 121 for reproducing the light state, and information on the irradiation state of the light to each sampling area 207.
- the process shown in step S101 corresponds to the process performed by the light beam information generation unit 131 shown in FIG. 6, for example.
- each pixel 111 is driven so that the incident state of light is controlled for each sampling area 207 based on the light ray information (step S102).
- the light beam state as described above is reproduced, and the virtual image of the image based on the image information is displayed at the virtual image generation position based on the virtual image position information. That is, clear and in-focus display is realized for the observer.
- the position of the virtual image plane 150 be at a finite distance. Since it is sufficient to display an image corresponding to each of the micro lenses 121 in each pixel 111, the number of pixels 111 in the pixel array 110 is reduced as compared with the case where the position of the virtual image plane 150 is at infinity. It is because Hereinafter, the relationship between the virtual image surface 150, the display surface 115, the lens surface 125, and the mutual distance between the plane 205 including the observer's pupil and the arrangement pitch of the plurality of pixels 111 in the pixel array 110 will be described with reference to FIG. Explain.
- ⁇ c represents the minimum resolution angle.
- H is the distance between the point 150P1 and the point 150P2 on the virtual image plane 150.
- B is the distance between the pupil of the observer and the lens surface 125
- C is the distance between the lens surface 125 and the display surface 115.
- D is the distance between the lens surface 125 and the virtual image surface 150.
- H (B + D) * tan ⁇ c (4)
- the arrangement pitch P LC is expressed by the following conditional expression (5).
- the arrangement pitch P LC is the minimum spacing of the pixels 111 visible to the observer.
- tan ⁇ c tan (1/60 °)
- the distance B 150 mm
- the distance C 20 mm.
- the required arrangement pitch P LC is larger when the position of the virtual image plane 150 with respect to the pixel array 110 is set to the minimum necessary finite distance than when the position is set to infinity. Therefore, the required fineness of the pixel array can be alleviated and the cost of the device can be reduced. This also reduces the number of pixels 111 in the pixel array 110 required for equal display capacity. Therefore, the cost of the pixel array and the drive circuit can be reduced. It should be noted that the actual distance D may be set to a distance sufficient for vision compensation to be achieved (the observer can focus).
- control unit 130 controls the plurality of lights such that the plurality of lights from the plurality of pixels 111 are viewed as a continuous integrated image through the plurality of microlenses 121. That is, from the pixel array 110, a group of light emitted from the pixel array 110 and reaching the eyes of the observer via the microlens array 120 is recognized as one image by the observer. Control part 130 performs control of a plurality of lights emitted.
- control unit 130 generates a virtual image at a position different from the display surface by causing a plurality of lights from the plurality of pixels 111 to be viewed as a continuous integral image through the plurality of microlenses 121 and the virtual image
- the display operation of the pixel array 110 is controlled so as to arbitrarily set the position of the binocular convergence plane in the direction orthogonal to the display plane independently of the position of.
- the control unit 130 controls the display operation of the pixel array 110 so that, for example, the position of the binocular convergence surface becomes substantially equal to the position of the virtual image.
- the convergence position Pc is on the display surface 115, not on the virtual image surface 150.
- the convergence position Pc is on the display surface 115, not on the virtual image surface 150.
- the convergence position Pc and the adjustment position Pf (the position of the virtual image plane 150) are different in the Z-axis direction, eye strain of the observer may be caused or the convergence function by the convergence distance itself is small. It is also said that it may cause fatigue.
- the convergence position Pc and the adjustment position Pf are different as shown in FIG. 10, the alignment direction of the right eye 211R and the left eye 211L of the observer is not parallel to the alignment direction of the microlenses 121 in the microlens array 120.
- the image recognized by the observer may be a double image, which may make it difficult to identify characters and the like.
- a good composite image 160C is obtained. That is, one good composite image such that the image 160L recognized by the left eye 211L (indicated by the solid line in FIG. 11) and the image 160R recognized by the right eye 211R (indicated by the solid line in FIG. 11) match exactly. 160C can be formed in the observer's brain.
- the left-right direction 211H connecting the left eye 211L and the right eye 211R both in FIG. 12 is lower than the horizontal direction (X-axis direction) of the microlens array 120, that is, the direction in which the microlenses 121 are arranged, in a plane parallel to the display surface 115.
- the composite image 160 CC is recognized as a double image by the observer.
- the image 160LL recognized by the left eye 211L (indicated by a broken line in FIG.
- the repetition period ⁇ is larger than PD ( ⁇ > PD).
- a straight line 211H (indicated by a solid line) passing through the center of both eyes of the observer coincides with the horizontal direction (X-axis direction) 115H of the display surface 115
- the upper part of FIG. As shown in the middle, a good composite image 160C is obtained. That is, one good composite image such that the image 160L recognized by the left eye 211L (indicated by the solid line in FIG. 13) and the image 160R recognized by the right eye 211R (indicated by the solid line in FIG. 13) match exactly. 160C can be formed in the observer's brain.
- the convergence position Pc and the adjustment position Pf coincide on the virtual image plane 150 because the left eye 211L and the right eye 211R are in a space that reproduces a continuous light beam state through the same microlens. It is. For this reason, even when the observer leans the face, the left eye 211 L (indicated by a broken line in FIG. 13) is displayed just as when the object is on the virtual image plane 150, as shown in the lower part of FIG. Does not occur in binocular vision although the image 160LL recognized by the image and the image 160RR recognized by the right eye 211R (indicated by broken lines in FIG. 13) are inclined.
- the problem described above that is, the problem that the image recognized by the observer may be a double image and it may be difficult to identify characters, etc.
- the repetition period ⁇ is the distance between the left and right eyes It can be seen that both eyes are a problem that occurs when observing substantially the same pixel 111 through different microlenses 121, which is smaller than the inter-pupil distance PD.
- the display image quality is degraded if the repetition period ⁇ is made larger than the interpupillary distance PD. Can cause This is because it is necessary to make the micro lens 121 largely large, which causes a decrease in resolution and also causes deterioration of the aberration of the lens.
- convergence of the convergence and the adjustment is realized by inputting image information separately to the left and right eyes instead of making the repetition period ⁇ larger than the inter-pupil distance PD. The details will be described below.
- FIG. 15A schematically illustrates the relationship between one continuous display area E having a repetition period ⁇ in the left and right direction, the corresponding microlenses 121A to 121E, and the pixel groups 113A to 113E of the pixel array 110 in the display device 10.
- the pixel groups 113A to 113E are respectively configured by a plurality of pixels 111, and correspond to the pixel group 112 in FIG.
- FIG. 15A shows the appearance of light when the pixel array 110 is viewed monocularly. As shown in FIG.
- pixel groups 113A to 113E are arranged in the lateral direction on the paper so as to correspond to the respective microlenses 121A to 121E arranged in the lateral direction on the paper. That is, for example, the image light emitted from the pixel group 113A enters the continuous display area E through the microlens 121A, and the image light emitted from the pixel group 113B enters the continuous display area E through the microlens 121B. It has become. Similarly, the image light emitted from each of the pixel groups 113C to 113E is incident on the continuous display area E through the micro lenses 121C to 121E.
- the positional relationship and size ratio of the continuous display area E, the microlenses 121A to 121E, and the pixel groups 113A to 113E illustrated in FIG. 15A are examples, and the present disclosure is not limited thereto.
- a normal virtual image can be viewed when the viewer's viewpoint (the position of the pupil PL or the pupil PR) with respect to the display surface 115 changes in a plane parallel to the display surface 115.
- the image light visually recognized from different viewpoint positions in one continuous display area E is emitted from different pixels 111 in each pixel group 113.
- the image light viewed in the area on the right side of the center position is, for example, the area on the left side of the center position of each pixel group 113. It is emitted from a certain pixel 111.
- the image light viewed in the area on the left side of the center position is emitted from the pixel 111 in the area on the right side of the center position of the pixel group 113, for example.
- the pixel group 113 is divided into the right eye pixel area 116 (116A to 116E) and the left eye pixel area 117 (117A to 117E), respectively.
- the unit 130 further causes the right-eye image to be displayed on the right-eye pixel in the right-eye pixel region 116 among the plurality of pixels 111, and the left-eye pixel in the left-eye pixel region 117 among the plurality of pixels 111. If the left-eye image is displayed, the user can visually recognize the right-eye image displayed in the right-eye pixel region 116 in the right-eye image region ER of the continuous display region E.
- the left-eye image displayed in the left-eye pixel region 117 can be visually recognized in the left-eye image region EL of the continuous display region E. In this way, the images observed by the left and right eyes can be controlled independently, and hence control of convergence can be freely performed.
- FIG. 15C is a conceptual diagram for explaining the relationship between the positions of the pupils PL and PR of the observer and the position of the continuous display area E.
- ⁇ is an arrangement pitch of the plurality of continuous display areas E arranged in the horizontal direction
- n is a natural number.
- each continuous display area E is bisected in the horizontal direction (X-axis direction), and the image area EL for the left eye and the image for the right eye corresponding to the image for the left eye
- the display of the pixel array 110 is controlled so that the right-eye image areas ER corresponding to Y are alternately arranged in the horizontal direction. In this way, it is possible to control the convergence of both eyes even in the case where the repetition period ⁇ is smaller for PD than in the example of FIG. 15B, ie, in the case of ⁇ ⁇ PD.
- the control unit 130 can control the pixel array 110 so that the convergence position Pc and the adjustment position Pf approach each other.
- the convergence position Pc and the adjustment position Pf the position of the virtual image plane 150
- the convergence position Pc and the adjustment position Pf reduce the possibility of eyestrain of the observer due to the mismatch between the convergence position Pc and the adjustment position Pf. be able to.
- the display device 10 and the display control method using the same the burden on the observer can be reduced while securing excellent visibility.
- stereoscopic vision can be controlled simultaneously with vision compensation by arbitrarily controlling the convergence position.
- the light constituting the image for the left eye is appropriately guided to the pupil PL of the left eye 211L by the restriction of the optical system described above. Since light constituting an appropriate right-eye image can be guided to the pupil PR of the right eye 211R and convergence and adjustment can be simultaneously corrected in the same manner, a normal virtual image can be viewed.
- the adjustment position Pf is at a position at a finite distance. Therefore, the number of pixels 111 in the pixel array 110 can be reduced as compared with the case where the adjustment position Pf is at the infinity position.
- the display device 10 is preferable because the convergence position Pc and the adjustment position Pf can be realized without any change in lens design such as the dimension of the microlens 121.
- FIG. 16 is a diagram showing a configuration example when the display device 10 is applied to another mobile device.
- the first housing 171 in which the pixel array 110 is mounted and the second housing 172 in which the microlens array 120 is mounted are mutually connected by the connection member 173 and integrated. It has been done.
- the first housing 171 corresponds to the main body of the mobile device, and a processing circuit or the like that controls the operation of the entire mobile device including the display device 10 may be mounted in the first housing 171.
- the connecting member 173 is a rod-like member provided with rotating shafts at both ends, and one of the rotating shafts is connected to the side surface of the first casing 171 as shown in FIG. Are connected to the side surface of the second housing 172.
- the first housing 171 and the second housing 172 are rotatably connected to each other by the connection member 173.
- a camera is provided on the front surface of a housing, and a visual aid device (hereinafter referred to as an electronic loupe) which enlarges and displays information on a paper surface photographed by the camera on a display screen provided on the back of the housing
- the device is known).
- the user can read the enlarged map, characters, etc. through the display screen by placing the electronic loupe device on the paper such as a map or newspaper so that the camera faces the paper.
- the display device 10 according to the first embodiment can be suitably applied to such an electronic loupe device.
- FIG. 17 shows an example of a general electronic loupe device.
- FIG. 17 is a diagram showing an example of a general electronic loupe device.
- the camera is mounted on the surface of the housing of the electronic loupe device 820.
- the electronic loupe device 820 is mounted on the paper surface 817 such that the camera faces the paper surface 817.
- the figures, characters and the like on the sheet 817 photographed by the camera are appropriately enlarged and displayed on the display screen on the back of the case of the electronic loupe device 820.
- a user who finds it difficult to read small figures or characters due to presbyopia or the like can more easily read information on the paper.
- a general electronic loupe device 820 as illustrated in FIG. 17 simply displays a photographed image at a predetermined magnification and simply displays it. Therefore, since the user needs to enlarge the display to such an extent that reading can be performed without blurring, the number of characters (the amount of information) displayed at one time on the display screen decreases. Therefore, in order to read a wide range of information in the page 817, it is necessary to move the electronic loupe device 820 frequently on the page 817.
- the display device 10 when the display device 10 is mounted on an electronic loupe device, for example, a configuration example in which a camera is mounted on the front surface of the housing and the display device 10 is mounted on the rear surface of the housing can be considered.
- an image including information on the paper surface photographed by the camera is mounted on the back surface of the housing It may be displayed by the display device 10.
- the above problem can be avoided by applying the display device 10 to an on-vehicle display device for displaying the driving support information as described above.
- the display device 10 can generate a virtual image behind (far away) from the actual display surface (that is, the microlens array 120), for example, the generation position of the virtual image is set sufficiently far.
- the display device 10 can generate a virtual image behind (far away) from the actual display surface (that is, the microlens array 120), for example, the generation position of the virtual image is set sufficiently far
- the display device 10 can generate a virtual image behind (far away) from the actual display surface (that is, the microlens array 120), for example, the generation position of the virtual image is set sufficiently far
- the user who is the driver looks at the display device 10 various information can be displayed at the same distance as when looking at the outside through the windshield. Therefore, even when the user alternately views the appearance of the outside world and the driving support information in the on-vehicle display device 10, the time taken for focusing can be shortened.
- the display device 10 can be suitably applied to an on-vehicle display device that displays driving support information.
- the display device 10 By applying the display device 10 to a vehicle-mounted display device, it is possible to fundamentally solve the safety problems caused by the focusing time of the driver's view as described above.
- the display device 10 reproduces a light beam state from the virtual image in the case where the virtual image exists at a predetermined position based on the virtual image position information, thereby displaying the display corresponding to the virtual image. It is provided to the user.
- the position at which a virtual image is generated (virtual image generation position) is appropriately set according to the user's visual acuity. For example, by setting a virtual image generation position to a focal position according to the user's vision, an image can be displayed so as to compensate the user's vision.
- a predetermined restriction exists when configuring the display device 10, and the degree of freedom in design is low.
- the visual acuity compensation of the user is performed by a different method with an apparatus configuration substantially the same as that of the display device 10 shown in FIG. 6 will be described.
- the constituent members need to satisfy predetermined conditions.
- the specific configurations and the arrangement positions of the pixel array 110 and the microlens array 120 are determined according to the size of the sampling area 207, the performance required for the resolution, the repetition period ⁇ , and the like. It can be done.
- the size of the sampling area 207 is sufficiently small relative to the user's pupil diameter, specifically 0.6 (mm) or less Is preferred.
- the size ds of the sampling area 207 the size dp of the pixels 111 of the pixel array 110, the distance B between the lens surface 125 of the microlens array 120 and the pupil, and the display of the lens surface 125 of the microlens array 120 and the pixel array 110.
- the distance C to the surface 115 there is a relationship shown in the following equation (7).
- the size dp, the distance B, and the distance C of the pixel 111 can be determined according to the size ds of the sampling area 207 required for the display device 10 (hereinafter referred to as condition 1). Since the size ds of the sampling area 207 is preferably smaller, for example, the size dp, the distance B and the distance C of the pixel 111 are determined such that the size ds of the sampling area 207 is smaller.
- each of the microlenses 121 of the microlens array 120 behaves as a pixel.
- the resolution of the display device 10 is determined by the pitch of the microlenses 121.
- the pitch of the microlenses 121 can be determined according to the resolution required for the display device 10 (hereinafter referred to as condition 2). In general, it is preferable that the resolution be larger, for example, the pitch of the microlenses 121 is required to be smaller.
- the distance D is a distance from the microlens array 120 to the virtual image generation position and is also referred to as a virtual image depth. Therefore, the size dp and the distance C of the pixel 111 can also be determined according to the resolution and the distance D required for the display device 10 (hereinafter referred to as Condition 3).
- the repetition period ⁇ is preferably larger in order to provide the user with a more normal view more stably. Therefore, for example, the pitch of the microlenses 121, the distance B and the distance C are determined such that the repetition period ⁇ becomes larger.
- the sizes dp of the pixels 111, the distance D, the pitch of the microlenses 121, the distance B, and the distance C, etc. are related to the configuration and arrangement position of the pixel array 110 and the microlens array 120.
- the value of may be appropriately determined so as to satisfy the conditions 1 to 4 required for the display device 10.
- the size dp of the pixel 111, the distance D, the pitch of the microlenses 121, the distance B, the distance C, and the like can not be set independently.
- the resolution and the repetition period ⁇ required for the display device 10 are determined from the viewpoint of product performance.
- the pitch of the microlenses 121 may be determined so as to satisfy the resolution required for the display device 10.
- the distance C can be determined based on the condition 4 so as to satisfy the repetition period ⁇ required for the display device 10.
- the distance B can be set, for example, as a distance at which the user generally observes the display device 10, and thus the design freedom for the distance B is small. Therefore, when the pitch of the microlenses 121 and the distance C are determined, the size dp of the pixels 111 is determined based on the condition 1 so as to satisfy the size ds of the sampling area 207 required for the display device 10. Therefore, when trying to reduce the size ds of the sampling area 207, the size dp of the pixel 111 becomes relatively small accordingly.
- the present applicant examined a technology capable of performing vision compensation with a device configuration substantially the same as that of the display device 10 and maintaining the size dp of the pixel 111 at a predetermined size.
- the virtual image of the image on the display surface of the pixel array 110 is generated at an arbitrary position by appropriately driving each pixel 111 of the pixel array 110 and controlling the light beam state.
- a convex lens generally has a function of generating a virtual image of the object enlarged at a predetermined position by a predetermined magnification according to the distance between the convex lens and the object and the focal distance f thereof. If the user is allowed to observe a virtual image optically generated by such a convex lens, it is considered possible to realize vision compensation of the user who is, for example, presbyopia.
- each microlens 121 can function as a magnifying glass similar to the above-described convex lens 821. That is, each of the microlenses 121 can cause the user who observes the object through the microlens 121 to observe a virtual image in which the object is enlarged.
- the microlens array 120 is configured such that virtual images of the display of the pixel array 110 can be generated by the respective microlenses 121 of the microlens array 120 (ie, the pixel array 110 By arranging it closer to the micro lens 121 than the focal length of each micro lens 121, an enlarged and resolved image (that is, a virtual image) is provided to the user without performing light beam reproduction. It is possible to At this time, if the size of the image displayed on the pixel array 110 is adjusted in consideration of the magnification of each of the micro lenses 121 as described above, the amount of information provided to the user does not decrease.
- the display device 10 illustrated in FIG. 6 can be regarded as having a plurality of lenses (that is, microlenses 121) arranged on the display surface side of the pixel array 110.
- Each microlens 121 does not have to have a large angle of view to cover the entire display surface of the pixel array 110, and thus can be formed as a convex lens of a realistic size.
- the display in the pixel array 110 may be controlled. That is, in each pixel 111 of the pixel array 110, a ray of light from which each image viewed by the user through the respective microlenses 121 of the microlens array 120 becomes continuous and integrated display from the user's pupil So as to be emitted to the
- each of the microlenses 121 may be controlled so that the user can observe a virtual image of a continuous and integrated image.
- the position of the virtual image in the image processing is adjusted to be equal to the virtual image generation position determined from the hardware configuration of the microlens 121.
- the image resolved by the microlens 121 is provided to the user as a continuous image.
- the virtual image is generated optically by the micro lens 121, it is not necessary to set the sampling area 207 small for vision compensation. Therefore, it is not necessary to consider the condition 1 above.
- the resolution of the display device 10 can be determined according to the magnification of the microlens 121 instead of the pitch of the microlens 121, it is not necessary to consider the condition 2 above.
- a display pixel array
- the display device can be configured without increasing the manufacturing cost.
- the virtual image generation position can be determined in a so-called hardware manner according to the distance between the microlenses 121 and the display surface of the pixel array 110 (that is, the distance C) and the focal distance f of the microlenses 121. Further, unlike the first embodiment, the virtual image generation position can be generated only on the opposite side of the microlens with respect to the pixel array. Therefore, in the second embodiment, although there is an advantage that the size dp of the pixel 111 does not need to be reduced, the virtual image depth D can not be arbitrarily changed by changing only the driving of the pixel array 110. Therefore, the first embodiment and the second embodiment may be selected according to the application.
- FIG. 18 is a view showing an example of the configuration of a display device according to the second embodiment.
- the display device 40 includes a pixel array 110 in which a plurality of pixels 111 are two-dimensionally arranged, and a microlens array provided on the display surface 115 of the pixel array 110. And a control unit 430 that controls driving of each pixel 111 of the pixel array 110.
- the configuration of each of the pixel array 110 and the microlens array 120 is the same as the configuration of these members in the display device 10 shown in FIG. 6, and thus the detailed description thereof is omitted here.
- the distance between the pixel array 110 and the microlens array 120 is set to be longer than the focal distance of each of the microlenses 121 of the microlens array 120.
- the distance between the pixel array 110 and the microlens array 120 is a microlens array. It is arranged to be shorter than the focal length of each of the microlenses 121 of 120.
- the pixel array 110 and the microlens array 120 need to be designed to satisfy all the conditions 1 to 4 described above. Therefore, the size dp of the pixels 111 and the pitch of the microlenses 121 tend to be relatively small.
- the conditions 1 and 2 may not be considered among the conditions 1 to 4. Therefore, the size dp of the pixel 111 may be larger than that of the first embodiment, and may be equivalent to, for example, a display of general-purpose products that are generally used widely.
- the pixel array 110 and the microlens array 120 are designed to satisfy the conditions 3 and 4. That is, in the display device 40, the size dp, the distance D, and the distance C of the pixel 111 may be set so as to satisfy a predetermined resolution.
- the control unit 430 includes, for example, a processor such as a CPU or a DSP, and controls driving of each pixel 111 of the pixel array 110 by operating according to a predetermined program.
- the control unit 430 has a light beam information generation unit 431 and a pixel drive unit 432 as its function.
- the functions of the light beam information generation unit 431 and the pixel drive unit 432 correspond to those in which a part of the functions of the light beam information generation unit 131 and the pixel drive unit 132 in the display device 10 illustrated in FIG.
- description of the control unit 430 that is the same as the control unit 130 of the display device 10 will be omitted, and differences from the control unit 130 will be mainly described.
- the light beam information generation unit 431 generates light beam information for driving each pixel 111 of the pixel array 110 based on the image information and the virtual image position information.
- the image information is two-dimensional image information presented to the user, as in the first embodiment.
- virtual image position information is not arbitrarily set as in the first embodiment, and predetermined virtual image generation determined according to the distance C and the focal distance of each of the microlenses 121 of the microlens array 120 It is information about the position.
- the light beam information generation unit 431 performs a light beam state in which an image visually recognized through each of the micro lenses 121 of the micro lens array 120 is continuously and integrally displayed based on the image information.
- the information to be shown is generated as light beam information.
- the light beam information generation unit 431 generates the virtual image related to the continuous and integral display based on the positional relationship between the pixel array 110 and the microlens array 120 based on the virtual image position information, and the micro lens 121.
- the light beam information is generated so as to coincide with the virtual image generation position determined by the optical characteristics.
- the light beam information generation unit 431 may appropriately adjust the light beam information in consideration of the magnification of the microlens 121 so that the size of the image finally observed by the user becomes an appropriate size.
- the light beam information generation unit 431 provides the generated light beam information to the pixel drive unit 432.
- the image information and virtual image position information may be transmitted from another device, or may be stored in advance in a storage device (not shown) provided in the display device 40.
- the pixel drive unit 432 drives each pixel 111 of the pixel array 110 based on the light beam information.
- each pixel 111 of the pixel array 110 is driven based on the light beam information by the pixel drive unit 432, so that an image visually recognized through each microlens 121 of the microlens array 120 is continuous.
- the light emitted from each of the microlenses 121 is controlled to provide an integral display. Thereby, the user can recognize the optical virtual image generated by each of the microlenses 121 as a continuous and integrated image.
- the configuration of the display device 40 according to the second embodiment has been described above with reference to FIG.
- FIG. 20 is a flowchart showing an example of a processing procedure of the display control method according to the second embodiment.
- Each process shown in FIG. 19 corresponds to each process executed by control unit 430 shown in FIG.
- the virtual image position information is information on the position (virtual image generation position) at which the virtual image is generated on the display device 40.
- the virtual image position information is information on a predetermined virtual image generation position determined according to the distance C between the lens pixels and the focal distance of each of the microlenses 121 of the microlens array 120.
- the image information is two-dimensional image information presented to the user.
- step S201 information indicating a light beam state such that an image visually recognized through each of the micro lenses 121 of the micro lens array 120 becomes a continuous and integrated display based on the image information is used as light ray information. It is generated.
- the virtual image generation position relating to the continuous and integral display matches the virtual image generation position determined by the positional relationship between the pixel array 110 and the microlens array 120 based on virtual image position information and the optical characteristics of the microlens 121.
- the ray information may be generated.
- the light beam information may be appropriately adjusted so that the size of the image finally observed by the user becomes an appropriate size in consideration of the magnification of the microlens 121. .
- each pixel is driven such that an image viewed through each of the microlenses 121 of the microlens array 120 can be displayed continuously and integrally based on the light beam information (step S202).
- the optical virtual image generated by each of the microlenses 121 is provided to the user as a continuous and integral image.
- the adjustment position Pf (the position of the virtual image plane 150) be at a finite distance.
- the arrangement pitch of the pixels 111 can be increased as compared with the case where the adjustment position Pf is at the infinity position, and as a result, the number of the pixels 111 in the pixel array 110 can be reduced. That is, as shown in FIG. 20, also in the display device 40, the conditional expression (5) shown above is established. Therefore, the required arrangement pitch P LC is larger when the position of the virtual image plane 150 with respect to the pixel array 110 is set to the minimum necessary finite distance than when the position is set to infinity. Therefore, the number of pixels 111 in the pixel array 110 can be reduced.
- the repetition period ⁇ is similarly present. If the repetition period ⁇ is smaller than the inter-pupil distance PD, as in the description of the first embodiment, although it is advantageous for reducing the size of the display device, the convergence position and There are concerns such as eye fatigue and the risk of double image formation due to different adjustment positions. On the other hand, if the interpupillary distance PD is smaller than the repetition period ⁇ , the concerns such as eye fatigue and the risk of double image generation described above are eliminated, but the aberration increase with the increase of the numerical aperture of the lens array, etc. Large problems exist and realistic design is often difficult.
- the second embodiment of the present disclosure in order to simultaneously solve the problems in the above two cases caused by the magnitude relationship between the inter-pupil distance PD and the repetition period ⁇ .
- the convergence of the convergence and the adjustment is realized by inputting image information separately to the left and right eyes. The details will be described below.
- FIG. 21A schematically illustrates the relationship between one continuous display area E having a repetition period ⁇ in the left and right direction, the corresponding microlenses 121A to 121E, and the pixel groups 113A to 113E of the pixel array 110 in the display device 40.
- FIG. 21A shows the appearance of light when the pixel array 110 is viewed monocularly.
- the relationship between the focal length of the microlenses 121A to 121E and the distance between the microlenses 121A to 121E and the pixel array 110 is This is different, and as shown in FIG. 21A, the continuous display area E is narrower than ⁇ .
- the second embodiment a plurality of pixels 111 are observed through one microlens 121 when observed from the same viewpoint. Therefore, in the vicinity of the boundary of the pixel groups 113A to 113E, an incomplete whole image is observed, and the normal continuous display area E becomes narrow. Except for the above point, the overall configuration of the second embodiment is the same as that of the first embodiment, and thus the detailed description of the same parts will be omitted.
- the positional relationship and size ratio of the continuous display area E, the microlenses 121A to 121E, and the pixel groups 113A to 113E illustrated in FIG. 21A are examples, and the present disclosure is not limited thereto.
- a normal virtual image can be viewed when the viewer's viewpoint (the position of the pupil PL or the pupil PR) with respect to the display surface 115 changes in a plane parallel to the display surface 115.
- the image light viewed in the area to the right of the center position is, for example, in the area to the left of the center position of the pixel group 113. It is emitted from the pixel 111.
- the image light viewed in the area on the left side of the center position is emitted from the pixel 111 in the area on the right side of the center position of the pixel group 113, for example. It is
- the pixel group 113 is divided into the right eye pixel area 116 (116A to 116E) and the left eye pixel area 117 (117A to 117E), respectively.
- the unit 430 further causes the right-eye image to be displayed on the right-eye pixel in the right-eye pixel region 116 among the plurality of pixels 111, and the left-eye pixel in the left-eye pixel region 117 among the plurality of pixels 111. If the left-eye image is displayed, the user can visually recognize the right-eye image displayed in the right-eye pixel region 116 in the right-eye image region ER of the continuous display region E.
- the left-eye image displayed in the left-eye pixel region 117 can be visually recognized in the left-eye image region EL of the continuous display region E. In this way, the images observed by the left and right eyes can be controlled independently, and hence control of convergence can be freely performed.
- the display device 40 is configured to substantially satisfy the same conditional expression (6) in the description of the first embodiment, as shown in FIG. 15C.
- ⁇ is an arrangement pitch of the plurality of continuous display areas E arranged in the horizontal direction
- n is a natural number.
- each continuous display area E is bisected in the horizontal direction (X-axis direction), and the image area EL for the left eye and the image for the right eye corresponding to the image for the left eye
- the display of the pixel array 110 is controlled so that the right-eye image areas ER corresponding to Y are alternately arranged in the horizontal direction. In this way, it is possible to control the convergence of both eyes even when the repetition period ⁇ is equal to or less than the interpupillary distance PD, that is, ⁇ ⁇ PD.
- the control unit 430 can control the pixel array 110 so that the convergence position Pc and the adjustment position Pf become equal. Therefore, for example, even when the observer tilts the face with respect to the display surface, it can be avoided that the image visually recognized by the observer becomes a double image. Further, the convergence position Pc and the adjustment position Pf (the position of the virtual image plane 150) can be equalized in the Z-axis direction, thereby alleviating the possibility of eyestrain of the observer due to the mismatch between the convergence position Pc and the adjustment position Pf. be able to. Furthermore, it is possible to avoid eyestrain due to convergence itself in close vision. Therefore, according to the display device 40 and the display control method using the same, the burden on the observer can be reduced while securing excellent visibility. In addition, stereoscopic vision can be controlled simultaneously with vision compensation by arbitrarily controlling the convergence position.
- a pupil position detection unit 231 that detects the position of the pupil of the right eye and the position of the pupil of the left eye may be further provided.
- the display device 20 has the same configuration as the display device 10 shown in FIG. 6 except that the control unit 230 is provided instead of the control unit 130.
- the control unit 230 has a light beam information generation unit 131, a pixel drive unit 132, and a pupil position detection unit 231 as its functional parts.
- the pupil position detection unit 231 detects the positions of the pupils PL and PR of the observer.
- any known method used in general pupil position detection technology may be applied, for example.
- the display device 20 is provided with an imaging device capable of capturing the face of the observer, and the pupil position detection unit 231 analyzes the captured image acquired by the imaging device using a known image analysis method.
- the positions of the pupils PL and PR of the observer can be detected.
- the pupil position detection unit 231 provides the light beam information generation unit 131 with information about the detected positions of the pupils PL and PR of the observer.
- the light beam information generation unit 131 locates the pupil PL in the image region EL for the left eye (FIG. 15C) based on the information about the positions of the pupils PL and PR of the observer from the pupil position detection unit 231 and Information on the light irradiation state is generated so as to be located in the eye image area ER (FIG. 15C).
- the pixel drive unit 132 optimizes the light for the right eye and the light for the left eye according to the position of the pupil PL of the left eye 211L and the position of the pupil PR of the right eye 211R detected by the pupil position detection unit 231.
- the pixel array 110 is controlled to emit light.
- the observer can obtain a favorable virtual image by appropriately feeding back information about the positions of the pupils PL and PR of the observer to the light beam information generation unit 131 using the pupil position detection unit 231. Can increase the chances of being able to
- the pupil position detection unit 231 is used as a distance measuring device, and the distance (B + C) between the pupil PR and pupil PL of the observer and the pixel array 110 is appropriately measured and fed back to the control unit 230, The position of the virtual image plane 150 may be adjusted.
- the arrangement positions and shapes of the pixel array 110, the microlens array 120, the control units 130, 230, 430, and the like described in the above embodiment and the like are merely examples, and the present invention is not limited to these.
- the display device described in the above-described embodiment and the like is not limited to the case where all the components described above are included, and may further include other components.
- the display device described in the above-described embodiment and the like is not limited to portable electronic devices such as smartphones and tablet-type terminals described above, in-vehicle devices, or televisions used in homes. It includes those that display information inside and outside. Furthermore, it has applicability to various medical devices (for example, an endoscopic surgery system, an operating room system, a microsurgery system, etc.).
- a pixel array having a display surface in which a plurality of pixels are arranged at a first pitch;
- a lens array having a plurality of lenses arranged to face the display surface and arranged at a second pitch greater than the first pitch along the extension direction of the display surface;
- a virtual image is generated at a position different from the display surface by allowing a plurality of lights from the plurality of pixels to be viewed as a continuous integral image through the plurality of lenses, and independent of the position of the virtual image
- a control unit configured to control the display operation of the pixel array so as to arbitrarily control the position of the convergence plane in the direction orthogonal to the display surface.
- a pixel array having a display surface in which a plurality of pixels are arranged at a first pitch;
- a lens array having a plurality of lenses arranged to face the display surface and arranged at a second pitch greater than the first pitch along the extension direction of the display surface;
- a virtual image is generated at a position different from the display surface by causing a plurality of lights from the plurality of pixels to be viewed as a continuous integral image through the plurality of lenses, and in a direction orthogonal to the display surface
- a control unit that controls the display operation of the pixel array such that the position of the binocular convergence plane is substantially equal to the position of the virtual image.
- the control unit divides a pixel area corresponding to each of the plurality of lenses into the pixel area for the right eye and the pixel area for the left eye on the display surface in the pixel array, and the right eye in the pixel area
- the right-eye image is displayed on the right-eye pixel in the pixel region
- the left-eye image is displayed on the left-eye pixel in the left-eye pixel region in the pixel region.
- the position of the virtual image is at a finite distance,
- a plurality of lights from the plurality of pixels in the pixel array having the display surface in which the plurality of pixels are arranged at the first pitch is larger than the first pitch along the extending direction of the display surface; Controlling the display operation of the pixel array to be viewed as a continuous integral image through the plurality of lenses in a lens array having a plurality of lenses arranged at a pitch of In the display surface of the pixel array, a pixel area corresponding to the lens is divided into a right-eye pixel area and a left-eye pixel area, and the right-eye pixel in the right-eye pixel area among the plurality of pixels The left eye image is displayed on the pixel for the left eye in the pixel region for the left eye among the plurality of pixels, and the convergence position and the adjustment position are in the direction orthogonal to the display surface.
- a plurality of lights from the plurality of pixels in the pixel array having the display surface in which the plurality of pixels are arranged at the first pitch is larger than the first pitch along the extending direction of the display surface; Controlling the display operation of the pixel array to be viewed as a continuous integral image through the plurality of lenses in a lens array having a plurality of lenses arranged at a pitch of In the display surface of the pixel array, a pixel area corresponding to the lens is divided into a right-eye pixel area and a left-eye pixel area, and the right-eye pixel in the right-eye pixel area among the plurality of pixels To display the image for the right eye on the left eye pixel in the pixel region for the left eye among the plurality of pixels so that the image for the left eye is displayed, and the convergence position and the adjustment position are substantially matched. Controlling the pixel array.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
L'invention concerne un dispositif d'affichage apte à garantir une meilleure visibilité tout en réduisant la charge sur un utilisateur. Le dispositif d'affichage selon l'invention comprend une matrice de pixel, une matrice de lentilles, et une unité de commande. La matrice de pixels comprend une surface d'affichage sur laquelle une pluralité de pixels est disposée sous forme de matrice, à un premier pas. La matrice de lentilles est placée de sorte à faire face à la surface d'affichage, et comprend une pluralité de lentilles disposée sous forme de matrice, à un second pas qui est plus grand que le premier pas dans le sens de la direction d'extension de la surface d'affichage. L'unité de commande : amène une pluralité de faisceaux de lumière provenant de la pluralité de pixels à passer à travers la pluralité de lentilles, formant une seule image continue, et génère ainsi une image virtuelle à un emplacement différent de celui de la surface d'affichage; et commande l'opération d'affichage de la matrice de pixels de telle sorte que l'emplacement d'une surface de convergence oculaire est commandé de façon optionnelle, indépendamment de l'emplacement de l'image virtuelle, dans une direction orthogonale à la surface d'affichage.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019543476A JP7184042B2 (ja) | 2017-09-19 | 2018-08-10 | 表示装置および表示制御方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-178694 | 2017-09-19 | ||
JP2017178694 | 2017-09-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019058811A1 true WO2019058811A1 (fr) | 2019-03-28 |
Family
ID=65811134
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/030102 WO2019058811A1 (fr) | 2017-09-19 | 2018-08-10 | Dispositif d'affichage, et procédé de commande d'affichage |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7184042B2 (fr) |
WO (1) | WO2019058811A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115039405A (zh) * | 2020-12-22 | 2022-09-09 | 京东方科技集团股份有限公司 | 一种显示装置及其制备方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10178658A (ja) * | 1996-12-19 | 1998-06-30 | Olympus Optical Co Ltd | 立体画像表示装置 |
JP2003209858A (ja) * | 2002-01-17 | 2003-07-25 | Canon Inc | 立体画像生成方法及び記録媒体 |
JP2006262191A (ja) * | 2005-03-17 | 2006-09-28 | Victor Co Of Japan Ltd | 複数視点立体映像表示方法及び複数視点立体映像表示装置並びに複数視点立体映像表示プログラム |
WO2016038997A1 (fr) * | 2014-09-08 | 2016-03-17 | ソニー株式会社 | Dispositif d'affichage, procédé d'attaque de dispositif d'affichage et dispositif électronique |
WO2016072194A1 (fr) * | 2014-11-07 | 2016-05-12 | ソニー株式会社 | Dispositif d'affichage et procédé de commande d'affichage |
-
2018
- 2018-08-10 JP JP2019543476A patent/JP7184042B2/ja active Active
- 2018-08-10 WO PCT/JP2018/030102 patent/WO2019058811A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10178658A (ja) * | 1996-12-19 | 1998-06-30 | Olympus Optical Co Ltd | 立体画像表示装置 |
JP2003209858A (ja) * | 2002-01-17 | 2003-07-25 | Canon Inc | 立体画像生成方法及び記録媒体 |
JP2006262191A (ja) * | 2005-03-17 | 2006-09-28 | Victor Co Of Japan Ltd | 複数視点立体映像表示方法及び複数視点立体映像表示装置並びに複数視点立体映像表示プログラム |
WO2016038997A1 (fr) * | 2014-09-08 | 2016-03-17 | ソニー株式会社 | Dispositif d'affichage, procédé d'attaque de dispositif d'affichage et dispositif électronique |
WO2016072194A1 (fr) * | 2014-11-07 | 2016-05-12 | ソニー株式会社 | Dispositif d'affichage et procédé de commande d'affichage |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115039405A (zh) * | 2020-12-22 | 2022-09-09 | 京东方科技集团股份有限公司 | 一种显示装置及其制备方法 |
Also Published As
Publication number | Publication date |
---|---|
JP7184042B2 (ja) | 2022-12-06 |
JPWO2019058811A1 (ja) | 2020-11-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10805598B2 (en) | Wearable 3D lightfield augmented reality display | |
JP7185331B2 (ja) | インテグラルイメージング方式のライトフィールドディスプレイ用にライトフィールド画像をレンダリングする方法 | |
JP6704349B2 (ja) | 表示装置及び表示方法 | |
JP3984907B2 (ja) | 画像観察システム | |
JP7185303B2 (ja) | インテグラルイメージングおよびリレー光学部品を用いたヘッドマウント・ライトフィールド・ディスプレイ | |
JP3151770B2 (ja) | 複眼式画像表示装置 | |
JP7182796B2 (ja) | インテグラルイメージングおよび導波路プリズムを用いたヘッドマウント・ライトフィールド・ディスプレイ | |
WO2019012385A1 (fr) | Systèmes de réalité virtuelle et de réalité augmentée avec correction dynamique de la vision | |
JP3979604B2 (ja) | ディスプレイ | |
US20060181767A1 (en) | Three-dimensional image observation microscope system | |
Lim et al. | Fatigue-free visual perception of high-density super-multiview augmented reality images | |
JP7184042B2 (ja) | 表示装置および表示制御方法 | |
Zabels et al. | Integrated head-mounted display system based on a multi-planar architecture | |
KR101746719B1 (ko) | 디스플레이패널과 개별 렌즈간의 거리를 각각 달리하는 렌즈어레이를 이용한 3차원 영상 출력 방법 | |
JPWO2018101170A1 (ja) | 表示装置、及び、電子ミラー | |
JP7127415B2 (ja) | 虚像表示装置 | |
JP2011133672A (ja) | 表示装置および表示方法 | |
Huang et al. | Design of an optical see-through multi-focal-plane stereoscopic 3D display with eye-tracking ability | |
JP3825414B2 (ja) | 三次元表示装置 | |
Hua | Optical methods for enabling focus cues in head-mounted displays for virtual and augmented reality | |
US20180234670A1 (en) | Display device | |
KR101746717B1 (ko) | 초점이 각각 다른 렌즈어레이를 이용한 3차원 영상 출력 방법 | |
JP7437934B2 (ja) | ヘッドアップディスプレイ装置 | |
CN111650754B (zh) | 一种平视显示设备 | |
CN118363178A (zh) | 一种抬头显示设备的标定方法、装置、电子设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18859791 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019543476 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18859791 Country of ref document: EP Kind code of ref document: A1 |