US9762870B2 - Image processing device and image display apparatus - Google Patents
Image processing device and image display apparatus Download PDFInfo
- Publication number
- US9762870B2 US9762870B2 US15/010,677 US201615010677A US9762870B2 US 9762870 B2 US9762870 B2 US 9762870B2 US 201615010677 A US201615010677 A US 201615010677A US 9762870 B2 US9762870 B2 US 9762870B2
- Authority
- US
- United States
- Prior art keywords
- image data
- information
- luminance
- correction
- filter
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000012545 processing Methods 0.000 title claims abstract description 57
- 238000012937 correction Methods 0.000 claims abstract description 121
- 238000006243 chemical reaction Methods 0.000 claims abstract description 46
- 238000000034 method Methods 0.000 claims description 19
- 230000008569 process Effects 0.000 claims description 17
- 230000006866 deterioration Effects 0.000 claims description 14
- 230000000694 effects Effects 0.000 claims description 8
- 230000008859 change Effects 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 15
- 230000004075 alteration Effects 0.000 description 11
- 238000004020 luminiscence type Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 7
- 210000001747 pupil Anatomy 0.000 description 5
- 230000003247 decreasing effect Effects 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 210000001525 retina Anatomy 0.000 description 4
- 210000004087 cornea Anatomy 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000001131 transforming effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G06K9/4661—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3182—Colour adjustment, e.g. white balance, shading or gamut
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0233—Improving the luminance or brightness uniformity across the screen
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/144—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
Definitions
- Embodiments described herein relate generally to an image processing device and image display apparatus.
- Image display apparatus for displaying images to the eyes of a user have been proposed.
- the image display apparatus may be a head-mounted display.
- the image display apparatus may comprise a display unit, a projection unit, and a reflection unit.
- Light from a light source incident to the display unit and light including image information based on display image data is emitted by the display unit.
- the direction of at least a part of the light rays of the light including image information is corrected by an optical component, and the projection unit emits the corrected light.
- the reflection unit reflects at least a part of the light rays of the corrected light toward an eye of a user.
- An image (an observation image) observed by a user may include blurring, color shift, and/or distortion due to an aberration caused by optical components or lens systems in the reflection unit, such as lenses or half mirrors.
- Image processing techniques to suppress this deterioration in image quality have been proposed.
- An inverse of the deterioration characteristics due to the aberration is applied to the image data to be input to the image display apparatus prior to display.
- an inverse filter having inverse characteristics relative to a space filter or frequency filter that expresses the blurring aberration in the observation image may be applied to the display image data. The user is thus able to observe an image in which the blurring is suppressed by the inverse filter.
- FIG. 1 is a diagram of an image display apparatus according to a first embodiment.
- FIG. 2 is a block diagram of the image display apparatus according to the first embodiment.
- FIG. 3 is a block diagram of a deblurring unit according to the first embodiment.
- FIG. 4 is a flow chart corresponding to the FIG. 2 .
- FIG. 5 is a flow chart corresponding to the FIG. 3 .
- FIG. 6 is a block diagram of the image display apparatus according to a second embodiment.
- FIG. 7 is a flow chart corresponding to the FIG. 6 .
- FIG. 8 is a diagram of a structure of an image display apparatus.
- an image processing device includes a background luminance input unit, an image processing unit and a luminance setting unit.
- the background luminance input unit stores background luminance information regarding background luminance.
- the image processing unit generates display image data and contrast conversion information.
- the display image data is provided by modulating a contrast of a correction image data set so that pixel values included in the correction image data set are in an acceptable range.
- the correction image data set is provided by correcting blurring in an input image data set to be input to the image processing unit based on the background luminance information.
- the contrast conversion information is information used for modulating the contrast.
- the luminance setting unit generates luminance setting information to set a luminance of the projection unit based on the modulation information.
- FIG. 1 is a diagram of an image display apparatus according to a first embodiment.
- an image display apparatus 100 comprises a circuit (image processing device) 110 , a projection unit 120 , and a reflection unit 130 .
- the circuit 110 is connected to an external memory medium or network, and receives input image data.
- the connection may be wired or wireless.
- the circuit 110 may be connected to the projection unit electrically through a cable 115 and may transmit the image display data to a display 122 in the projection unit 120 .
- the projection unit 120 emits light including image information based on display setting information.
- the projection unit 120 may comprise a light source 121 , the display 122 , and a lens system 123 .
- the light source 121 may be white LED (Light Emitting Diode) and may comprise a red LED, a green LED, and a blue LED.
- the display 122 may comprise a plurality of pixels arranged in a plane to display an image based on the display image data.
- the display 122 transmits light L 1 emitted from the light source 121 .
- Light including image information L 2 based on the display image data is emitted by the display 122 toward the lens system 123 .
- the light source 121 may not necessarily be needed in the case where the display 122 is a self-lighting type display.
- the display 122 may, for example, comprise a liquid crystal display, an organic EL (Electro Luminescence) display or LCOS (Liquid Crystal On Silicon) display.
- the lens system 123 is located between the display and the reflection unit 130 in a light path of the light including image information L 2 .
- the lens system 123 corrects a direction of at least a part of the light including image information to travel toward the reflection unit 130 .
- the light that travels from the lens system 123 toward the reflection unit 130 is defined as correction light L 3 .
- the lens system 123 includes at least one optical component.
- the optical component may be, for example, a lens, a prism, or a mirror. In the case where the lens system 123 includes a plurality of optical components, the optical components are not necessarily directly-aligned.
- the reflection unit 130 reflects at least a part of light emitted by the projection unit toward the eye 161 of a viewer. That is, the reflection unit 130 reflects at least a part of light rays included in the correction light L 3 towards the eye 161 , so the reflection unit 130 reflects at least a part of the light including image information toward the eye.
- the reflection unit 130 transmits a part of incident light so that a user observes background on which an image overlaps. At least a part of the light rays included in the correction light L 3 reflected by the reflection unit 130 is incident on the pupil 162 and is collected on the retina 163 by the lens and cornea.
- a virtual image is formed on an extended line of light rays included in the correction light L 3 extended in a direction opposite to the traveling direction. In this way, it is possible to view the observation image 170 for a user 160 .
- the relative position between the display 122 and a lens system 123 is fixed in the projection unit 120 .
- the relative position between the display 122 and an lens system 123 may not be fixed in the projection unit 120 .
- the distance and relative attitude between the display 122 and the lens system 123 may be adjusted by adjusting screws that fix the position of the display 122 and/or the lens system 123 on the projection unit.
- the distance between the pupils 162 and the observation image 170 and the size of the observation image 170 may be adjusted by adjusting the distance between the display 122 and the lens system 123 .
- the reflection unit 130 transmits a part of a light that is incident on the reflection unit 130 . In this manner, a user is able to see the view through the reflection unit 130 .
- the reflection unit 130 is provided along a first plane 131 .
- the reflection unit 130 may comprise a plurality of fine reflection planes arranged along the first plane 131 .
- the first plane 131 may be flat or curved.
- Each of the plurality of reflection planes may be implemented using a half mirror that reflects at least a part of the light that is incident on the reflection plane.
- the reflection unit 130 need not be implemented using a half mirror. Any member that transmits a part of the incident light and reflects or otherwise modifies a reflection angle of another part of the incident light may be used as the reflection unit 130 .
- Each of the plurality of reflection planes is declined with respect to the first plane 131 and there are gaps between the reflection planes along a direction perpendicular to the first plane 131 .
- the angle between the reflection plane and the first plane 131 is set based on the light axis of the projection unit 120 and the positional relationship between the eyes 161 and the observation image 170 . Reflection angle is changed by the angle between the reflection plane and the first plane 131 .
- the reflection unit 130 may be implemented by a plurality of reflection planes, e.g., forming a Fresnel mirror.
- the observation image 170 is displayed in front of a user 160 in this embodiment.
- the observation image 171 may alternatively be displayed at an edge of the range of the user's vision, so as not to block the user's view.
- the image display apparatus 100 may be implemented in a heads-up display.
- the image display apparatus 100 may be, for example, a spectacle type image display apparatus.
- the image display apparatus 100 may include a spectacle frame or holding part 140 .
- FIG. 1 shows image display apparatus 100 mounted on the head of a user 160 with the holding part 140 .
- the image display apparatus 100 may further include spectacle lenses 150 .
- the reflection units 130 may be included in the spectacle lenses 150 .
- the holding part 140 may include a nose pad 141 , bridge 142 , and temple 143 .
- the bridge 142 portion may connect spectacle lenses 150 .
- the holding part 140 may further include a rim to hold the spectacle lenses 150 .
- a non-refractive glass may be used as the spectacle lenses 150 .
- Each of the spectacle lenses 150 has a first surface 151 and a second surface 152 being apart from the first surface 151 .
- the reflection units 130 may be provided between the first surface 151 and the second surface 152 .
- the reflection units 130 may additionally or alternatively be provided on the first surface 151 or the second surface 151 .
- the relative positions between the nose pad 141 and the spectacle lens 150 may be fixed. That is, the relative positions between the reflection unit 130 and the spectacle lens 150 may be fixed.
- the spectacle lenses 150 may be held by the holding part 140 .
- the angle between the holding part 140 and the spectacle lens 150 may be adjustable.
- the projection unit 120 may be held by the holding part 140 .
- the position and attitude of the projection unit 120 may be adjustable.
- the nose pad 141 may be placed on the user's nose 165 and the temple 143 on the user's ear 164 .
- Relative positions of the holding part 140 , the spectacle lens 150 , and the reflection unit 130 may be decided based on the position of the nose 165 and the ear 164 of the user 160 .
- Relative positions of the holding portion 140 and the reflection unit 130 may be substantially fixed.
- An observation image 170 may include blurring, color shift, and distortion because of an aberration caused by the lens system 123 and a component in the reflection unit, such as a lens or a half mirror.
- the light including image information from the display 122 forms a group of light rays from each pixel in the display 122 .
- a light ray from a pixel spreads as the ray travels.
- the light ray spreads out in cone shape.
- a part of a light ray passes the lens system 123 and is reflected by the reflection unit 130 to be incident on the eye of a user and is collected on one point on the retina 163 . In this case, it is able to observe the observation image 170 without blurring.
- a part of a light ray may diverge from an ideal path because of an aberration.
- light rays from a plurality of pixels may overlap each other on the retina 163 , so that the user observes the observation image with blurring.
- an image processing technique in which an inverse filter that has an inverse characteristic relative to a space filter or frequency filter that expresses the deterioration process that causes blurring in the observation image may be applied to the display image data.
- processing with the inverse filter may result in a pixel value that is out of the acceptable range for the image.
- an edge where a pixel value is near the border of the acceptable range is likely to include blurring.
- the acceptable range may be set for each image or each image data.
- Circuit 110 may set the acceptable range for the input image data, the deblurring image data, and the display image data.
- the acceptable range may be stored in an image processing unit described later. The acceptable range may be determined prior to processing by the image processing unit.
- a so-called Wiener filter may be applied to input image data of a projector preliminarily to correct blurring of a focus on a screen.
- a clipping process may be executed for the pixel value being out of the acceptable range or
- contrast of the image may be modulated so that the pixel values are in the acceptable range.
- the effect of deblurring may be suppressed, thus causing distortion of the image.
- the blurring may be corrected, however visibility of the observation image may be low, due to a decrease in a contrast of the observation image and a contrast and luminance of the background. Contrast and luminance of the background refers to a maximum luminance of the observation image 160 relative to the luminance of background. The higher the contrast and luminance of the background, the higher the visibility of the observation image 170 .
- FIG. 2 is a block diagram of the image display apparatus according to the first embodiment.
- FIG. 3 is a block diagram of a deblurring unit according to the first embodiment.
- the circuit (image processing device) 110 comprises a background luminance input unit 210 , an image processing unit 220 , and a luminance correction unit (luminance setting unit) 230 .
- the background luminance input unit 210 , the image processing unit 220 , and the luminance correction unit (luminance setting unit) 230 may be included in the circuit 110 .
- the background luminance input unit 210 stores information (background information) regarding a luminance of a background on which the observation image 170 is overlapped and transmits the background information to the image processing unit 220 .
- the background luminance input unit 210 may acquire the background information.
- the image processing unit 220 comprises a deblurring unit 221 and a contrast conversion unit 222 .
- the deblurring unit 221 corrects the blurring in the input image data that is input to the circuit 110 and transmits the correction image data to the contrast conversion unit 222 .
- the input image data and the correction image data include pixel values.
- the contrast conversion unit 222 modulates a contrast of the pixel values in the correction image data so that the pixel values are in the acceptable range and transmits the modulated correction image data as a display image data to the display 122 of the projection unit 120 .
- the contrast conversion unit 222 also transmits information used as contrast conversion information to the luminance correction unit 230 .
- the luminance correction unit 230 transmits the luminance setting information to set the luminance of the projection unit 120 based on the contrast conversion information.
- the luminance correction unit 230 calculates luminance of the observation image 170 , as decreased by the modulating of the contrast of the pixel values in the correction image data, and transmits the result to the light source 121 of the projection unit 120 as the luminance correction information.
- the display 122 emits the light including image information to the lens system 123 by displaying the display image based on the display image data transmitted by the image processing unit 220 , and transmitting light from the light source 121 based on the luminance correction information.
- the lens system 123 changes a traveling direction of at least a part of the light including image information toward the reflection unit 130 .
- the deblurring unit 221 comprises filter image filtering units 330 , 340 .
- the deblurring unit 221 may further comprise a filter selection unit 310 and a filter switching unit 320 .
- Each of the filter image filtering units 330 , 340 applies a different filter to the input image data and transmits the correction image data. That is, deblurring unit 221 comprises a filter set consisting of two filters.
- the filter selection unit 310 transmits filter information to select a filter image processing unit based on the background luminance information to the filter switching unit 320 .
- the filter switching unit 320 switches filters (the filter image processing units) based on the filter information.
- a filter is selected from a filter set consist of two filters (filter image processing units) is described.
- a filter may be selected from a filter set consisting of three or more filters.
- FIGS. 2 and 3 are examples only, and it is to be understood that the disclosed apparatus may be implemented in other ways. For example, a part of each block may be implemented apart from the image display apparatus.
- FIG. 4 is a flow chart corresponding to the image display apparatus 100 in FIG. 2 .
- FIG. 5 is a flow chart corresponding to the FIG. 3 . Processing in each of the blocks of the image display apparatus 100 is described with reference to FIGS. 2 through 5 .
- step S 410 background luminance on which the observation image 170 is overlapped (background luminance in front of the user 160 ) is input to the background luminance input unit 210 and the background luminance information is transmitted to the image processing unit 220 .
- the background luminance may, for example, be measured by a sensor such as a camera.
- the background luminance may be measured every time the background luminance changes and the measurement result may be input to the background luminance input unit 210 .
- the background luminance may be preliminarily measured and input to the background luminance input unit 210 .
- the background luminance of each pattern may be preliminarily measured and input to the background luminance input unit 210 .
- the background luminance may be input manually or automatically.
- a user 160 may input the background luminance to the background luminance input unit 210 , which may be implemented with software or hardware.
- step S 420 blurring in the input image data is corrected based on the background luminance by the deblurring unit 221 and correction image data is transmitted to the contrast conversion unit 222 .
- an image observed by a user may include blurring, color shift, and distortion because of an aberration caused by the lens system 123 and reflection unit 130 .
- the inverse filter that has inverse characteristic relative to space filter or frequency filter that express the deterioration process that causes blurring in the observation image may be applied to the input image data.
- An energy function to express a difference between a desirable image to be observed by a user, which does not include blurring and the observation image 170 is made smaller by the inverse filter. Therefore, it is able to display a blurring-corrected observation image to a user.
- a pixel value of the correction image data may be out of the acceptable range.
- a contrast of the correction image data is modulated so that a pixel value is in the acceptable range, there is a problem that a contrast of the observation image is decreased.
- the decrease in a contrast of the observation image may be suppressed.
- an inverse filter that has strong effect for correcting blurring may be applied.
- the deterioration process that causes blurring in the observation image 170 may be expressed as a space filter.
- a spot diagram may be obtained by tracing a part of the light rays that are emitted from a pixel in the display 122 and incident on pupils 162 , by plotting points at the intersection of lines extended along the light rays from the reflection unit 130 to the pupils 162 in a reverse direction to the traveling direction with the observation image 170 .
- the spot diagram is a space filter expressed based on the deterioration process.
- a space filter may be obtained by imaging the observation image 170 with a camera and estimating the deterioration process.
- the deterioration process may be expressed as below.
- y Bx (1)
- y is a vector that expresses the observation image 170
- x is a vector that expresses a desirable image to be observed by a user that does not include blurring
- B is a matrix of a space filter that express the deterioration processes.
- the inverse filter may, for example, be acquired by finding a correction image data to decrease the energy function J.
- J ⁇ x ⁇ B ⁇ circumflex over (x) ⁇ 2 + ⁇ C ⁇ circumflex over (x) ⁇ 2 (2)
- ⁇ circumflex over (x) ⁇ is a vector that expresses the correction image data
- C is a matrix that expresses regularization operator
- ⁇ is a regularization weight coefficient.
- Matrix C may, for example, be a Laplacian operator or operators.
- the first element in the relation (2) expresses energy of a difference between a desirable image to be observed by a user that does not include blurring and the observation image 170 .
- the second element in relation (2) is a regularization item for finding the correction image data stably.
- the smaller the regularization weight coefficient ⁇ the stronger the correction of the blurring in the observation image 170 , but also the bigger the overshoot or undershoot around the edge of the correction image data.
- the overshoot or undershoot may cause a pixel value in the correction image data to fall out of the acceptable range.
- the regularization weight coefficient ⁇ may be, for example, selected based on an edge intensity of the image. The weaker the edge intensity is, the smaller the regularization weight coefficient ⁇ may be.
- the inverse filter is, for example, smaller than an energy function that expresses a difference between an image data that is generated by convolution of the correction image data with the space filter that expresses the deterioration process that causes blurring in the observation image and determined image data.
- the inverse filter may be calculated by finding the correction image data to minimize the energy function J of the relation (2).
- I B T B+ ⁇ C T C ) ⁇ 1 B T (3)
- I is a matrix that expresses the inverse filter.
- the inverse filter may be, for example, a high-pass filter or a sharpening filter.
- a high-pass filter or a sharpening filter may be implemented, for example, using an unshaped mask.
- step S 420 The debluring in step S 420 will be described with reference to FIGS. 3 and 5 .
- the filter selection unit 310 transmits filter information to the filter switching unit 320 , so as to select a filter from a filter set consisting of two filters based on the background luminance information.
- the filter set may, for example, comprise an inverse filter A that has strong deblurring effect (that is, having a small value of ⁇ in relation (2)) and an inverse filter B that has weak deblurring effect (having a large value of ⁇ ).
- the filter set may, for example, comprise high-pass filters that have different pass levels or sharpening filters that have different sharpening levels.
- step S 520 the filter switching unit 320 switches filters to be applied to the input image data based on the filter information.
- the filter selected by the filter switching unit 320 is applied to the input image data.
- the way that the inverse filter is applied to the input image data may depend on the type of the inverse filter, e.g., whether the inverse filter is a space filter or a frequency filter.
- the selected filter may be applied directly to the pixel in the input image data.
- the selected filter may be, for example, applied to the transformed input image data, which is transformed in a given frequency range with a Fourier transform.
- the correction image data may be generated by multiplying at least one of pixel values in the input image data and a weighting coefficient that expresses the inverse filter.
- the correction image data may be generated by multiplying transformed image data obtained by Fourier transform of the input image data by the inverse filter, and by transforming the result of the multiplication with an inverse Fourier transform.
- the inverse filter may be applied to a value in a color space that is linear with respect to the luminescence of the observation image 170 .
- a color space may be a CIE-RGB color system or a CIE-XYZ color system.
- the inverse filter may be applied to a value obtained by transforming a pixel value of the input image data in a color space that is linear to the luminescence of the observation image 170 . Then the value may be transformed in the original color space.
- the inverse filter may be applied to a value in a color space that is not linear to the luminescence of the observation image 170 as long as the effect of the deblurring does not decrease undesirably.
- the contrast conversion unit 222 transmits the display image data, which is obtained by modulating the contrast of the correction image data so that a pixel value of the correction image data is in the acceptable range to the projection unit 120 .
- the contrast conversion unit 222 also transmits the contrast conversion information to the luminance correction unit 230 .
- the image processing unit 220 modulates a contrast based on the maximum value and the minimum value of the correction image data. For example, if the minimum value of the correction image data is out of the acceptable range, contrast conversion to the correction image data may be performed according to relation (4).
- the function f(x, y) is a pixel value of the coordinate (x, y) in the correction image data.
- the function g(x, y) is a pixel value of the coordinate (x, y) in the display image data.
- the value f max is the maximum value of the f(x, y).
- the value f min is the minimum value of the f(x, y).
- the value MAX is the maximum value in the acceptable range. For example, an image has N-bit pixels, MAX is 2 N ⁇ 1 because the acceptable range is from 0 to 2 N ⁇ 1.
- contrast conversion to the correction image data may be performed according to relation (5).
- g ⁇ ( x , y ) MAX - f min f max - f min ⁇ ( f ⁇ ( x , y ) - f min ) + f min ( 5 )
- contrast conversion to the correction image data may be performed according to relation (6).
- Contrast conversion may be performed in other ways than the ways described above as long as pixel values in the modulated correction image data are within the acceptable range.
- contrast conversion may be performed based on the LUT (Look Up Table), which associates pixel values between pixel values before the modulation and pixel values after the modulation.
- the pixel values of the correction image data may lie outside of the acceptable range as long as visibility of the observation image does not decrease.
- the pixel values of the observation image that lie outside of the acceptable range may be clipped.
- contrast of the correction image may be modulated so that the minimum value is ⁇ 5 and the maximum value is 260.
- the pixel values that lie outside of the acceptable range may be clipped. If a number of the pixels whose pixel values are clipped is small or a change of the pixel values by the clipping are small, the magnitude of the deblurring and the magnitude of the distortion by clipping are likewise small.
- contrast conversion may be performed with relation (4). In this case, visibility of the observation image will be improved because of improvement of contrast of the observation image.
- the contrast conversion information is information to indicate what kind of contrast conversion is performed.
- the contrast conversion information may include the maximum pixel value of the correction image data.
- the contrast conversion information may include information to indicate the amount of change of the maximum pixel value or the amount of change of the minimum pixel value of the correction image data.
- step S 440 the luminance correction unit 230 finds an amount of luminance correction to correct luminance of the light source in the projection unit 120 so that the decrease of luminance of the observation image by the contrast modulating will be canceled.
- the luminance correction unit 230 transmits the result as the luminance correction information to the projection unit 120 .
- an amount of luminance correction may be found with relation (7).
- L is a reference luminance value of the light source and ⁇ L is an amount of luminance correction.
- a relationship between a pixel value of the correction data and luminance of light including image information from the display 122 may be non-linear.
- a relation between a pixel value of the correction data and luminance of light including image information from the display 122 may have gamma characteristic as below.
- an amount of luminance correction may be calculated with the relation (9).
- ⁇ ⁇ ⁇ L ( ( f max MAX ) ⁇ - 1 ) ⁇ L ( 9 )
- step S 450 the light source emits light that is corrected based on the luminance correction information toward the display 122 .
- luminance correction information has an amount of luminance correction expressed with relation (7) or (9)
- luminance of light from the light source 121 may be corrected according to relation (10).
- L c ⁇ L+L (10) where L c is a luminance value of corrected light emitted from the light source 121 .
- the luminance of observation image 170 does not decrease. Therefore, luminance and contrast of the background does not decrease and contrast of the observation image 170 after the correction increases with respect to the contrast of the observation image 170 before the correction. Therefore visibility of the observation image 170 is improved.
- step S 460 light including image information is emitted toward the lens system 123 , when display image data input by the image processing unit 220 is displayed on the display 122 .
- Light from the light source 121 whose luminance is corrected based on the luminance correction information, is transmitted to the display 122 .
- the reflection unit 130 reflects at least a part of light rays of the correction light toward the user's eyes 161 .
- the correction light L 3 is reflected by the reflection unit 130 so as to be incident on pupils 162 and is collected on the retina 163 by the lens and cornea so that a virtual image is formed in front of the user 160 .
- the reflection unit 130 also transmits a part of the light incident on the reflection unit 130 . Therefore the user 160 is able to observe the observation image 170 and outside world in front of the user.
- the user 160 is provided with an observation image 170 in which the blurring due to aberration caused by such as the lens system 123 and the reflection unit 130 has been corrected. Furthermore, in the case where a pixel value after an inverse filter is applied is out of the acceptable range, the corrected observation image is provided without decreasing a luminance and contrast of the background. The decrease of the contrast of the observation image is further suppressed.
- deterioration such as blurring, color shift, and distortion may occur.
- a user 160 observes the observation image that includes more blurring, color shift, and distortion than may be estimated to occur in the observation image.
- the color shift and distortion in the observation image may be corrected by distorting the image inversely with respect to the color shift and distortion caused by the aberration.
- FIG. 6 is a block diagram of an example of the image display apparatus according to the second embodiment.
- the image processing unit 220 of the image display apparatus 100 further comprises a distortion correction unit 710 .
- the image processing unit 220 comprises the deblurring unit 221 , the distortion correction unit 710 , and the contrast conversion unit 222 .
- the deblurring unit 221 corrects blurring in the image data input to the image display apparatus 100 and transmits the correction image data to the distortion correction unit 710 .
- the distortion correction unit 710 corrects distortion in the correction image data and transmits distortion correction image data to the contrast conversion unit 222 .
- the contrast conversion unit 222 transmits display image data to the projection unit 120 and transmits contrast conversion information to luminance correction unit 230 .
- the block diagram illustrated in FIG. 7 is exemplary and is not required for implementation. For example, a part of each block may be separated from the image display apparatus.
- the burring correction unit 221 and the distortion correction unit 710 may be substituted for each other.
- the distortion correction unit may correct distortion in the input image data and transmit distortion correction image data to the deblurring unit 221 .
- the deblurring unit 221 may correct blurring in the distortion correction image data and may transmit deblurring image data to the contrast conversion unit 222 .
- the contrast conversion unit 222 may transmit display image data in which the contrast of the deblurring image data is modulated to the projection unit 120 .
- the deblurring unit 221 and the distortion correction unit 710 may be substituted by a unit that is able to correct both blurring and distortion.
- the unit may correct blurring and distortion in the input image data and transmit correction image data to the contrast conversion unit 222 .
- the contrast conversion unit 222 may transmit display image data in which the contrast of the correction image data is modulated to the projection unit 210 .
- FIG. 7 is a flow chart to illustrate the image display apparatus 100 according to the second embodiment. Process in the image processing unit 220 will be described with FIGS. 6 and 7 .
- step 810 the deblurring unit 221 corrects blurring in the input image data input to the image display apparatus 100 , based on the background luminance, and transmits the correction image data to the distortion correction unit 710 .
- the blurring may be corrected in the same manner as described above with respect to the first embodiment.
- step S 820 the distortion correcting unit 710 corrects distortion of the correction image data and transmits the distortion-corrected image data to the contrast conversion unit 222 .
- color shift and distortion due to aberration are expressed by an LUT indicating a relation between arbitrary pixels in the display 122 and the positions in the observation image corresponding to the pixels.
- the deblurring image data is distorted inversely to a color shift and distortion based on the LUT. Therefore, a user is able to observe the observation image that does not include color shift and distortion.
- the contrast conversion unit 222 transmits the display image data in which contrast is modulated so that pixel values of the distortion correction image are modulated to be in the acceptable range for the projection unit 120 .
- the contrast conversion unit 222 also transmits the contrast conversion information to the luminance correction unit 130 .
- the contrast of the correction image data may be modulated, for example, in the same manner as described above with respect to the first embodiment.
- the user 160 p is provided with an observation image 170 in which blurring due to aberration caused by color shift and distortion of the lens system 123 and the reflection unit 130 have been corrected. Furthermore, in the case in which a pixel value after an inverse filter is applied is out of the acceptable range, the corrected observation image is provided without decreasing a luminance and contrast of the background. The decrease of the contrast of the observation image is also suppressed.
- FIG. 8 is a diagram of an example of a structure of the image display apparatus.
- the circuit 110 includes, for example, an interface 610 , a processing circuit 620 , and a memory 630 .
- the circuit 110 may, for example, be connected to such as a recording medium that is outside of the image display apparatus, network, or an image reproducer, and acquire the input image data through an interface.
- the circuit 110 may be connected to a recording medium, network, or an image reproducer by wire or by a wireless network.
- the processing circuit 620 may, for example, process the input image data information acquired through a sensor 650 , and image information based on the program 640 . For example, processes of the image processing unit 220 and luminance correction unit 230 may be performed in the processing circuit 620 based on the program 640 .
- a program 640 to process acquired input image data may be stored in the memory 630 .
- the memory 630 may, for example, comprise a recording medium recordable magnetically or optically and be able to record the acquired input image data, information acquired through a sensor 650 , and image information.
- the memory 630 may record a program 640 to control the image display apparatus 100 or various setting information.
- the input image data may be transformed based on the program 640 and display image data may be generated.
- Image information such as the input image data and display image data may be stored in the memory 630 .
- the program 640 may be preliminarily stored in the memory 630 .
- the program 640 may be provided through a memory medium such as CD-ROM (Compact Disc Read Only Memory) or network and installed in the memory 630 .
- the circuit 110 may include a sensor 650 .
- the sensor 60 may be an arbitrary sensor such as a spectral radiance meter, a spectroradiometer, a camera, a microphone, a position sensor, or an acceleration sensor.
- background luminance in front of the user may be measured.
- an image to be displayed on the display 122 may be changed based on the information acquired by the sensor 650 . Convenience and visibility of the image display apparatus may be improved.
- An integrated circuit (IC) or IC chip set such as LSI (Large Scale Integration) may be used as a part or whole of each block in the circuit 110 .
- Each block may be implemented by an individual circuit.
- a part or whole of each block may be implemented by an integrated circuit.
- Blocks may be implemented as a unit.
- a block or certain blocks may be separated from the other blocks.
- a part of a block may be separated from the other part of the block.
- a dedicated circuit or general processor may be used in the circuit 110 , rather than LSI circuitry.
- processing of the circuit 110 , the image processing unit 220 , and luminance correction unit 230 may be performed by a processor such as CPU in a general-purpose computer executing a program.
- the program may be pre-installed in the computer.
- the program may be provided through a memory medium such as CD-ROM or network and installed in the computer.
- all or part of the processing may be performed by an OS (Operating System) operating in a computer based on a instruction from the program installed in a computer, or by an integrated system memory medium, a database management software, Middleware (MW) such as network, etc.
- OS Operating System
- MW Middleware
- a memory medium is not limited to a medium independent from a computer or an integrated system; a memory medium memorized by downloading a program through LAN or internet may also be used. A plurality of memory media may be used for processing in these embodiments.
- a computer or an integrated system may execute each disclosed process based on a program memorized in memory media.
- a computer or an integrated system may be implemented by an apparatus comprising a personal computer or microcomputer, or a system comprising a plurality of apparatus connected to a network.
- the computer may be implemented not only by a personal computer, but also by a image processing unit or microcomputer included in an information processing equipment.
- a computer is any device that is able to perform the functions described in these embodiments.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
y=Bx (1)
where y is a vector that expresses the
J=∥x−B{circumflex over (x)}∥ 2 +ε∥C{circumflex over (x)}∥ 2 (2)
where {circumflex over (x)} is a vector that expresses the correction image data; C is a matrix that expresses regularization operator; and ε is a regularization weight coefficient.
I=B T B+εC T C)−1 B T (3)
where I is a matrix that expresses the inverse filter.
The function f(x, y) is a pixel value of the coordinate (x, y) in the correction image data. The function g(x, y) is a pixel value of the coordinate (x, y) in the display image data. The value fmax is the maximum value of the f(x, y). The value fmin, is the minimum value of the f(x, y). The value MAX is the maximum value in the acceptable range. For example, an image has N-bit pixels, MAX is 2N−1 because the acceptable range is from 0 to 2N−1.
where L is a reference luminance value of the light source and ΔL is an amount of luminance correction.
where l is a value of luminance of light including image information; f is a pixel value in the correction image; and γ is a parameter to express gamma characteristic. In this case, an amount of luminance correction may be calculated with the relation (9).
L c =ΔL+L (10)
where Lc is a luminance value of corrected light emitted from the
Claims (12)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-076102 | 2015-04-02 | ||
JP2015076102A JP2016197145A (en) | 2015-04-02 | 2015-04-02 | Image processor and image display device |
Publications (2)
Publication Number | Publication Date |
---|---|
US20160295183A1 US20160295183A1 (en) | 2016-10-06 |
US9762870B2 true US9762870B2 (en) | 2017-09-12 |
Family
ID=57015431
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/010,677 Expired - Fee Related US9762870B2 (en) | 2015-04-02 | 2016-01-29 | Image processing device and image display apparatus |
Country Status (2)
Country | Link |
---|---|
US (1) | US9762870B2 (en) |
JP (1) | JP2016197145A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220099877A1 (en) * | 2019-02-01 | 2022-03-31 | tooz technologies GmbH | Light-guiding arrangement, imaging optical unit, head mounted display and method for improving the imaging quality of an imaging optical unit |
US20230137831A1 (en) * | 2021-11-03 | 2023-05-04 | Samsung Electronics Co., Ltd. | Electronic device for improving image quality |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110431840B (en) * | 2017-03-28 | 2021-12-21 | 索尼公司 | Image processing apparatus, method and storage medium |
WO2019230108A1 (en) * | 2018-05-28 | 2019-12-05 | ソニー株式会社 | Image processing device and image processing method |
Citations (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5612708A (en) * | 1994-06-17 | 1997-03-18 | Hughes Electronics | Color helmet mountable display |
US20020099257A1 (en) * | 2001-01-21 | 2002-07-25 | Parker Donald E. | Alleviating motion, simulator, and virtual environmental sickness by presenting visual scene components matched to inner ear vestibular sensations |
US20020154214A1 (en) * | 2000-11-02 | 2002-10-24 | Laurent Scallie | Virtual reality game system using pseudo 3D display driver |
US20060007056A1 (en) * | 2004-07-09 | 2006-01-12 | Shu-Fong Ou | Head mounted display system having virtual keyboard and capable of adjusting focus of display screen and device installed the same |
US20080031490A1 (en) * | 2006-08-07 | 2008-02-07 | Canon Kabushiki Kaisha | Position and orientation measuring apparatus and position and orientation measuring method, mixed-reality system, and computer program |
US20100045571A1 (en) * | 2007-11-20 | 2010-02-25 | Kakuya Yamamoto | Beam-scan display apparatus, display method, and vehicle |
US20100091027A1 (en) * | 2008-10-14 | 2010-04-15 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
US20100103077A1 (en) * | 2007-11-20 | 2010-04-29 | Keiji Sugiyama | Image display apparatus, display method thereof, program, integrated circuit, goggle-type head-mounted display, vehicle, binoculars, and desktop display |
US20100177035A1 (en) * | 2008-10-10 | 2010-07-15 | Schowengerdt Brian T | Mobile Computing Device With A Virtual Keyboard |
US20100199232A1 (en) * | 2009-02-03 | 2010-08-05 | Massachusetts Institute Of Technology | Wearable Gestural Interface |
US20100287485A1 (en) * | 2009-05-06 | 2010-11-11 | Joseph Bertolami | Systems and Methods for Unifying Coordinate Systems in Augmented Reality Applications |
US20110057862A1 (en) * | 2009-09-07 | 2011-03-10 | Hsin-Liang Chen | Image display device |
US20110142514A1 (en) * | 2009-12-10 | 2011-06-16 | Fuji Xerox Co., Ltd. | Medium clamping device and image forming device |
US20120139933A1 (en) * | 2010-12-03 | 2012-06-07 | Fujitsu Limited | Image display device and method, and image processing device |
US8223024B1 (en) * | 2011-09-21 | 2012-07-17 | Google Inc. | Locking mechanism based on unnatural movement of head-mounted display |
US8228315B1 (en) * | 2011-07-12 | 2012-07-24 | Google Inc. | Methods and systems for a virtual input device |
US20120242560A1 (en) * | 2011-03-24 | 2012-09-27 | Seiko Epson Corporation | Head-mounted display device and control method for the head-mounted display device |
US20130182185A1 (en) * | 2010-10-15 | 2013-07-18 | Sharp Kabushiki Kaisha | Image processing device, image processing method, image processing program, and recording medium |
US20130222213A1 (en) * | 2012-02-29 | 2013-08-29 | Recon Instruments Inc. | Modular heads-up display systems |
US20140184477A1 (en) * | 2012-12-27 | 2014-07-03 | Seiko Epson Corporation | Head-mounted display |
US20140268336A1 (en) * | 2013-03-13 | 2014-09-18 | Seiko Epson Corporation | Virtual image display apparatus |
US20150009416A1 (en) * | 2012-02-22 | 2015-01-08 | Sony Corporation | Display device, image processing device and image processing method, and computer program |
US20150050880A1 (en) * | 2013-08-16 | 2015-02-19 | Samsung Electronics Co., Ltd. | Data communication method and apparatus based on wireless communication |
US20150312468A1 (en) * | 2014-04-23 | 2015-10-29 | Narvaro Inc. | Multi-camera system controlled by head rotation |
US20150326258A1 (en) * | 2014-05-08 | 2015-11-12 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling power amplifier bias |
US20150338662A1 (en) | 2014-05-20 | 2015-11-26 | Kabushiki Kaisha Toshiba | Display device |
US20160028940A1 (en) * | 2013-03-27 | 2016-01-28 | Fujifilm Corporation | Image processing device, imaging device, image processing method and computer readable medium |
US9360672B2 (en) * | 2013-07-11 | 2016-06-07 | Seiko Epson Corporation | Head mounted display device and control method for head mounted display device |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004233867A (en) * | 2003-01-31 | 2004-08-19 | Nikon Corp | Picture display device |
JP2006133439A (en) * | 2004-11-05 | 2006-05-25 | Nikon Corp | Head-mounted display |
JP2008258802A (en) * | 2007-04-03 | 2008-10-23 | Canon Inc | Image display system |
JP5223452B2 (en) * | 2008-05-20 | 2013-06-26 | 株式会社リコー | Projector, projection image forming method, and vehicle head-up display device |
US20120147163A1 (en) * | 2010-11-08 | 2012-06-14 | DAN KAMINSKY HOLDINGS LLC, a corporation of the State of Delaware | Methods and systems for creating augmented reality for color blindness |
US20140253605A1 (en) * | 2013-03-05 | 2014-09-11 | John N. Border | Controlling brightness of a displayed image |
JP2016519322A (en) * | 2014-04-10 | 2016-06-30 | マイクロソフト テクノロジー ライセンシング,エルエルシー | Display image brightness control |
JP6469130B2 (en) * | 2014-11-10 | 2019-02-13 | マクセル株式会社 | Projector and image display method |
-
2015
- 2015-04-02 JP JP2015076102A patent/JP2016197145A/en active Pending
-
2016
- 2016-01-29 US US15/010,677 patent/US9762870B2/en not_active Expired - Fee Related
Patent Citations (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5612708A (en) * | 1994-06-17 | 1997-03-18 | Hughes Electronics | Color helmet mountable display |
US20020154214A1 (en) * | 2000-11-02 | 2002-10-24 | Laurent Scallie | Virtual reality game system using pseudo 3D display driver |
US20020099257A1 (en) * | 2001-01-21 | 2002-07-25 | Parker Donald E. | Alleviating motion, simulator, and virtual environmental sickness by presenting visual scene components matched to inner ear vestibular sensations |
US20060007056A1 (en) * | 2004-07-09 | 2006-01-12 | Shu-Fong Ou | Head mounted display system having virtual keyboard and capable of adjusting focus of display screen and device installed the same |
US20080031490A1 (en) * | 2006-08-07 | 2008-02-07 | Canon Kabushiki Kaisha | Position and orientation measuring apparatus and position and orientation measuring method, mixed-reality system, and computer program |
US20100103077A1 (en) * | 2007-11-20 | 2010-04-29 | Keiji Sugiyama | Image display apparatus, display method thereof, program, integrated circuit, goggle-type head-mounted display, vehicle, binoculars, and desktop display |
US20100045571A1 (en) * | 2007-11-20 | 2010-02-25 | Kakuya Yamamoto | Beam-scan display apparatus, display method, and vehicle |
US20100177035A1 (en) * | 2008-10-10 | 2010-07-15 | Schowengerdt Brian T | Mobile Computing Device With A Virtual Keyboard |
US20100091027A1 (en) * | 2008-10-14 | 2010-04-15 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
US20100199232A1 (en) * | 2009-02-03 | 2010-08-05 | Massachusetts Institute Of Technology | Wearable Gestural Interface |
US20100287485A1 (en) * | 2009-05-06 | 2010-11-11 | Joseph Bertolami | Systems and Methods for Unifying Coordinate Systems in Augmented Reality Applications |
US20110057862A1 (en) * | 2009-09-07 | 2011-03-10 | Hsin-Liang Chen | Image display device |
US20110142514A1 (en) * | 2009-12-10 | 2011-06-16 | Fuji Xerox Co., Ltd. | Medium clamping device and image forming device |
US20130182185A1 (en) * | 2010-10-15 | 2013-07-18 | Sharp Kabushiki Kaisha | Image processing device, image processing method, image processing program, and recording medium |
US20120139933A1 (en) * | 2010-12-03 | 2012-06-07 | Fujitsu Limited | Image display device and method, and image processing device |
US20120242560A1 (en) * | 2011-03-24 | 2012-09-27 | Seiko Epson Corporation | Head-mounted display device and control method for the head-mounted display device |
US8228315B1 (en) * | 2011-07-12 | 2012-07-24 | Google Inc. | Methods and systems for a virtual input device |
US8223024B1 (en) * | 2011-09-21 | 2012-07-17 | Google Inc. | Locking mechanism based on unnatural movement of head-mounted display |
US20150009416A1 (en) * | 2012-02-22 | 2015-01-08 | Sony Corporation | Display device, image processing device and image processing method, and computer program |
US20130222213A1 (en) * | 2012-02-29 | 2013-08-29 | Recon Instruments Inc. | Modular heads-up display systems |
US20140184477A1 (en) * | 2012-12-27 | 2014-07-03 | Seiko Epson Corporation | Head-mounted display |
US20140268336A1 (en) * | 2013-03-13 | 2014-09-18 | Seiko Epson Corporation | Virtual image display apparatus |
US20160028940A1 (en) * | 2013-03-27 | 2016-01-28 | Fujifilm Corporation | Image processing device, imaging device, image processing method and computer readable medium |
US9360672B2 (en) * | 2013-07-11 | 2016-06-07 | Seiko Epson Corporation | Head mounted display device and control method for head mounted display device |
US20150050880A1 (en) * | 2013-08-16 | 2015-02-19 | Samsung Electronics Co., Ltd. | Data communication method and apparatus based on wireless communication |
US20150312468A1 (en) * | 2014-04-23 | 2015-10-29 | Narvaro Inc. | Multi-camera system controlled by head rotation |
US20150326258A1 (en) * | 2014-05-08 | 2015-11-12 | Samsung Electronics Co., Ltd. | Apparatus and method for controlling power amplifier bias |
US20150338662A1 (en) | 2014-05-20 | 2015-11-26 | Kabushiki Kaisha Toshiba | Display device |
Non-Patent Citations (1)
Title |
---|
Oyamada, Y., et al., "Focal Pre-Correction of Projected Image for Deblurring on Diesplayed Image on the Screen", Graduate School of Science and Technology, Kelo University, MIRU2007/Proceedings, pp. 1295-1300 (2007). |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220099877A1 (en) * | 2019-02-01 | 2022-03-31 | tooz technologies GmbH | Light-guiding arrangement, imaging optical unit, head mounted display and method for improving the imaging quality of an imaging optical unit |
US20230137831A1 (en) * | 2021-11-03 | 2023-05-04 | Samsung Electronics Co., Ltd. | Electronic device for improving image quality |
Also Published As
Publication number | Publication date |
---|---|
US20160295183A1 (en) | 2016-10-06 |
JP2016197145A (en) | 2016-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9720238B2 (en) | Method and apparatus for a dynamic “region of interest” in a display system | |
JP6523151B2 (en) | Display device | |
US9711114B1 (en) | Display apparatus and method of displaying using projectors | |
US9762870B2 (en) | Image processing device and image display apparatus | |
JP5140869B2 (en) | Image projection method and image projection apparatus | |
US20130120390A1 (en) | Systems and methods for rendering a display to compensate for a viewer's visual impairment | |
US8565524B2 (en) | Image processing apparatus, and image pickup apparatus using same | |
JP5886896B2 (en) | Display device | |
US20150062446A1 (en) | Projection display with multi-channel optics with non-circular overall aperture | |
US20210018751A1 (en) | Optical hybrid reality system having digital correction of aberrations | |
CN108847200A (en) | Backlight adjusting method and device, head up display, system and storage medium | |
JP2015097350A (en) | Image processing apparatus and multi-projection system | |
US9959841B2 (en) | Image presentation control methods and image presentation control apparatuses | |
JP6991957B2 (en) | Image processing device, image pickup device and image processing method | |
KR102435398B1 (en) | Apparatus and method for aligning image projection device | |
US11927733B2 (en) | Image compensation device for image for augmented reality | |
US11599973B2 (en) | Image processing apparatus, lens apparatus, and image processing method for sharpening processing | |
US20200033595A1 (en) | Method and system for calibrating a wearable heads-up display having multiple exit pupils | |
EP3845949B1 (en) | Lens system and image observation device | |
JP2018157276A (en) | Image display device, image display method, and program | |
CN112348751A (en) | Anti-distortion method and device for near-eye display equipment | |
JP6051859B2 (en) | Image display device and image display method | |
US11874469B2 (en) | Holographic imaging system | |
WO2015159847A1 (en) | See-through display device and display method therefor | |
JP6838608B2 (en) | Image processing device and image processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: KABUSHIKI KAISHA TOSHIBA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YAMAGUCHI, JUN;BABA, MASAHIRO;KOKOJIMA, YOSHIYUKI;REEL/FRAME:037646/0716 Effective date: 20151221 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20210912 |