WO2021095581A1 - 電子機器 - Google Patents
電子機器 Download PDFInfo
- Publication number
- WO2021095581A1 WO2021095581A1 PCT/JP2020/040954 JP2020040954W WO2021095581A1 WO 2021095581 A1 WO2021095581 A1 WO 2021095581A1 JP 2020040954 W JP2020040954 W JP 2020040954W WO 2021095581 A1 WO2021095581 A1 WO 2021095581A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- optical system
- imaging optical
- electronic device
- imaging
- image
- Prior art date
Links
- 230000003287 optical effect Effects 0.000 claims abstract description 614
- 238000003384 imaging method Methods 0.000 claims abstract description 543
- 238000012937 correction Methods 0.000 claims description 18
- 230000002194 synthesizing effect Effects 0.000 claims description 5
- 230000015572 biosynthetic process Effects 0.000 claims description 3
- 238000003786 synthesis reaction Methods 0.000 claims description 3
- 230000015556 catabolic process Effects 0.000 abstract 1
- 238000006731 degradation reaction Methods 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 28
- 238000000034 method Methods 0.000 description 22
- 238000012545 processing Methods 0.000 description 20
- 238000007781 pre-processing Methods 0.000 description 16
- 230000008569 process Effects 0.000 description 13
- 238000003860 storage Methods 0.000 description 13
- 239000000463 material Substances 0.000 description 9
- 239000000758 substrate Substances 0.000 description 8
- 239000002775 capsule Substances 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 239000010410 layer Substances 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 238000012805 post-processing Methods 0.000 description 6
- 238000002834 transmittance Methods 0.000 description 6
- 239000012790 adhesive layer Substances 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 5
- 239000006059 cover glass Substances 0.000 description 5
- 239000000203 mixture Substances 0.000 description 5
- 238000012549 training Methods 0.000 description 5
- 238000003702 image correction Methods 0.000 description 4
- 230000001629 suppression Effects 0.000 description 4
- 241000282412 Homo Species 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 230000008921 facial expression Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000003062 neural network model Methods 0.000 description 3
- 230000000644 propagated effect Effects 0.000 description 3
- 230000011514 reflex Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 235000014676 Phragmites communis Nutrition 0.000 description 2
- 239000004642 Polyimide Substances 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000006866 deterioration Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 229920001721 polyimide Polymers 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000001839 endoscopy Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000004313 glare Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000010030 laminating Methods 0.000 description 1
- 238000003475 lamination Methods 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000001615 p wave Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000013179 statistical model Methods 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 239000012780 transparent material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
- G03B17/12—Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B30/00—Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
- H04M1/0264—Details of the structure or mounting of specific components for a camera module assembly
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/53—Constructional details of electronic viewfinders, e.g. rotatable or detachable
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/011—Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02F—OPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
- G02F2201/00—Constructional arrangements not provided for in groups G02F1/00 - G02F7/00
- G02F2201/58—Arrangements comprising a monitoring photodetector
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B11/00—Filters or other obturators specially adapted for photographic purposes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
- H04M1/0266—Details of the structure or mounting of specific components for a display module assembly
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/52—Details of telephonic subscriber devices including functional features of a camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
- H04N2007/145—Handheld terminals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
- H04N7/144—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display camera and display on the same optical axis, e.g. optically multiplexing the camera and display for eye to eye contact
-
- H—ELECTRICITY
- H10—SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
- H10K—ORGANIC ELECTRIC SOLID-STATE DEVICES
- H10K59/00—Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
- H10K59/30—Devices specially adapted for multicolour light emission
- H10K59/38—Devices specially adapted for multicolour light emission comprising colour filters or colour changing media [CCM]
Definitions
- This disclosure relates to electronic devices.
- Recent electronic devices such as smartphones, mobile phones, and PCs (Personal Computers) are equipped with a camera on the frame (bezel) of the display unit, making it easy to make videophone calls and shoot videos. Since smartphones and mobile phones are often carried in pockets or bags, it is necessary to make the external size as compact as possible. On the other hand, if the size of the display screen is small, the higher the display resolution, the smaller the displayed character size and the more difficult it is to see. Therefore, it has been studied to make the size of the display screen as large as possible without increasing the outer size of the electronic device by reducing the width of the bezel around the display screen.
- the bezel width cannot be made smaller than the outer diameter size of the camera.
- the line of sight is often placed near the center of the display screen, so the line of sight shifts from the optical axis of the camera, and the line of sight does not match. A shot image with a certain image is obtained.
- the camera module on the side opposite to the display surface of the display unit and to shoot the subject light passing through the display unit with the camera.
- One aspect of the present disclosure provides an electronic device capable of suppressing deterioration of the image quality of an image captured by a camera while reducing the width of the bezel.
- the electronic device includes a display unit having a displayable region in which display optics are provided in an array along a second direction intersecting the first direction and the first direction and a second direction. In the third direction intersecting the directions, it is arranged on the side opposite to the display surface of the display unit so as to overlap the displayable area, and at least the first imaging optical system and the first imaging optical in at least one of the first direction and the second direction.
- Image data is acquired based on a plurality of imaging optical systems having a second imaging optical system having coordinates different from those of the system, and information acquired by the first imaging optical system and the second imaging optical system. It is equipped with an image acquisition unit.
- the first imaging optical system and the second imaging optical system may be two optical systems arbitrarily selected from the plurality of imaging optical systems. That is, the configuration described below may be a feature for at least two of the plurality of imaging optical systems. Further, one feature and the other feature may be night-time in different combinations of the first and second optics. As described above, the following features may be possessed by at least two of the plurality of imaging optical systems.
- light may propagate from the display surface of the display unit to the first imaging optical system and the second imaging optical system via optical systems having different optical characteristics. Further, all of the plurality of imaging optical systems may have different optical characteristics, and some may have the same optical characteristics. As described above, at least two imaging optical systems may have different optical characteristics in the optical path of light propagating from the display surface. By having different optical characteristics, the characteristics of the flare generated may be changed.
- the electronic device may be provided with an aperture in which the light incident from the display surface propagates to the display unit, and the light incident from the display surface may propagate to the imaging optical system through the aperture.
- this aperture may form the above optical features.
- the aperture that propagates light to the first imaging optical system and the aperture that propagates light to the second imaging optical system may have different layouts. This layout may form different optical features.
- the aperture that propagates light to the first imaging optical system and the aperture that propagates light to the second imaging optical system may form a diffraction image in different directions.
- an aperture that propagates light to the first imaging optical system has a size larger than the size of the second direction in the first direction
- an aperture that propagates light to the second imaging optical system has a size of the second direction. It may be larger than the size in one direction.
- the flare generation direction can be different in the two imaging optical systems.
- the electronic device may include a third imaging optical system, which has the same parallax as the first imaging optical system and the second imaging optical system with respect to the first imaging optical system, and the image acquisition unit may include a third imaging optical system.
- Image data may be acquired based on the information acquired based on the information acquired from the imaging optical system and the third imaging optical system and the information acquired from the first imaging optical system. That is, for a combination of two imaging optical systems, an imaging optical system having equivalent parallax in opposite directions may be further provided centering on either of these two imaging optical systems.
- the first imaging optical system In the first direction or the second direction of the display surface, the first imaging optical system is provided near the center, and the second imaging optical system and the third imaging optical system sandwich the first imaging optical system.
- the first imaging optical system may have a smaller region facing the display surface in the third direction than the second imaging optical system and the third imaging optical system. That is, the second and third imaging optical systems may be provided at both ends on the display surface of the electronic device 1, and the first imaging optical system that is less conspicuous than these imaging optical systems may be provided near the center.
- the arrangement of the first imaging optical system and the second imaging optical system may have different coordinates in both the first direction and the second direction.
- the two imaging optical systems may have different coordinates in only the first direction, only in the second direction, and in both the first direction and the second direction. That is, on the display surface, the two imaging optical systems may be arranged horizontally, vertically, or in any direction.
- the image acquisition unit When the image acquisition unit synthesizes the data acquired from the first imaging optical system and the data acquired from the second imaging optical system, the image acquisition unit obtains the imaging result based on the data having low intensity at the position in the image data which is the composite result. You may get it. Flare often has higher brightness and light intensity than the object to be imaged in the acquired data. Therefore, when the image acquisition unit acquires the image data, of the signals acquired from the two imaging optical systems, the flare is used. Images may be acquired based on low intensity signals.
- the first imaging optical system and the second imaging optical system have directions to be reflected with priority on each of them, and the image acquisition unit either of them when a difference of a predetermined value or more occurs in the output for each direction.
- the imaging result may be acquired using the result. For example, when the first imaging optical system generates flare along the first direction and the second imaging optical system generates flare along the second direction, the image acquisition unit uses imaging optics with less flare generation. Image data may be acquired based on the output from the system.
- the space between the first imaging optical system and the second imaging optical system may be shielded from light. In this way, the light may be shielded between the first imaging optical system and the second imaging optical system to suppress the influence on each other's optical systems.
- the image acquisition unit may synthesize the information acquired by the first imaging optical system and the second imaging optical system using the trained model. In this way, for example, data output from a plurality of imaging optical systems may be combined using a model generated by machine learning.
- This trained model may be trained based on data collected from multiple electronic devices. For example, a model may be generated based on an electronic device 1 manufactured with the same model number. Further, even in the model number, the model may be changed and trained based on the imaging mode or the like.
- the image acquisition unit may make corrections when the parallax in a plurality of images acquired by a plurality of imaging optical systems exceeds a predetermined amount.
- image data may be acquired by detecting a region where flare occurs from parallax and performing image processing on that region.
- At least one imaging optical system may be composed of a microlens array. As described above, one imaging system may be provided with a lens system instead of one lens.
- a plurality of imaging optical systems may be provided in the area where the microlens array is provided. In this way, a plurality of imaging optical systems may be formed using one microlens array.
- the first image pickup optical system and the second image pickup optical system may acquire information in the same image pickup element.
- a plurality of image pickup optical systems may be formed in one image pickup device, for example, a one-chip image pickup device.
- a microlens array for one chip it is possible to have a configuration in which a plurality of imaging optical systems are provided for each region.
- the image acquisition unit may be arranged on the same chip as the image sensor.
- one chip may be provided with an image sensor and a logic circuit, the data acquired by the image sensor may be DA-converted, and the converted data may be processed by the logic circuit. It may be formed as a plurality of laminated chips instead of one chip.
- the display unit may include a plurality of display optical systems having different optical features.
- the display optical system may be a mixture of OLED, MicroLED, liquid crystal, and the like.
- the flare based on the result of reflection, refraction, and diffraction in the display unit has different characteristics depending on the imaging optical system. It can also be.
- At least one of the plurality of imaging optical systems may operate when signal correction is required in the image acquisition unit. For example, it may be switched whether or not to correct by the image pickup optical system based on the surrounding environment, such as providing an image pickup element that corrects when the electronic device detects a strong light source.
- the first imaging optical system and the second imaging optical system may be integrated.
- the light receiving element and the optical path to the light receiving element may be adjacent to each other.
- the first imaging optical system and the second imaging optical system may be provided near the boundary of the display surface.
- the boundary is, for example, an end portion of the display surface, and a plurality of imaging optical systems may be provided at this end portion.
- the first imaging optical system and the second imaging optical system may be arranged at a distance of 50 mm or more and 80 mm or less, and the image acquisition unit is the information acquired by the first imaging optical system and the second imaging optical system. Disparity image data may be generated.
- the two imaging optics may be arranged so as to have a distance equivalent to the distance between the human eyes.
- Display units may be provided on both sides of the device.
- a plurality of imaging optical systems may be provided on each of the plurality of display units, one may be provided with a plurality of imaging optical systems and the other may not be provided, and one may be provided with a plurality of imaging optical systems.
- One imaging optical system may be provided and the other may be provided with a plurality of imaging optical systems.
- the fourth imaging optical system that is different from the first imaging optical system and the second imaging optical system, and the fourth imaging optical system and the first imaging optical system or the second imaging optical system have any of the above characteristics. You may have. As described at the beginning, the first imaging optical system and the second imaging optical system are only two of a plurality of imaging optical systems extracted, and do not indicate a specific imaging optical system.
- the figure which shows an example of the image imaged by the image pickup optical system of FIG. The figure which shows an example of the image acquired by the image acquisition part which concerns on one Embodiment.
- the figure which shows an example of the image imaged by the image pickup optical system of FIG. A schematic external view of an electronic device according to an embodiment.
- the figure which shows an example of the image imaged by the image pickup optical system of FIG. It is a figure which shows an example which combined the image acquired by the imaging optical system at both ends of FIG.
- a schematic cross-sectional view of an electronic device according to an embodiment. A schematic cross-sectional view of an electronic device according to an embodiment.
- the schematic diagram which shows an example of the imaging optical system which concerns on one Embodiment The schematic diagram which shows an example of the imaging optical system which concerns on one Embodiment.
- the schematic diagram which shows an example of the imaging optical system which concerns on one Embodiment The schematic diagram which shows an example of the imaging optical system which concerns on one Embodiment.
- the schematic diagram which shows an example of the imaging optical system which concerns on one Embodiment The schematic diagram which shows an example of the imaging optical system which concerns on one Embodiment.
- the schematic diagram which shows an example of the imaging optical system which concerns on one Embodiment The schematic diagram which shows an example of the imaging optical system which concerns on one Embodiment.
- the schematic diagram which shows an example of the imaging optical English which concerns on one Embodiment The schematic diagram which shows an example of the implementation of the imaging part which concerns on one Embodiment.
- the schematic diagram which shows an example of the implementation of the imaging part which concerns on one Embodiment. The figure which shows an example of the arrangement of the imaging optical system which concerns on one Embodiment.
- the figure which shows an example of the arrangement of the imaging optical system which concerns on one Embodiment. The figure which shows an example of the arrangement of the imaging optical system which concerns on one Embodiment.
- the figure which shows an example of the layout of the opening which concerns on one Embodiment The figure which shows an example of the layout of the opening which concerns on one Embodiment.
- the figure which shows an example of the layout of the opening which concerns on one Embodiment The figure which shows an example of the layout of the opening which concerns on one Embodiment.
- a plan view when the electronic device of one embodiment is applied to a capsule endoscope.
- the rear view when the electronic device of one Embodiment is applied to a digital single-lens reflex camera.
- the electronic device may have components and functions not shown or described.
- the following description does not exclude components or functions not shown or described.
- some of them have been changed in size, shape, aspect ratio, etc. for the sake of explanation, they have an appropriate size, shape, aspect ratio, etc. in mounting.
- the signal to be acquired is described as image information or imaging information, but the image information and imaging information are broadly defined concepts, and one frame in a still image, a moving image, or a moving image. It is a concept that includes images and the like.
- FIG. 1 is a schematic cross-sectional view of the electronic device 1 according to the first embodiment.
- the electronic device 1 in FIG. 1 is an arbitrary electronic device having both a display function and a shooting function, such as a smartphone, a mobile phone, a tablet, and a PC.
- the first direction is the right side of the drawing
- the second direction is the direction perpendicular to the drawing
- the third direction is the downward direction of the drawing. That is, the second direction is the direction that intersects the first direction
- the third direction is the direction that intersects the first direction and the second direction.
- the intersection may include the intersection at an angle of 90 °, and may not be exactly 90 °.
- the first direction and the second direction are distinguished for convenience, and are equivalent even if they are interchanged.
- the electronic device 1 of FIG. 1 includes an imaging optical system (camera module or the like) arranged on the side opposite to the display surface of the display unit 2. As described above, the electronic device 1 is provided with the imaging optical system 3 on the back side (opposite side) of the display surface of the display unit 2. Therefore, the imaging optical system 3 takes a picture through the display unit 2.
- an imaging optical system camera module or the like
- the display unit 2 is a structure in which a display panel 4, a circularly polarizing plate 5, a touch panel 6, and a cover glass 7 are laminated in this order.
- the lamination of FIG. 1 is shown as an example, and even if an adhesive layer or an adhesive layer is provided between the display panel 4, the circularly polarizing plate 5, the touch panel 6, and the cover glass 7, if necessary. Good. Further, the order of the circularly polarizing plate 5 and the touch panel 6 may be appropriately changed depending on the design.
- the imaging optical system 3 is provided on the opposite side of the display surface of the display unit 2.
- the imaging optical system 3 includes, for example, a photoelectric element (light receiving element) that receives light and photoelectrically converts it into an analog signal, and an optical system that propagates the light irradiated on the display surface to the photoelectric element. ..
- the optical system may be, for example, an opening provided in the display panel 4.
- a plurality of the imaging optical systems 3 are provided for one display unit 2 of the electronic device 1, and for example, two are provided as shown in the figure. The light applied to the display surface is diffracted at the opening and propagated to the light receiving element as shown by the arrow in the figure.
- the image pickup optical system 3 includes, for example, an image pickup unit 8 and an optical system 9 that collects and diffuses light incident on the image pickup unit 8 from a display surface.
- the plurality of imaging optical systems 3 are arranged so as to have different coordinates in the second direction, as shown in the figure, but the present invention is not limited to this. For example, it may have different coordinates in the first direction, or it may have different coordinates in both the first direction and the second direction.
- the display panel 4 may be provided with an OLED (Organic Light Emitting Device) or a liquid crystal such as a TFT as an optical system (display optical system) for displaying, for example. It may be equipped with a Micro LED.
- the display optical system may include a light emitting element based on other display principles.
- the light emitting elements as the display optical system may be, for example, a striped array, a mosaic array, may be arranged in an array in the first direction and the second direction, or may be arranged in an oblique direction or a portion. Pixel thinning may be performed.
- the display optical system may have a light emitting element provided with a laminated filter to change the display color.
- the display panel 4 may be composed of a plurality of layers such as an anode layer and a cathode layer. Further, these layers may be formed of a material having a high transmittance.
- the display panel 4 may be provided with a member having a low transmittance such as a color filter layer.
- the substrate 4a and the OLED unit may be provided.
- the substrate 4a may be formed of, for example, polyimide or the like.
- an opening may be formed according to the arrangement location of the imaging optical system 3. If the subject light passing through the aperture is incident on the imaging optical system 3, the image quality of the image captured by the imaging optical system 3 can be improved. Further, it may be provided with a light propagation path formed by a substance having a high transmittance instead of an opening. Also in this case, the light incident from the display surface of the display unit 2 is received by the imaging optical system 3 and converted into a signal.
- the circularly polarizing plate 5 is provided, for example, in order to reduce glare or enhance the visibility of the display screen 1a even in a bright environment.
- a touch sensor is incorporated in the touch panel 6.
- touch sensors such as a capacitance type and a resistance film type, and any method may be used.
- the touch panel 6 and the display panel 4 may be integrated.
- the cover glass 7 is provided to protect the display panel 4 and the like.
- an adhesive layer or an adhesive layer such as OCA (Optical Clear Adhesive) may be provided at an appropriate place. Further, depending on the design, the order of the circularly polarizing plate 5 and the touch panel 6 in the third direction may be interchanged.
- FIG. 2 shows a schematic external view and a cross-sectional view of the electronic device 1 shown in FIG.
- the cross-sectional view shows a cross section of a display portion including the display unit 2 in the alternate long and short dash line shown in the figure. Circuits and the like other than the housing and display portion of the electronic device 1 are omitted.
- the display screen 1a extends close to the outer diameter size of the electronic device 1, and the width of the bezel 1b around the display screen 1a is reduced to several mm or less.
- the bezel 1b is often equipped with a front camera.
- the front camera may be positioned as a plurality of imaging optical systems 3 at substantially the center of the display screen 1a in the second direction as shown by the dotted line in the external view.
- the external view of FIG. 2 is shown as an example, and the imaging optical system 3, that is, the front camera, is located on the display screen 1a at positions in arbitrary first and second directions. It may be arranged on the side opposite to the display surface (back surface side). For example, it may be arranged on the peripheral edge portion (end portion, boundary portion) of the display screen 1a. As shown in the external view of FIG. 2, the plurality of imaging optical systems 3 are provided so as to have different coordinates in the first direction, for example. Even when the imaging optical system 3 is arranged at an arbitrary position, it may be arranged so as to have different coordinates in at least one of the first direction and the second direction. Further, although two imaging optical systems 3 are drawn, the present invention is not limited to this, and more imaging optical systems may be provided on the side opposite to the display surface.
- the imaging optical system 3 is provided on the back surface side opposite to the display surface side, which is the display surface of the display unit 2. It should be noted that this cross-sectional view is omitted.
- the adhesive layer and the like are also provided in the configuration of the cross-sectional view of FIG. 2, but are omitted for the sake of simplicity.
- FIG. 3A is a diagram showing an example of the imaging optical system 3.
- the image pickup optical system 3 includes, for example, an image pickup unit 8 and an optical system 9.
- the optical system 9 is arranged on the incident surface side of the light of the imaging unit 8, that is, on the side closer to the display unit 2. The light transmitted through the display surface of the display unit 2 is propagated to the image pickup unit 8 by the optical system 9.
- the imaging unit 8 includes, for example, a light receiving element such as a photodiode and a photoelectric element.
- the light collected, diffused, and propagated by the optical system 9 is received by the imaging pixel array provided in the imaging unit 8 and outputs an analog signal.
- the image pickup pixel array may be provided with a color filter such as a Bayer array on the incident surface side of each image pickup element, or may be provided with a laminated color filter.
- a filter for acquiring a color image may be provided.
- other elements, circuits, etc. necessary for receiving light and outputting analog signals are provided.
- the photoelectric conversion may be a CMOS (Complementary Metal-Oxide-Semiconductor) element or a CCD (Charge Coupled Device) element.
- CMOS Complementary Metal-Oxide-Semiconductor
- CCD Charge Coupled Device
- the above-mentioned filter, a polarizing element, and the like may be provided.
- the optical system 9 may include, for example, a lens. Further, the optical system 9 may be a concept including the opening provided in the display panel 4 described above. For example, as the optical system 9, a lens is arranged at an opening provided in the display panel 4 and a position closer to the image pickup unit 8 than this opening in the third direction.
- This aperture may be provided, for example, in the substrate 4a having a low transmittance, and may be provided with a lens that propagates the light transmitted through the aperture to the image pickup unit 8.
- the lens and aperture define optical features such as numerical aperture Na (Numerical Aperture) and F-number (F-Number) in each imaging optical system 3.
- the optical system 9 may have other optical features such as the imaging optical system 3 having a different Abbe number.
- the lens is shown as a single lens, but is not limited to this, and may be provided as a lens system including a plurality of various types of lenses.
- the aperture and the lens are shown as an example, and the configuration of the optical system 9 is not necessarily limited to these combinations. Further, in the figure, one lens is provided for each aperture, but the present invention is not limited to this.
- a plurality of openings may be provided for one lens in the optical system 9. In the region where the opening does not exist, for example, the light emitting element of the display panel 4 may be provided, and the opening may be provided so as to sew between these light emitting elements. By arranging in this way, it is possible to provide the imaging optical system 3 without breaking the display.
- the plurality of imaging optical systems 3 may be formed having different optical features depending on the shape of the aperture, the performance of the lens, and the like.
- the corresponding optical systems 9 may have different optical features.
- the imaging optical system 3 may be divided into a plurality of groups, and each group may have different optical characteristics.
- the optical system 9 is formed by the shape, orientation, or lens of the aperture so that the two imaging optical systems 3 having common optical characteristics and one imaging optical system 3 having different optical characteristics are obtained. It may be provided by changing the material or the like.
- the layout of the opening is described as an expression including the shape and orientation of the opening.
- the light incident from the display surface side of the display unit 2 is refracted by the optical system 9 and received by the image pickup unit 8.
- the display on the display unit 2 may be adjusted so as to be easy to see, as in the case of a normal display.
- an aperture is provided between the light emitting pixels of the display panel 4, a lens is provided on the side opposite to the display surface of the opening in the third direction, and light incident from the display surface is projected onto the image pickup unit 8.
- an opening may be provided between each of the continuous light emitting pixels. In other words, the configuration may be such that a light emitting pixel is provided between the openings.
- FIG. 4 shows an example of a block diagram showing a configuration related to the imaging operation of the electronic device 1 according to the present embodiment.
- the electronic device 1 includes a display unit 2, a plurality of imaging optical systems 3, a pre-processing unit 10, an image acquisition unit 12, a post-processing unit 14, an output unit 16, a control unit 18, and a storage unit 20. , Equipped with.
- one display unit 2 is provided with a plurality of imaging optical systems 3 on the side opposite to the display surface.
- the imaging optical system 3 includes an imaging unit 8 and an optical system 9, respectively.
- the preprocessing unit 10 is a circuit that processes an analog signal output by the imaging unit 8.
- the preprocessing unit 10 includes, for example, an ADC (Analog to Digital Converter) and converts the input analog signal into digital image data.
- ADC Analog to Digital Converter
- the image acquisition unit 12 acquires the captured image from the digital image data converted by the preprocessing unit 10.
- the imaging result is acquired based on the digital image data acquired from the plurality of imaging optical systems 3. More specifically, the image acquisition unit 12 acquires, for example, an image pickup result in which flares generated in each image pickup optical system 3 are suppressed by using image data obtained by a plurality of image pickup optical systems 3 and outputs the image acquisition result. To do.
- the post-processing unit 14 performs appropriate processing on the imaging result output by the image acquisition unit 12 and outputs the image.
- the appropriate processing may be, for example, image processing or signal processing such as pixel defect correction, edge enhancement, noise removal, brightness adjustment, color correction, white balance adjustment, distortion correction, autofocus processing, and the like. .. Further, this appropriate process may be a process specified by the user.
- the output unit 16 outputs information to the outside of the electronic device 1.
- the output unit 16 includes, for example, an output interface.
- the output interface may be, for example, an interface that outputs a digital signal such as USB (Universal Serial Bus), or a user interface such as a display. Further, the output interface provided in the output unit 16 may also include an input interface.
- the control unit 18 controls the processing in the electronic device 1.
- the control unit 18 may include, for example, a CPU (Central Processing Unit), and may control the processing of the pre-processing unit 10, the image acquisition unit 12, the post-processing unit 14, and the output unit 16. Further, the control of taking a picture by the image pickup optical system 3 may be executed based on the image pickup timing instructed from the user interface.
- a CPU Central Processing Unit
- the storage unit 20 stores the data in the electronic device 1.
- the storage unit 20 may be, for example, a memory such as a DRAM (Dynamic Random Access Memory) or a storage such as an SSD (Solid State Drive).
- the storage unit 20 may be a built-in memory or a memory such as a removable memory card. Further, the storage unit 20 is not necessarily provided inside the electronic device 1, and may be an external storage or the like connected via an input / output interface. Information is appropriately input / output to and from the storage unit 20 at a timing required by the electronic device 1.
- the imaging optical system 3, the pre-processing unit 10, the image acquisition unit 12, the post-processing unit 14, the output unit 16, the control unit 18, and the storage unit 20 may be formed on one chip, and these may be formed as appropriate.
- a part may be formed as another chip.
- FIG. 5 is a diagram showing the imaging optical system 3 from the display surface side of the display unit 2.
- the display panel 4 can be seen from the display surface, and a plurality of light emitting elements arranged in an array in the first direction and the second direction are formed on the display panel 4 as shown by a dotted line.
- the light emitting pixel 4b is provided.
- the arrangement and orientation of the light emitting pixels 4b are shown as an example, and the light emitting pixels 4b are arranged in the same manner as in FIG. 5, but may be rotated by 45 ° and installed.
- the light emitting pixel 4b is drawn as a square, but the present invention is not limited to this, and the light emitting pixel 4b may be a rectangle extending in any direction, and may not be a rectangle.
- a plurality of imaging optical systems 3 having an opening between the light emitting pixels 4b of the display panel 4 are provided.
- the first imaging optical system 3A and the second imaging optical system 3B are shown.
- each imaging optical system has an elliptical aperture having a long axis in the second direction as the first optical system 9A and a long axis in the first direction as the second optical system 9B. It is provided with an oval opening.
- a plurality of openings as an optical system may be provided in each imaging optical system.
- the first imaging unit 8A is provided below the opening provided in the first optical system 9A
- the second imaging unit 8B is provided below the opening provided in the second optical system 9B.
- the image pickup unit 8 may be provided at a position deviated from the opening.
- the light passing through the aperture is appropriately diffused and focused in the imaging region of the imaging unit 8 between each aperture and the imaging unit 8 or at the aperture.
- the lens may be provided as part of the optical system 9.
- Other optical systems may be provided so that light can be appropriately received in the imaging region of the imaging unit 8 instead of the lens.
- the opening indicates an optical transmission region, which may be an air gap or may be filled with a transparent material such as resin.
- a material that transmits a specific wavelength by a color filter or the like may be used, and the material that fills the opening is not limited.
- the optical characteristics of the first optical system 9A and the second optical system 9B may be different by filling the openings with materials having different transmittances, refractive indexes, and the like.
- the imaging optical system 3 has different aperture layouts and is arranged so as to be displaced from each other, as in the case of the first imaging optical system 3A and the second imaging optical system 3B shown in FIG. 5, for example. Due to the different layout of the apertures, the optical characteristics of the optical systems belonging to each are different, and light is incident on each image pickup unit 8 from the display surface of the display unit 2 via different optical characteristics. That is, light is incident on the first imaging unit 8A and the second imaging unit 8B via different optical features.
- the openings belonging to each imaging optical system may have similar elliptical directions.
- the optical system 9 that generates diffraction images in different directions may be formed by having similar openings having different directions.
- one optical system 9 is not necessarily one continuous region when viewed from the display surface in the display unit 2 including the display panel 4, but is sewn between the light emitting pixels and the light emitting pixels, for example. It may be configured as a plurality of separate areas containing openings arranged in periodic vessels.
- the long axis of one aperture corresponds to approximately 2 to 3 display pixels, but the present invention is not limited to this.
- the opening may have, for example, a longer major axis or a shorter major axis.
- the length of the aperture which is a part of the second optical system 9B is longer than the long axis of the aperture which is a part of the first optical system 9A.
- the shaft may be lengthened or shortened.
- FIG. 6A and 6B show images that have been converted from analog signals acquired by the respective imaging optical systems 3 shown in FIG. 5 and whose coordinates have been adjusted.
- FIG. 6A is an image acquired based on the first imaging optical system 3A
- FIG. 6B is an image acquired based on the second imaging optical system 3B.
- the image acquisition unit 12 may adjust the position.
- the adjustment of the imaging position may be adjusted so that, for example, the image obtained by displaying the mirror image of the image acquired by each imaging optical system 3 on the display panel 4 and the reflected image when the display surface is viewed from the front overlap.
- the position may be adjusted so that the image acquired from the imaging optical system 3 is displayed at the center of the display surface.
- the correction may be performed based on the positional deviation from the center of the display surface of the imaging optical system 3.
- the position adjustment is not limited to this, and may be performed by any method as long as it is appropriately controlled.
- flare occurs as shown in the white areas in FIGS. 6A and 6B.
- flare occurs as shown in the figure as a region brighter than the image actually desired to be captured.
- the image acquisition unit 12 determines the light intensity in the signals output from the first imaging optical system 3A and the second imaging optical system 3B in order to suppress flares occurring at different locations.
- the image is corrected by acquiring the value of a pixel having a lower luminance value or the like after being converted into a digital signal.
- the preprocessing unit 10 may make a selection based on the signal strength.
- FIG. 7 shows an image processed by the image acquisition unit 12 and output from the electronic device 1.
- the post-processing unit 14 adjusts, for example, brightness and the like. By outputting the image processed in this way, flare is suppressed as shown in FIG. 7, and an image adjusted to natural brightness or the like can be acquired.
- the region where the flare is generated may be determined depending on each of the imaging optical systems 3.
- the flare generation region may be stored in advance in the storage unit 20 or the image acquisition unit 12.
- the pixel value selection process or the pixel value composition process can be executed at high speed. For example, for a certain pixel, an image may be acquired based on the output from a predetermined imaging optical system 3 without having to compare the pixel values output from the respective imaging optical systems 3.
- the camera provided on the back surface of the display unit of the display, that is, the display is overlapped. It is possible to suppress flare in the image taken by the arranged front camera. As a result, in an electronic device, it is possible to provide an imaging optical system capable of acquiring an accurate image on the front side of the display without increasing the width of the bezel on the surface on which the display is provided.
- flare components are likely to occur in the imaging optical system 3 near the region hit by the light. Even in such a case, by using the images acquired by at least one other imaging optical system 3 provided so as to have different positions, it is possible to acquire an image in which flare is suppressed as described above. ..
- the first imaging optical system 3A and the second imaging optical system 3B are selected from a plurality of imaging optical systems, and may be provided with three or more imaging optical systems 3. That is, if at least two imaging optical systems 3 function as the first imaging optical system 3A and the second imaging optical system 3B in the plurality of three or more imaging optical systems 3, the present embodiment and the following will be described. It is possible to exert the action and effect of the embodiment. Further, in the three or more imaging optical systems 3, by providing two or more sets of imaging optical systems having the characteristics of the above-mentioned first imaging optical system and second imaging optical system, the degree of freedom in acquiring an captured image is further increased. It is also possible to improve the accuracy of flare suppression.
- the same imaging optical system may be used for different combinations as long as they are not the same combination.
- Xs (3X, 3Y) and (3X, 3Z) are common as a combination of (first imaging optical system, second imaging optical system). It may be a combination of two sets.
- each imaging optical system 3 is provided with an aperture as the optical system 9, but the present invention is not limited to this.
- flares at different positions are generated from the plurality of imaging optical systems 3, it is possible to suppress the influence of flares by the images acquired by the plurality of imaging optical systems 3.
- the shape of the opening is said to be elliptical, but it is not limited to this.
- it may be a rectangle or a rectangle with rounded corners.
- it may be formed by an arbitrary closed curve that can be appropriately received by the imaging unit 8.
- it does not have to have the same shape in the thickness direction of the opening, that is, in the third direction.
- the upper part, that is, the one closer to the display surface may have a rectangular shape, and the one closer to the image pickup unit 8 may have an ellipse.
- the shape of the opening and the like have been described as optical features, but the present invention is not limited to this.
- the material to be filled in the aperture may be changed for each imaging optical system 3 to have different characteristics.
- the optical system 9 may be provided with a ⁇ / 4 wave plate.
- the p wave may be cut in order to reduce flare due to the influence of reflection in the display panel 4.
- a different wavelength plate may be provided as the optical system 9 for the plurality of imaging optical systems 3.
- the influence of flare due to reflection, diffraction, etc. on the display unit 2 can be changed for each imaging optical system 3, and various image correction methods can be used.
- the electronic device has a plurality of imaging optical systems having openings having a similar layout and capable of reducing the influence of flare.
- FIG. 8 shows the display surface of the electronic device 1 according to the present embodiment in the same manner as in FIG.
- the first imaging optical system 3A and the second imaging optical system 3B have the same aperture layout.
- the first imaging optical system 3A and the second imaging optical system 3B exist at different positions in the second direction. That is, each optical system has an elliptical opening having a long axis in the second direction, and is arranged at a position deviated in the second direction.
- FIG. 9 is an image acquired by the second imaging optical system 3B.
- the image acquired by the first imaging optical system 3A is assumed to be FIG. 6A.
- both imaging optical systems 3 have an elliptical aperture having a long axis in the second direction, the flare generation directions are the same.
- the first imaging optical system 3A and the second imaging optical system 3B are arranged so as to be displaced in the second direction, flare occurs at a position displaced in the second direction.
- the image acquisition unit 12 can acquire an image in which the influence of flare is suppressed.
- the first imaging optical system 3A and the second imaging optical system 3B may be deviated from each other in the first direction. When both the first direction and the second direction are deviated in this way, the center point in the first direction of flare generation in each imaging optical system is at a different position, so that a better flare-suppressed image can be obtained. can do.
- FIG. 10 is a diagram showing an example of the electronic device 1 according to the present embodiment.
- the electronic device 1 includes a first imaging optical system 3A, a second imaging optical system 3B, and a third imaging optical system 3C on the opposite side of the display surface of the display unit 2.
- these imaging optics are provided along the first direction.
- the first imaging optical system 3A is located at the center of the screen, and the second imaging optical system 3B and the third imaging optical system 3C are at the edges of the display surface of the display unit 2 so as to sandwich the first imaging optical system 3A. It may be provided near a portion (boundary).
- the vicinity may be, for example, one pixel to several pixels of the display element from the edge and the boundary, but as another example, it may be several mm from the edge of the housing of the electronic device 1 or the electronic device 1. It may be a few percent with respect to the width or height of.
- the parallax between the first imaging optical system 3A and the second imaging optical system 3B and the parallax between the third imaging optical system 3C and the first imaging optical system 3A are equivalent. It becomes possible to do. That is, from this, an image in which the parallax is (almost) 0 with that of the first imaging optical system 3A by using the image acquired by the second imaging optical system 3B and the image acquired by the third imaging optical system 3C. Can be generated.
- the edge of the display surface is arranged in this way because it has little effect on the user even if pixel disorder occurs.
- the size of the first imaging optical system 3A near the center of the display surface may be smaller than that of other imaging optical systems.
- the size may be, for example, the size of the opening or the size of the imaging unit 8.
- the first imaging optical system 3A includes an ellipse having a long axis in the first direction as the optical system 9, and the second imaging optical system 3B and the third imaging optical system 3C are used as the optical system 9.
- An ellipse having a major axis in the second direction may be provided.
- FIG. 11 is a diagram showing an image that can be acquired from each imaging optical system when such an aperture is provided. These are images acquired by the first imaging optical system 3A, the second imaging optical system 3B, and the third imaging optical system 3C in order from the top. As described above, in the first imaging optical system 3A, flare occurs along the second direction, and in the second imaging optical system 3B and the third imaging optical system 3C, the positions are displaced along the first direction. Flare occurs.
- FIG. 12 is an image in which the images of the second imaging optical system 3B and the third imaging optical system 3C are superimposed in consideration of parallax.
- flare can be brought to both ends as shown in FIG. 12, for example.
- FIG. 12 With this image and the top image of FIG. 11, for example, it is possible to acquire an image in which flare is suppressed as shown in FIG. 7.
- the composition or correction of these images may be executed by the image acquisition unit 12.
- the second imaging optical system 3B and the third imaging optical system 3C are assumed to be at both ends, but the present invention is not limited to this.
- the second imaging optical system 3B and the third imaging optical system 3C are provided at positions where the same parallax appears across the first imaging optical system 3A, the acquisition of an image in which flare is suppressed is executed in the same manner. Is possible.
- the respective imaging optical systems 3 may be arranged at positions shifted in both the first direction and the second direction.
- the intensity of flare generated may vary depending on the direction. For example, it may occur strongly in the first direction but weakly in the second direction. In such a case, in the present embodiment, an imaging optical system in which flare is likely to occur in the first direction or the second direction is appropriately selected, and flare is suppressed based on the signal acquired from the selected imaging optical system. Images may be acquired.
- FIG. 13 shows, for example, the first imaging optical system in which flare in the second direction is strongly generated and flare in the first direction is not so strongly generated when an image is acquired in the imaging optical system shown in FIG. An example of the image acquired based on the output from 3A is shown.
- the flare intensity in the first direction is weak, an image in which the flare portion is darker than that in FIG. 6A is acquired.
- flare in the second direction is strongly generated, an image in which the flare portion is bright is acquired as the image output from the second imaging optical system 3B, as shown in FIG. 6B.
- an image in which the flare is suppressed may be acquired based on the output from the first imaging optical system 3A.
- the preprocessing unit 10 calculates the variance of the brightness in the first direction and the second direction in the image data output from the respective imaging optical systems 3. For example, the variance along the first direction is calculated for each row and the average value is taken, and the variance along the second direction is calculated for each column and the average value is taken. It can be said that the flare intensity is stronger in the direction in which the average value is higher. Further, the direction may be determined based on the difference between the maximum brightness and the minimum brightness for each row or column instead of the dispersion.
- the imaging optical system 3 in which flare is likely to occur depends on the optical system 9 of each imaging optical system 3. For example, as described above, it is determined based on the direction of the aperture provided in the optical system 9. Depending on the direction of the aperture, when flare occurs in any direction, the imaging optical system 3 to be prioritized may be determined in advance.
- the optical system for controlling flare may be not only the aperture layout but also the entire display panel including the wiring of the circuit.
- the flare and diffraction shape also change depending on what kind of pattern is periodically laid out on the circuit and how the light interferes. With this in mind, it may be determined in which direction flare is likely to occur.
- the image acquisition unit 12 acquires an image in which flare is suppressed based on the imaging optical system 3 having a priority direction different from the direction in which flare is generated. For example, when the images shown in FIGS. 13 and 6B are output via the respective imaging optical systems 3, the image of FIG. 13 with less flare may be selected and output. Further, the weighted average of the images of FIGS. 13 and 6B in which the weight of the image of FIG. 13 is increased may be calculated.
- any of the above-mentioned acquisition (for example, selection, synthesis) methods is used by using the imaging optical system 3 in which the plurality of first directions are the priority directions.
- An image may be acquired based on.
- an image in which the flare is suppressed may be acquired based on the image output via the imaging optical system 3 having a direction different from that direction as the priority direction. ..
- FIG. 14 is a diagram showing a cross-sectional view of the electronic device 1 according to the present embodiment.
- the electronic device 1 includes a light-shielding portion 30 between the plurality of imaging optical systems 3.
- the light-shielding portion 30 may be a light-shielding film formed of a material having a high light-shielding property, or may be an absorption film formed of a material having a high light absorption rate.
- a light-shielding portion 30 may be provided so that light does not propagate between the imaging optical systems 3.
- FIG. 15 is another example, in which the light-shielding unit 30 prevents the light reflected by the display panel 4 and the circularly polarizing plate 5 from propagating in the first direction and the second direction with respect to the plurality of imaging optical systems 3. , May be provided.
- FIG. 16 is still another example, and the light-shielding portion 30 may be provided so as to penetrate the display panel 4. By providing in this way, it is possible to block not only the light reflected by the display unit 2 but also the light reflected by other than the display unit 2.
- the light-shielding portion 30 by arranging the light-shielding portion 30 in the region provided with the display panel 4 and the circularly polarizing plate 5, the generation of flare due to reflected or diffracted light in these layers is suppressed. It becomes possible. Even if strong flare is generated in a certain region due to the influence of incident light or the like and strong flare is generated in the imaging optical system 3 in which the light in the region is incident, the other imaging optical system 3 separated by the light-shielding portion 30 It is possible to reduce the influence of flare.
- the influence between the imaging units 8 is suppressed by reducing the influence of light reflection and diffraction on the opposite side of the display surface of the display unit 2 in the third direction. Is possible.
- the light-shielding portion 30 may be provided in the touch panel 6. Further, a sufficiently thin light-shielding portion 30 may be formed so as to reach within the region of the cover glass 7. In this case, the size may be appropriately adjusted and arranged so that the display of the display unit 2 on the display surface looks like a natural image to the user. Further, in the display panel 4, the brightness of the light emitting pixels 4b arranged around the light-shielding portion 30 may be made higher than the brightness of the other light emitting pixels 4b, and other software measures may be taken.
- the image acquisition unit 12 can acquire an image having a small influence of flare by comparing or synthesizing the output values from the plurality of imaging optical systems 3 as in the above-described embodiment. is there.
- the image acquisition unit 12 acquires an image having a small influence of flare by a predetermined calculation (including comparison).
- an image in which flare is suppressed is acquired by synthesizing outputs from a plurality of imaging optical systems 3 using a model.
- the model may be a statistical model.
- a model is generated by statistically calculating what kind of calculation should be used for synthesizing various imaging optical systems 3, and the image acquisition unit 12 has a plurality of imaging optical systems in this model. By inputting the information acquired from No. 3, an image having a small influence of flare may be acquired.
- the model may be a neural network model trained by deep learning.
- the neural network model may be formed by MLP (Multi-Layer Perceptron), CNN (Convolutional Neural Network), or the like.
- MLP Multi-Layer Perceptron
- CNN Convolutional Neural Network
- parameters trained by a plurality of teacher data in advance are stored in the storage unit 20 or the image acquisition unit 12, and the image acquisition unit 12 may form a neural network model based on the stored parameters. Good.
- the image acquisition unit 12 may acquire an image in which flare is suppressed by using the data output from the plurality of imaging optical systems 3.
- the electronic device 1 may further improve the training accuracy by using the captured image.
- the training may be executed in the control unit 18 or the like of the electronic device 1.
- a plurality of electronic devices 1 may transmit data to a storage or the like existing in a cloud or the like, perform training on a server or the like, and reflect the retrained parameters in the electronic device 1. .. In this case, only the flare information may be transmitted so that the privacy information including the user's face information is not included. Further, the transmission / reception of data from the electronic device 1 may be in a state in which the user can select by opt-in, opt-out, or the like, for example.
- the image acquisition unit 12 may acquire an image not only by linear processing but also by non-linear processing, particularly, calculation using various models including a trained model.
- the image acquisition unit 12 acquires an image based on a pixel value or a priority direction, or by combining images based on a method using a model or the like.
- the region in which image correction is executed is intended to be limited.
- the image acquisition unit 12 can predict the area where flare occurs for each of the imaging optical systems 3 when there is a bright subject, and the image may be corrected based on the prediction result. ..
- Flare is often caused by nearby optical elements. Therefore, the parallax in the images acquired by the plurality of imaging optical systems 3 is likely to be larger than that of other subjects. Therefore, the parallax may be calculated by the image acquisition unit 12 or the like, and the region where the parallax is large may be predicted as the region where flare is generated.
- a pixel value is determined based on information acquired from another imaging optical system 3.
- interpolation processing from another imaging optical system 3 may be executed in this region.
- the correction process is not limited to this, and the correction process may be performed by another calculation such as a weighting calculation so that the influence of the pixel value output from the other imaging optical system 3 becomes large.
- flare is obtained based on the information acquired from these other plurality of image pickup optical systems 3. An image with a small influence of the above may be acquired.
- this area may be updated by reflecting the training in the above-described embodiment in the model. By reflecting it as training, for example, it is possible to improve the accuracy of correction between different users having the same electronic device 1.
- this region may be determined by, for example, incident light without displaying on the display panel 4.
- the display panel 4 may be illuminated with an intensity that causes flare to predict the area.
- processing may be performed to determine a region where flare occurs at other timings, for example, timings before and after shooting, instead of the imaging timing desired by the user. This process may be executed by, for example, the control unit 18 or the image acquisition unit 12.
- the electronic device 1 includes a microlens array as the optical system 9 of the imaging optical system 3.
- FIG. 17 is a diagram showing an imaging optical system 3 according to the present embodiment.
- the optical system 9 of the imaging optical system 3 includes a microlens array 32.
- the light that has passed through the microlens array 32 appropriately enters the image pickup unit 8, is converted into a signal from the image pickup unit 8, and is output.
- the preprocessing unit 10 may reconstruct the image based on the signal output from the imaging unit 8. Based on this reconstructed image, the image acquisition unit 12 acquires an image in which the influence of flare is reduced according to each of the above-described embodiments.
- FIG. 18 is a diagram showing another example of the imaging optical system 3 according to the present embodiment.
- the first imaging optical system 3A and the second imaging optical system 3B include a common microlens array 32 as the optical system 9.
- the region where the light collected from each lens is incident is limited to a certain region. Therefore, by arranging the imaging unit 8 for each region, a plurality of imaging optical systems can be obtained. 3 can be formed.
- the light-shielding unit 30 described in the fifth embodiment may be provided between the first imaging unit 8A and the second imaging unit 8B.
- two imaging optical systems are provided, but the present invention is not limited to this, and three or more imaging optical systems 3 may be formed by the same microlens array 32.
- FIG. 19 is a diagram showing an example of the imaging optical system 3 according to the present embodiment.
- the electronic device 1 includes one imaging unit 8 for the first imaging optical system 3A and the second imaging optical system 3B.
- a first imaging unit 8A and a second imaging unit 8B are defined in a region where imaging is possible.
- Light passing through the first optical system 9A is incident on the first imaging unit 8A, and light is incident on the second imaging unit 8B via the second optical system 9B.
- a plurality of imaging optical systems 3 may be formed by providing the same imaging unit 8 with a plurality of optical systems 9.
- FIG. 20 is a diagram showing another implementation example of this embodiment. That is, the microlens array 32 may be provided as the optical system 9, and the same microlens array 32 and the imaging unit 8 may be used to form the first imaging optical system 3A and the second imaging optical system 3B.
- FIG. 21 is a modification of FIG. 20, and a light-shielding portion 30 is provided between the first imaging optical system 3A and the second imaging optical system 3B.
- a light-shielding portion 30 is provided between the first imaging optical system 3A and the second imaging optical system 3B.
- FIG. 22 is a modification of FIG. 19, in which a light-shielding portion 30 is provided between the first optical system 9A and the second optical system 9B.
- the light-shielding portion 30 does not penetrate through the display panel 4, but of course, as shown in the example of FIG. 16, it penetrates through the display panel 4 or penetrates the display panel 4.
- a light-shielding portion 30 may be provided so as to do so.
- the imaging unit 8 By making the imaging unit 8 common to the plurality of imaging optical systems 3 in this way, it is possible to simplify the circuit configuration and the semiconductor process in the imaging unit 8.
- a plurality of imaging optical systems 3 can be integrated by using the same element in part or in whole as in the present embodiment.
- the influence of parallax between the imaging optical systems 3 can be reduced.
- the influence of parallax can be reduced, and the accuracy of the correction can be improved.
- a plurality of imaging optical systems 3 are integrated so that the optical systems 9 are adjacent to each other. It may be transformed into. In this case as well, it is possible to perform accurate correction after reducing the influence of parallax.
- FIG. 23A is a diagram showing another arrangement example of the microlens array 32.
- the microlens array may be arranged in the aperture of the optical system 9 closer to the imaging unit 8.
- the optical system 9 may be formed with the aperture and the microlens array 32 as separate configurations.
- the aperture is not always essential, and the optical system 9 may be formed by arranging the microlens array 32 on the side opposite to the surface of the display panel 4.
- FIG. 23B shows yet another example of the arrangement of the microlens array 32 and the arrangement of the display panel 4.
- the microlens array 32 may be provided with one or more for a plurality of openings.
- an aperture may be appropriately formed by leaving a region of light emitting pixels of the display panel 4, and a microlens array 32 may be provided between the image pickup unit 8 and the lower portion of the aperture.
- the light-shielding portion 30 may be provided at an appropriate position.
- FIG. 24 shows an example of the configuration of the chip according to the present embodiment. This configuration is outlined as an example without limitation, and does not preclude the chip from being provided with other functions. That is, a selector, an I / F, a power supply node, and the like are appropriately provided in addition to those shown in the figure.
- an analog circuit such as the preprocessing unit 10 and a logic circuit such as the image acquisition unit 12 may be provided on the same chip as the image pickup unit 8. Further, the control unit 18 and the storage unit 20 may also be provided.
- the image pickup unit 8 By forming the image pickup unit 8, the preprocessing unit 10, and the image acquisition unit 12 on the same chip in this way, signal deterioration can be suppressed at high speed without using an interface for transmitting and receiving signals and data. Then, it becomes possible to execute data processing.
- the preprocessing unit 10 does not have to be common, and may be provided for each imaging unit 8. In this case, the digital image data may be transmitted from each preprocessing unit 10 to the image acquisition unit 12.
- FIG. 25 is a diagram showing another example of the chip configuration.
- the configuration other than the imaging unit 8, the preprocessing unit 10, the image acquisition unit 12, and the interface is omitted.
- the imaging unit 8 may be provided on separate chips.
- One chip is provided with an imaging unit 8, and the other chip is also provided with an imaging unit 8.
- the imaging unit 8 is provided with a receiving interface 22 that transmits to the other chip.
- a signal is received via the reception interface 22, and the image acquisition unit 12 acquires an image together with the signal from the image pickup unit 8 provided in the chip. If necessary, the preprocessing unit 10 executes signal processing before outputting the data to the image acquisition unit 12.
- the pretreatment unit 10 may be provided on both chips, or may be provided only on the other chip.
- more imaging units 8 may be provided on another chip. With such a configuration, it is possible to reduce the area of the chip related to the image pickup unit 8 even when the image pickup optical system 3 is arranged at various positions on the opposite side of the display surface of the display unit 2. ..
- FIG. 26 is still another example, in which a plurality of imaging units 8 are located on separate chips, and an image acquisition unit 12 is located on yet another chip.
- the logic circuit that processes the digital signal and the photoelectric conversion circuit in which the light receiving element exists can be configured as separate chips. By doing so, it is possible to further increase the degree of freedom in arranging the imaging optical system 3 without increasing the area of the chip so much.
- the receiving interface 22 may be, for example, an interface capable of executing data reception according to the MIPI standard. Further, it may be capable of high-speed transfer of another standard. Depending on the signal to be transmitted, an analog signal may be transmitted and received, or a digital signal may be transmitted and received.
- the type of interface can be appropriately selected by mounting the imaging unit 8 and the preprocessing unit 10 on a chip.
- the light emitting pixel 4b of the display panel 4 has not been particularly described, but the light emitting pixel 4b changes the occurrence of flare in the plurality of imaging optical systems 3, and the flare by the image acquisition unit 12 It is also possible to acquire an image that is less affected by.
- the light emitting pixels 4b around the first imaging optical system 3A and the periphery of the second imaging optical system 3B may be different from that of the light emitting pixel 4b.
- one may be an OLED and the other may be a MicroLED.
- another light emitting element particularly a display optical system having different optical characteristics, the influence of flare can be made to have different characteristics.
- the flare generation differs depending on the situation in the images acquired from the individual imaging optical systems 3. Since the flare characteristics are different, for example, one image pickup optical system 3 obtains an image having a strong flare effect, but the other image pickup optical system 3 obtains an image having a weak flare effect. To be acquired. As described above, by using the light emitting elements having different characteristics, it is possible to expand the method of image composition and correction in the image acquisition unit 12 to various methods.
- images output from a plurality of imaging optical systems 3 are used to acquire an image in which the influence of flare is reduced. That is, a plurality of imaging optical systems 3 are activated, and an image is acquired based on the information acquired from each of them.
- at least one imaging optical system 3 is activated as needed to reduce the influence of flare. For example, at least one imaging optical system 3 that is activated when image correction is required is provided.
- the imaging optical system 3 near the center of the display surface may be activated.
- another imaging optical system 3 is activated and the image acquisition unit 12 reduces the influence of flare. May be obtained.
- the influence of flare may be judged based on other conditions.
- the output is based on the output from one imaging optical system 3, but the determination is not limited to this, and the determination may be made based on, for example, the magnitude of signals from several light receiving elements. Further, as another example, it may be based on the distribution of the brightness and the like of the image displayed on the display panel 4. In this way, the effect of flare is judged by various judgment methods.
- control unit 18 operates an appropriate imaging optical system 3, acquires a signal based on the incident light, and acquires an image by the image acquisition unit 12.
- the control unit 18 may also execute the determination of the influence of the flare described above. In this way, power saving can be achieved by operating at least a part of the imaging optical system 3 as needed.
- FIGS. 27 to 31B are diagrams showing an arrangement example of the imaging optical system 3 as viewed from the display surface, respectively.
- Each of these figures shows the arrangement as an example, and is not limited to the figures. Further, the arrangement may include a part or all of the arrangements in the plurality of figures. That is, the electronic device 1 may include not only one figure but also three or more imaging optical systems 3 having features in a plurality of figures superimposed.
- the imaging optical system 3 may be provided at both ends (near the boundary) in any direction of the display screen 1a, for example, in the first direction. By arranging in this way, the imaging optical system 3 can be provided at a position inconspicuous to the user. It may be provided at both ends in the second direction, that is, at the upper and lower ends in FIG. 27, respectively.
- two imaging optical systems 3 may be provided in the display screen 1a at a distance of 50 mm to 80 mm.
- a plurality of imaging optical systems 3 may be arranged so as to have a deviation in each of the first direction and the second direction.
- each optical system 9 with an aperture having a major axis in the same direction, it is possible to acquire an image including flare centered in both the first direction and the second direction. Therefore, the image acquisition unit 12 can acquire a highly accurate flare suppression image even in a simpler process (for example, the comparison process and the selection process shown in the first embodiment).
- three or more imaging optical systems 3 may be provided.
- they may be arranged so as not to have symmetry as shown in FIG. 30, or some symmetry may be provided with respect to the first direction, the second direction, the center point, or the like. It may be arranged to have.
- the third imaging optical system 3 may be arranged at a position away from the other two imaging optical systems 3. In this case, the flare intensity due to the position may be different in each imaging optical system 3.
- FIG. 31A and 31B show the same electronic device 1.
- FIG. 31B is an external view seen from the direction of the arrow in FIG. 31A.
- the imaging optical system 3 on each display screen may be arranged at the same position with respect to the display screen 1a, or may be arranged at a completely different position.
- the imaging optical system 3 in FIG. 31B may be used as a rear camera at normal times, and can also be used as a two-eye (or more) rear camera.
- the imaging optical system 3 can be arranged in various ways on the display screen 1a.
- At least two of the plurality of imaging optical systems 3 may be provided at different positions.
- the arrangement can be freely selected regardless of the drawings or the like.
- FIG. 2 and the like by providing a plurality of imaging optical systems 3 having a deviation in the first direction around the center of the display surface in the second direction, the natural line of sight of the user looking around the center of the screen can be seen. It is possible to acquire images and videos.
- the present invention is not limited to this, and for example, it may be provided slightly above in the second direction.
- the center of the face may be located around the center of the screen, and the imaging optical system may be arranged around the displayed eyes. Further, apart from the above, it may be similarly provided near the center of the first direction along the second direction. In this case, the same image can be acquired even if the screen is rotated.
- the layout may be more complicated than the simple shape as shown in FIGS. 5 and 8.
- 32 to 35 are diagrams showing an example of the layout of the openings. As shown in FIG. 32, for example, the openings may be arranged so as to be offset in the second direction with respect to the display element.
- the opening may have a shape in which an ellipse having a major axis in the first direction and an ellipse having a major axis in the second direction are combined between display elements.
- the aperture belonging to the first optical system 9A and the aperture belonging to the second optical system 9B do not have to intersect at an angle close to 90 °.
- the angle of the light emitting element of the display panel 4 may be partially changed.
- the angle at which the light emitting element provided on the third-direction display surface of any of the optical systems 9 may be rotated to 45 ° or any significant angle.
- FIG. 35 the openings shown in FIG. 33 may be arranged as shown in FIG. 34.
- the aperture that is a part of the optical system 9 may also be laid out so that flare occurs in a significantly different direction.
- the electronic device 1 is provided with three or more imaging optical systems 3, the layouts of the openings shown in FIGS. 5, 8, 32 to 35 may be combined.
- the flare direction is a combination of various directions depending on the imaging optical system 3, it is possible to further expand the range of image acquisition, for example, image correction, composition, and selection.
- the imaging optical system 3 is arranged on the side opposite to the display surface of the display unit 2, and the light passing through the display unit 2 is acquired by the plurality of imaging units 8. A part of the light passing through the display unit 2 is repeatedly reflected in the display unit 2 and then incident on the image pickup unit 8 in the plurality of image pickup optical systems 3.
- the plurality of imaging units 8 integrated are repeatedly reflected in the display unit 2. It is possible to generate an captured image in which the flare component and the diffracted light component contained in the light incident on the light (including the case) are suppressed easily and reliably.
- the processing in the image acquisition unit 12 or the like may be composed of a digital circuit or a programmable circuit such as an FPGA (Field Programmable Gate Array). Further, the processing content may be described in a program, and information processing by software may be concretely realized by using hardware resources such as a CPU.
- FPGA Field Programmable Gate Array
- FIG. 36 is a plan view when the electronic device 1 of each embodiment is applied to the capsule endoscope 50.
- the capsule endoscopy 50 of FIG. 36 is photographed by, for example, a camera (ultra-small camera) 52 and a camera 52 for capturing an image in the body cavity in a housing 51 having a hemispherical both end faces and a cylindrical central portion.
- a wireless transmitter for transmitting the recorded image data to the outside via the antenna 54 after the memory 53 for recording the image data and the capsule endoscope 50 are discharged to the outside of the subject's body. It is equipped with 55.
- a CPU (Central Processing Unit) 56 and a coil (magnetic force / current conversion coil) 57 are provided in the housing 51.
- the CPU 56 controls the shooting by the camera 52 and the data storage operation in the memory 53, and also controls the data transmission from the memory 53 to the data receiving device (not shown) outside the housing 51 by the wireless transmitter 55.
- the coil 57 supplies electric power to the camera 52, the memory 53, the wireless transmitter 55, the antenna 54, and the light source 52b described later.
- the housing 51 is provided with a magnetic (reed) switch 58 for detecting when the capsule endoscope 50 is set in the data receiving device.
- the reed switch 58 detects the set to the data receiving device and the data can be transmitted, the CPU 56 supplies electric power from the coil 57 to the wireless transmitter 55.
- the camera 52 has, for example, an image sensor 52a including an optical system 9 for capturing an image in the body cavity, and a plurality of light sources 52b that illuminate the inside of the body cavity.
- the camera 52 is composed of, for example, a CMOS (Complementary Metal Oxide Semiconductor) sensor equipped with an LED (Light Emitting Diode), a CCD (Charge Coupled Device), or the like as the light source 52b.
- CMOS Complementary Metal Oxide Semiconductor
- LED Light Emitting Diode
- CCD Charge Coupled Device
- the display unit 2 in the electronic device 1 of the above-described embodiment is a concept including a light emitting body such as the light source 52b of FIG. 36.
- the capsule endoscope 50 of FIG. 36 has, for example, two light sources 52b, and these light sources 52b can be configured by a display panel 4 having a plurality of light source units and an LED module having a plurality of LEDs. In this case, by arranging the image pickup unit 8 of the camera 52 below the display panel 4 and the LED module, restrictions on the layout arrangement of the camera 52 are reduced, and a smaller capsule endoscope 50 can be realized.
- FIG. 37 is a rear view when the electronic device 1 of the above-described embodiment is applied to the digital single-lens reflex camera 60.
- the digital single-lens reflex camera 60 and the compact camera are provided with a display unit 2 for displaying a preview screen on the back surface opposite to the lens.
- the imaging optical system 3 may be arranged on the side opposite to the display surface of the display unit 2 so that the photographer's face image can be displayed on the display screen 1a of the display unit 2.
- the imaging optical system 3 can be arranged in an area overlapping the display unit 2, it is not necessary to provide the imaging optical system 3 in the frame portion of the display unit 2, and the size of the display unit 2 can be increased. It can be as large as possible.
- FIG. 38A is a plan view showing an example in which the electronic device 1 of the above-described embodiment is applied to a head-mounted display (hereinafter, HMD) 61.
- the HMD 61 of FIG. 38A is used for VR (Virtual Reality), AR (Augmented Reality), MR (Mixed Reality), SR (Substitutional Reality), and the like.
- the current HMD has a camera 62 mounted on the outer surface, so that the wearer of the HMD can visually recognize the surrounding image, while the surrounding humans wear the HMD.
- the facial expressions of a person's eyes and face cannot be understood.
- the display surface of the display unit 2 is provided on the outer surface of the HMD 61, and the imaging optical system 3 is provided on the opposite side of the display surface of the display unit 2.
- the facial expression of the wearer taken by the imaging optical system 3 can be displayed on the display surface of the display unit 2, and the humans around the wearer can display the facial expression of the wearer and the movement of the eyes in real time. Can be grasped.
- the electronic device 1 can be used for various purposes, and the utility value can be enhanced.
- a display unit having a displayable region in which display optical systems are provided in an array along a first direction and a second direction intersecting the first direction, and a display unit.
- the displayable area is overlapped and arranged on the side opposite to the display surface of the display unit, and at least,
- the first imaging optical system and A second imaging optical system having coordinates different from those of the first imaging optical system in at least one of the first direction and the second direction.
- An image acquisition unit that acquires image data based on the information acquired by the first imaging optical system and the second imaging optical system. Equipped with electronic equipment.
- the display unit is provided with an opening through which light incident from the display surface propagates. Light incident on the display surface propagates to the imaging optical system through the aperture.
- the electronic device according to (1) or (2).
- the aperture that propagates light to the first imaging optical system and the aperture that propagates light to the second imaging optical system have different layouts.
- the aperture that propagates light to the first imaging optical system and the aperture that propagates light to the second imaging optical system form diffraction images in different directions.
- a third imaging optical system having the same parallax as the parallax between the first imaging optical system and the second imaging optical system with respect to the first imaging optical system is provided.
- the image acquisition unit is based on the information acquired based on the information acquired from the second imaging optical system and the third imaging optical system, and the information acquired from the first imaging optical system.
- Get image data The electronic device according to any one of (1) to (5).
- the first imaging optical system is provided near the center.
- the second imaging optical system and the third imaging optical system are provided near the boundary of the display surface so as to sandwich the first imaging optical system.
- the first imaging optical system has a smaller region facing the display surface in the third direction than the second imaging optical system and the third imaging optical system.
- the arrangement of the first imaging optical system and the second imaging optical system has different coordinates in both the first direction and the second direction.
- the electronic device according to any one of (1) to (7).
- the image acquisition unit is based on the data having low intensity at the position in the image data which is the synthesis result.
- the electronic device according to any one of (1) to (8).
- the first imaging optical system and the second imaging optical system each have a direction of preferentially reflecting.
- the image acquisition unit acquires an imaging result using one of the results when a difference of a predetermined value or more occurs in the output for each direction.
- the electronic device according to any one of (1) to (9).
- the space between the first imaging optical system and the second imaging optical system is shielded from light.
- the electronic device according to any one of (1) to (10).
- the image acquisition unit synthesizes the information acquired by the first imaging optical system and the second imaging optical system using the trained model.
- the electronic device according to any one of (1) to (11).
- the trained model is trained on the basis of data collected from multiple electronic devices.
- the image acquisition unit makes corrections when the parallax in a plurality of images acquired by the plurality of imaging optical systems exceeds a predetermined amount.
- the electronic device according to any one of (1) to (13).
- the at least one imaging optical system is composed of a microlens array.
- the electronic device according to any one of (1) to (14).
- a plurality of the imaging optics are provided in the area of the microlens array.
- the first image pickup optical system and the second image pickup optical system acquire information in the same image pickup device.
- the electronic device according to any one of (1) to (16).
- the image acquisition unit is arranged on the same chip as the image sensor.
- the display unit includes a plurality of display optical systems having different optical characteristics.
- the electronic device according to any one of (1) to (18).
- At least one of the plurality of imaging optical systems operates when signal correction is required in the image acquisition unit.
- the electronic device according to any one of (1) to (19).
- the first imaging optical system and the second imaging optical system are integrated.
- the electronic device according to any one of (1) to (20).
- the first imaging optical system and the second imaging optical system are provided near the boundary of the display surface.
- the electronic device according to any one of (1) to (21).
- the first imaging optical system and the second imaging optical system are arranged at a distance of 50 mm or more and 80 mm or less.
- the image acquisition unit generates parallax image data of information acquired by the first imaging optical system and the second imaging optical system.
- the electronic device according to any one of (1) to (22).
- the display unit is provided on both sides of the device.
- the electronic device according to any one of (1) to (23).
- the first imaging optical system and the second imaging optical system are any two of the plurality of imaging optical systems.
- the electronic device according to any one of (1) to (24).
- It has a fourth imaging optical system different from the first imaging optical system and the second imaging optical system.
- the fourth imaging optical system and the first imaging optical system or the second imaging optical system have any of the characteristics described in (1) to (24). Electronics.
- 1 Electronic equipment, 1a: Display screen, 1b: Bezel, 2: Display, 3: Imaging optical system, 3A: 1st imaging optical system, 3B: 2nd imaging optical system, 3C: 3rd imaging optical system, 4: Display panel, 4a: Substrate, 4b: Light emitting pixel, 5: Circularly polarizing plate, 6: Touch panel, 7: Cover glass, 8: Imaging unit, 8A: 1st imaging unit, 8B: 2nd imaging unit, 9: Optical system, 9A: First optical system, 9B: Second optical system, 10: Pretreatment section, 12: Image acquisition unit, 14: Post-processing unit, 16: Output section, 18: Control unit, 20: Memory, 22: Receive interface, 30: Shading part, 32: Microlens array,
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Geometry (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
Description
図1は第1実施形態による電子機器1の模式的な断面図である。図1の電子機器1は、スマートフォンや携帯電話、タブレット、PCなど、表示機能と撮影機能を兼ね備えた任意の電子機器である。図の左下に示すように、第1方向を図面右側向き、第2方向を図面と垂直な方向、第3方向を図面下向きとする。すなわち、第2方向は、第1方向と交わる向きであり、第3方向は、第1方向及び第2方向と交わる向きである。なお、交わるとは、90°の角度で交わることを含んでもよいし、厳密に90°ではなくてもよい。また、図からわかるように、第1方向と第2方向は、便宜的に区別されるものであり、入れ替えても同等のものである。
本実施形態に係る電子機器は、同様のレイアウトの開口を備えてフレアの影響を小さくすることが可能となる複数の撮像光学系を有する。
本実施形態では、3つの撮像光学系を有する電子機器について説明する。
フレアは、方向により発生する強度が変わる場合がある。例えば、第1方向には強く発生するが第2方向には弱く発生するといった場合がある。このような場合、本実施形態では、適切に第1方向又は第2方向にフレアが発生しやすい撮像光学系を選択して、その選択した撮像光学系から取得した信号に基づいてフレアを抑制した画像を取得してもよい。
本実施形態における電子機器1では、フレアが発生した場合であっても、いずれかの撮像光学系3においては、フレアの発生が弱くなるように配置するものである。
前述の各実施形態においては、画像取得部12は、所定の演算(比較を含む)によりフレアの影響の小さい画像を取得した。これに対して、本実施形態では、モデルを用いて複数の撮像光学系3からの出力を合成してフレアを抑制した画像を取得する。
前述の実施形態においては、画素値や優先方向に基づいて、又は、モデルを用いた手法に基づいて画像の合成等により画像取得部12は、画像を取得する。本実施形態においては、それぞれの撮像光学系3において、画像の補正を実行する領域を限定しようとするものである。
本実施形態では、電子機器1は、撮像光学系3の光学系9としてマイクロレンズアレイを備える。
前述の実施形態においては、一例として、同一のマイクロレンズアレイ32において複数の撮像部8が備えられる構成について説明したが、本実施形態において、電子機器1は、複数の光学系9に対して同一の撮像部8を備える。
本実施形態では、撮像部8等のチップの構成について説明する。
前述の各実施形態においては、表示パネル4の発光画素4bについては特に説明していなかったが、発光画素4bにより、複数の撮像光学系3におけるフレアの発生を変化させ、画像取得部12によるフレアの影響の小さい画像を取得させることも可能となる。
前述の各実施形態は、複数の撮像光学系3から出力された画像を用いてフレアの影響が小さくなる画像を取得するものである。すなわち、複数の撮像光学系3を起動させておいて、それぞれから取得された情報に基づいて画像を取得するものである。本実施形態においては、フレアの影響を小さくする必要に応じて少なくとも1つの撮像光学系3を起動させるものである。例えば、画像の補正が必要な場合に起動される少なくとも1つの撮像光学系3が備えられる。
前述の各実施形態においては、撮像光学系3及び画像取得部12の種々の実装について説明した。本実施形態では、撮像光学系3の配置について説明する。
前述の実施形態で説明した構成を備えた電子機器1の具体的な候補としては、種々のものが考えられる。例えば、図36は各実施形態の電子機器1をカプセル内視鏡50に適用した場合の平面図である。図36のカプセル内視鏡50は、例えば両端面が半球状で中央部が円筒状の筐体51内に、体腔内の画像を撮影するためのカメラ(超小型カメラ)52、カメラ52により撮影された画像データを記録するためのメモリ53、および、カプセル内視鏡50が被験者の体外に排出された後に、記録された画像データを、アンテナ54を介して外部へ送信するための無線送信機55を備えている。
また、図37は前述の実施形態の電子機器1をデジタル一眼レフカメラ60に適用した場合の背面図である。デジタル一眼レフカメラ60やコンパクトカメラは、レンズとは反対側の背面に、プレビュー画面を表示する表示部2を備えている。この表示部2の表示面とは反対側に撮像光学系3を配置して、撮影者の顔画像を表示部2の表示画面1aに表示できるようにしてもよい。前述の各実施形態による電子機器1では、表示部2と重なる領域に撮像光学系3を配置できるため、撮像光学系3を表示部2の額縁部分に設けなくて済み、表示部2のサイズを可能な限り大型化することができる。
図38Aは前述の実施形態の電子機器1をヘッドマウントディスプレイ(以下、HMD)61に適用した例を示す平面図である。図38AのHMD61は、VR(Virtual Reality)、AR(Augmented Reality)、MR(Mixed Reality)、又は、SR(Substitutional Reality)等に利用されるものである。現状のHMDは、図38Bに示すように、外表面にカメラ62を搭載しており、HMDの装着者は、周囲の画像を視認することができる一方で、周囲の人間には、HMDの装着者の目や顔の表情がわからないという問題がある。
電子機器は、
第1方向及び前記第1方向に交わる第2方向に沿ってアレイ状に表示光学系が備えられる表示可能領域を有する表示部と、
前記第1方向及び前記第2方向に交わる第3方向において、前記表示可能領域と重なって前記表示部の表示面と反対側に配置され、少なくとも、
第1撮像光学系と、
前記第1方向及び前記第2方向の少なくとも一方において前記第1撮像光学系とは異なる座標を有する、第2撮像光学系と、
を有する、複数の撮像光学系と、
前記第1撮像光学系及び前記第2撮像光学系により取得された情報に基づいて画像データを取得する、画像取得部と、
を備える、電子機器。
前記表示部の前記表示面から、前記第1撮像光学系及び前記第2撮像光学系に、異なる光学的特徴を有する光学系を介して光が伝播する、
(1)に記載の電子機器。
前記表示部に前記表示面から入射された光が伝播する、開口が備えられ、
前記撮像光学系に前記開口を介して前記表示面から入射した光が伝播する、
(1)又は(2)に記載の電子機器。
前記第1撮像光学系に光を伝播する前記開口と、前記第2撮像光学系に光を伝播する前記開口は、異なるレイアウトを有する、
(3)に記載の電子機器。
前記第1撮像光学系に光を伝播する前記開口と、前記第2撮像光学系に光を伝播する前記開口は、異なる方向に回折像を形成させる、
(3)又は(4)に記載の電子機器。
前記第1撮像光学系及び前記第2撮像光学系との視差と同じ視差を前記第1撮像光学系に対して有する、第3撮像光学系、を備え、
前記画像取得部は、前記第2撮像光学系及び前記第3撮像光学系から取得された情報に基づいて取得した情報と、前記第1撮像光学系から取得された情報と、に基づいて、前記画像データを取得する、
(1)から(5)のいずれかに記載の電子機器。
前記表示面の前記第1方向又は前記第2方向において、
前記第1撮像光学系は、中央付近に備えられ、
前記第2撮像光学系及び前記第3撮像光学系は、前記第1撮像光学系を挟むように、前記表示面の境界付近に備えられ、
前記第1撮像光学系は、前記第2撮像光学系及び前記第3撮像光学系よりも、前記第3方向における前記表示面に面している領域が小さい、
(6)に記載の電子機器。
前記第1撮像光学系及び前記第2撮像光学系の配置は、前記第1方向及び前記第2方向の双方において異なる座標を有する、
(1)から(7)のいずれかに記載の電子機器。
前記第1撮像光学系から取得したデータと、前記第2撮像光学系から取得したデータとを合成する場合、前記画像取得部は、合成結果である前記画像データにおける位置において強度の低いデータに基づいて撮像結果を取得する、
(1)から(8)のいずれかに記載の電子機器。
前記第1撮像光学系と、前記第2撮像光学系は、それぞれに優先して反映させる方向を有し、
前記画像取得部は、方向毎の出力に所定値以上の差分が生じた場合に、いずれかの結果を用いて撮像結果を取得する、
(1)から(9)のいずれかに記載の電子機器。
前記第1撮像光学系と、前記第2撮像光学系との間が遮光される、
(1)から(10)のいずれかに記載の電子機器。
前記画像取得部は、訓練済みモデルを用いて、前記第1撮像光学系と前記第2撮像光学系により取得された情報を合成する、
(1)から(11)のいずれかに記載の電子機器。
前記訓練済みモデルは、複数の電子機器から収集されたデータに基づいて訓練される、
(12)に記載の電子機器。
前記画像取得部は、複数の前記撮像光学系により取得された複数の画像における視差が所定量を超える場合に補正を加える、
(1)から(13)のいずれかに記載の電子機器。
少なくとも1つの前記撮像光学系は、マイクロレンズアレイにより構成される、
(1)から(14)のいずれかに記載の電子機器。
前記マイクロレンズアレイの備えられる領域において複数の前記撮像光学系が備えられる、
(15)に記載の電子機器。
前記第1撮像光学系と、前記第2撮像光学系は、同じ撮像素子において情報を取得する、
(1)から(16)のいずれかに記載の電子機器。
前記画像取得部は、前記撮像素子と同一チップ上に配置される、
(17)に記載の電子機器。
前記表示部は、異なる光学的特徴を有する複数の前記表示光学系を備える、
(1)から(18)のいずれかに記載の電子機器。
複数の前記撮像光学系のうち少なくとも1つは、前記画像取得部において信号補正が必要である場合に動作する、
(1)から(19)のいずれかに記載の電子機器。
前記第1撮像光学系と、前記第2撮像光学系は、一体化されている、
(1)から(20)のいずれかに記載の電子機器。
前記第1撮像光学系と、前記第2撮像光学系は、前記表示面の境界付近に備えられる、
(1)から(21)のいずれかに記載の電子機器。
前記第1撮像光学系と、前記第2撮像光学系は、50mm以上80mm以下の距離、離れて配置されており、
前記画像取得部は、前記第1撮像光学系及び前記第2撮像光学系が取得した情報の視差画像データを生成する、
(1)から(22)のいずれかに記載の電子機器。
前記表示部は、デバイスの両面に備えられる、
(1)から(23)のいずれかに記載の電子機器。
前記第1撮像光学系と、前記第2撮像光学系は、複数の前記撮像光学系のうち、任意の2つである、
(1)から(24)のいずれかに記載の電子機器。
前記第1撮像光学系及び前記第2撮像光学系とは異なる第4撮像光学系を有し、
前記第4撮像光学系と、前記第1撮像光学系又は前記第2撮像光学系について、(1)から(24)に記載のいずれかの特徴を有する、
電子機器。
2:表示部、
3:撮像光学系、3A:第1撮像光学系、3B:第2撮像光学系、3C:第3撮像光学系、
4:表示パネル、4a:基板、4b:発光画素、
5:円偏光板、
6:タッチパネル、
7:カバーガラス、
8:撮像部、8A:第1撮像部、8B:第2撮像部、
9:光学系、9A:第1光学系、9B:第2光学系、
10:前処理部、
12:画像取得部、
14:後処理部、
16:出力部、
18:制御部、
20:記憶部、
22:受信インタフェース、
30:遮光部、
32:マイクロレンズアレイ、
Claims (20)
- 第1方向及び前記第1方向に交わる第2方向に沿ってアレイ状に表示光学系が備えられる表示可能領域を有する表示部と、
前記第1方向及び前記第2方向に交わる第3方向において、前記表示可能領域と重なって前記表示部の表示面と反対側に配置され、少なくとも、
第1撮像光学系と、
前記第1方向及び前記第2方向の少なくとも一方において前記第1撮像光学系とは異なる座標を有する、第2撮像光学系と、
を有する、複数の撮像光学系と、
前記第1撮像光学系及び前記第2撮像光学系により取得された情報に基づいて画像データを取得する、画像取得部と、
を備える、電子機器。 - 前記表示部の前記表示面から、前記第1撮像光学系の撮像素子及び前記第2撮像光学系の撮像素子に、異なる光学的特徴を有する光学系を介して光が伝播する、
請求項1に記載の電子機器。 - 前記表示部に前記表示面から入射された光が伝播する、開口が備えられ、
前記撮像光学系に前記開口を介して前記表示面から入射した光が伝播する、
請求項1に記載の電子機器。 - 前記第1撮像光学系に光を伝播する前記開口と、前記第2撮像光学系に光を伝播する前記開口は、異なるレイアウトを有する、
請求項3に記載の電子機器。 - 前記第1撮像光学系及び前記第2撮像光学系との視差と同じ視差を前記第1撮像光学系に対して有する、第3撮像光学系、を備え、
前記画像取得部は、前記第2撮像光学系及び前記第3撮像光学系から取得された情報に基づいて取得した情報と、前記第1撮像光学系から取得された情報と、に基づいて、前記画像データを取得する、
請求項1に記載の電子機器。 - 前記表示面の前記第1方向又は前記第2方向において、
前記第1撮像光学系は、中央付近に備えられ、
前記第2撮像光学系及び前記第3撮像光学系は、前記第1撮像光学系を挟むように、前記表示面の境界付近に備えられ、
前記第1撮像光学系は、前記第2撮像光学系及び前記第3撮像光学系よりも、前記第3方向における前記表示面に面している領域が小さい、
請求項5に記載の電子機器。 - 前記第1撮像光学系から取得したデータと、前記第2撮像光学系から取得したデータとを合成する場合、前記画像取得部は、合成結果である前記画像データにおける位置において強度の低いデータに基づいて撮像結果を取得する、
請求項1に記載の電子機器。 - 前記第1撮像光学系と、前記第2撮像光学系は、それぞれに優先して反映させる方向を有し、
前記画像取得部は、方向毎の出力に所定値以上の差分が生じた場合に、いずれかの結果を用いて撮像結果を取得する、
請求項1に記載の電子機器。 - 前記第1撮像光学系と、前記第2撮像光学系との間が遮光される、
請求項1に記載の電子機器。 - 前記画像取得部は、訓練済みモデルを用いて、前記第1撮像光学系と前記第2撮像光学系により取得された情報を合成する、
請求項1に記載の電子機器。 - 前記画像取得部は、複数の前記撮像光学系により取得された複数の画像における視差が所定量を超える場合に補正を加える、
請求項1に記載の電子機器。 - 少なくとも1つの前記撮像光学系は、マイクロレンズアレイにより構成される、
請求項1に記載の電子機器。 - 前記マイクロレンズアレイの備えられる領域において複数の前記撮像光学系が備えられる、
請求項12に記載の電子機器。 - 前記第1撮像光学系と、前記第2撮像光学系は、同じ撮像素子において情報を取得する、
請求項1に記載の電子機器。 - 前記表示部は、異なる光学的特徴を有する複数の前記表示光学系を備える、
請求項1に記載の電子機器。 - 複数の前記撮像光学系のうち少なくとも1つは、前記画像取得部において信号補正が必要である場合に動作する、
請求項1に記載の電子機器。 - 前記第1撮像光学系と、前記第2撮像光学系は、一体化されている、
請求項1に記載の電子機器。 - 前記第1撮像光学系と、前記第2撮像光学系は、前記表示面の境界付近に備えられる、
請求項1に記載の電子機器。 - 前記第1撮像光学系と、前記第2撮像光学系は、50mm以上80mm以下の距離、離れて配置されており、
前記画像取得部は、前記第1撮像光学系及び前記第2撮像光学系が取得した情報の視差画像データを生成する、
請求項1に記載の電子機器。 - 前記表示部は、デバイスの両面に備えられる、
請求項1に記載の電子機器。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20887168.1A EP4060405A4 (en) | 2019-11-12 | 2020-10-30 | ELECTRONIC DEVICE |
KR1020227014957A KR20220091496A (ko) | 2019-11-12 | 2020-10-30 | 전자 기기 |
US17/773,968 US20220343471A1 (en) | 2019-11-12 | 2020-10-30 | Electronic apparatus |
JP2021556027A JPWO2021095581A1 (ja) | 2019-11-12 | 2020-10-30 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019-204844 | 2019-11-12 | ||
JP2019204844 | 2019-11-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021095581A1 true WO2021095581A1 (ja) | 2021-05-20 |
Family
ID=75912510
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/040954 WO2021095581A1 (ja) | 2019-11-12 | 2020-10-30 | 電子機器 |
Country Status (7)
Country | Link |
---|---|
US (1) | US20220343471A1 (ja) |
EP (1) | EP4060405A4 (ja) |
JP (1) | JPWO2021095581A1 (ja) |
KR (1) | KR20220091496A (ja) |
CN (2) | CN112866518A (ja) |
TW (1) | TW202134769A (ja) |
WO (1) | WO2021095581A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023112780A1 (ja) * | 2021-12-13 | 2023-06-22 | ソニーセミコンダクタソリューションズ株式会社 | 画像表示装置及び電子機器 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006060535A (ja) * | 2004-08-20 | 2006-03-02 | Sharp Corp | 携帯電話装置 |
WO2007013272A1 (ja) * | 2005-07-28 | 2007-02-01 | Sharp Kabushiki Kaisha | 表示装置及びバックライト装置 |
US20120050601A1 (en) * | 2010-08-26 | 2012-03-01 | Samsung Electronics Co., Ltd. | Method of controlling digital photographing apparatus and digital photographing apparatus |
JP2014138290A (ja) * | 2013-01-17 | 2014-07-28 | Sharp Corp | 撮像装置及び撮像方法 |
JP2016136246A (ja) * | 2015-01-23 | 2016-07-28 | 三星ディスプレイ株式會社Samsung Display Co.,Ltd. | 表示装置 |
US20180069060A1 (en) | 2011-10-14 | 2018-03-08 | Apple Inc. | Electronic Devices Having Displays With Openings |
US20180198980A1 (en) * | 2017-01-06 | 2018-07-12 | Intel Corporation | Integrated Image Sensor and Display Pixel |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8199185B2 (en) * | 1995-09-20 | 2012-06-12 | Videotronic Systems | Reflected camera image eye contact terminal |
US7209160B2 (en) * | 1995-09-20 | 2007-04-24 | Mcnelley Steve H | Versatile teleconferencing eye contact terminal |
US6888562B2 (en) * | 2003-03-26 | 2005-05-03 | Broadcom Corporation | Integral eye-path alignment on telephony and computer video devices using a pinhole image sensing device |
TWI558215B (zh) * | 2003-06-17 | 2016-11-11 | 半導體能源研究所股份有限公司 | 具有攝像功能之顯示裝置及雙向通訊系統 |
JP4845336B2 (ja) * | 2003-07-16 | 2011-12-28 | 株式会社半導体エネルギー研究所 | 撮像機能付き表示装置、及び双方向コミュニケーションシステム |
US20070002130A1 (en) * | 2005-06-21 | 2007-01-04 | David Hartkop | Method and apparatus for maintaining eye contact during person-to-person video telecommunication |
US8022977B2 (en) * | 2005-10-17 | 2011-09-20 | I2Ic Corporation | Camera placed behind a display with a transparent backlight |
US8390671B2 (en) * | 2006-05-25 | 2013-03-05 | I2Ic Corporation | Display with gaps for capturing images |
US7714923B2 (en) * | 2006-11-02 | 2010-05-11 | Eastman Kodak Company | Integrated display and capture apparatus |
US20090009628A1 (en) * | 2007-07-06 | 2009-01-08 | Michael Janicek | Capturing an image with a camera integrated in an electronic display |
US8154582B2 (en) * | 2007-10-19 | 2012-04-10 | Eastman Kodak Company | Display device with capture capabilities |
US8164617B2 (en) * | 2009-03-25 | 2012-04-24 | Cisco Technology, Inc. | Combining views of a plurality of cameras for a video conferencing endpoint with a display wall |
JP5684488B2 (ja) * | 2009-04-20 | 2015-03-11 | 富士フイルム株式会社 | 画像処理装置、画像処理方法およびプログラム |
US8456586B2 (en) * | 2009-06-11 | 2013-06-04 | Apple Inc. | Portable computer display structures |
JP5836768B2 (ja) * | 2011-11-17 | 2015-12-24 | キヤノン株式会社 | 撮像装置付き表示装置 |
KR101864452B1 (ko) * | 2012-01-12 | 2018-06-04 | 삼성전자주식회사 | 이미지 촬영 및 화상 통화 장치와 방법 |
JP2015012127A (ja) * | 2013-06-28 | 2015-01-19 | ソニー株式会社 | 固体撮像素子および電子機器 |
KR101462351B1 (ko) * | 2013-08-16 | 2014-11-14 | 영남대학교 산학협력단 | 시선일치 영상통화장치 |
US9767728B2 (en) * | 2015-10-30 | 2017-09-19 | Essential Products, Inc. | Light sensor beneath a dual-mode display |
US10277043B2 (en) | 2016-09-23 | 2019-04-30 | Apple Inc. | Wireless charging mats for portable electronic devices |
CN110336907A (zh) * | 2019-08-21 | 2019-10-15 | 惠州Tcl移动通信有限公司 | 终端、拍摄方法及存储介质 |
-
2020
- 2020-10-30 EP EP20887168.1A patent/EP4060405A4/en active Pending
- 2020-10-30 JP JP2021556027A patent/JPWO2021095581A1/ja active Pending
- 2020-10-30 US US17/773,968 patent/US20220343471A1/en active Pending
- 2020-10-30 KR KR1020227014957A patent/KR20220091496A/ko active Search and Examination
- 2020-10-30 WO PCT/JP2020/040954 patent/WO2021095581A1/ja unknown
- 2020-11-05 TW TW109138548A patent/TW202134769A/zh unknown
- 2020-11-11 CN CN202011252523.7A patent/CN112866518A/zh active Pending
- 2020-11-11 CN CN202022601717.5U patent/CN213718047U/zh active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006060535A (ja) * | 2004-08-20 | 2006-03-02 | Sharp Corp | 携帯電話装置 |
WO2007013272A1 (ja) * | 2005-07-28 | 2007-02-01 | Sharp Kabushiki Kaisha | 表示装置及びバックライト装置 |
US20120050601A1 (en) * | 2010-08-26 | 2012-03-01 | Samsung Electronics Co., Ltd. | Method of controlling digital photographing apparatus and digital photographing apparatus |
US20180069060A1 (en) | 2011-10-14 | 2018-03-08 | Apple Inc. | Electronic Devices Having Displays With Openings |
JP2014138290A (ja) * | 2013-01-17 | 2014-07-28 | Sharp Corp | 撮像装置及び撮像方法 |
JP2016136246A (ja) * | 2015-01-23 | 2016-07-28 | 三星ディスプレイ株式會社Samsung Display Co.,Ltd. | 表示装置 |
US20180198980A1 (en) * | 2017-01-06 | 2018-07-12 | Intel Corporation | Integrated Image Sensor and Display Pixel |
Non-Patent Citations (1)
Title |
---|
See also references of EP4060405A4 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023112780A1 (ja) * | 2021-12-13 | 2023-06-22 | ソニーセミコンダクタソリューションズ株式会社 | 画像表示装置及び電子機器 |
Also Published As
Publication number | Publication date |
---|---|
KR20220091496A (ko) | 2022-06-30 |
JPWO2021095581A1 (ja) | 2021-05-20 |
EP4060405A4 (en) | 2023-01-25 |
US20220343471A1 (en) | 2022-10-27 |
TW202134769A (zh) | 2021-09-16 |
EP4060405A1 (en) | 2022-09-21 |
CN112866518A (zh) | 2021-05-28 |
CN213718047U (zh) | 2021-07-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9360671B1 (en) | Systems and methods for image zoom | |
CN111726502B (zh) | 电子设备及显示装置 | |
US10437080B2 (en) | Eyewear, eyewear systems and associated methods for enhancing vision | |
KR20210130773A (ko) | 이미지 처리 방법 및 머리 장착형 디스플레이 디바이스 | |
CN216625895U (zh) | 电子设备 | |
KR20220063467A (ko) | 디스플레이를 포함하는 웨어러블 전자 장치 | |
WO2021095581A1 (ja) | 電子機器 | |
KR20200073211A (ko) | 전자 기기 | |
KR20230154987A (ko) | 화상관찰 장치 | |
WO2021157389A1 (ja) | 電子機器 | |
US20190215446A1 (en) | Mobile terminal | |
EP4400941A1 (en) | Display method and electronic device | |
KR101815164B1 (ko) | 이미지의 생성을 위해 다수의 서브-이미지를 캡처하기 위한 카메라 | |
JP2007108626A (ja) | 立体映像生成システム | |
Tsuchiya et al. | An optical design for avatar-user co-axial viewpoint telepresence | |
US11238830B2 (en) | Display device and display method thereof | |
TW202147822A (zh) | 電子機器 | |
WO2021157324A1 (ja) | 電子機器 | |
US12035058B2 (en) | Electronic equipment | |
WO2022244354A1 (ja) | 撮像素子及び電子機器 | |
US20240064417A1 (en) | Systems and methods for multi-context image capture | |
TWI841071B (zh) | 可薄型化的影像感應模組 | |
US20230222757A1 (en) | Systems and methods of media processing | |
US20240205380A1 (en) | Head-Mounted Electronic Device with Display Recording Capability | |
US20230179868A1 (en) | Systems and methods for determining image capture settings |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20887168 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021556027 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20227014957 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020887168 Country of ref document: EP Effective date: 20220613 |