CN213718047U - Electronic device - Google Patents

Electronic device Download PDF

Info

Publication number
CN213718047U
CN213718047U CN202022601717.5U CN202022601717U CN213718047U CN 213718047 U CN213718047 U CN 213718047U CN 202022601717 U CN202022601717 U CN 202022601717U CN 213718047 U CN213718047 U CN 213718047U
Authority
CN
China
Prior art keywords
optical system
photographing optical
imaging
electronic device
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202022601717.5U
Other languages
Chinese (zh)
Inventor
中田征志
金井淳一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Semiconductor Solutions Corp
Original Assignee
Sony Semiconductor Solutions Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Semiconductor Solutions Corp filed Critical Sony Semiconductor Solutions Corp
Application granted granted Critical
Publication of CN213718047U publication Critical patent/CN213718047U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/12Bodies with means for supporting objectives, supplementary lenses, filters, masks, or turrets
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B30/00Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/53Constructional details of electronic viewfinders, e.g. rotatable or detachable
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F2201/00Constructional arrangements not provided for in groups G02F1/00 - G02F7/00
    • G02F2201/58Arrangements comprising a monitoring photodetector
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • H04N2007/145Handheld terminals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • H04N7/144Constructional details of the terminal equipment, e.g. arrangements of the camera and the display camera and display on the same optical axis, e.g. optically multiplexing the camera and display for eye to eye contact
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/30Devices specially adapted for multicolour light emission
    • H10K59/38Devices specially adapted for multicolour light emission comprising colour filters or colour changing media [CCM]

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Geometry (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

An electronic apparatus is provided which suppresses degradation of the image quality of an image captured by a camera while reducing the width of a frame. The electronic device includes: a display unit having a displayable region having display optical systems arranged in an array along a first direction and a second direction intersecting the first direction; a plurality of imaging optical systems that are arranged on the display unit on the side opposite to the display surface so as to overlap with the displayable region in a third direction intersecting the first direction and the second direction, and that have at least a first imaging optical system and a second imaging optical system having a coordinate different from that of the first imaging optical system in at least one of the first direction and the second direction; and an image acquisition unit that acquires image data based on the information acquired by the first and second imaging optical systems.

Description

Electronic device
Technical Field
The utility model relates to an electronic equipment.
Background
Recent electronic devices such as smart phones, mobile phones, and personal computers (pcs) have a camera arranged on a frame (frame) of a display unit, and can easily perform video calls and video shooting. Since smartphones, mobile phones are usually carried in a pocket or a bag, the overall dimensions must be made as compact as possible. On the other hand, if the size of the display screen is small, the higher the display resolution, the smaller the size of the displayed character, and it is difficult to recognize it. Therefore, it is being studied to reduce the width of the bezel around the display screen in order to increase the size of the display screen as much as possible without increasing the overall size of the electronic device.
However, since a camera or the like is generally mounted on the bezel of the electronic apparatus, the bezel width cannot be smaller than the outer diameter of the camera. Further, when a camera is placed in the frame, for example, during a video call, the line of sight is usually concentrated near the center of the display screen, and therefore the line of sight deviates from the optical axis of the camera, and a captured image that does not match the line of sight and is not in harmony with the line of sight is obtained.
In order to avoid the above problem, it is proposed to arrange a camera module on the side opposite to the display surface of the display unit and to take an image of subject light passing through the display unit with the camera.
[ Prior art documents ]
[ patent document ]
Patent document 1: U.S. patent publication 2018/0069060
SUMMERY OF THE UTILITY MODEL
[ problem to be solved by the utility model ]
However, since a part of the light passing through the display unit is reflected or diffracted and enters the camera, the image quality of the captured image is degraded due to the influence of glare or diffraction caused by the reflection.
An aspect of the utility model provides an when reducing the frame width, can restrain the electronic equipment of the image quality reduction that the camera was shot.
[ means for solving the problems ]
According to one embodiment, an electronic device includes: a display unit having a displayable region having display optical systems arranged in an array along a first direction and a second direction intersecting the first direction; a plurality of imaging optical systems that are arranged on a side of the display unit opposite to the display surface so as to overlap the displayable region in a third direction intersecting the first direction and the second direction, the plurality of imaging optical systems including at least: a first photographing optical system; and a second photographing optical system having a different coordinate from the first photographing optical system in at least one of the first direction and the second direction; and an image acquisition section that acquires image data based on the information acquired by the first and second photographing optical systems. In this way, by shifting the positions of at least two imaging optical systems, the position of the glare generated in the captured image can be changed. For example, the first photographing optical system and the second photographing optical system are two optical systems arbitrarily selected from a plurality of photographing optical systems. That is, the following configurations may be characterized as corresponding to at least two of the plurality of imaging optical systems. Further, the one feature and the other feature may also be features of the first optical system and the second optical system based on different combinations. In this way, at least two of the plurality of imaging optical systems may have the following features.
In the electronic apparatus, light may also be transmitted from the display surface of the display portion to the first photographing optical system and the second photographing optical system through optical systems having different optical characteristics. Further, the plurality of photographing optical systems may all have different optical characteristics, or a part thereof may have the same optical characteristics. In this way, the at least two photographing optical systems may have different optical characteristics in the optical path of the light propagating from the display surface. By having different optical characteristics, the characteristics of the generated glare may be changed.
The electronic device may be provided with an opening for transmitting light incident from the display surface in the display portion, and the light incident from the display surface may be transmitted to the imaging optical system through the opening. For example, the optical features described above may be formed through the opening.
The opening that propagates light to the first photographing optical system and the opening that propagates light to the second photographing optical system may have different layouts. Different optical features can be formed by this layout.
The opening that causes light to travel to the first photographing optical system and the opening that causes light to travel to the second photographing optical system may form diffraction images in different directions. For example, the size of the first direction in which light propagates to the opening of the first photographing optical system is made larger than the size of the second direction, and the size of the second direction in which light propagates to the opening of the second photographing optical system is made larger than the size of the first direction. In this way, the diffraction images are formed in different directions by different openings, so that the directions in which glare is generated can be made different in the two photographing optical systems.
The electronic apparatus may include a third photographing optical system having the same parallax with the first photographing optical system and the same parallax with the first photographing optical system, and the image acquisition unit may acquire the image data based on information acquired from the second photographing optical system and the third photographing optical system and information acquired from the first photographing optical system. That is, with respect to a combination of two imaging optical systems, imaging optical systems having the same parallax in opposite directions are further arranged around either one of the two imaging optical systems.
The first imaging optical system is provided near the center in the first direction or the second direction of the display surface, and the second imaging optical system and the third imaging optical system are provided near the boundary of the display surface with the first imaging optical system interposed therebetween. That is, the second and third photographing optical systems are provided at both ends of the display surface of the electronic apparatus 1, and the first photographing optical system, which is less conspicuous than the two photographing optical systems, is provided near the center.
The first photographing optical system and the second photographing optical system have different coordinates in a first direction and a second direction. In this way, the two photographing optical systems may have different coordinates only in the first direction, different coordinates only in the second direction, or different coordinates in both the first direction and the second direction. That is, in the display surface, the two photographing optical systems may be arranged in the horizontal direction, may be arranged in the vertical direction, or may be arranged in an arbitrary direction.
The image acquisition unit acquires a shooting result based on data having low intensity in image data as a result of synthesis when synthesizing data acquired from the first shooting optical system and data acquired from the second shooting optical system. In acquiring data, glare often has higher brightness and light intensity than a photographic subject, so the image acquisition section can acquire an image based on a low intensity signal among signals acquired from the two photographing optical systems when acquiring image data.
The first and second imaging optical systems each have a direction to be reflected with priority, and the image acquisition unit can acquire the imaging result using an arbitrary result when a difference of a predetermined value or more is generated in the output in each direction. For example, when the first photographing optical system generates glare in the first direction and the second photographing optical system generates glare in the second direction, the image acquiring part may acquire the image data based on an output of the photographing optical system in which the glare is small.
Light shielding may be performed between the first photographing optical system and the second photographing optical system. Thus, the first and second imaging optical systems are shielded from light, and the mutual influence of the optical systems can be suppressed.
The image acquisition unit may synthesize information acquired by the first and second photographing optical systems using the trained model. In this way, for example, data output from a plurality of photographing optical systems can be synthesized using a model generated by machine learning.
The trained model may be trained based on data collected from a plurality of electronic devices. For example, the model may be generated based on the same model of the electronic apparatus 1. Further, even in the same model, the model may be changed based on the shooting mode or the like and trained.
The image acquisition unit may perform correction when the parallax in the plurality of images acquired by the plurality of photographing optical systems exceeds a predetermined amount. For example, parallax is detected from a region generated by glare, and image data can be acquired by performing image processing on the region.
The at least one photographing optical system may be constituted by a microlens array. Thus, a lens system may be provided in a photographing system instead of a lens.
A plurality of photographing optical systems may be arranged in a region where the microlens array is disposed. Thus, a plurality of photographing optical systems can be formed using one microlens array.
The first photographing optical system and the second photographing optical system can acquire information through the same photographing element. Thus, a plurality of photographing optical systems can be formed in one photographing element such as one chip photographing element. In combination with the above, by providing a microlens array for one chip, a plurality of photographing optical systems can be provided for each region.
The image pickup section may be disposed on the same chip as the imaging element. For example, one chip has an imaging element and a logic circuit, and data acquired by the imaging element is DA-converted and the converted data is processed by the logic circuit. Instead of one chip, a plurality of stacked chips may be formed.
The display portion may include a plurality of display optical systems having different optical characteristics. For example, OLED, micro led, liquid crystal, or the like may be used as a mixed display optical system, and in this case, glare caused by reflection, refraction, or diffraction in the display portion can have different characteristics depending on the imaging optical system.
At least one of the plurality of photographing optical systems operates when the image acquisition section needs a correction signal. For example, the electronic device includes an imaging element that performs correction when detecting a strong light source, and whether or not correction is performed by the imaging optical system may be switched based on the surrounding environment.
The first photographing optical system and the second photographing optical system may be integrally formed. For example, in the two photographing optical systems, the light receiving element and the optical path to the light receiving element may be adjacent to each other.
The first photographing optical system and the second photographing optical system may be disposed near a boundary of the display surface. The boundary refers to, for example, an end portion of the display surface, at which the plurality of photographing optical systems may be disposed.
The first imaging optical system and the second imaging optical system are arranged at a distance of 50mm or more and 80mm or less, and the image acquisition unit generates parallax image data of information acquired by the first imaging optical system and the second imaging optical system. In this way, it is also possible to configure the two photographing optical systems to have a distance equal to the distance between the two human eyes.
The display portions may be provided on both sides of the electronic apparatus. In this case, the plurality of display units may be provided with a plurality of imaging optical systems, respectively, one of the display units may be provided with a plurality of imaging optical systems, and the other display unit may not be provided with an imaging optical system, or one of the display units may be provided with one imaging optical system and the other display unit may be provided with a plurality of imaging optical systems.
A fourth photographing optical system different from the first photographing optical system and the second photographing optical system may be provided, and the fourth photographing optical system and the first photographing optical system or the second photographing optical system may have any of the features described above. As described above, the first and second photographing optical systems are merely two systems extracted from the plurality of photographing optical systems, and do not indicate a specific photographing optical system.
Drawings
Fig. 1 is a schematic cross-sectional view of an electronic device according to an embodiment.
Fig. 2 is a schematic external view of an electronic device according to an embodiment.
Fig. 3A is a schematic diagram showing an example of an imaging optical system according to an embodiment.
Fig. 3B is a schematic diagram showing an example of the photographing optical system according to the embodiment.
Fig. 4 is a block diagram of an imaging operation of the electronic device according to the embodiment.
Fig. 5 is a diagram showing the imaging optical system according to the embodiment from the display surface side.
Fig. 6A is a diagram showing an example of an image captured by the capturing optical system of fig. 5.
Fig. 6B is a diagram showing an example of an image captured by the capturing optical system of fig. 5.
Fig. 7 is a diagram showing an example of an image acquired by the image acquisition unit according to the embodiment.
Fig. 8 is a diagram showing the imaging optical system according to the embodiment from the display surface side.
Fig. 9 is a diagram showing an example of an image captured by the capturing optical system of fig. 8.
Fig. 10 is a schematic external view of an electronic device according to an embodiment.
Fig. 11 is a diagram showing an example of an image captured by the capturing optical system of fig. 10.
Fig. 12 is a diagram showing an example of combining images acquired by the photographing optical systems at both ends of fig. 11.
Fig. 13 is a diagram showing an example of an image captured by the imaging optical system according to the embodiment.
Fig. 14 is a schematic cross-sectional view of an electronic device according to an embodiment.
Fig. 15 is a schematic cross-sectional view of an electronic device according to an embodiment.
Fig. 16 is a schematic cross-sectional view of an electronic device according to an embodiment.
Fig. 17 is a schematic diagram showing an example of an imaging optical system according to an embodiment.
Fig. 18 is a schematic diagram showing an example of an imaging optical system according to an embodiment.
Fig. 19 is a schematic diagram showing an example of an imaging optical system according to an embodiment.
Fig. 20 is a schematic diagram showing an example of an imaging optical system according to an embodiment.
Fig. 21 is a schematic diagram showing an example of an imaging optical system according to an embodiment.
Fig. 22 is a schematic diagram showing an example of an imaging optical system according to an embodiment.
Fig. 23A is a schematic diagram showing an example of an imaging optical system according to an embodiment.
Fig. 23B is a schematic diagram showing an example of the photographing optical system according to the embodiment.
Fig. 24 is a schematic diagram showing an example of an embodiment of an imaging unit according to an embodiment.
Fig. 25 is a schematic diagram showing an example of an embodiment of an imaging unit according to an embodiment.
Fig. 26 is a schematic diagram showing an example of an embodiment of an imaging unit according to an embodiment.
Fig. 27 is a diagram showing an example of the arrangement of the imaging optical system according to the embodiment.
Fig. 28 is a diagram showing an example of the arrangement of the imaging optical system according to the embodiment.
Fig. 29 is a diagram showing an example of the arrangement of the imaging optical system according to the embodiment.
Fig. 30 is a diagram showing an example of the arrangement of the imaging optical system according to the embodiment.
Fig. 31A is a diagram showing an example of the arrangement of the imaging optical system according to the embodiment.
Fig. 31B is a diagram showing an example of the arrangement of the imaging optical system according to the embodiment.
Fig. 32 is a diagram showing an example of the layout of the openings according to the embodiment.
Fig. 33 is a diagram showing an example of the layout of the openings according to the embodiment.
Fig. 34 is a diagram showing an example of the layout of the openings according to the embodiment.
Fig. 35 is a diagram showing an example of the layout of the openings according to the embodiment.
Fig. 36 is a plan view of the electronic apparatus according to the embodiment applied to a capsule endoscope.
Fig. 37 is a rear view when the electronic apparatus of an embodiment is applied to a digital single lens reflex camera.
Fig. 38A is a diagram showing an example in which the electronic apparatus according to one embodiment is applied to an HMD.
Fig. 38B is a diagram showing a conventional HMD.
Description of reference numerals:
1, an electronic device; 1a displaying a screen; 1b, a frame; 2a display unit; 3a photographing optical system; 3A a first photographing optical system; a 3B second photographing optical system; a 3C third photographing optical system; 4a display panel; 4a substrate; 4b light emitting pixels; 5 circular polarization glare; 6a touch panel; 7, cover glass; 8 shooting department; an 8A first imaging unit; an 8B second imaging unit; 9 an optical system; 9A first optical system; 9B a second optical system; 10 a pretreatment part; 12 an image acquisition unit; 14 a post-processing part; 16 an output unit; 18 a control unit; 20 a storage unit; 22 a receiving interface; 30 light shielding parts; 32 microlens array.
Detailed Description
Embodiments of an electronic device are described below with reference to the drawings. In the following description, the main components of the electronic device will be mainly described, but the electronic device may have components and functions not shown or described. The following description does not exclude constituent elements and functions not shown or described. For the purpose of explanation, the size, shape, aspect ratio, and the like may be changed and may be set to have an appropriate size, shape, aspect ratio, and the like when mounted. In addition, in the following description, the acquired signal is described as image information or photographing information, but the image information or the photographing information is a broad concept including still images, moving images, 1-frame images in video, and the like.
(first embodiment)
Fig. 1 is a schematic cross-sectional view of an electronic apparatus 1 of the first embodiment. The electronic device 1 in fig. 1 is any electronic device having both a display function and a shooting function, such as a smartphone, a mobile phone, a tablet, and a PC. As shown in the lower left of the figure, the first direction is toward the right of the figure, the second direction is perpendicular to the figure, and the third direction is toward the lower side of the figure. That is, the second direction intersects the first direction, and the third direction intersects the first direction and the second direction. In addition, intersecting includes intersecting at an angle of 90 °, and may not be 90 ° in the strict sense. In addition, as can be seen from the drawings, the first direction and the second direction are distinguished for convenience and may be interchanged with each other.
The electronic device 1 shown in fig. 1 includes an imaging optical system (such as a camera module) and is disposed on the side of the display unit 2 opposite to the display surface. In this way, the electronic apparatus 1 is provided with the imaging optical system 3 on the inner surface side (opposite side) of the display surface of the display unit 2. Therefore, the imaging optical system 3 performs imaging through the display unit 2.
As shown in fig. 1, the display unit 2 is a structure in which a display panel 4, a circularly polarizing plate 5, a touch panel 6, and a cover glass 7 are laminated in this order. The laminate of fig. 1 is an example, and an adhesive layer or an adhesive layer may be provided between the display panel 4, the circularly polarizing plate 5, the touch panel 6, and the cover glass 7 as needed. The order of the circularly polarizing plate 5 and the touch panel 6 may be changed as appropriate according to the design.
The imaging optical system 3 is provided on the opposite side of the display surface of the display unit 2. The photographing optical system 3 includes, for example, a photoelectric element (light receiving element) that receives light and photoelectrically converts it into an analog signal, and an optical system that propagates the light irradiated to the display surface to the photoelectric element. The optical system may be, for example, an opening provided in the display panel 4. One display unit 2 of the electronic device 1 includes a plurality of the imaging optical systems 3, for example, two as shown in the figure. Light irradiated onto the display surface is diffracted at the opening and propagated toward the light receiving element as shown by the arrow in the figure. Instead of providing the opening, an optical system having some optical characteristics, such as adjusting the optical path length or changing the polarization state, may be provided. The imaging optical system 3 includes, for example, an imaging unit 8 and an optical system 9, and the optical system 9 condenses, diffuses, or the like light entering the imaging unit 8 from the display surface.
As shown in the figure, the plurality of imaging optical systems 3 have different coordinates in the second direction, for example, but are not limited thereto. For example, the first direction may have different coordinates, and the first direction and the second direction may have different coordinates.
Although not shown in detail, the display panel 4 may include, for example, an oled (organic Light Emitting device), a liquid crystal such as a TFT, or a micro led as an optical system for display (display optical system). The display optical system may also include a light-emitting element based on other display principles. The light emitting elements as the display optical system may be arranged in a stripe pattern, a mosaic pattern, in a first direction and a second direction, or in an oblique direction, or in a partial pixel interval. In addition, in the display optical system, the light emitting element may have a lamination type filter to change a display color. When the OLED or the like is provided, the display panel 4 may be composed of a plurality of layers such as an anode layer and a cathode layer. Further, these layers may be formed of a high transmittance material.
The display panel 4 may be provided with a low transmittance member such as a color filter layer. When the display panel 4 includes an OLED, for example, the substrate 4a and the OLED portion may be provided. The substrate 4a may be formed of polyimide or the like, for example. When the substrate 4a is made of a material having low light transmittance such as polyimide, an opening may be formed corresponding to the arrangement portion of the imaging optical system 3. When the subject light passing through the opening is made incident on the image pickup optical system 3, the image quality of the image picked up by the image pickup optical system 3 can be improved. Further, instead of forming the opening, a light propagation path formed of a high-transmittance substance may be provided. In this case, the light incident from the display surface of the display unit 2 is also received by the photographing optical system 3 and converted into a signal.
The circularly polarizing plate 5 is provided, for example, to reduce glare or to improve visibility of the display screen 1a even in a bright environment. The touch panel 6 has a touch sensor built therein. The touch sensor may be of various types such as a capacitance type and a resistance film type, and any type may be employed. Further, the touch panel 6 and the display panel 4 may be integrally formed. The cover glass 7 is provided to protect the display panel 4 and the like. As described above, an adhesive layer such as oca (optical Clear adhesive) or an adhesive layer may be provided at an appropriate position. In addition, the order of the circularly polarizing plate 5 and the touch panel 6 in the third direction may be interchanged by design.
Fig. 2 shows a schematic external view and a cross-sectional view of the electronic apparatus 1 shown in fig. 1. The sectional view illustrates a section of the display portion including the display section 2 at the dot-and-dash line shown in the figure. Circuits and the like other than the housing and the display portion of the electronic apparatus 1 are omitted.
In the external view, the display screen 1a is expanded to a size close to the outer diameter of the electronic apparatus 1, and the width of the frame 1b around the display screen 1a is several millimeters or less. In general, the front camera is often arranged on the frame 1 b. In the present embodiment, for example, as shown by the broken line in the external view, the front camera is located as the plurality of photographing optical systems 3 at the substantially center in the second direction of the display screen 1 a. By disposing the front camera as the imaging optical system 3 on the opposite side of the display surface of the display unit 2 in this way, it is not necessary to dispose the front camera on the frame 1b, and the width of the frame 1b can be narrowed.
The external view of fig. 2 shows an example, and the photographing optical system 3, i.e., a front camera, may be disposed on the display screen 1a at any position in the first direction or the second direction, on the opposite side (back side) of the display surface of the display unit 2. For example, it may be arranged at a peripheral edge portion (end portion, boundary portion) of the display screen 1 a. As shown in the external view of fig. 2, the plurality of photographing optical systems 3 have different coordinates in the first direction, for example. In any position, the imaging optical system 3 may have different coordinates in at least one of the first direction and the second direction. Further, although two photographing optical systems 3 are drawn, the present invention is not limited thereto, and more photographing optical systems may be arranged on the opposite side of the display surface.
For example, as shown in a cross-sectional view, the imaging optical system 3 is disposed on the inner surface side of the display unit 2 opposite to the display surface side. In addition, the sectional view is a simplified diagram. For example, similarly to the above, an adhesive layer or the like is provided in the structure of the cross-sectional view of fig. 2, but the description thereof is omitted for simplicity.
Fig. 3A is a diagram showing an example of the photographing optical system 3. The imaging optical system 3 includes, for example, an imaging unit 8 and an optical system 9. The optical system 9 is disposed on the light incident surface side of the imaging unit 8, i.e., on the side closer to the display unit 2. The light passes through the display surface of the display unit 2 and propagates to the imaging unit 8 through the optical system 9.
The imaging unit 8 includes a light receiving element such as a photodiode and a photoelectric element, for example. The light is condensed, diffused, and propagated by the optical system 9, and is received by an imaging pixel array included in the imaging unit 8 to output an analog signal. The imaging pixel array may be provided with a color filter such as a bayer array or a lamination type color filter on the incident surface side of each imaging element. Further, an optical filter for acquiring a color image may be provided. Further, although not shown, other elements, circuits, and the like necessary for receiving light and outputting an analog signal are provided. For example, a CMOS (Complementary Metal-Oxide-Semiconductor) device or a ccd (charge Coupled device) device may be used for photoelectric conversion. Further, the above-described filter, polarizing element, and the like may be provided.
The optical system 9 may be provided with a lens, for example. Further, the concept of the optical system 9 includes the above-described opening provided in the display panel 4. For example, when the display panel 4 is provided with an opening as the optical system 9, a lens is disposed at a position closer to the image pickup unit 8 than the opening in the third direction. The opening is provided, for example, on the substrate 4a having a low transmittance, and a lens for transmitting the light transmitted through the opening to the imaging unit 8 is provided. For example, the optical characteristics such as the numerical aperture na (numerical aperture) and the F-Number (F-Number) of each imaging optical system 3 are defined by the lens and the opening. The optical system 9 further has other optical characteristics such as an abbe number different from that of the imaging optical system 3. The lens is shown as a single lens, but is not limited thereto, and the lens system may have a plurality of types of lenses.
The opening and the lens are examples, and the configuration of the optical system 9 is not limited to this combination. In the figure, one lens is provided for one opening, but the present invention is not limited to this. For example, as shown in fig. 3B, a plurality of openings may be provided for one lens in the optical system 9. In a region where no opening is provided, for example, light emitting elements of the display panel 4 are provided, and an opening may be formed between these light emitting elements. By configuring in this way, the photographing optical system 3 can be configured without impairing the display.
The plurality of imaging optical systems 3 may have different optical characteristics depending on the shape of the opening, the performance of the lens, and the like. When there are three or more imaging optical systems 3, the optical systems 9 corresponding to each other may have different optical characteristics. As another example, the photographing optical system 3 may be divided into a plurality of groups, and each group has different optical characteristics. For example, the optical system 9 may change its opening shape, orientation, lens material, or the like so that two photographing optical systems 3 have common optical characteristics and one photographing optical system 3 has different optical characteristics. The opening layout is described as an expression of the shape, orientation, and the like of the opening.
As shown by the arrows in fig. 3A, light enters from the display surface side of the display unit 2, is refracted by the optical system 9, and is received by the imaging unit 8. In a portion where the optical system 9 is not provided, reflection and the like can be suppressed as appropriate in the case of a normal display, and display of the display section 2 can be adjusted so as to be easy to view. For example, an opening is provided between the pixels of the display panel 4, a lens is provided on the opposite side of the opening from the display surface in the third direction, and light incident from the display surface is projected onto the imaging unit 8. Further, openings may be provided between the consecutive light emitting pixels, respectively. In other words, the light-emitting pixels may be provided between the openings.
Here, one example of the photographing function of the electronic apparatus 1 is explained.
Fig. 4 shows an example of a block diagram showing a configuration related to a shooting operation of the electronic apparatus 1 according to the present embodiment. The electronic apparatus 1 includes: a display unit 2, a plurality of imaging optical systems 3, a preprocessing unit 10, an image acquisition unit 12, a post-processing unit 14, an output unit 16, a control unit 18, and a storage unit 20.
As in the above-described drawings, a plurality of imaging optical systems 3 are provided on the side opposite to the display surface of one display unit 2. The imaging optical systems 3 include an imaging unit 8 and an optical system 9, respectively.
The preprocessing unit 10 is a circuit that processes the analog signal output from the imaging unit 8. The preprocessing unit 10 includes, for example, an adc (analog to Digital converter) and converts an input analog signal into Digital image data.
The image acquisition unit 12 acquires a captured image from the digital image data converted by the preprocessing unit 10. Based on the digital image data acquired from the plurality of photographing optical systems 3, photographing results are acquired. More specifically, the image acquisition section 12 acquires and outputs, for example, the following imaging results: the imaging result obtained by suppressing the glare generated by each imaging optical system 3 using the image data acquired by the plurality of imaging optical systems 3.
The post-processing unit 14 performs appropriate processing on the imaging result output from the image acquisition unit 12 and outputs the result. The appropriate processing may refer to, for example, image processing or signal processing such as pixel defect correction, edge enhancement, noise removal, brightness adjustment, color correction, white balance adjustment, distortion correction, auto-focus processing, and the like. Further, the appropriate process may be a process specified by the user.
The output unit 16 outputs information to the outside of the electronic apparatus 1. The output unit 16 includes an output interface, for example. The output interface is, for example, an interface for outputting digital signals such as usb (universal Serial bus) or a user interface such as a display. The output interface provided in the output unit 16 may also serve as an input interface.
The control unit 18 controls processing in the electronic apparatus 1. The control unit 18 may include, for example, a cpu (central Processing unit), and may control the processes of the preprocessing unit 10, the image acquisition unit 12, the post-Processing unit 14, and the output unit 16. Further, it is also possible to execute control for shooting based on the shooting optical system 3 based on shooting timing instructed from the user interface.
The storage unit 20 stores data in the electronic device 1. The storage unit 20 is, for example, a Memory such as a DRAM (Dynamic Random Access Memory) or a storage device such as an SSD (Solid State Drive). The storage unit 20 may be a built-in memory or a memory such as a removable memory card. The storage unit 20 is not necessarily provided inside the electronic device 1, and may be a storage device or the like connected to the outside through an input/output interface. The storage unit 20 or the storage unit 20 appropriately inputs and outputs information to and from the electronic device 1 at necessary timings.
A part or all of the above-described members may be formed on the same substrate. For example, the imaging optical system 3, the preprocessing unit 10, the image acquiring unit 12, the post-processing unit 14, the output unit 16, the control unit 18, and the storage unit 20 may be formed on one chip, or some of them may be formed on another chip as appropriate. In addition, the partial structure formed on the same substrate of one chip and the partial structure formed on the other substrate may be formed by laminating the partial structures by a technique such as coc (chip on chip), cow (chip wafer), and wow (wafer on wafer) in the manufacturing process.
Next, the operations of the imaging optical system 3 and the image acquisition unit 12 will be described in detail.
Fig. 5 is a view showing the imaging optical system 3 from the display surface side of the display unit 2. For example, the partial region when the display unit 2 is viewed from the display surface side is shown in the figure. The display panel 4 is viewed from the display surface, and a light-emitting pixel 4b formed of a plurality of light-emitting elements arranged in an array in the first direction and the second direction is provided on the display panel 4 as indicated by a dotted line. The arrangement and orientation of the pixels 4b are merely examples, and the pixels 4b may be arranged by being rotated by 45 ° as in the arrangement of fig. 5. The light-emitting pixel 4b is drawn as a square, but is not limited thereto, and may be a rectangle extending in any direction or may not be a rectangle.
A plurality of photographing optical systems 3 forming an opening between the light emitting pixels 4b of the display panel 4 may be provided. For example, fig. 5 shows a first photographing optical system 3A and a second photographing optical system 3B. In each photographing optical system, as an example, as shown in the drawing, an opening of an elliptical shape having a major axis in the second direction is formed as the first optical system 9A, and an opening of an elliptical shape having a major axis in the first direction is formed as the second optical system 9B. A plurality of openings may be provided as optical systems in each photographing optical system. In fig. 5, for example, the first imaging section 8A is disposed below the opening of the first optical system 9A, and the second imaging section 8B is disposed below the opening of the second optical system 9B. The imaging unit 8 may be provided at a position offset from the opening, without limitation.
Further, although not shown, a lens may be provided between each opening and the imaging unit 8 or in the opening as a part of the optical system 9 so that light passes through the opening and is appropriately diffused and focused on the imaging area of the imaging unit 8. Instead of the lens, another optical system may be provided so that the imaging area of the imaging section 8 can appropriately receive light. Here, the opening means an optically transmissive region, and may be an air gap or may be filled with a transparent material such as resin. The material filling the opening is not limited, and may be a material that transmits a specific wavelength through a color filter or the like. For example, filling the openings with materials having different transmittances, refractive indices, and the like can make the first optical system 9A and the second optical system 9B have different optical characteristics.
The imaging optical system 3 has different aperture layouts and is arranged at different positions, for example, like the first imaging optical system 3A and the second imaging optical system 3B shown in fig. 5. The optical characteristics of the optical system are different depending on the aperture layout, and light is incident on each imaging unit 8 from the display surface of the display unit 2 based on the different optical characteristics. That is, the light enters the first imaging unit 8A and the second imaging unit 8B based on different optical characteristics.
For example, as shown in fig. 5, the opening in each of the imaging optical systems may be an opening in which the direction of the same ellipse is changed. In this way, by making the same opening have different directions, it is possible to form the optical system 9 that generates diffraction images in different directions.
Fig. 5 shows a state where two openings are provided along the first direction and the second direction between three consecutive pixels, but the present invention is not limited to this. For example, openings may be provided between more consecutive light emitting pixels, respectively, each other. As described above, the single optical system 9 does not necessarily have to be a single continuous region in the display unit 2 including the display panel 4 when viewed from the display surface, and may be configured to include a plurality of individual regions including openings periodically arranged so as to be stitched between pixels.
In fig. 5, the long axis of one opening corresponds to approximately two or three display pixels, but the present invention is not limited thereto. The long axis of the opening may be longer or shorter.
In addition, the apertures may not be the same aperture, and in order to generate diffraction images having different characteristics, the long axis of the aperture that is a part of the second optical system 9B may be longer than the long axis of the aperture that is a part of the first optical system 9A, or may be shorter than the long axis of the aperture that is a part of the first optical system 9A. In this way, by making the aperture layouts different, the influence of the glare can be further made different in the signals output from the first and second imaging optical systems 3A and 3B. As a result, the glare suppression operation by the image acquisition unit 12 can be assisted. The influence of the opening layout on the glare varies depending on the state of the electronic device 1 or the states of the components in the electronic device 1, and thus can be appropriately defined by the design of the electronic device 1.
Fig. 6A and 6B show images in which coordinates are adjusted by conversion from analog signals acquired by the respective imaging optical systems 3 shown in fig. 5. Fig. 6A is an image acquired based on the first photographing optical system 3A, and fig. 6B is an image acquired based on the second photographing optical system 3B.
The image acquisition section 12 may perform position adjustment. The adjustment of the shooting position can be adjusted, for example, as follows: an image in which a mirror image of the image acquired by each photographing optical system 3 is displayed on the display panel 4 is superimposed on a reflected image of the front observation display surface. For example, when a person is reflected at the center of the display surface, the position is adjusted so that the image acquired from the photographing optical system 3 is displayed at the center of the display surface. For example, the correction may be performed based on a positional deviation from the center of the display surface of the photographing optical system 3. The position adjustment is not limited to this, and may be appropriately controlled, and any method may be executed.
For example, when the position is adjusted in the above manner, glare occurs as shown by the white areas in fig. 6A and 6B. For example, the glare is generated as shown in the figure as a brighter region than an image to be actually photographed.
In this case, the image acquiring unit 12 acquires a pixel value having a lower light intensity or luminance (luminance) value after conversion into a digital signal in the signals output from the first and second imaging optical systems 3A and 3B to suppress glare generated at different portions, and corrects the image. In addition, when processing a signal before conversion into a digital signal, the preprocessing section 10 may select the signal based on the signal intensity.
Fig. 7 shows an image output from the electronic device 1 by the processing of the image acquisition unit 12. After the processing by the image acquisition unit 12, the post-processing unit 14 adjusts, for example, brightness. By outputting the image thus processed, as shown in fig. 7, glare can be suppressed, and an image adjusted to natural brightness can be obtained.
For example, the regions where glare occurs as shown in fig. 6A and 6B may depend on the respective imaging optical systems 3. In this way, the glare generating area may be stored in the storage unit 20 or the image acquiring unit 12 in advance. As a result, the selection process of the pixel values or the synthesis process of the pixel values can be performed quickly. For example, for a certain pixel, it is not necessary to compare pixel values output from the respective imaging optical systems 3, and an image is acquired based on an output from a predetermined imaging optical system 3.
As described above, according to the present embodiment, by making the directions of the generated diffracted lights different in the plurality of imaging optical systems, it is possible to suppress glare in an image captured by a front camera arranged so as to overlap with the display, which is a camera provided on the inner surface of the display unit of the display. As a result, in the electronic apparatus, the imaging optical system capable of acquiring a highly accurate image can be provided on the display front side without increasing the frame width of the display installation surface.
For example, when light emitted from an intense light source is irradiated to a local portion of the display surface, glare is easily generated in the photographing optical system 3 in the vicinity of the light irradiation area. In this case, it is also possible to acquire an image in which the glare is suppressed as described above by using an image acquired in at least one other photographing optical system 3 set to a different position.
The first imaging optical system 3A and the second imaging optical system 3B are selected from two of the plurality of imaging optical systems, and three or more imaging optical systems 3 may be provided. That is, at least two imaging optical systems 3 among the three or more imaging optical systems 3 may function as the first imaging optical system 3A and the second imaging optical system 3B, and in the present embodiment, the operation and effect of the embodiments described below can be achieved. In addition, by providing two or more sets of the imaging optical systems having the characteristics of the set of the first imaging optical system and the second imaging optical system among the three or more imaging optical systems 3, the degree of freedom in acquiring the captured image can be further improved, and the accuracy in suppressing glare can be further improved. In this case, the same combination is not required, and the same imaging optical system may be used for different combinations. For example, when 3X, 3Y, and 3Z are provided as the imaging optical systems, two combinations (3X, 3Y) and (3X, 3Z) common to X may be provided as combinations (the first imaging optical system and the second imaging optical system).
In the above description, each of the imaging optical systems 3 has an opening as the optical system 9, but the present invention is not limited to this. For example, one may be provided with an opening and the other may not be provided with an opening. In this case as well, similar to the above, since the glare light is generated from the plurality of imaging optical systems 3 at different positions, the influence of the glare light can be suppressed by the images acquired by the plurality of imaging optical systems 3.
The opening shape is an ellipse, but is not limited thereto. For example, the shape may be rectangular or rectangular with rounded corners. Further, it may be formed of any closed curve as long as light can be appropriately received in the imaging section 8. The openings may not have the same shape in the thickness direction, i.e., the third direction. For example, a more complicated shape such as a rectangle on the upper side, i.e., on the side closer to the display surface, and an ellipse on the side closer to the imaging unit 8 may be used.
In the above description, the shape of the opening and the like are described as optical characteristics, but the present invention is not limited to this. For example, the material filled in the opening may be changed for each photographing optical system 3 to have different characteristics. For example, a λ/4 wavelength plate may be provided as the optical system 9. In this case, the p-wave may be shielded to reduce glare caused by the influence of reflection in the display panel 4. Different wavelength plates may be provided as the optical system 9 for the plurality of photographing optical systems 3. As a result, the influence of glare due to reflection, diffraction, or the like in the display unit 2 can be changed for each imaging optical system 3, and various image correction methods can be used.
(second embodiment)
The electronic device according to the present embodiment includes a plurality of imaging optical systems having openings with the same layout and capable of reducing the influence of glare.
Fig. 8 shows a display surface of the electronic device 1 according to the present embodiment in the same manner as fig. 5. In the present embodiment, the first photographing optical system 3A and the second photographing optical system 3B have the same aperture layout. On the other hand, unlike the first embodiment, the first photographing optical system 3A and the second photographing optical system 3B exist at different positions in the second direction. That is, each optical system has an elliptical aperture having a major axis along the second direction, and is arranged at a position shifted in the second direction.
Fig. 9 is an image acquired by the second photographing optical system 3B. For example, an image acquired by the first photographing optical system 3A is made fig. 6A. In this way, both the two photographing optical systems 3 have elliptical openings forming major axes in the second direction, and thus the directions of generation of glare coincide. However, since the first imaging optical system 3A and the second imaging optical system 3B are arranged to be shifted in the second direction, glare occurs at a position shifted in the second direction.
Therefore, the image acquiring unit 12 can acquire an image in which the influence of glare is suppressed, as in the first embodiment. Further, the first imaging optical system 3A and the second imaging optical system 3B are also shifted in the first direction thereof. In this way, when the image pickup optical system is shifted in both the first direction and the second direction, the center point in the first direction where the glare occurs in each of the image pickup optical systems is located at a different position, and therefore, an image in which the glare is more suppressed can be obtained.
(third embodiment)
In this embodiment, an electronic apparatus including three imaging optical systems will be described.
Fig. 10 is a diagram showing an example of the electronic device 1 according to the present embodiment. The electronic apparatus 1 includes a first imaging optical system 3A, a second imaging optical system 3B, and a third imaging optical system 3C on the opposite side of the display surface of the display unit 2. For example, the photographing optical system is disposed along the first direction. The first photographing optical system 3A is provided at the center of the screen, and the second photographing optical system 3B and the third photographing optical system 3C are provided near the end (boundary) of the display surface of the display unit 2 via the first photographing optical system 3A, respectively. The vicinity means, for example, one to several pixels in the display element from the end, the boundary, and may be, as other examples, several millimeters from the end of the housing of the electronic apparatus 1, or several percent with respect to the width or height of the electronic apparatus 1.
With this arrangement, the parallax between the first photographing optical system 3A and the second photographing optical system 3B and the parallax between the third photographing optical system 3C and the first photographing optical system 3A are equal. That is, an image with a parallax (almost) of 0 with respect to the first photographing optical system 3A can thereby be generated using the image acquired by the second photographing optical system 3B and the image acquired by the third photographing optical system 3C.
This is done because the end of the display surface has little effect on the user even if pixel damage occurs. On the other hand, for example, the size of the first imaging optical system 3A in the vicinity of the center of the display surface may be smaller than the size of the other imaging optical systems. Here, the size may be, for example, the size of the opening or the size of the imaging unit 8. By configuring in this way, the image displayed by the display section 2 can be made more natural.
In fig. 10, for example, the first imaging optical system 3A may have an ellipse having a major axis in the first direction as the optical system 9, and the second imaging optical system 3B and the third imaging optical system 3C may have an ellipse having a major axis in the second direction as the optical system 9.
Fig. 11 is a diagram showing images that can be acquired from the respective photographing optical systems when such an opening is provided. The images acquired by the first photographing optical system 3A, the second photographing optical system 3B, and the third photographing optical system 3C are in order from the top. In this way, glare occurs in the first imaging optical system 3A along the second direction, and glare occurs in the second imaging optical system 3B and the third imaging optical system 3C at positions displaced along the first direction.
Fig. 12 is an image in which images of the second photographing optical system 3B and the third photographing optical system 3C are superimposed in consideration of parallax. In this way, when an image in which glare is suppressed is acquired in consideration of parallax, for example, as shown in fig. 12, glare may be positioned at both ends. Using this image and the top image of fig. 11, for example, the image shown in fig. 7 in which glare is suppressed can be obtained. The synthesis or correction of these images may be performed by the image acquisition section 12.
In the above description, the second photographing optical system 3B and the third photographing optical system 3C are particularly provided at both ends, but the present invention is not limited thereto. For example, by providing the second imaging optical system 3B and the third imaging optical system 3C at positions where the same parallax appears with the first imaging optical system 3A interposed therebetween, it is possible to similarly perform the acquisition of the image in which the glare is suppressed. In the present embodiment, the first direction is offset, but the second direction may be offset. Further, each of the imaging optical systems 3 may be arranged at a position shifted in both the first direction and the second direction.
(fourth embodiment)
The intensity of glare may vary depending on the direction. For example, it may be generated strongly in the first direction and weakly in the second direction. In this case, in the present embodiment, an imaging optical system that easily generates glare in the first direction or the second direction is appropriately selected, and an image in which glare is suppressed is acquired based on a signal acquired from the selected imaging optical system.
Fig. 13 shows an example of acquiring an image based on an output from the first photographing optical system 3A in which glare generated in the first direction is weak while glare generated in the second direction is strong, for example, when acquiring an image in the photographing optical system shown in fig. 5. As shown in fig. 13, since the glare intensity in the first direction is weak, an image in which the glare is partially darkened compared to fig. 6A is acquired. On the other hand, since the glare in the second direction is strong, the image output from the second photographing optical system 3B acquires an image in which the glare portion of the image is bright as shown in fig. 6B.
In this way, when the intensity of the glare varies depending on the direction, an image in which the glare is suppressed can be obtained based on the output from the first imaging optical system 3A. The intensity of the glare is calculated, for example, by the pre-processing unit 10, based on the image data output from each of the imaging optical systems 3, the dispersion of the luminance in the first direction and the second direction. For example, the dispersion along the first direction is calculated for each row and averaged, and the dispersion along the second direction is calculated for each column and averaged. It can be seen that the directional glare intensity with a high average value is strong. Further, the direction may be determined not based on the dispersion but based on the difference between the maximum luminance and the minimum luminance for each row or each column.
As described above, the photographing optical system 3 in which the glare is easily generated depends on the optical system 9 of each photographing optical system 3. For example, as described above, the determination is made based on the direction of the opening provided in the optical system 9. The photographing optical system 3 that gives priority when glare is generated in an arbitrary direction can be determined in advance according to the direction of the opening.
In addition, a case where glare is likely to occur according to the layout of the openings is described, but the present invention is not limited thereto. For example, the optical system for controlling glare may be not only an opening layout but also the entire display panel including circuit wiring. Glare and diffraction shapes also vary depending on how the pattern types are periodically arranged on the circuit and how the light interferes. Based on these factors, it is possible to determine in which direction glare is easily generated.
The image acquiring unit 12 acquires an image in which glare is suppressed based on the photographing optical system 3 that forms a preferential direction different from the direction in which glare occurs. For example, when images such as fig. 13 and 6B are output by the respective photographing optical systems 3, the image of fig. 13 with less glare can be selected and output. Further, a weighted average of the images of fig. 13 and 6B with the weight of the image of fig. 13 increased may be calculated.
As another example, when strong glare is generated in the second direction, an image is acquired based on the above-described arbitrary acquisition (e.g., selection, synthesis) method using a plurality of the photographing optical systems 3 whose first directions are the priority directions.
In this way, when the glare has polarity, an image in which the glare is suppressed can be acquired based on an image output through the photographing optical system 3 having a direction different from the direction as a priority direction.
(fifth embodiment)
In the electronic apparatus 1 of the present embodiment, even when glare occurs, the occurrence of glare can be reduced in any of the imaging optical systems 3.
Fig. 14 is a diagram showing a cross-sectional view of the electronic device 1 according to the present embodiment. As shown in fig. 14, the electronic apparatus 1 includes a light shielding portion 30 between the plurality of imaging optical systems 3. The light shielding portion 30 may be a light shielding film made of a material having high light shielding properties, or may be an absorption film made of a material having high light absorption.
In this way, the light shielding portion 30 that prevents light from propagating between the imaging optical systems 3 can be provided on the side of the display panel 4 opposite to the display surface.
Fig. 15 shows another example in which the light shielding portion 30 is provided so as not to transmit light reflected by the display panel 4 and the circularly polarizing plate 5 in the first direction and the second direction with respect to the plurality of imaging optical systems 3.
Fig. 16 is another other example, and the light shielding portion 30 is provided so as to penetrate the display panel 4. In this way, it is possible to block not only light reflected in the display unit 2 but also light reflected outside the display unit 2.
For example, as shown in fig. 15 and 16, by disposing the light shielding portion 30 in the region including the display panel 4 and the circularly polarizing plate 5, it is possible to suppress the occurrence of glare due to light such as reflection and diffraction in these layers. Even if strong glare is generated in a certain region by the influence of incident light or the like, and strong glare is generated in the imaging optical system 3 of light incident in the region, the influence of glare can be reduced in the other imaging optical systems 3 spaced apart by the light shielding portion 30.
Further, as shown in fig. 14 and 16, for example, the influence of reflection and diffraction of light on the opposite side in the third direction of the display surface of the display unit 2 can be reduced, thereby suppressing the influence of the imaging units 8.
In the above description, the case where the light shielding portion 30 is provided to the circularly polarizing plate 5 has been described, but the light shielding portion 30 may be provided in the touch panel 6, for example. Further, the light shielding portion 30 may be formed to be very thin so as to reach the region of the cover glass 7. In this case, the display of the display section 2 on the display surface can be appropriately adjusted in size and arranged so as to be a natural image for the user. In the display panel 4, the luminance of the pixel 4b arranged around the light shielding portion 30 may be made higher than the luminance of the other pixels 4b by software.
In this way, even when glare occurs, by shielding the plurality of imaging optical systems 3 from each other, when a certain imaging optical system 3 generates strong glare, it is possible to acquire an image with little influence of glare using the other imaging optical system 3. The image acquiring unit 12 can compare or combine output values of a plurality of imaging optical systems 3 to acquire an image with less influence of glare, for example, as in the above-described embodiment.
(seventh embodiment)
In the above embodiments, the image acquiring unit 12 acquires an image with a small influence of glare by a predetermined operation (including comparison). In contrast, in the present embodiment, the outputs from the plurality of imaging optical systems 3 are synthesized using a model to obtain an image in which glare is suppressed.
For example, the model may be a statistical model. The model is generated by statistically calculating which calculation is used for each of the imaging optical systems 3 to synthesize, and the image acquisition unit 12 acquires an image with a small influence of glare by inputting information acquired from the plurality of imaging optical systems 3 to the model.
For example, the model may be a neural network model trained by deep learning. The Neural Network model may be formed of MLP (Multi-Layer Perceptron), CNN (Convolutional Neural Network), or the like. In this case, the storage unit 20 or the image acquisition unit 12 may store parameters trained from a plurality of teacher data in advance, and the image acquisition unit 12 may form a neural network model based on the stored parameters. Using the formed trained model, the image acquisition unit 12 acquires an image in which glare is suppressed using data output from the plurality of imaging optical systems 3.
In addition, when using the trained model, the electronic apparatus 1 may further improve the training accuracy using the captured image. For example, the control unit 18 of the electronic device 1 may perform training. As another example, the plurality of electronic devices 1 may transmit data to a storage device or the like existing in a cloud or the like, perform training in a server or the like, and reflect the retrained parameters to the electronic devices 1. In this case, only the glare information may be transmitted in order not to include the privacy information including the face information of the user. The data transmission and reception of the electronic device 1 can be set to a state selectable by the user, for example, by selecting entry (opt-in) or selection exit (opt-out).
In this way, the image acquisition unit 12 can acquire an image not only by linear processing but also by nonlinear processing, particularly by calculation using various models including a trained model.
(eighth embodiment)
In the embodiment, the image acquisition unit 12 acquires an image by image synthesis or the like based on the pixel value, the priority direction, or a method using a model. In the present embodiment, an area in which image correction is performed is defined in each photographing optical system 3.
For example, as shown in fig. 6A and the like, in each of the imaging optical systems 3, a bright object is often determined as a bright point in a region where glare occurs. Based on the characteristics of the optical system 9 in each photographing optical system 3, the arrangement of the photographing optical system 3, and the like. Therefore, when a bright object is present, the image acquisition unit 12 can predict the glare generating area of each imaging optical system 3 and correct the image based on the prediction result.
Glare is often caused by close optical elements. Therefore, the parallax in the images acquired by the plurality of photographing optical systems 3 is likely to be larger than that of other subjects. Therefore, the image acquisition unit 12 or the like can calculate the parallax, and the region with the large parallax is predicted as the glare generating region.
The correction is, for example, an area in which the possibility of glare is high in a certain photographing optical system 3, and the pixel value is determined based on information acquired from another photographing optical system 3. For example, in addition to the comparison operation described in the first embodiment, interpolation processing from the other photographing optical system 3 may be performed in the area. The present invention is not limited to this, and the correction processing may be executed by performing other calculation such as weighting calculation so as to increase the influence of the pixel values output from the other imaging optical system 3.
When three or more imaging optical systems 3 are provided and the probability of glare occurring in the other plurality of imaging optical systems 3 in the area is low, an image with little influence of glare may be acquired based on information acquired from the other plurality of imaging optical systems 3.
Further, the region may be updated by being reflected to the model after the training of the above embodiment. By reflecting this as training, correction accuracy can be improved, for example, between different users having the same electronic apparatus 1.
Further, as another example, the area may be determined in such a manner that the display panel 4 does not perform display but light is incident, for example. Instead of not displaying the display panel 4, the area may be predicted by illuminating the display panel 4 with an intensity that causes glare. In this way, the process of determining the region where the glare occurs can be performed not at the shooting timing desired by the user but at other timings, for example, timings before and after shooting. This process may be executed by the control unit 18 or the image acquisition unit 12, for example.
(ninth embodiment)
In the present embodiment, the electronic apparatus 1 includes a microlens array as the optical system 9 of the imaging optical system 3.
Fig. 17 is a diagram showing the imaging optical system 3 according to the present embodiment. The optical system 9 of the imaging optical system 3 includes a microlens array 32. The light passing through the microlens array 32 is appropriately incident on the imaging section 8, and is converted into a signal to be output from the imaging section 8.
The preprocessing unit 10 reconstructs an image based on the signal output from the imaging unit 8. The image acquisition unit 12 acquires an image in which the influence of glare is reduced by the above embodiments, based on the reconstructed image.
Fig. 18 is a diagram showing another example of the imaging optical system 3 according to the present embodiment. The first and second imaging optical systems 3A and 3B include a common microlens array 32 as the optical system 9. In this way, since the microlens array 32 limits the region on which the light condensed by each lens enters to a certain extent, a plurality of imaging optical systems 3 can be formed by arranging the imaging units 8 for each region. In this case, for example, the light shielding portion 30 described in the fifth embodiment may be provided between the first imaging portion 8A and the second imaging portion 8B. In fig. 18, two imaging optical systems are provided, but the present invention is not limited to three or more imaging optical systems 3, and the same microlens array 32 may be used.
(tenth embodiment)
In the above-described embodiment, the configuration in which the plurality of imaging units 8 are provided in the same microlens array 32 has been described as an example, but in the present embodiment, the electronic apparatus 1 is provided with the same imaging unit 8 for the plurality of optical systems 9.
Fig. 19 is a diagram showing an example of the imaging optical system 3 according to the present embodiment. The electronic apparatus 1 is provided with one imaging unit 8 for the first imaging optical system 3A and the second imaging optical system 3B. In the imaging unit 8, a first imaging unit 8A and a second imaging unit 8B are defined in the imaging-possible area. The light enters the first image pickup unit 8A via the first optical system 9A, and the light enters the second image pickup unit 8B via the second optical system 9B. In this way, a plurality of imaging optical systems 3 can be formed by providing a plurality of optical systems 9 for the same imaging unit 8.
Fig. 20 is a view showing another example of mounting of the present embodiment. That is, the microlens array 32 is provided as the optical system 9, and the first imaging optical system 3A and the second imaging optical system 3B are formed using the same microlens array 32 and the imaging unit 8.
Fig. 21 is a modification of fig. 20, and a light shielding portion 30 is provided between the first imaging optical system 3A and the second imaging optical system 3B. By providing the light shielding portion 30 in this manner, light incident on each region of the imaging portion 8 can be controlled more clearly.
Fig. 22 is a modification of fig. 19, and a light shielding portion 30 is provided between the first optical system 9A and the second optical system 9B. By providing the light shielding portion 30 in this way, even if the microlens array 32 is not provided, the light incident on the first image pickup portion 8A and the second image pickup portion 8B can be clearly controlled.
In fig. 21 and 22, the light shielding portion 30 does not penetrate the display panel 4, but may penetrate the display panel 4 or penetrate the display panel 4 as in the example of fig. 16.
By sharing the imaging unit 8 with the plurality of imaging optical systems 3 in this way, the circuit configuration and semiconductor process of the imaging unit 8 can be simplified.
For example, as in the present embodiment, a plurality of imaging optical systems 3 can be integrated by partially or entirely using the same elements. By thus integrally arranging the plurality of imaging optical systems 3, the influence of parallax between the imaging optical systems 3 can be reduced. This makes it possible to reduce the influence of parallax even when correcting images acquired separately, and to improve correction accuracy.
In addition, instead of sharing the imaging unit 8 or the optical system 9 as in the present embodiment, a plurality of imaging optical systems 3 may be integrated so that the optical systems 9 are adjacent to each other in the configuration in each of the above embodiments. In this case as well, correction with high accuracy can be performed while reducing the influence of parallax.
Fig. 23A is a diagram showing another arrangement example of the microlens array 32. As shown in fig. 23A, the microlens array is disposed on the side closer to the imaging section 8 in the opening of the optical system 9. Thus, the aperture, microlens array 32 forms the optical system 9 as a separate structure. Of course, the optical system 9 may be formed by disposing the microlens array 32 on the side of the display panel 4 opposite to the surface, without having to have an opening.
Fig. 23B shows another example of the configuration of the microlens array 32 and the configuration of the display panel 4. One or more microlens arrays 32 may be provided for the plurality of openings. In this case, an opening may be formed by leaving an area of the light-emitting pixels of the display panel 4 as appropriate, and the microlens array 32 may be provided between the lower portion of the opening and the imaging section 8.
In the case of fig. 23A and 23B, it is needless to say that the light shielding portion 30 may be provided at an appropriate position.
(eleventh embodiment)
In the present embodiment, a chip configuration of the imaging unit 8 and the like will be described.
Fig. 24 shows an example of a chip configuration according to the present embodiment. The configuration is roughly shown, but not limited to a limited example, and a case where another function is provided to a different chip is not excluded. That is, the present invention is suitably provided with a selector, an I/F, a power supply node, and the like, in addition to the illustrated configuration.
As shown in fig. 24, the analog circuit such as the preprocessing section 10, the logic circuit such as the image acquisition section 12, and the imaging section 8 may be provided on the same chip. Further, the control unit 18 and the storage unit 20 may be provided. By forming the imaging unit 8, the preprocessing unit 10, and the image acquisition unit 12 on the same chip in this manner, data processing is performed at high speed while suppressing signal degradation without passing through an interface for transmission and reception of signals and data.
The preprocessing unit 10 may not be shared, but may be provided for each imaging unit 8. In this case, the digital image data may be transmitted from each preprocessing unit 10 to the image acquisition unit 12.
Fig. 25 is a diagram showing another example of the chip configuration. In fig. 25, the configuration other than the imaging unit 8, the preprocessing unit 10, the image acquisition unit 12, and the interface is omitted. The photographing section 8 may be provided on a separate chip. One chip is provided with the imaging section 8, and the other chip is also provided with the imaging section 8. The imaging unit 8 includes a reception interface 22 for transmitting information to another chip in one chip.
In the other chip, the signal is received via the receiving interface 22 and combined with the signal from the imaging unit 8 included in the chip, and the image acquisition unit 12 performs image acquisition. The preprocessing unit 10 performs signal processing as necessary before outputting data to the image acquisition unit 12. The pretreatment unit 10 may be provided on two chips or only on the other chip.
Of course, more photographing sections 8 are provided on another chip instead of two chips. With this configuration, even when the imaging optical system 3 is disposed at various positions on the opposite side of the display surface of the display unit 2, the area of the chip related to the imaging unit 8 can be reduced.
Fig. 26 is another example in which the plurality of photographing sections 8 are on a single chip and the image acquisition section 12 is provided on another chip. In this way, the logic circuit for processing the digital signal and the photoelectric conversion circuit including the light receiving element can be configured as separate chips. With this arrangement, the degree of freedom in the arrangement of the imaging optical system 3 and the like can be further improved without increasing the chip area.
In fig. 24 to 26, the reception interface 22 may be, for example, an interface capable of performing reception of data in the MIPI standard. In addition, high-speed transmission of other standards can be performed. The transmitted signal may be a standard for transmitting and receiving an analog signal or a standard for transmitting and receiving a digital signal. The interface type can be appropriately selected according to the chip mounting of the imaging section 8 and the preprocessing section 10.
(twelfth embodiment)
In the above embodiments, although the light-emitting pixels 4b of the display panel 4 are not particularly described, it is also possible to obtain an image with less influence of glare caused by the image obtaining unit 12 by changing the occurrence of glare in the plurality of imaging optical systems 3 by the light-emitting pixels 4 b.
For example, when the first imaging optical system 3A and the second imaging optical system 3B are provided as the two imaging optical systems 3, the light-emitting pixels 4B around the first imaging optical system 3A and the light-emitting pixels 4B around the second imaging optical system 3B may be different from each other. For example, one is an OLED and the other is a micro led. In this way, by providing the display optical system having different light emitting elements, particularly different optical characteristics, it is possible to have characteristics different from each other in the influence on the glare.
When the glare is generated based on different optical characteristics, the glare generated differs depending on the situation in the image acquired from the photographing optical system 3. By making the characteristics of the glare different, for example, an image whose influence of the glare is strong is acquired from one photographing optical system 3, and an image whose influence of the glare is weak is acquired from the other photographing optical system 3. In this way, by using light-emitting elements having different characteristics, the image combining and correcting method in the image acquisition unit 12 can be expanded to various methods.
(thirteenth embodiment)
The foregoing embodiments use images output from a plurality of photographing optical systems 3 to acquire images with little influence of glare. That is, the plurality of photographing optical systems 3 are activated to acquire images based on the respective acquired information. In the present embodiment, at least one photographing optical system 3 is activated as necessary to reduce the influence of glare. For example, there is at least one photographing optical system 3 that is activated when image correction is required.
For example, in the initial state, only the photographing optical system 3 near the center of the display surface may be activated. When it is determined that the influence of glare needs to be reduced based on the image acquired by the imaging optical system 3, the other imaging optical system 3 is restarted, and the image acquiring unit 12 acquires an image with a small influence of glare.
For example, when strong light is irradiated to the entire area or only a part of the area of the display surface, it is determined that the influence of glare is strong. Further, the influence of the glare may also be judged based on other conditions. In the above description, the output of one imaging optical system 3 is used, but the present invention is not limited to this, and the determination may be made based on the magnitude of signals of a plurality of light receiving elements, for example. Further, as another example, the determination may be made based on the distribution of the luminance or the like of the image displayed in the display panel 4. In this way, the influence of the glare is judged by various judgment methods.
Based on the above determination, the control unit 18 operates the appropriate imaging optical system 3, acquires a signal based on the incident light, and acquires an image by the image acquisition unit 12. The determination of the influence of the glare may be performed by the control unit 18. In this way, energy can be saved by operating at least a part of the imaging optical system 3 as needed.
(fourteenth embodiment)
In the above embodiments, various modes of the imaging optical system 3 and the image acquisition unit 12 are described. In the present embodiment, the arrangement of the photographing optical system 3 will be described.
Fig. 27 to 31B are views each showing an example of the arrangement of the photographing optical system 3 as viewed from the display surface. These drawings each show an example of the configuration and are not limited to the drawings. Further, some or all of the configurations in the plurality of drawings may be included. That is, the electronic apparatus 1 may be provided with three or more imaging optical systems 3 having not only the features of one drawing but also the features of a plurality of drawings.
As shown in fig. 27, the photographing optical system 3 may be disposed at both end portions (the vicinity of the boundary) in an arbitrary direction, for example, the first direction of the display screen 1 a. By configuring in this way, the photographing optical system 3 can be disposed at a position inconspicuous to the user. It is also possible to provide both ends in the second direction, i.e., the upper and lower ends in fig. 27, respectively.
As shown in fig. 28, the distance between the two photographing optical systems 3 may be set to 50mm to 80mm in the display screen 1 a. By configuring in this way, images having parallax close to that of human eyes can be acquired from the two photographing optical systems 3. Thereby, the influence of glare can be reduced in the plurality of photographing optical systems 3, and a stereoscopic image having a natural impression on a human can be generated. Further, the third imaging optical system 3 may be disposed at another position, and an image with further reduced glare may be obtained by the third imaging optical system 3 having a parallax with respect to the two imaging optical systems 3. Thus, the stereoscopic image can be acquired while improving the glare suppression accuracy.
As shown in fig. 29, a plurality of imaging optical systems 3 may be arranged so as to be shifted in each of the first and second directions. For example, by providing openings having long axes formed in the same direction in each optical system 9, an image including glare that is off-centered in both the first direction and the second direction can be obtained. Thus, the image acquisition unit 12 can acquire a highly accurate image with glare suppressed by simpler processing (for example, comparison processing and selection processing shown in the first embodiment).
As shown in fig. 30, three or more photographing optical systems 3 may be provided. When three or more imaging optical systems 3 are arranged, they may be arranged so as not to have symmetry as shown in fig. 30, or may be arranged so as to have some symmetry with respect to the first direction, the second direction, or the center point. Further, the third photographing optical system 3 may be arranged at a position distant from the other two photographing optical systems 3. In this case, the intensity of the glare caused by the position may be made different in each imaging optical system 3.
Fig. 31A and 31B show the same electronic device 1. Fig. 31B is an external view seen from the arrow direction of fig. 31A. Thus, there can be a plurality of display screens 1 a. The imaging optical systems 3 in the respective display screens may be arranged at the same position toward the display screen 1a as shown in the drawing, or may be arranged at completely different positions. For example, at least two imaging optical systems 3 are arranged on the surface at both ends along the first direction, and at least two imaging optical systems 3 are arranged on the inner surface at both ends along the second direction. In this case, for example, the photographing optical system 3 in fig. 31B may be used as a rear camera at ordinary times, or a rear camera for taking two shots (or two or more shots) may be used.
In this way, the photographing optical system 3 can be configured in various ways on the display screen 1 a.
As described in the above embodiments, the plurality of imaging optical systems 3 may be provided so as to be shifted by at least two. The configuration can be freely selected regardless of the drawing. For example, in fig. 2 and the like, by providing a plurality of imaging optical systems 3 shifted in the first direction at the center of the second direction of the display surface, an image in which the line of sight of the user is natural can be acquired while the user observes the center of the screen. The present invention is not limited to this, and may be provided slightly above the second direction, for example. For example, when acquiring the image of the face of the user, the center of the face is located at the center of the screen, and the photographing optical system may be arranged in the vicinity of the displayed eyes. In addition to the above, the second direction may be provided in the vicinity of the center of the first direction. In this case, the same image can be acquired by rotating the screen.
Next, another example of the layout of the openings included in the optical system 9 is shown. This layout is not a simple shape as shown in fig. 5, fig. 8, and the like but a more complicated layout.
Fig. 32 to 35 are views showing an example of the layout of the openings. As shown in fig. 32, for example, the openings are also arranged to be shifted in the second direction with respect to the display element.
As shown in fig. 33, for example, the opening is formed in a shape in which an ellipse having a major axis in the first direction and an ellipse having a major axis in the second direction are combined between the display elements.
As shown in fig. 34, for example, the opening belonging to the first optical system 9A and the opening belonging to the second optical system 9B may be at an angle close to 90 ° without intersecting. For example, in this case, the angle of the light emitting element of the display panel 4 may also be partially changed. For example, the arrangement angle of the light emitting elements on the display surface in the third direction of the arbitrary optical system 9 may be rotated by 45 ° or an arbitrary angle of significance.
Of course, as shown in fig. 35, the opening shown in fig. 33 may be arranged as shown in fig. 34.
In this way, the openings that are part of the optical system 9 may also be laid out so as to intentionally create glare in other directions. When the electronic apparatus 1 is provided with three or more imaging optical systems 3, the aperture layouts shown in fig. 5, 8, and 32 to 35 may be combined. In this case, since the direction of generation of the glare is further combined in various directions depending on the photographing optical system 3, the range of acquisition of the image, such as image correction, synthesis, and selection, can be further expanded.
As described above, according to each embodiment, the imaging optical system 3 is disposed on the side of the display unit 2 opposite to the display surface, and the light passing through the display unit 2 is acquired by the plurality of imaging units 8. Part of the light passing through the display unit 2 is repeatedly reflected in the display unit 2 and then enters the imaging unit 8 in the plurality of imaging optical systems 3. According to the above embodiments, images are acquired from signals acquired by the plurality of imaging optical systems 3, and thus, it is possible to easily and reliably suppress captured images of glare components and diffraction components contained in light that is repeatedly reflected in the display unit 2 and then enters the plurality of imaging units 8 (including the case of integration).
For example, the processing in the image acquisition unit 12 may be constituted by a digital circuit, or may be constituted by a Programmable circuit such as an fpga (field Programmable Gate array). Note that the processing contents may be described by a program, and the information processing by software may be specifically realized by using hardware resources such as a CPU.
Several application examples are listed below.
(fifteenth embodiment)
As specific candidate configurations of the electronic apparatus 1 having the configuration described in the above embodiment, various configurations can be considered. For example, fig. 36 is a plan view when the electronic apparatus 1 according to each embodiment is applied to the capsule endoscope 50. The capsule endoscope 50 of fig. 36 includes, for example, in a casing 51 having hemispherical end surfaces and a cylindrical central portion: a camera (subminiature camera) 52 for taking an image of the inside of the body cavity; a memory 53 for recording image data taken by the camera 52; and a wireless transmitter 55 for transmitting the recorded image data to the outside through the antenna 54 after the capsule endoscope 50 is discharged outside the subject's body.
In addition, a cpu (central Processing unit)56 and a coil (magnetic force or current conversion coil) 57 are provided in the case 51. The CPU56 controls the shooting by the camera 52 and the data storage operation into the memory 53, and controls the transmission of data from the memory 53 to a data receiving device (not shown) outside the housing 51 via the wireless transmitter 55. The coil 57 supplies power to the camera 52, the memory 53, the wireless transmitter 55, the antenna 54, and the light source 52b described later.
Also, the casing 51 is provided with a magnetic (lead) switch 58 for detecting the capsule-type endoscope 50 when it is placed to the data receiving apparatus. The CPU56 supplies power from the coil 57 to the wireless transmitter 55 at a time point when the lead switch 58 detects that the placement in the data receiving device is possible and data can be transmitted.
The camera 52 includes, for example, an imaging element 52a including the optical system 9 for imaging an image in the body cavity and a plurality of light sources 52b for illuminating the body cavity. Specifically, the camera 52 is configured by, for example, a cmos (complementary Metal Oxide) sensor or a ccd (charge Coupled device) having an led (light Emitting diode) as the light source 52 b.
The display unit 2 in the electronic device 1 of the foregoing embodiment includes a concept of a light emitter such as the light source 52b of fig. 36. In the capsule endoscope 50 of fig. 36, for example, two light sources 52b are provided, and the light sources 52b may be constituted by a display panel 4 having a plurality of light source units and an LED module having a plurality of LEDs. In this case, by disposing the imaging unit 8 of the camera 52 below the display panel 4 and the LED modules, the limitation of the layout arrangement of the camera 52 can be reduced, and a smaller capsule endoscope 50 can be realized.
(sixteenth embodiment)
Fig. 37 is a rear view of the electronic apparatus 1 according to the embodiment applied to a digital single lens reflex camera 60. The digital single lens reflex camera 60 and the compact camera have a display unit 2 that displays a preview screen on the back surface on the side opposite to the lens. The imaging optical system 3 may be disposed on the side of the display unit 2 opposite to the display surface, and the face image of the photographer may be displayed on the display screen 1a of the display unit 2. In the electronic apparatus 1 according to each of the above embodiments, since the imaging optical system 3 can be disposed in the region overlapping the display unit 2, it is not necessary to provide the imaging optical system 3 in the frame portion of the display unit 2, and the size of the display unit 2 can be increased as much as possible.
(seventeenth embodiment)
Fig. 38A is a plan view showing an example in which the electronic apparatus 1 of the embodiment is applied to a head mounted display (hereinafter referred to as HMD) 61. HMD61 of fig. 38A can be used for VR (Virtual Reality), AR (Augmented Reality), MR (Mixed Reality), SR (supplemental Reality alternative), or the like. In the conventional HMD, as shown in fig. 38B, a camera 62 is mounted on the outer surface, so that the wearer of the HMD can see surrounding images, while the surrounding people do not know the eye and facial expression of the wearer of the HMD.
Therefore, in fig. 38A, the display surface of the display unit 2 is provided on the outer surface of the HMD61, and the photographing optical system 3 is provided on the opposite side of the display surface of the display unit 2. This makes it possible to display the facial expression of the wearer imaged by the imaging optical system 3 on the display surface of the display unit 2, and to grasp the facial expression and eye movement of the wearer in real time by people around the wearer.
In the example of fig. 38A, since the imaging optical system 3 is provided on the rear surface side of the display unit 2, the installation place of the imaging optical system 3 is not limited, and the degree of freedom in design of the HMD61 can be increased. Further, since the camera can be arranged at the optimum position, it is possible to prevent a problem such as misalignment of the line of sight of the wearer displayed on the display surface.
As described above, in the present embodiment, the electronic device 1 of the embodiment can be used for various purposes and can improve the utility value.
The present technology can be configured as follows.
(1) An electronic device, comprising:
a display unit having a displayable region having display optical systems arranged in an array along a first direction and a second direction intersecting the first direction;
a plurality of imaging optical systems that are arranged on a side of the display unit opposite to a display surface so as to overlap the displayable region in a third direction intersecting the first direction and the second direction, the plurality of imaging optical systems including at least:
a first photographing optical system; and
a second photographing optical system having a coordinate different from that of the first photographing optical system in at least one of the first direction and the second direction; and
and an image acquisition unit that acquires image data based on the information acquired by the first and second imaging optical systems.
(2) The electronic apparatus according to (1), wherein,
light is transmitted from the display surface of the display portion to the first photographing optical system and the second photographing optical system through optical systems having different optical characteristics.
(3) The electronic apparatus according to (1) or (2), wherein,
the display unit is provided with an opening for transmitting light incident from the display surface,
light incident from the display surface propagates to the photographing optical system through the opening.
(4) The electronic apparatus according to (3), wherein,
the opening that propagates light to the first photographing optical system and the opening that propagates light to the second photographing optical system have different layouts.
(5) The electronic apparatus according to (3) or (4), wherein,
the opening through which light is propagated to the first photographing optical system and the opening through which light is propagated to the second photographing optical system form diffraction images in different directions.
(6) The electronic apparatus according to any one of (1) to (5),
a third photographing optical system having the same parallax with the first photographing optical system and the second photographing optical system,
the image acquisition unit acquires the image data based on information acquired based on the information acquired from the second photographing optical system and the third photographing optical system, and information acquired from the first photographing optical system.
(7) The electronic apparatus according to (6), wherein,
in the first direction or the second direction of the display surface,
the first photographing optical system is disposed near the center,
the second photographing optical system and the third photographing optical system are provided in the vicinity of a boundary of the display surface with the first photographing optical system interposed therebetween,
the first photographing optical system has a small area facing the display surface in the third direction, compared to the second photographing optical system and the third photographing optical system.
(8) The electronic apparatus according to any one of (1) to (7),
the first photographing optical system and the second photographing optical system have different coordinates in the first direction and the second direction.
(9) The electronic apparatus according to any one of (1) to (8),
the image acquisition unit acquires a shooting result based on data having a low intensity in the image data as a result of the combination when the data acquired from the first shooting optical system and the data acquired from the second shooting optical system are combined.
(10) The electronic apparatus according to any one of (1) to (9),
the first photographing optical system and the second photographing optical system each have a direction preferentially reflected,
when the output in each direction has a difference of a predetermined value or more, the image acquisition unit acquires the imaging result using an arbitrary result.
(11) The electronic apparatus according to any one of (1) to (10),
and the first shooting optical system and the second shooting optical system are shielded from light.
(12) The electronic apparatus according to any one of (1) to (11), wherein,
the image acquisition unit synthesizes information acquired by the first and second imaging optical systems using the trained model.
(13) The electronic device according to (12), wherein,
the trained model is trained based on data collected from a plurality of electronic devices.
(14) The electronic apparatus according to any one of (1) to (13),
the image acquisition unit corrects when the parallax in the plurality of images acquired by the plurality of imaging optical systems exceeds a predetermined amount.
(15) The electronic apparatus according to any one of (1) to (14),
at least one of the photographing optical systems is constituted by a microlens array.
(16) The electronic device according to (15), wherein,
a plurality of the photographing optical systems are disposed in a region where the microlens array is disposed.
(17) The electronic apparatus according to any one of (1) to (16),
the first photographing optical system and the second photographing optical system acquire information through the same photographing element.
(18) The electronic device according to (17), wherein,
the image acquisition unit and the imaging element are disposed on the same chip.
(19) The electronic apparatus according to any one of (1) to (18),
the display section includes a plurality of the display optical systems having different optical characteristics.
(20) The electronic apparatus according to any one of (1) to (19),
at least one of the plurality of photographing optical systems operates when the image acquisition section needs a correction signal.
(21) The electronic apparatus according to any one of (1) to (20),
the first photographing optical system is integrally formed with the second photographing optical system.
(22) The electronic apparatus according to any one of (1) to (21), wherein,
the first photographing optical system and the second photographing optical system are disposed near a boundary of the display surface.
(23) The electronic apparatus according to any one of (1) to (22),
the first imaging optical system and the second imaging optical system are arranged at a distance of 50mm or more and 80mm or less,
the image acquisition unit generates parallax image data of the information acquired by the first and second imaging optical systems.
(24) The electronic apparatus according to any one of (1) to (23),
the display portions are provided on both sides of the electronic apparatus.
(25) The electronic apparatus according to any one of (1) to (24),
the first photographing optical system and the second photographing optical system are two arbitrarily selected from a plurality of the photographing optical systems.
(26) An electronic apparatus having a fourth photographing optical system different from the first photographing optical system and the second photographing optical system,
the fourth imaging optical system and the first imaging optical system or the second imaging optical system have any of the features (1) to (24).
The aspects of the present invention are not limited to the above embodiments, but include various modifications that can be conceived by those skilled in the art, and the effects of the present invention are not limited to the above. That is, various additions, modifications, and partial deletions can be derived from the contents and equivalents thereof defined in the claims without departing from the scope and spirit of the present invention.

Claims (20)

1. An electronic device is characterized by comprising:
a display unit having a displayable region having display optical systems arranged in an array along a first direction and a second direction intersecting the first direction;
a plurality of imaging optical systems that are arranged on a side of the display unit opposite to a display surface so as to overlap the displayable region in a third direction intersecting the first direction and the second direction, the plurality of imaging optical systems including at least:
a first photographing optical system; and
a second photographing optical system having a different coordinate from the first photographing optical system in at least one of the first direction and the second direction; and
and an image acquisition unit that acquires image data based on the information acquired by the first and second imaging optical systems.
2. The electronic device of claim 1,
light is transmitted from the display surface of the display portion to the image pickup element of the first image pickup optical system and the image pickup element of the second image pickup optical system through optical systems having different optical characteristics.
3. The electronic device of claim 1,
the display unit is provided with an opening for transmitting light incident from the display surface,
light incident from the display surface propagates to the photographing optical system through the opening.
4. The electronic device of claim 3,
the opening that propagates light to the first photographing optical system and the opening that propagates light to the second photographing optical system have different layouts.
5. The electronic device of claim 1,
the electronic device includes a third photographing optical system having the same parallax with the first photographing optical system and the second photographing optical system,
the image acquisition unit acquires the image data based on information acquired based on the information acquired from the second photographing optical system and the third photographing optical system, and information acquired from the first photographing optical system.
6. The electronic device of claim 5,
in the first direction or the second direction of the display surface,
the first photographing optical system is disposed near the center,
the second photographing optical system and the third photographing optical system are provided in the vicinity of a boundary of the display surface with the first photographing optical system interposed therebetween,
an area of the first photographing optical system facing the display surface in the third direction is smaller than the second photographing optical system and the third photographing optical system.
7. The electronic device of claim 1,
the image acquisition unit acquires a shooting result based on data having a low intensity in the image data as a result of the combination when the data acquired from the first shooting optical system and the data acquired from the second shooting optical system are combined.
8. The electronic device of claim 1,
the first photographing optical system and the second photographing optical system each have a direction preferentially reflected,
when the output in each direction has a difference of a predetermined value or more, the image acquisition unit acquires the imaging result using an arbitrary result.
9. The electronic device of claim 1,
and the first shooting optical system and the second shooting optical system are shielded from light.
10. The electronic device of claim 1,
the image acquisition unit combines information acquired by the first and second imaging optical systems using the trained model.
11. The electronic device of claim 1,
the image acquisition unit corrects when the parallax in the plurality of images acquired by the plurality of imaging optical systems exceeds a predetermined amount.
12. The electronic device of claim 1,
at least one of the photographing optical systems is constituted by a microlens array.
13. The electronic device of claim 12,
a plurality of the photographing optical systems are disposed in a region where the microlens array is disposed.
14. The electronic device of claim 1,
the first and second photographing optical systems acquire information by the same photographing element.
15. The electronic device of claim 1,
the display section includes a plurality of the display optical systems having different optical characteristics.
16. The electronic device of claim 1,
at least one of the plurality of imaging optical systems operates when the image acquisition unit needs a correction signal.
17. The electronic device of claim 1,
the first photographing optical system is integrally formed with the second photographing optical system.
18. The electronic device of claim 1,
the first photographing optical system and the second photographing optical system are disposed near a boundary of the display surface.
19. The electronic device of claim 1,
the first imaging optical system and the second imaging optical system are arranged at a distance of 50mm or more and 80mm or less,
the image acquisition unit generates parallax image data of the information acquired by the first and second imaging optical systems.
20. The electronic device of claim 1,
the display portions are provided on both sides of the electronic apparatus.
CN202022601717.5U 2019-11-12 2020-11-11 Electronic device Active CN213718047U (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-204844 2019-11-12
JP2019204844 2019-11-12

Publications (1)

Publication Number Publication Date
CN213718047U true CN213718047U (en) 2021-07-16

Family

ID=75912510

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202011252523.7A Pending CN112866518A (en) 2019-11-12 2020-11-11 Electronic device
CN202022601717.5U Active CN213718047U (en) 2019-11-12 2020-11-11 Electronic device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202011252523.7A Pending CN112866518A (en) 2019-11-12 2020-11-11 Electronic device

Country Status (7)

Country Link
US (1) US20220343471A1 (en)
EP (1) EP4060405A4 (en)
JP (1) JPWO2021095581A1 (en)
KR (1) KR20220091496A (en)
CN (2) CN112866518A (en)
TW (1) TW202134769A (en)
WO (1) WO2021095581A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023112780A1 (en) * 2021-12-13 2023-06-22 ソニーセミコンダクタソリューションズ株式会社 Image display device and electronic apparatus

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8199185B2 (en) * 1995-09-20 2012-06-12 Videotronic Systems Reflected camera image eye contact terminal
US7209160B2 (en) * 1995-09-20 2007-04-24 Mcnelley Steve H Versatile teleconferencing eye contact terminal
US6888562B2 (en) * 2003-03-26 2005-05-03 Broadcom Corporation Integral eye-path alignment on telephony and computer video devices using a pinhole image sensing device
TWI381702B (en) * 2003-06-17 2013-01-01 Semiconductor Energy Lab A display device having an image pickup function and a two-way communication system
JP4845336B2 (en) * 2003-07-16 2011-12-28 株式会社半導体エネルギー研究所 Display device with imaging function and bidirectional communication system
JP2006060535A (en) * 2004-08-20 2006-03-02 Sharp Corp Portable telephone system
US20070002130A1 (en) * 2005-06-21 2007-01-04 David Hartkop Method and apparatus for maintaining eye contact during person-to-person video telecommunication
WO2007013272A1 (en) * 2005-07-28 2007-02-01 Sharp Kabushiki Kaisha Display device and backlight device
WO2007047685A2 (en) * 2005-10-17 2007-04-26 I2Ic Corporation Combined video display and camera system
WO2007138543A2 (en) * 2006-05-25 2007-12-06 Udayan Kanade Display with gaps for capturing images
US7714923B2 (en) * 2006-11-02 2010-05-11 Eastman Kodak Company Integrated display and capture apparatus
US20090009628A1 (en) * 2007-07-06 2009-01-08 Michael Janicek Capturing an image with a camera integrated in an electronic display
US8154582B2 (en) * 2007-10-19 2012-04-10 Eastman Kodak Company Display device with capture capabilities
US8164617B2 (en) * 2009-03-25 2012-04-24 Cisco Technology, Inc. Combining views of a plurality of cameras for a video conferencing endpoint with a display wall
JP5684488B2 (en) * 2009-04-20 2015-03-11 富士フイルム株式会社 Image processing apparatus, image processing method, and program
US8456586B2 (en) * 2009-06-11 2013-06-04 Apple Inc. Portable computer display structures
KR20120019703A (en) * 2010-08-26 2012-03-07 삼성전자주식회사 Method for controlling a digital photographing apparatus and the digital photographing apparatus
US8947627B2 (en) 2011-10-14 2015-02-03 Apple Inc. Electronic devices having displays with openings
JP5836768B2 (en) * 2011-11-17 2015-12-24 キヤノン株式会社 Display device with imaging device
KR101864452B1 (en) * 2012-01-12 2018-06-04 삼성전자주식회사 Image taking and video communincation device and method
JP2014138290A (en) * 2013-01-17 2014-07-28 Sharp Corp Imaging device and imaging method
JP2015012127A (en) * 2013-06-28 2015-01-19 ソニー株式会社 Solid state image sensor and electronic apparatus
KR101462351B1 (en) * 2013-08-16 2014-11-14 영남대학교 산학협력단 Apparatus for eye contact video call
KR102289904B1 (en) * 2015-01-23 2021-08-18 삼성디스플레이 주식회사 Display apparatus
US9767728B2 (en) * 2015-10-30 2017-09-19 Essential Products, Inc. Light sensor beneath a dual-mode display
US20180090999A1 (en) 2016-09-23 2018-03-29 Apple Inc. Wireless charging mat with multiple coil arrangements optimized for different devices
US10958841B2 (en) * 2017-01-06 2021-03-23 Intel Corporation Integrated image sensor and display pixel
CN110336907A (en) * 2019-08-21 2019-10-15 惠州Tcl移动通信有限公司 Terminal, image pickup method and storage medium

Also Published As

Publication number Publication date
US20220343471A1 (en) 2022-10-27
KR20220091496A (en) 2022-06-30
JPWO2021095581A1 (en) 2021-05-20
EP4060405A4 (en) 2023-01-25
EP4060405A1 (en) 2022-09-21
TW202134769A (en) 2021-09-16
CN112866518A (en) 2021-05-28
WO2021095581A1 (en) 2021-05-20

Similar Documents

Publication Publication Date Title
US6104431A (en) Visual axis detecting apparatus and method including scanning light source, and image device using same
US10437080B2 (en) Eyewear, eyewear systems and associated methods for enhancing vision
CN108461045B (en) Display device and method for manufacturing the same
US20140125810A1 (en) Low-profile lens array camera
US10623697B2 (en) Display panel, display device and image pickup method therefor
JP2007116208A (en) Compound eye imaging apparatus
CN216625895U (en) Electronic device
KR20170141140A (en) Head mounted display and gaze detection system
KR20200073211A (en) Electronics
CN213718047U (en) Electronic device
CN105227828B (en) Filming apparatus and method
US11917298B2 (en) Electronic device
JP3819733B2 (en) Imaging device
CN212727101U (en) Electronic device
CN107172338A (en) A kind of camera and electronic equipment
KR101815164B1 (en) Camera to capture multiple sub-images for generation of an image
WO2021157324A1 (en) Electronic device
CN206573783U (en) Virtual reality helmet
KR20200062220A (en) Electronics
WO2021149503A1 (en) Electronic device
WO2022244354A1 (en) Imaging element and electronic device
CN111694183A (en) Display device and display method thereof
WO2021226770A1 (en) Mobile terminal, method for acquiring image, and computer-readable storage medium
WO2023050040A1 (en) Camera module and electronic device

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant