GB2614876A - Camera System - Google Patents

Camera System Download PDF

Info

Publication number
GB2614876A
GB2614876A GB2200361.0A GB202200361A GB2614876A GB 2614876 A GB2614876 A GB 2614876A GB 202200361 A GB202200361 A GB 202200361A GB 2614876 A GB2614876 A GB 2614876A
Authority
GB
United Kingdom
Prior art keywords
image
image sensor
movement
under
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2200361.0A
Other versions
GB2614876B (en
Inventor
Richards David
Carr Joshua
John Burbridge Daniel
Koveos Yannis
Garden Danny
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cambridge Mechatronics Ltd
Original Assignee
Cambridge Mechatronics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cambridge Mechatronics Ltd filed Critical Cambridge Mechatronics Ltd
Priority to GB2200361.0A priority Critical patent/GB2614876B/en
Publication of GB2614876A publication Critical patent/GB2614876A/en
Application granted granted Critical
Publication of GB2614876B publication Critical patent/GB2614876B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B29/00Combinations of cameras, projectors or photographic printing apparatus with non-photographic non-optical apparatus, e.g. clocks or weapons; Cameras having the shape of other objects
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B30/00Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B5/00Adjustment of optical system relative to image or object surface other than for focusing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/682Vibration or motion blur correction
    • H04N23/685Vibration or motion blur correction performed by mechanical compensation
    • H04N23/687Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

An under-display camera system 1 in which a transmissive display device 2 mounted on a support structure 3 and an image capture system 10 including an image sensor 11 and a focussing lens 12 is located behind the transmissive display 2. An actuator 13, 43 moves an optical component (e.g. the sensor 11 or the lens 12 of the image capture system 10 or an optical element of a spatially non-uniform illumination system 40 located behind the display device 2) relative to the support 3. A control unit (31, fig.5) controls the actuator system 13 to move the optical component so that diffraction effects from the transmissive display device change and captures images at different points during this movement and an image processing unit (32, fig.5) combines plural captured images to produce an image in which the diffraction effects are reduced. Movement may be translational or rotational relative to an optical axis of the capture system. The transmissive display device 2 may be e.g. an LCD panel with a waveguide backlight. The support structure 3 may be e.g. a housing of a mobile telephone.

Description

CAMERA SYSTEM
Field
The present invention relates to an under-display camera system.
Background
An image capture system may be arranged behind a transmissive display device to improve the aesthetics or reduce the size of a product incorporating the image capture system, for example a mobile telephone. However, when the image capture system is arranged behind the transmissive display device, then images focussed on the image sensor may have diffraction effects from the transmissive display device. These may be formed by the elements of the transmissive display device having different transmissivities, typically including elements that have low transmissivity or are not transmissive. This reduces the quality of captured images, so would desirably be reduced.
Summary
According to a first aspect of the present invention, there is provided an under-display camera system comprising: a transmissive display device mounted on a support structure; an image capture system comprising optical components including an image sensor for capturing an image and a lens for focussing an image on the image sensor, the image capture system being arranged behind the transmissive display device such that images focussed on the image sensor have diffraction effects from the transmissive display device; an actuator system arranged to drive movement of at least one optical component of the image capture system relative to the support structure in a manner that the diffraction effects from the transmissive display device change; a control unit arranged to control the actuator system to drive movement of the at least one optical component of the image capture system and to control the image sensor device to capture plural images having different diffraction effects from the transmissive display device; an image processing device arranged to perform an image combination process on the plural captured images to produce a combined image in which the diffraction effects are reduced.
In some types of embodiment, the movement of the at least one optical component of the image capture system driven by the actuator system may be translational movement perpendicular to an optical axis of the image capture system. In this case, the at least one optical component that is moved may comprise the lens only, or the lens and the image sensor together.
In some types of embodiment, the movement of the at least one optical component of the image capture system driven by the actuator system may be translational movement parallel to an optical axis of the image capture system. In this case, the at least one optical component that is moved may preferably comprise the image sensor only, but may alternatively comprise the lens only, or the lens and the image sensor together.
In some types of embodiment, the at least one optical component that is moved comprises the image sensor, the movement of the at least one optical component of the image capture system driven by the actuator system is rotational movement around an optical axis of the image capture system, and the control unit is arranged to detect rotational movement of the support structure and is arranged to control the actuator system to drive said rotational movement of the at least one optical component of the image capture system around an optical axis of the image capture system in a sense that counters the detected rotational movement of the support structure.
The image combination process may comprise: optionally, aligning the captured images; detecting regions of the captured images having diffraction effects; and correcting the detected regions.
The step of correcting the detected regions may comprise selecting pixel values from the plural images having a minimum value, a median value or a mean value.
The invention may be applied to an under-display imaging system in which the image sensor is a colour image sensor, or to an under-display imaging system in which the camera system is a structured light camera system or a time-of-flight camera system and the image sensor is optionally a monochrome image sensor.
According to a second aspect of the present invention, there is provided under-display camera system comprising: a transmissive display device mounted on a support structure; an image capture system comprising optical components including an image sensor for capturing an image and a lens for focussing an image on the image sensor, the image capture system being arranged behind the transmissive display device such that images focussed on the image sensor have diffraction effects from the transmissive display device; an illumination system arranged to output spatially non-uniform light onto a scene imaged by the image capture system; an actuator system arranged to drive movement of at least one optical component of the illumination system relative to the support structure to cause movement of the spatially non-uniform light on the scene, whereby the diffraction effects from the transmissive display device change; a control unit arranged to control the actuator system to drive movement of the at least one optical component of the illumination system and to control the image sensor device to capture plural images having changed diffraction effects from the transmissive display device; an image processing device arranged to perform an image combination process on the plural captured images to produce a combined image in which the diffraction effects are reduced.
The illumination system may be arranged behind the transmissive display device to reduce the size of a product incorporating the image capture system, but this is not essential.
The camera system may be a structured light camera system or a time-of-flight camera system which employs spatially non-uniform illumination as described in W02020030916 (which is incorporated by reference).
The movement of the spatially non-uniform light on the scene caused by the movement of the at least one optical component of the image capture system driven by the actuator system may be rotational movement.
The image combination process may comprise transforming at least one of the captured images to align the spatially non-uniform light in respect of each captured image; detecting regions of the captured images having diffraction effects; and correcting the detected regions.
The step of correcting the detected regions may comprise selecting pixel values from the plural images having a minimum value, a median value or a mean value.
The image sensor may be a monochrome image sensor.
The two aspects of the present invention may be combined together in a single image capture system.
Brief Description of the Drawings
Embodiments of the present invention will now be described, by way of example only, with reference to the accompanying drawings, which show the following.
Figs. 1 to 4 are schematic sides views of an under-display camera system having four alternative constructions.
Fig. 5 is a diagram of a control unit and an image processing unit of the under-display camera system.
Fig. 6 is a flow chart of an image combination process performed in the image processing unit for images captured by an under-display camera system having a construction shown in Figs. 1 to 4.
Figs. 7 and 8 are schematic side views of an under-display camera system that is a spatially non-uniform light system having two alternative constructions.
Fig. 9 is a flow chart of an image combination process performed in the image processing unit for images captured by an under-display camera system having a construction as shown in Figs. 7 and 8.
Detailed Description
Figs. 1 to 4 show an under-display camera system 1 having four alternative constructions. In each case, the camera system 1 comprises the following components.
The under-display camera system 1 comprises a transmissive display device 2 mounted on a support structure 3. The transmissive display device 2 may be of any type that is transmissive to light, for example a liquid crystal display (LCD) panel incorporating a backlight in the form of a waveguide. Such a transmissive display device 2 has variable transmissivity across the structure of its pixels, typically including some regions that have relatively high transmissivity and other regions that have relatively low or are not transmissive.
The support structure 3 may take any form and may be for example a housing of a product in which the under-display camera system 1 is incorporated, for example a mobile telephone.
The under-display camera system 1 comprises an image capture system 10 comprising optical components including an image sensor 11 for capturing an image and a lens 12 for focussing an image on the image sensor 11. The image sensor 11 may be of any type, for example being a CMOS (Complementary Metal Oxide Semiconductor) device. The image sensor 11 may be a colour image sensor or a monochrome image sensor, depending on the application.
The image capture system 10 is arranged behind the transmissive display device 2. As a result of the transmissivity of the transmissive display device 2, images may be captured on the image sensor 11. However, the images focussed on the image sensor 11 and captured thereby have diffraction effects from the transmissive display device 2, as a result of the variable transmissivity of the transmissive display device 2, which may be considered to act as a diffraction grating. This reduces the quality of individual captured images, but this problem is tackled in the under-display camera system 1 as follows.
The under-display camera system 1 comprises an actuator system 13 which is arranged to drive movement of at least one optical component of the image capture system 10 relative to the support structure 2. Different optical components of the image capture system 10 are moved in the three alternative constructions of Figs. 1 to 3 as follows.
In the construction of Fig. 1, the actuator system 13 drives movement of the image sensor 11 relative to the support structure 2.
In the construction of Fig. 2, the actuator system 13 drives movement of the lens 12 relative to the support structure 2.
In the construction of Fig. 3, the actuator system 13 drives movement of the image sensor 11 and lens 12 together relative to the support structure 2.
In the construction of Fig. 4, the camera system 1 is a spatially non-uniform light camera system, in which the image capture system 10 has a construction as shown in Fig. 2, and the image sensor 11 may be a monochrome image sensor. In addition, the camera system 1 further comprises an illumination system 40. The illumination system 40 comprises a light source 41 and an optical element 42 that is arranged to output spatially non-uniform light onto a scene imaged by the image capture system 1.
The illumination system 40 is arranged behind the transmissive display device in Fig. 4, but that is not essential.
The light source 41 may be a vertical-cavity surface-emitting laser (VCSEL). The optical element 42 may be a diffractive optical element (DOE). The illumination system 40 may take the form disclosed in detail in W02020030916.
The actuator system 13 may be arranged to drive movement using any suitable actuation technology. In one type of embodiment, the actuator system 13 may comprise shape memory alloy (SMA) wires, some examples of which are given below. Alternatively, the actuator system 13 may use any other type of actuator, including for example a voice-coil motor.
In each of the alternative constructions of Figs. 1 to 3, the actuator system 13 drives movement of the at least one optical component of the image capture system 10 in a manner that the diffraction effects from the transmissive display device change. Some examples of types of movement driven by the actuator system 13 are as follows.
In a first example, the movement of the at least one optical component of the image capture system 10 driven by the actuator system 13 is translational movement perpendicular to an optical axis of the image capture system 10. This causes relative motion relative motion between the transmissive display device 2 that acts as a diffraction grating and the lens 12 (and optionally also the image sensor 11), which causes a difference in the location of the peaks of the diffraction from the transmissive display device 2 (whether or not you move the sensor along with doesn't matter). Thus, in this example the at the at least one optical component of the image capture system 10 whose movement is driven may be the lens 12 only, or the lens 12 and the image sensor 11 together.
In this first example, the degree of movement of the lens 11 may typically be less than the pitch of the pixels of the transmissive display device 2, which act as a diffraction grating.
In this first example, where the actuator system 13 comprises SMA wires, this movement may be achieved by a configuration of four SMA wires as disclosed in W02013/175197 or a configuration of eight SMA wires as disclosed in W02011/104518.
In a second example, the movement of the at least one optical component of the image capture system driven by the actuator system 13 is translational movement parallel to an optical axis of the image capture system 10. In this example, the movement changes the magnification of the diffraction effect created by the transmissive display device 2 which act as a diffraction grating, for example by changing the distance from the image sensor 11 and/or the lens 12 to the transmissive display device 2. This effectively causes the diffraction pattern to change by spreading out as the magnification decreases, so that the effective size of the diffraction pattern increases. Thus, in this example the at the at least one optical component of the image capture system 10 whose movement is driven may preferably be the image sensor 11 only, but may alternatively be the lens 12 only, or the lens 11 and the image sensor 12 together.
In this second example, where the actuator system 13 comprises SMA wires, this movement may be achieved by a configuration of eight SMA wires as disclosed in W02011/104518.
In a third example, the at the at least one optical component of the image capture system 10 whose movement is driven is the image sensor 12, and the movement of the at least one optical component of the image capture system 10 driven by the actuator system 13 is rotational movement around an optical axis of the image capture system. This example may take advantage of rotational movement of the image sensor to provide optical image stabilisation. In that case, the control unit 31 detects rotational movement of the support structure 3. This may be detected from the output of the image sensor 11 or using an optional rotation sensor 33, if provided. Then, the control unit 31 controls the actuator system 13 to drive rotational movement of the image sensor 12 around an optical axis of the image capture system 11 in a sense that counters the detected rotational movement of the support structure 3.
In this third example, where the actuator system 13 comprises SMA wires, this movement may be achieved by a configuration of four SMA wires as disclosed in WO 2017/072525.
Fig. 5 shows a control unit 31 and an image processing unit 32 of the under-display camera system 1 that are used to reduce diffraction effects, as follows.
The control unit 31 is arranged to control the image sensor 11 and the actuator system 13.
During image capture, the control unit 31 controls the actuator system 13 to drive movement of the optical component of the image capture system, for example the image sensor 11, the lens 12 or the image sensor 11 and lens 12 together, so that the diffraction effects from the transmissive display device change. The control unit 31 also controls the image sensor 11 to capture plural images at different points during this movement. Thus, the plural captured images have different diffraction effects from the transmissive display device 2.
Any number of plural captured images may be used, for example two, three, or more. In general, increasing the number of plural captured images improves the correction at the expense of increasing the overall image capture time, so is a balance between these factors.
The plural captured images are supplied to the image processing unit 32, which performs an image combination process on the plural captured images. The image combination process produces a combined image in which the diffraction effects are reduced. In general terms, as each of the plural captured images has different diffraction effects, it is possible to make use of the differing information in the plural captured images to reduce the diffraction effects in the combined image. It is possible to apply any image combination process which achieves this.
Fig. 6 is a flow chart of an example of a suitable image combination process performed in the image processing unit 32, which is performed as follows.
Step 51 is performed in cases where the movement driven by the actuator system 13 causes the captured images to be misaligned on the image sensor 11. Whether that occurs is dependent on the type of movement, and does not occur in all cases. Where the captured images are not misaligned, step 51 is omitted. It may also be possible to omit step 51 if the degree of movement of the captured image relative to the image sensor is relatively small. Thus, step 51 is optional.
In step Si, the plural captured images are aligned. This may be performed by any conventional alignment process. The alignment may be performed based on the content of the captured images themselves. Alternatively, the alignment may be performed based on the position of the optical component driven by the actuator system 13. The relative alignment of each captured image is dependent on that position, so the images may be aligned by performing a transformation that is the inverse of the transformation at each given position In step 52, regions of the captured images where diffraction effects are present are identified. This may be done based on the difference between the pixel values in the plural captured images, for example comparing the difference with a threshold.
In step 53, the identified regions of the captured images having diffraction effects are corrected, thereby providing a combined image with improved quality. Any suitable correction process may be performed.
In one example, correction is performed by selecting pixel values from the plural images having a minimum value, a median value or a mean value.
As an alternative, step S2 may be omitted. In this case, in regions where diffraction effects are present the same effect is achieved in step 53, but in regions where there are no diffraction effects, step 53 has no effect on the quality of the image.
Figs. 7 and 8 shows two alternative constructions for an under-display camera system 1 that is a spatially non-uniform light camera system having a construction as follows. In this construction, the under-display camera system 1 has the same construction as shown in Fig. 4, except as will be described.
The under-display camera system 1 comprises an image capture system 10 that is arranged as described above. The image sensor 11 may be a monochrome image sensor.
The under-display camera system 1 further comprises an illumination system 40. The illumination system 40 comprises a light source 41 and an optical element 42 that is arranged to output spatially non-uniform light onto a scene imaged by the image capture system 1.
The illumination system 40 is arranged behind the transmissive display device in Figs. 7 and 8, but that is not essential.
The light source 41 may be a vertical-cavity surface-emitting laser (VCSEL). The optical element 42 may be a diffractive optical element (DOE). The illumination system 40 may take the form disclosed in detail in W02020030916.
However, the difference from the construction as shown in Fig. 4 is that the actuator system 13 is replaced by an actuator system 43 arranged to drive movement of at least one optical component of the illumination system 40 relative to the support structure 2, instead of at least one optical component of the image capture system 10. Different optical components of the illumination system 40 are moved in the three alternative constructions of Figs. 7 and 8 as follows.
In the construction of Fig. 7, the actuator system 13 drives movement of the optical element 42 of the illumination system 40 In the construction of Fig. 8, the actuator system 13 drives movement of the light source 41 of the illumination system 40.
The actuator system 43 may be arranged to drive movement using any suitable actuation technology. In one type of embodiment, the actuator system 43 may comprise SMA wires, some examples of which are given below. Alternatively, the actuator system 43 may use any other type of actuator, including for example a voice-coil motor.
In each of the alternative constructions of Figs. 1 to 3, the actuator system 43 drives movement of the at least one optical component of the illumination system 40 in a manner that causes movement of the spatially non-uniform light on the scene. As a result, there is a change in the diffraction effects on the images captured by the image sensor 11 from the transmissive display device 2.
The movement of the spatially non-uniform light on the scene caused by the movement of the at least one optical component of the illumination system driven by the actuator system is rotational movement. This may be achieved by rotational movement of the light source 41 or rotational movement of the optical element 42.
25 30 35 The control unit 31 and the image processing unit 32 of the under-display camera system 1 take the form shown in Fig. 5, except that the control unit 31 controls the actuation system 43 instead of the actuation system 13.
During image capture, the control unit 31 controls the actuator system 43 to drive movement of the optical component of the image capture system, for example the light source 41 or the optical element 42, so that the movement of the spatially non-uniform light on the scene changes and the diffraction effects from the transmissive display device 2 change. The control unit 31 also controls the image sensor 11 to capture plural images at different points during this movement. Thus, the plural captured images have different diffraction effects from the transmissive display device 2. These diffraction effects could confuse the analysis of the spatially non-uniform light.
Again, any number of plural captured images may be used, for example two, three, or more. In general, increasing the number of plural captured images improves the correction at the expense of increasing the overall image capture time, so is a balance between these factors.
The plural captured images are supplied to the image processing unit 32, which performs an image combination process on the plural captured images. The image combination process produces a combined image in which the diffraction effects are reduced. In general terms, as each of the plural captured images has different diffraction effects, it is possible to make use of the differing information in the plural captured images to reduce the diffraction effects in the combined image. It is possible to apply any image combination process which achieves this.
Fig. 9 is a flow chart of an example of a suitable image combination process performed in the image processing unit 32, which is performed as follows.
In step Ti, the plural captured images are aligned. This may be performed by any conventional alignment process. The alignment may be performed based on the content of the captured images themselves. Alternatively, the alignment may be performed based on the position of the optical component driven by the actuator system 43. The alignment may be performed by performing a transformation which is the inverse of the transformation applied to the spatially non-uniform light.
In step T2, regions of the captured images where diffraction effects are present are identified. This may be done based on the difference between the pixel values in the plural captured images, for example comparing the difference with a threshold.
In step T3, the identified regions of the captured images having diffraction effects are corrected, thereby providing a combined image with improved quality. Any suitable correction process may be performed. In one example, correction is performed by selecting pixel values from the plural images having a minimum value, a median value or a mean value.
As an alternative, step T2 may be omitted. In this case, in regions where diffraction effects are present the same effect is achieved in step 53, but in regions where there are no diffraction effects, step T3 has no effect on the quality of the image.
In another alternative (not shown), the actuator system 43 of the illumination system 40 may be provided in addition to the actuator system 13 of the image capture system 10, that is by modifying the under-display camera system 1 of either of Fig. 7 or 8 by replacing the image capture system 10 by the image capture system 10 of any of Figs. 1 to 3.

Claims (16)

  1. Claims 1. An under-display camera system comprising: a transmissive display device mounted on a support structure; an image capture system comprising optical components including an image sensor for capturing an image and a lens for focussing an image on the image sensor, the image capture system being arranged behind the transmissive display device such that images focussed on the image sensor have diffraction effects from the transmissive display device; an actuator system arranged to drive movement of at least one optical component of the image capture system relative to the support structure in a manner that the diffraction effects from the transmissive display device change; a control unit arranged to control the actuator system to drive movement of the at least one optical component of the image capture system and to control the image sensor device to capture plural images having different diffraction effects from the transmissive display device; an image processing device arranged to perform an image combination process on the plural captured images to produce a combined image in which the diffraction effects are reduced.
  2. 2. An under-display imaging system according to claim 1, wherein the movement of the at least one optical component of the image capture system driven by the actuator system is translational movement perpendicular to an optical axis of the image capture system.
  3. 3. An under-display imaging system according to claim 2, wherein the at least one optical component comprises the lens only, or the lens and the image sensor together.
  4. 4. An under-display imaging system according to claim 1, wherein the movement of the at least one optical component of the image capture system driven by the actuator system is translational movement parallel to an optical axis of the image capture system.
  5. 5. An under-display imaging system according to claim 4, wherein the at least one optical component comprises the image sensor only, the lens only, or the lens and the image sensor together.
  6. 6. An under-display imaging system according to claim 1, wherein the at least one optical component comprises the image sensor, the movement of the at least one optical component of the image capture system driven by the actuator system is rotational movement around an optical axis of the image capture system, and the control unit is arranged to detect rotational movement of the support structure and is arranged to control the actuator system to drive said rotational movement of the at least one optical component of the image capture system around an optical axis of the image capture system in a sense that counters the detected rotational movement of the support structure.
  7. 7. An under-display imaging system according to any one of the preceding claims, wherein the image combination process comprises: optionally, aligning the captured images; correcting regions of the captured images having diffraction effects.
  8. 8. An under-display imaging system according to claim 7, wherein the step of correcting the regions of the captured images having diffraction effects comprises selecting pixel values from the plural images having a minimum value, a median value or a mean value.
  9. 9. An under-display imaging system according to any one of the preceding claims, wherein the image sensor is a colour image sensor.
  10. 10. An under-display imaging system according to any one of claims 1 to 8 or 11 to 16, wherein the camera system is a structured light camera system or a time-of-flight camera system and the image sensor is optionally a monochrome image sensor.
  11. 11. An under-display camera system comprising: a transmissive display device mounted on a support structure; an image capture system comprising optical components including an image sensor for capturing an image and a lens for focussing an image on the image sensor, the image capture system being arranged behind the transmissive display device such that images focussed on the image sensor have diffraction effects from the transmissive display device; an illumination system arranged to output spatially non-uniform light onto a scene imaged by the image capture system; an actuator system arranged to drive movement of at least one optical component of the illumination system relative to the support structure to cause movement of the spatially non-uniform light on the scene, whereby the diffraction effects from the transmissive display device change; a control unit arranged to control the actuator system to drive movement of the at least one optical component of the illumination system and to control the image sensor device to capture plural images having changed diffraction effects from the transmissive display device; an image processing device arranged to perform an image combination process on the plural captured images to produce a combined image in which the diffraction effects are reduced.
  12. 12. An under-display imaging system according to claim 11, wherein the illumination system is arranged behind the transmissive display device.
  13. 13. An under-display imaging system according to claim 11 or 12; wherein the movement of the spatially non-uniform light on the scene caused by the movement of the at least one optical component of the illumination system driven by the actuator system is rotational movement.
  14. 14. An under-display imaging system according to any one of claims 11 to 13, wherein the image combination process comprises: aligning the captured images; and correcting regions of the captured images having diffraction effects.
  15. 15. An under-display imaging system according to claim 14, wherein the step of correcting the regions of the captured images comprises selecting pixel values from the plural images having a minimum value, a median value or a mean value.
  16. 16. An under-display imaging system according to any one of claims 11 to 15, wherein the image sensor is a monochrome image sensor.
GB2200361.0A 2022-01-12 2022-01-12 Camera System Active GB2614876B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2200361.0A GB2614876B (en) 2022-01-12 2022-01-12 Camera System

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2200361.0A GB2614876B (en) 2022-01-12 2022-01-12 Camera System

Publications (2)

Publication Number Publication Date
GB2614876A true GB2614876A (en) 2023-07-26
GB2614876B GB2614876B (en) 2024-03-06

Family

ID=86990824

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2200361.0A Active GB2614876B (en) 2022-01-12 2022-01-12 Camera System

Country Status (1)

Country Link
GB (1) GB2614876B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113777687A (en) * 2021-08-12 2021-12-10 惠州Tcl云创科技有限公司 Camera assembly under screen, backlight module and display device
CN114549581A (en) * 2021-12-24 2022-05-27 北京旷视科技有限公司 Image generation method, system, electronic device, storage medium, and program product
CN114779487A (en) * 2022-04-28 2022-07-22 深圳市安思疆科技有限公司 Optical device and optical system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113777687A (en) * 2021-08-12 2021-12-10 惠州Tcl云创科技有限公司 Camera assembly under screen, backlight module and display device
CN114549581A (en) * 2021-12-24 2022-05-27 北京旷视科技有限公司 Image generation method, system, electronic device, storage medium, and program product
CN114779487A (en) * 2022-04-28 2022-07-22 深圳市安思疆科技有限公司 Optical device and optical system

Also Published As

Publication number Publication date
GB2614876B (en) 2024-03-06

Similar Documents

Publication Publication Date Title
JP3908255B2 (en) Image projection system
EP2910011B1 (en) Multi-camera system using folded optics
EP0528646B1 (en) Visual display system and exposure control apparatus
US8350948B2 (en) Image device which bypasses blurring restoration during a through image
EP1999947B1 (en) Image capturing device with improved image quality
US7450155B2 (en) Image capturing apparatus
US8049798B2 (en) Imaging device and image processing method
US7651282B2 (en) Devices and methods for electronically controlling imaging
US20060268419A1 (en) Image correction using a microlens array as a unit
US20090096915A1 (en) Anti-aliasing spatial filter system
KR20190138853A (en) Device for imaging partial fields of view, multi-opening imaging device and method for providing same
CN105323423A (en) Image processing method, image processing apparatus, and image pickup apparatus
JP2012145755A (en) Image display device
US7588337B2 (en) Optical system and image projection apparatus
US20070280550A1 (en) Lens defect correction
WO2007100690A2 (en) Scanned beam source and systems using a scanned beam source for producing a wavelength-compensated composite beam of light
EP2172800A1 (en) Light scanning device
GB2614876A (en) Camera System
JP3709395B2 (en) Image projection system
JP5069850B2 (en) Imaging lens and imaging apparatus
JP4628593B2 (en) Imaging device
EP0642723A1 (en) Imaging system with dead element concealment
US7742233B2 (en) Image correction using a microlens array as a unit
WO2024089968A1 (en) Image processing device, imaging device, image processing method, and program
US20230384648A1 (en) Carrying frame, camera module and electronic device