WO2021057731A1 - 图像处理方法、装置、电子设备和介质 - Google Patents

图像处理方法、装置、电子设备和介质 Download PDF

Info

Publication number
WO2021057731A1
WO2021057731A1 PCT/CN2020/116887 CN2020116887W WO2021057731A1 WO 2021057731 A1 WO2021057731 A1 WO 2021057731A1 CN 2020116887 W CN2020116887 W CN 2020116887W WO 2021057731 A1 WO2021057731 A1 WO 2021057731A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
target
brightness value
brightness
pixel
Prior art date
Application number
PCT/CN2020/116887
Other languages
English (en)
French (fr)
Inventor
杨卓坚
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Priority to JP2022508992A priority Critical patent/JP7397966B2/ja
Priority to EP20868126.2A priority patent/EP4037304A4/en
Publication of WO2021057731A1 publication Critical patent/WO2021057731A1/zh
Priority to US17/691,859 priority patent/US11678066B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/13Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/42Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by switching between different modes of operation using different resolutions or aspect ratios, e.g. switching between interlaced and non-interlaced mode

Definitions

  • This application relates to the field of data processing, and in particular to an image processing method, device, electronic equipment and medium.
  • the use of digging holes in the display or backlight area can increase the proportion of the mobile phone screen to a certain extent.
  • the user when viewed from the front, the user will still see the black dots of the camera digging holes, and the full-screen display has not yet been fully realized.
  • the camera is placed below the display screen, a full-screen display can be realized, but because the display screen only has a certain light transmittance, the brightness of the image captured by the camera is very low.
  • the present application provides an image processing method, device, electronic equipment, and medium to solve the problem of low brightness of the image captured by the camera below the display screen.
  • the first aspect of the present application provides an image processing method, which is applied to an electronic device, the electronic device includes an under-screen camera component, and the under-screen camera component includes M color cameras;
  • the image processing method includes:
  • For each first image obtain the brightness value of at least some pixels in the first image
  • M and N are both integers greater than or equal to 2, and N is less than or equal to M.
  • the present application provides an image processing device, which is applied to electronic equipment.
  • the electronic equipment includes an under-screen camera component, and the under-screen camera component includes M color cameras,
  • the image processing device includes:
  • An image acquisition module for acquiring the first image taken by each of the N color cameras in the M color cameras
  • a brightness value obtaining module configured to obtain the brightness value of at least some pixels in the first image for each first image
  • the brightness enhancement module is used to increase the brightness of the target image by using the obtained brightness value, and use the target image after the brightness increase as the image taken by the under-screen camera component of the target object.
  • the target image is a color camera among N color cameras The first image taken;
  • M and N are both integers greater than or equal to 2, and N is less than or equal to M.
  • this application also provides an electronic device, which includes:
  • the under-screen camera assembly includes M color cameras
  • An image processing device for acquiring a first image taken by each of the N color cameras in the M color cameras, for each first image, acquiring the brightness value of at least some pixels in the first image, and using the acquired Increase the brightness of the target image, and use the increased brightness of the target image as the image taken by the under-screen camera component of the target object.
  • the target image is the first image taken by one of the N color cameras; where M Both and N are integers greater than or equal to 2, and N is less than or equal to M.
  • this application also provides an electronic device.
  • the electronic device includes a processor, a memory, and a computer program stored on the memory and running on the processor.
  • the computer program implements the first aspect of the application when the computer program is executed by the processor. Image processing method.
  • the present application also provides a computer storage medium on which computer program instructions are stored.
  • the computer program instructions are executed by a processor, the image processing method of the first aspect of the present application is implemented.
  • the brightness value of at least part of the pixels in the first image collected by each color camera set under the screen is used to increase the brightness of the target image, and the target image with the increased brightness is used as the shooting of the camera component.
  • Image thereby realizing a full-screen display while improving the brightness of the captured image of the camera component under the screen.
  • FIG. 1 is a schematic flowchart of an embodiment of an image processing method provided by the first aspect of the application
  • Fig. 2 is a front view of an exemplary electronic device to which the image processing method of the first aspect of the present application can be applied;
  • FIG. 3 is a top view showing an example of an under-screen camera assembly including four color cameras in an exemplary electronic device
  • Fig. 4 is a front view of an example of a color camera in an exemplary electronic device
  • Figure 5 is an exploded view of the color camera in Figure 4.
  • FIG. 6 is a top view showing an example in which two color cameras are included in an exemplary electronic device
  • Figure 7 is an exploded view of the two color cameras in Figure 6;
  • FIG. 8 is a schematic diagram showing an example of a target area of a first image in an embodiment of the present application.
  • FIG. 9 is a schematic diagram showing an example of the correspondence between pixels in different first images in an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of an embodiment of an image processing device provided by the second aspect of the application.
  • FIG. 11 is a schematic structural diagram of an embodiment of an electronic device provided by this application.
  • FIG. 1 shows a schematic flowchart of an embodiment of an image processing method 100 applied to an electronic device according to the first aspect of the present application.
  • the electronic device 200 to which the image processing method 100 provided in the present application is applied includes:
  • the display panel 210 has a first surface 210a and a second surface 210b opposite to each other.
  • the first surface 210a is the display surface of the display panel 210, and the display panel 210 includes a light-transmitting area 210c.
  • the under-screen camera assembly 220 is disposed on the second surface 210b side of the display panel 210 and corresponds to the position of the light-transmitting area 210c.
  • the under-screen camera assembly 220 includes M color cameras. M is an integer greater than or equal to 2.
  • the display panel 210 in the light-transmitting area 210c may be made of light-transmitting materials such as glass or polyimide (PI).
  • electronic devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, and the like.
  • the M color cameras in the under-screen camera component 220 may be arranged according to a preset rule.
  • the M color cameras may be arranged along a preset circular arc track, arranged along a preset circular track, or arranged in an array.
  • the under-screen camera assembly 220 includes four color cameras arranged in an array, and FIG. 3 shows a top view of the four cameras.
  • each color camera in the under-screen camera assembly 220 has the same structure.
  • Figures 4 and 5 show a side view and an exploded view of the color camera.
  • the color camera includes a lens 407, a photosensitive chip 408, a circuit board 409 connected to the photosensitive chip 408, a lens base 410, and a filter assembly 411.
  • the circuit board 409 may be a flexible printed circuit (FPC).
  • the lens 407 is a condensing lens.
  • the lens holder 410 is provided on the circuit board 409.
  • the lens 407 and the filter assembly 411 are arranged on the lens holder 410.
  • the lens holder 410 includes a first mounting portion 410a and a second mounting portion 410b.
  • the first mounting portion 410a is used for mounting the lens 407.
  • the lens 407 may be connected to the first mounting portion 410a by threads. Multiple lenses can be installed in the lens 407.
  • the second mounting portion 410b is used for mounting the filter assembly 411.
  • the filter assembly 411 is disposed in the cavity of the lens holder 410.
  • the lens 407 and the photosensitive chip 408 are arranged opposite to each other, and the photosensitive chip 408 is arranged on the circuit board 409 and is electrically connected to the circuit board 409.
  • the filter component 411 is located between the lens 407 and the photosensitive chip 408, and is used to implement the filter function during the shooting process.
  • the color camera further includes a connector 412, the connector 412 is disposed on the circuit and is electrically connected to the circuit board 409, and the connector 412 is used to connect an external device to perform the operation of the color camera. Power and information transmission.
  • the two color cameras of the under-screen camera assembly 220 may share the circuit board 409 and the connector 412. 6 and 7 show a top view and an exploded view of two color cameras sharing the circuit board 409 and the connector 412.
  • the under-screen camera assembly 220 by disposing the under-screen camera assembly 220 on the second surface 210b side of the display panel 210, it is not necessary to dig holes in the display screen or the backlight area, and a full-screen display of the electronic device can be realized.
  • the light-transmitting area 210c of the display panel 210 also needs to have a display function, and the light-transmitting area 210c has only a certain light transmittance. Therefore, in order to ensure that the image captured by the under-screen camera component 220 under the display panel 210 has a higher brightness, the image processing method shown in FIG. 1 needs to be used.
  • the image processing method 100 provided by the present application includes S110 to S130.
  • the acquired brightness value is used to increase the brightness of the target image, and the target image with the increased brightness is used as the captured image of the under-screen camera component 220.
  • the target image is the first image taken by one of the N color cameras.
  • N is an integer greater than or equal to 2, and N is less than or equal to M.
  • an independent lens 407 is provided above the photosensitive chip 408 of each color camera in the under-screen camera assembly 220, which can image separately.
  • the combination of the photosensitive chip 408 and the lens 407 in the color camera can ensure that each color camera has a certain light transmittance.
  • at least two color cameras are provided on the second surface 210b side of the display panel 210 and corresponding to the light-transmitting area 210c, which can increase the brightness collected by the camera component 220 under the screen, namely The light transmittance of the camera assembly 220 under the screen is improved.
  • the brightness of the selected target image is increased by using the brightness value of at least part of the pixels in the first image collected by each color camera, and the target image with increased brightness is used as the captured image of the under-screen camera component 220, thereby The brightness of the captured image of the under-screen camera component 220 is improved.
  • an image taken by a color camera located at a preset position among the N color cameras may be used as the target image.
  • the image taken by the color camera at the center position can be used as the target image.
  • S120 includes S1201 to S1205.
  • the target shooting distance between the under-screen camera component 220 and the target object photographed by the under-screen camera component 220 is acquired.
  • the target shooting distance is the distance between the under-screen camera component 220 and the target object to be photographed. Since the distance between each color camera in the under-screen camera component 220 is very close, it can be considered that the shooting distance of each color camera shooting the target object is the same.
  • the under-screen camera component 220 needs to photograph a target object, the light-transmitting area 210c above the under-screen camera component 220 should be extinguished, that is, the display function is suspended.
  • the under-screen camera assembly 220 may be calibrated in advance, that is, at different shooting distances, each color camera of the under-screen camera assembly 220 is used to photograph the calibrated object.
  • the location area of the calibration object in the image is taken as the predetermined target area of the image.
  • the calibration object can be the target object to be photographed, or other photographic objects. If the calibration object is the target object, the predetermined target area of the image taken by the color camera at any shooting distance stored in advance is the location area where the target object is located in the image.
  • the predetermined target area may be expressed using a pixel coordinate range.
  • the number of pixels in the predetermined target area of the image captured by each color camera is the same.
  • the pixel has a corresponding position and a pixel with the same color information in the predetermined target area of the image captured by other color cameras point.
  • the under-screen camera component 220 includes 4 color cameras, and it is assumed that the size of the first image taken by each camera of the target object is 8 pixels*8 pixels.
  • the area of 4 pixels*4 pixels in the thick black frame in each first image is the target area corresponding to the first image.
  • the target area in FIG. 8 is only a schematic diagram, and the size of the image in FIG. 8 is only for reference, and is not specifically limited. In some embodiments, the size of the first image may be 8 million pixels*8 million pixels.
  • the brightness value of each pixel in the target area of each first image needs to be used to increase the brightness of the pixel in the target area of the target image. Therefore, each first image needs to be acquired first. The brightness value of all pixels in the target area.
  • step S140 After acquiring the brightness value of each pixel in the target area of each first image, step S140 includes S1401 to S1402.
  • each pixel in the target area of the target image is taken as the first target pixel.
  • each pixel in the target area of the target image needs to be increased. Therefore, each pixel in the target area of the target image can be used as the first target pixel.
  • the brightness value of the first target pixel and the first brightness value are added to obtain the first target brightness value, and the brightness value of the first target pixel is increased to the first target pixel.
  • Target brightness value For each first target pixel, the brightness value of the first target pixel and the first brightness value are added to obtain the first target brightness value, and the brightness value of the first target pixel is increased to the first target pixel.
  • the first brightness value is the sum of the brightness values of the pixel points corresponding to the first target pixel point in the target area of each first image except the target image.
  • each first target pixel has a pixel with a corresponding position and the same color information in the target area of each first image except the target image.
  • the pixels in the i-th row and j-th column of each target area are pixels in corresponding positions, and the color information of the pixels with corresponding positions in each target area is the same.
  • the pixels in the first row and the first column of each target area are corresponding 4 pixels, and the color information of the four pixels is red.
  • the first image taken by the first color camera is the target image
  • the pixel in the first row and the first column in the target area in the upper left corner of Figure 8 is the first target pixel, assuming the first target pixel
  • the brightness value of is R1.
  • the brightness value of the pixel in the first row and the first column in the target area in the upper right corner of Fig. 8 is R2
  • the brightness value of the pixel in the first row and the first column in the target area in the lower left corner of Fig. 8 is R2 R3 assuming that the brightness value of the pixel in the first row and the first column in the target area in the lower right corner of FIG. 8 is R4.
  • the first brightness value R' R2+R3+R4.
  • the brightness value of each pixel in the target area of the target image can be improved. Then, the target image after the brightness of the target area is increased is used as the image taken by the under-screen camera component 220 of the target object.
  • the brightness value of all pixels in the target area of each first image is used to increase the brightness value of each pixel in the target area of the target image, thereby improving the image captured by the under-screen camera component 220 The brightness.
  • the fields of view of all the color cameras in the under-screen camera assembly 220 have the same overlapping area. Since the field of view of each color camera has the same overlapping area, the image captured by each color camera has its own corresponding target area.
  • the distance between each color camera in the under-screen camera component 220 is set closer, the more pixels are included in the target area of each first image, which can further improve the shooting of the under-screen camera component 220.
  • the overall brightness of the image is set closer, the more pixels are included in the target area of each first image, which can further improve the shooting of the under-screen camera component 220. The overall brightness of the image.
  • the non-target area of the target image may also be obtained.
  • the brightness value of each pixel in the area, and then the brightness value of each pixel in the non-target area of the acquired target image is increased by a preset multiple.
  • the preset multiple is N-1 times.
  • the brightness value of each pixel in the non-target area of the target image is adjusted to N times the original.
  • step S120 for each first image, the brightness values of all pixels in the first image are acquired.
  • step S140 After obtaining the brightness values of all pixels in each first image, step S140 includes S1401' to S1402'.
  • each pixel in the target image is taken as the second target pixel.
  • the second brightness value is the sum of the brightness values of the pixel points corresponding to the second target pixel point in each first image except the target image.
  • each first image is the same, and the pixel points in the i-th row and j-th column in each first image are regarded as mutually corresponding pixels, and in each first image The color information of the corresponding pixels is the same.
  • the pixels in the first row and the second column in each first image are corresponding 4 pixels, and the color information of the four pixels is green.
  • the brightness value of the pixel in the first row and second column in the first image in the upper left corner of Figure 9 is G1
  • the brightness of the pixel in the first row and second column in the first image in the upper right corner of Figure 9 The value is G2, assuming that the brightness value of the pixel in the first row and second column in the first image in the lower left corner of Fig. 9 is G3, and assuming that the pixel in the first row and second column in the first image in the lower right corner of Fig. 9
  • the brightness value of the pixel is G4.
  • the second brightness value G' G2+G3+G4.
  • the brightness value of each pixel in the target image can be increased. Then, the target image in which the brightness of each pixel is increased is used as the image taken by the under-screen camera component 220 of the target object.
  • the distance between the color cameras of the under-screen camera component 220 is very close, the difference between each first image can be ignored, and the target area can be calibrated in advance, and each first image can be used directly
  • the brightness of all the pixels in the target image is increased, and the brightness of the image captured by the under-screen camera component 220 is increased, and the image processing speed is increased.
  • FIG. 10 shows a schematic structural diagram of an embodiment of an image processing apparatus applied to the above-mentioned electronic equipment provided by the second aspect of the present application.
  • the image processing apparatus 1000 provided by the embodiment of the present application includes:
  • the image acquisition module 1010 is configured to acquire the first image taken by each of the N color cameras in the M color cameras;
  • the brightness value obtaining module 1020 is configured to obtain the brightness value of at least some pixels in the first image for each first image;
  • the brightness enhancement module 1030 is used to use the acquired brightness value to enhance the brightness of the target image, and use the brightness-enhanced target image as a captured image of the target object by the under-screen camera assembly.
  • the target image is one of the N color cameras. The first image taken by the camera.
  • the brightness value of at least part of the pixels in the first image collected by each color camera set on the second surface 210b side of the display panel 210 and corresponding to the light-transmitting area 210c is used to increase the brightness value.
  • the brightness of the target image is used as the captured image of the under-screen camera component 220 with the increased brightness of the target image, so that the brightness of the captured image of the under-screen camera component 220 is improved while realizing a full-screen display.
  • the brightness value obtaining module 1020 includes:
  • the target shooting distance acquiring unit is configured to acquire the target shooting distance between the under-screen camera component 220 and the target object photographed by the under-screen camera component 220.
  • the target area determining unit is configured to obtain the target area of each first image at the target shooting distance based on the predetermined target area in the image taken by each color camera at different shooting distances stored in advance.
  • the first brightness value obtaining unit is configured to obtain, for each first image, the brightness value of each pixel in the target area of the first image.
  • the predetermined target area of the image taken by the color camera is the location area where the target object in the image is located.
  • the brightness enhancement module 1030 is used for:
  • the brightness value of the first target pixel and the first brightness value are added to obtain the first target brightness value, and the brightness value of the first target pixel is increased to the first target brightness value .
  • the first brightness value is the sum of the brightness values of the pixel points corresponding to the first target pixel point in the target area of each first image except the target image.
  • the brightness enhancement module 1030 is also used for:
  • the brightness value of each pixel in the non-target area of the acquired target image is increased by a preset multiple.
  • the preset multiple is N-1 times.
  • the brightness value obtaining module 1020 is used for:
  • the brightness values of all pixels in the first image are acquired.
  • the brightness enhancement module 1030 is also used for:
  • the second brightness value is the sum of the brightness values of the pixel points corresponding to the second target pixel point in each first image except the target image.
  • the third aspect of the present application also provides an electronic device, including:
  • An under-screen camera assembly where the under-screen camera assembly includes M color cameras;
  • An image processing device for acquiring a first image taken by each of the N color cameras in the M color cameras, and for each first image, acquiring the brightness value of at least some pixels in the first image , Using the acquired brightness value to increase the brightness of the target image, and use the target image with the increased brightness as a captured image of the target object by the under-screen camera assembly, the target image being the N color cameras
  • M and N are both integers greater than or equal to 2, and N is less than or equal to M.
  • the electronic device provided by the embodiment of the present application may include the display panel 210 in FIG. 2.
  • the under-screen camera assembly can be used for the under-screen camera assembly 220 described above with reference to FIG. 2 and any of its embodiments.
  • the image processing device may be the image processing device 1000 described with reference to FIG. 10 and any of its embodiments.
  • the fourth aspect of the present application also provides an electronic device, which includes:
  • Memory used to store programs
  • the processor is configured to run a program stored in the memory to execute each step in the image processing method of the embodiment of the present application, where the method includes: acquiring the first image captured by each of the N color cameras among the M color cameras One image, for each first image, obtain the brightness value of at least part of the pixels in the first image, use the obtained brightness value to increase the brightness of the target image, and use the increased brightness of the target image as the off-screen camera component 220 A captured image of the target object, where the target image is the first image captured by one of the N color cameras.
  • the electronic device provided in the present application can implement each process in any embodiment of the image processing method of the first aspect of the present application. To avoid repetition, details are not described herein again.
  • the brightness value of at least part of the pixels in the first image collected by each color camera set on the second surface 210b side of the display panel 210 and corresponding to the light-transmitting area 210c is used to increase the brightness value.
  • the brightness of the target image is used as the captured image of the under-screen camera component 220 with the increased brightness of the target image, so that the brightness of the captured image of the under-screen camera component 220 is improved while realizing a full-screen display.
  • FIG. 11 is a schematic structural diagram of an embodiment of an electronic device provided by the fourth aspect of the application.
  • the electronic device 1100 includes, but is not limited to: a radio frequency unit 1101, a network module 1102, an audio output unit 1103, an input unit 1104, a sensor 1105, a display unit 1106, a user input unit 1107, an interface unit 1108, a memory 1109, a processor 1110, and Power supply 1111 and other components.
  • the electronic device 1100 also includes a first screen and a second screen.
  • Those skilled in the art can understand that the structure of the electronic device shown in FIG. 11 does not constitute a limitation on the electronic device.
  • the electronic device may include more or fewer components than those shown in the figure, or a combination of certain components, or different components. Layout.
  • electronic devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, and pedometers.
  • the processor 1110 is configured to obtain the first image taken by each of the N color cameras in the M color cameras, and for each first image, obtain the brightness value of at least some pixels in the first image, and use The acquired brightness value enhances the brightness of the target image, and the target image with the increased brightness is used as the captured image of the target object by the under-screen camera component 220, and the target image is the first image captured by one of the N color cameras.
  • the brightness value of at least part of the pixels in the first image collected by each color camera set on the second surface 210b side of the display panel 210 and corresponding to the light-transmitting area 210c is used to improve the performance of the target image.
  • the brightness of the target image after the increased brightness is used as the captured image of the under-screen camera assembly 220, so that the brightness of the captured image of the under-screen camera assembly 220 is improved while realizing a full-screen display.
  • the radio frequency unit 1101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, after receiving the downlink data from the base station, it is processed by the processor 1110; Uplink data is sent to the base station.
  • the radio frequency unit 1101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 1101 can also communicate with the network and other devices through a wireless communication system.
  • the electronic device provides users with wireless broadband Internet access through the network module 1102, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 1103 can convert the audio data received by the radio frequency unit 1101 or the network module 1102 or stored in the memory 1109 into audio signals and output them as sounds. Moreover, the audio output unit 1103 may also provide audio output related to a specific function performed by the electronic device 1100 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 1103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 1104 is used to receive audio or video signals.
  • the input unit 1104 may include a graphics processing unit (GPU) 11041 and a microphone 11042, and the graphics processor 11041 is configured to respond to still pictures or video images obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode.
  • the data is processed.
  • the processed image frame can be displayed on the display unit 1106.
  • the image frame processed by the graphics processor 11041 may be stored in the memory 1109 (or other storage medium) or sent via the radio frequency unit 1101 or the network module 1102.
  • the microphone 11042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 1101 for output in the case of a telephone call mode.
  • the electronic device 1100 further includes at least one sensor 1105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 11061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 11061 and 11061 when the electronic device 1100 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three axes), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of electronic devices (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; the sensor 1105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 1106 is used to display information input by the user or information provided to the user.
  • the display unit 1106 may include a display panel 11061, and the display panel 11061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 1107 can be used to receive inputted numeric or character information, and generate key signal inputs related to user settings and function control of the electronic device.
  • the user input unit 1107 includes a touch panel 11071 and other input devices 11072.
  • the touch panel 11071 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 11071 or near the touch panel 11071. operating).
  • the touch panel 11071 may include two parts, a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it
  • the processor 1110 receives and executes the command sent by the processor 1110.
  • the touch panel 11071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 1107 may also include other input devices 11072.
  • other input devices 11072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 11071 can be overlaid on the display panel 11061.
  • the touch panel 11071 detects a touch operation on or near it, it transmits it to the processor 1110 to determine the type of the touch event, and then the processor 1110 determines the type of the touch event according to the touch.
  • the type of event provides corresponding visual output on the display panel 11061.
  • the touch panel 11071 and the display panel 11061 are used as two independent components to implement the input and output functions of the electronic device, but in some embodiments, the touch panel 11071 and the display panel 11061 can be integrated
  • the implementation of the input and output functions of the electronic device is not specifically limited here.
  • the interface unit 1108 is an interface for connecting an external device and the electronic device 1100.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 1108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic device 1100 or can be used to connect the electronic device 1100 to an external device. Transfer data between devices.
  • the memory 1109 can be used to store software programs and various data.
  • the memory 1109 may mainly include a storage program area and a storage data area.
  • the storage program area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 1109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 1110 is the control center of the electronic device. It uses various interfaces and lines to connect the various parts of the entire electronic device, runs or executes the software programs and/or modules stored in the memory 1109, and calls the data stored in the memory 1109 , Perform various functions of electronic equipment and process data, so as to monitor the electronic equipment as a whole.
  • the processor 1110 may include one or more processing units; preferably, the processor 1110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem
  • the processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 1110.
  • the electronic device 1100 may also include a power source 1111 (such as a battery) for supplying power to various components.
  • a power source 1111 such as a battery
  • the power source 1111 may be logically connected to the processor 1110 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system. And other functions.
  • the electronic device 1100 includes some functional modules not shown, which will not be repeated here.
  • the fifth aspect of the present application also provides a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program.
  • the computer program is executed by a processor, each of the embodiments of the image processing method of the first aspect of the present application is implemented.
  • examples of the computer-readable storage medium include non-transitory computer-readable storage media, such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disks, or CD etc.
  • Such a processor may be, but is not limited to, a general-purpose processor, a special-purpose processor, a special application processor, or a field programmable logic array. It should also be understood that each block in the block diagram and/or flowchart, and a combination of blocks in the block diagram and/or flowchart, can also be implemented by a dedicated hardware-based system that performs the specified function or action, Or it can be implemented by a combination of dedicated hardware and computer instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Exposure Control For Cameras (AREA)
  • Cameras In General (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)

Abstract

本申请公开了一种图像处理方法、装置、电子设备和介质。该方法应用于电子设备,电子设备包括括屏下摄像组件,屏下摄像组件包括M个彩色摄像头;该方法包括:获取M个彩色摄像头中N个彩色摄像头中每个彩色摄像头拍摄的第一图像;针对每个第一图像,获取第一图像中至少部分像素点的亮度值;利用所获取的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为屏下摄像组件的拍摄图像,目标图像为N个彩色摄像头中的一个彩色摄像头拍摄的第一图像;其中,M和N均为大于或等于2的整数,N小于或等于M。

Description

图像处理方法、装置、电子设备和介质
相关申请的交叉引用
本申请主张在2019年9月27日在中国提交的中国专利申请号201910921251.6的优先权,其全部内容通过引用包含于此。
技术领域
本申请涉及数据处理领域,尤其涉及一种图像处理方法、装置、电子设备和介质。
背景技术
随着移动智能电子设备技术的迅猛发展,用户对移动智能电子设备的需求越来越大,各手机厂家都想方设法提升电子设备的屏占比,以提升用户体验。但是由于前置摄像头镜头要有一定透过率,显示屏和摄像头镜头区域会形成黑色块而无法显示画面,影响了用户体验效果。
目前利用在显示屏或者背光区域挖孔,例如屏下水滴屏的设计方式,可以在一定程度上提高手机屏占比。但是用户从正面看,还是会看到摄像头挖孔的黑色点,仍未完全实现全面屏显示。
若将摄像头放置在显示屏下方可以实现全面屏显示,但是由于显示屏只具有一定的透光率,造成摄像头拍摄的图像亮度很低。
发明内容
本申请提供一种图像处理方法、装置、电子设备和介质,以解决显示屏下方的摄像头的拍摄图像的亮度较低的问题。
为了解决上述技术问题,本申请的第一方面提供一种图像处理方法,应用于电子设备,电子设备包括屏下摄像组件,屏下摄像组件包括M个彩色摄像头;
其中,该图像处理方法包括:
获取M个彩色摄像头中N个彩色摄像头中每个彩色摄像头拍摄的第一图像;
针对每个第一图像,获取第一图像中至少部分像素点的亮度值;
利用所获取的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为屏下摄像组件的拍摄图像,目标图像为N个彩色摄像头中的一个彩色摄像头拍摄的第一图像;
其中,M和N均为大于或等于2的整数,N小于或等于M。
第二方面,本申请提供了一种图像处理装置,应用于电子设备,电子设备包括屏下摄像组件,屏下摄像组件包括M个彩色摄像头,
图像处理装置包括:
图像获取模块,用于获取M个彩色摄像头中N个彩色摄像头中每个彩色摄像头拍摄的第一图像;
亮度值获取模块,用于针对每个第一图像,获取第一图像中至少部分像素点的亮度值;
亮度提升模块,用于利用所获取的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为屏下摄像组件对目标对象的拍摄图像,目标图像为N个彩色摄像头中的一个彩色摄像头拍摄的第一图像;
其中,M和N均为大于或等于2的整数,N小于或等于M。
第三方面,本申请还提供了一种电子设备,电子设备包括:
屏下摄像组件,屏下摄像组件包括M个彩色摄像头;
图像处理装置,用于获取M个彩色摄像头中N个彩色摄像头中每个彩色摄像头拍摄的第一图像,针对每个第一图像,获取第一图像中至少部分像素点的亮度值,利用所获取的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为屏下摄像组件对目标对象的拍摄图像,目标图像为N个彩色摄像头中的一个彩色摄像头拍摄的第一图像;其中,M和N均为大于或等于2的整数,N小于或等于M。
第四方面,本申请还提供了一种电子设备,电子设备包括处理器、存储器及存储在存储器上并可在处理器上运行的计算机程序,计算机程序被 处理器执行时实现本申请第一方面的图像处理方法。
第四方面,本申请还提供了一种计算机存储介质,计算机存储介质上存储有计算机程序指令,计算机程序指令被处理器执行时实现本申请第一方面的图像处理方法。
在本申请实施例中,通过利用在屏下设置的每个彩色摄像头采集的第一图像中至少部分像素点的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为摄像组件的拍摄图像,从而在实现全面屏显示的同时提高了屏下摄像组件的拍摄图像的亮度。
附图说明
从下面结合附图对本申请的具体实施方式的描述中可以更好地理解本申请其中,相同或相似的附图标记表示相同或相似的特征。
图1为本申请第一方面提供的图像处理方法的一实施例的流程示意图;
图2为可应用本申请第一方面的图像处理方法的示例性电子设备的正视图;
图3为示出了示例性电子设备中包括四个彩色摄像头的屏下摄像组件的示例的俯视图;
图4为示例性电子设备中一个彩色摄像头的示例的正视图;
图5为图4中的彩色摄像头的分解图;
图6为示出了示例性电子设备中包括两个彩色摄像头的示例的俯视图;
图7为图6中的两个彩色摄像头的分解图;
图8为示出了本申请实施例中的第一图像的目标区域的示例的示意图;
图9为示出了本申请实施例中的不同的第一图像中像素点的对应关系的示例示意图;
图10为本申请第二方面提供的图像处理装置的一实施例的结构示意图;
图11为本申请提供的电子设备的一实施例的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
图1示出本申请第一方面提供的应用于电子设备的图像处理方法100的一实施例的流程示意图。如图2所示,本申请提供的图像处理方法100所应用的电子设备200包括:
显示面板210,具有相背的第一表面210a和第二表面210b,第一表面210a为显示面板210的显示面,显示面板210包括透光区210c。
屏下摄像组件220,设置在显示面板210的第二表面210b侧且与透光区210c的位置对应,屏下摄像组件220包括M个彩色摄像头。M为大于或等于2的整数。
其中,透光区210c的显示面板210可以是玻璃、聚酰亚胺(PI)等透光材料制成。
在本申请实施例中,电子设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备等。
在本申请的实施例中,屏下摄像组件220中的M个彩色摄像头可以按照预设规则排布。例如,M个彩色摄像头可以沿预设圆弧轨迹排布、沿预设圆周轨迹排布或阵列排布。作为一个具体示例,屏下摄像组件220包括阵列排布的4个彩色摄像头,图3示出这四个摄像头的俯视图。
在本申请的一些实施例中,屏下摄像组件220中的每个彩色摄像头的结构相同。图4和图5示出彩色摄像头的侧视图和分解图。如图4所示,彩色摄像头包括镜头407、感光芯片408、与感光芯片408连接的电路板409、镜座410和滤光组件411。作为一个示例,电路板409可以为柔性电路板(Flexible Printed Circuit,FPC)。其中,镜头407为聚光镜头。
参见图4,镜座410设于电路板409上。镜头407和滤光组件411设于镜座410上。参见图5,镜座410包括第一安装部410a和第二安装部410b。第一安装部410a用于安装镜头407。在一些实施例中,镜头407可以通过螺纹与第一安装部410a连接。镜头407内可以安装多片镜片。第二安装部410b用于安装滤光组件411。滤光组件411设置于镜座410的腔室内。
参见图4,镜头407与感光芯片408正对设置,且感光芯片408设置于电路板409上,并与电路板409电连接。参见图5,滤光组件411位于镜头407和感光芯片408之间,用于实现拍摄过程中的滤光功能。
在本申请的实施例中,彩色摄像头还包括连接器412,该连接器412设置于电路上且与该电路板409电性连接,该连接器412用于外接一设备上,以为该彩色摄像头进行电能及信息传输。
在本申请的一些实施例中,为了节省电子设备的内部空间,屏下摄像组件220的两个彩色摄像头可以共用电路板409和连接器412。图6和图7示出共用电路板409和连接器412的两个彩色摄像头的俯视图和分解图。
在本申请的实施例中,通过将屏下摄像组件220设置于显示面板210的第二表面210b侧,可以不用在显示屏或背光区域挖孔,可以实现电子设备的全面屏显示。在电子设备实现全面屏显示的情况下,显示面板210的透光区210c也需要具有显示功能,则透光区210c只有一定的透光率。因此为了保证显示面板210下的屏下摄像组件220拍摄的图像具有较高的亮度,则需要利用图1所示的图像处理方法。
参见图1,本申请提供的图像处理方法100包括S110~S130。
在S110,获取M个彩色摄像头中N个彩色摄像头中每个彩色摄像头拍摄的第一图像。
在S120,针对每个第一图像,获取第一图像中至少部分像素点的亮度值。
在S130,利用所获取的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为屏下摄像组件220的拍摄图像。
其中,目标图像为N个彩色摄像头中的一个彩色摄像头拍摄的第一图像。N为大于或等于2的整数,N小于或等于M。
在本申请的实施例中,屏下摄像组件220中的每个彩色摄像头的感光芯片408的上方均设有独立的镜头407,均可以单独成像。对于每个彩色摄像头,该彩色摄像头中的感光芯片408和镜头407的组合可以保证每个彩色摄像头具有一定的透光率。相对于在显示面板210下设置单一摄像头,在显示面板210的第二表面210b侧且与透光区210c的对应位置处设置至少两个彩色摄像头,可以提升屏下摄像组件220采集的亮度,即提高了屏下摄像组件220的透光率。因此,通过利用每个彩色摄像头采集的第一图像中至少部分像素点的亮度值对选取的目标图像的亮度进行提升,并将亮度提升后的目标图像作为屏下摄像组件220的拍摄图像,从而提高了屏下摄像组件220的拍摄图像的亮度。
在本申请的一些实施例中,可以将位于N个彩色摄像头中预设位置处的彩色摄像头所拍摄的图像为目标图像。作为一个示例,至少两个摄像头呈一字型排列,则可以将处于中心位置处的彩色摄像头拍摄的图像作为目标图像。
在一些实施例中,S120包括S1201~S1205。
在S1201,获取屏下摄像组件220与屏下摄像组件220拍摄的目标对象之间的目标拍摄距离。
在本申请的实施例中,目标拍摄距离是屏下摄像组件220与要拍摄的目标对象之间的距离。由于屏下摄像组件220中的每个彩色摄像头之间的距离很近,因此可以认为每个彩色摄像头拍摄目标对象的拍摄距离均相同。
值得一提的是,当屏下摄像组件220需要对目标对象拍摄时,屏下摄像组件220上方的透光区210c要熄灭,即暂停显示功能。
在S1203,基于预先存储的不同拍摄距离下每个彩色摄像头所拍摄图像中的预定目标区域,得到在目标拍摄距离下每个第一图像的目标区域。
在本申请的实施例中,可以预先对屏下摄像组件220进行标定,即在不同的拍摄距离下,利用屏下摄像组件220的每个彩色摄像头对标定对象进行拍摄。在任一拍摄距离下,对于每个彩色摄像头所拍摄的图像,将标定对象在该图像中的位置区域作为该图像的预定目标区域。其中,标定对 象可以为要拍摄的目标对象,也可以为其他拍摄对象。若标定对象为目标对象,则对于预先存储的任一拍摄距离下彩色摄像头所拍摄图像的预定目标区域为该图像中目标对象所在的位置区域。
作为一个示例,预定目标区域可以为利用像素坐标范围表示。其中,在任一拍摄距离下,每个彩色摄像头所拍摄图像的预定目标区域中的像素点个数相同。在一个拍摄距离下,对于一个彩色摄像头所拍摄图像的预定目标区域中的每个像素点,该像素点在其他彩色摄像头所拍摄图像的预定目标区域中均具有对应位置的且颜色信息相同的像素点。
如图8所示,作为一个示例,屏下摄像组件220包括4个彩色摄像头,假设每个摄像头对目标对象拍摄的第一图像的大小均为8像素*8像素。其中每个第一图像中的黑色粗框内的4像素*4像素的区域分别为该第一图像对应的目标区域。图8中的目标区域仅仅为示意图,且图8中图像的大小仅为示意,不做具体限定。在一些实施例中,第一图像的大小可以为800万像素*800万像素。
在S1205,针对每个第一图像,获取第一图像的目标区域中每个像素点的亮度值。
在本申请的一些实施例中,需要利用每个第一图像的目标区域中每个像素点的亮度值去提升目标图像的目标区域中像素点的亮度值,因此需要先获取每个第一图像的目标区域中所有像素点的亮度值。
当获取每个第一图像的目标区域中每个像素点的亮度值之后,步骤S140包括S1401~S1402。
在S1401,将目标图像的目标区域中的每个像素点作为第一目标像素点。
在本申请的实施例中,需要对目标图像的目标区域中的每个像素点的亮度值进行提升,因此可以将目标图像的目标区域中的每个像素点均作为第一目标像素点。
在S1402,针对每个第一目标像素点,将第一目标像素点的亮度值和第一亮度值相加,得到第一目标亮度值,并将第一目标像素点的亮度值提升为第一目标亮度值。
其中,第一亮度值为除目标图像之外的其他每个第一图像的目标区域中与第一目标像素点对应的像素点的亮度值之和。
需要说明的是,每个第一目标像素点在除目标图像之外的其他每个第一图像的目标区域中均具有对应位置且具有相同颜色信息的像素点。
作为一个示例,参见图8,每个目标区域的第i行第j列的像素点均为对应位置的像素点,且每个目标区域中具有对应位置的像素点的颜色信息相同。
参见图8,每个目标区域的第1行第1列的像素点为对应的4个像素点,且四个像素点的颜色信息为红色。假设第1个彩色摄像头所拍摄的第一图像为目标图像,则图8中左上角的目标区域中的第1行第1列的像素点为第一目标像素点,假设该第一目标像素点的亮度值为R1。假设图8中右上角的目标区域中的第1行第1列的像素点的亮度值为R2,假设图8中左下角的目标区域中的第1行第1列的像素点的亮度值为R3,假设图8中右下角的目标区域中的第1行第1列的像素点的亮度值为R4。
其中,第一亮度值R’=R2+R3+R4。目标图像的目标区域的第1行第1列的像素点对应的第一目标亮度值R总=R1+R’=R1+R2+R3+R4。然后将目标图像中的目标区域中的第1行第1列的像素点的亮度值为提升为R总。
按照上述相类似的方法,可以提升目标图像的目标区域中的每个像素点的亮度值。然后,将目标区域进行亮度提升后的目标图像作为屏下摄像组件220对目标对象的拍摄图像。
在本申请的实施例中,通过利用每个第一图像的目标区域中所有像素点的亮度值提升目标图像的目标区域中每个像素点的亮度值,从而提升屏下摄像组件220所拍摄图像的亮度。
在本申请的实施例中,屏下摄像组件220中的所有彩色摄像头的视场具有相同的重叠区域。由于每个彩色摄像头的视场具有相同的重叠区域,所以每个彩色摄像头所拍摄图像才均具有各自对应的目标区域。
若屏下摄像组件220中的每个彩色摄像头之间的距离设置的越近,则每个第一图像的目标区域中包含的像素点的个数越多,可以进一步提高屏 下摄像组件220拍摄的图像的整体亮度。
在本申请的一些实施例中,为了减小屏下摄像组件220对目标对象的拍摄图像中目标区域和非目标区域之间的亮度差异性,在步骤S140中,还可以获取目标图像的非目标区域中的每个像素点的亮度值,然后将获取的目标图像的非目标区域中的每个像素点的亮度值提高预设倍数。
作为一个具体示例,预设倍数为N-1倍。也就是说,将目标图像的非目标区域中每个像素点的亮度值调节为原来的N倍。
在本申请的另一些实施例中,为了提高图像处理的速度,可以不用获取每个第一图像的目标区域,可以直接采用每个第一图像中的所有像素点的亮度值对目标图像进行亮度提升。具体地,在步骤S120中,针对每个第一图像,获取该第一图像中全部像素点的亮度值。
当获取每个第一图像中全部像素点的亮度值之后,步骤S140包括S1401’~S1402’。
在S1401’,将目标图像中的每个像素点作为第二目标像素点。
在S1402’,针对每个第二目标像素点,将第二目标像素点的亮度值和第二亮度值相加,得到第二目标亮度值,并将第二目标像素点的亮度值提升为第二目标亮度值。
其中,第二亮度值为除目标图像之外的每个第一图像中与第二目标像素点对应的像素点的亮度值之和。
在本申请的一些实施例中,每个第一图像的大小相同,将每个第一图像中的第i行第j列的像素点均作为相互对应的像素点,且每个第一图像中具有对应关系的像素点的颜色信息相同。
参见图9,每个第一图像中第1行第2列的像素点为对应的4个像素点,且四个像素点的颜色信息为绿色。假设图9中左上角的第一图像中的第1行第2列的像素点的亮度值为G1,假设图9中右上角的第一图像中的第1行第2列的像素点的亮度值为G2,假设图9中左下角的第一图像中的第1行第2列的像素点的亮度值为G3,假设图9中右下角的第一图像中的第1行第2列的像素点的亮度值为G4。
其中,第二亮度值G’=G2+G3+G4。目标图像中第1行第1列的像素 点对应的第二目标亮度值G总=G1+G’=G1+G2+G3+G4。然后将目标图像中的第1行第2列的像素点的亮度值提升为G总。
按照上述相类似的方法,可以提升目标图像中的每个像素点的亮度值。然后,将每个像素点的亮度均提升后的目标图像作为屏下摄像组件220对目标对象的拍摄图像。
也就是说,若屏下摄像组件220的彩色摄像头之间的距离很近,则每个第一图像之间的差异可以忽略不计,则可以不用预先标定目标区域,可以直接利用每个第一图像中的全部像素点对目标图像进行亮度提升,在提高屏下摄像组件220拍摄图像的亮度的同时,提高了图像处理的速度。
图10示出本申请第二方面提供的应用于上述电子设备的图像处理装置的一实施例的结构示意图。本申请实施例提供的图像处理装置1000包括:
图像获取模块1010,用于获取M个彩色摄像头中N个彩色摄像头中每个彩色摄像头拍摄的第一图像;
亮度值获取模块1020,用于针对每个第一图像,获取第一图像中至少部分像素点的亮度值;
亮度提升模块1030,用于利用所获取的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为屏下摄像组件对目标对象的拍摄图像,目标图像为N个彩色摄像头中的一个彩色摄像头拍摄的第一图像。
在本申请的实施例中,通过利用在显示面板210的第二表面210b侧且与透光区210c的对应位置处设置的每个彩色摄像头采集的第一图像中至少部分像素点的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为屏下摄像组件220的拍摄图像,从而在实现全面屏显示的同时提高了屏下摄像组件220的拍摄图像的亮度。
在本申请的实施例中,亮度值获取模块1020包括:
目标拍摄距离获取单元,用于获取屏下摄像组件220与屏下摄像组件220拍摄的目标对象之间的目标拍摄距离。
目标区域确定单元,用于基于预先存储的不同拍摄距离下每个彩色摄像头所拍摄图像中的预定目标区域,得到在目标拍摄距离下每个第一图像的目标区域。
第一亮度值获取单元,用于针对每个第一图像,获取第一图像的目标区域中每个像素点的亮度值。
在本申请的实施例中,对于预先存储的任一拍摄距离下彩色摄像头所拍摄图像的预定目标区域为该图像中目标对象所在的位置区域。
在本申请的实施例中,亮度提升模块1030用于:
将目标图像的目标区域中的每个像素点作为第一目标像素点;
针对每个第一目标像素点,将第一目标像素点的亮度值和第一亮度值相加,得到第一目标亮度值,并将第一目标像素点的亮度值提升为第一目标亮度值。
其中,第一亮度值为除目标图像之外的其他每个第一图像的目标区域中与第一目标像素点对应的像素点的亮度值之和。
在本申请的实施例中,亮度提升模块1030还用于:
将获取的目标图像的非目标区域中的每个像素点的亮度值提高预设倍数。其中,预设倍数为N-1倍。
在本申请的实施例中,亮度值获取模块1020用于:
针对每个第一图像,获取第一图像中全部像素点的亮度值。
在本申请的实施例中,亮度提升模块1030还用于:
将目标图像中的每个像素点作为第二目标像素点;
针对每个第二目标像素点,将第二目标像素点的亮度值和第二亮度值相加,得到第二目标亮度值,并将第二目标像素点的亮度值提升为第二目标亮度值。
其中,第二亮度值为除目标图像之外的每个第一图像中与第二目标像素点对应的像素点的亮度值之和。
本申请第三方面还提供一种电子设备,包括:
屏下摄像组件,所述屏下摄像组件包括M个彩色摄像头;
图像处理装置,用于获取所述M个彩色摄像头中N个彩色摄像头中每个彩色摄像头拍摄的第一图像,针对每个第一图像,获取所述第一图像中至少部分像素点的亮度值,利用所获取的亮度值提升所述目标图像的亮度,并将亮度提升后的目标图像作为所述屏下摄像组件对所述目标对象的拍摄 图像,所述目标图像为所述N个彩色摄像头中的一个彩色摄像头拍摄的第一图像;
其中,M和N均为大于或等于2的整数,N小于或等于M。
本申请实施例提供的电子设备可包括图2中的显示面板210。在根据本实施例的电子设备中,屏下摄像组件可以上上文参照图2所描述的屏下摄像组件220及其任意实施例。图像处理装置可以是参考图10所描述的图像处理装置1000及其任意实施例。
本申请第四方面还提供一种电子设备,电子设备包括:
存储器,用于存储程序;
处理器,用于运行存储器中存储的程序,以执行本申请实施例的图像处理方法中的各步骤,其中,方法包括:获取M个彩色摄像头中N个彩色摄像头中每个彩色摄像头拍摄的第一图像,针对每个第一图像,获取第一图像中至少部分像素点的亮度值,利用所获取的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为屏下摄像组件220对目标对象的拍摄图像,目标图像为N个彩色摄像头中的一个彩色摄像头拍摄的第一图像。
本申请提供的电子设备能够实现上述本申请第一方面的图像处理方法的任意实施例中的各个过程,为避免重复,这里不再赘述。在本申请的实施例中,通过利用在显示面板210的第二表面210b侧且与透光区210c的对应位置处设置的每个彩色摄像头采集的第一图像中至少部分像素点的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为屏下摄像组件220的拍摄图像,从而在实现全面屏显示的同时提高了屏下摄像组件220的拍摄图像的亮度。
图11为本申请第四方面提供的电子设备的实施例的结构示意图,
该电子设备1100包括但不限于:射频单元1101、网络模块1102、音频输出单元1103、输入单元1104、传感器1105、显示单元1106、用户输入单元1107、接口单元1108、存储器1109、处理器1110、以及电源1111等部件。电子设备1100还包括第一屏幕和第二屏幕。本领域技术人员可以理解,图11中示出的电子设备结构并不构成对电子设备的限定,电子设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件 布置。在本申请实施例中,电子设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、以及计步器等。
其中,处理器1110,用于获取M个彩色摄像头中N个彩色摄像头中每个彩色摄像头拍摄的第一图像,针对每个第一图像,获取第一图像中至少部分像素点的亮度值,利用所获取的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为屏下摄像组件220对目标对象的拍摄图像,目标图像为N个彩色摄像头中的一个彩色摄像头拍摄的第一图像。
本申请的实施例通过利用在显示面板210的第二表面210b侧且与透光区210c的对应位置处设置的每个彩色摄像头采集的第一图像中至少部分像素点的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为屏下摄像组件220的拍摄图像,从而在实现全面屏显示的同时提高了屏下摄像组件220的拍摄图像的亮度。
应理解的是,本申请实施例中,射频单元1101可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器1110处理;另外,将上行的数据发送给基站。通常,射频单元1101包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元1101还可以通过无线通信系统与网络和其他设备通信。
电子设备通过网络模块1102为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元1103可以将射频单元1101或网络模块1102接收的或者在存储器1109中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元1103还可以提供与电子设备1100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元1103包括扬声器、蜂鸣器以及受话器等。
输入单元1104用于接收音频或视频信号。输入单元1104可以包括图形处理器(Graphics Processing Unit,GPU)11041和麦克风11042,图形处理器11041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以 显示在显示单元1106上。经图形处理器11041处理后的图像帧可以存储在存储器1109(或其它存储介质)中或者经由射频单元1101或网络模块1102进行发送。麦克风11042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元1101发送到移动通信基站的格式输出。
电子设备1100还包括至少一种传感器1105,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板11061的亮度,接近传感器可在电子设备1100移动到耳边时,关闭显示面板11061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别电子设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器1105还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元1106用于显示由用户输入的信息或提供给用户的信息。显示单元1106可包括显示面板11061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板11061。
用户输入单元1107可用于接收输入的数字或字符信息,以及产生与电子设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元1107包括触控面板11071以及其他输入设备11072。触控面板11071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板11071上或在触控面板11071附近的操作)。触控面板11071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器1110,接收处理器1110发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声 波等多种类型实现触控面板11071。除了触控面板11071,用户输入单元1107还可以包括其他输入设备11072。具体地,其他输入设备11072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板11071可覆盖在显示面板11061上,当触控面板11071检测到在其上或附近的触摸操作后,传送给处理器1110以确定触摸事件的类型,随后处理器1110根据触摸事件的类型在显示面板11061上提供相应的视觉输出。虽然在图11中,触控面板11071与显示面板11061是作为两个独立的部件来实现电子设备的输入和输出功能,但是在某些实施例中,可以将触控面板11071与显示面板11061集成而实现电子设备的输入和输出功能,具体此处不做限定。
接口单元1108为外部装置与电子设备1100连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元1108可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到电子设备1100内的一个或多个元件或者可以用于在电子设备1100和外部装置之间传输数据。
存储器1109可用于存储软件程序以及各种数据。存储器1109可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器1109可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器1110是电子设备的控制中心,利用各种接口和线路连接整个电子设备的各个部分,通过运行或执行存储在存储器1109内的软件程序和/或模块,以及调用存储在存储器1109内的数据,执行电子设备的各种功能和处理数据,从而对电子设备进行整体监控。处理器1110可包括一个或多 个处理单元;优选的,处理器1110可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器1110中。
电子设备1100还可以包括给各个部件供电的电源1111(比如电池),优选的,电源1111可以通过电源管理系统与处理器1110逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,电子设备1100包括一些未示出的功能模块,在此不再赘述。
本申请第五方面还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现本申请第一方面的图像处理方法的任意实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质的示例包括非暂态计算机可读存储介质,如只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
上面参考根据本公开的实施例的方法、装置(系统)或计算机程序产品的流程图和/或框图描述了本公开的各方面。应当理解,流程图和/或框图中的每个方框以及流程图和/或框图中各方框的组合可以由计算机程序指令实现。这些计算机程序指令可被提供给通用计算机、专用计算机、或其它可编程数据处理装置的处理器,以产生一种机器,使得经由计算机或其它可编程数据处理装置的处理器执行的这些指令使能对流程图和/或框图的一个或多个方框中指定的功能/动作的实现。这种处理器可以是但不限于是通用处理器、专用处理器、特殊应用处理器或者现场可编程逻辑阵列。还 应理解,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,也可以由执行指定的功能或动作的专用的基于硬件的系统来实现,或者可以由专用硬件和计算机指令的组合来实现。
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本申请宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本申请的保护之内。

Claims (14)

  1. 一种图像处理方法,应用于电子设备,所述电子设备包括屏下摄像组件,所述屏下摄像组件包括M个彩色摄像头;
    其中,所述方法包括:
    获取所述M个彩色摄像头中N个彩色摄像头中每个彩色摄像头拍摄的第一图像;
    针对每个第一图像,获取所述第一图像中至少部分像素点的亮度值;
    利用所获取的亮度值提升目标图像的亮度,并将亮度提升后的目标图像作为所述屏下摄像组件的拍摄图像,所述目标图像为所述N个彩色摄像头中的一个彩色摄像头拍摄的第一图像;
    其中,M和N均为大于或等于2的整数,N小于或等于M。
  2. 根据权利要求1所述的方法,其中,所述针对每个第一图像,获取所述第一图像中至少部分像素点的亮度值,包括:
    获取所述屏下摄像组件与所述屏下摄像组件拍摄的目标对象之间的目标拍摄距离;
    基于预先存储的不同拍摄距离下每个彩色摄像头所拍摄图像中的预定目标区域,得到在所述目标拍摄距离下每个第一图像的目标区域;
    针对所述每个第一图像,获取所述第一图像的目标区域中每个像素点的亮度值。
  3. 根据权利要求2所述的方法,其中,对于预先存储的任一拍摄距离下所述彩色摄像头所拍摄图像的预定目标区域为该图像中所述目标对象所在的位置区域。
  4. 根据权利要求2所述的方法,其中,所述利用所获取的亮度值提升目标图像的亮度,包括:
    将所述目标图像的目标区域中的每个像素点作为第一目标像素点;
    针对所述每个第一目标像素点,将所述第一目标像素点的亮度值和第一亮度值相加,得到第一目标亮度值,并将所述第一目标像素点的亮度值提升为所述第一目标亮度值;
    其中,所述第一亮度值为除所述目标图像之外的其他每个第一图像的目标区域中与所述第一目标像素点对应的像素点的亮度值之和。
  5. 根据权利要求4所述的方法,其中,所述利用所获取的亮度值提升目标图像的亮度,还包括:
    将获取的所述目标图像的非目标区域中的每个像素点的亮度值提高预设倍数;其中,所述预设倍数为N-1倍。
  6. 根据权利要求1所述的方法,其中,所述针对每个第一图像,获取所述第一图像中至少部分像素点的亮度值,包括:
    针对所述每个第一图像,获取所述第一图像中全部像素点的亮度值。
  7. 根据权利要求6所述的方法,其中,所述利用所获取的亮度值提升目标图像的亮度,包括:
    将所述目标图像中的每个像素点作为第二目标像素点;
    针对所述每个第二目标像素点,将所述第二目标像素点的亮度值和第二亮度值相加,得到第二目标亮度值,并将所述第二目标像素点的亮度值提升为所述第二目标亮度值;
    其中,所述第二亮度值为除所述目标图像之外的每个第一图像中与所述第二目标像素点对应的像素点的亮度值之和。
  8. 一种图像处理装置,应用于电子设备,所述电子设备包括屏下摄像组件,所述屏下摄像组件包括M个彩色摄像头;
    其中,所述装置包括:
    图像获取模块,用于获取所述M个彩色摄像头中N个彩色摄像头中每个彩色摄像头拍摄的第一图像;
    亮度值获取模块,用于针对每个第一图像,获取所述第一图像中至少部分像素点的亮度值;
    亮度提升模块,用于利用所获取的亮度值提升所述目标图像的亮度,并将亮度提升后的目标图像作为所述屏下摄像组件对所述目标对象的拍摄图像,所述目标图像为所述N个彩色摄像头中的一个彩色摄像头拍摄的第一图像;
    其中,M和N均为大于或等于2的整数,N小于或等于M。
  9. 一种电子设备,包括:
    屏下摄像组件,所述屏下摄像组件包括M个彩色摄像头;
    图像处理装置,用于获取所述M个彩色摄像头中N个彩色摄像头中每个彩色摄像头拍摄的第一图像,针对每个第一图像,获取所述第一图像中至少部分像素点的亮度值,利用所获取的亮度值提升所述目标图像的亮度,并将亮度提升后的目标图像作为所述屏下摄像组件对所述目标对象的拍摄图像,所述目标图像为所述N个彩色摄像头中的一个彩色摄像头拍摄的第一图像;
    其中,M和N均为大于或等于2的整数,N小于或等于M。
  10. 根据权利要求9所述的电子设备,其中,所述M个彩色摄像头阵列排布。
  11. 根据权利要求9所述的电子设备,其中,所述屏下摄像组件中的所有摄像头的视场具有相同的重叠区域。
  12. 根据权利要求9所述的电子设备,其中,所述彩色摄像头包括:
    镜头、镜座、滤光组件、感光芯片和与所述感光芯片连接的电路板;
    其中,所述镜座设于所述电路板上,所述镜头和所述滤光组件设于所述镜座上,所述滤光组件位于所述镜头和所述感光芯片之间。
  13. 一种电子设备,包括:处理器以及存储有计算机程序指令的存储器;
    所述处理器执行所述计算机程序指令时实现如权利要求1-7任意一项所述的方法。
  14. 一种计算机存储介质,所述计算机存储介质上存储有计算机程序指令,所述计算机程序指令被处理器执行时实现如权利要求1-7任意一项所述的方法。
PCT/CN2020/116887 2019-09-27 2020-09-22 图像处理方法、装置、电子设备和介质 WO2021057731A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2022508992A JP7397966B2 (ja) 2019-09-27 2020-09-22 画像処理方法、画像処理装置、および電子機器
EP20868126.2A EP4037304A4 (en) 2019-09-27 2020-09-22 IMAGE PROCESSING METHOD AND APPARATUS, ELECTRONIC DEVICE AND MEDIA
US17/691,859 US11678066B2 (en) 2019-09-27 2022-03-10 Image processing method, electronic device and medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910921251.6A CN110661972B (zh) 2019-09-27 2019-09-27 图像处理方法、装置、电子设备和介质
CN201910921251.6 2019-09-27

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/691,859 Continuation US11678066B2 (en) 2019-09-27 2022-03-10 Image processing method, electronic device and medium

Publications (1)

Publication Number Publication Date
WO2021057731A1 true WO2021057731A1 (zh) 2021-04-01

Family

ID=69039466

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/116887 WO2021057731A1 (zh) 2019-09-27 2020-09-22 图像处理方法、装置、电子设备和介质

Country Status (5)

Country Link
US (1) US11678066B2 (zh)
EP (1) EP4037304A4 (zh)
JP (1) JP7397966B2 (zh)
CN (1) CN110661972B (zh)
WO (1) WO2021057731A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110769151B (zh) * 2019-09-27 2021-10-15 维沃移动通信有限公司 图像处理方法、装置、电子设备和介质
CN110661972B (zh) * 2019-09-27 2021-02-23 维沃移动通信有限公司 图像处理方法、装置、电子设备和介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012057619A1 (en) * 2010-10-24 2012-05-03 Ziv Attar System and method for imaging using multi aperture camera
CN103986874A (zh) * 2014-05-29 2014-08-13 宇龙计算机通信科技(深圳)有限公司 一种图像获取装置、图像获取方法及终端
CN105611185A (zh) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 图像生成方法、装置及终端设备
CN107079083A (zh) * 2015-11-25 2017-08-18 华为技术有限公司 一种拍照方法、拍照装置和终端
CN109348123A (zh) * 2018-10-25 2019-02-15 努比亚技术有限公司 拍照方法、移动终端及计算机可读存储介质
CN109587396A (zh) * 2018-11-27 2019-04-05 浙江舜宇光学有限公司 拍摄方法和拍摄装置
CN110661972A (zh) * 2019-09-27 2020-01-07 维沃移动通信有限公司 图像处理方法、装置、电子设备和介质
CN110971805A (zh) * 2019-12-20 2020-04-07 维沃移动通信有限公司 一种电子设备及其拍照方法

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4411059B2 (ja) 2003-12-12 2010-02-10 キヤノン株式会社 カメラ付きディスプレイ装置、通信装置および通信システム
US8314859B2 (en) * 2008-05-29 2012-11-20 Lg Electronics Inc. Mobile terminal and image capturing method thereof
JP5406619B2 (ja) 2009-07-27 2014-02-05 キヤノン株式会社 動画像再生装置及び撮像装置並びにそれらの制御方法、プログラム
JP6322099B2 (ja) 2014-09-12 2018-05-09 キヤノン株式会社 画像処理装置、及び、画像処理方法
CN204836352U (zh) * 2015-09-07 2015-12-02 深圳六滴科技有限公司 一种调整多摄像头拍摄图像的亮度的系统以及装置
CN105577941A (zh) * 2016-02-01 2016-05-11 惠州Tcl移动通信有限公司 一种终端屏幕亮度调节方法及系统
JP6598028B2 (ja) 2016-02-22 2019-10-30 パナソニックIpマネジメント株式会社 撮像装置
US9843736B2 (en) * 2016-02-26 2017-12-12 Essential Products, Inc. Image capture with a camera integrated display
US20180114493A1 (en) * 2016-10-21 2018-04-26 Motorola Mobility Llc Electronic Device with Display-Based Image Compensation and Corresponding Systems and Methods
WO2019006749A1 (zh) * 2017-07-07 2019-01-10 华为技术有限公司 一种具有摄像头的终端和拍摄方法
US10911656B2 (en) * 2017-11-21 2021-02-02 Microsoft Technology Licensing, Llc Optical isolation systems for displays
US10122969B1 (en) * 2017-12-07 2018-11-06 Microsoft Technology Licensing, Llc Video capture systems and methods
US10681281B2 (en) * 2017-12-28 2020-06-09 Htc Corporation Mobile device, and image processing method for mobile device
US10762664B2 (en) * 2017-12-29 2020-09-01 Intel Corporation Multi-camera processor with feature matching
CN109840475A (zh) * 2018-12-28 2019-06-04 深圳奥比中光科技有限公司 人脸识别方法及电子设备
CN109756680B (zh) * 2019-01-30 2021-05-14 Oppo广东移动通信有限公司 图像合成方法、装置、电子设备及可读存储介质
CN109788207B (zh) * 2019-01-30 2021-03-23 Oppo广东移动通信有限公司 图像合成方法、装置、电子设备及可读存储介质
CN110062082B (zh) * 2019-05-24 2024-05-17 Oppo广东移动通信有限公司 一种显示屏及终端设备
US11153513B2 (en) * 2019-08-19 2021-10-19 Synaptics Incorporated Light source for camera

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012057619A1 (en) * 2010-10-24 2012-05-03 Ziv Attar System and method for imaging using multi aperture camera
CN103986874A (zh) * 2014-05-29 2014-08-13 宇龙计算机通信科技(深圳)有限公司 一种图像获取装置、图像获取方法及终端
CN107079083A (zh) * 2015-11-25 2017-08-18 华为技术有限公司 一种拍照方法、拍照装置和终端
CN105611185A (zh) * 2015-12-18 2016-05-25 广东欧珀移动通信有限公司 图像生成方法、装置及终端设备
CN109348123A (zh) * 2018-10-25 2019-02-15 努比亚技术有限公司 拍照方法、移动终端及计算机可读存储介质
CN109587396A (zh) * 2018-11-27 2019-04-05 浙江舜宇光学有限公司 拍摄方法和拍摄装置
CN110661972A (zh) * 2019-09-27 2020-01-07 维沃移动通信有限公司 图像处理方法、装置、电子设备和介质
CN110971805A (zh) * 2019-12-20 2020-04-07 维沃移动通信有限公司 一种电子设备及其拍照方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4037304A4 *

Also Published As

Publication number Publication date
CN110661972B (zh) 2021-02-23
US11678066B2 (en) 2023-06-13
CN110661972A (zh) 2020-01-07
US20220201223A1 (en) 2022-06-23
EP4037304A1 (en) 2022-08-03
JP7397966B2 (ja) 2023-12-13
EP4037304A4 (en) 2022-11-23
JP2022544304A (ja) 2022-10-17

Similar Documents

Publication Publication Date Title
WO2021057735A1 (zh) 图像处理方法、装置、电子设备和介质
WO2021104195A1 (zh) 图像显示方法及电子设备
WO2021036542A1 (zh) 录屏方法及移动终端
WO2020253382A1 (zh) 信息显示方法及终端设备
WO2019174628A1 (zh) 拍照方法及移动终端
WO2021104321A1 (zh) 图像显示方法及电子设备
WO2021233176A1 (zh) 环境光检测方法及电子设备
WO2021047325A1 (zh) 显示面板的控制方法及电子设备
WO2021036623A1 (zh) 显示方法及电子设备
WO2021147911A1 (zh) 移动终端、拍摄模式的检测方法及存储介质
WO2021121398A1 (zh) 一种视频录制方法及电子设备
US11678066B2 (en) Image processing method, electronic device and medium
WO2021129745A1 (zh) 触控按键、控制方法及电子设备
KR20210057790A (ko) 정보 처리 방법 및 단말
WO2021208890A1 (zh) 截屏方法及电子设备
WO2021185139A1 (zh) 电子设备及屏幕显示方法
WO2021190387A1 (zh) 检测结果输出的方法、电子设备及介质
WO2020238562A1 (zh) 显示方法及终端
US11863901B2 (en) Photographing method and terminal
WO2020220893A1 (zh) 截图方法及移动终端
US20200341623A1 (en) Image display method and mobile terminal
WO2020020060A1 (zh) 补光灯模组、指纹识别方法和终端设备
WO2020216181A1 (zh) 终端设备及其控制方法
WO2020156119A1 (zh) 应用程序界面调整方法及移动终端
WO2021204101A1 (zh) 显示方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20868126

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2022508992

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2020868126

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2020868126

Country of ref document: EP

Effective date: 20220428