WO2023236209A1 - Procédé et appareil de traitement d'image, dispositif électronique et support de stockage - Google Patents

Procédé et appareil de traitement d'image, dispositif électronique et support de stockage Download PDF

Info

Publication number
WO2023236209A1
WO2023236209A1 PCT/CN2022/098240 CN2022098240W WO2023236209A1 WO 2023236209 A1 WO2023236209 A1 WO 2023236209A1 CN 2022098240 W CN2022098240 W CN 2022098240W WO 2023236209 A1 WO2023236209 A1 WO 2023236209A1
Authority
WO
WIPO (PCT)
Prior art keywords
light spot
image
spot area
area
effective
Prior art date
Application number
PCT/CN2022/098240
Other languages
English (en)
Chinese (zh)
Inventor
尹双双
董家旭
饶强
陈妹雅
刘阳晨旭
江浩
Original Assignee
北京小米移动软件有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京小米移动软件有限公司 filed Critical 北京小米移动软件有限公司
Priority to PCT/CN2022/098240 priority Critical patent/WO2023236209A1/fr
Priority to CN202280004273.6A priority patent/CN117616777A/zh
Publication of WO2023236209A1 publication Critical patent/WO2023236209A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise

Definitions

  • the present disclosure relates to the technical field of image processing, and specifically to an image processing method, device, electronic device and storage medium.
  • the camera program of the terminal device can provide a variety of photo modes, so that it has various functions of a professional camera to satisfy users in various situations.
  • Photography needs in various scenarios.
  • photography with a physical blur function is mainly realized through professional lenses of professional cameras.
  • embodiments of the present disclosure provide an image processing method, device, electronic device and storage medium to solve the defects in the related technology.
  • an image processing method including:
  • the first image is blurred and rendered according to the effective light spot area.
  • determining an effective light spot area in at least one first light spot area in the first image according to at least one second light spot area in the second image includes:
  • the first light spot area belonging to the first intersection among the at least one first light spot area is determined as the effective light spot area, wherein the first intersection is the at least one first light spot area and the at least one second light spot. intersection of regions.
  • the method further includes:
  • the color parameter of each pixel is determined according to the color parameter of the target pixel, wherein the color parameter is used to perform a blur rendering process on the first image.
  • determining the color parameters of each pixel point in each effective light spot area according to the color parameters of the target pixel point includes:
  • the i-th effective light spot When the color parameter difference between each pixel in the i-th effective light spot area and the target pixel point in the i-th effective light spot area is less than the preset difference threshold, the i-th effective light spot The color parameters of each pixel in the area are adjusted according to the color parameters of the target pixel, where i is an integer greater than 0 and not greater than N, and N is the effective spot area in the first image. total quantity;
  • the jth effective light spot area is determined
  • the color parameters of each pixel within are unchanged, where j is an integer greater than 0 and not greater than N.
  • the method further includes:
  • the target brightness parameter is the brightness parameter of the pixel with the smallest brightness parameter in the effective light spot area
  • the brightness parameter of each pixel is adjusted according to the target brightness parameter, wherein the brightness parameter is used to perform blur rendering processing on the first image.
  • the method before determining the effective light spot area in the at least one second light spot area in the first image based on the at least one second light spot area in the second image, the method further includes:
  • light spot detection is performed on the first image and the second image respectively to obtain at least one first light spot area in the first image and at least one second light spot in the second image.
  • areas including:
  • light spot detection is performed on the first image and the second image respectively to obtain at least one first light spot area in the first image and at least one first light spot area in the second image.
  • the two-spot area includes:
  • Pixels in the second image with brightness higher than the second brightness threshold are determined as second light spot pixels, and at least one connected area composed of the second light spot pixels is determined as a second light spot area.
  • determining at least one connected domain composed of the first light spot pixels as the first light spot area includes:
  • Determining at least one connected area composed of the second light spot pixel points as the second light spot area includes:
  • At least one connected area composed of the second light spot pixels and with a number of pixels within a preset number range is determined as a second light spot area.
  • an image processing device includes:
  • An acquisition module configured to acquire the first image and the second image collected by the image acquisition device for the same scene, where the first image is a normal exposure image and the second image is an underexposure image;
  • a determining module configured to determine an effective light spot area in at least one first light spot area in the first image according to at least one second light spot area in the second image;
  • a rendering module configured to perform blur rendering processing on the first image according to the effective light spot area.
  • the determining module is specifically used to:
  • the first light spot area belonging to the first intersection among the at least one first light spot area is determined as the effective light spot area, wherein the first intersection is the at least one first light spot area and the at least one second light spot. intersection of regions.
  • a color module is also included for:
  • determining an effective light spot area in at least one first light spot area in the first image based on at least one second light spot area in the second image determining a target pixel for each of the effective light spot areas. point, wherein the target pixel point is the pixel point with the highest color saturation in the effective light spot area;
  • the color parameter of each pixel is determined according to the color parameter of the target pixel, wherein the color parameter is used to perform a blur rendering process on the first image.
  • the color module is used to determine the color parameters of each pixel according to the color parameters of the target pixel in each of the effective light spot areas, specifically for:
  • the i-th effective light spot When the color parameter difference between each pixel in the i-th effective light spot area and the target pixel point in the i-th effective light spot area is less than the preset difference threshold, the i-th effective light spot The color parameters of each pixel in the area are adjusted according to the color parameters of the target pixel, where i is an integer greater than 0 and not greater than N, and N is the effective spot area in the first image. total quantity;
  • the jth effective light spot area is determined
  • the color parameters of each pixel within are unchanged, where j is an integer greater than 0 and not greater than N.
  • a brightness module is also included for:
  • determining an effective light spot area in at least one first light spot area in the first image based on at least one second light spot area in the second image determining a target brightness of each effective light spot area.
  • the target brightness parameter is the brightness parameter of the pixel with the smallest brightness parameter in the effective spot area
  • the brightness parameter of each pixel is adjusted according to the target brightness parameter, wherein the brightness parameter is used to perform blur rendering processing on the first image.
  • it also includes a detection module for:
  • the detection module is specifically used to:
  • the detection module is specifically used to:
  • Pixels in the second image with brightness higher than the second brightness threshold are determined as second light spot pixels, and at least one connected area composed of the second light spot pixels is determined as a second light spot area.
  • the detection module when used to determine at least one connected domain composed of the first light spot pixels as the first light spot area, it is specifically used to:
  • Determining at least one connected area composed of the second light spot pixel points as the second light spot area includes:
  • At least one connected area composed of the second light spot pixels and with a number of pixels within a preset number range is determined as a second light spot area.
  • an electronic device includes a memory and a processor.
  • the memory is used to store computer instructions executable on the processor.
  • the processor is used to execute the The computer instructions are based on the image processing method described in the first aspect.
  • a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the method described in the first aspect is implemented.
  • the image processing method since the first image is a normal exposure image and the second image is an underexposure image, light spot areas that are mistakenly recognized such as light-colored substances in the normal exposure image will not be detected in the underexposure image. Identified as a light spot area, therefore the effective light spot area screened out in the first light spot area by using the second light spot area is more accurate, and the first light spot area that was mistakenly identified is removed, so that the third light spot area can be determined on the basis of determining the effective light spot area.
  • An image is rendered with blur, imitating the physical blur function of a professional camera. If this method is applied to the camera program of the terminal device, the functions of the camera program can be enriched and the camera effect can be closer to that of a professional camera.
  • Figure 1 is a flow chart of an image processing method according to an exemplary embodiment of the present disclosure
  • FIG. 2 is a flowchart of an image processing method according to another exemplary embodiment of the present disclosure.
  • Figure 3 is a schematic structural diagram of an image processing device according to an exemplary embodiment of the present disclosure.
  • FIG. 4 is a structural block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
  • first, second, third, etc. may be used in this disclosure to describe various information, the information should not be limited to these terms. These terms are only used to distinguish information of the same type from each other.
  • first information may also be called second information, and similarly, the second information may also be called first information.
  • word “if” as used herein may be interpreted as "when” or “when” or “in response to determining.”
  • At least one embodiment of the present disclosure provides an image processing method. Please refer to FIG. 1 , which shows the flow of the method, including step S101 and step S103.
  • the method can be applied to a terminal device, for example, to an algorithm that simulates physical blur in a camera program of the terminal device.
  • the terminal device may have an image acquisition device such as a camera. These image acquisition devices can acquire images, and the camera program of the terminal device can control various parameters in the image acquisition process of the image acquisition device.
  • This method can be applied to the scene where the camera program of the terminal device captures images. That is, this method is used to blur and render the graphics collected by the image acquisition device, thereby obtaining the image output by the camera program, which is what the user does when taking pictures with the camera program. the resulting image.
  • step S101 a first image and a second image collected by the image capture device for the same scene are obtained, where the first image is a normal exposure image and the second image is an underexposure image.
  • the image acquisition device can continuously collect the first image and the second image for the same scene.
  • the same scene is the scene that the user takes pictures of, that is, the real scene in the field of view of the image acquisition device. It can be understood that this step does not limit the order in which the first image and the second image are collected, that is, the first image can be collected first and then the second image, or the second image can be collected first and then the first image, or the first image can be collected first and then the first image.
  • Different sub-cameras in the image collection device collect the first image and the second image simultaneously.
  • the first image is a normal exposure image, that is, the image collected under the normal exposure (ie, the default exposure) when the camera program takes the photo.
  • the second image is an underexposed image, that is, the exposure is smaller than the exposure when the camera program takes the photo. Images captured under normal exposure conditions. It can be understood that the proportional relationship between the exposure amount when collecting under-exposed images and the exposure amount when collecting normal-exposed images can be set in advance, such as 80%, 75%, 60%, etc.; the exposure amount can be controlled by controlling the exposure time.
  • step S102 an effective light spot area is determined in at least one first light spot area in the first image according to at least one second light spot area in the second image.
  • the first light spot area in the first image and the second light spot area in the second image can be collected in advance. That is, before step S102, light spot detection can be performed on the first image and the second image respectively to obtain at least one first light spot area in the first image and at least one second light spot area in the second image. Spot area.
  • the purpose of spot detection is to detect the spot area in the image, and the difference between the spot area and other areas is mainly reflected in the brightness. Therefore, the first image and the second image can be separately detected in the YUV domain. . If the first image and the second image are RGB images, the first image and the second image can be converted from the RGB domain to the YUV domain respectively before performing spot detection on the first image and the second image, then Spot detection can be completed using the Y channel (ie, brightness channel) of the first image and the second image.
  • the Y channel ie, brightness channel
  • methods such as brightness threshold, energy function, and deep learning can be used for spot detection.
  • light spot detection can be performed on the first image and the second image respectively in the following manner: pixels with brightness higher than the first brightness threshold in the first image are determined as first light spot pixels. , and determine at least one connected domain composed of the first light spot pixels as the first light spot area; determine the pixels with a brightness higher than the second brightness threshold in the second image as the second light spot pixels, and determine At least one connected area composed of the second light spot pixels is determined as the second light spot area.
  • the brightness value of the pixel is the value of the pixel in the Y channel.
  • the pixel with a brightness value higher than the first brightness threshold can be determined as the first light spot pixel, and the other pixels can be determined as non-first light spot pixels;
  • the pixels with a brightness value higher than the second brightness threshold can be determined as the second light spot pixels, and the other pixels can be determined as non-second light spot pixels.
  • a connected domain division standard of a four-connected connected domain or an eight-connected connected domain can be used to determine the connected domain composed of the first light spot pixels and the connected domain composed of the second light spot pixels. Then each independent connected domain in the first image can be assigned a unique label value, such as a number, etc. This label value is also the label value of the first light spot area determined by the connected domain; each independent connected domain in the second image can be assigned Each independent connected domain is assigned a unique label value, such as a number, etc., and the label value is also the label value of the second light spot area determined by the connected domain. Label values can be assigned to connected domains using two-pass or seed-fill methods.
  • the large-area light source out of focus does not form a light source. Therefore, when determining the first light spot area and the second light spot area, the area of the connected domain can be further measured, the connected domain with an area that is too large is excluded, and the connected domain with an area within a reasonable range is determined as the light spot area, so that large areas can be excluded
  • the light source is mistakenly identified as a spot area, which improves the accuracy of spot area detection.
  • the area of the connected domain can be characterized by the number of pixels in the connected domain, then at least one connected domain composed of the first light spot pixels and the number of pixels within the preset number range can be determined as the first light spot area; At least one connected area composed of the second light spot pixels and with a number of pixels within a preset number range is determined as the second light spot area.
  • the preset quantity range can be set in advance, and can be less than a preset quantity threshold (for example, 300), etc.
  • the image acquisition device may have a position change when acquiring the first image and acquiring the second image. For example, the user may shake when holding the terminal device to take pictures. Changes in the position of the image acquisition device will cause the first image and the second image not to completely overlap, but to have some deviations. Therefore, before determining the effective light spot area, the first image and the second image can be aligned.
  • One channel of the Y channel in the YUV domain can be used to complete the alignment process of the first image and the second image, that is, Align the Y channel of the first image with the Y channel of the second image, thereby completing the alignment process of the first image and the second image.
  • first increase the brightness of the second image through the Y channel histogram of the second image then use optical flow alignment to align the first image and the second image after increasing the brightness, and finally complete the first image according to the alignment result.
  • Alignment processing of an image and a second image For example, after increasing the brightness, the second image is aligned with the first image by shifting 15 pixels upward and 20 pixels to the right. Then the second image can be shifted upward. 15 pixels and offset 20 pixels to the right to align with the first image. Through the alignment process, it can be ensured that pixels at the same position on the first image and the second image correspond to the same real-life scene, thereby improving the screening accuracy of the effective spot area.
  • each first light spot area belonging to a first intersection in the at least one first light spot area can be determined as an effective light spot area, wherein the first intersection is the at least one The intersection of the first light spot area and the at least one second light spot area.
  • the first intersection can be determined based on the position coordinates of the first light spot area in the first image and the position coordinates of the second light spot area in the second image, that is, the first light spot area and the second light spot with the same position coordinates The area joins the first intersection.
  • the first image marked with the first light spot area and the second image marked with the second light spot area are superimposed, and the first light spot area with an overlapping second light spot area on the second image is determined as an effective light spot. area.
  • step S103 a blur rendering process is performed on the first image according to the effective light spot area.
  • the depth information of the current imaging scene can be calculated through the multi-camera system or deep learning algorithm of the terminal device, and then the blur corresponding to the pixels outside the focus plane is calculated based on different depth information. radius, and finally generate a picture with a blur effect based on the blur radius corresponding to each pixel.
  • the image processing method since the first image is a normal exposure image and the second image is an underexposure image, light spot areas that are mistakenly recognized such as light-colored substances in the normal exposure image will not be detected in the underexposure image. Identified as a light spot area, therefore the effective light spot area screened out in the first light spot area by using the second light spot area is more accurate, and the first light spot area that was mistakenly identified is removed, so that the third light spot area can be determined on the basis of determining the effective light spot area.
  • An image is rendered with blur, imitating the physical blur function of a professional camera. If this method is applied to the camera program of the terminal device, the functions of the camera program can be enriched and the camera effect can be closer to that of a professional camera.
  • the present disclosure can perform light spot detection on the first image and the second image respectively by acquiring the first image and the second image collected by the image acquisition device for the same scene, thereby obtaining at least one first light spot in the first image. area and at least one second spot area in the second image, and then these second spot areas in the second image can be used to determine the effective spot area among the first spot areas in the first image, that is, in the first image Some or all of these first light spot areas are determined as effective light spot areas, and finally the first image can be blurred and rendered based on these effective light spot areas.
  • the first image is a normal exposure image and the second image is an underexposure image
  • the light spot areas mistakenly recognized such as light-colored substances in the normal exposure image will not be recognized as light spot areas in the underexposure image. Therefore, the second image is used
  • the effective light spot area screened out in the first light spot area is more accurate, and the mistakenly identified first light spot area is removed, thereby achieving a better blur rendering effect and avoiding blurring of non-light spot areas. Rendered with a spot effect. If this method is applied to the camera program of the terminal device, the camera program can accurately identify the spot area in the image to be blurred, so that the camera program can achieve a better blurred rendering effect.
  • the color information in the spot area is easily lost, especially the color information in the overexposed spot area.
  • the U and V channels in the overexposed spot area will have patchy distribution, that is, N, The V channel values are all close to 128, so the color information in the effective spot area in the first image may be lost. Therefore, after determining the effective light spot area in the first light spot area in the at least one first image based on at least one second light spot area in the second image, the effective light spot area can be determined in the following manner.
  • the color information is restored: first, determine the target pixel point of each effective light spot area, wherein the target pixel point is the pixel point with the highest color saturation in the effective light spot area, and the pixel point with the highest color saturation It mostly exists near the boundary of the effective light spot area, that is, within the halo. For example, the color saturation of each pixel can be judged based on the value of each pixel in the U and V channels in each effective light spot area. degree; next, in each effective light spot area, determine the color parameter of each pixel according to the color parameter of the target pixel, wherein the color parameter is used to blur the first image Rendering processing.
  • the color parameters of each pixel in the i effective light spot areas are adjusted according to the color parameters of the target pixels, where i is an integer greater than 0 and not greater than N, and the N is an integer in the first image.
  • the total number of effective spot areas This is because in this case, the original color in the effective light spot area is one color, that is, the color of the target pixel. Therefore, the color parameters of each pixel can be adjusted using the color parameters of the target pixel.
  • the color parameter difference between at least one pixel in the jth effective light spot area and the target pixel point in the jth effective light spot area is greater than the preset difference threshold, determine the jth The color parameter of each pixel in the effective light spot area remains unchanged, where j is an integer greater than 0 and not greater than N. This is because in this case, it means that the original colors in the effective spot area are at least two colors. If the color parameters of each pixel are adjusted using the color parameters of the target pixel, it will make part of the effective spot area Pixels are adjusted to a different color than their original color.
  • the lost color information of the pixels in the effective light spot area can be restored, so that in the image rendered according to the blurring of the effective light spot area, the color information in the light spot area can be restored. Colors stay true and color loss is avoided.
  • the effective light spot area can be enhanced in the following manner Brightness level: First, determine the target brightness parameter of each effective spot area, where the target brightness parameter is the brightness parameter of the pixel with the smallest brightness parameter in the effective spot area.
  • a certain pixel The brightness parameter of a point is the value of the pixel on the Y channel; next, in each effective light spot area, the brightness parameter of each pixel is adjusted according to the target brightness parameter, wherein the brightness The parameters are used to perform blur rendering processing on the first image.
  • the brightness parameters of the pixels can be adjusted through the following formula that characterizes the gamma curve:
  • Y' is the brightness parameter adjustment result of the pixel
  • Y is the brightness parameter of the pixel
  • Y min is the target brightness parameter
  • its energy response value in the Y channel is remapped to enrich the brightness level, so that in the image rendered according to the blurring of the effective spot area, the light spot area
  • the brightness is realistic and layered.
  • FIG. 2 exemplarily shows the complete flow of the image processing method provided by the present disclosure.
  • first obtain the normal exposure frame EV0 and underexposure frame EV- collected by the image acquisition device and then perform YUV domain conversion on EV0 and EV- respectively, that is, convert to the YUV domain, and then convert EV0 and EV- Perform image alignment, then perform intensity threshold detection on EV0 to obtain the first light spot pixel point, then perform connected domain detection to obtain the first light spot area, and then perform the same detection as EV0 on EV- to obtain the second light spot area, and use the third light spot area
  • the second light spot area filters the first light spot area, that is, the intersection of the two is retained, and then it is judged whether the remaining first light spot area enhances the color.
  • the overexposed spot is color enhanced, and finally the remaining third light spot area is enhanced.
  • the light spot energy value is remapped in the first light spot area, thereby improving the brightness level of the first light spot area, and obtaining the image to be rendered.
  • the image to be rendered can be blurred and rendered.
  • an image processing device is provided. Please refer to FIG. 3.
  • the device includes:
  • the acquisition module 301 is used to acquire the first image and the second image collected by the image acquisition device for the same scene, where the first image is a normal exposure image and the second image is an underexposure image;
  • Determining module 302 configured to determine an effective light spot area in at least one first light spot area in the first image according to at least one second light spot area in the second image;
  • the rendering module 303 is configured to perform blur rendering processing on the first image according to the effective light spot area.
  • the determining module is specifically used to:
  • the first light spot area belonging to the first intersection among the at least one first light spot area is determined as the effective light spot area, wherein the first intersection is the at least one first light spot area and the at least one second light spot. intersection of regions.
  • a color module is also included for:
  • determining an effective light spot area in at least one first light spot area in the first image based on at least one second light spot area in the second image determining a target pixel for each of the effective light spot areas. point, wherein the target pixel point is the pixel point with the highest color saturation in the effective light spot area;
  • the color parameter of each pixel is determined according to the color parameter of the target pixel, wherein the color parameter is used to perform a blur rendering process on the first image.
  • the color module is used to determine the color parameters of each pixel according to the color parameters of the target pixel in each of the effective light spot areas, specifically for:
  • the i-th effective light spot When the color parameter difference between each pixel in the i-th effective light spot area and the target pixel point in the i-th effective light spot area is less than the preset difference threshold, the i-th effective light spot The color parameters of each pixel in the area are adjusted according to the color parameters of the target pixel, where i is an integer greater than 0 and not greater than N, and N is the effective spot area in the first image. total quantity;
  • the jth effective light spot area is determined
  • the color parameters of each pixel within are unchanged, where j is an integer greater than 0 and not greater than N.
  • a brightness module is also included for:
  • determining an effective light spot area in at least one first light spot area in the first image based on at least one second light spot area in the second image determining a target brightness of each effective light spot area.
  • the target brightness parameter is the brightness parameter of the pixel with the smallest brightness parameter in the effective spot area
  • the brightness parameter of each pixel is adjusted according to the target brightness parameter, wherein the brightness parameter is used to perform blur rendering processing on the first image.
  • a detection module is also included for:
  • the detection module is specifically used to:
  • the detection module is specifically used to:
  • Pixels in the second image with brightness higher than the second brightness threshold are determined as second light spot pixels, and at least one connected area composed of the second light spot pixels is determined as a second light spot area.
  • the detection module when used to determine at least one connected domain composed of the first light spot pixels as the first light spot area, it is specifically used to:
  • Determining at least one connected area composed of the second light spot pixel points as the second light spot area includes:
  • At least one connected area composed of the second light spot pixels and with a number of pixels within a preset number range is determined as a second light spot area.
  • the device 400 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like.
  • the device 400 may include one or more of the following components: a processing component 402, a memory 404, a power supply component 406, a multimedia component 408, an audio component 410, an input/output (I/O) interface 412, a sensor component 414, and communications component 416.
  • Processing component 402 generally controls the overall operations of device 400, such as operations associated with display, phone calls, data communications, camera program operations, and recording operations.
  • the processing element 402 may include one or more processors 420 to execute instructions to complete all or part of the steps of the above method.
  • processing component 402 may include one or more modules that facilitate interaction between processing component 402 and other components.
  • processing component 402 may include a multimedia module to facilitate interaction between multimedia component 408 and processing component 402.
  • Memory 404 is configured to store various types of data to support operations at device 400 . Examples of such data include instructions for any application or method operating on device 400, contact data, phonebook data, messages, pictures, videos, etc.
  • Memory 404 may be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as static random access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EEPROM), Programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read-only memory
  • EEPROM erasable programmable read-only memory
  • EPROM Programmable read-only memory
  • PROM programmable read-only memory
  • ROM read-only memory
  • magnetic memory flash memory, magnetic or optical disk.
  • Power component 406 provides power to various components of device 400 .
  • Power components 406 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power to device 400 .
  • Multimedia component 408 includes a screen that provides an output interface between the device 400 and the user.
  • the screen may include a liquid crystal display (LCD) and a touch panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from the user.
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may not only sense the boundary of the touch or sliding operation, but also detect the duration and pressure associated with the touch or sliding operation.
  • multimedia component 408 includes a front-facing camera and/or a rear-facing camera. When the device 400 is in an operating mode, such as a shooting mode or a video mode, the front camera and/or the rear camera may receive external multimedia data.
  • Each front-facing camera and rear-facing camera can be a fixed optical lens system or have a focal length and optical zoom capabilities.
  • Audio component 410 is configured to output and/or input audio signals.
  • audio component 410 includes a microphone (MIC) configured to receive external audio signals when device 400 is in operating modes, such as call mode, recording mode, and voice recognition mode. The received audio signals may be further stored in memory 404 or sent via communication component 416 .
  • audio component 410 also includes a speaker for outputting audio signals.
  • the I/O interface 412 provides an interface between the processing component 402 and a peripheral interface module, which may be a keyboard, a click wheel, a button, etc. These buttons may include, but are not limited to: Home button, Volume buttons, Start button, and Lock button.
  • Sensor component 414 includes one or more sensors for providing various aspects of status assessment for device 400 .
  • the sensor component 414 can detect the open/closed state of the device 400, the relative positioning of components, such as the display and keypad of the device 400, and the sensor component 414 can also detect a change in position of the device 400 or a component of the device 400. , the presence or absence of user contact with the device 400 , device 400 orientation or acceleration/deceleration and temperature changes of the device 400 .
  • Sensor assembly 414 may also include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • Sensor assembly 414 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor component 414 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 416 is configured to facilitate wired or wireless communication between apparatus 400 and other devices.
  • the device 400 can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, 4G or 5G or a combination thereof.
  • the communication component 416 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel.
  • the communications component 416 also includes a near field communications (NFC) module to facilitate short-range communications.
  • NFC near field communications
  • the NFC module can be implemented based on radio frequency identification (RFID) technology, infrared data association (IrDA) technology, ultra-wideband (UWB) technology, Bluetooth (BT) technology and other technologies.
  • RFID radio frequency identification
  • IrDA infrared data association
  • UWB ultra-wideband
  • Bluetooth Bluetooth
  • apparatus 400 may be configured by one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable Gate array (FPGA), controller, microcontroller, microprocessor or other electronic components are implemented for executing the power supply method of the above electronic device.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGA field programmable Gate array
  • controller microcontroller, microprocessor or other electronic components are implemented for executing the power supply method of the above electronic device.
  • the present disclosure also provides a non-transitory computer-readable storage medium including instructions, such as a memory 404 including instructions.
  • the instructions can be executed by the processor 420 of the device 400 to complete the above electronic tasks.
  • the method of powering the device may be ROM, random access memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

La présente divulgation concerne un procédé et un appareil de traitement d'image, un dispositif électronique et un support de stockage. Le procédé consiste à : acquérir une première image et une seconde image qui sont collectées par un dispositif de collecte d'image pour la même scène, la première image étant une image normalement exposée, et la seconde image étant une image sous-exposée ; déterminer une zone de point lumineux efficace dans au moins une première zone de point lumineux dans la première image selon au moins une seconde zone de point lumineux dans la seconde image ; et effectuer un traitement de flou et de rendu sur la première image selon la zone de point lumineux efficace.
PCT/CN2022/098240 2022-06-10 2022-06-10 Procédé et appareil de traitement d'image, dispositif électronique et support de stockage WO2023236209A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/098240 WO2023236209A1 (fr) 2022-06-10 2022-06-10 Procédé et appareil de traitement d'image, dispositif électronique et support de stockage
CN202280004273.6A CN117616777A (zh) 2022-06-10 2022-06-10 图像处理方法、装置、电子设备和存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/098240 WO2023236209A1 (fr) 2022-06-10 2022-06-10 Procédé et appareil de traitement d'image, dispositif électronique et support de stockage

Publications (1)

Publication Number Publication Date
WO2023236209A1 true WO2023236209A1 (fr) 2023-12-14

Family

ID=89117464

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/098240 WO2023236209A1 (fr) 2022-06-10 2022-06-10 Procédé et appareil de traitement d'image, dispositif électronique et support de stockage

Country Status (2)

Country Link
CN (1) CN117616777A (fr)
WO (1) WO2023236209A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103916610A (zh) * 2013-01-07 2014-07-09 通用汽车环球科技运作有限责任公司 用于动态后视镜的眩光减少
CN105635593A (zh) * 2014-10-13 2016-06-01 广达电脑股份有限公司 多重曝光成像系统及其白平衡方法
CN107197146A (zh) * 2017-05-31 2017-09-22 广东欧珀移动通信有限公司 图像处理方法及相关产品
US20220005169A1 (en) * 2018-11-29 2022-01-06 Samsung Electronics Co., Ltd. Image processing method and electronic device supporting same
CN114565517A (zh) * 2021-12-29 2022-05-31 骨圣元化机器人(深圳)有限公司 红外相机的图像去噪方法、装置及计算机设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103916610A (zh) * 2013-01-07 2014-07-09 通用汽车环球科技运作有限责任公司 用于动态后视镜的眩光减少
CN105635593A (zh) * 2014-10-13 2016-06-01 广达电脑股份有限公司 多重曝光成像系统及其白平衡方法
CN107197146A (zh) * 2017-05-31 2017-09-22 广东欧珀移动通信有限公司 图像处理方法及相关产品
US20220005169A1 (en) * 2018-11-29 2022-01-06 Samsung Electronics Co., Ltd. Image processing method and electronic device supporting same
CN114565517A (zh) * 2021-12-29 2022-05-31 骨圣元化机器人(深圳)有限公司 红外相机的图像去噪方法、装置及计算机设备

Also Published As

Publication number Publication date
CN117616777A (zh) 2024-02-27

Similar Documents

Publication Publication Date Title
WO2019183813A1 (fr) Dispositif et procédé de capture d'image
AU2016200002B2 (en) High dynamic range transition
KR101800101B1 (ko) 촬영 파라미터 조정 방법, 장치, 프로그램 및 기록매체
CN109345485B (zh) 一种图像增强方法、装置、电子设备及存储介质
CN109379572B (zh) 图像转换方法、装置、电子设备及存储介质
WO2020034737A1 (fr) Procédé de commande d'imagerie, appareil, dispositif électronique et support d'informations lisible par ordinateur
CN111586282B (zh) 拍摄方法、装置、终端及可读存储介质
CN108154465B (zh) 图像处理方法及装置
KR101930460B1 (ko) 촬영 장치 및 제어 방법
JP2012199675A (ja) 画像処理装置、画像処理方法及びプログラム
US9025050B2 (en) Digital photographing apparatus and control method thereof
CN108040204B (zh) 一种基于多摄像头的图像拍摄方法、装置及存储介质
WO2023071933A1 (fr) Procédé et appareil de réglage de paramètre de photographie de caméra et dispositif électronique
JP2013017218A (ja) 画像処理装置、画像処理方法及びプログラム
CN111586280B (zh) 拍摄方法、装置、终端及可读存储介质
CN108156381B (zh) 拍照方法及装置
WO2023236209A1 (fr) Procédé et appareil de traitement d'image, dispositif électronique et support de stockage
WO2016074414A1 (fr) Procédé et appareil de traitement d'image
KR102512787B1 (ko) 촬영 프리뷰 이미지를 표시하는 방법, 장치 및 매체
WO2023236215A1 (fr) Procédé et appareil de traitement d'images, et support de stockage
WO2023231009A1 (fr) Procédé et appareil de mise au point, et support de stockage
US20240005521A1 (en) Photographing method and apparatus, medium and chip
US20230410260A1 (en) Method and apparatus for processing image
WO2023220957A1 (fr) Procédé et appareil de traitement d'image, terminal mobile et support de stockage
EP3941042A1 (fr) Procédé et appareil de traitement d'image ainsi qu'ensemble caméra, dispositif électronique et support de stockage

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 202280004273.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22945350

Country of ref document: EP

Kind code of ref document: A1