WO2021208706A1 - 高动态范围图像合成方法、装置、图像处理芯片及航拍相机 - Google Patents

高动态范围图像合成方法、装置、图像处理芯片及航拍相机 Download PDF

Info

Publication number
WO2021208706A1
WO2021208706A1 PCT/CN2021/083350 CN2021083350W WO2021208706A1 WO 2021208706 A1 WO2021208706 A1 WO 2021208706A1 CN 2021083350 W CN2021083350 W CN 2021083350W WO 2021208706 A1 WO2021208706 A1 WO 2021208706A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
synthesized
brightness
exposure
images
Prior art date
Application number
PCT/CN2021/083350
Other languages
English (en)
French (fr)
Inventor
李昭早
Original Assignee
深圳市道通智能航空技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市道通智能航空技术股份有限公司 filed Critical 深圳市道通智能航空技术股份有限公司
Publication of WO2021208706A1 publication Critical patent/WO2021208706A1/zh
Priority to US17/938,517 priority Critical patent/US20230038844A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/68Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
    • H04N23/681Motion detection
    • H04N23/6811Motion detection based on the image signal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10144Varying exposure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present invention relates to the technical field of image processing, in particular to a high dynamic range image synthesis method, device, image processing chip and aerial camera.
  • High dynamic range images are image data that can provide more bright and dark image details than ordinary images, and can better reflect the visual effects in the real environment.
  • a high dynamic range image is synthesized from multiple frames of ordinary images (ie, low dynamic range images) with different exposure times, and the optimal details corresponding to each exposure time are used for synthesis.
  • the HDR image obtained by synthesizing multiple frames of ordinary images through multiple exposures is prone to tailing of moving objects and reduced picture clarity in specific application scenarios such as aerial photography, which may have a high moving rate. There may even be bright and dark errors.
  • the embodiments of the present invention aim to provide a high dynamic range image synthesis method, device, image processing chip, and aerial camera, which can solve the defects of existing HDR image synthesis methods.
  • a high dynamic range image synthesis method including:
  • the motion state of the image at the position of the pixel point; according to the brightness type of the image and the motion state, the image to be synthesized is weighted and synthesized into a corresponding high dynamic range image.
  • the image brightness type includes: a high-illuminance scene and a low-illuminance scene; the determining the image brightness type of the image to be synthesized according to the average brightness specifically includes: when the average brightness is greater than or equal to a preset When the brightness detection threshold is set, the image brightness type of the image to be synthesized is determined to be a high-illuminance scene; when the average brightness is less than the brightness detection threshold, the image brightness type of the image to be synthesized is determined to be a low-illuminance scene.
  • the calculating the average brightness of the image to be synthesized specifically includes: superimposing the brightness value of each pixel in the image to be synthesized to obtain a brightness cumulative value: for each frame of the brightness of the image to be synthesized The accumulated values are summed to obtain a total brightness value; the average brightness value is calculated according to the total brightness value, the number of images to be synthesized, and the size of the images to be synthesized.
  • the image to be synthesized includes: a short-exposure image, a medium-exposure image, and a long-exposure image obtained by continuous shooting; the exposure time of the short-exposure image is shorter than the medium-exposure image; the exposure time of the medium-exposure image Smaller than the long exposure image.
  • the motion state includes: moving pixels and static pixels; the determining the motion state of the image to be synthesized at the pixel point position according to the difference between the frames specifically includes: determining the short exposure Whether the inter-frame difference between the image and the medium-exposure image, and whether the inter-frame difference between the medium-exposure image and the long-exposure image is greater than or equal to the preset motion detection threshold; if yes, determine the motion state of the pixel position Is a moving pixel; if not, it is determined that the position of the pixel point is in a moving state and is a stationary pixel.
  • the calculating the brightness difference between adjacent pixels in the same image to be synthesized includes: calculating the first brightness difference between the target pixel and the adjacent first pixel and the target pixel The second brightness difference between the adjacent second pixel point; the difference between the first brightness difference value and the second brightness difference value is acquired as the brightness difference of the target pixel point.
  • the weighted synthesis of the plurality of images to be synthesized into corresponding high dynamic range images according to the image brightness type and the motion state specifically includes:
  • the motion state of the pixel position is a static pixel
  • the image brightness type is a high-illuminance image
  • the short exposure weight coefficient, medium exposure weight coefficient, and long exposure weight coefficient are weighted and synthesized into the pixels of the high dynamic range image at the same pixel position.
  • the weighted synthesis of the plurality of images to be synthesized into corresponding high dynamic range images according to the image brightness type and the motion state specifically includes:
  • the motion state of the pixel point position is a static pixel
  • the image brightness type is a high-illuminance image
  • the pixels of the medium exposure image and the long exposure image at the pixel position are weighted and synthesized into the high dynamic range image at the same pixel position. ⁇ pixels;
  • the motion state is a motion pixel and the image brightness type is a low-illuminance image, discard the medium exposure image and the long exposure image;
  • the pixels of the short-exposure image at the pixel position and the short-exposure weight coefficient are weighted and synthesized.
  • a high dynamic range image synthesis device including:
  • the image frame acquisition module is used to acquire multiple frames of images to be synthesized, and each frame of the image to be synthesized has a different exposure time; the scene detection module is used to calculate the average brightness of the image to be synthesized, and determine according to the average brightness The image brightness type of the image to be synthesized; a secondary difference calculation module for calculating the brightness difference between adjacent pixels in the image to be synthesized in the same frame, and calculating the image to be synthesized based on the brightness difference For the difference between frames at the same pixel position, the motion detection module is used to determine the motion state of the image to be synthesized at the pixel position according to the difference between frames; the synthesis module is used to The image brightness type and the motion state are weighted and synthesized into the corresponding high dynamic range image.
  • an image processing chip including: a processor and a memory communicatively connected with the processor; computer program instructions are stored in the memory, and the computer When the program instructions are called by the processor, the processor executes the above-mentioned high dynamic range image synthesis method.
  • an aerial camera includes:
  • An image sensor the image sensor is used to collect multiple frames of images with set shooting parameters; a controller; the controller is connected to the image sensor, and is used to trigger the image sensor to collect multiple frames with different exposure time lengths Image; image processor, the image processor is used to receive the image sensor through continuous exposure to collect multi-frame images, and the received multi-frame images perform the above-mentioned high dynamic range image synthesis method to obtain high A dynamic range image; a storage device, the storage device is connected to the image processor, and is used to store the high dynamic range image.
  • the HDR image synthesis method of the embodiment of the present invention adapts the weight ratio of different ordinary images in the process of synthesizing the HDR image according to the motion state of the image to be synthesized and the type of image brightness. Make adjustments to effectively avoid the problems of tailing of moving objects and decreased picture clarity when multi-frame images are synthesized into HDR images.
  • FIG. 1 is a schematic diagram of an application scenario of a high dynamic range image synthesis method according to an embodiment of the present invention
  • FIG. 2 is a structural block diagram of an aerial camera provided by an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a high dynamic range image synthesis device provided by an embodiment of the present invention.
  • FIG. 5 is a method flowchart of a method for judging an exercise state according to an embodiment of the present invention
  • FIG. 6 is a schematic diagram of pixel positions provided by an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of an image processing chip provided by an embodiment of the present invention.
  • the high dynamic range image is composed of multiple frames of ordinary images with different exposure times to better show the bright and dark details.
  • Fig. 1 is an application scenario of a high dynamic range image synthesis method provided by an embodiment of the present invention.
  • a drone 10 equipped with an aerial camera, a smart terminal 20 and a wireless network 30 are included.
  • the UAV 10 may be an unmanned aerial vehicle driven by any type of power, including but not limited to a four-axis UAV, a fixed-wing aircraft, and a helicopter model. It can be equipped with the corresponding volume or power according to the needs of the actual situation, so as to provide the load capacity, flight speed and flight mileage that can meet the needs of use.
  • the aerial camera can be any type of image acquisition device, including a sports camera, a high-definition camera, or a wide-angle camera.
  • its aerial camera can be installed and fixed on the drone through a fixed bracket such as a pan/tilt, and controlled by the drone 10 to perform image collection tasks.
  • one or more functional modules can also be added to the drone to enable the drone to achieve corresponding functions, such as the built-in main control chip, which is used as the control core of drone flight and data transmission or as a video transmission device. , Upload the collected image information to the equipment connected with the drone.
  • the smart terminal 20 may be any type of smart device used to establish a communication connection with the drone, such as a mobile phone, a tablet computer, or a smart remote control.
  • the smart terminal 20 may be equipped with one or more different user interaction devices to collect user instructions or display and feedback information to the user.
  • buttons, display screens, touch screens, speakers, and remote control joysticks are included in the smart terminal 20 .
  • the smart terminal 20 may be equipped with a touch screen, through which the user’s remote control instructions for the drone are received and the image information obtained by the aerial camera is displayed to the user through the touch screen. The user can also use the remote control The touch screen switches the image information currently displayed on the display.
  • the UAV 10 and the smart terminal 20 can also integrate existing image visual processing technologies to further provide more intelligent services.
  • the drone 10 can collect images through an aerial camera, and then the smart terminal 20 analyzes the operation gestures in the image, and finally realizes the user's gesture control of the drone 10.
  • the wireless network 30 may be a wireless communication network based on any type of data transmission principle for establishing a data transmission channel between two nodes, such as a Bluetooth network, a WiFi network, a wireless cellular network, or a combination thereof located in a specific signal frequency band.
  • FIG. 2 is a structural block diagram of an aerial camera 11 provided by an embodiment of the present invention.
  • the aerial camera 11 may include: an image sensor 111, a controller 112 and an image processor 113.
  • the image sensor 111 is a functional module used to collect images with set shooting parameters.
  • the optical signal corresponding to the visual image is projected onto the photosensitive element through the lens and related optical components, and the optical signal is converted into the corresponding electrical signal by the photosensitive element.
  • the shooting parameter is a parameter variable that can be adjusted related to the lens and related optical component structure (such as the shutter) during the image acquisition process of the image sensor 111, such as aperture, focal length, or exposure time.
  • the image sensor 111 can acquire one frame of image for each exposure.
  • the controller 112 is the control core of the image sensor 111. It is connected to the image sensor, and can correspondingly control the shooting behavior of the image sensor 111 according to the received instruction, for example, set one or more shooting parameters of the image sensor 111.
  • the controller 112 can trigger the image sensor to continuously acquire multiple frames of images with different exposure times.
  • the number of collected images is a constant value manually set, which may be a default value preset by a technician, or a value set by the user according to the synthesis needs of high dynamic range images during use.
  • three frames of images with different exposure times can be continuously collected. According to the length of the exposure time, they are called short-exposure images, medium-exposure images, and long-exposure images.
  • the image processor 113 is a functional module for synthesizing high dynamic range images. It can receive multiple frames of images continuously collected by the image sensor and synthesize them into corresponding high dynamic range images (HDR).
  • HDR high dynamic range images
  • the aerial camera may further include a storage device 114 for storing data information generated by the aerial camera 11 during use, such as storing an image to be synthesized, a synthesized high dynamic range image, and the like.
  • a storage device 114 for storing data information generated by the aerial camera 11 during use, such as storing an image to be synthesized, a synthesized high dynamic range image, and the like.
  • any type of non-volatile memory with suitable capacity such as SD card, flash memory, or solid state hard disk, can be used.
  • the storage device 114 may also be a detachable structure or a distributed arrangement structure.
  • the aerial camera may only be provided with a data interface, and data such as an image to be synthesized or a high dynamic range image can be transferred to a corresponding device for storage through the data interface.
  • one or more functional modules (such as a controller, an image processor, and a storage device) of the aerial camera 11 shown in FIG. 2 can also be integrated into the UAV 10 as the UAV 10 Part.
  • the functional modules of the aerial camera 11 are only exemplarily described based on the image acquisition process, and are not used to limit the functional modules of the aerial camera 11.
  • Fig. 3 is a structural block diagram of a high dynamic range image synthesis device provided by an embodiment of the present invention.
  • the high dynamic range image synthesis device can be executed by the above-mentioned image processor.
  • the composition of the high dynamic range image synthesis device is described in the form of functional modules.
  • FIG. 3 can be selectively implemented through software, hardware, or a combination of software and hardware according to actual needs. For example, it can be implemented by the processor calling a related software application stored in the memory.
  • the high dynamic range image synthesis device 300 includes: an image frame acquisition module 310, a brightness detection module 320, a secondary difference calculation module 330, a motion detection module 340, and a synthesis module 350.
  • the image frame acquisition module 310 is used to acquire multiple frames of images to be synthesized, and each frame of the images to be synthesized has a different exposure time.
  • the image to be synthesized is the image data information acquired by the image sensor with one exposure collection.
  • the images to be synthesized obtained by continuous acquisition can be combined into an image set for synthesizing the final high dynamic range image.
  • the brightness detection module 320 is configured to calculate the average brightness of the image to be synthesized, and determine the image brightness type of the image to be synthesized according to the average brightness.
  • the image to be synthesized may have significantly different image brightness due to the different environment in which it was taken.
  • each image to be synthesized can be roughly divided into different image brightness types according to the difference in image brightness.
  • the image to be synthesized can be divided into two different image brightness types: high-illuminance images and low-illuminance images according to daytime shooting and nighttime shooting.
  • the secondary difference calculation module 330 is used to calculate the brightness difference between adjacent pixels in the same frame of the image to be synthesized, and calculate the interframe of the image to be synthesized at the same pixel position according to the brightness difference difference. It shows the changes in a specific area between different images to be synthesized.
  • the motion detection module 340 is configured to determine the motion state of the image to be synthesized at the pixel point position according to the difference between the frames.
  • the difference between frames obtained by the quadratic difference calculation indicates the dynamic change of a certain area in time. Therefore, it can be judged whether different positions of the image have moved based on this, so as to determine the movement state of these pixel positions.
  • the specific exercise state can be delineated according to the needs of the actual situation.
  • the motion state can be simply divided into motion pixels and static pixels.
  • a moving pixel indicates that the image at the pixel point has moved.
  • the static pixel indicates that the image at the pixel position has not moved.
  • the synthesis module 350 is configured to weight and synthesize the image to be synthesized into a corresponding high dynamic range image according to the image brightness type and the motion state.
  • Weighted synthesis refers to assigning corresponding weight values to different images to be synthesized, and finally synthesizing to obtain the required high dynamic range image. By adjusting the weight value of the image to be synthesized, some images to be synthesized with poor image quality can be less considered, thereby reducing the influence of the images to be synthesized on the quality of the high dynamic range image.
  • the weight value of the image to be synthesized is adaptively adjusted and considered based on the image brightness type and motion state, thereby effectively avoiding the interference of part of the image to be synthesized, which is beneficial to improving the quality of the HDR image.
  • the application scenario shown in Figure 1 takes an aerial camera applied to a drone as an example.
  • the high dynamic range image synthesis method can also be used in other types of scenes and devices to improve the quality of the output high dynamic range image.
  • the high dynamic range image synthesis method disclosed in the embodiment of the present invention is not limited to being applied to the unmanned aerial vehicle shown in FIG. 1.
  • FIG. 4 is a method flowchart of a high dynamic image synthesis method provided by an embodiment of the present invention. As shown in Figure 4, the image processing method includes the following steps:
  • each frame of the image to be synthesized has a different exposure time.
  • the specific exposure time can be set according to the needs of the actual situation, which is an empirical value and will not be repeated here.
  • These images to be synthesized are all images obtained by continuous shooting. It is used as a data basis to synthesize a high dynamic range image.
  • Average brightness is the overall image brightness in a to-be-composited image, which reflects the illuminance of the surrounding environment when the image is taken. The higher the draw brightness, the better the illumination of the surrounding environment when the image to be synthesized is shot.
  • the average brightness can be calculated in the following manner:
  • the brightness value of each pixel in the image to be synthesized is superimposed to obtain the accumulated brightness value.
  • the sum of the accumulated brightness values of the image to be synthesized in each frame is added to obtain the total brightness value.
  • the average brightness value is calculated according to the total brightness value, the number of images to be synthesized and the size of the images to be synthesized. In this way, the average brightness value of multiple frames of images to be synthesized at one pixel position can be calculated and used as the "average brightness".
  • the image brightness type is a type that is pre-defined or divided according to the brightness. In different use cases, it can be divided into an appropriate number of image brightness types according to use needs, so that images to be synthesized with similar average brightness are regarded as the same image brightness type for further processing.
  • the brightness type of the image to which the image to be synthesized belongs can be determined by setting an appropriate brightness detection threshold. For example, when the image brightness type includes a high-illuminance image and a low-illuminance image, a brightness detection threshold can be preset.
  • the average brightness is greater than or equal to a preset brightness detection threshold, it is determined that the image to be synthesized is a high-illuminance image.
  • the average brightness is less than the brightness detection threshold, it is determined that the image to be synthesized is a low-illuminance image.
  • the brightness detection threshold is an empirical value, which can be set according to the needs of the actual situation.
  • a high-illuminance image corresponds to a scene with sufficient light such as daytime or illumination.
  • the low-illuminance image indicates that the shooting scene of the image to be synthesized is a scene with severely insufficient light such as at night.
  • the image to be synthesized is actually composed of many different pixels. Pixel is the smallest basic unit in a frame of image. In the same frame of image, the difference between adjacent pixels roughly reflects the texture of the subject.
  • the specific calculation method of the brightness difference may include the following steps:
  • the target pixel is currently selected and needs to be calculated to determine the motion state of the pixel. As shown in Fig. 6, for any pixel, its border can be surrounded by 8 adjacent pixels. The first pixel and the second pixel are two of the 8 pixels located around the target pixel.
  • the inter-frame difference is calculated from the difference between the brightness differences of different images to be synthesized at the same position.
  • the texture of the images taken continuously in multiple frames at the same position should not change significantly.
  • the difference between frames based on the secondary difference can reflect the movement of the subject.
  • the difference between frames In the case where the difference between frames is too large, it indicates that the subject is moving violently.
  • the difference between frames When the difference between frames is low, it indicates that the position of the subject has basically not changed.
  • the secondary difference provided in this embodiment is based on the brightness difference between adjacent pixels to measure the difference between different images to be synthesized, which can effectively avoid the influence of the brightness difference of the image to be synthesized. , As the basis for accurate judgment of motion detection.
  • the state of motion refers to whether the subject is moving. Specifically, the pixel position where the photographic object has moved can be referred to as a motion pixel. And those pixel positions when the subject is not moving are called stationary pixels.
  • the image to be synthesized including a short exposure image, a medium exposure image, and a long exposure image obtained by continuous shooting as an example (wherein, the exposure time of the short exposure image is shorter than the medium exposure image, and the exposure time of the medium exposure image (Smaller than the long exposure image), the specific determination process of the motion state is described in detail.
  • the method of judging the motion state includes:
  • step 520 Determine whether both K1 and K2 are less than or equal to a preset motion detection threshold. If not, go to step 530, and if yes, go to step 540.
  • the motion detection threshold is an empirical value, which can be set according to the needs of the actual situation.
  • “Motion pixel” means that the subject at the pixel point has moved. In this embodiment, all pixels at this position of the image to be synthesized are called “moving pixels”.
  • “Still pixel” means that the subject at the pixel position has not moved. In this embodiment, all pixels of the image to be synthesized at this position are called “still pixels”.
  • the two indicators of image brightness type and motion state can well reflect the situation of the scene when the image to be synthesized is shot. Therefore, the weight of each frame to be synthesized in the synthesis process can be adjusted adaptively, so that the synthesized high dynamic range image has better image quality.
  • each pixel of the same high dynamic range image is calculated and determined by the pixel of each image to be synthesized at the same pixel position through weighted synthesis.
  • the specific weighting calculation process is as follows:
  • the short-exposure weighting coefficient, medium-exposure weighting coefficient, and long-exposure weighting coefficient are all preset weight values, which can be adjusted or set according to the needs of the actual situation.
  • the weight ratio when synthesizing high dynamic range images are all preset weight values, which can be adjusted or set according to the needs of the actual situation.
  • the motion state of the pixel position is a static pixel
  • the image brightness type is a high-illuminance image
  • the short exposure image The pixel points of the medium exposure image and the long exposure image at the pixel point positions are weighted and synthesized into the pixel points of the high dynamic range image at the same pixel point position.
  • the motion state of the pixel position is a static pixel
  • the image brightness type is a high-illuminance image
  • the short exposure image and the long exposure image are discarded
  • the medium exposure image is set in the pixel according to the
  • the pixels at the dot positions and the medium exposure weight coefficient are weighted and synthesized to the pixels at the same pixel position of the high dynamic range image.
  • the motion state is a static pixel and the image brightness type is a low-illuminance image
  • the short exposure image is discarded, and the medium exposure image and the long exposure weight coefficient are combined according to the medium exposure weight coefficient and the long exposure weight coefficient.
  • the pixels of the long exposure image at the pixel position are weighted and synthesized into the pixels of the high dynamic range image at the same pixel position.
  • the image to be synthesized with a shorter exposure time will have more light noise and lower image quality.
  • the weight coefficient of the short exposure image can be adjusted to zero in such a situation, so as to avoid adverse effects on the final synthesized high dynamic range image.
  • the weight coefficients of the short-exposure image and the medium-exposure image can be adjusted to zero to avoid adverse effects on the final synthesized high dynamic range image.
  • the interference of some low-quality images to be synthesized can be avoided, and the final output high dynamic range image can have high dynamic range and high definition in the daytime static scene, and in the night scene.
  • the picture has low noise and good technical effect without tailing when moving.
  • the image sensor continuously collects short-exposure images with an exposure time of x/2 each time, a medium-exposure image with an exposure time of x, and a long-exposure image with an exposure time of 2x as images to be synthesized.
  • the length of the image to be synthesized is w pixels and the width is h pixels.
  • the short-exposure image is sent to the image processor first after the short-exposure image is taken, and the medium-exposure image and the long-exposure image are sent sequentially, so that the entire image synthesis process has the smallest delay.
  • the brightness detection module 320 calculates the average brightness of the image to be synthesized by the following formula (1):
  • S (i, j) is the brightness value of the pixel in the i-th row and j-th column of the short-exposure image
  • M (i, j) is the brightness value of the pixel in the i-th row and j-th column of the medium-exposure image
  • L (i, j) is the brightness value of the pixel in the i-th row and j-th column of the long-exposure image
  • L is the average brightness.
  • the brightness detection module 320 determines whether the average brightness L is greater than or equal to a preset brightness detection threshold T. When the average brightness L is greater than or equal to the brightness detection threshold T, it is determined that the image brightness type of the image to be synthesized is a high-illuminance image, and when the average brightness L is less than the brightness detection threshold T, the image brightness type of the image to be synthesized is determined The image brightness type is a low-light image.
  • the secondary difference calculation module 330 calculates the brightness difference between adjacent pixels through the following formulas (2-1) to (2-3):
  • ⁇ S (i,j)
  • ⁇ M (i,j)
  • ⁇ L (i,j)
  • ⁇ S (i,j) is the brightness difference of pixels in the i-th row and j-th column of the short-exposure image
  • ⁇ M (i,j) is the brightness difference of the pixels in the i-th row and j-th column of the medium-exposure image
  • ⁇ L (i,j) is the brightness difference of the pixels in the i-th row and j-th column of the long-exposure image.
  • the inter-frame difference between the short-exposure image and the medium-exposure image can be calculated to be
  • the motion detection module 340 determines whether the difference between the two frames is less than the preset motion detection threshold A based on the difference between the frames calculated by the secondary difference calculation module 330.
  • the motion state of the pixel point position (i, j) is a motion pixel.
  • the synthesis module 350 is connected to the brightness detection module 320 and the motion detection module 340, and adjusts and determines specific weighting coefficients according to the image brightness type and motion state provided by it, and completes the synthesis of the high dynamic range image.
  • the synthesis module 350 presets an ideal weighted synthesis process shown in the following formula (3):
  • H (i,j) a ⁇ S (i,j) + b ⁇ M (i,j) +c ⁇ L (i,j) (3)
  • a is a short exposure weighting coefficient
  • b is a medium exposure weighting coefficient
  • c is a long exposure weighting coefficient.
  • S (i,j) is the pixel in the i-th row and j-th column of the short-exposure image
  • M (i,j) is the pixel in the i-th row and j-th column of the medium-exposure image
  • L (i,j) is the long
  • H (i, j) is the pixel points in the i-th row and j-th column of the synthesized high dynamic range image.
  • the synthesis module 350 performs weighted synthesis according to equation (3).
  • the synthesis module 350 adjusts the coefficients a and c to zero, and performs weighted synthesis as shown in the following formula (3-1):
  • the synthesis module 350 discards the short-exposure image with more noise, and performs weighted synthesis as shown in the following formula (3-2):
  • the synthesis module 350 When the image brightness type is a low-illuminance image and the motion state is a motion pixel, the synthesis module 350 only uses the long-exposure image with a sufficiently long exposure time to perform weighted synthesis using the method shown in the following formula (3-3):
  • the composite image obtained after multiple consecutive exposures can be targeted and integrated into an HDR image with higher image quality, avoiding the tailing of moving objects and picture clarity that are prone to composite HDR images in high-speed moving scenes such as aerial photography. Decline, even light and dark errors and other issues.
  • the embodiment of the present invention also provides a non-volatile computer storage medium, the computer storage medium stores at least one executable instruction, and the computer executable instruction can execute the high dynamic range image synthesis method in any of the foregoing method embodiments .
  • Fig. 7 shows a schematic structural diagram of an image processing chip according to an embodiment of the present invention.
  • the specific embodiment of the present invention does not limit the specific implementation of the image processing chip.
  • the image processing chip may include: a processor (processor) 702, a communication interface (Communications Interface) 704, a memory (memory) 706, and a communication bus 708.
  • processor processor
  • communication interface Communication Interface
  • memory memory
  • the processor 702, the communication interface 704, and the memory 706 communicate with each other through the communication bus 708.
  • the communication interface 704 is used to communicate with other devices such as network elements such as clients or other servers.
  • the processor 702 is configured to execute the program 710, and specifically can execute the relevant steps in the above-mentioned embodiment of the high dynamic range image synthesis method.
  • the program 710 may include program code, and the program code includes computer operation instructions.
  • the processor 702 may be a central processing unit CPU, or an Application Specific Integrated Circuit (ASIC), or one or more integrated circuits configured to implement the embodiments of the present invention.
  • the one or more processors included in the network slicing device may be the same type of processor, such as one or more CPUs, or different types of processors, such as one or more CPUs and one or more ASICs.
  • the memory 706 is used to store the program 710.
  • the memory 706 may include a high-speed RAM memory, and may also include a non-volatile memory (non-volatile memory), for example, at least one disk memory.
  • the program 710 may be specifically used to enable the processor 702 to execute the high dynamic range image synthesis method in any of the foregoing method embodiments.
  • the computer software may be stored in a computer readable storage medium, and when the program is executed, it may include the processes of the above-mentioned method embodiments.
  • the storage medium can be a magnetic disk, an optical disc, a read-only storage memory, or a random storage memory, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)

Abstract

本发明实施例涉及高动态范围图像合成方法、装置、图像处理芯片以及航拍相机。其包括:获取多帧具有不同的曝光时间的待合成图像;计算所述待合成图像的平均亮度;根据所述平均亮度,确定所述待合成图像的图像亮度类型;计算同一帧待合成图像中,相邻像素点之间的亮度差异;根据所述亮度差异,计算不同的待合成图像在同一个像素点位置上的帧间差异;根据所述帧间差异,确定所述待合成图像在所述像素点位置上的运动状态;根据所述图像亮度类型以及所述运动状态,将所述待合成图像加权合成为对应的高动态范围图像。其根据运动状态以及图像亮度类型,适应性的对权重进行调整,避免了合成HDR图像时出现运动物体拖尾,画面清晰度下降等问题。

Description

高动态范围图像合成方法、装置、图像处理芯片及航拍相机
本申请要求于2020年4月14日提交中国专利局、申请号为2020102915710、申请名称为“高动态范围图像合成方法、装置、图像处理芯片及航拍相机”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
【技术领域】
本发明涉及图像处理技术领域,尤其涉及一种高动态范围图像合成方法、装置、图像处理芯片以及航拍相机。
【背景技术】
高动态范围图像(HDR)是相对普通图像而言,可以提供更多亮暗图像细节,能够更好的反映真实环境中的视觉效果的图像数据。通常的,高动态范围图像由多帧不同曝光时间的普通图像(即低动态范围图像)合成,利用每个曝光时间相对应的最佳细节来合成。
但是,这样通过多次曝光产生多帧普通图像后再合成获得的HDR图像,在航拍等特定,可能具有高移动速率的应用场景中很容易出现运动物体拖尾,画面清晰度下降的问题,有时甚至还会出现亮暗错误的情况。
因此,如何避免通过多帧图像合成HDR图像而导致HDR图像出现的运动物体拖尾,清晰度下降等一系列缺陷是一个迫切需要解决的问题。
【发明内容】
本发明实施例旨在提供一种高动态范围图像合成方法、装置、图像处理芯片以及航拍相机,能够解决现有HDR图像合成方式存在的缺陷。
为解决上述技术问题,本发明实施例提供以下技术方案:一种高动态范围图像合成方法,包括:
获取多帧待合成图像,每一帧待合成图像具有不同的曝光时间;计算所 述待合成图像的平均亮度;根据所述平均亮度,确定所述待合成图像的图像亮度类型;计算同一帧待合成图像中,相邻像素点之间的亮度差异;根据所述亮度差异,计算不同的待合成图像在同一个像素点位置上的帧间差异;根据所述帧间差异,确定所述待合成图像在所述像素点位置上的运动状态;根据所述图像亮度类型以及所述运动状态,将所述待合成图像加权合成为对应的高动态范围图像。
可选地,所述图像亮度类型包括:高照度场景以及低照度场景;所述根据所述平均亮度,确定所述待合成图像的图像亮度类型,具体包括:在所述平均亮度大于等于预设的亮度检测阈值时,确定所述待合成图像的图像亮度类型为高照度场景;在所述平均亮度小于所述亮度检测阈值时,确定所述待合成图像的图像亮度类型为低照度场景。
可选地,所述计算所述待合成图像的平均亮度,具体包括:叠加所述待合成图像中每一个像素点的亮度值,获得亮度累加值:对每一帧所述待合成图像的亮度累加值求和,获得总亮度值;根据所述总亮度值、待合成图像的数量以及待合成图像的尺寸,计算所述平均亮度值。
可选地,所述待合成图像包括:连续拍摄获得的短曝光图像、中曝光图像以及长曝光图像;所述短曝光图像的曝光时间小于所述中曝光图像;所述中曝光图像的曝光时间小于所述长曝光图像。
可选地,所述运动状态包括:运动像素和静止像素;所述根据所述帧间差异,确定所述待合成图像在所述像素点位置上的运动状态,具体包括:判断所述短曝光图像与中曝光图像之间的帧间差异,以及所述中曝光图像与长曝光图像之间的帧间差异是否均大于等于预设的运动检测阈值;若是,确定所述像素点位置的运动状态为运动像素;若否,确定所述像素点位置为运动状态为静止像素。
可选地,所述计算同一待合成图像中,相邻像素点之间的亮度差异,具体包括:计算目标像素点与相邻的第一像素点的第一亮度差值以及所述目标像素点与相邻的第二像素点之间的第二亮度差值;获取所述第一亮度差值和第二亮度差值之间的差值作为所述目标像素点的亮度差异。
可选地,所述根据所述图像亮度类型以及所述运动状态,将所述若干待 合成图像加权合成为对应的高动态范围图像,具体包括:
分别为所述短曝光图像、中曝光图像以及长曝光图像预设对应的短曝光权重系数、中曝光权重系数以及长曝光权重系数;
在所述像素点位置的运动状态为静止像素,所述图像亮度类型为高照度图像时,根据所述短曝光权重系数、中曝光权重系数以及长曝光权重系数,将所述短曝光图像、中曝光图像以及长曝光图像在所述像素点位置的像素点,加权合成为所述高动态范围图像在相同像素点位置的的像素点。
可选地,所述根据所述图像亮度类型以及所述运动状态,将所述若干待合成图像加权合成为对应的高动态范围图像,具体包括:
在所述像素点位置的运动状态为静止像素,所述图像亮度类型为高照度图像时,舍弃所述短曝光图像以及所述长曝光图像;
根据所述中曝光图像在所述像素点位置的像素点以及所述中曝光权重系数,加权合成所述高动态范围图像在相同像素点位置的像素点;
在所述运动状态为静止像素,所述图像亮度类型为低照度图像时,舍弃所述短曝光图像;
根据所述中曝光权重系数和长曝光权重系数,将所述中曝光图像与所述长曝光图像在所述像素点位置的像素点,加权合成为所述高动态范围图像在相同像素点位置的的像素点;
在所述运动状态为运动像素,所述图像亮度类型为低照度图像时,舍弃所述中曝光图像以及所述长曝光图像;
根据所述短曝光图像在所述像素点位置的像素点与所述短曝光权重系数,加权合成所述高动态范围图像在相同像素点位置的像素点。
为解决上述技术问题,本发明实施例还提供以下技术方案:一种高动态范围图像合成装置,包括:
图像帧获取模块,用于获取多帧待合成图像,每一帧待合成图像具有不同的曝光时间;场景检测模块,用于计算所述待合成图像的平均亮度,并且根据所述平均亮度,确定所述待合成图像的图像亮度类型;二次差值计算模块,用于计算同一帧待合成图像中,相邻像素点之间的亮度差异,并且根据所述亮度差异,计算所述待合成图像在同一个像素点位置上的帧间差异,运 动检测模块,用于根据所述帧间差异,确定所述待合成图像在所述像素点位置上的运动状态;合成模块,用于根据所述图像亮度类型以及所述运动状态,将所述待合成图像加权合成为对应的高动态范围图像。
为解决上述技术问题,本发明实施例还提供以下技术方案:一种图像处理芯片,包括:处理器以及与所述处理器通信连接的存储器;所述存储器中存储有计算机程序指令,所述计算机程序指令在被所述处理器调用时,以使所述处理器执行如上所述的高动态范围图像合成方法。
为解决上述技术问题,本发明实施例还提供以下技术方案:一种航拍相机。所述航拍相机包括:
图像传感器,所述图像传感器用于以设定的拍摄参数,采集多帧图像;控制器;所述控制器与所述图像传感器连接,用于触发所述图像传感器以不同曝光时间长度采集多帧图像;图像处理器,所述图像处理器用于接收所述图像传感器通过连续曝光采集的多帧图像,并对接收到的所述多帧图像执行如上所述的高动态范围图像合成方法,获得高动态范围图像;存储设备,所述存储设备与所述图像处理器连接,用于存储所述高动态范围图像。
与现有技术相比较,本发明实施例的HDR图像合成方法根据待合成图像的运动状态以及图像亮度类型的不同,适应性的在合成的HDR图像的过程中,对不同的普通图像的权重比例进行调整,从而有效的避免了多帧图像合成HDR图像时所出现运动物体拖尾,画面清晰度下降等的问题。
【附图说明】
一个或多个实施例通过与之对应的附图中的图片进行示例性说明,这些示例性说明并不构成对实施例的限定,附图中具有相同参考数字标号的元件表示为类似的元件,除非有特别申明,附图中的图不构成比例限制。
图1为本发明实施例的高动态范围图像合成方法的应用场景的示意图;
图2为本发明实施例提供的航拍相机的结构框图;
图3为本发明实施例提供的高动态范围图像合成装置的示意图;
图4为本发明实施例提供的图像处理方法的方法流程图;
图5为本发明实施例提供的运动状态判断方法的方法流程图;
图6为本发明实施例提供的像素点位置的示意图;
图7为本发明实施例提供的图像处理芯片的结构示意图。
【具体实施方式】
为了便于理解本发明,下面结合附图和具体实施例,对本发明进行更详细的说明。需要说明的是,当元件被表述“固定于”另一个元件,它可以直接在另一个元件上、或者其间可以存在一个或多个居中的元件。当一个元件被表述“连接”另一个元件,它可以是直接连接到另一个元件、或者其间可以存在一个或多个居中的元件。本说明书所使用的术语“上”、“下”、“内”、“外”、“底部”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。此外,术语“第一”、“第二”“第三”等仅用于描述目的,而不能理解为指示或暗示相对重要性。
除非另有定义,本说明书所使用的所有的技术和科学术语与属于本发明的技术领域的技术人员通常理解的含义相同。本说明书中在本发明的说明书中所使用的术语只是为了描述具体的实施例的目的,不是用于限制本发明。本说明书所使用的术语“和/或”包括一个或多个相关的所列项目的任意的和所有的组合。
此外,下面所描述的本发明不同实施例中所涉及的技术特征只要彼此之间未构成冲突就可以相互结合。
在相机拍摄照片时,不同长度的曝光时间可以改变进入到感光元件中的通光量,从而提供具有不同细节的图像。高动态范围图像由多帧不同曝光时间的普通图像合成,以更好的展现亮暗细节。
在一些拍摄环境,如高速移动的情况下,曝光时间不合适的图像会存在比较显著的质量问题。因此,需要对这些图像进行筛选和调整,从而更好的提升合成获得的高动态范围图像的质量,避免出现画面清晰度下降和亮暗错误等的问题。
图1为本发明实施例提供的高动态范围图像合成方法的应用场景。如图1所示,在该应用场景中,包括了搭载了航拍相机的无人机10、智能终端20以及无线网络30。
无人机10可以是以任何类型的动力驱动的无人飞行载具,包括但不限于四轴无人机、固定翼飞行器以及直升机模型等。其可以根据实际情况的需要,具备相应的体积或者动力,从而提供能够满足使用需要的载重能力、飞行速度以及飞行续航里程等。
航拍相机可以是任何类型的图像采集设备,包括运动相机、高清相机或者广角相机。其航拍相机作为无人机上搭载的其中一种功能模块,可以通过云台等安装固定支架,安装固定在无人机上,并受控于无人机10,执行图像采集的任务。
当然,无人机上还可以添加有一种或者多种功能模块,令无人机能够实现相应的功能,例如内置的主控芯片,作为无人机飞行和数据传输等的控制核心或者是图传装置,将采集获得的图像信息上传至与无人机建立连接的设备中。
智能终端20可以是任何类型,用以与无人机建立通信连接的智能设备,例如手机、平板电脑或者智能遥控器等。该智能终端20可以装配有一种或者多种不同的用户交互装置,用以采集用户指令或者向用户展示和反馈信息。
这些交互装置包括但不限于:按键、显示屏、触摸屏、扬声器以及遥控操作杆。例如,智能终端20可以装配有触控显示屏,通过该触控显示屏接收用户对无人机的遥控指令并通过触控显示屏向用户展示由航拍相机获得的图像信息,用户还可以通过遥控触摸屏切换显示屏当前显示的图像信息。
在一些实施例中,无人机10与智能终端20之间还可以融合现有的图像视觉处理技术,进一步的提供更智能化的服务。例如无人机10可以通过航拍相机采集图像,然后由智能终端20对图像中的操作手势进行解析,最终实现用户对于无人机10的手势控制。
无线网络30可以是基于任何类型的数据传输原理,用于建立两个节点之间的数据传输信道的无线通信网络,例如位于特定信号频段的蓝牙网络、WiFi网络、无线蜂窝网络或者其结合。
图2为本发明实施例提供的航拍相机11的结构框图。如图2所示,该航拍相机11可以包括:图像传感器111,控制器112以及图像处理器113。
其中,图像传感器111是用于以设定的拍摄参数,采集图像的功能模组。其通过镜头和相关光学组件将视觉画面对应的光信号投射到感光元件上,并 由感光元件将光信号转换为相应的电信号。
该拍摄参数是图像传感器111在图像采集过程中,与镜头和相关光学组件结构(如快门)相关,可以调整的参数变量,例如光圈、焦距或者曝光时间等。图像传感器111每一次曝光可以采集获得一帧图像。
控制器112是图像传感器111的控制核心。其与所述图像传感器连接,可以根据接收到的指令,相应的控制图像传感器111的拍摄行为,例如设定图像传感器111的一个或者多个拍摄参数。
在合适的触发条件下,控制器112可以触发所述图像传感器以不同的曝光时间连续采集多帧图像。该采集的图像数量是一个人为设定的常数值,其可以是技术人员预先设定的默认值,也可以是用户在使用过程中,根据高动态范围图像的合成需要而设定的数值。
例如,可以连续采集三帧不同曝光时间的图像。其根据曝光时间的长短而分别被称为短曝光图像、中曝光图像以及长曝光图像。
图像处理器113是用于合成高动态范围图像的功能模组。其可以接收所述图像传感器连续采集的多帧图像并将其合成为对应的高动态范围图像(HDR)。
在一些实施例中,该航拍相机中还可以进一步的包括存储设备114,用于存储航拍相机11在使用过程中产生的数据信息,如存储待合成图像,合成后的高动态范围图像等。其具体可以采用任何类型的,具有合适容量的非易失性存储器,如SD卡、闪存或者固态硬盘等。
在一些实施例中,存储设备114还可以是可拆卸结构或者是分布式布置的结构。航拍相机可以仅设置有数据接口,将待合成的图像或者高动态范围图像等的数据通过该数据接口传递到相应的设备中进行存储。
应当说明的是,图2所示的航拍相机11的一个或者多个功能模组(如控制器、图像处理器和存储设备)也可以整合到无人机10中,作为无人机10的其中一部分。图2中仅基于图像采集的过程对所述航拍相机11的功能模块进行示例性描述,而不用于限制航拍相机11所具有的功能模组。
图3为本发明实施例提供的高动态范围图像合成装置的结构框图。该高动态范围图像合成装置可以由上述的图像处理器所执行。在本实施例中,以功能模块的方式描述该高动态范围图像合成装置的组成。
本领域技术人员可以理解的是,图3所示的功能模块可以根据实际情况的需要,选择性的通过软件、硬件或者软件和硬件相结合的方式来实现。例如,可以通过处理器调用存储器中存储的相关软件应用程序予以实现。
如图3所示,该高动态范围图像合成装置300包括:图像帧获取模块310,亮度检测模块320,二次差值计算模块330,运动检测模块340以及合成模块350。
其中,图像帧获取模块310用于获取多帧待合成图像,每一帧待合成图像具有不同的曝光时间。待合成图像是由图像传感器一次曝光采集获得的图像数据信息。连续采集获得的待合成图像可以组成为一个图像集合,用于合成最终的高动态范围图像。
亮度检测模块320用于计算所述待合成图像的平均亮度,并且根据所述平均亮度,确定所述待合成图像的图像亮度类型。
待合成图像因其拍摄时所处的环境不同,可能具有显著不同的图像亮度情况。在本实施例中,可以根据图像亮度的区别而大致的将各个待合成图像划分为不同的图像亮度类型。
例如,可以根据白天拍摄和夜间拍摄,将待合成图像划分为高照度图像和低照度图像两种不同的图像亮度类型。
二次差值计算模块330用于计算同一帧待合成图像中,相邻像素点之间的亮度差异,并且根据所述亮度差异,计算所述待合成图像在同一个像素点位置上的帧间差异。其表明了不同的待合成图像之间在特定区域范围内的变化情况。
运动检测模块340用于根据所述帧间差异,确定所述待合成图像在所述像素点位置上的运动状态。
由于二次差值计算得到的帧间差异表明了某个区域在时间上的动态变化情况。因此,可以据此对图像不同的位置是否发生了移动而进行判断,从而确定这些像素点位置的运动状态。具体的运动状态可以根据实际情况的需要而进行划定。
例如,可以简单的将所述运动状态划分为运动像素和静止像素。运动像素表明该像素点位置的图像出现了移动。而静止像素表明该像素点位置的图 像没有移动。
合成模块350用于根据所述图像亮度类型以及所述运动状态,将所述待合成图像加权合成为对应的高动态范围图像。
“加权合成”是指为不同的待合成图像赋予对应的权重值,最终合成获得所需要的高动态范围图像。通过调整待合成图像的权重值可以使得一些图像质量较差的待合成图像被更少的考虑,从而降低这些待合成图像对高动态范围图像的质量的影响。
在本实施例中,由图像亮度类型和运动状态来适应性的对待合成图像的权重值进行调整和考量,从而有效的避免了部分待合成图像的干扰,有利于提升HDR图像的质量。
虽然,图1所示的应用场景中以应用在无人机搭载的航拍相机为例。但是,本领域技术人员可以理解的是,该高动态范围图像合成方法还可以在其它类型的场景和设备中使用,以提高输出的高动态范围图像的质量。本发明实施例公开的高动态范围图像合成方法并不限于在图1所示的无人机上应用。
图4为本发明实施例提供的高动态图像合成方法的方法流程图。如图4所示,该图像处理方法包括如下步骤:
410、获取多帧待合成图像。
其中,每一帧待合成图像具有不同的曝光时间。具体的曝光时间可以根据实际情况的需要而进行设定,是一个经验性数值,在此不作赘述。这些待合成图像都是一些连续拍摄获得的图像。其作为数据基础,用以合成一张高动态范围图像。
420、计算所述待合成图像的平均亮度。
“平均亮度”是一个待合成图像中整体的图像亮度情况,反映了在拍摄图像时,周边环境的光照度情况。平局亮度越高表明待合成图像在拍摄时周边环境的光照度越好。
具体的可以采用任何合适的方式来计算得到该平均亮度。在一些实施例中,该平均亮度可以通过如下方式计算获得:
首先,叠加所述待合成图像中每一个像素点的亮度值,获得亮度累加值。然后,对每一帧所述待合成图像的亮度累加值求和,获得总亮度值。最后,根据所述总亮度值、待合成图像的数量以及待合成图像的尺寸,计算所述平 均亮度值。通过这样的方式可以计算得到多帧待合成图像在一个像素点位置上的平均亮度值,作为所述“平均亮度”使用。
430、根据所述平均亮度,确定所述待合成图像的图像亮度类型。
图像亮度类型是一个根据亮度的不同而预先划定或者划分的类型。在不同的使用情况下,可以根据使用需要划分为合适数量的图像亮度类型,从而将具有相近平均亮度的待合成图像作为同样的图像亮度类型以便于进一步的处理。
在一些实施例中,可以通过设置合适的亮度检测阈值的方式来确定待合成图像所属的图像亮度类型。例如,在图像亮度类型包括高照度图像和低照度图像时,可以预设一个亮度检测阈值。
在所述平均亮度大于等于预设的亮度检测阈值时,确定所述待合成图像为高照度图像。而在所述平均亮度小于所述亮度检测阈值时,确定所述待合成图像为低照度图像。
该亮度检测阈值是一个经验性数值,可以根据实际情况的需要而进行设置。高照度图像对应于日间等光线或者光照度充足的场景。而低照度图像则表明待合成图像的拍摄场景是夜间等光线严重不足的场景。
440、计算同一帧待合成图像中,相邻像素点之间的亮度差异。
该待合成图像实际上是由许多不同的像素点组成的。像素点是一帧图像中最小的基础单元。在同一帧图像中,相邻像素点之间的差异大致上反映了被拍摄对象的纹理情况。
具体的,该亮度差异的具体计算方法可以包括如下步骤:
首先,计算目标像素点与相邻的第一像素点的第一亮度差值以及所述目标像素点与相邻的第二像素点之间的第二亮度差值。然后,获取所述第一亮度差值和第二亮度差值之间的差值作为所述目标像素点的亮度差异。
该目标像素点是当前选定的,需要计算确定运动状态的像素点。如图6所示,对于任意一个像素点,其围边可以围绕有8个相邻的像素点。所述第一像素点和第二像素点是位于目标像素点周边的8个像素点之中的其中两个。
450、根据所述亮度差异,计算不同的待合成图像在同一个像素点位置上的帧间差异。
该帧间差异是由不同的待合成图像在同一个位置上的亮度差异之间的差 所计算得到。本领域技术人员可以理解的是,在拍摄对象没有发生显著移动的情况下,在多帧连续拍摄的图像在相同位置上的纹理情况应该不会发生显著的变动。
因此,基于二次差值得到的帧间差异可以反映拍摄对象的移动情况。在帧间差异过大的情况下,表明拍摄对象的移动剧烈。而帧间差异较低时,则表明拍摄对象的位置基本没有变动。
由于待合成图像之间具有显著不同的曝光时间。因此,不同的待合成图像之间自身就存在着较大的亮度差别,传统的比较相同像素点位置的亮度差异来检测拍摄对象是否运动的方式无法排除待合成图像的自身所存在的亮度差别。
而本实施例提供的二次差值以相邻像素点之间的亮度差异为基础来衡量不同待合成图像之间的差异,则可以有效的避免待合成图像自身存在的亮度差别带来的影响,作为运动检测的精确判断依据。
460、根据所述帧间差异,确定所述待合成图像在所述像素点位置上的运动状态。
该运动状态是指被拍摄对象是否存在移动的情况。具体的,可以将拍摄对象存发生了移动的像素点位置称为运动像素。而将那些拍摄对象没有发生移动时的像素点位置称为静止像素。
以所述待合成图像包括连续拍摄获得的短曝光图像、中曝光图像以及长曝光图像为例(其中,所述短曝光图像的曝光时间小于所述中曝光图像,所述中曝光图像的曝光时间小于所述长曝光图像),详细描述运动状态的具体判断过程。如图5所示,运动状态的判断方法包括:
510、计算所述短曝光图像与中曝光图像之间的帧间差异K1和所述中曝光图像与长曝光图像之间的帧间差异K2。
520、判断K1和K2是否均小于等于预设的运动检测阈值。若否,执行步骤530,若是,执行步骤540。
该运动检测阈值是一个经验性数值,可以根据实际情况的需要而进行设置。
530、确定所述像素点位置的运动状态为运动像素。
“运动像素”是指该像素点位置上的拍摄对象发生了移动。在本实施例 中,将所有待合成图像的在这一位置的像素称为“运动像素”。
540、确定所述像素点位置为运动状态为静止像素。
“静止像素”是指该像素点位置上的拍摄对象没有发生移动。在本实施例中,将所有待合成图像在这一位置的像素称为“静止像素”。
470、根据所述图像亮度类型以及所述运动状态,将所述待合成图像加权合成为对应的高动态范围图像。
如上所描述的,图像亮度类型和运动状态这两个指标可以很好的反映待合成图像在拍摄时的场景情况。由此,可以适应性的调整各帧待合成图像在合成过程中的权重,从而使合成获得高动态范围图像具有更好的图像质量。
同样地,以所述待合成图像包括连续拍摄获得的短曝光图像、中曝光图像以及长曝光图像为例,详细描述具体的加权合成过程。
在本实施例中,同一个高动态范围图像的每一个像素点均由各个待合成图像的在相同像素点位置上的像素点通过加权合成的方式计算确定。具体的加权计算的过程如下:
首先,分别为所述短曝光图像、中曝光图像以及长曝光图像预设对应的短曝光权重系数、中曝光权重系数以及长曝光权重系数。
该短曝光权重系数、中曝光权重系数以及长曝光权重系数都是预先设置的权重值,可以根据实际情况的需要而进行相应的调整或者设置,表明了在一般情况下,各帧待合成图像在合成高动态范围图像时的权重比例。
然后,确定该像素点位置的运动状态和待合成图像的图像亮度类型,并相应的分为如下几种情况分别进行处理:
1)在所述像素点位置的运动状态为静止像素,所述图像亮度类型为高照度图像时,根据所述短曝光权重系数、中曝光权重系数以及长曝光权重系数,将所述短曝光图像、中曝光图像以及长曝光图像在所述像素点位置的像素点,加权合成为所述高动态范围图像在相同像素点位置的像素点。
在静止和拍摄光照度充足的情况下,可以认为是理想情况,不需要对权重系数进行调整,可以直接使用预先设置好的权重比例。
2)在所述像素点位置的运动状态为静止像素,所述图像亮度类型为高照度图像时,舍弃所述短曝光图像以及所述长曝光图像,并且根据所述中曝光图像在所述像素点位置的像素点以及所述中曝光权重系数,加权合成所述高 动态范围图像在相同像素点位置的像素点。
对于运动和光照度充足的情况下,过长的曝光时间和过短的曝光时间均无法得到良好的拍摄质量(容易出现模糊和昏暗的问题)。因此,需要将短曝光图像和长曝光图像的权重系数调整为零,避免其对最终合成的高动态范围图像造成不利的影响。
3)在所述运动状态为静止像素,所述图像亮度类型为低照度图像时,舍弃所述短曝光图像并且,根据所述中曝光权重系数和长曝光权重系数,将所述中曝光图像与所述长曝光图像在所述像素点位置的像素点,加权合成为所述高动态范围图像在相同像素点位置的像素点。
在低照度的情况下,曝光时间较短的待合成图像会出现较多的光噪点,图像质量较低。由此,可以在这样的情况下将短曝光图像的权重系数调整为零,避免其对最终合成的高动态范围图像造成不利的影响。
4)在所述运动状态为运动像素,所述图像亮度类型为低照度图像时,舍弃所述中曝光图像以及所述长曝光图像并且,根据所述短曝光图像在所述像素点位置的像素点与所述短曝光权重系数,加权合成所述高动态范围图像在相同像素点位置的像素点。
在低照度和拍摄对象移动的状态下,需要较长的曝光时间才可能保证通光量,拍摄得到拍摄对象的清晰图像。因此,可以将短曝光图像和中曝光图像的权重系数调整为零,避免其对最终合成的高动态范围图像造成不利的影响。
通过上述权重系数的适应性调整,避免部分低质量的待合成图像的干扰,可以使得最终输出的高动态范围图像具有白天静止场景时,具有高动态范围和高清晰度,而晚上场景时,具有画面低噪声,运动时不拖尾的良好技术效果。
为充分描述本发明,以下结合具体实例,详细描述本发明实施例揭露的高动态范围图像合成方法在图像处理器中的执行过程。
假设图像传感器每次连续采集曝光时间为x/2的短曝光图像,曝光时间为x的中曝光图像以及曝光时间为2x的长曝光图像作为待合成图像。其中,待合成图像的长为w个像素,宽为h个像素。
较佳的是,短曝光图像拍摄完毕后先送入图像处理器,中曝光图像和长 曝光图像依次送入,以使得整个图像合成的过程具有最小的延时。
在处理过程中,由亮度检测模块320通过如下算式(1)计算待合成图像的平均亮度:
Figure PCTCN2021083350-appb-000001
其中,S (i,j)为短曝光图像第i行,第j列的像素点的亮度值,M (i,j)为中曝光图像第i行,第j列的像素点的亮度值,L (i,j)为长曝光图像第i行,第j列的像素点的亮度值,L为平均亮度。
亮度检测模块320判断平均亮度L是否大于等于预设的亮度检测阈值T。在平均亮度L大于等于所述亮度检测阈值T时,确定所述待合成图像的图像亮度类型为高照度图像,而在平均亮度L小于所述亮度检测阈值T时,确定所述待合成图像的图像亮度类型为低照度图像。
另外,由二次差值计算模块330通过如下算式(2-1)到(2-3)计算相邻像素点之间的亮度差异:
ΔS (i,j)=||S (i,j)-S (i+1,j)|-|S (i,j)-S (i,j+1)||  (2-1)
ΔM (i,j)=||M (i,j)-M (i+1,j)|-|M (i,j)-M (i,j+1)||  (2-2)
ΔL (i,j)=||L (i,j)-L (i+1,j)|-|L (i,j)-L (i,j+1)||  (2-3)
其中,ΔS (i,j)为短曝光图像第i行,第j列的像素点的亮度差异,ΔM (i,j)为中曝光图像第i行,第j列的像素点的亮度差异,ΔL (i,j)为长曝光图像第i行,第j列的像素点的亮度差异。(如图6所示,相邻的像素点是指围在目标像素点周边的8个像素点)
基于所述亮度差异,可以计算获得所述短曝光图像与中曝光图像之间的帧间差异为|ΔS (i,j)-ΔM (i,j)|,而所述中曝光图像与长曝光图像之间的帧间差异为|ΔM (i,j)-ΔL (i,j)|。
运动检测模块340基于二次差值计算模块330计算获得的帧间差异,判断两个帧间差异是否均小于预设的运动检测阈值A。
在|ΔS (i,j)-ΔM (i,j)|<A且|ΔM (i,j)-ΔL (i,j)|<A时,确定该像素点位置(i,j)的运动状态为静止像素。
而在其中一个帧间差异大于等于运动检测阈值A时,确定该像素点位置(i,j)的运动状态为运动像素。
合成模块350与亮度检测模块320和运动检测模块340连接,根据其提供的图像亮度类型和运动状态来调整和确定具体的权重系数,完成高动态范围图像的合成。
合成模块350中预设有如下算式(3)所示的理想加权合成过程:
H (i,j)=a×S (i,j)+b×M (i,j)+c×L (i,j)  (3)
其中,a为短曝光权重系数,b为中曝光权重系数,c为长曝光权重系数。S (i,j)为短曝光图像第i行,第j列的像素点,M (i,j)为中曝光图像第i行,第j列的像素点,L (i,j)为长曝光图像第i行,第j列的像素点,H (i,j)为合成的高动态范围图像第i行,第j列的像素点。
当图像亮度类型为高照度图像,运动状态为静止像素时,为理想状态。合成模块350按照算式(3)进行加权合成。
当图像亮度类型为高照度图像,运动状态为运动像素时,合成模块350则将系数a和c均调整为零,使用如下算式(3-1)所示的方式进行加权合成:
H (i,j)=b×M (i,j)  (3-1)
当图像亮度类型为低照度图像,运动状态为静止像素时,合成模块350将噪点较多的短曝光图像舍弃,使用如下算式(3-2)所示的方式进行加权合成:
H (i,j)=b×M (i,j)+c×L (i,j)  (3-2)
当图像亮度类型为低照度图像,运动状态为运动像素时,合成模块350仅利用曝光时间足够长的长曝光图像,使用如下算式(3-3)所示的方式进行加权合成:
H (i,j)=c×L (i,j)  (3-3)
通过上述方法可以将多次连续曝光后拍摄获得待合成图像针对性的整合为具有较高图像质量的HDR图像,避免航拍等高速移动场景下合成HDR图像容易出现的运动物体拖尾、画面清晰度下降,甚至亮暗错误等的问题。
本发明实施例还提供了一种非易失性计算机存储介质,所述计算机存储介质存储有至少一可执行指令,该计算机可执行指令可执行上述任意方法实施例中的高动态范围图像合成方法。
图7示出了本发明实施例的图像处理芯片的结构示意图,本发明具体实 施例并不对图像处理芯片的具体实现做限定。
如图7所示,该图像处理芯片可以包括:处理器(processor)702、通信接口(Communications Interface)704、存储器(memory)706、以及通信总线708。
其中:处理器702、通信接口704、以及存储器706通过通信总线708完成相互间的通信。通信接口704,用于与其它设备比如客户端或其它服务器等的网元通信。处理器702,用于执行程序710,具体可以执行上述高动态范围图像合成方法实施例中的相关步骤。
具体地,程序710可以包括程序代码,该程序代码包括计算机操作指令。
处理器702可能是中央处理器CPU,或者是特定集成电路ASIC(Application Specific Integrated Circuit),或者是被配置成实施本发明实施例的一个或多个集成电路。网络切片设备包括的一个或多个处理器,可以是同一类型的处理器,如一个或多个CPU;也可以是不同类型的处理器,如一个或多个CPU以及一个或多个ASIC。
存储器706,用于存放程序710。存储器706可能包含高速RAM存储器,也可能还包括非易失性存储器(non-volatile memory),例如至少一个磁盘存储器。
程序710具体可以用于使得处理器702执行上述任意方法实施例中的高动态范围图像合成方法。
本领域技术人员应该还可以进一步意识到,结合本文中所公开的实施例描述的示例性的高动态范围图像合成方法的各个步骤,能够以电子硬件、计算机软件或者二者的结合来实现,为了清楚地说明硬件和软件的可互换性,在上述说明中已经按照功能一般性地描述了各示例的组成及步骤。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。
本领域技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。所述的计算机软件可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体或随机存储记忆体等。
最后应说明的是:以上实施例仅用以说明本发明的技术方案,而非对其 限制;在本发明的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本发明的不同方面的许多其它变化,为了简明,它们没有在细节中提供;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (11)

  1. 一种高动态范围图像合成方法,其特征在于,包括:
    获取多帧待合成图像,每一帧待合成图像具有不同的曝光时间;
    计算所述待合成图像的平均亮度;
    根据所述平均亮度,确定所述待合成图像的图像亮度类型;
    计算同一帧待合成图像中,相邻像素点之间的亮度差异;
    根据所述亮度差异,计算不同的待合成图像在同一个像素点位置上的帧间差异;
    根据所述帧间差异,确定所述待合成图像在所述像素点位置上的运动状态;
    根据所述图像亮度类型以及所述运动状态,将所述待合成图像加权合成为对应的高动态范围图像。
  2. 根据权利要求1所述的方法,其特征在于,所述图像亮度类型包括:高照度图像以及低照度图像;
    所述根据所述平均亮度,确定所述待合成图像的图像亮度类型,具体包括:
    在所述平均亮度大于等于预设的亮度检测阈值时,确定所述待合成图像为高照度图像;
    在所述平均亮度小于所述亮度检测阈值时,确定所述待合成图像为低照度图像。
  3. 根据权利要求1所述的方法,其特征在于,所述计算所述待合成图像的平均亮度,具体包括:
    叠加所述待合成图像中每一个像素点的亮度值,获得亮度累加值:
    对每一帧所述待合成图像的亮度累加值求和,获得总亮度值;
    根据所述总亮度值、待合成图像的数量以及待合成图像的尺寸,计算所述平均亮度值。
  4. 根据权利要求1所述的方法,其特征在于,所述待合成图像包括:连 续拍摄获得的短曝光图像、中曝光图像以及长曝光图像;
    所述短曝光图像的曝光时间小于所述中曝光图像;所述中曝光图像的曝光时间小于所述长曝光图像。
  5. 根据权利要求4所述的方法,其特征在于,所述运动状态包括:运动像素和静止像素;
    所述根据所述帧间差异,确定所述待合成图像在所述像素点位置上的运动状态,具体包括:
    判断所述短曝光图像与中曝光图像之间的帧间差异,以及所述中曝光图像与长曝光图像之间的帧间差异是否均小于预设的运动检测阈值;
    若是,确定所述像素点位置的运动状态为静止像素;
    若否,确定所述像素点位置为运动状态为运动像素。
  6. 根据权利要求1所述的方法,其特征在于,所述计算同一待合成图像中,相邻像素点之间的亮度差异,具体包括:
    计算目标像素点与相邻的第一像素点的第一亮度差值以及所述目标像素点与相邻的第二像素点之间的第二亮度差值;
    获取所述第一亮度差值和第二亮度差值之间的差值作为所述目标像素点的亮度差异。
  7. 根据权利要求5所述的方法,其特征在于,所述根据所述图像亮度类型以及所述运动状态,将所述若干待合成图像加权合成为对应的高动态范围图像,具体包括:
    分别为所述短曝光图像、中曝光图像以及长曝光图像预设对应的短曝光权重系数、中曝光权重系数以及长曝光权重系数;
    在所述像素点位置的运动状态为静止像素,所述图像亮度类型为高照度图像时,根据所述短曝光权重系数、中曝光权重系数以及长曝光权重系数,将所述短曝光图像、中曝光图像以及长曝光图像在所述像素点位置的像素点,加权合成为所述高动态范围图像在相同像素点位置的像素点。
  8. 根据权利要求7所述的方法,其特征在于,所述根据所述图像亮度类型以及所述运动状态,将所述若干待合成图像加权合成为对应的高动态范围图像,具体包括:
    在所述像素点位置的运动状态为静止像素,所述图像亮度类型为高照度图像时,舍弃所述短曝光图像以及所述长曝光图像;
    根据所述中曝光图像在所述像素点位置的像素点以及所述中曝光权重系数,加权合成所述高动态范围图像在相同像素点位置的像素点;
    在所述运动状态为静止像素,所述图像亮度类型为低照度图像时,舍弃所述短曝光图像;
    根据所述中曝光权重系数和长曝光权重系数,将所述中曝光图像与所述长曝光图像在所述像素点位置的像素点,加权合成为所述高动态范围图像在相同像素点位置的像素点;
    在所述运动状态为运动像素,所述图像亮度类型为低照度图像时,舍弃所述中曝光图像以及所述长曝光图像;
    根据所述短曝光图像在所述像素点位置的像素点与所述短曝光权重系数,加权合成所述高动态范围图像在相同像素点位置的像素点。
  9. 一种高动态范围图像合成装置,其特征在于,包括:
    图像帧获取模块,用于获取多帧待合成图像,每一帧待合成图像具有不同的曝光时间;
    亮度检测模块,用于计算所述待合成图像的平均亮度,并且根据所述平均亮度,确定所述待合成图像的图像亮度类型;
    二次差值计算模块,用于计算同一帧待合成图像中,相邻像素点之间的亮度差异,并且根据所述亮度差异,计算所述待合成图像在同一个像素点位置上的帧间差异,
    运动检测模块,用于根据所述帧间差异,确定所述待合成图像在所述像素点位置上的运动状态;
    合成模块,用于根据所述图像亮度类型以及所述运动状态,将所述待合成图像加权合成为对应的高动态范围图像。
  10. 一种图像处理芯片,其特征在于,包括:处理器以及与所述处理器通信连接的存储器;
    所述存储器中存储有计算机程序指令,所述计算机程序指令在被所述处理器调用时,以使所述处理器执行如权利要求1-8任一项所述的高动态范围图 像合成方法。
  11. 一种航拍相机,其特征在于,包括:
    图像传感器,所述图像传感器用于以设定的拍摄参数,采集多帧图像;
    控制器;所述控制器与所述图像传感器连接,用于触发所述图像传感器以不同曝光时间长度采集多帧图像;
    图像处理器,所述图像处理器用于接收所述图像传感器通过连续曝光采集的多帧图像,并对接收到的所述多帧图像执行如权利要求1-8任一项所述的高动态范围图像合成方法,获得高动态范围图像。
PCT/CN2021/083350 2020-04-14 2021-03-26 高动态范围图像合成方法、装置、图像处理芯片及航拍相机 WO2021208706A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/938,517 US20230038844A1 (en) 2020-04-14 2022-10-06 High dynamic range image synthesis method and apparatus, image processing chip and aerial camera

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010291571.0 2020-04-14
CN202010291571.0A CN111479072B (zh) 2020-04-14 2020-04-14 高动态范围图像合成方法、装置、图像处理芯片及航拍相机

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/938,517 Continuation US20230038844A1 (en) 2020-04-14 2022-10-06 High dynamic range image synthesis method and apparatus, image processing chip and aerial camera

Publications (1)

Publication Number Publication Date
WO2021208706A1 true WO2021208706A1 (zh) 2021-10-21

Family

ID=71751968

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/083350 WO2021208706A1 (zh) 2020-04-14 2021-03-26 高动态范围图像合成方法、装置、图像处理芯片及航拍相机

Country Status (3)

Country Link
US (1) US20230038844A1 (zh)
CN (1) CN111479072B (zh)
WO (1) WO2021208706A1 (zh)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT201900005536A1 (it) * 2019-04-10 2020-10-10 Doss Visual Solution S R L Metodo di acquisizione immagini per una macchina di ispezione ottica
CN111479072B (zh) * 2020-04-14 2021-12-17 深圳市道通智能航空技术股份有限公司 高动态范围图像合成方法、装置、图像处理芯片及航拍相机
CN111770243B (zh) * 2020-08-04 2021-09-03 深圳市精锋医疗科技有限公司 内窥镜的图像处理方法、装置、存储介质
CN111787183B (zh) * 2020-08-04 2021-09-03 深圳市精锋医疗科技有限公司 内窥镜的图像处理方法、装置、存储介质
WO2022041287A1 (zh) * 2020-08-31 2022-03-03 华为技术有限公司 获取图像的方法、装置、设备及计算机可读存储介质
CN114630053B (zh) * 2020-12-11 2023-12-12 青岛海信移动通信技术有限公司 一种hdr图像显示方法及显示设备
CN114650361B (zh) * 2020-12-17 2023-06-06 北京字节跳动网络技术有限公司 拍摄模式确定方法、装置、电子设备和存储介质
CN114820404A (zh) * 2021-01-29 2022-07-29 北京字节跳动网络技术有限公司 图像处理方法、装置、电子设备及介质
CN115037915B (zh) * 2021-03-05 2023-11-14 华为技术有限公司 视频处理方法和处理装置
KR20220156242A (ko) * 2021-05-18 2022-11-25 에스케이하이닉스 주식회사 이미지 처리 장치
US11863880B2 (en) * 2022-05-31 2024-01-02 Microsoft Technology Licensing, Llc Image frame selection for multi-frame fusion
CN117082355B (zh) * 2023-09-19 2024-04-12 荣耀终端有限公司 图像处理方法和电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101262564A (zh) * 2007-03-09 2008-09-10 索尼株式会社 图像处理装置、成像装置、图像处理方法和计算机程序
EP2175635A1 (en) * 2008-10-10 2010-04-14 Samsung Electronics Co., Ltd. Method and apparatus for creating high dynamic range image
CN107231530A (zh) * 2017-06-22 2017-10-03 维沃移动通信有限公司 一种拍照方法及移动终端
CN110572585A (zh) * 2019-08-26 2019-12-13 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN111479072A (zh) * 2020-04-14 2020-07-31 深圳市道通智能航空技术有限公司 高动态范围图像合成方法、装置、图像处理芯片及航拍相机

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7239805B2 (en) * 2005-02-01 2007-07-03 Microsoft Corporation Method and system for combining multiple exposure images having scene and camera motion
JP5821571B2 (ja) * 2011-11-28 2015-11-24 富士通株式会社 画像合成装置及び画像合成方法
CN103973989B (zh) * 2014-04-15 2017-04-05 北京理工大学 获取高动态图像的方法及系统
US10638052B2 (en) * 2017-04-12 2020-04-28 Samsung Electronics Co., Ltd. Method and apparatus for generating HDR images
CN108419023B (zh) * 2018-03-26 2020-09-08 华为技术有限公司 一种生成高动态范围图像的方法以及相关设备
CN109005361A (zh) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 控制方法、装置、成像设备、电子设备及可读存储介质
CN108881731B (zh) * 2018-08-06 2021-07-02 Oppo广东移动通信有限公司 全景拍摄方法、装置和成像设备
CN109005346B (zh) * 2018-08-13 2020-04-03 Oppo广东移动通信有限公司 控制方法、装置、电子设备和计算机可读存储介质
CN108989700B (zh) * 2018-08-13 2020-05-15 Oppo广东移动通信有限公司 成像控制方法、装置、电子设备以及计算机可读存储介质
CN109120862A (zh) * 2018-10-15 2019-01-01 Oppo广东移动通信有限公司 高动态范围图像获取方法、装置及移动终端
CN109286758B (zh) * 2018-10-15 2021-02-12 Oppo广东移动通信有限公司 一种高动态范围图像的生成方法、移动终端及存储介质
CN110381263B (zh) * 2019-08-20 2021-04-13 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101262564A (zh) * 2007-03-09 2008-09-10 索尼株式会社 图像处理装置、成像装置、图像处理方法和计算机程序
EP2175635A1 (en) * 2008-10-10 2010-04-14 Samsung Electronics Co., Ltd. Method and apparatus for creating high dynamic range image
CN107231530A (zh) * 2017-06-22 2017-10-03 维沃移动通信有限公司 一种拍照方法及移动终端
CN110572585A (zh) * 2019-08-26 2019-12-13 Oppo广东移动通信有限公司 图像处理方法、装置、存储介质及电子设备
CN111479072A (zh) * 2020-04-14 2020-07-31 深圳市道通智能航空技术有限公司 高动态范围图像合成方法、装置、图像处理芯片及航拍相机

Also Published As

Publication number Publication date
US20230038844A1 (en) 2023-02-09
CN111479072A (zh) 2020-07-31
CN111479072B (zh) 2021-12-17

Similar Documents

Publication Publication Date Title
WO2021208706A1 (zh) 高动态范围图像合成方法、装置、图像处理芯片及航拍相机
US11089207B2 (en) Imaging processing method and apparatus for camera module in night scene, electronic device and storage medium
WO2020177723A1 (zh) 图像处理方法、夜间拍摄方法、图像处理芯片及航拍相机
CN111418201B (zh) 一种拍摄方法及设备
CN108419023B (zh) 一种生成高动态范围图像的方法以及相关设备
US11532076B2 (en) Image processing method, electronic device and storage medium
WO2020057198A1 (zh) 图像处理方法、装置、电子设备及存储介质
CN109218627B (zh) 图像处理方法、装置、电子设备及存储介质
JP6953311B2 (ja) 画素データに対して演算を実行するためのシステムおよび方法
CN110445988A (zh) 图像处理方法、装置、存储介质及电子设备
EP3306913B1 (en) Photographing method and apparatus
WO2014093042A1 (en) Determining an image capture payload burst structure based on metering image capture sweep
WO2014158241A1 (en) Viewfinder display based on metering images
WO2021160001A1 (zh) 图像获取方法和装置
US10609265B2 (en) Methods and apparatus for synchronizing camera flash and sensor blanking
TW201931844A (zh) 在不同影像擷取條件下組合光學及數位變焦
CN114189634B (zh) 图像采集方法、电子设备及计算机存储介质
CN107613190A (zh) 一种拍照方法及终端
EP3454547A1 (en) Imaging apparatus, image processing apparatus, imaging method, image processing method, and storage medium
CN116744120A (zh) 图像处理方法和电子设备
CN117135293B (zh) 图像处理方法和电子设备
CN107979729B (zh) 一种显示预览图像的方法及设备
WO2023185584A1 (zh) 一种飞行控制方法、无人飞行器以及可读存储介质
WO2021253167A1 (zh) 数字变焦成像方法、装置、相机以及无人飞行器系统
CN105208286A (zh) 一种模拟慢速快门的拍摄方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21788785

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21788785

Country of ref document: EP

Kind code of ref document: A1