WO2018176925A1 - Procédé et appareil de génération d'image hdr - Google Patents

Procédé et appareil de génération d'image hdr Download PDF

Info

Publication number
WO2018176925A1
WO2018176925A1 PCT/CN2017/117106 CN2017117106W WO2018176925A1 WO 2018176925 A1 WO2018176925 A1 WO 2018176925A1 CN 2017117106 W CN2017117106 W CN 2017117106W WO 2018176925 A1 WO2018176925 A1 WO 2018176925A1
Authority
WO
WIPO (PCT)
Prior art keywords
reference image
image
pixel
value
threshold
Prior art date
Application number
PCT/CN2017/117106
Other languages
English (en)
Chinese (zh)
Inventor
李欣
宋明黎
陈柯
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2018176925A1 publication Critical patent/WO2018176925A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present invention relates to the field of image processing, and in particular, to a method and apparatus for generating an HDR image.
  • HDR High-Dynamic Range imaging
  • HDR High-Dynamic Range imaging
  • LDR Low-Dynamic Range
  • High dynamic range imaging was originally only used for purely computer generated images. Later, people developed methods for generating HDR from photos of different exposure ranges. As handheld cameras become more popular and smartphones become easier to use, many amateur photographers can easily generate photos of high dynamic range scenes through some mobile applications.
  • HDR High Technology Computer Corp.
  • HTC High Technology Computer Corp.
  • Nokia Samsung and other mobile phone manufacturers.
  • the more traditional method was proposed by Paul Debevec in 1997.
  • the method uses the exposure time of the image to obtain the response function of the camera, and then uses the response function to inversely map the image from the pixel value back to the scene irradiation field, thereby obtaining the pixel.
  • the irradiance proportional to the actual brightness value of the scene
  • weighted average fusion of multiple images over the irradiation domain and finally tone mapping to obtain the final HDR.
  • the effect of these methods often depends on the solution of the camera response function, and the response function is sensitive to the noise of the image. Therefore, in order to obtain an accurate response function, it is necessary to take a plurality of high-quality images with different exposure times, and sample each. The point where the actual brightness is different makes the operation more complicated.
  • the camera's response function may change, and calibration is required at a time. This requires standardization of the user's operation. If the user does not perform the calibration of the response function well, the effect of HDR synthesis will be greatly reduced. . Therefore, if this method of camera response function calibration is adopted, the user experience will be poor, and in addition, the obtained image quality may also be degraded, which is not suitable for a handheld camera application.
  • Another method is to fuse on the image domain.
  • Mertens proposed the exposure fusion method in 2009 to calculate the saturation, contrast and exposure-exposedness of each pixel of multiple images. That is, to describe whether the object display effect is good, combined with these three coefficients, to obtain the weight of each pixel of each picture, and then use this weight to weight multiple pictures, this method only needs to input the image, and does not need Solving the camera response function does not require a final tone mapping.
  • Both the Debevec method and the Mertens method can only deal with static scenes. If there are objects in the scene moving, it will cause ghosts.
  • the motion of the object in the scene is relatively large, then the picture cannot be included in the picture used for the fusion, resulting in only a few pictures being merged, and since the exposure time of the brighter image is longer, the exposure time of the brighter image is longer.
  • the displacement of the moving object in the brighter image is larger than the reference image, so the brighter image in the moving scene is often discarded, the result of the fusion will be darker, the object recognition will decrease, and No enhancements to the detail area of the scene result in the loss of some detail areas.
  • the present invention provides an HDR image generation method and apparatus, which can detect a large motion area in a target scene, and use the detail area to enhance the detail of the first merged HDR image, and generate the target HDR.
  • the image is rich in detail.
  • an HDR image generating method comprising: acquiring a first motion region and a second motion region of a sequence of images; wherein the sequence of images comprises a reference image, a first non-reference image, and a second a non-reference image, wherein the first non-reference image, the reference image, and the second non-reference image sequentially increase an exposure duration of the target scene, the first motion region being the first non-reference image relative to the
  • the reference image has a region where the gray value differs, the second motion region is a region where the second non-reference image has a gray value difference with respect to the reference image; according to the first ratio and the first threshold Comparing the result and a comparison result between the second ratio and the first threshold, determining a target motion region; wherein the first ratio is a ratio of the first motion region in the first non-reference image, The second ratio is a proportion of the second motion area in the second non-reference image, and the target motion area includes at least one connected area
  • the first threshold it is possible to detect a certain area of the motion area in the target scene, and expand the range of the motion area, that is, the present application can allow a large motion area in the scene, and remove the adjusted first The ghost of the HDR image, resulting in a second HDR image without ghosting.
  • the method further includes: determining an edge of the first non-reference image, the edge includes an edge pixel point; determining a location around each pixel point in the first non-reference image Whether the value of the edge pixel is greater than the second threshold; if the value of the edge pixel is greater than the second threshold, the value of the edge pixel around the pixel in the first non-reference image is greater than the second An area formed by all the pixels of the threshold is determined as a detail area of the first non-reference image; based on the detail area, the first non-reference image and the second HDR image are fused to obtain a target HDR image.
  • the edge of the first non-reference image the detail region is acquired to perform detail enhancement on the second HDR image, and the obtained target HDR image is more rich in detail.
  • the determining, according to a comparison result between the first ratio and the first threshold, and a comparison result between the second ratio and the first threshold, determining a target motion region including: when When the first ratio and the second ratio are not greater than the first threshold, the first motion region and the second motion region are superimposed to determine that the superposed region is the target motion region; or Determining that the first ratio is greater than the first threshold, and when the second ratio is greater than the first threshold, determining that the first motion region is a target motion region; or, when the first ratio is greater than the first a threshold, when the second ratio is not greater than the first threshold, determining that the second motion region is a target motion region.
  • the first weight value is a product of saturation, contrast, and exposure degree of each pixel in the first non-reference image and each pixel in the first non-reference image. a sum of a product of saturation, contrast, and exposure degree of the dot; or, the first weight value is a product of saturation, contrast, and exposure degree of each pixel in the first non-reference image; or The first weight value is a product of saturation, contrast, and exposure degree of each pixel in the second non-reference image; the second weight is a saturation, contrast, and sum of each pixel in the reference image.
  • a product of the degree of exposure comprising: calculating, when the first ratio and the second ratio are not greater than the first threshold, each pixel in the first non-reference image and the second non-reference image, respectively Saturation, contrast and degree of exposure of the point, multiplying the saturation, contrast and exposure degree of each pixel in the first non-reference image to obtain a weight of each pixel in the first non-reference image And multiplying saturation, contrast, and exposure degree of each pixel in the second non-reference image to obtain a weight value of each pixel in the second non-reference image, and the first non-reference Adding a weight value of each pixel in the image to a weight value of each pixel in the second non-reference image to obtain the first weight value; or, when the first ratio is not greater than the first a threshold, when the second ratio is greater than the first threshold, calculating a saturation, a contrast, and an exposure degree of each pixel in the first non-reference image, each pixel in the first non-reference image Multip
  • the first threshold is not less than 7.5%.
  • the obtaining, according to the first weight value and the second weight value, the first HDR image comprising: a first weight value of the first pixel point and a second weight point of the second pixel point Weighting the weighted average to obtain a weighted average value as a weight value of the third pixel point, wherein the first pixel point is any one of all pixel points determining the first weight value,
  • the two pixel points are pixel points corresponding to the first pixel point in the reference image; and the first HDR image is obtained according to the obtained weight values of all the third pixel points.
  • the method further includes: Scanning an image in which the target motion region is located, and when there is a pixel point whose pixel value is equal to a third threshold in the image in which the target motion region is located, marking the pixel point whose pixel value is the third threshold value; The area formed by all the marked pixels is the connected area.
  • the determining the reference image according to the comparison result of the sum of the first weight and the second weight comprises: when the sum of the first weights is not less than the sum of the second weights Determining that the reference image is a reference image; and determining that the first non-reference image is a reference image when a sum of the first weights is less than a sum of the second weights.
  • the performing the brightness adjustment process on the first HDR image by using the reference image to obtain the adjusted first HDR image includes: when the sum of the first weights is not less than Calculating a luminance value of the first HDR image and a luminance value of the reference image when using a sum of the second weights, using a difference between a luminance value of the first HDR image and a luminance value of the reference image, Determining, by the first HDR image, a brightness adjustment process to obtain an adjusted first HDR image; or, when the sum of the first weights is less than a sum of the second weights, calculating a brightness value of the first HDR image and a luminance value of the first non-reference image, using a difference between a luminance value of the first HDR image and a luminance value of the first non-reference image, performing dimming processing on the first HDR image to obtain an adjustment The first HDR image.
  • the difference between the brightness value of the first HDR image and the brightness value of the reference image is utilized.
  • performing a brightness adjustment process on the first HDR image to obtain the adjusted first HDR image including: using a brightness value of the reference image as a reference, and comparing a brightness value of the first HDR image with a preset number Performing a multiplication process on the luminance ratio to obtain a first HDR image having the same luminance value as the reference image; and using the first HDR when the sum of the first weights is smaller than the sum of the second weights
  • dimming the first HDR image to obtain the adjusted first HDR image including: the first non-reference image
  • the luminance value is a reference, and the luminance value of the first HDR image is multiplied by a preset second luminance ratio to obtain a first HDR image that
  • the method further includes: The target HDR image is subjected to guided filtering and exposure correction. Thereby, the respective parts of the target HDR image are more clear.
  • the method further includes: determining an edge of the first non-reference image, the edge includes an edge pixel point; determining a location around each pixel point in the first non-reference image Whether the value of the edge pixel is greater than the second threshold; if the value of the edge pixel is greater than the second threshold, the value of the edge pixel around the pixel in the first non-reference image is greater than the second An area formed by all the pixels of the threshold is determined as a detail area of the first non-reference image; when the first ratio and the second ratio are both greater than the first threshold, the detail area and the reference image are The fusion process is performed to obtain a target HDR image.
  • the determining an edge of the first non-reference image includes: performing edge detection on each pixel of the first non-reference image by using a canny operator; As a result, the edge of the first non-reference image is determined.
  • the present invention provides an apparatus for generating an HDR image, the apparatus comprising: an acquiring unit, configured to acquire a first motion area and a second motion area of a sequence of images; wherein the image sequence includes a reference image, a first non-reference image and a second non-reference image, wherein the first non-reference image, the reference image, and the second non-reference image sequentially increase an exposure duration of the target scene, the first motion region being the a region in which the first non-reference image has a difference in gray value with respect to the reference image, the second motion region being a region in which the second non-reference image has a difference in gray value with respect to the reference image; a determining unit, Determining, according to a comparison result between the first ratio and the first threshold, and a comparison result between the second ratio and the first threshold, determining a target motion region; wherein the first ratio is the first non-reference a ratio of the first motion area in the image, the second ratio is a ratio of
  • the first threshold it is possible to detect a certain area of the motion area in the target scene, and expand the range of the motion area, that is, the present application can allow a large motion area in the scene, and remove the adjusted first The ghost of the HDR image, resulting in a second HDR image without ghosting.
  • the determining unit is further configured to: determine an edge of the first non-reference image, the edge includes an edge pixel point; the apparatus further includes: a determining unit; the determining unit, And determining whether the value of the edge pixel point around each pixel point in the first non-reference image is greater than a second threshold; the determining unit is further configured to: if the edge pixel value is greater than the first value a second threshold, wherein an area formed by all the pixel points of the edge pixel points around the pixel point in the first non-reference image that are greater than the second threshold is determined as the detail area of the first non-reference image; the fusion unit further And configured to perform fusion processing on the first non-reference image and the second HDR image based on the detail region to obtain a target HDR image.
  • the edge of the first non-reference image the detail region is acquired to perform detail enhancement on the second HDR image, and the obtained target HDR image is more rich in detail.
  • the determining unit is specifically configured to: when the first ratio and the second ratio are not greater than the first threshold, the first motion area and the first The two motion regions are superimposed to determine that the superimposed region is the target motion region; or, when the first ratio is not greater than the first threshold, and the second ratio is greater than the first threshold, determining the first The motion area is the target motion area; or, when the first ratio is greater than the first threshold and the second ratio is not greater than the first threshold, determining that the second motion area is the target motion area.
  • the first weight value is a product of saturation, contrast, and exposure degree of each pixel in the first non-reference image and each pixel in the first non-reference image. a sum of a product of saturation, contrast, and exposure degree of the dot; or, the first weight value is a product of saturation, contrast, and exposure degree of each pixel in the first non-reference image; or The first weight value is a product of saturation, contrast, and exposure degree of each pixel in the second non-reference image; the second weight is a saturation, contrast, and sum of each pixel in the reference image.
  • a product of the degree of exposure comprising: calculating, when the first ratio and the second ratio are not greater than the first threshold, each pixel in the first non-reference image and the second non-reference image, respectively Saturation, contrast and degree of exposure of the point, multiplying the saturation, contrast and exposure degree of each pixel in the first non-reference image to obtain a weight of each pixel in the first non-reference image a value, and multiplying a saturation, a contrast, and an exposure degree of each pixel in the second non-reference image to obtain a weight value of each pixel in the second non-reference image, the first non- Adding a weight value of each pixel in the reference image and a weight value of each pixel in the second non-reference image to obtain the first weight value; or, when the first ratio is not greater than the first a threshold, when the second ratio is greater than the first threshold, calculating saturation, contrast, and exposure level of each pixel in the first non-reference image, each pixel in the first non-reference image
  • the first threshold is not less than 7.5%.
  • the processing unit is specifically configured to weight average the first weight value of the first pixel and the second weight of the second pixel to obtain a weighted average, the weighted average a value as a weight value of a third pixel point, wherein the first pixel point is any one of all pixel points that determine a first weight value, and the second pixel point is the first pixel in the reference image a pixel corresponding to the point; the first HDR image is obtained according to the obtained weight values of all the third pixel points.
  • the device further includes: a scanning unit; the scanning unit is configured to scan an image where the target motion area is located, and a pixel value exists in an image where the target motion area is located When the pixel is equal to the third threshold, the pixel with the pixel value being the third threshold is marked; the determining unit is further configured to determine that the area formed by all the marked pixels is the connected area.
  • the determining unit is specifically configured to: when the sum of the first weights is not less than a sum of the second weights, determine that the reference image is a reference image; when the first weight is The sum is smaller than the sum of the second weights, and the first non-reference image is determined to be a reference image.
  • the processing unit is specifically configured to: when the sum of the first weights is not less than a sum of the second weights, calculate a brightness value of the first HDR image and the reference a brightness value of the image, using a difference between the brightness value of the first HDR image and the brightness value of the reference image, performing brightness adjustment processing on the first HDR image to obtain an adjusted first HDR image; or, when Calculating a luminance value of the first HDR image and a luminance value of the first non-reference image when the sum of the first weights is smaller than a sum of the second weights, using a luminance value of the first HDR image And performing, by the difference of the brightness values of the first non-reference image, performing dimming processing on the first HDR image to obtain an adjusted first HDR image.
  • the difference between the brightness value of the first HDR image and the brightness value of the reference image is utilized.
  • performing a brightness adjustment process on the first HDR image to obtain the adjusted first HDR image including: using a brightness value of the reference image as a reference, and comparing a brightness value of the first HDR image with a preset number Performing a multiplication process on the luminance ratio to obtain a first HDR image having the same luminance value as the reference image; and using the first HDR when the sum of the first weights is smaller than the sum of the second weights
  • dimming the first HDR image to obtain the adjusted first HDR image including: the first non-reference image
  • the luminance value is a reference, and the luminance value of the first HDR image is multiplied by a preset second luminance ratio to obtain a first HDR image that
  • the apparatus further includes: a guiding filtering and an exposure correcting unit; and the guiding filtering and exposure correcting unit, configured to perform guided filtering and exposure correction on the target HDR image.
  • a guiding filtering and an exposure correcting unit configured to perform guided filtering and exposure correction on the target HDR image.
  • the determining unit is further configured to: determine an edge of the first non-reference image, the edge includes an edge pixel; the determining unit is further configured to determine the first non Whether the value of the edge pixel point around each pixel in the reference image is greater than a second threshold; the determining unit is further configured to: if the value of the edge pixel point is greater than a second threshold, An area formed by all the pixels of the edge pixel around the pixel in the non-reference image having a value greater than the second threshold is determined as the detail area of the first non-reference image; the fusion unit is further configured to: when the first When the ratio and the second ratio are both greater than the first threshold, the detail region and the reference image are merged to obtain a target HDR image.
  • the determining unit is specifically configured to: perform edge detection on each pixel of the first non-reference image by using a canny operator; and determine the first according to a result of the edge detection The edge of a non-reference image.
  • an embodiment of the present invention provides a computer storage medium for storing computer software instructions for generating the HDR image, which includes a program designed to execute the first aspect.
  • the present invention provides an apparatus for generating an HDR image, the generating apparatus comprising a processor, a memory, and a computer program stored on the memory and operable on the processor, the processor executing the program:
  • the image sequence includes a reference image, a first non-reference image, and a second non-reference image, the first non-reference image, the reference image, and the second non-reference image pair target
  • the exposure duration of the scene is sequentially increased, the first motion region is a region where the first non-reference image has a gray value difference with respect to the reference image, and the second motion region is a region where the second non-reference image has a gray value difference with respect to the reference image.
  • a target motion region Determining a target motion region according to a comparison result between the first ratio and the first threshold and a comparison result between the second ratio and the first threshold; wherein the first ratio is occupied by the first motion region in the first non-reference image Ratio, the second ratio is a proportion of the second motion area in the second non-reference image, and the target motion area includes at least one connected area;
  • first weight value is a product of saturation, contrast, and exposure degree of each pixel in the first non-reference image and the second non-reference image
  • first weight value is the product of the saturation, contrast, and exposure degree of each pixel in the first non-reference image
  • first The weight value is a product of saturation, contrast, and exposure degree of each pixel in the second non-reference image
  • second weight is a product of saturation, contrast, and exposure degree of each pixel in the reference image
  • the sum of the first weights is a sum of weight values of pixel points of the region overlapping the connected region in the reference image
  • the second weight is And a sum of weight values of pixels of the region overlapping the connected region in the first non-reference image, wherein the weight of the pixel is a product of saturation, contrast, and exposure of the pixel
  • the adjusted first HDR image is fused with the reference image or the first non-reference image based on the connected region to obtain a second HDR image.
  • the processor is further configured to execute:
  • the edge pixel If the value of the edge pixel is greater than the second threshold, determining an area formed by all the pixels of the edge pixel around the pixel in the first non-reference image that are greater than the second threshold as the details of the first non-reference image region;
  • the first non-reference image and the second HDR image are merged based on the detail region to obtain a target HDR image.
  • determining a target motion area according to a comparison result between the first ratio and the first threshold and a comparison result between the second ratio and the first threshold including:
  • the first ratio and the second ratio are not greater than the first threshold, the first motion region and the second motion region are superimposed to determine that the superposed region is the target motion region;
  • the first weight value is a product of saturation, contrast, and exposure degree of each pixel in the first non-reference image and saturation and contrast of each pixel in the first non-reference image. And the sum of the products of the exposure degrees; or, the first weight value is a product of the saturation, contrast, and exposure degree of each pixel in the first non-reference image; or, the first weight value is each of the second non-reference images.
  • the product of the saturation, contrast, and exposure of the pixels; the second weight is the product of the saturation, contrast, and exposure of each pixel in the reference image, including:
  • the first threshold is not less than 7.5%.
  • the first HDR image is obtained according to the first weight value and the second weight value, including:
  • Weighting and averaging the first weight value of the first pixel point and the second weight value of the second pixel point to obtain a weighted average value is used as the weight value of the third pixel point
  • the first pixel point is determined to be the first Any one of all the pixel points of the weight value, the second pixel point being a pixel point corresponding to the first pixel point in the reference image;
  • a first HDR image is obtained based on the obtained weight values of all the third pixel points.
  • the method further includes:
  • determining a reference image according to a comparison result of a sum of the first weight and the second weight including:
  • the brightness adjustment process is performed on the first HDR image by using the reference image to obtain the adjusted first HDR image, including:
  • the first HDR image is dimmed to obtain an adjusted first HDR image.
  • the first HDR image is brightened by using a difference between the luminance values of the first HDR image and the luminance values of the reference image. , getting the adjusted first HDR image, including:
  • the first HDR image is dimmed by using a difference between the luminance values of the first HDR image and the luminance values of the first non-reference image to obtain the adjusted first HDR images, including:
  • the luminance value of the first HDR image is multiplied with the preset second luminance ratio based on the luminance value of the first non-reference image to obtain a first HDR image having the same luminance value as the first non-reference image.
  • the method further includes:
  • the processor is further configured to execute:
  • the edge pixel If the value of the edge pixel is greater than the second threshold, determining an area formed by all the pixels of the edge pixel around the pixel in the first non-reference image that are greater than the second threshold as the details of the first non-reference image region;
  • the detail region and the reference image are merged to obtain a target HDR image.
  • determining an edge of the first non-reference image includes:
  • the edge of the first non-reference image is determined based on the result of the edge detection.
  • FIG. 1 is a flowchart of a method for generating an HDR image according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of obtaining a final connected area according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of obtaining a second HDR image according to an embodiment of the present disclosure
  • FIG. 4 is a schematic diagram of obtaining a target HDR image according to an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of an apparatus for generating an HDR image according to an embodiment of the present disclosure
  • FIG. 6 is another schematic structural diagram of an apparatus for generating an HDR image according to an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of still another apparatus for generating an HDR image according to an embodiment of the present invention.
  • the application scenario of the present application is in the field of image processing.
  • the method for generating an HDR image provided by the present application can be applied.
  • FIG. 1 is a flowchart of a method for generating an HDR image according to an embodiment of the present invention. As shown in FIG. 1, the HDR image generation method includes the following steps:
  • Step 101 Acquire a first motion area and a second motion area of the image sequence.
  • the image sequence includes a reference image, a first non-reference image, and a second non-reference image.
  • the first non-reference image, the reference image, and the second non-reference image sequentially increase the exposure duration of the target scene, and the first motion region is first.
  • the non-reference image has a region where the gradation value differs from the reference image, and the second motion region is a region where the second non-reference image has a gradation value difference with respect to the reference image.
  • the present application only uses three images as an example for illustration. It can be understood that the number of images may be three or more.
  • the first non-reference image and the second non- The exposure time of the reference image is closest to the reference image, the exposure time of the first non-reference image is smaller than the reference image, and the exposure time of the second non-reference image is greater than the reference image.
  • an image in which three exposure times of the target scene are gradually increased may be taken by the terminal device, wherein the terminal device may be a device having a camera, including However, it is not limited to a camera (such as a digital camera), a video camera, a mobile phone (such as a smart phone), a tablet (Pad), a personal digital assistant (PDA), a portable device (for example, a portable computer), a wearable device, or the like.
  • a camera such as a digital camera
  • a video camera such as a video camera
  • a mobile phone such as a smart phone
  • a tablet Pad
  • PDA personal digital assistant
  • portable device for example, a portable computer
  • wearable device for example, a portable computer
  • the RGB maps of the original first non-reference image, the original reference image, and the original second non-reference image are respectively converted into grayscale images.
  • HM histogram matching
  • the surf feature point detection algorithm is used to obtain the homography matrix of the original original non-reference image, the original reference image and the dimmed original second non-reference image, and respectively use the homography matrix to the original Mapping the first non-reference image, the original reference image, and the original second non-reference image to align the images, eliminating motion caused by camera shake, and obtaining the first non-reference image, the reference image, and the second non-reference image (see FIG. 2 in 201, 202 and 203).
  • Obtaining the first motion area and the second motion area of the image sequence specifically includes:
  • the RGB maps of the first non-reference image, the reference image, and the second non-reference image are respectively converted into grayscale maps (see 204, 205, and 206 in FIG. 2, respectively).
  • HM perform HM
  • HM perform HM on the grayscale image of the first non-reference image with the grayscale image of the reference image as a standard, and brighten the grayscale image of the first non-reference image (see FIG. 2) 207)
  • performing histogram matching on the grayscale image of the second non-reference image and dimming the grayscale image of the second non-reference image (see 208 in FIG. 2).
  • the gray value of the first pixel in the grayscale image of the second non-reference image after dimming is 200
  • the grayscale image of the reference image is The gray value of one pixel is 100
  • the difference between the gray values is 100.
  • the preset threshold is 50
  • the difference between the gray values is greater than the preset threshold. Therefore, the pixel value of the first pixel is 1.
  • the threshold filtering and the etching expansion operation are performed to obtain the second motion region after de-drying, and the pixel value of the pixel value is 1 (see the white area of 210 in FIG. 2, and the pixel value is The pixel of 0 refers to the black area of 210 in Fig. 2, and the obtained second motion area See in 2210.
  • Step 102 Determine a target motion area according to a comparison result between the first ratio and the first threshold and a comparison result between the second ratio and the first threshold.
  • the first ratio is a proportion of the first motion area in the first non-reference image
  • the second ratio is a proportion of the second motion area in the second non-reference image
  • the target motion area includes at least one connected area.
  • a connected component generally refers to an image region in the image that has the same pixel value and is adjacent to the foreground pixel.
  • determining the target motion area according to a comparison result between the first ratio and the first threshold and a comparison result between the second ratio and the first threshold including:
  • the first ratio and the second ratio are not greater than the first threshold, the first motion area and the second motion area are superimposed to determine that the superimposed area is the target motion area; or
  • the first ratio and the second ratio are not greater than the first threshold. After the first motion region and the second motion region are superimposed, the obtained target motion region is referred to as 211 in 2.
  • the method further includes: targeting the target motion region The image in which the image is located is scanned, and when there is a pixel point whose pixel value is equal to the third threshold in the image in which the target motion region is located, the pixel point whose pixel value is the third threshold value is marked; and all the marked pixel points are determined.
  • the area formed is the connected area.
  • the third threshold is 1, and the area formed by the pixel points having all pixel values of 1 in the target motion area is the connected area.
  • the third threshold is 0, and an area formed by the pixel points of all the pixel values in the target area is determined, and an area of the image in which the target motion area is located, except for the pixel point whose pixel value is 0, The remaining area is the connected area.
  • the first motion area includes two moving areas, such as the head and the foot
  • the head is a connected area
  • the foot is a connected area.
  • the number of connected areas is 0.
  • the number of connected regions is 1.
  • the reference image is used as the guide image, and the connected area is guided filtered (GF), so that the connected area after the GF is closer to the edge of the object in the actual scene.
  • GF that is, the intelligent expansion of the connected area, so that the connected area after the GF is more like a complete area, so that the connected area is appropriately enlarged. If the GF is not performed, since the hand of the person in the scene is moving, the obtained connected area is obtained. At the edge of the hand, that is, the position of the finger will be missing, and using GF to perform the GF on the connected region of the target moving region will cause the connected region to be expanded in the edge region (ie, the finger position), thereby avoiding the finger position.
  • ghosting after the booting and filtering of the connected area, the obtained connected area is referred to as 211 in FIG.
  • the value of the connected region becomes a small value between 0-1, and the connected region before the GF is superimposed with the connected region after the GF, and the value of the pixel value greater than 1 is set to 1, less than 1.
  • the value remains the same, so that the resulting connected region looks more like an entire moving object than a partially moving object, so the transition is more natural, see 211 and 212 in Figure 2, which will guide the connected region before filtering. 211 and the connected region 212 after the guided filtering are superimposed to obtain a final connected region, denoted 213.
  • the fusion processing is performed based on the connected region, the fusion is performed by using the finally obtained connected region, that is, the fusion processing is performed using 213 in FIG. 2 .
  • the first threshold may be set as needed, and the first threshold is not less than 7.5%. Therefore, the present application can detect a certain area of the motion area in the target scene, and expand the range of the motion area, that is, the present application can allow a larger motion area in the scene.
  • Step 103 Obtain a first HDR image according to the first weight value and the second weight value.
  • the first weight value is a sum of a product of saturation, contrast, and exposure degree of each pixel in the first non-reference image and a product of saturation, contrast, and exposure degree of each pixel in the second non-reference image.
  • the first weight value is a product of saturation, contrast, and exposure degree of each pixel in the first non-reference image; or, the first weight value is saturation of each pixel in the second non-reference image, The product of the contrast and the degree of exposure; the second weight is the product of the saturation, contrast, and exposure level of each pixel in the reference image.
  • the following describes how to calculate the first weight value and the second weight value.
  • the first ratio and the second ratio are not greater than the first threshold, calculating saturation, contrast, and exposure degree of each pixel in the first non-reference image and the second non-reference image, respectively, Multiplying the saturation, contrast, and exposure levels of each pixel in the first non-reference image to obtain a weight value for each pixel in the first non-reference image, and saturating each pixel in the second non-reference image Multiplying the degree, the contrast, and the degree of exposure to obtain a weight value for each pixel in the second non-reference image, the weight value of each pixel in the first non-reference image and each pixel in the second non-reference image
  • the weight values of the points are added to obtain the first weight value; or,
  • the first ratio is not greater than the first threshold and the second ratio is greater than the first threshold, calculating saturation, contrast, and exposure level of each pixel in the first non-reference image, in the first non-reference image Multiplying the saturation, contrast, and exposure level of each pixel to obtain the first weight value; or,
  • obtaining the first HDR image according to the first weight value and the second weight value comprising: weighting and averaging the first weight value of the first pixel point and the second weight value of the second pixel point, a weighted average value as a weight value of a third pixel point, the first pixel point being any one of all pixel points determining a first weight value, the second pixel point being the a pixel point corresponding to the first pixel point in the reference image; and the first HDR image is obtained according to the obtained weight values of all the third pixel points.
  • Step 104 Determine a reference image according to a comparison result of a sum of the first weight and a second weight.
  • the sum of the first weights is a sum of weight values of pixel points of an area overlapping the connected area in the reference image
  • the sum of the second weights is a sum of the first non-reference image
  • the determining the reference image according to the comparison result of the sum of the first weight and the second weight comprises: determining that the sum of the first weights is not less than a sum of the second weights The reference image is a reference image; and when the sum of the first weights is less than the sum of the second weights, determining that the first non-reference image is a reference image.
  • the reference image is determined to be a reference image.
  • Step 105 Perform brightness adjustment processing on the first HDR image by using the reference image to obtain an adjusted first HDR image.
  • performing brightness adjustment processing on the first HDR image by using the reference image to obtain an adjusted first HDR image includes:
  • the first Performing a brightness adjustment process on the HDR image to obtain the adjusted first HDR image comprising: multiplying a brightness value of the first HDR image by a preset first brightness ratio based on a brightness value of the reference image Processing to obtain a first HDR image having the same brightness value as the reference image;
  • the first HDR Performing a dimming process on the image to obtain the adjusted first HDR image, including: performing, by using a luminance value of the first non-reference image, a luminance value of the first HDR image and a preset second luminance ratio
  • the multiplication processing obtains a first HDR image that is the same as the luminance value of the first non-reference image.
  • the resulting adjusted first HDR image is referenced to 301 in FIG. Since the first HDR image has ghosts, the first HDR image is de-ghosted. In the following step 106, how to remove the ghost of the first HDR image will be described in detail.
  • Step 106 fused the adjusted first HDR image with the reference image or the first non-reference image based on the connected area to obtain a second HDR image.
  • the fusion processing is a Laplacian pyramid fusion, and when the Laplacian pyramid fusion is performed, the adjusted first HDR image is merged with the reference image or the first non-reference image, and the two images are respectively set.
  • the fusion result is c, then some of the c must be taken from a, this part is definitely no information of b, the same, the part taken from b, there will be no information of a Then, how to distinguish which information is taken from a, and which information is taken from b.
  • the connected area is to distinguish which information is taken from a and which information is taken from b.
  • the connected area is only a black and white image, and white represents The adjusted first HDR image is taken, and the black color is taken from the reference image or the first non-reference image, so in a Laplacian pyramid fusion, strictly speaking, three images are required, and two source images (final synthesis)
  • the second HDR image consists of the pixels in the two images) and a mask image indicating how to take the point.
  • the connected region is the mask image (non-black or white, non-zero or 1).
  • the connected area here refers specifically to the final connected area, see 213 in FIG. Thereby, ghosts of the adjusted first HDR image are removed.
  • the method further includes:
  • Step 107 Determine an edge of the first non-reference image, the edge includes an edge pixel point, and determine whether a value of the edge pixel point around each pixel point in the first non-reference image is greater than a second threshold If the value of the edge pixel is greater than the second threshold, determining an area formed by all the pixels of the edge pixel around the pixel in the first non-reference image that are greater than the second threshold is determined as the first a detail region of the non-reference image; based on the detail region, the first non-reference image and the second HDR image are merged to obtain a target HDR image. Thereby, the second HDR image is enhanced in detail, and the obtained target HDR image is more detailed.
  • an edge refers to a collection of those pixels whose pixel gradation changes abruptly, which is the most basic feature of an image.
  • the pixels in the edge are customized as edge pixels. If the value of the edge pixel of one pixel is greater than the second threshold, the pixel is rich in detail. In this case, the second threshold may be set according to actual needs.
  • the fusion processing here may also be a Laplacian pyramid fusion process.
  • the role of the detail area is the same as that of the connected area mentioned in step 106, and is used as a mask image.
  • the detail area is used here to indicate which information is taken from the first non-reference image and which information is taken from the second HDR image, and details will not be described again.
  • Determining an edge of the first non-reference image comprising: performing edge detection on each pixel of the first non-reference image by using a canny operator; determining the first non-reference image according to a result of the edge detection the edge of.
  • the method further includes:
  • Step 108 performing guided filtering and exposure correction on the target HDR image.
  • the guiding filtering is to denoise the second fused image, and the purpose of the exposure correction is brightness adjustment.
  • the detail area is referred to as 401 in FIG. 4, and the obtained target HDR image is referred to as 402 in FIG. 4.
  • the image obtained after the guide filtering and exposure correction of the target HDR image is referred to as 403 in FIG.
  • the method for generating the HDR image further includes:
  • Step 109 determining an edge of the first non-reference image, the edge including an edge pixel point
  • determining an area formed by all the pixels of the edge pixel around the pixel in the first non-reference image that are greater than the second threshold is determined as the first non- The detail area of the reference image
  • the detail region and the reference image are merged to obtain a target HDR image.
  • 201, 202, 203 in FIG. 2, 202, 301, and 302 in FIG. 3, and 201, 401, 402, and 403 in FIG. 4 are all RGB images, but in order to meet the requirements of the drawings of the specification, It is shown in Figures 2, 3 and 4 as a grayscale image.
  • the method for generating an HDR image may acquire a motion region of each non-reference image when generating an HDR image, and may use multiple motion regions to display a scene when describing a non-motion scene.
  • the motion of the object in the object can even synthesize the HDR time-frequency.
  • FIG. 5 is a schematic structural diagram of an apparatus for generating an HDR image according to an embodiment of the present invention, for performing the method described in FIG. 1.
  • the HDR image generating apparatus 500 includes: an obtaining unit 510, a determining unit 520, and processing. Unit 530, fusion unit 540.
  • the acquiring unit 510 is configured to acquire a first motion area and a second motion area of the image sequence, where the image sequence includes a reference image, a first non-reference image, and a second non-reference image, the first non-reference image, The exposure time of the reference image and the second non-reference image to the target scene are sequentially incremented, and the first motion area is an area where the first non-reference image has a gray value difference with respect to the reference image, The second motion region is a region where the second non-reference image has a difference in gray value with respect to the reference image.
  • a determining unit 520 configured to determine a target motion region according to a comparison result between the first ratio and the first threshold and a comparison result between the second ratio and the first threshold; wherein the first ratio is the a ratio of the first motion area in the first non-reference image, the second ratio is a ratio of the second motion area in the second non-reference image, the target motion area including at least one Connected area.
  • the processing unit 530 is configured to obtain, according to the first weight value and the second weight value, a first HDR image, where the first weight value is a saturation, a contrast, and a saturation of each pixel in the first non-reference image. a product of the degree of exposure; or, the first weight value is a product of saturation, contrast, and exposure degree of each pixel in the second non-reference image; or, the first weight value is the first a sum of a product of saturation, contrast, and exposure degree of each pixel in the non-reference image and a product of saturation, contrast, and exposure degree of each pixel in the first non-reference image; the second weight value The product of the saturation, contrast, and exposure level of each pixel in the reference image.
  • the determining unit 520 is further configured to: determine, according to a comparison result of the sum of the first weight and the second weight, the sum of the first weights is an area of the reference image that overlaps with the connected area a sum of the weight values of the pixels, the sum of the second weights being a sum of weight values of pixel points of the region of the first non-reference image overlapping with the connected region, wherein the weight of the pixel
  • the value is the product of the saturation, contrast, and exposure level of the pixel.
  • the processing unit 530 is further configured to perform brightness adjustment processing on the first HDR image by using the reference image to obtain an adjusted first HDR image.
  • the merging unit 540 is configured to fuse the adjusted first HDR image with the reference image or the first non-reference image based on the connected area to obtain a second HDR image.
  • the determining unit 520 is further configured to determine an edge of the first non-reference image, where the edge includes an edge pixel point;
  • FIG. 6 is another schematic structural diagram of an HDR image generating apparatus according to an embodiment of the present invention. As shown in FIG. 6, the HDR image generating apparatus 600 further includes a determining unit 610.
  • the determining unit 610 is configured to determine whether a value of the edge pixel point around each pixel point in the first non-reference image is greater than a second threshold.
  • the determining unit 520 is further configured to: if the value of the edge pixel point is greater than the second threshold, the value of the edge pixel point around the pixel point in the first non-reference image is greater than all pixels of the second threshold The area formed by the dots is determined as the detail area of the first non-reference image.
  • the merging unit 540 is further configured to perform fusion processing on the first non-reference image and the second HDR image based on the detail region to obtain a target HDR image.
  • the determining unit 520 is specifically configured to: when the first ratio and the second ratio are not greater than the first threshold, superimpose the first motion region and the second motion region to determine The superimposed area is the target motion area; or,
  • the first weight value is a product of saturation, contrast, and exposure degree of each pixel in the first non-reference image and saturation and contrast of each pixel in the first non-reference image. And a sum of products of exposure degrees; or, the first weight value is a product of saturation, contrast, and exposure degree of each pixel in the first non-reference image; or, the first weight value is a product of saturation, contrast, and exposure degree of each pixel in the second non-reference image; the second weight value is a product of saturation, contrast, and exposure degree of each pixel in the reference image, including :
  • the first threshold is not less than 7.5%.
  • processing unit 530 is specifically configured to:
  • the device further includes: a scanning unit 620.
  • the scanning unit 620 is configured to scan an image where the target motion area is located, and when there is a pixel point whose pixel value is equal to a third threshold in the image where the target motion area is located, the pixel value is the third value.
  • the pixel points of the threshold are marked.
  • the determining unit 520 is further configured to determine that an area formed by all the marked pixels is the connected area.
  • determining unit 520 is specifically configured to:
  • the reference image is a reference image when a sum of the first weights is not less than a sum of the second weights
  • processing unit 530 is specifically configured to:
  • the first Performing a brightness adjustment process on the HDR image to obtain the adjusted first HDR image comprising: multiplying a brightness value of the first HDR image by a preset first brightness ratio based on a brightness value of the reference image Processing, obtaining a first HDR image that is the same as the luminance value of the reference image.
  • the first HDR Performing a dimming process on the image to obtain the adjusted first HDR image, including: performing, by using a luminance value of the first non-reference image, a luminance value of the first HDR image and a preset second luminance ratio
  • the multiplication processing obtains a first HDR image that is the same as the luminance value of the first non-reference image.
  • the apparatus further includes: a guide filtering and exposure correction unit 630.
  • the guiding filtering and exposure correction unit 630 is configured to perform guided filtering and exposure correction on the target HDR image.
  • the determining unit 520 is further configured to determine an edge of the first non-reference image, where the edge includes an edge pixel point.
  • the determining unit 610 is further configured to determine whether a value of the edge pixel point around each pixel point in the first non-reference image is greater than a second threshold.
  • the determining unit 520 is further configured to: if the value of the edge pixel point is greater than the second threshold, the value of the edge pixel point around the pixel point in the first non-reference image is greater than all pixels of the second threshold The area formed by the dots is determined as the detail area of the first non-reference image.
  • the merging unit 540 is further configured to: when the first ratio and the second ratio are both greater than the first threshold, perform fusion processing on the detail region and the reference image to obtain a target HDR image.
  • the determining unit 520 is specifically configured to: perform edge detection on each pixel of the first non-reference image by using a canny operator; and determine, according to a result of the edge detection, the first non-reference image. edge.
  • FIG. 7 is still another schematic structural diagram of an apparatus for generating an HDR image according to an embodiment of the present invention, where the apparatus is used to perform the method illustrated in FIG. 1.
  • the system 700 includes a processor 710, a memory 720, a display 730, a receiver 740, a communication interface 750, and a system bus 760; the processor 710, the memory 720, the display 730, the receiver 740, and a communication interface.
  • the 750 establishes a connection through the system bus, one or more programs will be stored in the memory 730 and configured to be executed by the processor 710, and the one or more programs include all instructions in the method of generating the HDR image.
  • the processor 710 can be a central processing unit (English: central processing unit, abbreviation: CPU).
  • the memory 720 may include a volatile memory (English: volatile memory), such as a random access memory (English: random-access memory, abbreviation: RAM); the memory may also include a non-volatile memory (English: non-volatile memory) , for example, read-only memory (English: read-only memory, abbreviation: ROM), flash memory, hard disk (English: hard disk drive, abbreviation: HDD) or solid state drive (English: solid state drive, abbreviation: SSD); memory 720 may also include a combination of the above types of memory.
  • An exemplary storage medium is coupled to processor 710 to enable processor 710 to read information from, and to write information to, the storage medium.
  • the storage medium can also be an integral part of the processor 710.
  • the processor 710 and the storage medium can be located in an ASIC. Additionally, the ASIC can be located in a core network interface device.
  • the processor 710 and the storage medium may also exist as discrete components in the core network interface device.
  • the functions described herein can be implemented in hardware, software, firmware, or any combination thereof.
  • the functions may be stored in a computer readable medium or transmitted as one or more instructions or code on a computer readable medium.
  • Computer readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one location to another.
  • a storage medium may be any available media that can be accessed by a general purpose or special purpose computer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

La présente invention concerne un procédé et un appareil de génération d'image HDR. Le procédé comprend : acquisition d'une première région de mouvement et d'une deuxième région de mouvement d'une séquence d'images ; détermination, en fonction d'un résultat de comparaison d'un premier rapport et d'une première valeur de seuil et d'un résultat de comparaison d'un deuxième rapport et de la première valeur de seuil, d'une région de mouvement cible ; obtention, en fonction de premières valeurs de poids et de deuxièmes valeurs de poids, d'une première image HDR ; détermination, en fonction d'un résultat de comparaison de la somme des premières valeurs de poids et de la somme des deuxièmes valeurs de poids, d'une image de référence ; réalisation d'un traitement d'ajustement de la luminosité sur la première image HDR au moyen de l'image de référence afin d'obtenir une première image HDR ajustée ; et, sur la base d'une région connectée, fusion de la première image HDR ajustée avec l'image de référence ou une première image non de référence afin d'obtenir une deuxième image HDR. Une région de mouvement ayant une certaine surface dans une scène cible peut ainsi être détectée, ce qui permet d'étendre la portée de la région de mouvement, autrement dit une région de mouvement relativement grande peut être autorisée à exister dans la scène. De plus, une image fantôme de la première image HDR ajustée est éliminée, ce qui permet d'obtenir la deuxième image HDR sans l'image fantôme.
PCT/CN2017/117106 2017-03-31 2017-12-19 Procédé et appareil de génération d'image hdr WO2018176925A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710209247.8A CN108668093B (zh) 2017-03-31 2017-03-31 Hdr图像的生成方法及装置
CN201710209247.8 2017-03-31

Publications (1)

Publication Number Publication Date
WO2018176925A1 true WO2018176925A1 (fr) 2018-10-04

Family

ID=63674214

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/117106 WO2018176925A1 (fr) 2017-03-31 2017-12-19 Procédé et appareil de génération d'image hdr

Country Status (2)

Country Link
CN (1) CN108668093B (fr)
WO (1) WO2018176925A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112767281A (zh) * 2021-02-02 2021-05-07 北京小米松果电子有限公司 图像鬼影消除方法、装置、电子设备及存储介质
CN113012070A (zh) * 2021-03-25 2021-06-22 常州工学院 一种基于模糊控制的高动态场景图像序列获取方法
CN113313661A (zh) * 2021-05-26 2021-08-27 Oppo广东移动通信有限公司 图像融合方法、装置、电子设备及计算机可读存储介质
CN113705509A (zh) * 2021-09-02 2021-11-26 北京云蝶智学科技有限公司 试题解析信息的获取方法及装置
CN114240813A (zh) * 2021-12-14 2022-03-25 成都微光集电科技有限公司 图像处理方法及其装置、设备和存储介质
CN115293994A (zh) * 2022-09-30 2022-11-04 腾讯科技(深圳)有限公司 图像处理方法、装置、计算机设备和存储介质

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110503044A (zh) 2018-12-03 2019-11-26 神盾股份有限公司 指纹传感器及其指纹感测方法
CN111489320A (zh) * 2019-01-29 2020-08-04 华为技术有限公司 图像处理的方法和装置
CN113674181A (zh) * 2020-05-13 2021-11-19 武汉Tcl集团工业研究院有限公司 一种多曝光图像的对齐融合方法及设备
CN111915635A (zh) * 2020-08-21 2020-11-10 广州云蝶科技有限公司 支持自阅卷的试题解析信息生成方法及系统
CN113012081A (zh) * 2021-01-28 2021-06-22 北京迈格威科技有限公司 图像处理方法、装置和电子系统
CN113240614B (zh) * 2021-04-07 2023-02-10 华南理工大学 一种适用于k-tig焊超强弧光场景的高动态图像融合方法
CN113626633A (zh) * 2021-09-01 2021-11-09 北京云蝶智学科技有限公司 图片检索方法及装置
CN113723539A (zh) * 2021-09-02 2021-11-30 北京云蝶智学科技有限公司 试题信息采集方法及装置
CN116233607B (zh) * 2021-12-01 2024-05-14 Oppo广东移动通信有限公司 一种多曝光图像处理方法、装置、芯片及电子设备
CN114664047A (zh) * 2022-05-26 2022-06-24 长沙海信智能系统研究院有限公司 高速公路火灾识别方法、装置及电子设备
CN117710264A (zh) * 2023-07-31 2024-03-15 荣耀终端有限公司 图像的动态范围校准方法和电子设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140307044A1 (en) * 2013-04-15 2014-10-16 Qualcomm Incorporated Reference image selection for motion ghost filtering
CN105894484A (zh) * 2016-03-30 2016-08-24 山东大学 一种基于直方图归一化与超像素分割的hdr重建算法
CN106056629A (zh) * 2016-05-31 2016-10-26 南京大学 通过运动物体检测和扩展去除鬼影的高动态范围成像方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101852616B (zh) * 2010-04-30 2012-07-11 北京航空航天大学 一种高动态条件下实现星体目标提取的方法和装置
CN106169182B (zh) * 2016-05-25 2019-08-09 西安邮电大学 一种合成多幅不同曝光度图像的方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140307044A1 (en) * 2013-04-15 2014-10-16 Qualcomm Incorporated Reference image selection for motion ghost filtering
CN105894484A (zh) * 2016-03-30 2016-08-24 山东大学 一种基于直方图归一化与超像素分割的hdr重建算法
CN106056629A (zh) * 2016-05-31 2016-10-26 南京大学 通过运动物体检测和扩展去除鬼影的高动态范围成像方法

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112767281A (zh) * 2021-02-02 2021-05-07 北京小米松果电子有限公司 图像鬼影消除方法、装置、电子设备及存储介质
CN112767281B (zh) * 2021-02-02 2024-04-30 北京小米松果电子有限公司 图像鬼影消除方法、装置、电子设备及存储介质
CN113012070A (zh) * 2021-03-25 2021-06-22 常州工学院 一种基于模糊控制的高动态场景图像序列获取方法
CN113012070B (zh) * 2021-03-25 2023-09-26 常州工学院 一种基于模糊控制的高动态场景图像序列获取方法
CN113313661A (zh) * 2021-05-26 2021-08-27 Oppo广东移动通信有限公司 图像融合方法、装置、电子设备及计算机可读存储介质
CN113705509A (zh) * 2021-09-02 2021-11-26 北京云蝶智学科技有限公司 试题解析信息的获取方法及装置
CN114240813A (zh) * 2021-12-14 2022-03-25 成都微光集电科技有限公司 图像处理方法及其装置、设备和存储介质
CN115293994A (zh) * 2022-09-30 2022-11-04 腾讯科技(深圳)有限公司 图像处理方法、装置、计算机设备和存储介质
CN115293994B (zh) * 2022-09-30 2022-12-16 腾讯科技(深圳)有限公司 图像处理方法、装置、计算机设备和存储介质

Also Published As

Publication number Publication date
CN108668093A (zh) 2018-10-16
CN108668093B (zh) 2020-08-14

Similar Documents

Publication Publication Date Title
WO2018176925A1 (fr) Procédé et appareil de génération d'image hdr
CN108335279B (zh) 图像融合和hdr成像
EP2987135B1 (fr) Selection d'image de reference pour un filtrage d'image fantome mobile
US9077913B2 (en) Simulating high dynamic range imaging with virtual long-exposure images
JP6467787B2 (ja) 画像処理システム、撮像装置、画像処理方法およびプログラム
KR101662846B1 (ko) 아웃 포커싱 촬영에서 빛망울 효과를 생성하기 위한 장치 및 방법
US9131201B1 (en) Color correcting virtual long exposures with true long exposures
US8891867B2 (en) Image processing method
US8340417B2 (en) Image processing method and apparatus for correcting skin color
US8929683B2 (en) Techniques for registering and warping image stacks
US8284271B2 (en) Chroma noise reduction for cameras
KR20200023651A (ko) 미리보기 사진 블러링 방법 및 장치 및 저장 매체
US20150063694A1 (en) Techniques for combining images with varying brightness degrees
JP6720881B2 (ja) 画像処理装置及び画像処理方法
CN106791451B (zh) 一种智能终端的拍照方法
CN107564085B (zh) 图像扭曲处理方法、装置、计算设备及计算机存储介质
JP6904788B2 (ja) 画像処理装置、画像処理方法、及びプログラム
EP3179716B1 (fr) Procédé de traitement d'image, support de stockage informatique, dispositif et terminal
US9900503B1 (en) Methods to automatically fix flash reflection at capture time
CN107087114B (zh) 一种拍摄的方法及装置
US20230252612A1 (en) De-ghosting and see-through prevention for image fusion
Mangiat et al. Automatic scene relighting for video conferencing
JP6873815B2 (ja) 画像処理装置、撮像装置、画像処理方法、プログラム、および、記憶媒体
JP6668646B2 (ja) 画像処理装置、画像処理方法及びプログラム
CN117710264A (zh) 图像的动态范围校准方法和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17902749

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17902749

Country of ref document: EP

Kind code of ref document: A1