WO2023070261A1 - Procédé et appareil photographiques, et support de stockage - Google Patents

Procédé et appareil photographiques, et support de stockage Download PDF

Info

Publication number
WO2023070261A1
WO2023070261A1 PCT/CN2021/126074 CN2021126074W WO2023070261A1 WO 2023070261 A1 WO2023070261 A1 WO 2023070261A1 CN 2021126074 W CN2021126074 W CN 2021126074W WO 2023070261 A1 WO2023070261 A1 WO 2023070261A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame
scene
shooting scene
exposure
dark
Prior art date
Application number
PCT/CN2021/126074
Other languages
English (en)
Chinese (zh)
Inventor
李泽飞
郑子翔
胡涛
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2021/126074 priority Critical patent/WO2023070261A1/fr
Publication of WO2023070261A1 publication Critical patent/WO2023070261A1/fr

Links

Images

Definitions

  • the present application relates to the technical field of photographing, and in particular, to a photographing method, device and storage medium.
  • Stack photography can fuse multiple frames of images collected under different exposure parameters to obtain a high dynamic range image, which makes the final image more detailed and closer to the actual scene, and can also remove noise in the image.
  • Stack photography can fuse multiple frames of images collected under different exposure parameters to obtain a high dynamic range image, which makes the final image more detailed and closer to the actual scene, and can also remove noise in the image.
  • it is necessary to measure the dynamic range of the scene determine a set of exposure parameter sequences based on the dynamic range of the scene, and use the exposure parameter sequence to collect multiple frames of images for synthesizing high dynamic range images. Therefore, accurately measuring the dynamic range of a scene is a prerequisite for obtaining a high-quality high dynamic range image.
  • the present application provides a photographing method, device and storage medium.
  • a kind of photographing method comprising steps:
  • a target frame is acquired based on the scene information of the shooting scene, wherein the target frame includes a dark frame and/or a bright frame, and the exposure of the dark frame is lower than that of the normal exposure.
  • the exposure of the bright frame is higher than the exposure of the normal exposure, and used to determine the brightness distribution of the dark area of the shooting scene;
  • An image is captured according to the dynamic range of the shooting scene.
  • the photographing device includes a processor, a memory, and a computer program stored in the memory for execution by the processor, when the processor executes the computer program Implement the following steps:
  • a target frame is acquired based on the scene information of the shooting scene, wherein the target frame includes a dark frame and/or a bright frame, and the exposure of the dark frame is lower than that of the normal exposure.
  • the exposure of the bright frame is higher than the exposure of the normal exposure, and used to determine the brightness distribution of the dark area of the shooting scene;
  • An image is captured according to the dynamic range of the shooting scene.
  • a computer-readable storage medium wherein a computer program is stored on the computer-readable storage medium, and when the computer program is executed, the method mentioned in the above-mentioned first aspect is implemented .
  • Fig. 1 is a schematic diagram of the principle of stack photography according to an embodiment of the present application.
  • Fig. 2 is a flowchart of a photographing method according to an embodiment of the present application.
  • Fig. 3 is a schematic diagram of a sequence of image frames collected by stack photography according to an embodiment of the present application.
  • Fig. 4 is a schematic diagram of a sequence of image frames collected by stack photography according to an embodiment of the present application.
  • 5(a)-5(c) are schematic diagrams of the photographing method of the embodiment of the present application.
  • Fig. 6 is a schematic diagram of a logical structure of a photographing device according to an embodiment of the present application.
  • Stack photography can be used to denoise the image and restore the dynamic range of the image, making the details of the image richer and more realistically restoring the actual scene.
  • the dynamic range of the shooting scene is usually measured before taking stack photos.
  • the dynamic range of the shooting scene refers to the brightest part of the shooting scene The greater the difference, the greater the dynamic range.
  • a set of exposure parameter sequences can be determined based on the dynamic range of the shooting scene, for example, a set of exposure parameter sequences from high to low exposure can be determined (such as exposure parameter 1, exposure parameter 2... in Figure 1), and then based on the Multi-frame images (such as image frame 1, image frame 2... in Fig. 1) are acquired by group exposure parameter sequence, which are used to synthesize high dynamic range images.
  • Accurately measuring the dynamic range of the shooting scene is a prerequisite for capturing high-quality high-dynamic-range images.
  • a frame of an image of the shooting scene is collected with normal exposure, and then the dynamic range of the shooting scene is determined based on the brightness distribution of the image. For example, an easily conceivable way is to pre-set the expected brightness l0 of the dark region of the image and the expected brightness h0 of the bright region of the image for different scenes.
  • the dark area can be a pixel point or a pixel area that represents the minimum brightness of the image, for example, it can be the first 10% pixels with the smallest brightness value in the image, and the bright area can be a pixel point or pixel that represents the maximum brightness of the image
  • the region for example, may be the top 10% pixels in the image with the largest brightness values.
  • the bright area and the dark area can be set based on actual needs, as long as they can represent the maximum brightness and the minimum brightness of the image.
  • the average brightness l of the dark region of the image and the average brightness h of the bright region of the image can be determined.
  • the brightening range can be log 2(l0/l).
  • the reduction range can be log2(h0/h), and then it can be based on the brightening range log 2(l0/l) and the reduction range log2(h0/h) determines the dynamic range of the scene.
  • the images collected with normal exposure may have underexposure or overexposure problems, which makes it impossible to accurately determine the brightness distribution of the bright or dark areas in the image, so it is necessary to roughly Estimate the dynamic range of the captured scene. For example, there may be a large area of overexposed or underexposed areas in the image. The brightness values of these areas are 255 or 0, and the actual brightness values cannot be accurately obtained. Therefore, it is impossible to accurately measure and determine the bright areas or The brightness distribution of the dark area is a rough estimate, which leads to an inaccurate dynamic range of the determined scene.
  • the embodiment of the present application provides a photographing method.
  • the exposure can be appropriately reduced, and an additional dark frame can be collected to measure the brightness distribution of the bright area in the scene, or the exposure can be increased appropriately, and an additional bright frame can be collected to measure the dark area of the scene brightness distribution. Therefore, the dynamic range of the shooting scene can be accurately determined, and an image can be shot based on the dynamic range of the shooting scene.
  • the photographing method provided in the embodiment of the present application can be used in various photographing devices, for example, it can be used in various types of digital cameras (such as professional cameras, sports cameras, mobile phone cameras, handheld pan-tilt cameras, etc.), and can also be used in thermal imaging camera.
  • the camera can be mounted on various mobile platforms, such as unmanned vehicles, unmanned vehicles, etc.
  • the photographing method provided in the embodiment of the present application is also applicable to various photographing scenarios, for example, a common photographing scene, a zero-delay photographing scene, or a time-lapse photographing scene.
  • the photographing method as shown in Figure 2 may include the following steps:
  • a target frame is acquired based on the scene information of the shooting scene, wherein the target frame includes a dark frame and/or a bright frame, and the exposure of the dark frame is lower than that of the normal exposure , used to determine the brightness distribution of the bright area of the shooting scene, the exposure of the bright frame is higher than the exposure of the normal exposure, and used to determine the brightness distribution of the dark area of the shooting scene;
  • the exposure parameters when the captured image reaches the best brightness can be determined based on the ambient brightness of the current shooting scene, and a frame of image is collected based on the exposure parameter, and the shooting is determined based on this frame of image
  • This exposure method we call this exposure method of determining an appropriate exposure parameter based on the ambient brightness as normal exposure. Due to the normal exposure, in bright or dark scenes, the captured image may be over-exposed or under-exposed, which makes it impossible to accurately determine the dynamic range of the shooting scene.
  • the camera control on the camera device may be triggered to issue a camera command.
  • the photographing device receives the photographing instruction, it can collect target frames according to the scene information of the current shooting scene, wherein the target frames include at least one of dark frames or bright frames.
  • the dark frame or the bright frame can be one frame or multiple frames of images.
  • the exposure of the dark frame is lower than that of the normal exposure, which can be used to determine the brightness distribution of the bright area of the shooting scene.
  • the exposure of the bright frame is higher than that of the normal exposure.
  • the exposure level during normal exposure is used to determine the brightness distribution of the dark area of the shooting scene.
  • the dynamic range of the shooting scene can be obtained, that is, the overall brightness distribution of the shooting scene. Therefore, in step S204, after the target frame is acquired, the target frame can be used to determine the brightness distribution of the bright area and/or dark area of the shooting scene, so as to obtain the dynamic range of the shooting scene.
  • an appropriate exposure parameter may be determined according to the dynamic range of the shooting scene to shoot an image.
  • an appropriate exposure parameter can be determined for the target frame based on the scene information of the current shooting scene, and then The target frame is acquired according to the determined exposure parameters. For example, taking the target frame as a dark frame as an example, since the dark frame is used to determine the brightness distribution of the bright area of the shooting scene, there should be as little or no overexposed area in the dark frame as possible to ensure The brightness distribution of the bright region can be accurately determined based on the dark frame, so appropriate exposure parameters can be determined in combination with the scene information of the current shooting scene for collecting the dark frame. For bright frames, the situation is similar. Since the accuracy of the exposure parameters of the target frame is not high, it is only necessary to ensure that the target frame is not overexposed or underexposed, because the appropriate exposure parameters can be easily determined based on the scene information of the shooting scene.
  • the scene information of the shooting scene may be various information that can characterize the brightness of the shooting scene.
  • the scene information of the shooting scene may be the scene type of the shooting scene, for example, whether the current shooting scene is sunny or rainy, whether it is a daytime or night scene, whether it is a person or a landscape, and so on.
  • the scene information of the shooting scene may be the ambient brightness of the shooting scene, for example, the ambient brightness of the current shooting scene may be measured by various light metering methods.
  • the scene information of the shooting scene may also be the brightness information of the reference frame, for example, it may be the brightness histogram of the reference frame, the area of the overexposed area in the reference frame, etc., wherein the reference frame is an image frame collected during normal exposure.
  • the reference frame obtained by normal exposure can accurately measure the dynamic range of the scene. Therefore, not all shooting scenes need to additionally use the target frame to determine the dynamic range of the shooting scene. Therefore, in some embodiments, it may first be determined based on the brightness distribution in the reference frame whether to acquire the target frame. For example, in some embodiments, after the reference frame is obtained through normal exposure acquisition, the proportion of the overexposed area in the reference frame can be counted in the entire reference frame. If the proportion of the overexposed area in the reference frame is greater than the first preset threshold, dark frames are obtained based on the scene information of the shooting scene. At the same time, the ratio of the underexposed area in the reference frame to the entire reference frame can also be counted. If the ratio of the underexposed area in the reference frame is greater than the second preset threshold, a bright frame is obtained based on the scene information of the shooting scene. Wherein, the first preset threshold and the second preset threshold can be set according to actual needs.
  • a shooting scene is prone to overexposure or underexposure can be predicted in advance. For example, in an extremely dark night scene, the collected images are usually prone to underexposure. When shooting against the sun or a light source, the collected images prone to overexposure problems. These scenes require additional collection of target frames to determine the dynamic range of the scene with the help of target frames. Therefore, you can pre-set the corresponding relationship between the scene type and the target frame type to be collected. Before collecting the target frame, you can first determine whether the shooting scene belongs to the preset type of scene, and then decide whether to collect the target frame and which one to collect. Type of target frame.
  • the shooting scene in the reference frame can be identified, and when the current shooting scene is recognized as the first type of scene, the Scene information is collected to obtain dark frames; and/or when it is recognized that the current shooting scene is a second type of scene, then based on the scene information of the shooting scene is collected to obtain bright frames, wherein the captured in the first type of scene Images are prone to overexposure, and images collected in the second type of scene are prone to underexposure.
  • the first type of scenarios and the second type of scenarios may be preset based on experience.
  • the brightness distribution of the bright area of the shooting scene is determined based on the dark frame or a reference frame collected under normal exposure, and the brightness distribution of the dark area of the shooting scene is determined based on the bright frame or the reference frame. For example, if there are not too many overexposed regions in the reference frame acquired by normal exposure, the brightness distribution of the bright region of the shooting scene can be determined more accurately, and the brightness distribution of the bright region of the shooting scene can be determined based on the reference frame. If there are too many overexposed areas in the reference frame, dark frames can be collected, and the brightness distribution of the bright area of the shooting scene can be determined by using the dark frames. For the brightness distribution of the dark area in the shooting scene, a similar method is adopted, which will not be repeated here.
  • the average brightness of the bright region of the dark frame may be determined first, and then based on the average brightness of the bright region and the first expected brightness
  • the difference, and the exposure parameter of the dark frame determine the brightness distribution of the bright region of the shooting scene.
  • the bright area of the dark frame may be a pixel point or a pixel area representing the maximum brightness in the dark frame. For example, it may be the top 10% (this value can be adjusted based on actual needs) pixels with the largest brightness values in the dark frame.
  • the first expected brightness may be an expected brightness value preset by the user, and the first expected brightness may be different in different shooting scenarios.
  • the shooting scene can be determined based on the increase or decrease range and the exposure parameters of the dark frame The brightness distribution of the highlight region of .
  • the average brightness of the dark region of the bright frame may be determined first, based on the difference between the average brightness of the dark region and the second expected brightness, and the bright frame
  • the dark area of the bright frame may be a pixel point or pixel area representing the minimum brightness in the bright frame. For example, it may be the first 10% (this value can be adjusted based on actual needs) pixels with the smallest brightness value in the bright frame.
  • the second expected brightness may be an expected brightness value preset by the user, and the second expected brightness may be different in different shooting scenarios.
  • the brightness of the shooting scene can be determined based on the increase or decrease range and the exposure parameters of the bright frame. Brightness distribution of dark areas.
  • stack photography may be performed based on the dynamic range of the shooting scene.
  • a set of exposure parameter sequences may be determined according to the dynamic range of the shooting scene, hereinafter referred to as the first exposure parameter sequence.
  • a plurality of original image frames for synthesizing the high dynamic range image may be acquired based on the first exposure parameter sequence, hereinafter referred to as the first original frame, and the high dynamic range image may be synthesized by using the plurality of first original frames.
  • the determination of the dynamic range of the shooting scene and the acquisition of the original frame for compositing the high dynamic range image are serial.
  • it takes a certain amount of time to determine the first exposure parameter sequence For example, as shown in Figure 3, the calculation starts with the reference frame s0 obtained from normal exposure acquisition.
  • the scene information of the scene determines the exposure parameters of the target frame. This process takes a certain amount of time.
  • the exposure parameters can take effect after 3 to 4 frames, that is, s4 can be used as the target frame.
  • the target frame s4 can be collected, and then the dynamic range of the shooting scene needs to be determined based on the target frame s4, and the exposure parameter sequence is determined based on the dynamic range. This process takes a certain amount of time, for example, the exposure parameters
  • the sequence can take effect after 3-4 frames, that is, the image frames after s8 can be used as the first original frame.
  • 6 to 8 frames may be wasted (for example, s1-s7 cannot be used to synthesize high dynamic range images, which is wasted), resulting in It takes a long time for the user to trigger the camera to actually capture the high dynamic range image, and the user needs to wait for a long time.
  • the second exposure parameter sequence can be determined based on the scene information of the shooting scene, and based on the second exposure parameter sequence, multiple frames of the second original frame can be acquired, and then the multi-frame first original frame and the multi-frame second
  • the high dynamic range image is composited from two raw frames.
  • the high dynamic range image is obtained quickly.
  • the step of determining the exposure parameters of the target frame based on the scene information of the shooting scene can be executed in parallel with the step of determining the second exposure parameter sequence based on the scene information of the shooting scene. For example, determining the exposure parameters of the target frame according to the shooting scene information needs to cost 3 ⁇ The duration of 4 frames, therefore, within this period of time, the second exposure parameter sequence for compositing the high dynamic range image can be determined based on the shooting scene information at the same time. For example, a set of second exposure parameter sequences can be estimated based on the scene type of the shooting scene, the brightness of the shooting scene, and the brightness distribution of the reference frame. Two original frames, that is, part of the original frames used to synthesize the high dynamic range image can be acquired by making full use of this waiting time. As shown in FIG. 4 , during the T1 time period, the above two processes may be executed simultaneously.
  • the step of collecting multiple second original frames and the step of determining the dynamic range of the shooting scene based on the target frame may be Execute in parallel. That is, it takes 3-4 frames of time to determine the dynamic range of the shooting scene. Therefore, this time can be used to collect 3-4 second original frames based on the predetermined second exposure parameter sequence. As shown in FIG. 4, within the time period T2, the above two processes may be executed simultaneously.
  • the first exposure parameter sequence may also be adjusted based on the second exposure parameter sequence, so that based on the adjusted first exposure parameter sequence Sequential image acquisition.
  • a high dynamic range image can be synthesized by a series of images with different exposures, so it is possible to determine which images in the second original frame can be used to synthesize a high dynamic range image, and then there is no need to repeatedly collect images with the same exposure as these images
  • the image that is, the image whose exposure is missing for the composite high dynamic range image can be selected from the first exposure parameter sequence, and then correspondingly acquired.
  • a frame rate limitation is added, that is, there is a certain exposure time interval between two adjacent frames of images.
  • the time required for data processing of each frame of image is different, for example, only the target frame and reference frame need to perform data processing (target frame is used for Determine the dynamic range of the shooting scene, the reference frame is used to determine whether the target frame needs to be collected, and determine the exposure parameters of the target frame), the first original frame and the second original frame do not need to perform data processing. Therefore, the limitation of the frame rate can be canceled, and during the photographing process, the exposure time interval between two adjacent frames can be set based on the time length for performing data processing on the previous frame of the two adjacent frames.
  • a frame of image needs to be processed for a long time, it can be set to expose the next frame at a longer interval after the image frame is collected. If a certain frame of image needs to be processed for a short period of time or does not need to be processed, you can set the next frame to be collected after a short interval after the image frame is collected, or the next frame can be collected continuously without interval. In this way, the user's waiting time can be shortened.
  • the exposure time interval between the first specified frame and the next frame of the first specified frame is a preset duration
  • the exposure time interval between the second specified frame and the next frame of the second specified frame is 0, wherein the first designated frame includes the target frame and/or reference frame (as shown in s0 and s4 in Figure 4), and the second designated frame includes the first original frame and/or the second original frame (except s0 in Figure 4). and other frames in s4), where the reference frame is an image frame collected during normal exposure.
  • the preset duration can be determined based on the actual duration of data processing using the frame, for example, it can be set to 20ms.
  • FIG. 5(a) it is a schematic diagram of a photographing method in an embodiment of the present application.
  • the camera will collect a preview image in real time, and the preview image is obtained according to normal exposure based on the ambient brightness.
  • the camera will count the ratio of the over-exposed area and the under-exposed area in the preview image of the previous frame (hereinafter collectively referred to as the reference frame s0). , then the average brightness l of the dark region of the reference frame s0 and the average brightness h of the bright region of the reference frame s0 can be determined.
  • the exposure parameter sequence A is determined based on this dynamic range.
  • the exposure parameter sequence A determined based on the dynamic range needs to be delayed by 3 to 4 frames to take effect (for example, it starts to take effect at s4), so after capturing the reference frame s0, the camera will still capture 3 to 4 frames of images with normal exposure (such as , s1-s3), after the exposure parameter sequence A is determined, start to collect multiple frames of the first original frame (s4-s6) based on the exposure parameter sequence A, for synthesizing a high dynamic range image.
  • FIG. 5( b ) it is a schematic diagram of a photographing method in another embodiment of the present application.
  • the camera When the user triggers the camera command, the camera will count the ratio of the over-exposed area and the under-exposed area in the preview image of the previous frame (hereinafter collectively referred to as the reference frame s0). The proportion of the exposed area is less than the preset threshold. Then the camera will determine the brightness distribution of the dark area in the shooting scene based on the reference frame s0 (the method is similar to the previous embodiment). At the same time, the camera will estimate the exposure parameters of the dark frame based on the scene type of the current shooting scene, the brightness histogram of the reference frame s0, and the ambient brightness of the shooting scene.
  • the exposure of the dark frame will be lower than the exposure of the reference frame s0, so that There is no overexposed area in the collected dark frame or the proportion of the overexposed area is very small, and then the brightness distribution of the bright area in the shooting scene can be determined based on the dark frame, and then the dynamic range of the shooting scene can be obtained. Based on the dynamic range setting Exposure parameter sequence A.
  • the exposure parameters of the dark frame determined based on the shooting scene need to be delayed by 3 to 4 frames to take effect, that is, after the reference frame s0, it takes an interval of 3 to 4 frames to collect the dark frame (s4 in the figure).
  • the dynamic range of the shooting scene determined based on the dark frame, and the exposure parameter sequence A determined based on the dynamic range need to be delayed by 3 to 4 frames to take effect, that is, the images in the middle 6 to 8 frames are not available, resulting in It takes a long time to capture high dynamic range images, and users have to wait for a long time.
  • the exposure parameter sequence B of the second original frame required for synthesizing the high dynamic range image is also determined in parallel based on the shooting scene.
  • the second raw material used to synthesize the high dynamic range image will be collected in parallel based on the predicted exposure parameter sequence B. frame (s6-s7 in the figure). If the predicted exposure parameter sequence B is relatively accurate, the second original frame acquired based on the exposure parameter sequence B can basically be used to synthesize a high dynamic range image.
  • the acquisition of the original frame of the synthesized high dynamic range image and the determination of the exposure parameter sequence A based on the dynamic range of the shooting scene can be executed in parallel, shortening the shooting time.
  • the exposure parameter sequence A is determined based on the dynamic range of the shooting scene, it is possible to first determine which of the second original frames shot based on the exposure sequence B are available, and then select from the exposure parameter sequence A based on the second original frames that have been shot.
  • the exposure parameters of the original frames to be photographed, and the first original frame (s8-s11 in the figure) is collected.
  • FIG. 5( c ) it is a schematic diagram of a photographing method in another embodiment of the present application.
  • the camera When the user triggers the camera command, the camera will count the ratio of the overexposed area and the underexposed area in the preview image of the previous frame (hereinafter collectively referred to as the reference frame s0), assuming that the ratio of the overexposed area and the underexposed area is greater than preset threshold.
  • the camera will estimate the exposure parameters of the dark frame and the exposure parameters of the bright frame based on the scene type of the current shooting scene, the brightness histogram of the reference frame s0, and the ambient brightness of the shooting scene.
  • the exposure of the bright frame will be higher than the exposure of the reference frame s0, so that there is no underexposed area or the proportion of the underexposed area in the collected bright frame is very small, and then the brightness of the dark area in the shooting scene can be determined based on the bright frame distribution, dark frames work similarly. Furthermore, the dynamic range of the shooting scene can be obtained based on the dark frame and the bright frame, and the exposure parameter sequence A is set based on the dynamic range.
  • the exposure parameter sequence of the second original frame required for synthesizing the high dynamic range image will also be predicted in parallel based on the shooting scene b.
  • the dynamic range of the shooting scene is determined based on the dark frame and the bright frame, and the exposure parameter sequence A is determined based on the dynamic range.
  • the predicted exposure parameter sequence B is also collected in parallel for the synthesis of high dynamic range.
  • the second raw frame of the range image is also collected.
  • the second original frame (s6-s8) collected based on the exposure parameter sequence B can basically be used to synthesize a high dynamic range image. Therefore, the acquisition of the original frame of the synthesized high dynamic range image and the determination of the exposure parameter sequence A based on the dynamic range of the shooting scene can be executed in parallel, shortening the shooting time.
  • the exposure parameter sequence A is determined based on the dynamic range of the shooting scene, it can be determined which of the second original frames (s6-s8) shot based on the predicted exposure sequence B are available, and then based on the captured second original frames ( s6-s8) Select the exposure parameters of the first original frame (s9-s11) to be photographed from the exposure parameter sequence A, and supplement the original frame.
  • the traditional exposure sequences are limited by the frame rate, that is, the shortest time interval is set between adjacent bright frames.
  • the purpose is to process data after collecting a frame and send parameters for the subsequent photo sequence.
  • Figure 5(c) it can be seen from the figure that only s0, s4 and s5 need to process data, and the rest of the frames do not need to process data.
  • the frame rate limit can also be canceled, and the image can be continuously exposed.
  • the stack acquisition time is limited by the exposure time, so the brighter the scene, the shorter the exposure time, and the lower the total time spent on taking photos.
  • the embodiment of the present application also provides a photographing device.
  • the processor 61 implements the following steps when executing the computer program:
  • a target frame is acquired based on the scene information of the shooting scene, wherein the target frame includes a dark frame and/or a bright frame, and the exposure of the dark frame is lower than that of the normal exposure.
  • the exposure of the bright frame is higher than the exposure of the normal exposure, and used to determine the brightness distribution of the dark area of the shooting scene;
  • An image is captured according to the dynamic range of the shooting scene.
  • the processor when the processor is used to acquire the target frame based on the scene information of the shooting scene, it is specifically used for:
  • the target frame is acquired based on the exposure parameters.
  • the target frame is a dark frame
  • the processor when the processor is used to acquire the target frame based on the scene information of the shooting scene, it is specifically used for:
  • the dark frame is acquired based on the scene information of the shooting scene, wherein the reference frame is an image frame collected during normal exposure; and / or
  • the target frame is a bright frame, and the target frame is acquired based on the scene information of the shooting scene, including:
  • the bright frame is acquired based on scene information of the shooting scene.
  • the processor when the processor is used to acquire the target frame based on the scene information of the shooting scene, it is specifically used for:
  • the dark frame is acquired based on the scene information of the shooting scene.
  • the bright frame is acquired based on the scene information of the shooting scene, wherein the image collected in the first type of scene has an overexposure phenomenon , the image collected in the second type of scene has an underexposure phenomenon.
  • the brightness distribution of the bright area of the shooting scene is determined based on the dark frame or the reference frame, and the brightness distribution of the dark area of the shooting scene is determined based on the bright frame or the reference frame, wherein,
  • the reference frame is an image frame collected during normal exposure.
  • the brightness distribution of the bright region of the shooting scene is determined based on the dark frame, including:
  • the brightness distribution of the dark area of the shooting scene is determined based on the bright frame, including:
  • the processor when the processor is configured to capture an image according to the dynamic range of the shooting scene, it is specifically configured to:
  • a plurality of first original frames for synthesizing a high dynamic range image are acquired based on the first exposure parameter sequence.
  • the processor is further configured to:
  • the high dynamic range image is synthesized by using the plurality of first original frames and the plurality of second original frames.
  • the step of collecting the plurality of second original frames is performed in parallel with the step of determining the dynamic range of the shooting scene based on the target frame.
  • the step of determining the exposure parameter of the target frame based on the scene information of the shooting scene is performed in parallel with the step of determining the second exposure parameter sequence based on the scene information of the shooting scene.
  • the processor before acquiring a plurality of first original frames based on the first exposure parameter sequence, the processor is further configured to:
  • the first exposure parameter sequence is adjusted based on the second exposure parameter sequence, so as to perform image acquisition based on the adjusted first exposure sequence.
  • the exposure time interval between two adjacent frames is set based on the time length for performing data processing on the previous frame of the two adjacent frames.
  • the exposure time interval between the first specified frame and the next frame of the first specified frame is a preset duration
  • the exposure time interval between the second specified frame and the next frame of the second specified frame is The exposure time interval is 0, wherein the first specified frame includes the target frame and/or the reference frame, and the second specified frame includes the first original frame and/or the second original frame, wherein the reference frame is the image frame collected during normal exposure.
  • the scene information includes one or more of the following: the scene type of the shooting scene, the ambient brightness of the shooting scene, the brightness histogram of the reference frame, and the area of the overexposed area in the reference frame, wherein the The reference frame is the image frame collected during normal exposure.
  • the embodiments of this specification further provide a computer storage medium, where a program is stored in the storage medium, and when the program is executed by a processor, the method in any of the foregoing embodiments is implemented.
  • Embodiments of the present description may take the form of a computer program product embodied on one or more storage media (including but not limited to magnetic disk storage, CD-ROM, optical storage, etc.) having program code embodied therein.
  • Computer usable storage media includes both volatile and non-permanent, removable and non-removable media, and may be implemented by any method or technology for information storage.
  • Information may be computer readable instructions, data structures, modules of a program, or other data.
  • Examples of storage media for computers include, but are not limited to: phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Flash memory or other memory technology, Compact Disc Read-Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cartridge, tape magnetic disk storage or other magnetic storage device or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
  • PRAM phase change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • ROM read only memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • Flash memory or other memory technology
  • CD-ROM Compact Disc Read-Only Memory
  • DVD Digital Versatile Disc
  • Magnetic tape cartridge tape magnetic disk storage or other magnetic storage device or any other non-transmission medium that can be used to
  • the device embodiment since it basically corresponds to the method embodiment, for related parts, please refer to the part description of the method embodiment.
  • the device embodiments described above are only illustrative, and the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network elements. Part or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment. It can be understood and implemented by those skilled in the art without creative effort.

Landscapes

  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé et un appareil photographiques, et un support de stockage. Le procédé comprend les étapes suivantes : en réponse à une instruction photographique déclenchée par un utilisateur, obtention d'une trame cible au moyen de la mise en œuvre d'une collecte sur la base d'informations de scène d'une scène photographique, la trame cible comprenant une trame sombre et/ou une trame lumineuse, et la quantité d'exposition de la trame sombre étant inférieure à la quantité d'exposition d'une exposition normale, et étant utilisée pour déterminer une distribution de luminosité d'une zone de lumière de la scène photographique, et la quantité d'exposition de la trame lumineuse étant supérieure à la quantité d'exposition de l'exposition normale, et étant utilisée pour déterminer une distribution de luminosité d'une zone sombre de la scène photographique (S202) ; détermination d'une plage dynamique de la scène photographique sur la base de la trame cible (S204) ; et photographie d'une image selon la plage dynamique de la scène photographique (S206). Au moyen du procédé, une plage dynamique d'une scène photographique peut être mesurée avec précision.
PCT/CN2021/126074 2021-10-25 2021-10-25 Procédé et appareil photographiques, et support de stockage WO2023070261A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/126074 WO2023070261A1 (fr) 2021-10-25 2021-10-25 Procédé et appareil photographiques, et support de stockage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/126074 WO2023070261A1 (fr) 2021-10-25 2021-10-25 Procédé et appareil photographiques, et support de stockage

Publications (1)

Publication Number Publication Date
WO2023070261A1 true WO2023070261A1 (fr) 2023-05-04

Family

ID=86159916

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/126074 WO2023070261A1 (fr) 2021-10-25 2021-10-25 Procédé et appareil photographiques, et support de stockage

Country Status (1)

Country Link
WO (1) WO2023070261A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103581565A (zh) * 2012-07-20 2014-02-12 佳能株式会社 摄像设备、摄像设备的控制方法和电子装置
US20160309071A1 (en) * 2015-04-14 2016-10-20 Fotonation Limited Image acquisition method and apparatus
US20160352995A1 (en) * 2015-05-26 2016-12-01 SK Hynix Inc. Apparatus for generating image and method thereof
CN107205120A (zh) * 2017-06-30 2017-09-26 维沃移动通信有限公司 一种图像的处理方法和移动终端
CN107707827A (zh) * 2017-11-14 2018-02-16 维沃移动通信有限公司 一种高动态图像拍摄方法及移动终端

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103581565A (zh) * 2012-07-20 2014-02-12 佳能株式会社 摄像设备、摄像设备的控制方法和电子装置
US20160309071A1 (en) * 2015-04-14 2016-10-20 Fotonation Limited Image acquisition method and apparatus
US20160352995A1 (en) * 2015-05-26 2016-12-01 SK Hynix Inc. Apparatus for generating image and method thereof
CN107205120A (zh) * 2017-06-30 2017-09-26 维沃移动通信有限公司 一种图像的处理方法和移动终端
CN107707827A (zh) * 2017-11-14 2018-02-16 维沃移动通信有限公司 一种高动态图像拍摄方法及移动终端

Similar Documents

Publication Publication Date Title
WO2020038109A1 (fr) Procédé et dispositif de photographie, terminal et support de stockage lisible par ordinateur
CN108495050B (zh) 拍照方法、装置、终端及计算机可读存储介质
CN110445988B (zh) 图像处理方法、装置、存储介质及电子设备
US20150264271A1 (en) Image Blurring Method and Apparatus, and Electronic Devices
US8305487B2 (en) Method and apparatus for controlling multiple exposures
CN110198417A (zh) 图像处理方法、装置、存储介质及电子设备
JP2017514334A (ja) スローシャッタの撮像方法及び撮像装置
CN110246101B (zh) 图像处理方法和装置
CN111064898B (zh) 图像拍摄方法及装置、设备、存储介质
JP2003348438A (ja) 画像撮影方法および装置、画像選択方法および装置並びにプログラム
CN105812670B (zh) 一种拍照的方法及终端
US10999526B2 (en) Image acquisition method and apparatus
CN106303243A (zh) 一种拍照方法、装置及终端
WO2022142177A1 (fr) Procédé et appareil de génération d'image hdr, dispositif électronique et support de stockage lisible
WO2016082347A1 (fr) Procédé et dispositif de compensation de luminosité, et support d'enregistrement informatique
US20160094825A1 (en) Method for selecting metering mode and image capturing device thereof
JP6467190B2 (ja) 露出制御装置及びその制御方法、撮像装置、プログラム、記憶媒体
TW201349854A (zh) 影像擷取裝置及其影像合成方法
CN111586308A (zh) 图像处理方法、装置及电子设备
CN1467991A (zh) 摄像装置和摄像方法
US20140168505A1 (en) Digital imaging exposure metering system
CN115278069A (zh) 图像处理方法及装置、计算机可读存储介质、终端
WO2018119590A1 (fr) Procédé et dispositif de mesurage de lumière, procédé et dispositif d'exposition, et véhicule aérien sans pilote
JP2004289383A (ja) 撮像装置、画像データ生成方法、画像データ処理装置および画像データ処理プログラム
WO2017088314A1 (fr) Procédé de prise de vues, terminal, et support de stockage informatique

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE