CN109286758B - High dynamic range image generation method, mobile terminal and storage medium - Google Patents

High dynamic range image generation method, mobile terminal and storage medium Download PDF

Info

Publication number
CN109286758B
CN109286758B CN201811195806.5A CN201811195806A CN109286758B CN 109286758 B CN109286758 B CN 109286758B CN 201811195806 A CN201811195806 A CN 201811195806A CN 109286758 B CN109286758 B CN 109286758B
Authority
CN
China
Prior art keywords
exposure time
moving object
time set
image
mobile terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811195806.5A
Other languages
Chinese (zh)
Other versions
CN109286758A (en
Inventor
刘银华
孙剑波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201811195806.5A priority Critical patent/CN109286758B/en
Publication of CN109286758A publication Critical patent/CN109286758A/en
Application granted granted Critical
Publication of CN109286758B publication Critical patent/CN109286758B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Abstract

The application is applicable to the technical field of image processing, and provides a high dynamic range image generation method, a mobile terminal and a computer readable storage medium, wherein the method comprises the following steps: the method comprises the steps of determining the movement speed of a moving object in a preview picture acquired by a camera of the mobile terminal, acquiring an exposure time set according to the movement speed of the moving object, wherein the exposure time set comprises at least two exposure times, controlling the camera of the mobile terminal to acquire images based on each exposure time in the exposure time set, acquiring images respectively corresponding to each exposure time in the exposure time set, and synthesizing the images respectively corresponding to each exposure time in the exposure time set to acquire the high dynamic range image.

Description

High dynamic range image generation method, mobile terminal and storage medium
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to a method for generating a high dynamic range image, a mobile terminal, and a computer-readable storage medium.
Background
Compared with the common image, the High-Dynamic Range image (HDR) can provide more Dynamic Range and image details, and according to LDR (Low-Dynamic Range) images with different exposure times, the LDR image with the best details corresponding to each exposure time is utilized to synthesize the final HDR image, so that the visual effect in the real environment of a person can be better reflected.
At present, high dynamic range images obtained by combining a plurality of images shot at different exposure times can obtain more details; however, the problem of blurring of the contours of local objects often arises. Therefore, the effect of the currently photographed high dynamic range image is poor.
Disclosure of Invention
In view of this, embodiments of the present application provide a method for generating a high dynamic range image, a mobile terminal, and a computer-readable storage medium, so as to solve the problems that a local object contour of a current high dynamic range image is blurred and the effect is poor.
A first aspect of an embodiment of the present application provides a method for generating a high dynamic range image, including:
determining the motion speed of a moving object in a preview picture acquired by a camera of the mobile terminal;
obtaining an exposure time set according to the movement speed of the moving object, wherein the exposure time set comprises at least two exposure times;
controlling a camera of the mobile terminal to acquire an image based on each exposure time in the exposure time set, and obtaining an image corresponding to each exposure time in the exposure time set;
and synthesizing the images respectively corresponding to each exposure time in the exposure time set to obtain the high dynamic range image.
A second aspect of an embodiment of the present application provides a mobile terminal, including:
the mobile terminal comprises a moving speed determining unit, a preview unit and a display unit, wherein the moving speed determining unit is used for determining the moving speed of a moving object in a preview picture acquired by a camera of the mobile terminal;
an exposure time obtaining unit, configured to obtain an exposure time set according to a motion speed of the moving object, where the exposure time set includes at least two exposure times;
the image acquisition unit is used for controlling a camera of the mobile terminal to acquire an image based on each exposure time in the exposure time set and obtain an image corresponding to each exposure time in the exposure time set;
and the image synthesis unit is used for synthesizing the images respectively corresponding to each exposure time in the exposure time set to obtain the high dynamic range image.
A third aspect of an embodiment of the present application provides a mobile terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method provided in the first aspect of the embodiment of the present application when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by one or more processors, performs the steps of the method provided by the first aspect of embodiments of the present application.
A fifth aspect of embodiments of the present application provides a computer program product comprising a computer program that, when executed by one or more processors, performs the steps of the method provided by the first aspect of embodiments of the present application.
The embodiment of the application provides a method for generating a high dynamic range image, which determines a plurality of exposure times through the motion speed of a moving object in a preview picture acquired by a camera of a mobile terminal, then, controlling a camera of the mobile terminal to acquire an image based on each exposure time in the exposure time set, obtaining an image corresponding to each exposure time in the exposure time set, synthesizing the images corresponding to each exposure time in the exposure time set, obtaining a high dynamic range image, because of the plurality of exposure times determined by the movement speed, a clearer image of the area where the moving object is located can be obtained through the setting of the exposure times, and then obtaining a plurality of detailed images through a plurality of exposure times, so that the combined high dynamic range image can avoid the phenomenon that the outline of a local object is unclear.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of an implementation of a method for generating a high dynamic range image according to an embodiment of the present application;
fig. 2 is a schematic flow chart of an implementation of another method for generating a high dynamic range image according to an embodiment of the present application;
fig. 3 is a schematic flow chart of an implementation of another method for generating a high dynamic range image according to an embodiment of the present application;
fig. 4 is a schematic block diagram of a mobile terminal according to an embodiment of the present application;
fig. 5 is a schematic block diagram of another mobile terminal provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Fig. 1 is a schematic flow chart of an implementation of a method for generating a high dynamic range image according to an embodiment of the present application, and as shown in the figure, the method may include the following steps:
and step S101, determining the motion speed of a moving object in a preview picture acquired by a camera of the mobile terminal.
In the embodiment of the application, the preview picture acquired by the camera of the mobile terminal is also one frame by one frame, so that when it is determined that the mobile terminal needs to acquire a high dynamic range image, the motion speed of a moving object in the preview picture currently acquired by the camera of the mobile terminal can be determined in real time in a high dynamic range image mode, and the motion speed of the moving object in the preview picture currently acquired by the camera of the mobile terminal can be determined after a photographing instruction is received.
When taking a picture, a user holds the mobile terminal to control the mobile terminal to be static, and then clicks a picture taking key or button to take a picture, so that the mobile terminal is usually in a relatively static state when taking a picture, at this time, if a moving object does not exist, the difference between two continuous preview pictures should be small, and if a moving object exists, and the moving speed of the moving object is faster, the difference between two continuous preview pictures should be larger. Therefore, the moving object in the mobile terminal and the moving speed of the moving object can be determined through the preview picture currently acquired by the mobile terminal and the 1 or more continuous preview pictures before the currently acquired preview picture. For example, a currently acquired preview picture and a previous preview picture are obtained, a difference image of the two preview pictures is calculated, a region where the moving object is located is determined according to the difference image, and the moving speed of the moving object is determined according to the distribution range of pixel points with differences in the difference image and the time difference when the two preview pictures are acquired. The movement speed may not be a specific value and may be different movement speed levels, for example, a region is defined according to a distribution range of pixel points where there is a difference in the difference image, and the movement speed level is determined from a value obtained by a time difference at which the area of the region is acquired.
Step S102, obtaining an exposure time set according to the movement speed of the moving object, wherein the exposure time set comprises at least two exposure times.
In the embodiment of the present application, exposure time sets corresponding to different movement speeds or movement speed levels may be preset, and after the movement speed or movement speed level is determined, the corresponding exposure time set may be queried in the database. When the moving speed is high, at least one short exposure time is included in the exposure time set, because generally, the short exposure time can obtain relatively clear images of moving objects, more details of the moving objects can be obtained, and the fuzziness of the moving areas in the obtained images can be reduced. In addition, in order to obtain an image capable of embodying more details, a plurality of different exposure times can be set, and finally the images corresponding to the different exposure times are synthesized to obtain a high dynamic range image.
And step S103, controlling a camera of the mobile terminal to acquire an image based on each exposure time in the exposure time set, and obtaining an image corresponding to each exposure time in the exposure time set.
And step S104, synthesizing the images respectively corresponding to each exposure time in the exposure time set to obtain a high dynamic range image.
In the embodiment of the present application, even if a plurality of images corresponding to short exposure times are included, due to other factors such as hand shake in a user shooting process, a blurred image in a region where a moving object is located may exist in an finally obtained image, and thus, a phenomenon such as blurring or double shadow on a boundary of the moving object may also exist in an finally synthesized high dynamic range image. Therefore, before the synthesis, the image with the fuzzy area where some moving objects are located can be removed, and then the rest images are synthesized. Because the probability of blurring in the images corresponding to the normal exposure time and the long exposure time is very high when the moving object exists, and the images corresponding to the normal exposure time and the long exposure time are not used for obtaining the details of the area where the moving object is located but used for obtaining the details of other areas except the moving object, the image with the blurring degree of the moving area in the image corresponding to the exposure time smaller than the time threshold (the exposure time when the preview picture is acquired can be preset as the time threshold) larger than the blurring threshold can be deleted, namely the blurring in the image corresponding to the short exposure time is removed, and the residual images are synthesized to obtain the high-dynamic-range image.
As another embodiment of the present application, the synthesizing images respectively corresponding to each exposure time in the exposure time set to obtain the high dynamic range image includes:
judging whether images with the fuzziness of the area where the moving object is located larger than a threshold exist in the images respectively corresponding to the minimum exposure time and the middle exposure time in the exposure time set;
and if the image with the fuzziness of the area where the moving object is located is larger than the threshold value, discarding the image with the fuzziness of the area where the moving object is located larger than the threshold value, and synthesizing the rest images to obtain the high dynamic range image.
In this embodiment, the minimum exposure time is a minimum exposure time in the exposure time set, and the intermediate exposure time is other exposure times in the exposure time set except for the minimum exposure time and the maximum exposure time. It should be noted that, in order to obtain the details of the moving object, the image in which the degree of blur of the region in which the moving object is located in the image corresponding to the short exposure time is greater than the threshold may be discarded, and the remaining images may be synthesized, or the image in which the degree of blur of the region in which the moving object is located in the preset number of images with the smaller exposure time is greater than the threshold may be removed, for example, the image in which the degree of blur of the region in which the moving object is located in the two images with the smallest exposure time is greater than the threshold may be removed.
According to the method and the device, the plurality of exposure times are determined through the movement speed, so that a clear image of a region where a moving object is located can be obtained through the setting of the exposure times, and then a plurality of detailed images are obtained through the plurality of exposure times, so that the phenomenon that the outline of a local object is unclear can be avoided through the synthesized high dynamic range image.
Fig. 2 is a schematic flow chart of an implementation of another method for generating a high dynamic range image according to an embodiment of the present application, and as shown in the drawing, the method describes how to determine a moving speed of a moving object in a preview picture acquired by a camera of a mobile terminal based on the embodiment shown in fig. 1, and the method may include the following steps:
step S201, based on the preview picture currently acquired by the camera and N continuous preview pictures before the preview picture currently acquired, determining a moving object in the preview picture currently acquired by the camera.
In the embodiment of the application, when a moving object in a currently acquired preview picture is calculated, two preview pictures (or two preview pictures closest to the acquisition time of the currently acquired preview picture) are selected from the first N consecutive preview pictures of the currently acquired preview picture, and the difference between the selected two preview pictures and the current preview picture is calculated to obtain two difference images, then the two difference images are respectively subjected to binarization processing to obtain two binarized images, and the intersection area of the target areas of the two binarized images is calculated to obtain the moving object of the current frame image.
For example, the preview picture of the current frame is HnSelecting a preview picture Hn-1And a preview screen Hn-2Calculating H, wherein the three preview pictures are gray imagesn-1And HnDifference of (A) and Hn-2And HnObtaining T from the difference ofn-(n-1)And Tn-(n-2)To Tn-(n-1)And Tn-(n-2)Performing binarization processing to find Hn-1And HnDifference region (black or white region, also referred to as target region) of (a), and (H)n-2And HnThe difference region (black or white region, also called as target region) of the two binary images, and the intersection region (common region in the difference regions of the two binary images) of the difference regions of the two binary images are calculated to obtain the region where the moving object of the current frame image is located, so that the region where the moving object is located in the currently acquired preview picture is the moving object. After the moving object is determined, the speed of movement of the moving object can be determined.
It should be noted that the movement speed in the embodiment of the present application does not indicate the movement speed of the actual object, but the speed at which the image of the specific object in the preview screen moves in position in the preview screen. In general, the exposure time is the same, the more blurred object in the preview screen has a higher motion speed level, and the boundary of the more blurred moving object is less sharp. Therefore, the ambiguity of the area where the moving object is located in the preview picture currently acquired by the camera can be obtained, and then the moving speed grade of the moving object can be determined according to the ambiguity.
Step S202, calculating a first gradient change value of the gray value of each pixel point on the contour line of the region where the moving object is located in the first direction, and calculating a second gradient change value of the gray value of each pixel point on the contour line of the region where the moving object is located in the second direction, wherein the first direction is perpendicular to the second direction.
Step S203, calculating data characteristic values of the first gradient change value and the second gradient change value, and obtaining the ambiguity of the region where the moving object is located according to the data characteristic values.
In this embodiment of the application, after the moving object is determined, the contour line of the moving object in the preview picture is also determined, then a first gradient change value of the gray value of each pixel point on the contour line of the region where the moving object is located in the horizontal direction (each pixel point corresponds to one first gradient change value) respectively can be calculated, and a second gradient change value of the gray value of each pixel point on the contour line of the region where the moving object is located in the vertical direction (each pixel point corresponds to one second gradient change value) respectively can be calculated, in practice, only one direction can be set, two directions can be set, the two directions are perpendicular, and multiple directions can be set. Then, gradient change values twice as many as the number of the pixel points are obtained, and when the data characteristic values (for example, an average value and the like) of the gradient change values (the first gradient change value and the second gradient change value) are calculated, the larger the data characteristic values (the larger the gradient change is represented), the clearer the boundary and the lower the ambiguity. I.e. the data characteristic values are inversely proportional to the degree of ambiguity of the region in which the moving object is located. The degree of ambiguity may be a specific value or may be a rank.
And step S204, determining the motion speed of the moving object according to the fuzziness of the area where the moving object is located in the currently acquired preview picture and the exposure time of the currently acquired preview picture.
In the embodiment of the application, the motion speed or the grade of the motion speed of the moving object is related to the ambiguity of the area where the moving object is located and the exposure time when the current preview picture is acquired, so that the motion speed of the moving object needs to be determined according to the ambiguity of the area where the moving object is located in the currently acquired preview picture and the exposure time of the currently acquired preview picture, the corresponding relation of the motion speed and the motion speed can be preset, and under the condition that the ambiguity and the exposure time are determined, the motion speed or the grade of the motion speed of the moving object can be obtained through table lookup.
Fig. 3 is a schematic flow chart of an implementation of another method for generating a high dynamic range image according to an embodiment of the present application, and as shown in the figure, the method describes how to obtain an exposure time set according to a motion speed of a moving object on the basis of the embodiment shown in fig. 1, and the method may include the following steps:
step S301, determining the minimum exposure time in the exposure time set according to the motion speed of the moving object.
In this embodiment, the minimum exposure time in the set of exposure times may be determined according to the motion speed of the moving object based on a rule that the motion speed and the minimum exposure time are inversely proportional. For example, the greater the moving speed of the moving object, the smaller the corresponding minimum exposure time, and the smaller the moving speed of the moving object, the larger the corresponding minimum exposure time.
Step S302, taking the exposure time when the camera collects the current preview picture as the maximum exposure time in the exposure time set, wherein the number of the maximum exposure time is 1 or more.
In this embodiment of the application, a maximum exposure time may be set in advance, and the maximum exposure time may be an exposure time when the camera acquires a current preview screen, or may be a time longer than the exposure time when the camera acquires the current preview screen.
Step S303, generating an intermediate exposure time in the exposure time set according to the minimum exposure time and the maximum exposure time, where the intermediate exposure time is an exposure time in the exposure time set other than the minimum exposure time and the maximum exposure time.
In the embodiment of the present application, after the minimum exposure time and the maximum exposure time are determined, the intermediate exposure time may be determined, and the intermediate exposure time in the exposure time set may be determined according to the number of preset exposure times or the step length of the preset exposure time. It should be noted that the same exposure time is allowed to exist in the exposure time set, and for example, the number of maximum exposure times may be two, or the number of a certain intermediate exposure time may be plural.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 4 is a schematic block diagram of a mobile terminal according to an embodiment of the present application, and only a portion related to the embodiment of the present application is shown for convenience of description.
The mobile terminal 4 may be a software unit, a hardware unit or a combination of software and hardware unit built in a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, etc., or may be integrated into a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, etc., as an independent pendant.
The mobile terminal 4 includes:
a moving speed determining unit 41, configured to determine a moving speed of a moving object in a preview picture acquired by a camera of the mobile terminal;
an exposure time obtaining unit 42, configured to obtain an exposure time set according to a motion speed of the moving object, where the exposure time set includes at least two exposure times;
an image acquisition unit 43, configured to control a camera of the mobile terminal to acquire an image based on each exposure time in the exposure time set, and obtain an image corresponding to each exposure time in the exposure time set;
and an image synthesizing unit 44 configured to synthesize images respectively corresponding to each exposure time in the exposure time set to obtain a high dynamic range image.
Optionally, the movement speed determination unit 41 includes:
a moving object determining module 411, configured to determine, based on the preview picture currently acquired by the camera and N consecutive preview pictures before the preview picture currently acquired, a moving object in the preview picture currently acquired by the camera;
a ambiguity obtaining module 412, configured to obtain ambiguity of a region where a moving object in a preview picture currently acquired by the camera is located;
and the moving speed determining module 413 is configured to determine the moving speed of the moving object according to the ambiguity of the area where the moving object is located in the currently acquired preview picture and the exposure time of the currently acquired preview picture.
Optionally, the ambiguity obtaining module 412 is further configured to:
calculating a first gradient change value of the gray value of each pixel point on the contour line of the region where the moving object is located in a first direction, and calculating a second gradient change value of the gray value of each pixel point on the contour line of the region where the moving object is located in a second direction, wherein the first direction is vertical to the second direction;
and calculating the data characteristic values of the first gradient change value and the second gradient change value, and obtaining the ambiguity of the region of the moving object according to the data characteristic values.
Optionally, the exposure time obtaining unit 42 includes:
a minimum exposure time obtaining module 421, configured to determine a minimum exposure time in the exposure time set according to the motion speed of the moving object;
a maximum exposure time obtaining module 422, configured to take the exposure time when the camera acquires the current preview screen as the maximum exposure time in the exposure time set, where the number of the maximum exposure time is 1 or more;
an intermediate exposure time determining module 423, configured to generate an intermediate exposure time in the exposure time set according to the minimum exposure time and the maximum exposure time, where the intermediate exposure time is an exposure time in the exposure time set other than the minimum exposure time and the maximum exposure time.
Optionally, the minimum exposure time obtaining module 421 is further configured to:
and determining the minimum exposure time in the exposure time set according to the motion speed of the moving object based on a rule that the motion speed and the minimum exposure time are in inverse proportion.
Optionally, the intermediate exposure time determining module 423 is further configured to:
and determining the middle exposure time in the exposure time set based on the number of preset exposure times in the exposure time set or the step length of the preset exposure time.
Optionally, the image synthesizing unit 44 includes:
a determining module 441, configured to determine whether an image in which a region in which a moving object is located has a blur degree greater than a threshold exists in images corresponding to the minimum exposure time and the intermediate exposure time in the exposure time set respectively;
the image synthesizing module 442 is configured to discard the image in which the degree of blur of the region of the moving object is greater than the threshold if there is an image in which the degree of blur of the region of the moving object is greater than the threshold, and synthesize the remaining images to obtain a high dynamic range image.
It will be apparent to those skilled in the art that, for convenience and simplicity of description, the foregoing functional units and modules are merely illustrated in terms of division, and in practical applications, the foregoing functional allocation may be performed by different functional units and modules as needed, that is, the internal structure of the mobile terminal is divided into different functional units or modules to perform all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the mobile terminal may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
Fig. 5 is a schematic block diagram of a mobile terminal according to another embodiment of the present application. As shown in fig. 5, the mobile terminal 5 of this embodiment includes: one or more processors 50, a memory 51 and a computer program 52 stored in said memory 51 and executable on said processors 50. The processor 50, when executing the computer program 52, implements the steps in the various method embodiments described above, such as the steps S101 to S104 shown in fig. 1. Alternatively, the processor 50, when executing the computer program 52, implements the functions of the modules/units in the above-described mobile terminal embodiments, such as the functions of the modules 41 to 44 shown in fig. 4.
Illustratively, the computer program 52 may be partitioned into one or more modules/units, which are stored in the memory 51 and executed by the processor 50 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 52 in the mobile terminal 5. For example, the computer program 52 may be divided into a motion speed determination unit, an exposure time obtaining unit, an image acquisition unit, an image composition unit.
The mobile terminal comprises a moving speed determining unit, a preview unit and a display unit, wherein the moving speed determining unit is used for determining the moving speed of a moving object in a preview picture acquired by a camera of the mobile terminal;
an exposure time obtaining unit, configured to obtain an exposure time set according to a motion speed of the moving object, where the exposure time set includes at least two exposure times;
the image acquisition unit is used for controlling a camera of the mobile terminal to acquire an image based on each exposure time in the exposure time set and obtain an image corresponding to each exposure time in the exposure time set;
and the image synthesis unit is used for synthesizing the images respectively corresponding to each exposure time in the exposure time set to obtain the high dynamic range image.
Other units or modules can be referred to the description of the embodiment shown in fig. 4, and are not described again here.
The mobile terminal includes, but is not limited to, a processor 50, a memory 51. Those skilled in the art will appreciate that fig. 5 is only one example of a mobile terminal 5 and is not intended to limit the mobile terminal 5 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the mobile terminal may also include input devices, output devices, network access devices, buses, etc.
The Processor 50 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the mobile terminal 5, such as a hard disk or a memory of the mobile terminal 5. The memory 51 may also be an external storage device of the mobile terminal 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the mobile terminal 5. Further, the memory 51 may also include both an internal storage unit and an external storage device of the mobile terminal 5. The memory 51 is used for storing the computer program and other programs and data required by the mobile terminal. The memory 51 may also be used to temporarily store data that has been output or is to be output.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed mobile terminal and method may be implemented in other ways. For example, the above-described embodiments of the mobile terminal are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (8)

1. A method of generating a high dynamic range image, comprising:
determining the motion speed of a moving object in a preview picture acquired by a camera of the mobile terminal;
obtaining an exposure time set according to the movement speed of the moving object, wherein the exposure time set comprises at least two exposure times;
controlling a camera of the mobile terminal to acquire an image based on each exposure time in the exposure time set, and obtaining an image corresponding to each exposure time in the exposure time set;
synthesizing images respectively corresponding to each exposure time in the exposure time set to obtain a high dynamic range image;
wherein the obtaining of the set of exposure times according to the motion speed of the moving object comprises:
determining the minimum exposure time in the exposure time set according to the movement speed of the moving object;
taking the exposure time when the camera collects the current preview picture as the maximum exposure time in the exposure time set, wherein the number of the maximum exposure time is 1 or more;
generating an intermediate exposure time in the exposure time set according to the minimum exposure time and the maximum exposure time, wherein the intermediate exposure time is an exposure time in the exposure time set except for the minimum exposure time and the maximum exposure time, and the same exposure time is allowed to exist in the exposure time set;
the synthesizing images respectively corresponding to each exposure time in the exposure time set to obtain a high dynamic range image includes:
judging whether images with the fuzziness of the area where the moving object is located larger than a threshold exist in the images respectively corresponding to the minimum exposure time and the middle exposure time in the exposure time set;
and if the image with the fuzziness of the area where the moving object is located is larger than the threshold value, discarding the image with the fuzziness of the area where the moving object is located larger than the threshold value, and synthesizing the rest images to obtain the high dynamic range image.
2. The method for generating a high dynamic range image according to claim 1, wherein the determining the moving speed of the moving object in the preview screen captured by the camera of the mobile terminal comprises:
determining a moving object in the preview picture currently acquired by the camera based on the preview picture currently acquired by the camera and N continuous preview pictures before the preview picture currently acquired;
acquiring the ambiguity of a region where a moving object in a preview picture currently acquired by the camera is located;
and determining the motion speed of the moving object according to the ambiguity of the area where the moving object is located in the currently acquired preview picture and the exposure time of the currently acquired preview picture.
3. The method for generating a high dynamic range image according to claim 2, wherein the calculating the ambiguity of the region where the moving object is located in the preview picture currently acquired by the camera comprises:
calculating a first gradient change value of the gray value of each pixel point on the contour line of the region where the moving object is located in a first direction, and calculating a second gradient change value of the gray value of each pixel point on the contour line of the region where the moving object is located in a second direction, wherein the first direction is vertical to the second direction;
and calculating the data characteristic values of the first gradient change value and the second gradient change value, and obtaining the ambiguity of the region of the moving object according to the data characteristic values.
4. The method of generating a high dynamic range image according to claim 1, wherein said determining a minimum exposure time of the set of exposure times according to the moving speed of the moving object comprises:
and determining the minimum exposure time in the exposure time set according to the motion speed of the moving object based on a rule that the motion speed and the minimum exposure time are in inverse proportion.
5. The method of generating a high dynamic range image of claim 1, wherein said generating an intermediate exposure time of the set of exposure times from the minimum exposure time and the maximum exposure time comprises:
and determining the middle exposure time in the exposure time set based on the number of preset exposure times in the exposure time set or the step length of the preset exposure time.
6. A mobile terminal, comprising:
the mobile terminal comprises a moving speed determining unit, a preview unit and a display unit, wherein the moving speed determining unit is used for determining the moving speed of a moving object in a preview picture acquired by a camera of the mobile terminal;
an exposure time obtaining unit, configured to obtain an exposure time set according to a motion speed of the moving object, where the exposure time set includes at least two exposure times;
the image acquisition unit is used for controlling a camera of the mobile terminal to acquire an image based on each exposure time in the exposure time set and obtain an image corresponding to each exposure time in the exposure time set;
an image synthesis unit for synthesizing images corresponding to each exposure time in the exposure time set to obtain a high dynamic range image;
wherein the exposure time obtaining unit includes:
the minimum exposure time obtaining module is used for determining the minimum exposure time in the exposure time set according to the movement speed of the moving object;
a maximum exposure time obtaining module, configured to take the exposure time when the camera acquires the current preview screen as the maximum exposure time in the exposure time set, where the number of the maximum exposure time is 1 or more;
an intermediate exposure time determining module, configured to generate an intermediate exposure time in the exposure time set according to the minimum exposure time and the maximum exposure time, where the intermediate exposure time is an exposure time in the exposure time set other than the minimum exposure time and the maximum exposure time, and the same exposure time is allowed to exist in the exposure time set;
the image synthesizing unit includes:
the judging module is used for judging whether an image with the fuzziness of a region where a moving object is located is larger than a threshold exists in the images respectively corresponding to the minimum exposure time and the middle exposure time in the exposure time set;
and the image synthesis module is used for discarding the images of which the blurriness is greater than the threshold value in the area where the moving object is located if the images of which the blurriness is greater than the threshold value exist in the area where the moving object is located, and synthesizing the rest images to obtain the high dynamic range image.
7. A mobile terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 5 when executing the computer program.
8. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by one or more processors, implements the steps of the method according to any one of claims 1 to 5.
CN201811195806.5A 2018-10-15 2018-10-15 High dynamic range image generation method, mobile terminal and storage medium Active CN109286758B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811195806.5A CN109286758B (en) 2018-10-15 2018-10-15 High dynamic range image generation method, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811195806.5A CN109286758B (en) 2018-10-15 2018-10-15 High dynamic range image generation method, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN109286758A CN109286758A (en) 2019-01-29
CN109286758B true CN109286758B (en) 2021-02-12

Family

ID=65176463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811195806.5A Active CN109286758B (en) 2018-10-15 2018-10-15 High dynamic range image generation method, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN109286758B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110213498B (en) * 2019-05-29 2021-04-23 Oppo广东移动通信有限公司 Image generation method and device, electronic equipment and computer readable storage medium
CN110445989B (en) * 2019-08-05 2021-03-23 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN111479072B (en) * 2020-04-14 2021-12-17 深圳市道通智能航空技术股份有限公司 High dynamic range image synthesis method and device, image processing chip and aerial camera
CN111901525A (en) * 2020-07-29 2020-11-06 西安欧亚学院 Multi-camera artificial intelligence image processing method
CN112437235B (en) * 2020-11-11 2022-03-01 Oppo广东移动通信有限公司 Night scene picture generation method and device and mobile terminal

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104660915A (en) * 2015-02-09 2015-05-27 广东欧珀移动通信有限公司 Control method and device for panoramic photography exposure
CN106469433A (en) * 2015-08-19 2017-03-01 奥林巴斯株式会社 Camera head, image capture method
CN107395997A (en) * 2017-08-18 2017-11-24 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN107566748A (en) * 2017-09-22 2018-01-09 维沃移动通信有限公司 A kind of image processing method, mobile terminal and computer-readable recording medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4286068B2 (en) * 2003-06-03 2009-06-24 大塚電子株式会社 Screen quality evaluation method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104660915A (en) * 2015-02-09 2015-05-27 广东欧珀移动通信有限公司 Control method and device for panoramic photography exposure
CN106469433A (en) * 2015-08-19 2017-03-01 奥林巴斯株式会社 Camera head, image capture method
CN107395997A (en) * 2017-08-18 2017-11-24 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN107566748A (en) * 2017-09-22 2018-01-09 维沃移动通信有限公司 A kind of image processing method, mobile terminal and computer-readable recording medium

Also Published As

Publication number Publication date
CN109286758A (en) 2019-01-29

Similar Documents

Publication Publication Date Title
CN109286758B (en) High dynamic range image generation method, mobile terminal and storage medium
CN108898567B (en) Image noise reduction method, device and system
CN109005367B (en) High dynamic range image generation method, mobile terminal and storage medium
CN109064428B (en) Image denoising processing method, terminal device and computer readable storage medium
CN109005368B (en) High dynamic range image generation method, mobile terminal and storage medium
CN109166156B (en) Camera calibration image generation method, mobile terminal and storage medium
CN108230333B (en) Image processing method, image processing apparatus, computer program, storage medium, and electronic device
CN110766679A (en) Lens contamination detection method and device and terminal equipment
CN110796600B (en) Image super-resolution reconstruction method, image super-resolution reconstruction device and electronic equipment
CN110335216B (en) Image processing method, image processing apparatus, terminal device, and readable storage medium
WO2011084279A2 (en) Algorithms for estimating precise and relative object distances in a scene
CN110660090B (en) Subject detection method and apparatus, electronic device, and computer-readable storage medium
CN109040596B (en) Method for adjusting camera, mobile terminal and storage medium
CN110084765B (en) Image processing method, image processing device and terminal equipment
CN111131688B (en) Image processing method and device and mobile terminal
CN111311482A (en) Background blurring method and device, terminal equipment and storage medium
CN112348686B (en) Claim settlement picture acquisition method and device and communication equipment
CN111405185B (en) Zoom control method and device for camera, electronic equipment and storage medium
CN111311481A (en) Background blurring method and device, terminal equipment and storage medium
CN110796041A (en) Subject recognition method and device, electronic equipment and computer-readable storage medium
CN108805838B (en) Image processing method, mobile terminal and computer readable storage medium
CN111340722B (en) Image processing method, processing device, terminal equipment and readable storage medium
CN111222446B (en) Face recognition method, face recognition device and mobile terminal
CN111371987B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN108769521B (en) Photographing method, mobile terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant