CN109005368B - High dynamic range image generation method, mobile terminal and storage medium - Google Patents

High dynamic range image generation method, mobile terminal and storage medium Download PDF

Info

Publication number
CN109005368B
CN109005368B CN201811196979.9A CN201811196979A CN109005368B CN 109005368 B CN109005368 B CN 109005368B CN 201811196979 A CN201811196979 A CN 201811196979A CN 109005368 B CN109005368 B CN 109005368B
Authority
CN
China
Prior art keywords
image
images
reference image
candidate
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811196979.9A
Other languages
Chinese (zh)
Other versions
CN109005368A (en
Inventor
刘银华
孙剑波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201811196979.9A priority Critical patent/CN109005368B/en
Publication of CN109005368A publication Critical patent/CN109005368A/en
Application granted granted Critical
Publication of CN109005368B publication Critical patent/CN109005368B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Abstract

The application is applicable to the technical field of image processing, and provides a high dynamic range image generation method, a mobile terminal and a computer readable storage medium, wherein the method comprises the following steps: the method comprises the steps of collecting multi-frame images with different exposure times through a camera of a mobile terminal, displaying the multi-frame images, prompting a user to select the most clear image of a motion region from the multi-frame images as a reference image, obtaining an image selected by the user, using the selected image as the reference image, using other images except the reference image in the multi-frame images as candidate images, determining the coincidence degree of the candidate images and the reference image, and synthesizing the candidate images with the coincidence degree lower than a threshold value and the reference image based on the motion region of the reference image to obtain a high-dynamic-range image.

Description

High dynamic range image generation method, mobile terminal and storage medium
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to a method for generating a high dynamic range image, a mobile terminal, and a computer-readable storage medium.
Background
Compared with a common image, a High-Dynamic Range (HDR) image can provide more Dynamic ranges and image details, and a final HDR image is synthesized by utilizing a L DR image with optimal details corresponding to each exposure time according to L DR (L ow-Dynamic Range) images with different exposure times, so that the visual effect in a real environment of a person can be better reflected.
At present, high dynamic range images obtained by combining a plurality of images shot at different exposure times can obtain more details; however, the problem of blurring of the contours of local objects often arises. Therefore, the effect of the currently photographed high dynamic range image is poor.
Disclosure of Invention
In view of this, embodiments of the present application provide a method for generating a high dynamic range image, a mobile terminal, and a computer-readable storage medium, so as to solve the problems that a local object contour of a current high dynamic range image is blurred and the effect is poor.
A first aspect of an embodiment of the present application provides a method for generating a high dynamic range image, including:
acquiring multi-frame images with different exposure times through a camera of the mobile terminal, and displaying the multi-frame images to prompt a user to select a frame of image with the clearest motion area from the multi-frame images as a reference image;
acquiring an image selected by a user, taking the selected image as a reference image, and taking other images except the reference image in the multi-frame image as candidate images;
and determining the coincidence degree of the candidate image and the reference image, and synthesizing the candidate image with the coincidence degree lower than a threshold value and the reference image based on the motion region of the reference image to obtain a high dynamic range image.
A second aspect of an embodiment of the present application provides a mobile terminal, including:
the display unit is used for acquiring multi-frame images with different exposure times through a camera of the mobile terminal and displaying the multi-frame images so as to prompt a user to select a frame of image with the clearest motion area from the multi-frame images as a reference image;
the reference image determining unit is used for acquiring an image selected by a user, taking the selected image as a reference image, and taking other images except the reference image in the multi-frame image as candidate images;
and the image synthesis unit is used for determining the coincidence degree of the candidate image and the reference image, and synthesizing the candidate image with the coincidence degree lower than a threshold value and the reference image based on the motion area of the reference image to obtain a high dynamic range image.
A third aspect of an embodiment of the present application provides a mobile terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method provided in the first aspect of the embodiment of the present application when executing the computer program.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by one or more processors, performs the steps of the method provided by the first aspect of embodiments of the present application.
A fifth aspect of embodiments of the present application provides a computer program product comprising a computer program that, when executed by one or more processors, performs the steps of the method provided by the first aspect of embodiments of the present application.
The embodiment of the application provides a method for generating a high dynamic range image, which comprises the steps of collecting multi-frame images with different exposure times through a camera of a mobile terminal, displaying the multi-frame images, prompting the user to select the sharpest image of a motion area from the multi-frame images as a reference image, acquiring the image selected by the user, and using the selected image as a reference image, using other images except the reference image in the multi-frame images as candidate images, determining the coincidence degree of the candidate images and the reference image, and synthesizing the candidate image having a degree of coincidence lower than a threshold value with the reference image based on the motion region of the reference image to obtain a high dynamic range image, because the user selects an image with the clearest motion area from the multiple images with different exposure times as a reference image, the motion area of the synthesized high dynamic range image is clearer; and synthesizing the candidate image with the coincidence degree lower than the threshold value and the reference image to obtain a high dynamic range image, selecting other areas of the candidate image with the high coincidence degree and the motion area of the reference image to synthesize the high dynamic range image, and ensuring that the outline of the motion area is clear and more detailed images of non-motion areas can be obtained.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of an implementation of a method for generating a high dynamic range image according to an embodiment of the present application;
fig. 2 is a schematic flow chart of an implementation of another method for generating a high dynamic range image according to an embodiment of the present application;
fig. 3 is a schematic block diagram of a mobile terminal according to an embodiment of the present application;
fig. 4 is a schematic block diagram of another mobile terminal provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Fig. 1 is a schematic flow chart of an implementation of a method for generating a high dynamic range image according to an embodiment of the present application, and as shown in the figure, the method may include the following steps:
and step S101, acquiring multi-frame images with different exposure times through a camera of the mobile terminal, and displaying the multi-frame images to prompt a user to select a frame of image with the clearest motion area from the multi-frame images as a reference image.
In the embodiment of the application, a camera of the mobile terminal can collect multiple frames of images, the exposure time of the multiple frames of images can be different, and after the multiple frames of images are obtained, the multiple frames of images are displayed on a display interface of the mobile terminal, and can be displayed according to the collected time sequence or the clear-to-fuzzy sequence of a motion area. And sending prompt information while displaying to prompt a user to select an image with the clearest motion area from the multi-frame images as a reference image.
The method comprises the steps that a user possibly has a moving object when shooting, when the moving object exists, short exposure time needs to exist in multiple exposure times, the moving object collected in the short exposure time is relatively clear, however, various factors such as hand trembling are possibly caused in the shooting process of the user, the moving area in an image collected in the shortest exposure time is probably not the clearest, and in order to make the moving area of a finally synthesized high dynamic range image clear, an image with the clearest moving area needs to be selected as a reference image, so that the user can select one image as the reference image according to the shooting effect desired by the user.
Step S102, acquiring an image selected by a user, taking the selected image as a reference image, and taking other images except the reference image in the multi-frame image as candidate images.
In the embodiment of the application, after a user selects one image by means of a physical key, a virtual button, touch control or the like, the image can be used as a reference image, and other images except the image can be used as candidate images.
Step S103, determining the coincidence degree of the candidate image and the reference image, and synthesizing the candidate image and the reference image with the coincidence degree lower than a threshold value based on the motion area of the reference image to obtain a high dynamic range image.
In the embodiment of the present application, the degree of coincidence between the candidate image and the reference image can be determined, and the motion region in the reference image is clearest, so the motion region in the reference image can be used as a main characteristic of the motion region in the synthesized high dynamic range image, and the non-motion region in the candidate image can be used as a main characteristic of other details in the synthesized high dynamic range image. The higher the coincidence degree of the candidate image and the reference image is, the more similar the two images are, the less details can be represented after the synthesis, and if the difference between the two images is larger, the more details can be represented after the synthesis, so that a threshold needs to be set, and the candidate image with the coincidence degree lower than the threshold and the reference image are synthesized to obtain the high dynamic range image. In this way, the high dynamic range image obtained can obtain more detail.
As another embodiment of the present application, the determining the degree of coincidence between the candidate image and the reference image includes:
acquiring a gray difference image between pixel points in the candidate image and pixel points in the reference image;
and carrying out binarization processing on the gray difference image to obtain a binarized image, and determining the coincidence degree of the candidate image and the reference image according to the binarized image.
In the embodiment of the present application, the coincidence degree between the candidate image and the reference image may be directly calculated, or the coincidence degree between the candidate image and the reference image may be calculated after removing the motion region in each image.
Taking an unremoved area as an example, the candidate image and the reference image can be respectively processed into gray level images, then a gray level difference image between the candidate image and the reference image is calculated, namely, the gray level of a pixel point with the same coordinate is subtracted to obtain the gray level difference image, the gray level difference image is subjected to binarization processing to obtain a binarization image, a white area (or a black area) in the binarization image is a difference area between the candidate image and the reference image, and the difference degree is represented according to the area of the difference area or the number of the pixel points; similarly, a black area (or a white area) in the binary image is an overlapping area between the candidate image and the reference image, and the overlapping degree is represented according to the area of the overlapping area or the number of pixels.
As another embodiment of the present application, the synthesizing the candidate image with the overlapping degree lower than the threshold value and the reference image to obtain the high dynamic range image based on the motion region of the reference image includes:
based on the motion area of the reference image, correcting the motion area of the candidate image with the coincidence degree lower than a threshold value to obtain a candidate corrected image;
and synthesizing the reference image and the candidate corrected image to obtain a high dynamic range image.
In the embodiment of the application, the motion area is clearer in order to obtain the finally obtained high dynamic range image. The contour boundary is clear, and the motion region of the candidate image with the coincidence degree lower than the threshold value can be corrected based on the motion region of the reference image to obtain a candidate corrected image; and synthesizing the reference image and the candidate corrected image to obtain a high dynamic range image. Of course, in calculating the degree of coincidence between the candidate image and the reference image, the degree of coincidence between the reference image candidate corrected images may also be calculated.
According to the method and the device, a user selects the image with the clearest motion area from the multiple frames of images with different exposure times as the reference image, so that the motion area of the synthesized high dynamic range image is clearer; and synthesizing the candidate image with the coincidence degree lower than the threshold value and the reference image to obtain a high dynamic range image, selecting other areas of the candidate image with the high coincidence degree and the motion area of the reference image to synthesize the high dynamic range image, and ensuring that the outline of the motion area is clear and more detailed images of non-motion areas can be obtained.
Fig. 2 is a schematic flow chart of an implementation of another method for generating a high dynamic range image according to an embodiment of the present application, and as shown in the figure, the method describes how to display the multiple frames of images on the basis of the embodiment shown in fig. 1, and specifically may include the following steps:
step S201, selecting a frame of image, selecting two frames of images from other frames of images except the current frame of image in the multiple frames of images, and calculating differences between the selected two frames of images and the current frame of image respectively to obtain two difference images.
In the embodiment of the present application, when displaying a plurality of frame images, it is desirable to display the motion regions of the plurality of frame images in the order from clear to blurred to shorten the time selected by a user, or directly use the image with the clearest motion region as a reference image, so that it is necessary to determine the motion region of the plurality of frame images, before correcting the motion region of a candidate image, the difference between the motion regions of the reference image and the candidate image is relatively large, and the difference image between the reference image and the candidate image usually includes the motion region of the reference image and the motion region of the difference image. Therefore, when calculating the motion area of each frame of image in the multi-frame images, one frame of image can be selected as the current frame of image, two frames of images are selected from other images, and the difference between the selected two frames of images and the current frame of image is calculated to obtain two difference images.
Step S202, the two difference images are respectively subjected to binarization processing to obtain two binarization images, an intersection area of target areas of the two binarization images is calculated, and a motion area of the current frame image is obtained.
In the embodiment of the present application, after obtaining two difference images, the motion region of the current frame image cannot be obtained, and the difference region of the selected one frame image and the current frame image can be represented only by further performing binarization processing on the difference images, so that the difference region can be obtained only by performing binarization processing on the two difference images, and the motion region of the current frame image can be obtained by calculating an intersection of the two difference regions.
By way of example, assuming that a motion region a of the a image, a motion region B of the B image, and a motion region C of the C image are calculated, the a image, the B image, and the C image are first subjected to a gradation process, processed into a gradation image, then, calculating a difference image (gray difference image) of the image A and the image B, selecting a proper threshold value to carry out binarization processing, representing the main difference in the image A and the image B, the thus obtained binarized image (binarized image of the difference image between the a image and the B image) contains the region a and the region B, and the binarized image of the difference image between the a image and the C image is also obtained, the binarized image comprises an area a and an area c, and the area a can be obtained by taking intersection of the two binarized images, wherein the area a is the motion area of the image A. After determining the motion region, it is also necessary to determine the sharpness or blur of the motion region.
Step S203, calculating a first gradient change value of the gray value of each pixel point on the contour line of the motion region in the first direction, and calculating a second gradient change value of the gray value of each pixel point on the contour line of the motion region in the second direction, where the first direction is perpendicular to the second direction.
Step S204, calculating the data characteristic values of the first gradient change value and the second gradient change value, and obtaining the ambiguity of the motion area according to the data characteristic values.
In this embodiment of the application, after the motion region is determined, the contour line of the motion region in the image is also determined, then a first gradient change value of the gray value of each pixel point on the contour line of the motion region in the horizontal direction (each pixel point corresponds to one first gradient change value) respectively can be calculated, and a second gradient change value of the gray value of each pixel point on the contour line of the motion region in the vertical direction (each pixel point corresponds to one second gradient change value) respectively can be calculated. Then, gradient change values twice as many as the number of the pixel points are obtained, and when the data characteristic values (for example, an average value and the like) of the gradient change values (the first gradient change value and the second gradient change value) are calculated, the larger the data characteristic values (the larger the gradient change is represented), the clearer the boundary is, and the lower the motion region ambiguity is. I.e. the data characteristic value is inversely proportional to the blurriness of the motion region. The degree of ambiguity may be a specific value or may be a rank.
Step S205, after calculating the blur degree of the motion region of each frame image in the multiple frame images, displaying the multiple frame images according to the sequence of the blur degrees from small to large.
In the embodiment of the present application, as can be seen from the above description, the degree of blur of the motion region of the image is related to the gradient change value of the boundary line of the motion region, and the multi-frame images are displayed according to the sequence of the degrees of blur from small to large, or the multi-frame images are displayed according to the sequence of the data feature values of the first gradient change value and the second gradient change value from large to small.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 3 is a schematic block diagram of a mobile terminal according to an embodiment of the present application, and only a portion related to the embodiment of the present application is shown for convenience of description.
The mobile terminal 3 may be a software unit, a hardware unit or a combination of software and hardware unit built in a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, etc., or may be integrated into a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, etc., as an independent pendant.
The mobile terminal 3 includes:
the display unit 31 is configured to collect multiple frames of images with different exposure times through a camera of the mobile terminal, and display the multiple frames of images to prompt a user to select a frame of image with the clearest motion area from the multiple frames of images as a reference image;
a reference image determining unit 32, configured to obtain an image selected by a user, use the selected image as a reference image, and use other images than the reference image in the multi-frame images as candidate images;
and an image synthesizing unit 33, configured to determine a degree of coincidence between the candidate image and the reference image, and synthesize the candidate image and the reference image, of which the degree of coincidence is lower than a threshold, based on a motion region of the reference image to obtain a high dynamic range image.
Optionally, the display unit 31 is further configured to:
and determining a motion area of each frame of image in the multi-frame images, and displaying the multi-frame images according to the sequence of the motion areas from clear to fuzzy.
Optionally, the display unit 31 includes:
a difference image obtaining module 311, configured to select two frames of images from other frame images except for the current frame image in the multiple frames of images when calculating a motion region of each frame of image in the multiple frames of images, and calculate differences between the selected two frames of images and the current frame image, so as to obtain two difference images;
a motion region obtaining module 312, configured to perform binarization processing on the two difference images respectively to obtain two binarized images, and calculate an intersection region of target regions of the two binarized images to obtain a motion region of the current frame image.
Optionally, the display unit 31 further includes:
and the display module 313 is configured to calculate a gradient change value of a gray value of each pixel point on the contour line of the motion region, and display the multi-frame image according to a descending order of the gradient change values.
Optionally, the display module 313 is further configured to:
calculating a first gradient change value of the gray value of each pixel point on the contour line of the motion area in a first direction respectively, and calculating a second gradient change value of the gray value of each pixel point on the contour line of the motion area in a second direction respectively, wherein the first direction is vertical to the second direction;
and calculating the data characteristic value of the first gradient change value and the second gradient change value, and taking the data characteristic value as the gradient change value of each pixel gray value on the contour line of the motion area.
Optionally, the image synthesizing unit 33 includes:
a gray-scale difference image obtaining module 331, configured to obtain a gray-scale difference image between a pixel point in the candidate image and a pixel point in the reference image;
the contact ratio determining module 332 is configured to perform binarization processing on the grayscale difference image to obtain a binarized image, and determine a contact ratio between the candidate image and the reference image according to the binarized image.
Optionally, the image synthesizing unit 33 includes:
a correcting module 333, configured to correct, based on the motion region of the reference image, the motion region of the candidate image with the coincidence degree lower than the threshold value, to obtain a candidate corrected image;
a synthesizing module 334, configured to synthesize the reference image and the candidate rectified image to obtain a high dynamic range image.
It will be apparent to those skilled in the art that, for convenience and simplicity of description, the foregoing functional units and modules are merely illustrated in terms of division, and in practical applications, the foregoing functional allocation may be performed by different functional units and modules as needed, that is, the internal structure of the mobile terminal is divided into different functional units or modules to perform all or part of the above described functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the mobile terminal may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
Fig. 4 is a schematic block diagram of a mobile terminal according to another embodiment of the present application. As shown in fig. 4, the mobile terminal 4 of this embodiment includes: one or more processors 40, a memory 41, and a computer program 42 stored in the memory 41 and executable on the processors 40. The processor 40, when executing the computer program 42, implements the steps in the various method embodiments described above, such as the steps S101 to S103 shown in fig. 1. Alternatively, the processor 40, when executing the computer program 42, implements the functions of the modules/units in the above-described mobile terminal embodiments, such as the functions of the modules 31 to 33 shown in fig. 3.
Illustratively, the computer program 42 may be partitioned into one or more modules/units that are stored in the memory 41 and executed by the processor 40 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 42 in the mobile terminal 4. For example, the computer program 42 may be divided into a display unit, a reference image determination unit, an image composition unit,
the display unit is used for acquiring multi-frame images with different exposure times through a camera of the mobile terminal and displaying the multi-frame images so as to prompt a user to select a frame of image with the clearest motion area from the multi-frame images as a reference image;
the reference image determining unit is used for acquiring an image selected by a user, taking the selected image as a reference image, and taking other images except the reference image in the multi-frame image as candidate images;
and the image synthesis unit is used for determining the coincidence degree of the candidate image and the reference image, and synthesizing the candidate image with the coincidence degree lower than a threshold value and the reference image based on the motion area of the reference image to obtain a high dynamic range image.
Other units or modules may refer to the description of the embodiment shown in fig. 3, and are not described herein again.
The mobile terminal includes, but is not limited to, a processor 40, a memory 41. Those skilled in the art will appreciate that fig. 4 is only one example of a mobile terminal 4 and is not intended to limit the mobile terminal 4 and may include more or fewer components than shown, or some components may be combined, or different components, for example, the mobile terminal may also include input devices, output devices, network access devices, buses, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the mobile terminal 4, such as a hard disk or a memory of the mobile terminal 4. The memory 41 may also be an external storage device of the mobile terminal 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the mobile terminal 4. Further, the memory 41 may also include both an internal storage unit and an external storage device of the mobile terminal 4. The memory 41 is used for storing the computer program and other programs and data required by the mobile terminal. The memory 41 may also be used to temporarily store data that has been output or is to be output.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed mobile terminal and method may be implemented in other ways. For example, the above-described embodiments of the mobile terminal are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (9)

1. A method of generating a high dynamic range image, comprising:
acquiring multi-frame images with different exposure times through a camera of the mobile terminal, and displaying the multi-frame images to prompt a user to select a frame of image with the clearest motion area from the multi-frame images as a reference image;
acquiring an image selected by a user to determine a shooting effect required by the user, taking the selected image as a reference image, and taking other images except the reference image in the multi-frame image as candidate images;
determining the coincidence degree of the candidate image and the reference image, and synthesizing the candidate image with the coincidence degree lower than a threshold value and the reference image based on the motion region of the reference image to obtain a high dynamic range image;
the determining a degree of coincidence of the candidate image and the reference image comprises:
acquiring a gray difference image between pixel points in the candidate image and pixel points in the reference image;
and carrying out binarization processing on the gray difference image to obtain a binarized image, and determining the coincidence degree of the candidate image and the reference image according to the binarized image.
2. The method of generating a high dynamic range image according to claim 1, wherein said displaying the plurality of frame images includes:
and determining a motion area of each frame of image in the multi-frame images, and displaying the multi-frame images according to the sequence of the motion areas from clear to fuzzy.
3. The method of generating a high dynamic range image according to claim 2, wherein said determining the motion region of each frame image of the plurality of frame images comprises:
when calculating the motion area of each frame of image in the multi-frame images, selecting two frames of images from other frames of images except the current frame of image in the multi-frame images, and calculating the difference between the selected two frames of images and the current frame of image respectively to obtain two difference images;
and respectively carrying out binarization processing on the two difference images to obtain two binarization images, and calculating the intersection area of the target areas of the two binarization images to obtain the motion area of the current frame image.
4. The method of generating a high dynamic range image according to claim 2, wherein said displaying the plurality of frame images in order of motion area from clear to blurred comprises:
and calculating the gradient change value of the gray value of each pixel point on the contour line of the motion area, and displaying the multi-frame image according to the sequence of the gradient change values from large to small.
5. The method of claim 4, wherein the calculating the gradient change value of the gray-level value of each pixel point on the contour line of the motion region comprises:
calculating a first gradient change value of the gray value of each pixel point on the contour line of the motion area in a first direction respectively, and calculating a second gradient change value of the gray value of each pixel point on the contour line of the motion area in a second direction respectively, wherein the first direction is vertical to the second direction;
and calculating the data characteristic value of the first gradient change value and the second gradient change value, and taking the data characteristic value as the gradient change value of each pixel gray value on the contour line of the motion area.
6. The method for generating a high dynamic range image according to any one of claims 1 to 5, wherein the synthesizing the candidate image having a degree of coincidence lower than a threshold value and the reference image based on the motion region of the reference image to obtain the high dynamic range image includes:
based on the motion area of the reference image, correcting the motion area of the candidate image with the coincidence degree lower than a threshold value to obtain a candidate corrected image;
and synthesizing the reference image and the candidate corrected image to obtain a high dynamic range image.
7. A mobile terminal, comprising:
the display unit is used for acquiring multi-frame images with different exposure times through a camera of the mobile terminal and displaying the multi-frame images so as to prompt a user to select a frame of image with the clearest motion area from the multi-frame images as a reference image;
the reference image determining unit is used for acquiring an image selected by a user to determine a shooting effect required by the user, taking the selected image as a reference image, and taking other images except the reference image in the multi-frame images as candidate images;
an image synthesis unit, configured to determine a degree of coincidence between the candidate image and the reference image, and synthesize the candidate image and the reference image, of which the degree of coincidence is lower than a threshold, based on a motion region of the reference image to obtain a high dynamic range image;
the determining a degree of coincidence of the candidate image and the reference image comprises:
acquiring a gray difference image between pixel points in the candidate image and pixel points in the reference image;
and carrying out binarization processing on the gray difference image to obtain a binarized image, and determining the coincidence degree of the candidate image and the reference image according to the binarized image.
8. A mobile terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 6 when executing the computer program.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by one or more processors, implements the steps of the method according to any one of claims 1 to 6.
CN201811196979.9A 2018-10-15 2018-10-15 High dynamic range image generation method, mobile terminal and storage medium Active CN109005368B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811196979.9A CN109005368B (en) 2018-10-15 2018-10-15 High dynamic range image generation method, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811196979.9A CN109005368B (en) 2018-10-15 2018-10-15 High dynamic range image generation method, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN109005368A CN109005368A (en) 2018-12-14
CN109005368B true CN109005368B (en) 2020-07-31

Family

ID=64589985

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811196979.9A Active CN109005368B (en) 2018-10-15 2018-10-15 High dynamic range image generation method, mobile terminal and storage medium

Country Status (1)

Country Link
CN (1) CN109005368B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111754411B (en) * 2019-03-27 2024-01-05 Tcl科技集团股份有限公司 Image noise reduction method, image noise reduction device and terminal equipment
CN110166710A (en) * 2019-06-21 2019-08-23 上海闻泰电子科技有限公司 Image composition method, device, equipment and medium
CN110532957B (en) * 2019-08-30 2021-05-07 北京市商汤科技开发有限公司 Face recognition method and device, electronic equipment and storage medium
CN112672056A (en) * 2020-12-25 2021-04-16 维沃移动通信有限公司 Image processing method and device
CN113052815B (en) * 2021-03-23 2022-06-24 Oppo广东移动通信有限公司 Image definition determining method and device, storage medium and electronic equipment
CN117278865A (en) * 2023-11-16 2023-12-22 荣耀终端有限公司 Image processing method and related device
CN117576490B (en) * 2024-01-16 2024-04-05 口碑(上海)信息技术有限公司 Kitchen environment detection method and device, storage medium and electronic equipment

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7295232B2 (en) * 2003-01-15 2007-11-13 Canon Kabushiki Kaisha Camera and program
CN101674428B (en) * 2009-09-18 2011-06-15 青岛海信电器股份有限公司 Video equipment control method, video playing equipment and playing control system
CN102404602A (en) * 2011-09-23 2012-04-04 浙江工业大学 Vidicon definition detection method based on definition test card
CN102694966B (en) * 2012-03-05 2014-05-21 天津理工大学 Construction method of full-automatic video cataloging system
CN104349066B (en) * 2013-07-31 2018-03-06 华为终端(东莞)有限公司 A kind of method, apparatus for generating high dynamic range images
CN105827971B (en) * 2016-03-31 2019-01-11 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN106131443A (en) * 2016-05-30 2016-11-16 南京大学 A kind of high dynamic range video synthetic method removing ghost based on Block-matching dynamic estimation
CN106060418A (en) * 2016-06-29 2016-10-26 深圳市优象计算技术有限公司 IMU information-based wide dynamic image fusion method
JP6333318B2 (en) * 2016-07-29 2018-05-30 株式会社Screenホールディングス Image processing method, image processing apparatus, and imaging apparatus
CN107230192B (en) * 2017-05-31 2020-07-21 Oppo广东移动通信有限公司 Image processing method, image processing device, computer-readable storage medium and mobile terminal
CN107610075A (en) * 2017-08-29 2018-01-19 维沃移动通信有限公司 Image combining method and mobile terminal
CN107800979B (en) * 2017-10-23 2019-06-28 深圳看到科技有限公司 High dynamic range video image pickup method and filming apparatus
CN108459788B (en) * 2018-03-15 2020-05-22 维沃移动通信有限公司 Picture display method and terminal

Also Published As

Publication number Publication date
CN109005368A (en) 2018-12-14

Similar Documents

Publication Publication Date Title
CN109005368B (en) High dynamic range image generation method, mobile terminal and storage medium
CN109064428B (en) Image denoising processing method, terminal device and computer readable storage medium
CN109286758B (en) High dynamic range image generation method, mobile terminal and storage medium
CN109166156B (en) Camera calibration image generation method, mobile terminal and storage medium
CN109005367B (en) High dynamic range image generation method, mobile terminal and storage medium
CN110796600B (en) Image super-resolution reconstruction method, image super-resolution reconstruction device and electronic equipment
CN108230333B (en) Image processing method, image processing apparatus, computer program, storage medium, and electronic device
CN108764139B (en) Face detection method, mobile terminal and computer readable storage medium
US9424632B2 (en) System and method for generating high dynamic range images
CN109214996B (en) Image processing method and device
CN111131688B (en) Image processing method and device and mobile terminal
CN111311482A (en) Background blurring method and device, terminal equipment and storage medium
CN108776800B (en) Image processing method, mobile terminal and computer readable storage medium
CN111028276A (en) Image alignment method and device, storage medium and electronic equipment
CN111882565B (en) Image binarization method, device, equipment and storage medium
CN113052754B (en) Method and device for blurring picture background
CN108805838B (en) Image processing method, mobile terminal and computer readable storage medium
CN114037992A (en) Instrument reading identification method and device, electronic equipment and storage medium
CN111563517A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112149592A (en) Image processing method and device and computer equipment
CN111340722B (en) Image processing method, processing device, terminal equipment and readable storage medium
CN114140481A (en) Edge detection method and device based on infrared image
CN111222446B (en) Face recognition method, face recognition device and mobile terminal
US20060204091A1 (en) System and method for analyzing and processing two-dimensional images
CN108769521B (en) Photographing method, mobile terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant