CN114630050A - Photographing method, device, medium and terminal equipment - Google Patents

Photographing method, device, medium and terminal equipment Download PDF

Info

Publication number
CN114630050A
CN114630050A CN202210302690.0A CN202210302690A CN114630050A CN 114630050 A CN114630050 A CN 114630050A CN 202210302690 A CN202210302690 A CN 202210302690A CN 114630050 A CN114630050 A CN 114630050A
Authority
CN
China
Prior art keywords
image
frames
frame image
preview images
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210302690.0A
Other languages
Chinese (zh)
Inventor
黄健钟
朱翔
李宏成
李海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Spreadtrum Semiconductor Nanjing Co Ltd
Original Assignee
Spreadtrum Semiconductor Nanjing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Spreadtrum Semiconductor Nanjing Co Ltd filed Critical Spreadtrum Semiconductor Nanjing Co Ltd
Priority to CN202210302690.0A priority Critical patent/CN114630050A/en
Publication of CN114630050A publication Critical patent/CN114630050A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a photographing method, a photographing device, a photographing medium and terminal equipment, wherein the photographing method comprises the following steps: acquiring M frames of preview images acquired by a camera; selecting N frames of preview images from the M frames of preview images according to the exposure parameters and the image information of the camera, determining key frame images from the N frames of preview images, and aligning the N-1 frames of preview images except the key frame images in the N frames of preview images with the key frame images to obtain N frames of candidate images; and sequencing the N frames of candidate images, and executing an image processing algorithm on the sequenced N frames of candidate images to obtain a target image. The method can simplify the complexity of an image processing algorithm in the photographing process.

Description

Photographing method, device, medium and terminal equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a photographing method, apparatus, medium, and terminal device.
Background
At present, cameras become an indispensable part of mobile phones, and the requirements on the performance of the cameras are higher and higher. In order to optimize the performance of the camera, various image processing algorithms are currently performed in a software implementation in an overlapping manner, but with the overlapping of the image processing algorithms, the cost of process maintenance also begins to become higher and higher, and each set of image processing algorithm needs an additional set of control flow and data flow, which undoubtedly becomes a bottleneck of algorithm integration. However, in order to improve the image quality, a plurality of image processing algorithms are still integrated, so that the complexity of the photographing process, both time and space, becomes very high.
Therefore, there is a need for a photographing method that simplifies the complexity of the image processing algorithm to improve the photographing effect.
Disclosure of Invention
The embodiment of the invention provides a photographing method, a photographing device, a photographing medium and terminal equipment, which are used for simplifying the complexity of an image processing algorithm.
In a first aspect, an embodiment of the present invention provides a photographing method, where the method includes: acquiring M frames of preview images acquired by a camera; selecting N frames of preview images from the M frames of preview images according to the exposure parameters and the image information of the camera, determining key frame images from the N frames of preview images, and aligning the N-1 frames of preview images except the key frame images in the N frames of preview images with the key frame images to obtain N frames of candidate images; and sequencing the N frames of candidate images, and executing an image processing algorithm on the sequenced N frames of candidate images to obtain a target image.
The photographing method provided by the embodiment of the invention has the beneficial effects that: before executing the image processing algorithm on the image frames, the image frames and the key frames with the required frame number are selected from the preview images, and the key frames are utilized to carry out the bit alignment pretreatment on other image frames, so that the image frame number to be transmitted to the image processing algorithm is dynamically adjusted, in addition, the image frames to be transmitted to the image processing algorithm are pretreated, and the aim of optimizing the final camera imaging effect is fulfilled.
In a possible implementation scheme, aligning the N-1 preview images except the key frame image in the N preview images with the key frame image to obtain N candidate images, including: aiming at a Kth frame image in the N-1 frame preview images except the key frame image in the N frame preview images, wherein the Kth frame image is any one frame image, the following processing is executed:
extracting feature points of the K frame image and the key frame image; matching the characteristic points of the K frame image with the characteristic points of the key frame image; adjusting the corresponding relation between the K frame image and the key frame image according to the matching result to obtain a transformation parameter; and adjusting the spatial layout of the K frame image to be consistent with the spatial layout of the key frame image through the transformation parameters. In the implementation scheme, the image frames can be aligned according to the method, and the image difference between the image frames is reduced, so that the final image optimization effect is improved.
In one possible implementation, selecting N preview images from the M preview images according to the exposure parameter of the camera, and determining a key frame image from the N preview images includes:
determining a starting frame preview image and an ending frame preview image in the M frames of preview images; and selecting N frames of preview images from the M frames of preview images according to the sequence of traversing from the ending frame preview image to the starting frame preview image, and determining a key frame image from the N frames of preview images.
In one possible implementation, the ranking the N frame candidate images includes: and sequencing the N frames of candidate images according to the requirement of an image processing algorithm. The realization scheme can be completed according to the requirements of an image processing algorithm
In one possible implementation, the exposure parameters include one or more of exposure time length, sensitivity, and exposure compensation value, and the image information includes one or more of brightness information and color information.
In a second aspect, an embodiment of the present invention further provides a photographing apparatus, which includes a module/unit for executing the method of any one of the possible designs of the first aspect. These modules/units may be implemented by hardware, or by hardware executing corresponding software.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a processor and a memory. Wherein the memory is used to store one or more computer programs; the one or more computer programs stored in the memory, when executed by the processor, enable the terminal device to implement the method of any one of the possible designs of the first aspect described above.
In a fourth aspect, this embodiment also provides a computer-readable storage medium, where the computer-readable storage medium includes a computer program, and when the computer program is run on an electronic device, the electronic device is caused to perform any one of the possible design methods of the first aspect.
In a fifth aspect, the present application further provides a computer program product, which when run on a terminal, causes the electronic device to execute any one of the possible design methods of the first aspect.
As for the advantageous effects of the above second to fifth aspects, reference may be made to the description in the above first aspect.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a photographing method according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating an image processing flow according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of an image processing method in a photographing scene according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a photographing effect according to an embodiment of the present invention;
fig. 6 is a schematic flow chart of a photographing apparatus according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of another terminal device according to an embodiment of the present invention.
Detailed Description
Before describing embodiments of the present invention in detail, some terms used in the embodiments of the present invention will be explained below to facilitate understanding by those skilled in the art.
At present, the image processing algorithms are more in variety, and the types of frames required by various algorithms, the functions realized by the algorithms and the processing capacity of the algorithms are different. Examples of the image processing algorithm include a High-Dynamic Range (HDR) imaging algorithm, and a Multi-Frame Noise Reduction (MFNR) algorithm. The high dynamic range imaging algorithm and the multi-frame fusion noise reduction algorithm both need image frames with fixed frame number, the needed frame number is different, and each set of image processing algorithm needs one set of extra control flow and data flow, so that the time complexity and the space complexity in the photographing process are high. The invention provides a photographing method which can improve the photographing effect and simplify the complexity of code implementation.
The technical solution in the embodiments of the present application is described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments of the present application, the terminology used in the following embodiments is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in the specification of the present application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, such as "one or more", unless the context clearly indicates otherwise. It should also be understood that in the following embodiments of the present application, "at least one", "one or more" means one or more than two (including two). The term "and/or" is used to describe the association relationship of the associated objects, and means that there may be three relationships; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather mean "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless otherwise specifically stated. The term "coupled" includes both direct and indirect connections, unless otherwise noted. "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The photographing method provided in the embodiment of the present application may be applied to a terminal device as shown in fig. 1, where fig. 1 shows a hardware configuration block diagram of the terminal device 100.
In some embodiments, the terminal apparatus 100 includes at least one of a tuner demodulator 110, a communicator 120, a collector 130, an external device interface 140, a controller 150, a display 160, an audio output interface 170, a memory, a power supply, and a user interface.
In some embodiments, the display 160 includes a display screen component for displaying pictures, and a driving component for driving image display, a component for receiving image signals from the controller output, and displaying video content, image content, and menu manipulation interface, and a user manipulation interface.
In some embodiments, the display 160 may be at least one of a liquid crystal display, an organic light-emitting diode (OLED), and a projection display, and may also be a projection device and a projection screen.
In some embodiments, the tuner/demodulator 110 receives broadcast television signals via wired or wireless reception and demodulates audio/video signals from a plurality of wireless or wired broadcast television signals.
In some embodiments, communicator 120 is a component for communicating with external devices or servers according to various communication protocol types. In one possible embodiment, the communicator may include at least one of a wireless fidelity (wifi) module, a bluetooth module, a wired ethernet module, or other network communication protocol chip or near field communication protocol chip, and an infrared receiver. The terminal device 100 can establish transmission and reception of control signals and data signals with other devices through the communicator 120.
In some embodiments, the collector 130 is used to collect external environment or signals interacting with the outside. In one possible embodiment, the collector 130 includes a light receiver, a sensor for collecting the intensity of ambient light; or the collector 130 includes an image collector for collecting external environment scenes, attributes of the user, or user interaction gestures; still alternatively, the collector 130 includes a sound collector for receiving external sound.
In some embodiments, the external device interface 140 may include, but is not limited to, the following: any one or more of a high-definition multimedia interface, an analog or data high-definition component input interface, a composite video input interface, a Universal Serial Bus (USB) input interface, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the controller 150 controls the operation of the terminal device 100 and the operation in response to the user through various software control programs stored in the memory. The controller 150 controls the overall operation of the terminal device 100. In one possible embodiment, in response to receiving a user command to select an object to be displayed on the display 160, the controller 150 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects. In one possible embodiment, the object may be a hyperlink, an icon, or other actionable control. Operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a Random Access Memory (RAM), a Read Only Memory (ROM), first to nth interfaces for input/output, a communication Bus (Bus), and the like.
And the central processor is used for executing the operating system and the application program instructions stored in the memory, and executing various application programs, data and contents according to various interaction instructions received from the outside so as to finally display and play various audio and video contents. The central processor may include a plurality of processors. In one possible embodiment, the central processor includes a main processor and one or more sub-processors.
In some embodiments, a graphics processor to generate various graphics objects. In one possible embodiment, the various graphical objects include at least one of icons, operation menus, and user input instruction display graphics. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform at least one of video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal that can be directly displayed or played on the terminal device 100.
In some embodiments, the video processor includes at least one of a demultiplexing module, a video decoding module, an image composition module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for performing superposition mixing processing on the Graphical User Interface (GUI) signal generated by the graphical generator and the video image after the zooming processing according to the user input or the GUI signal so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for outputting the video output signals after the frame rate conversion is received, and changing the signals to be in accordance with the signals of the display format. In one possible embodiment, the display formatting module outputs RGB data signals.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform at least one of noise reduction, digital-to-analog conversion, and amplification processing to obtain a sound signal that can be played in the speaker.
In some embodiments, a user may enter user commands on a graphical user interface displayed on the display 160, and the user input interface receives the user input commands through the graphical user interface. Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, the user interface is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A common presentation form of a user interface is a graphical user interface, which refers to a user interface displayed in a graphical manner and related to computer operations. It may be an interface element such as an icon, window, control, etc. displayed in a display screen of the electronic device, where the control may include at least one of an icon, button, menu, tab, text box, dialog box, status bar, navigation bar, etc. visual interface elements.
In some embodiments, user interface 180 is an interface that may be used to receive control inputs. In one possible embodiment, the user interface 180 may be a physical key on the body of the terminal device.
In a possible implementation scheme, the terminal device 100 may be any one of a mobile phone, a tablet computer, a handheld computer, a Personal Computer (PC), a cellular phone, a Personal Digital Assistant (PDA), a wearable device, a smart home device, an in-vehicle computer, a game machine, or an Augmented Reality (AR) \ Virtual Reality (VR) device. It should be noted that the present embodiment does not specifically limit the specific device form of the terminal device 100.
Based on the terminal device 100 shown in fig. 1, an embodiment of the present application provides a flowchart of a photographing method, as shown in fig. 2, a flow of the method may be executed by the terminal device 100, and the method includes the following steps:
s201, acquiring M frames of preview images acquired by the camera.
In this step, when the terminal device 100 starts the photographing function, the terminal device 100 may collect M frames of preview images collected by the camera within a preset time period, and the duration of the preset time period may be 3 seconds to 5 seconds. In this embodiment, the preview image is a preview screen displayed on the photographing function interface.
S202, selecting N frames of preview images from the M frames of preview images according to the exposure parameters and the image information of the camera, and determining a key frame image from the N frames of preview images.
In a possible embodiment, a start frame preview image and an end frame preview image in the M frame preview images may be determined first; and then selecting N frames of preview images from the M frames of preview images according to the sequence of traversing from the ending frame preview image to the starting frame preview image, and determining a key frame image from the N frames of preview images. The key frame image may be the image that satisfies the exposure and white balance requirements and has the best effect.
In this embodiment, the exposure parameters include one or more of exposure time length, sensitivity, and exposure compensation value, and the image information includes one or more of luminance information and color information. Optionally, the image information may also include image size and image resolution information.
S203, aligning the N-1 frame preview images except the key frame image in the N frame preview images with the key frame image to obtain N frame candidate images.
In this step, in a possible implementation manner, for a K-th frame image in the N-1 frame preview images except for the key frame image in the N frame preview images, where the K-th frame image is any one frame image, the following processing is performed to complete bit alignment: extracting feature points of the K frame image and the key frame image; matching the characteristic points of the K frame image with the characteristic points of the key frame image; adjusting the corresponding relation between the Kth frame image and the key frame image according to the matching result to obtain a transformation parameter; and adjusting the spatial layout of the K frame image to be consistent with the spatial layout of the key frame image through the transformation parameters. The implementation method can reduce the difference between the N frame image frames as much as possible, and is beneficial to optimizing the image imaging effect.
S204, sequencing the N frames of candidate images, and executing an image processing algorithm on the sequenced N frames of candidate images to obtain a target image.
In a possible embodiment, the N frame candidate images may be sorted by: as shown in fig. 3, when the image sensor of the camera acquires a plurality of frames of preview images, then a key frame is selected, then the key frame and other frames are aligned, the N candidate images are ranked according to the characteristics of the image processing algorithm to obtain N ranked candidate images, and the N candidate images are subjected to the image processing algorithm to obtain a target image. The method can control the frame number input to the image processing algorithm and the input frame sequence by sequencing the N frame candidate images so as to achieve better algorithm processing effect, and can make algorithm control become simple and efficient by combining different frame numbers so as to improve the image processing effect of the image processing algorithm. Optionally, the image processing algorithm comprises any one or more of: automatic white balancing, automatic lens shading correction, automatic exposure, color noise removal, luminance noise removal, face detection, multi-frame denoising, high dynamic range, full dynamic range, dynamic range enhancement, or global luminance mapping.
For the photographing process of the terminal device, in combination with fig. 4, during normal photographing, the camera imaging may have problems of moire, background color cast, and the like, and in order to improve the above problems, optimization processing of an image algorithm needs to be performed, in this embodiment, before executing the image processing algorithm, an N-frame preview image and a key frame are selected from a multi-frame preview image acquired by an image sensor in a hardware layer (kernel/HW), where the key frame may be a portion with the best exposure and white balance effects in the N-frame preview image, and then an abstraction layer (HAL/OEM) performs bit alignment on the selected N-frame preview image by using a capture function, that is, exposure and white balance optimization can be performed according to the brightness and color corresponding to the key frame, so as to obtain uniform image exposure brightness and uniform color. And then, the abstract layer (HAL/OEM) sequences the optimized N frames of candidate images, the adaptation layer (Adapter/Algos) selects the corresponding image processing algorithm, or the N frames of candidate images can be reordered according to the image processing algorithm, and then the image processing algorithm is executed to generate the target image. On one hand, the method can dynamically adjust the number of image frames to be transmitted to an image processing algorithm; on the other hand, the method preprocesses the image frame to be transmitted to the image processing algorithm, thereby achieving the purpose of optimizing the final camera imaging effect. For example, as shown in fig. 5, if the image processing algorithm is a dynamic range enhancement algorithm, the embodiment first performs the above preprocessing, and then executes the dynamic range enhancement algorithm, the preview image seen by the user is shown in fig. 5 (a), and when the user clicks to take a picture, the final picture taken by the picture taking interface is shown in fig. 5 (b). In contrast, the image of the flying insect in motion is enhanced.
In some embodiments of the present application, the present application further discloses a photographing apparatus, as shown in fig. 6, which is configured to implement the method described in the above method embodiments, and includes: an obtaining unit 601, configured to obtain M frames of preview images acquired by a camera; a selecting unit 602, configured to select N preview images from the M preview images according to an exposure parameter and image information of the camera, and determine a key frame image from the N preview images, where M is greater than or equal to N, and M and N are both positive integers; a bit alignment unit 603, configured to perform bit alignment on the N-1 preview images except the key frame image in the N preview images and the key frame image to obtain N candidate images. A sorting unit 604, configured to sort the N frames of candidate images; and the processing unit 605 is configured to execute an image processing algorithm on the sorted N frames of candidate images to obtain a target image.
All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
In other embodiments of the present application, an embodiment of the present application discloses a terminal device 100, and as shown in fig. 7, the terminal device 700 may include: one or more processors 701; a memory 702; a display 703; one or more application programs (not shown); and one or more computer programs 704, which may be connected via one or more communication buses 705. Wherein the one or more computer programs 704 are stored in the memory 702 and configured to be executed by the one or more processors 701, the one or more computer programs 704 comprising instructions which may be used to perform the steps as in the respective embodiments of fig. 2-4.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A photographing method is applied to terminal equipment and is characterized by comprising the following steps:
acquiring M frames of preview images acquired by a camera of the terminal equipment;
selecting N frames of preview images from the M frames of preview images according to the exposure parameters and the image information of the camera, and determining a key frame image from the N frames of preview images, wherein M is greater than or equal to N, and both M and N are positive integers;
aligning the N-1 frame preview images except the key frame image in the N frame preview images with the key frame image to obtain N frame candidate images;
and sequencing the N frames of candidate images, and executing an image processing algorithm on the sequenced N frames of candidate images to obtain a target image.
2. The method according to claim 1, wherein aligning N-1 preview images of the N preview images other than the key frame image with the key frame image to obtain N candidate images comprises:
aiming at a Kth frame image in the N-1 frame preview images except the key frame image in the N frame preview images, wherein the Kth frame image is any one frame image, and K is a positive integer, executing the following processing:
extracting feature points of the K frame image and the key frame image;
matching the characteristic points of the K frame image with the characteristic points of the key frame image;
adjusting the corresponding relation between the K frame image and the key frame image according to the matching result to obtain a transformation parameter;
and adjusting the spatial layout of the K frame image to be consistent with the spatial layout of the key frame image through the transformation parameters.
3. The method of claim 2, wherein selecting N preview images from the M preview images and determining a key frame image from the N preview images according to exposure parameters of the camera comprises:
determining a starting frame preview image and an ending frame preview image in the M frames of preview images;
and selecting N frames of preview images from the M frames of preview images according to the sequence of traversing from the ending frame preview image to the starting frame preview image, and determining a key frame image from the N frames of preview images.
4. The method of any of claims 1 to 3, wherein ranking the N frame candidate images comprises:
and sequencing the N frames of candidate images according to the characteristics of the image processing algorithm.
5. The method according to any one of claims 1 to 3, wherein the exposure parameters include one or more of an exposure time period, a sensitivity, and an exposure compensation value, and the image information includes one or more of luminance information and color information.
6. The utility model provides a photographing device, is applied to terminal equipment, its characterized in that, the device includes:
the acquisition unit is used for acquiring M frames of preview images acquired by the camera;
the selecting unit is used for selecting N frames of preview images from the M frames of preview images according to the exposure parameters and the image information of the camera and determining a key frame image from the N frames of preview images, wherein M is greater than or equal to N, and M and N are positive integers;
the bit alignment unit is used for aligning N-1 frame preview images except the key frame image in the N frame preview images with the key frame image to obtain N frame candidate images;
the sorting unit is used for sorting the N frames of candidate images;
and the processing unit is used for executing an image processing algorithm on the sequenced N frames of candidate images to obtain a target image.
7. The apparatus according to claim 6, wherein the bit alignment unit performs bit alignment on the N-1 preview images except the key frame image in the N preview images and the key frame image to obtain N candidate images, and is specifically configured to:
aiming at a Kth frame image in the N-1 frame preview images except the key frame image in the N frame preview images, wherein the Kth frame image is any one frame image, executing the following processing:
extracting feature points of the K frame image and the key frame image;
matching the characteristic points of the K frame image with the characteristic points of the key frame image;
adjusting the corresponding relation between the Kth frame image and the key frame image according to the matching result to obtain a transformation parameter;
and adjusting the spatial layout of the K frame image to be consistent with the spatial layout of the key frame image through the transformation parameters.
8. The apparatus according to claim 7, wherein the selecting unit selects N preview images from the M preview images according to the exposure parameters of the camera, and determines a key frame image from the N preview images, specifically to:
determining a starting frame preview image and an ending frame preview image in the M frames of preview images;
and selecting N frames of preview images from the M frames of preview images according to the sequence of traversing from the ending frame preview image to the starting frame preview image, and determining a key frame image from the N frames of preview images.
9. The apparatus according to any of the claims 6 to 8, wherein the ranking unit ranks the N frame candidate images, in particular to:
and sequencing the N frames of candidate images according to the characteristics of the image processing algorithm.
10. The apparatus according to any one of claims 6 to 8, wherein the exposure parameters include one or more of an exposure time period, a sensitivity, and an exposure compensation value, and the image information includes one or more of luminance information and color information.
11. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method of any one of claims 1 to 5.
12. A terminal device comprising a memory and a processor, the memory having stored thereon a computer program operable on the processor, the computer program, when executed by the processor, causing the processor to carry out the method of any one of claims 1 to 5.
CN202210302690.0A 2022-03-25 2022-03-25 Photographing method, device, medium and terminal equipment Pending CN114630050A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210302690.0A CN114630050A (en) 2022-03-25 2022-03-25 Photographing method, device, medium and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210302690.0A CN114630050A (en) 2022-03-25 2022-03-25 Photographing method, device, medium and terminal equipment

Publications (1)

Publication Number Publication Date
CN114630050A true CN114630050A (en) 2022-06-14

Family

ID=81904552

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210302690.0A Pending CN114630050A (en) 2022-03-25 2022-03-25 Photographing method, device, medium and terminal equipment

Country Status (1)

Country Link
CN (1) CN114630050A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107277387A (en) * 2017-07-26 2017-10-20 维沃移动通信有限公司 High dynamic range images image pickup method, terminal and computer-readable recording medium
CN109547700A (en) * 2018-12-27 2019-03-29 维沃移动通信有限公司 Photographic method and terminal
WO2019071613A1 (en) * 2017-10-13 2019-04-18 华为技术有限公司 Image processing method and device
WO2022021999A1 (en) * 2020-07-27 2022-02-03 虹软科技股份有限公司 Image processing method and image processing apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107277387A (en) * 2017-07-26 2017-10-20 维沃移动通信有限公司 High dynamic range images image pickup method, terminal and computer-readable recording medium
WO2019071613A1 (en) * 2017-10-13 2019-04-18 华为技术有限公司 Image processing method and device
CN109547700A (en) * 2018-12-27 2019-03-29 维沃移动通信有限公司 Photographic method and terminal
WO2022021999A1 (en) * 2020-07-27 2022-02-03 虹软科技股份有限公司 Image processing method and image processing apparatus

Similar Documents

Publication Publication Date Title
US11711623B2 (en) Video stream processing method, device, terminal device, and computer-readable storage medium
US20220360736A1 (en) Method for frame interpolation and related products
JP2008217785A (en) Display controller and image data converting method
WO2022111730A1 (en) Image processing method and apparatus, and electronic device
CN112165632A (en) Video processing method, device and equipment
CN114640783B (en) Photographing method and related equipment
US20220159197A1 (en) Image special effect processing method and apparatus, and electronic device and computer readable storage medium
CN114630053B (en) HDR image display method and display device
US20230300475A1 (en) Image processing method and apparatus, and electronic device
CN103312981A (en) Synthetic multi-picture taking method and shooting device
CN111432257A (en) Method for starting screen protection of display equipment and display equipment
WO2023125273A1 (en) Image display method of electronic equipment, image processing circuit and electronic equipment
CN113315915B (en) Image definition determining method, device, medium and electronic equipment
CN113453069B (en) Display device and thumbnail generation method
CN114630050A (en) Photographing method, device, medium and terminal equipment
CN114745555A (en) Motion estimation method of video image and display equipment
CN112799557B (en) Ink screen display control method, terminal and computer readable storage medium
CN114298889A (en) Image processing circuit and image processing method
WO2024082863A1 (en) Image processing method and electronic device
CN115119035B (en) Display device, image processing method and device
CN113587812B (en) Display equipment, measuring method and device
US20050041884A1 (en) System and method of sectioning digital images
US20240163392A1 (en) Image special effect processing method and apparatus, and electronic device and computer readable storage medium
CN113794926A (en) Screen image acquisition method, device and computer readable storage medium
CN117560574A (en) Shooting method, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination