CN114070999B - Image processing method, device and storage medium - Google Patents

Image processing method, device and storage medium Download PDF

Info

Publication number
CN114070999B
CN114070999B CN202010756691.3A CN202010756691A CN114070999B CN 114070999 B CN114070999 B CN 114070999B CN 202010756691 A CN202010756691 A CN 202010756691A CN 114070999 B CN114070999 B CN 114070999B
Authority
CN
China
Prior art keywords
image
planetary
processed
target
preview
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010756691.3A
Other languages
Chinese (zh)
Other versions
CN114070999A (en
Inventor
冯靖鹏
熊悦欣
王倩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202010756691.3A priority Critical patent/CN114070999B/en
Publication of CN114070999A publication Critical patent/CN114070999A/en
Application granted granted Critical
Publication of CN114070999B publication Critical patent/CN114070999B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The disclosure relates to an image processing method, an image processing device and a storage medium. The image processing method is applied to the terminal and comprises the following steps: acquiring an image to be processed acquired in real time through one or more cameras; the method comprises the steps of stretching an image to be processed to obtain a target image, and generating a preview image with a planetary special effect based on the target image; displaying the preview image; and generating a planetary special effect image based on the preview image in response to a shooting instruction. By the image processing method, a user does not need to obtain an image photo to be processed in advance in the process of obtaining the planetary special effect image, but can directly preview the preview image with the planetary special effect in the process of image acquisition, so that the processing complexity of the user in obtaining the planetary special effect image is reduced, and the experience and satisfaction of the user in the use process are improved.

Description

Image processing method, device and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method, an image processing device, and a storage medium.
Background
With the continuous development of terminal technology, the imaging technology on the terminal is also continuously developed, and the requirements of users on imaging are correspondingly higher. For example, generating an image with planetary effects becomes a hotspot for the user's attention.
Currently, in the process of processing an image into an image with a planetary special effect, a picture of the image to be processed needs to be obtained in advance, and then planetary special effect processing is performed on the picture, which increases the complexity of image processing for a user.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image processing method, apparatus, and storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided an image processing method. The image processing method is applied to a terminal, and comprises the following steps: acquiring an image to be processed acquired in real time through one or more cameras; the image to be processed is stretched to obtain a target image, and a preview image with a planetary special effect is generated based on the target image; displaying the preview image; and generating a planetary special effect image based on the preview image in response to a shooting instruction.
In one embodiment, the stretching the image to be processed to obtain a target image includes: and splicing the first target edge and the second target edge of the image to be processed, and filling the spliced part to obtain the target image, wherein the first target edge is opposite to the second target edge.
In another embodiment, the stretching the image to be processed to obtain a target image includes: determining a first object image in the image to be processed; and in the process of stretching the image to be processed, reducing the deformation rate of the first object image to obtain the target image.
In yet another embodiment, the stretching the image to be processed to obtain a target image includes: determining a second object image in the image to be processed; and performing image stretching processing on the second object image to obtain a third object image, and obtaining the target image based on the third object image.
In yet another embodiment, the acquiring the image to be processed acquired in real time by one or more cameras includes: acquiring a plurality of acquired images acquired by a plurality of cameras in real time; based on the plurality of acquired images, identifying an overlapping part of the acquired images, fusing the overlapping part to obtain a fused image, and taking the fused image as the image to be processed.
In yet another embodiment, the planetary effect image includes a planetary effect video, and the generating a planetary effect image based on the preview image in response to a photographing instruction includes: responding to a shooting instruction, and acquiring continuous multi-frame images; processing the continuous multi-frame images according to the processing mode of the preview images to obtain a plurality of intermediate images with planetary special effects, which correspond to the continuous multi-frame images; and connecting a plurality of intermediate images in series to generate the planetary special effect video.
According to a second aspect of embodiments of the present disclosure, there is provided an image processing apparatus applied to a terminal. Wherein the image processing apparatus includes: the acquisition module is used for acquiring an image to be processed through real-time acquisition of one or more cameras; the preview image generation module is used for obtaining a target image by stretching the image to be processed and generating a preview image with a planetary special effect based on the target image; the preview image displaying module is used for displaying the preview image; and the planetary special effect image generation module is used for responding to a shooting instruction and generating a planetary special effect image based on the preview image.
In one embodiment, the generating preview image module is configured to: and splicing the first target edge and the second target edge of the image to be processed, and filling the spliced part to obtain the target image, wherein the first target edge is opposite to the second target edge.
In another embodiment, the generating the preview image module includes: a determining object image sub-module, configured to determine a first object image in the image to be processed; the sub-module for obtaining the target image is used for reducing the deformation rate of the first object image in the process of stretching the image to be processed to obtain the target image.
In yet another embodiment, the generating the preview image module includes: a determining object image sub-module, configured to determine a second object image in the image to be processed; the target image acquisition sub-module is used for carrying out image stretching processing on the second object image to obtain a third object image, and obtaining the target image based on the third object image.
In yet another embodiment, the acquisition module includes: the acquisition image sub-module is used for acquiring a plurality of acquisition images acquired by a plurality of cameras in real time; and the fusion image acquisition sub-module is used for identifying the overlapped part of the acquired images based on the acquired images, fusing the overlapped part to obtain a fusion image, and taking the fusion image as the image to be processed.
In yet another embodiment, the planetary effect image comprises a planetary effect video, and the generating a planetary effect image module comprises: the response instruction submodule is used for responding to a shooting instruction and acquiring continuous multi-frame images; the image processing sub-module is used for processing the continuous multi-frame images according to the processing mode of the preview images to obtain a plurality of intermediate images with planetary special effects, which correspond to the continuous multi-frame images; and the generating planetary special effect video sub-module is used for connecting a plurality of intermediate images in series to generate a planetary special effect video.
According to a third aspect of embodiments of the present disclosure, there is provided an image processing apparatus. The image processing apparatus includes a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: the image processing method of the first aspect of the present disclosure or any implementation manner of the first aspect is performed.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer readable storage medium, which when executed by a processor of a mobile terminal, enables the mobile terminal to perform the image processing method of the first aspect or any implementation of the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: the method and the device generate the preview image with the planetary special effect through the acquired image to be processed, and can display the preview image on a display screen; when the preview image with the planetary special effect meets the user's requirements, the planetary special effect image can be generated based on the preview image by responding to the shooting instruction of the user. By the method, a user does not need to obtain an image photo to be processed in advance in the process of obtaining the planetary special effect image, but can directly preview the preview image with the planetary special effect in the process of image acquisition, so that the processing complexity of the user in the process of obtaining the planetary special effect image is reduced, and the experience and satisfaction of the user in the use process are improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flowchart illustrating a method of image processing according to an exemplary embodiment;
FIG. 2 shows a block diagram of a terminal;
FIG. 3 is a flowchart illustrating a method of image processing for capturing a target image according to an exemplary embodiment;
FIG. 4 is a flowchart of acquiring a target image in another image processing method, according to an exemplary embodiment;
FIG. 5 shows a schematic representation of an image to be processed;
fig. 6 shows a schematic diagram of a preview image with planetary special effects with respect to an image to be processed;
FIG. 7 is a flowchart illustrating a method of generating a planetary special effect image based on a preview image in an image processing method according to an exemplary embodiment;
fig. 8 is a block diagram of an image processing apparatus according to an exemplary embodiment;
FIG. 9 illustrates a block diagram of an apparatus for image processing, according to an exemplary embodiment;
Fig. 10 shows a block diagram of another apparatus for image processing according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
In the drawings, the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The described embodiments are some, but not all, embodiments of the present disclosure. The embodiments described below by referring to the drawings are exemplary and intended for the purpose of explaining the present disclosure and are not to be construed as limiting the present disclosure. Based on the embodiments in this disclosure, all other embodiments that a person of ordinary skill in the art would obtain without making any inventive effort are within the scope of protection of this disclosure. Embodiments of the present disclosure are described in detail below with reference to the attached drawings.
In the description of the present embodiment, it should be understood that the terms "center", "longitudinal", "lateral", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are merely for convenience in describing the present embodiment and simplifying the description, and do not indicate or imply that the device or element in question must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the scope of protection of the present embodiment. It should be noted that: the relative arrangement of the components, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
With the continuous development of terminal technology, the imaging technology on the terminal is also continuously developed, and the requirements of users on imaging are correspondingly higher. For example, generating an image with planetary effects becomes a hotspot for the user's attention.
The image with the planetary special effect is an image with the bird's eye-like miniature planet effect after being processed by the image special effect. Currently, in the process of processing an image into an image with a planetary special effect, a picture of the image to be processed needs to be obtained in advance, and then planetary special effect processing is performed on the picture, which increases the complexity of image processing for a user.
The image processing method can directly preview the preview image with the planetary special effect in the image acquisition process without obtaining the image picture to be processed in advance by a user in the planetary special effect image acquisition process, so that the processing complexity of the user in the planetary special effect image acquisition process is reduced, and the experience and satisfaction of the user in the use process are improved.
The image processing method provided by the embodiment of the disclosure can be applied to a terminal, and the terminal can be a mobile terminal such as a smart phone, a tablet and the like with an image acquisition function and a processing function, and can display the processed image on the mobile terminal. The terminal may also be a terminal having no image capturing function but an image processing function, and the processed image may be displayed on the terminal. In the present disclosure, the terminal is not particularly limited.
Fig. 1 is a flowchart illustrating an image processing method according to an exemplary embodiment, and as shown in fig. 1, the image processing method is applied to a terminal, which may include a display screen for displaying a preview image and a planetary special effect image generated based on the preview image. The image processing method includes steps S11 to S14, each of which will be described below.
In step S11, an image to be processed is acquired in real time by one or more cameras.
The camera can be positioned at the terminal, and the terminal directly acquires the image to be processed acquired by the camera; the camera can also be independent of the camera outside the terminal, wherein the camera is in communication connection with the terminal, and can transmit the image to be processed acquired by the camera in real time to the terminal, and the terminal processes the image.
In an example, the camera and the terminal may be located under the same lan, and image transmission is performed through the same lan. In another example, the camera may also transmit images with the terminal through bluetooth or ultrasonic transmission.
It should be noted that the image to be processed may be a panoramic image or a non-panoramic image. At this time, the image to be processed is a photographed image which is not determined to be finally generated, and the image to be processed can be understood as an image previewed in a viewfinder of the camera by a user in the process of capturing the image.
In step S12, a target image is obtained by stretching the image to be processed, and a preview image with a planetary special effect is generated based on the target image.
And stretching the image to be processed based on the image to be processed acquired by the camera in real time to obtain a target image. The target image is processed by an algorithm, so that a preview image with a planetary special effect corresponding to the image to be processed can be generated in real time. For example, coordinate transformation may be employed on an image to be processed to achieve image stretching processing. Here, the stretching process may be used to stretch the image to be processed into a curved image.
In step S13, a preview image is displayed.
The terminal can display the preview image with the planetary special effect corresponding to the image to be processed, which is generated in real time, on the terminal, further, can also display the preview image with the planetary special effect on a display screen of the terminal, so that a user previews and judges whether the generated preview image with the planetary special effect corresponding to the image to be processed meets the requirement of the user.
In step S14, in response to the photographing instruction, a planetary special effect image is generated based on the preview image.
When the user determines that the preview image with the planetary special effect corresponding to the image to be processed meets the requirement of the user, the preview image with the planetary special effect corresponding to the image to be processed can be made to generate the planetary special effect image by shooting the determining instruction.
In one example, the user-issued shooting determination instruction may be implemented by clicking a button of shooting determination on the terminal. In the present disclosure, a specific implementation form of the issued shooting determination instruction is not limited.
According to the image processing method, the preview image with the planetary special effect is generated through the collected image to be processed, and the preview image can be displayed on a display screen; when the preview image with the planetary special effect meets the user's requirements, the planetary special effect image can be generated based on the preview image by responding to the shooting instruction of the user. By the method, a user does not need to obtain an image photo to be processed in advance in the process of obtaining the planetary special effect image, but can directly preview the preview image with the planetary special effect in the process of image acquisition, so that the processing complexity of the user in the process of obtaining the planetary special effect image is reduced, and the experience and satisfaction of the user in the use process are improved.
The present disclosure will explain a process of obtaining a stitched image by performing stitching processing on an image to be processed by the following embodiments.
In an exemplary embodiment of the present disclosure, a first target edge and a second target edge of an image to be processed may be subjected to stitching processing, and a portion to be stitched may be subjected to padding processing, so as to obtain a stitched image.
In the application process, an image to be processed can be acquired based on a camera. The camera can be a camera positioned on the terminal, and the terminal can obtain an image to be processed directly through the camera. The camera can also be a camera independent of the terminal, and the acquired image to be processed is transmitted to the terminal in a communication transmission mode.
The camera can be a common camera and also can be a wide-angle camera. Since the camera does not have a 360 ° viewing angle, a single image to be processed acquired in real time by the camera cannot obtain a panorama with respect to the image to be processed.
In the application process, the first target edge and the second target edge of the image to be processed can be spliced to obtain an image in a panoramic-like form, namely a target image.
It should be noted that the first target edge may be opposite to the second target edge. Wherein the first target edge is at least one of a first edge and a second edge of the image to be processed, the first edge intersecting the second edge.
In one example, the image to be processed may be a square image. Where the left edge of the square image may be the first edge, then the upper or lower edge of the square image may be the second edge. Wherein the first target edge of the image to be processed may be the left edge (first edge) or the upper edge (second edge) of the image to be processed. If the first target edge is the left edge of the image to be processed, the second target edge is the right edge of the image to be processed; if the first target edge is the upper side edge of the image to be processed, the second target edge is the lower side edge of the image to be processed.
In this embodiment, the first target edge and the second target edge of the image to be processed are spliced to obtain a panoramic-like image, which can lay a foundation for obtaining a preview image with a planetary special effect.
Further, a preview image with a planetary effect may be generated based on the target image.
In an example, based on the obtained panoramic-like form of the image, i.e., the target image, a preview image with planetary special effects corresponding to the target image may be generated in real time through an algorithm. The obtained preview image with the planetary special effect can provide more interesting image preview experience for users.
Further, when the user determines that the generated preview image with the planetary special effect meets the requirement of the user, the preview image with the planetary special effect corresponding to the image to be processed can be made to generate the planetary special effect image through shooting instructions. And further, the processing complexity of the user in the process of obtaining the planetary special effect image is reduced, and the experience and satisfaction of the user in the use process are improved.
It should be noted that, in the process of stitching the first target edge and the second target edge of the image to be processed to obtain the target image, the obtained target image may often have an unnatural connection at the stitching place. Therefore, the spliced portion can also be padded to obtain the target image. Wherein, the spliced part of the target image is coherent, natural and smooth.
In an example, the portion where the first target edge and the second target edge of the image to be processed are spliced can be filled based on the image restoration technology of deep learning, so that the difference of the spliced portions is reduced, the spliced portions of the target images are more natural and smooth, and a foundation is laid for a user to obtain a preview image with a better effect and a planetary special effect.
In the application process, the image to be processed can include an image in a still picture such as a picture or a photo, and also can include an image in a certain video frame in a dynamic picture such as a video. Accordingly, the preview image corresponding to the image to be processed may include a preview image with a planetary effect corresponding to an image in a still picture such as a picture or a photo, and may also include a preview image with a planetary effect corresponding to an image in a certain video frame in a dynamic picture such as a video.
When the image to be processed is an image in a still picture such as a picture or a photograph, the preview image corresponding to the image to be processed is a preview image with a planetary special effect corresponding to the image in the still picture such as the picture or the photograph. The planetary effect image generated based on the preview image may be an image of a still picture, such as a photograph. In the application process, the preview image can be directly determined to be a photo with a planetary special effect.
In the application process, the image to be processed may be a plurality of acquired images acquired in real time by a plurality of cameras. Based on the acquired images acquired by the cameras in real time, a fusion image with panoramic effect can be obtained. Further, a target image is obtained based on the fusion image, and a preview image with a planetary special effect is obtained based on the target image.
The present disclosure will explain a process of obtaining a target image based on a plurality of captured images captured by a plurality of cameras by the following embodiments.
In one embodiment, based on a plurality of acquired images acquired by a plurality of cameras, overlapping portions of the acquired images are identified, the overlapping portions are fused to obtain a fused image, and the fused image is used as an image to be processed. The obtained fusion image is an image with panoramic effect.
In one example, multiple cameras may acquire acquired images in real time. The plurality of cameras may collect collected images of different angles simultaneously, for example, the plurality of cameras may include camera a, camera B, camera C, and camera D. The camera A can acquire an eastern acquired image P1 in real time; the camera B can acquire a southward acquisition image P2 in real time; the camera C can acquire a western acquisition image P3 in real time; the camera D can collect the north-oriented collected image P4 in real time.
Based on the collected images simultaneously collected by the camera a, the camera B, the camera C, and the camera D, overlapping portions of the collected images are identified, for example, overlapping portions between the collected image P1 and the collected image P2, overlapping portions between the collected image P2 and the collected image P3, overlapping portions between the collected image P3 and the collected image P4, and overlapping portions between the collected image P4 and the collected image P1 can be identified, respectively. Further, the overlapped parts are fused to obtain a fused image.
It should be noted that the fused image is an image having a panoramic effect. The obtained fusion image can lay a foundation for a user to obtain a preview image with a better effect and a planetary special effect.
In the application process, the fused image can be used as an image to be processed. Further, the first target edge and the second target edge of the image to be processed are spliced, the spliced part is filled to obtain a target image, and a preview image with a planetary special effect is generated based on the target image.
Since the process of obtaining the target image based on the image to be processed and obtaining the preview image with the planetary special effect based on the target image and the advantageous effects thereof have been described in detail above, the description thereof will not be repeated.
Further, when the user determines that the generated preview image with the planetary special effect meets the requirement of the user, the preview image with the planetary special effect corresponding to the image to be processed can be made to generate the planetary special effect image through shooting instructions. And further, the processing complexity of the user in the process of obtaining the planetary special effect image is reduced, and the experience and satisfaction of the user in the use process are improved.
In one possible embodiment, the image to be processed may also be acquired by a camera on the terminal. The number of cameras can be one or more. The terminal processes the image acquired based on the camera, generates and displays a preview image with the planetary special effect on the terminal or a display screen of the terminal, so that a user does not need to obtain an image picture to be processed in advance in the process of acquiring the planetary special effect image, but directly previews the preview image with the planetary special effect in the process of acquiring the image, and further, the experience and satisfaction of the user in the use process are improved.
An embodiment of a terminal applying the image processing method according to the embodiments of the present disclosure to collect an image to be processed through a camera will be described below.
Fig. 2 shows a block diagram of a terminal.
In an exemplary embodiment of the present disclosure, as shown in fig. 2, the terminal 100 includes a front camera 102 disposed at a front side of the terminal 100 and a rear camera 101 disposed at a rear side of the terminal 100.
In the application process, the terminal 100 can collect the image to be processed in real time through the front camera 102 and the rear camera 101 at the same time.
In one embodiment, the front camera 102 and the rear camera 101 may have the same configuration parameters, such as focal length size and aperture size. Wherein, the angles of view of the front camera 102 and the rear camera 101 may be both greater than or equal to 180 °.
Since the front camera 102 and the rear camera 101 are respectively disposed at the front side and the rear side of the terminal 100, and the angles of view of both are greater than or equal to 180 °, then, when the front camera 102 and the rear camera 101 collect images at the same time, a panoramic image with respect to the image to be processed can be obtained based on the two images collected by the front camera 102 and the rear camera 101.
Further, overlapping parts of the two images can be identified, and the overlapping parts are spliced to obtain a target image; based on the target image, a preview image with a planetary special effect is generated with respect to the panorama of the image to be processed.
With the maturity of folding screen technology, the terminal also can be provided with the folding screen to can be provided with a plurality of cameras on the folding screen. Because the folding screen can be folded to different directions, a plurality of cameras arranged on the folding screen can be respectively oriented to different directions. In one extreme case, four cameras may be provided on the folding screen, and by the folding manner of the folding screen, the four cameras face east, south, west and north, respectively.
In order to ensure that the formed preview image with the planetary special effect has a good effect, a stitched image can also be obtained in the following manner. Further, a preview image with a planetary special effect is generated based on the target image.
Fig. 3 is a flowchart illustrating a method of processing an image to acquire a target image according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, as shown in fig. 3, obtaining a target image by performing a stretching process on an image to be processed may include step S21 and step S22. The steps will be described separately.
In step S21, a first object image in the image to be processed is determined.
The first subject image may be understood as a main photographed subject, such as a photographed person.
In step S22, the deformation ratio of the first object image is reduced in the course of stretching the image to be processed to obtain a target image.
Since the preview image with the planetary special effect often causes the object image on the original image to deform to a certain extent, in order to ensure the display effect of the first object image on the preview image with the planetary special effect, the deformation rate of the first object image can be reduced in the process of converting the image to be processed into the target image.
In one embodiment, the position of the region where the first object image is located in the image to be processed (for convenience of description, may be referred to as a first region), and in the process of stretching the image to be processed, deformation of the first region is reduced, and stretching deformation is performed on a region other than the first region of the image to be processed, so that stretching of the image to be processed is achieved, and a target image is obtained.
Further, a preview image with a planetary effect may be generated based on the target image.
In the process of obtaining the target image, it can also be achieved in the following manner.
Fig. 4 is a flowchart illustrating an acquisition target image in another image processing method according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, as shown in fig. 4, by performing a stitching process on an image to be processed, obtaining a stitched image may include step S31 and step S32. The steps will be described separately.
In step S31, a second object image in the image to be processed is determined.
The second subject image can be understood as other subject images than the main subject. Wherein the main subject may be the first subject image described above. For example, if the first object image is a person, the second object image is another object image other than the person, for example, a ground, a tree, a house, or the like.
In step S32, the second object image is subjected to stretching processing to obtain a third object image, and a target image is obtained based on the third object image.
Since the second object image is not the main photographed image in the image to be processed, in the process of performing the stitching process on the image to be processed, in order to ensure the effect of the main photographed image, the stitching of the image to be processed may be achieved by performing the stretching process on the area where the second object image is located. The second object image is stretched to obtain a third object image, and a target image is obtained based on the third object image.
In this way, the display effect of the main photographed image (which may be the first object image described above in one example) on the preview image with the planetary special effect can be effectively ensured.
For convenience of explanation of the present embodiment, the following examples will be described.
FIG. 5 shows a schematic representation of an image to be processed; fig. 6 shows a schematic diagram of a preview image with planetary special effects with respect to the image to be processed.
In an example, a piece of image to be processed is acquired, as shown in fig. 5, wherein the image to be processed includes a first object image (person 2) and a second object image (ground 1, house 3, and tree 4). As a possible embodiment, the image to be processed may be obtained by identifying overlapping portions of the acquired images based on the acquired plurality of acquired images described above, and fusing the overlapping portions. Since the process of acquiring the image to be processed has been described in detail above, it is not described in detail here. In order to ensure that the first object image (person 2) shows effect on the preview image with the planetary special effect, the deformation rate of the first object image (person 2) may be reduced in the process of converting the image to be processed into the target image; the stretching process of the image to be processed is achieved by causing the stretching process (obtaining the third object image) of the second object image (the ground 1, the house 3, and the tree 4), and the target image is obtained. For example, in some optional embodiments, after the first object image is determined, the first object image may be resized according to a first preset ratio, for example, the second object image may be reduced according to the first preset ratio, or the first object image may be enlarged according to the first preset ratio; likewise, after the second object image is determined, the second object image may be resized according to the first preset ratio, for example, the second object image may be reduced according to the second preset ratio, or the second object image may be enlarged according to the second preset ratio; the above-mentioned image resizing of the first object image and the second object image may be implemented alone or in combination with the various embodiments of the present disclosure.
Further, based on the target image, a preview image with a planetary special effect is obtained as shown in fig. 6.
In the process of stretching the image to be processed, it can also be achieved in the following manner.
And (3) performing alignment treatment on the stretching treatment part of the image to be treated, namely performing alignment treatment on the splicing connection part of the first target edge and the second target edge of the image to be treated, so that the head-to-tail splicing treatment part of the third object image in the image to be treated along the horizontal direction is aligned.
Still taking the first object image as the figure 2, the second object image includes the ground 1 as an example, in the process of stretching, the stretched ground 1 (i.e. the third object image) needs to be connected together along the horizontal direction, if there is a significant difference between the horizontal direction and the horizontal direction of the stretched ground 1, the effect of the obtained spliced image will be affected. Therefore, the stretching processing part of the image to be processed can be aligned, so that the head and the tail of the third object image along the horizontal direction are aligned at the splicing processing part, and the effect of the preview image with the planetary feature image obtained based on the target image is further ensured.
In an exemplary embodiment of the present disclosure, the image processing method provided by the present disclosure further includes receiving a touch instruction of a user, and responding to the touch instruction, performing an operation or presentation on the preview image corresponding to the touch instruction.
In an example, to be able to provide a user with a preview image at multiple angles, the image processing method provided by the present disclosure may further adjust the presentation angle of the preview image based on the touch instruction. This embodiment will be described in detail below.
When a touch instruction of rightward sliding of a user is detected and received, the terminal responds to the touch instruction, a preview image with planetary special effects displayed on the terminal or a display screen of the terminal is rotated rightward, and the preview image with planetary special effects at a right rotated view angle is displayed on the terminal or the display screen of the terminal.
Through the embodiment, the preview image with the planet special effect can be provided for the user in multiple views, so that the experience and satisfaction of the user in the using process are improved.
In an example, to provide a display function of zooming in or zooming out a preview image for a user, the image processing method provided in the present disclosure may further adjust a display size of the preview image based on the touch instruction. This embodiment will be described in detail below.
When detecting and receiving a touch instruction for amplifying and displaying the preview image by a user, the terminal responds to the touch instruction, amplifies and displays the preview image with the planetary special effect displayed on the terminal or a display screen of the terminal, and displays the preview image with the planetary special effect on the terminal or the display screen of the terminal. Thereby increasing the experience and satisfaction of the user in the use process.
In the present disclosure, the touch instruction and the function corresponding to the touch instruction are not specifically limited, and may be determined according to actual situations.
As can be seen from the above description, the image to be processed may also include an image in a certain video frame in a dynamic picture such as a video. Correspondingly, the preview image corresponding to the image to be processed may further include a preview image with a planetary special effect corresponding to an image in a certain video frame in a dynamic picture such as a video. It can be seen that the planetary special effect video corresponding to the video can also be obtained through the image processing method of the present disclosure, wherein the video can be a video composed of continuous video frames. The above-described "a certain video frame in a dynamic picture such as a video" is understood to be a certain video frame in the video of the continuous video frame. A process of obtaining a planetary special effect video corresponding to a video will be described below.
The present disclosure will be described with respect to a procedure of generating a planetary special effect image based on a preview image in response to a photographing instruction, with respect to the planetary special effect image including a planetary special effect video, by the following embodiments.
Fig. 7 is a flowchart illustrating generation of a planetary special effect image based on a preview image in an image processing method according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, generating a planetary special effect image based on the preview image includes steps S41 to S43, as shown in fig. 7. The steps will be described separately.
In step S41, in response to a photographing instruction, continuous multi-frame images are acquired.
The continuous multi-frame image may be a multi-frame image acquired by a camera. Wherein successive multi-frame images may form a video.
In step S42, the continuous multi-frame image is processed in accordance with the processing method of the preview image, and a plurality of intermediate images having a special planetary effect corresponding to the continuous multi-frame image are obtained.
An intermediate image with a planetary effect may be understood as a preview image of an image of a picture in each video frame of a video with a succession of video frames. Wherein the preview image has a planetary effect.
In step S43, a plurality of intermediate images are connected in series to generate a planetary special effect video.
In one embodiment, when the user determines that the intermediate image with the planetary special effect meets the requirement of the user, a plurality of intermediate images can be connected in series to generate the planetary special effect video. By the method, the video with the planetary special effect can be generated, and experience and satisfaction of a user in the using process are improved.
In the application process, the intermediate images with the planetary special effects are connected according to the time sequence, and then the planetary special effect video can be obtained.
As can be seen from the above description, according to the image processing method provided by the present disclosure, a preview image with a planetary special effect can be generated by an acquired image to be processed, and the preview image can be displayed on a display screen; when the preview image with the planetary special effect meets the user's requirements, the planetary special effect image can be generated based on the preview image by responding to the shooting instruction of the user. By the method, a user does not need to obtain an image photo to be processed in advance in the process of obtaining the planetary special effect image, but can directly preview the preview image with the planetary special effect in the process of image acquisition, so that the processing complexity of the user in the process of obtaining the planetary special effect image is reduced, and the experience and satisfaction of the user in the use process are improved.
Based on the same conception, the embodiment of the disclosure also provides an image processing device.
It will be appreciated that, in order to implement the above-described functions, the image processing apparatus provided in the embodiments of the present disclosure includes corresponding hardware structures and/or software modules that perform the respective functions. The disclosed embodiments may be implemented in hardware or a combination of hardware and computer software, in combination with the various example elements and algorithm steps disclosed in the embodiments of the disclosure. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not to be considered as beyond the scope of the embodiments of the present disclosure.
Fig. 8 is a block diagram of an image processing apparatus according to an exemplary embodiment. The image processing apparatus 200 is applied to a terminal, which may include a display screen. Referring to fig. 8, the image processing apparatus 200 includes an acquisition module 121, a preview image generation module 122, a preview image display module 123, and a planetary special effect image generation module 124. The respective modules will be described below.
The acquiring module 121 is configured to acquire images to be processed acquired in real time through one or more cameras.
The preview image generating module 122 is configured to obtain a target image by stretching an image to be processed, and generate a preview image with a planetary special effect based on the target image.
The display preview image module 123 is configured to display a preview image.
The generate planetary special effects image module 124 is configured to generate a planetary special effects image based on the preview image in response to the shooting determination instruction.
In an exemplary embodiment of the present disclosure, the generate preview image module 122 is configured to: and splicing the first target edge and the second target edge of the image to be processed, and filling the spliced part to obtain the target image, wherein the first target edge is opposite to the second target edge.
In an exemplary embodiment of the present disclosure, generating the preview image module 122 includes determining an object image sub-module and obtaining a target image sub-module. The respective sub-modules are described below.
The determining object image sub-module is configured to determine a first object image in the image to be processed.
The target image acquisition sub-module is configured to reduce the deformation rate of the first object image in the process of stretching the image to be processed to obtain a target image.
In an exemplary embodiment of the present disclosure, the acquisition module 121 includes an acquisition image sub-module and an acquisition fusion image sub-module. The respective sub-modules are described below.
The acquisition image sub-module is configured to acquire a plurality of acquisition images acquired by a plurality of cameras in real time.
The fusion image acquisition sub-module is configured to identify the overlapped part of the acquired images based on the acquired images, fuse the overlapped part to obtain a fusion image, and take the fusion image as an image to be processed.
In an exemplary embodiment of the present disclosure, generating the preview image module 122 includes a determining object image sub-module and a capturing stitched image sub-module, each of which will be described separately below.
The determine object image sub-module is configured to: a second object image in the image to be processed is determined.
The acquire stitched image sub-module is configured to: and the image stretching processing is used for carrying out image stretching processing on the second object image to obtain a third object image, and obtaining a target image based on the third object image.
In an exemplary embodiment of the present disclosure, the planetary effect image includes a planetary effect video, and the generate planetary effect image module 124 includes a response instruction sub-module, a process image sub-module, and a generate planetary effect video sub-module. The respective sub-modules will be described below.
The response instruction sub-module is configured to: in response to a photographing instruction, continuous multi-frame images are acquired.
The process image sub-module is configured to: and processing the continuous multi-frame images according to the processing mode of the preview images to obtain a plurality of intermediate images with planetary special effects corresponding to the continuous multi-frame images.
The generate planetary effect video sub-module is configured to: and connecting a plurality of intermediate images in series to generate the planetary special effect video.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 9 is a block diagram illustrating an apparatus 800 for image processing according to an exemplary embodiment. For example, the apparatus 800 for image processing may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, or the like.
Referring to fig. 9, an apparatus 800 for image processing may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operations of the apparatus 800 for image processing, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 802 may include one or more processors 820 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interactions between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800 for image processing. Examples of such data include instructions for any application or method operating on the apparatus 800 for image processing, contact data, phonebook data, messages, pictures, videos, and the like. The memory 804 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 806 provides power to the various components of the device 800 for image processing. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the apparatus 800 for image processing.
The multimedia component 808 includes a screen that provides an output interface between the apparatus 800 for image processing and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front camera and/or a rear camera. When the apparatus 800 for image processing is in an operation mode, such as a photographing mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 for image processing is in an operation mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 further includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the apparatus 800 for image processing. For example, the sensor assembly 814 may detect an on/off state of the apparatus 800 for image processing, a relative positioning of the assemblies, such as a display and a keypad of the apparatus 800 for image processing, the sensor assembly 814 may also detect a change in position of the apparatus 800 for image processing or one assembly of the apparatus 800 for image processing, presence or absence of user contact with the apparatus 800 for image processing, orientation or acceleration/deceleration of the apparatus 800 for image processing, and a change in temperature of the apparatus 800 for image processing. The sensor assembly 814 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication in a wired or wireless manner between the apparatus 800 for image processing and other devices. The apparatus 800 for image processing may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 816 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 for image processing may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the above method.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 804 including instructions executable by processor 820 of apparatus 800 for image processing to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Fig. 10 is a block diagram illustrating another apparatus for image processing according to an exemplary embodiment. For example, the apparatus 1100 for image processing may be provided as a server. Referring to fig. 10, an apparatus 1100 for image processing includes a processing component 1122 that further includes one or more processors and memory resources represented by memory 1132 for storing instructions, such as application programs, executable by the processing component 1122. The application programs stored in memory 1132 may include one or more modules each corresponding to a set of instructions. Further, processing component 1122 is configured to execute instructions to perform the data transmission methods described above.
The apparatus for image processing 1100 may further include a power supply component 1126 configured to perform power management of the apparatus for image processing 1100, a wired or wireless network interface 1150 configured to connect the apparatus for data transmission 1100 to a network, and an input output (I/O) interface 1158. The apparatus 1100 for image processing may operate based on an operating system stored in the memory 1132, such as Windows server (tm), mac OS XTM, unixTM, linuxTM, freeBSDTM, or the like.
It is understood that the term "plurality" in this disclosure means two or more, and other adjectives are similar thereto. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It is further understood that the terms "first," "second," and the like are used to describe various information, but such information should not be limited to these terms. These terms are only used to distinguish one type of information from another and do not denote a particular order or importance. Indeed, the expressions "first", "second", etc. may be used entirely interchangeably. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It will be further understood that "connected" includes both direct connection where no other member is present and indirect connection where other element is present, unless specifically stated otherwise.
It will be further understood that although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (12)

1. An image processing method, applied to a terminal, comprising:
acquiring an image to be processed acquired in real time through one or more cameras;
the image to be processed is stretched to obtain a target image, and a preview image with a planetary special effect is generated based on the target image;
displaying the preview image;
generating a planetary special effect image based on the preview image in response to a shooting instruction;
the stretching treatment is carried out on the image to be processed to obtain a target image, which comprises the following steps:
and splicing the first target edge and the second target edge of the image to be processed, and filling the spliced part to obtain the target image, wherein the first target edge is opposite to the second target edge.
2. The image processing method according to claim 1, wherein the obtaining the target image by stretching the image to be processed includes:
determining a first object image in the image to be processed;
and in the process of stretching the image to be processed, reducing the deformation rate of the first object image to obtain the target image.
3. The image processing method according to claim 1, wherein the obtaining the target image by stretching the image to be processed includes:
determining a second object image in the image to be processed;
and performing image stretching processing on the second object image to obtain a third object image, and obtaining the target image based on the third object image.
4. The image processing method according to claim 1, wherein the acquiring the image to be processed acquired in real time by the one or more cameras includes:
acquiring a plurality of acquired images acquired by a plurality of cameras in real time;
based on the plurality of acquired images, identifying an overlapping part of the acquired images, fusing the overlapping part to obtain a fused image, and taking the fused image as the image to be processed.
5. The image processing method according to claim 1, wherein the planetary effect image includes a planetary effect video, the generating a planetary effect image based on the preview image in response to a photographing instruction, comprising:
responding to a shooting instruction, and acquiring continuous multi-frame images;
processing the continuous multi-frame images according to the processing mode of the preview images to obtain a plurality of intermediate images with planetary special effects, which correspond to the continuous multi-frame images;
And connecting a plurality of intermediate images in series to generate the planetary special effect video.
6. An image processing apparatus, characterized by being applied to a terminal, comprising:
the acquisition module is used for acquiring an image to be processed through real-time acquisition of one or more cameras;
the preview image generation module is used for obtaining a target image by stretching the image to be processed and generating a preview image with a planetary special effect based on the target image;
the preview image displaying module is used for displaying the preview image;
the planetary special effect image generation module is used for responding to a shooting instruction and generating a planetary special effect image based on the preview image; the preview image generating module is further configured to splice a first target edge and a second target edge of the image to be processed, and fill up the spliced portion to obtain the target image, where the first target edge is opposite to the second target edge.
7. The image processing apparatus of claim 6, wherein the generating preview image module comprises:
a determining object image sub-module, configured to determine a first object image in the image to be processed;
The sub-module for obtaining the target image is used for reducing the deformation rate of the first object image in the process of stretching the image to be processed to obtain the target image.
8. The image processing apparatus of claim 6, wherein the generating preview image module comprises:
a determining object image sub-module, configured to determine a second object image in the image to be processed;
the target image acquisition sub-module is used for carrying out image stretching processing on the second object image to obtain a third object image, and obtaining the target image based on the third object image.
9. The image processing apparatus of claim 6, wherein the acquisition module comprises:
the acquisition image sub-module is used for acquiring a plurality of acquisition images acquired by a plurality of cameras in real time;
and the fusion image acquisition sub-module is used for identifying the overlapped part of the acquired images based on the acquired images, fusing the overlapped part to obtain a fusion image, and taking the fusion image as the image to be processed.
10. The image processing apparatus of claim 6, wherein the planetary effect image comprises a planetary effect video, the generating a planetary effect image module comprising:
The response instruction submodule is used for responding to a shooting instruction and acquiring continuous multi-frame images;
the image processing sub-module is used for processing the continuous multi-frame images according to the processing mode of the preview images to obtain a plurality of intermediate images with planetary special effects, which correspond to the continuous multi-frame images;
and the generating planetary special effect video sub-module is used for connecting a plurality of intermediate images in series to generate a planetary special effect video.
11. An image processing apparatus, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: an image processing method according to any one of claims 1 to 5.
12. A non-transitory computer readable storage medium, which when executed by a processor of a mobile terminal, causes the mobile terminal to perform the image processing method of any of claims 1 to 5.
CN202010756691.3A 2020-07-31 2020-07-31 Image processing method, device and storage medium Active CN114070999B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010756691.3A CN114070999B (en) 2020-07-31 2020-07-31 Image processing method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010756691.3A CN114070999B (en) 2020-07-31 2020-07-31 Image processing method, device and storage medium

Publications (2)

Publication Number Publication Date
CN114070999A CN114070999A (en) 2022-02-18
CN114070999B true CN114070999B (en) 2023-12-12

Family

ID=80227522

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010756691.3A Active CN114070999B (en) 2020-07-31 2020-07-31 Image processing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN114070999B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1705363A (en) * 2004-05-31 2005-12-07 佳能株式会社 Apparatus and method for image processing
CN102780852A (en) * 2012-05-31 2012-11-14 新奥特(北京)视频技术有限公司 Method for realizing spherical transition special effects
CN103020900A (en) * 2012-11-15 2013-04-03 北京小米科技有限责任公司 Method and device for image processing

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008001515A1 (en) * 2006-06-28 2008-01-03 Panasonic Corporation Special effect imaging device and special effect imaging method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1705363A (en) * 2004-05-31 2005-12-07 佳能株式会社 Apparatus and method for image processing
CN102780852A (en) * 2012-05-31 2012-11-14 新奥特(北京)视频技术有限公司 Method for realizing spherical transition special effects
CN103020900A (en) * 2012-11-15 2013-04-03 北京小米科技有限责任公司 Method and device for image processing

Also Published As

Publication number Publication date
CN114070999A (en) 2022-02-18

Similar Documents

Publication Publication Date Title
KR102194094B1 (en) Synthesis method, apparatus, program and recording medium of virtual and real objects
US11368632B2 (en) Method and apparatus for processing video, and storage medium
CN111510645B (en) Video processing method and device, computer readable medium and electronic equipment
CN108495032B (en) Image processing method, image processing device, storage medium and electronic equipment
CN107426502B (en) Shooting method and device, electronic equipment and storage medium
CN108632543B (en) Image display method, image display device, storage medium and electronic equipment
CN106791483B (en) Image transmission method and device and electronic equipment
CN114009003A (en) Image acquisition method, device, equipment and storage medium
CN112312042A (en) Display control method, display control device, electronic equipment and storage medium
US11265529B2 (en) Method and apparatus for controlling image display
CN114070999B (en) Image processing method, device and storage medium
KR102557592B1 (en) Method and apparatus for displaying an image, electronic device and computer-readable storage medium
CN111461950A (en) Image processing method and device
CN112653841B (en) Shooting method and device and electronic equipment
EP3905660A1 (en) Method and device for shooting image, and storage medium
CN108596971B (en) Image display method and device
CN113315903A (en) Image acquisition method and device, electronic equipment and storage medium
CN113766115B (en) Image acquisition method, mobile terminal, device and storage medium
CN110876013B (en) Method and device for determining image resolution, electronic equipment and storage medium
CN118055334A (en) Photographing method, photographing device and storage medium
CN118018866A (en) Photographing method, photographing device and storage medium
CN114943791A (en) Animation playing method, device, equipment and storage medium
CN117480772A (en) Video display method and device, terminal equipment and computer storage medium
CN118018854A (en) Method, device and storage medium for generating high dynamic range image
CN118052752A (en) Image processing method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant