CN114070999A - Image processing method, device and storage medium - Google Patents

Image processing method, device and storage medium Download PDF

Info

Publication number
CN114070999A
CN114070999A CN202010756691.3A CN202010756691A CN114070999A CN 114070999 A CN114070999 A CN 114070999A CN 202010756691 A CN202010756691 A CN 202010756691A CN 114070999 A CN114070999 A CN 114070999A
Authority
CN
China
Prior art keywords
image
processed
target
special effect
preview
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010756691.3A
Other languages
Chinese (zh)
Other versions
CN114070999B (en
Inventor
冯靖鹏
熊悦欣
王倩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202010756691.3A priority Critical patent/CN114070999B/en
Publication of CN114070999A publication Critical patent/CN114070999A/en
Application granted granted Critical
Publication of CN114070999B publication Critical patent/CN114070999B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The present disclosure relates to an image processing method, apparatus, and storage medium. The image processing method is applied to the terminal and comprises the following steps: acquiring images to be processed acquired in real time through one or more cameras; stretching an image to be processed to obtain a target image, and generating a preview image with a planet special effect based on the target image; displaying the preview image; and generating a planet special effect image based on the preview image in response to a shooting instruction. By the image processing method, a user does not need to obtain an image photo to be processed in advance in the process of obtaining the planet special effect image, but can directly preview a preview image with a planet special effect in the process of image acquisition, so that the processing complexity of the user in obtaining the planet special effect image is reduced, and the experience and satisfaction of the user in the using process are increased.

Description

Image processing method, device and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, and a storage medium.
Background
With the continuous development of terminal technology, the imaging technology on the terminal is also continuously developed, and the requirement of users on imaging is correspondingly higher and higher. For example, generating an image with a planetary effect becomes a hot spot of interest to the user.
At present, in the process of processing an image into an image with a planetary special effect, an image photo to be processed needs to be obtained in advance, and then the planetary special effect processing is performed on the photo, which increases the complexity of image processing for a user.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image processing method, apparatus, and storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided an image processing method. The image processing method is applied to a terminal, and comprises the following steps: acquiring images to be processed acquired in real time through one or more cameras; stretching the image to be processed to obtain a target image, and generating a preview image with a planet special effect based on the target image; displaying the preview image; and generating a planet special effect image based on the preview image in response to a shooting instruction.
In one embodiment, the obtaining the target image by stretching the image to be processed includes: and splicing a first target edge and a second target edge of the image to be processed, and filling the spliced part to obtain the target image, wherein the first target edge is opposite to the second target edge.
In another embodiment, the obtaining the target image by stretching the image to be processed includes: determining a first object image in the image to be processed; and in the process of stretching the image to be processed, reducing the deformation rate of the first object image to obtain the target image.
In another embodiment, the obtaining the target image by stretching the image to be processed includes: determining a second object image in the image to be processed; and performing image stretching processing on the second object image to obtain a third object image, and obtaining the target image based on the third object image.
In another embodiment, the acquiring the to-be-processed image acquired by one or more cameras in real time includes: acquiring a plurality of acquired images acquired by a plurality of cameras in real time; and identifying the overlapped part of the collected images based on the plurality of collected images, fusing the overlapped part to obtain a fused image, and taking the fused image as the image to be processed.
In yet another embodiment, the planetary special effect image includes a planetary special effect video, and the generating the planetary special effect image based on the preview image in response to the capture instruction includes: responding to a shooting instruction, and acquiring continuous multi-frame images; processing the continuous multi-frame images according to the processing mode of the preview images to obtain a plurality of intermediate images with planetary special effects corresponding to the continuous multi-frame images; and connecting a plurality of intermediate images in series to generate a planet special effect video.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus applied to a terminal. Wherein the image processing apparatus includes: the acquisition module is used for acquiring images to be processed acquired in real time through one or more cameras; the preview image generating module is used for stretching the image to be processed to obtain a target image and generating a preview image with a planet special effect based on the target image; the preview image display module is used for displaying the preview image; and the planet special effect image generation module is used for responding to a shooting instruction and generating a planet special effect image based on the preview image.
In one embodiment, the generate preview image module is configured to: and splicing a first target edge and a second target edge of the image to be processed, and filling the spliced part to obtain the target image, wherein the first target edge is opposite to the second target edge.
In another embodiment, the generating a preview image module includes: a determining object image submodule for determining a first object image in the image to be processed; and the target image obtaining submodule is used for reducing the deformation rate of the first target image in the process of stretching the image to be processed to obtain the target image.
In yet another embodiment, the generating a preview image module comprises: a determining object image submodule for determining a second object image in the image to be processed; and the target image obtaining sub-module is used for carrying out image stretching processing on the second object image to obtain a third object image and obtaining the target image based on the third object image.
In yet another embodiment, the obtaining module includes: the acquisition image submodule is used for acquiring a plurality of acquisition images acquired by a plurality of cameras in real time; and the fused image sub-module is used for identifying the overlapped part of the collected images based on the collected images, fusing the overlapped part to obtain a fused image, and taking the fused image as the image to be processed.
In yet another embodiment, the planet effect image comprises a planet effect video, and the generate planet effect image module comprises: the response instruction submodule is used for responding to the shooting instruction and acquiring continuous multi-frame images; the processing image submodule is used for processing the continuous multi-frame images according to the processing mode of the preview images to obtain a plurality of intermediate images with the planetary special effect corresponding to the continuous multi-frame images; and generating a planet special effect video submodule for connecting the plurality of intermediate images in series to generate a planet special effect video.
According to a third aspect of the embodiments of the present disclosure, there is provided an image processing apparatus. The image processing apparatus includes a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: the image processing method in the first aspect or any one implementation manner of the first aspect of the present disclosure is performed.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium, wherein instructions of the storage medium, when executed by a processor of a mobile terminal, enable the mobile terminal to perform the image processing method of the first aspect of the present disclosure or any one of the implementation manners of the first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: the method and the device generate a preview image with a planet special effect through the acquired image to be processed, and can display the preview image on a display screen; when the preview image with the planetary special effect meets the requirements of the user, the planetary special effect image can be generated based on the preview image by responding to the shooting instruction of the user. By the method, a user does not need to obtain an image photo to be processed in advance in the process of obtaining the planet special effect image, but can directly preview the preview image with the planet special effect in the process of image acquisition, so that the processing complexity of the user in the process of obtaining the planet special effect image is reduced, and the experience and satisfaction of the user in the using process are increased.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a flow diagram illustrating an image processing method according to an exemplary embodiment;
fig. 2 shows a block diagram of a terminal;
FIG. 3 is a flow diagram illustrating acquisition of a target image in an image processing method according to an exemplary embodiment;
FIG. 4 is a flow diagram illustrating acquisition of a target image in another image processing method according to an exemplary embodiment;
FIG. 5 shows a schematic view of an image to be processed;
FIG. 6 is a schematic diagram of a preview image with a planetary effect for an image to be processed;
FIG. 7 is a flow diagram illustrating the generation of a planetary special effect image based on a preview image in an image processing method according to an exemplary embodiment;
FIG. 8 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment;
FIG. 9 illustrates a block diagram of an apparatus for image processing, according to an exemplary embodiment;
FIG. 10 illustrates a block diagram of another apparatus for image processing, according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
In the drawings, the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The described embodiments are only a subset of the embodiments of the present disclosure, and not all embodiments. The embodiments described below with reference to the drawings are exemplary and intended to be illustrative of the present disclosure, and should not be construed as limiting the present disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure. Embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
In the description of the present embodiment, it is to be understood that the terms "center", "longitudinal", "lateral", "front", "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc. indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of description and simplicity of description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and therefore, should not be construed as limiting the scope of the present embodiment. It should be noted that: the relative arrangement of the components, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
With the continuous development of terminal technology, the imaging technology on the terminal is also continuously developed, and the requirement of users on imaging is correspondingly higher and higher. For example, generating an image with a planetary effect becomes a hot spot of interest to the user.
The image with the planet effect is an image which is processed by the image effect and has the effect similar to a bird's-eye view miniature planet. At present, in the process of processing an image into an image with a planetary special effect, an image photo to be processed needs to be obtained in advance, and then the planetary special effect processing is performed on the photo, which increases the complexity of image processing for a user.
The invention provides an image processing method, which is characterized in that a user can directly preview a preview image with a planet special effect in the image acquisition process without acquiring an image photo to be processed in advance in the process of acquiring a planet special effect image, so that the processing complexity of the user in the process of acquiring the planet special effect image is reduced, and the experience and satisfaction of the user in the use process are increased.
The image processing method provided by the embodiment of the disclosure can be applied to a terminal, and the terminal can be a mobile terminal such as a smart phone, a tablet and the like with an image acquisition function and a processing function, and can display a processed image on the mobile terminal. The terminal may also be a terminal that does not have an image capturing function but has an image processing function, and may display a processed image on the terminal. In the present disclosure, the terminal is not particularly limited.
Fig. 1 is a flowchart illustrating an image processing method according to an exemplary embodiment, where the image processing method is applied to a terminal, and the terminal may include a display screen, where the display screen is used for displaying a preview image and a planet special effect image generated based on the preview image, as shown in fig. 1. The image processing method includes steps S11 to S14, which will be described separately below.
In step S11, acquiring images to be processed in real time by one or more cameras is acquired.
The camera can be positioned at the terminal, and the terminal directly acquires the image to be processed acquired by the camera; the camera can also be a camera independent of the terminal, wherein the camera is in communication connection with the terminal, and can transmit the image to be processed acquired by the camera in real time to the terminal, and the image processing is performed by the terminal.
In one example, the camera and the terminal may be located in the same lan, and image transmission may be performed through the same lan. In another example, the camera may also perform image transmission with the terminal through bluetooth or ultrasonic transmission.
It should be noted that the image to be processed may be a panoramic image or a non-panoramic image. At this time, the image to be processed is a captured image that is not determined as a final generated image, and the image to be processed may be understood as an image previewed in a finder frame of the camera by the user during image acquisition.
In step S12, a target image is obtained by performing stretching processing on the image to be processed, and a preview image having a special effect of planets is generated based on the target image.
And stretching the image to be processed based on the image to be processed acquired by the camera in real time to obtain a target image. The target image is processed through an algorithm, and a preview image with a planet special effect corresponding to the image to be processed can be generated in real time. For example, a coordinate transformation may be applied to the image to be processed to realize the image stretching process. Here, the stretching process may be used to stretch the image to be processed into a curved image.
In step S13, a preview image is displayed.
The terminal can display the preview image with the planet special effect corresponding to the image to be processed, which is generated in real time, on the terminal, and further can display the preview image with the planet special effect on a display screen of the terminal, so that a user can preview and judge whether the generated preview image with the planet special effect corresponding to the image to be processed meets the requirements of the user.
In step S14, in response to the photographing instruction, a planetary special effect image is generated based on the preview image.
When the user determines that the preview image with the planet special effect corresponding to the image to be processed meets the requirements of the user, the preview image with the planet special effect corresponding to the image to be processed can generate a planet special effect image through shooting a determination instruction.
In one example, the shooting determination instruction issued by the user may be realized by clicking a button for shooting determination on the terminal. In the present disclosure, a specific implementation form of the issued shooting determination instruction is not limited.
According to the image processing method provided by the disclosure, a preview image with a planet special effect is generated through an acquired image to be processed, and the preview image can be displayed on a display screen; when the preview image with the planetary special effect meets the requirements of the user, the planetary special effect image can be generated based on the preview image by responding to the shooting instruction of the user. By the method, a user does not need to obtain an image photo to be processed in advance in the process of obtaining the planet special effect image, but can directly preview the preview image with the planet special effect in the process of image acquisition, so that the processing complexity of the user in the process of obtaining the planet special effect image is reduced, and the experience and satisfaction of the user in the using process are increased.
The present disclosure will explain a process of obtaining a stitched image by stitching images to be processed by the following embodiments.
In an exemplary embodiment of the present disclosure, a first target edge and a second target edge of an image to be processed may be subjected to stitching processing, and a stitched portion is subjected to padding processing, so as to obtain a stitched image.
In the application process, a to-be-processed image can be acquired based on a camera. The camera can be a camera positioned on the terminal, and the terminal can directly obtain an image to be processed through the camera. The camera can also be a camera independent of the terminal, and transmits the acquired image to be processed to the terminal in a communication transmission mode.
The camera can be a common camera and also can be a wide-angle camera. Since the camera does not have a 360 ° view angle, a single to-be-processed image acquired in real time by the camera cannot obtain a panorama about the to-be-processed image.
In the application process, the first target edge and the second target edge of the image to be processed can be spliced to obtain an image in a panorama-like form, namely a target image.
It should be noted that the first target edge may be opposite to the second target edge. The first target edge is at least one of a first edge and a second edge of the image to be processed, and the first edge intersects with the second edge.
In one example, the image to be processed may be a square image. Here, the left edge of the image in the square shape may be used as the first edge, and then the upper or lower edge of the image in the square shape may be used as the second edge. Wherein, the first target edge of the image to be processed may be a left edge (first edge) or an upper edge (second edge) of the image to be processed. If the first target edge is the left edge of the image to be processed, the second target edge is the right edge of the image to be processed; if the first target edge is the upper edge of the image to be processed, the second target edge is the lower edge of the image to be processed.
In this embodiment, the first target edge and the second target edge of the image to be processed are spliced to obtain an image in a panoramic-like form, and a basis can be laid for obtaining a preview image with a planetary special effect.
Further, a preview image having a planetary special effect may be generated based on the target image.
In one example, based on the obtained image in the panoramic-like form, namely the target image, a preview image with a planetary special effect corresponding to the target image can be generated in real time through an algorithm. The obtained preview image with the planet special effect can provide more interesting image preview experience for users.
Further, when the user determines that the generated preview image with the planet effect meets the requirements of the user, the preview image with the planet effect corresponding to the image to be processed can be made to generate the planet effect image through the shooting instruction. And further, the processing complexity of the user in the process of obtaining the planet special effect image is reduced, and the experience and satisfaction of the user in the using process are increased.
It should be noted that, in the process of obtaining the target image by stitching the first target edge and the second target edge of the image to be processed, the obtained target image often has an unnatural joining condition at the stitched position. Therefore, the spliced part can be filled to obtain the target image. The splicing part of the target image is coherent, natural and smooth.
In an example, a part spliced by a first target edge and a second target edge of an image to be processed can be filled up based on an image restoration technology of deep learning, so that the difference of the spliced part is reduced, the spliced part of the target image is more natural and smooth, and a foundation is laid for a user to obtain a preview image with a planetary special effect and a better effect.
In the application process, the image to be processed may include an image in a still picture such as a picture or a photograph, and may also include an image in a certain video frame in a moving picture such as a video. Accordingly, the preview image corresponding to the image to be processed may include a preview image with a planetary special effect corresponding to an image in a static picture such as a picture or a photograph, and may further include a preview image with a planetary special effect corresponding to an image in a certain video frame in a dynamic picture such as a video.
When the image to be processed is an image in a still picture such as a picture or a photograph, the preview image corresponding to the image to be processed is a preview image having a planetary special effect corresponding to the image in the still picture such as the picture or the photograph. The planet effect image generated on the basis of the preview image may be an image of a still picture, for example a photograph. The preview image can be directly determined to be a photo with a planet special effect in the application process.
In the application process, the image to be processed can be a plurality of collected images collected by a plurality of cameras in real time. Based on the collected images collected by the plurality of cameras in real time, a fusion image with a panoramic effect can be obtained. Further, a target image is obtained based on the fused image, and a preview image with a planetary special effect is obtained based on the target image.
The present disclosure will explain a process of obtaining a target image based on a plurality of captured images captured by a plurality of cameras by the following embodiments.
In one embodiment, the overlapping portion of the acquired images may be identified based on a plurality of acquired images acquired by a plurality of cameras, the overlapping portion may be fused to obtain a fused image, and the fused image may be used as the image to be processed. The obtained fusion image is an image with a panoramic effect.
In one example, multiple cameras may acquire the acquired images in real time. The plurality of cameras can simultaneously acquire the acquired images at different angles, and for example, the plurality of cameras can comprise a camera A, a camera B, a camera C and a camera D. The camera A can acquire an east acquired image P1 in real time; the camera B can acquire a southward acquired image P2 in real time; the camera C can acquire a western acquisition image P3 in real time; the camera D can acquire the north-oriented acquired image P4 in real time.
Based on the captured images simultaneously captured by the cameras a, B, C, and D, the overlapping portions of the captured images are identified, and for example, the overlapping portion between the captured image P1 and the captured image P2, the overlapping portion between the captured image P2 and the captured image P3, the overlapping portion between the captured image P3 and the captured image P4, and the overlapping portion between the captured image P4 and the captured image P1 may be identified, respectively. Further, the overlapped portions are fused to obtain a fused image.
The fused image is an image having a panoramic effect. The obtained fusion image can lay a foundation for a user to obtain a preview image with a better effect and a planetary special effect.
In the application process, the fused image can be used as an image to be processed. Further, the first target edge and the second target edge of the image to be processed are spliced, the spliced part is subjected to filling processing to obtain a target image, and a preview image with a planet special effect is generated based on the target image.
Since the process of obtaining the target image based on the image to be processed and obtaining the preview image with the planet special effect based on the target image and the beneficial effects thereof have been described in detail above, no further description is given here.
Further, when the user determines that the generated preview image with the planet effect meets the requirements of the user, the preview image with the planet effect corresponding to the image to be processed can be made to generate the planet effect image through the shooting instruction. And further, the processing complexity of the user in the process of obtaining the planet special effect image is reduced, and the experience and satisfaction of the user in the using process are increased.
In a possible embodiment, the image to be processed may also be acquired by a camera on the terminal. The number of the cameras can be one or more. The terminal processes the image acquired by the camera, generates and displays the preview image with the planet special effect on the terminal or a display screen of the terminal, so that a user can directly preview the preview image with the planet special effect in the image acquisition process without acquiring an image photo to be processed in advance in the process of acquiring the planet special effect image, and further the experience and satisfaction of the user in the use process are improved.
An embodiment in which a terminal to which the image processing method according to the embodiment of the present disclosure is applied acquires an image to be processed through a camera will be described below.
Fig. 2 shows a block diagram of a terminal.
In an exemplary embodiment of the present disclosure, as shown in fig. 2, the terminal 100 includes a front camera 102 disposed at a front side of the terminal 100 and a rear camera 101 disposed at a rear side of the terminal 100.
During the application process, the terminal 100 can simultaneously acquire the images to be processed through the front camera 102 and the rear camera 101 in real time.
In one embodiment, the front camera 102 and the rear camera 101 may have the same configuration parameters, such as focal length size and aperture size. Wherein the field angles of the front camera 102 and the rear camera 101 may each be greater than or equal to 180 °.
Since the front camera 102 and the rear camera 101 are respectively disposed on the front side and the rear side of the terminal 100, and the field angles of both are greater than or equal to 180 °, when the front camera 102 and the rear camera 101 simultaneously capture images, a panoramic image of the image to be processed can be obtained based on the two images captured by the front camera 102 and the rear camera 101.
Further, the overlapped parts of the two images can be identified, and the overlapped parts are spliced to obtain a target image; based on the target image, a preview image with a planetary special effect is generated with respect to the to-be-processed image panorama.
Along with the maturity of folding screen technique, the terminal also can be provided with folding screen to can be provided with a plurality of cameras on folding screen. Because the folding screen can be folded to different directions, a plurality of cameras arranged on the folding screen can be respectively faced to different directions. In an extreme case, the folding screen can be provided with four cameras, and the four cameras face the east direction, the south direction, the west direction and the north direction respectively through the folding mode of the folding screen.
In order to ensure that the formed preview image with the planetary special effect has a good effect, a stitched image can also be obtained in the following manner. Further, based on the target image, a preview image with a planetary special effect is generated.
FIG. 3 is a flow diagram illustrating acquisition of a target image in an image processing method according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, as shown in fig. 3, obtaining the target image by performing the stretching process on the image to be processed may include steps S21 and S22. The steps will be described separately below.
In step S21, a first object image in the image to be processed is determined.
The first object image may be understood as a main subject to be photographed, for example, a person to be photographed.
In step S22, the deformation rate of the first target image is reduced during the stretching process of the image to be processed to obtain the target image.
Because the preview image with the planetary special effect often causes the object image on the original image to deform to a certain degree, in order to ensure the display effect of the first object image on the preview image with the planetary special effect, the deformation rate of the first object image can be reduced in the process of converting the image to be processed into the target image.
In an embodiment, a region position (for convenience of description, may be referred to as a first region) of the first object image in the image to be processed may be identified, deformation of the first region is reduced during stretching of the image to be processed, and stretching of the image to be processed is achieved by stretching deformation of a region of the image to be processed other than the first region, so as to obtain the target image.
Further, a preview image having a planetary special effect may be generated based on the target image.
In the process of obtaining the target image, the following method can be also used.
FIG. 4 is a flow diagram illustrating acquisition of a target image in another image processing method according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, as shown in fig. 4, obtaining a stitched image by performing a stitching process on the images to be processed may include step S31 and step S32. The steps will be described separately below.
In step S31, a second object image in the image to be processed is determined.
The second object image may be understood as an object image other than the main subject. The main subject may be the first subject image described above. For example, if the first object image is a person, the second object image is an image of an object other than the person, such as the ground, a tree, a house, or the like.
In step S32, the second object image is subjected to stretching processing to obtain a third object image, and a target image is obtained based on the third object image.
Because the second object image is not the main shot image in the images to be processed, in the process of splicing the images to be processed, in order to ensure the effect of the main shot image, the splicing of the images to be processed can be realized by stretching the area where the second object image is located. Namely, the second object image is stretched to obtain a third object image, and the target image is obtained based on the third object image.
In this way, the effect of displaying the main captured image (in one example, the first object image described above) on the preview image with the special effect of the planet can be effectively ensured.
For convenience of explanation of the present embodiment, the following example will be explained.
FIG. 5 shows a schematic view of an image to be processed; fig. 6 shows a schematic representation of a preview image with planetary effect for an image to be processed.
In one example, a to-be-processed image is acquired, as shown in fig. 5, wherein the to-be-processed image includes a first object image (person 2) and a second object image (ground 1, house 3, and tree 4). As a possible embodiment, the image to be processed may be obtained by the above-described process of identifying the overlapping portions of the captured images based on the plurality of captured images, and fusing the overlapping portions. Since the process of acquiring the image to be processed has been described in detail above, it is not described herein again. In order to ensure that the first object image (person 2) has the effect of displaying on the preview image with the planetary special effect, the deformation rate of the first object image (person 2) can be reduced in the process of converting the image to be processed into the target image; the stretching process of the image to be processed is realized by stretching the second object image (the ground 1, the house 3, and the tree 4) (to obtain a third object image), and the target image is obtained. For example, in some optional embodiments, after determining the first object image, the first object image may be image-resized according to a first preset scale, for example, the second object image is reduced according to the first preset scale, or the first object image is enlarged according to the first preset scale; similarly, after determining the second object image, the image size of the second object image may be adjusted according to a first preset ratio, for example, the second object image is reduced according to a second preset ratio, or the second object image is enlarged according to the second preset ratio; the above-mentioned image size adjustment of the first object image and the second object image may be implemented alone or in combination with the embodiments of the present disclosure.
Further, based on the target image, a preview image having a special effect of a planet is obtained, as shown in fig. 6.
In the process of stretching the image to be processed, the stretching can be realized in the following manner.
And aligning the stretching processing part of the image to be processed, namely aligning the splicing connection part of the first target edge and the second target edge of the image to be processed so as to align the head and tail splicing processing parts of the third object image in the image to be processed along the horizontal direction.
Still taking the first object image as the person 2 and the second object image including the ground 1 as an example, in the process of performing the stretching processing, the ground 1 (i.e. the third object image) after the stretching processing needs to be connected together along the horizontal direction, and if there is a significant difference in height between the ground 1 after the stretching processing and the horizontal direction, the effect of the obtained stitched image will be affected. Therefore, the stretching processing part of the image to be processed can be aligned, so that the head and the tail of the third object image along the horizontal direction are aligned in the splicing processing part, and the effect of the preview image with the planet special image obtained based on the target image is further ensured.
In an exemplary embodiment of the present disclosure, the image processing method further includes receiving a touch instruction of a user, and performing an operation or a presentation corresponding to the touch instruction on the preview image in response to the touch instruction.
In an example, in order to provide a preview image with multiple angles for a user, the image processing method provided by the present disclosure may further adjust a display angle of the preview image based on a touch instruction. This embodiment will be described in detail below.
When a touch instruction that a user slides to the right is detected and received, the terminal responds to the touch instruction, a preview image with a planetary special effect displayed on the terminal or a display screen of the terminal is rotated to the right, and the preview image with the planetary special effect of a visual angle after the preview image is rotated to the right is displayed on the terminal or the display screen of the terminal.
Through the embodiment, the preview image with the planet special effect in multiple visual angles can be provided for the user, so that the experience and satisfaction of the user in the using process are increased.
In an example, in order to provide a display function of enlarging or reducing the preview image for the user, the image processing method provided by the present disclosure may further adjust a display size of the preview image based on the touch instruction. This embodiment will be described in detail below.
When a touch instruction of a user for magnifying and displaying the preview image is detected and received, the terminal magnifies and displays the preview image with the planet special effect displayed on the terminal or a display screen of the terminal in response to the touch instruction, and displays the preview image on the terminal or the display screen of the terminal. Thereby increasing the experience and satisfaction of the user in the using process.
In the present disclosure, the touch instruction and the function corresponding to the touch instruction are not specifically limited, and may be determined according to actual situations.
As can be seen from the foregoing description, the image to be processed may also include an image in a certain video frame in a dynamic picture such as a video. Correspondingly, the preview image corresponding to the image to be processed may further include a preview image having a planetary special effect corresponding to an image in a certain video frame in a dynamic picture such as a video. Therefore, the image processing method can also obtain the planet special effect video corresponding to the video, wherein the video can be a video formed by continuous video frames. The "a certain video frame in a dynamic picture such as a video" described above may be understood as a certain video frame in a video of the continuous video frames. A process of obtaining a planetary special effect video corresponding to a video will be described below.
The present disclosure will be explained with the following embodiments with respect to a process in which a planet special effect image is generated based on a preview image in response to a shooting instruction, including a planet special effect video.
Fig. 7 is a flowchart illustrating generation of a planetary special effect image based on a preview image in an image processing method according to an exemplary embodiment.
In an exemplary embodiment of the present disclosure, as shown in fig. 7, generating a planetary special effect image based on the preview image includes steps S41 to S43. The steps will be described separately below.
In step S41, in response to the photographing instruction, a continuous multi-frame image is acquired.
The continuous multi-frame image may be a multi-frame image acquired by a camera. Wherein successive frames of images may constitute a video segment.
In step S42, the consecutive multi-frame images are processed in accordance with the preview image processing method, and a plurality of intermediate images having a planetary special effect corresponding to the consecutive multi-frame images are obtained.
An intermediate image with a planetary effect can be understood as a preview image of the image of a picture in each video frame in a piece of video with consecutive video frames. Wherein the preview image has a planetary special effect.
In step S43, the plurality of intermediate images are concatenated to generate a planetary special effect video.
In one embodiment, when the user determines that the intermediate image with the planetary special effect meets the requirement of the user, a plurality of intermediate images can be connected in series to generate the planetary special effect video. By the method, videos with the planetary special effects can be generated, and the experience and satisfaction of users in the using process are improved.
In the application process, the intermediate images with the planet special effects are connected according to the time sequence, and then the planet special effect video can be obtained.
As can be seen from the above description, according to the image processing method provided by the present disclosure, a preview image with a planet special effect can be generated through an acquired image to be processed, and the preview image can be displayed on a display screen; when the preview image with the planetary special effect meets the requirements of the user, the planetary special effect image can be generated based on the preview image by responding to the shooting instruction of the user. By the method, a user does not need to obtain an image photo to be processed in advance in the process of obtaining the planet special effect image, but can directly preview the preview image with the planet special effect in the process of image acquisition, so that the processing complexity of the user in the process of obtaining the planet special effect image is reduced, and the experience and satisfaction of the user in the using process are increased.
Based on the same conception, the embodiment of the disclosure also provides an image processing device.
It is understood that the image processing apparatus provided by the embodiments of the present disclosure includes a hardware structure and/or a software module for performing each function in order to realize the above functions. The disclosed embodiments can be implemented in hardware or a combination of hardware and computer software, in combination with the exemplary elements and algorithm steps disclosed in the disclosed embodiments. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
Fig. 8 is a block diagram illustrating an image processing apparatus according to an exemplary embodiment. The image processing apparatus 200 is applied to a terminal, and the terminal may include a display screen. Referring to fig. 8, the image processing apparatus 200 includes an acquiring module 121, a preview image generating module 122, a preview image displaying module 123, and a planet effect image generating module 124. Each module will be described separately below.
The acquiring module 121 is configured to acquire the to-be-processed image acquired by one or more cameras in real time.
And a preview image generating module 122 configured to obtain a target image by stretching the image to be processed, and generate a preview image with a planetary special effect based on the target image.
A display preview image module 123 configured to display the preview image.
A generate planet special effects image module 124 configured to generate a planet special effects image based on the preview image in response to the shoot determination instruction.
In an exemplary embodiment of the present disclosure, the generate preview image module 122 is configured to: and splicing a first target edge and a second target edge of the image to be processed, and filling the spliced part to obtain a target image, wherein the first target edge is opposite to the second target edge.
In an exemplary embodiment of the present disclosure, the generate preview image module 122 includes a determine object image sub-module and an acquire target image sub-module. Each sub-module is described separately below.
And the object image determining submodule is configured to determine a first object image in the image to be processed.
And the target image acquisition sub-module is configured to reduce the deformation rate of the first target image in the process of stretching the image to be processed to obtain the target image.
In an exemplary embodiment of the present disclosure, the obtaining module 121 includes an obtain captured image sub-module and an obtain fused image sub-module. Each sub-module is described separately below.
And the acquisition image submodule is configured to acquire a plurality of acquisition images acquired by the plurality of cameras in real time.
And the acquisition fusion image submodule is configured for identifying the overlapping part of the acquired images based on the plurality of acquired images, fusing the overlapping part to obtain a fusion image, and taking the fusion image as an image to be processed.
In an exemplary embodiment of the present disclosure, the generate preview image module 122 includes an determine object image sub-module and an acquire stitched image sub-module, which are described below.
The determine object image submodule is configured to: and determining a second object image in the image to be processed.
The acquire stitched image sub-module is configured to: and the image stretching unit is used for carrying out image stretching processing on the second object image to obtain a third object image and obtaining a target image based on the third object image.
In an exemplary embodiment of the present disclosure, the planetary effect image includes a planetary effect video, and the generate planetary effect image module 124 includes a response instructions submodule, a process images submodule, and a generate planetary effect video submodule. Each sub-module will be described separately below.
The response instructions submodule is configured to: in response to a photographing instruction, continuous multi-frame images are acquired.
The process image sub-module is configured for: and processing the continuous multi-frame images according to the processing mode of the preview images to obtain a plurality of intermediate images with the planetary special effect corresponding to the continuous multi-frame images.
The generate planet special effects video submodule is configured to: and connecting the plurality of intermediate images in series to generate the planet special effect video.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 9 is a block diagram illustrating an apparatus 800 for image processing according to an example embodiment. For example, the apparatus for image processing 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 9, an apparatus 800 for image processing may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls the overall operation of the apparatus 800 for image processing, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus for image processing 800. Examples of such data include instructions for any application or method operating on the apparatus for image processing 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power component 806 provides power to various components of the device 800 for image processing. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the apparatus for image processing 800.
The multimedia component 808 comprises a screen providing an output interface between the apparatus for image processing 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus for image processing 800 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive an external audio signal when the apparatus for image processing 800 is in an operation mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing status assessment of various aspects of the apparatus 800 for image processing. For example, the sensor component 814 may detect an open/closed state of the apparatus for image processing 800, a relative positioning of components, such as a display and a keypad of the apparatus for image processing 800, the sensor component 814 may also detect a change in position of the apparatus for image processing 800 or a component of the apparatus for image processing 800, the presence or absence of user contact with the apparatus for image processing 800, an orientation or acceleration/deceleration of the apparatus for image processing 800, and a change in temperature of the apparatus for image processing 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communication between the apparatus for image processing 800 and other devices in a wired or wireless manner. The apparatus 800 for image processing may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 for image processing may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors, or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the apparatus for image processing 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Fig. 10 is a block diagram illustrating another apparatus for image processing according to an exemplary embodiment. For example, the apparatus 1100 for image processing may be provided as a server. Referring to fig. 10, an apparatus 1100 for image processing includes a processing component 1122 that further includes one or more processors, and memory resources, represented by memory 1132, for storing instructions, such as application programs, executable by the processing component 1122. The application programs stored in memory 1132 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1122 is configured to execute instructions to perform the data transfer methods described above.
The apparatus for image processing 1100 may further include a power supply component 1126 configured to perform power management of the apparatus for image processing 1100, a wired or wireless network interface 1150 configured to connect the apparatus for data transmission 1100 to a network, and an input/output (I/O) interface 1158. The apparatus 1100 for image processing may operate based on an operating system stored in the memory 1132, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
It is understood that "a plurality" in this disclosure means two or more, and other words are analogous. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. The singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It will be further understood that the terms "first," "second," and the like are used to describe various information and that such information should not be limited by these terms. These terms are only used to distinguish one type of information from another and do not denote a particular order or importance. Indeed, the terms "first," "second," and the like are fully interchangeable. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It will be further understood that, unless otherwise specified, "connected" includes direct connections between the two without the presence of other elements, as well as indirect connections between the two with the presence of other elements.
It is further to be understood that while operations are depicted in the drawings in a particular order, this is not to be understood as requiring that such operations be performed in the particular order shown or in serial order, or that all illustrated operations be performed, to achieve desirable results. In certain environments, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. An image processing method applied to a terminal includes:
acquiring images to be processed acquired in real time through one or more cameras;
stretching the image to be processed to obtain a target image, and generating a preview image with a planet special effect based on the target image;
displaying the preview image;
and generating a planet special effect image based on the preview image in response to a shooting instruction.
2. The image processing method according to claim 1, wherein the obtaining of the target image by stretching the image to be processed comprises:
and splicing a first target edge and a second target edge of the image to be processed, and filling the spliced part to obtain the target image, wherein the first target edge is opposite to the second target edge.
3. The image processing method according to claim 2, wherein the obtaining of the target image by stretching the image to be processed comprises:
determining a first object image in the image to be processed;
and in the process of stretching the image to be processed, reducing the deformation rate of the first object image to obtain the target image.
4. The image processing method according to any one of claims 1 to 3, wherein the obtaining the target image by performing stretching processing on the image to be processed includes:
determining a second object image in the image to be processed;
and performing image stretching processing on the second object image to obtain a third object image, and obtaining the target image based on the third object image.
5. The image processing method according to claim 1, wherein the acquiring the to-be-processed image acquired by one or more cameras in real time comprises:
acquiring a plurality of acquired images acquired by a plurality of cameras in real time;
and identifying the overlapped part of the collected images based on the plurality of collected images, fusing the overlapped part to obtain a fused image, and taking the fused image as the image to be processed.
6. The image processing method according to claim 1, wherein the planetary special effect image includes a planetary special effect video, and the generating a planetary special effect image based on the preview image in response to the shooting instruction includes:
responding to a shooting instruction, and acquiring continuous multi-frame images;
processing the continuous multi-frame images according to the processing mode of the preview images to obtain a plurality of intermediate images with planetary special effects corresponding to the continuous multi-frame images;
and connecting a plurality of intermediate images in series to generate a planet special effect video.
7. An image processing apparatus, applied to a terminal, comprising:
the acquisition module is used for acquiring images to be processed acquired in real time through one or more cameras;
the preview image generating module is used for stretching the image to be processed to obtain a target image and generating a preview image with a planet special effect based on the target image;
the preview image display module is used for displaying the preview image;
and the planet special effect image generation module is used for responding to a shooting instruction and generating a planet special effect image based on the preview image.
8. The image processing apparatus of claim 7, wherein the generate preview image module is configured to:
and splicing a first target edge and a second target edge of the image to be processed, and filling the spliced part to obtain the target image, wherein the first target edge is opposite to the second target edge.
9. The image processing apparatus of claim 8, wherein the generate preview image module comprises:
a determining object image submodule for determining a first object image in the image to be processed;
and the target image obtaining submodule is used for reducing the deformation rate of the first target image in the process of stretching the image to be processed to obtain the target image.
10. The image processing apparatus according to any one of claims 7 to 9, wherein the means for generating a preview image comprises:
a determining object image submodule for determining a second object image in the image to be processed;
and the target image obtaining sub-module is used for carrying out image stretching processing on the second object image to obtain a third object image and obtaining the target image based on the third object image.
11. The image processing apparatus according to claim 7, wherein the acquisition module includes:
the acquisition image submodule is used for acquiring a plurality of acquisition images acquired by a plurality of cameras in real time;
and the fused image sub-module is used for identifying the overlapped part of the collected images based on the collected images, fusing the overlapped part to obtain a fused image, and taking the fused image as the image to be processed.
12. The image processing apparatus of claim 7, wherein the planetary effect image comprises a planetary effect video, and wherein the generate planetary effect image module comprises:
the response instruction submodule is used for responding to the shooting instruction and acquiring continuous multi-frame images;
the processing image submodule is used for processing the continuous multi-frame images according to the processing mode of the preview images to obtain a plurality of intermediate images with the planetary special effect corresponding to the continuous multi-frame images;
and generating a planet special effect video submodule for connecting the plurality of intermediate images in series to generate a planet special effect video.
13. An image processing apparatus characterized by comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: performing the image processing method of any one of claims 1 to 6.
14. A non-transitory computer readable storage medium, instructions in which, when executed by a processor of a mobile terminal, enable the mobile terminal to perform the image processing method of any one of claims 1 to 6.
CN202010756691.3A 2020-07-31 2020-07-31 Image processing method, device and storage medium Active CN114070999B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010756691.3A CN114070999B (en) 2020-07-31 2020-07-31 Image processing method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010756691.3A CN114070999B (en) 2020-07-31 2020-07-31 Image processing method, device and storage medium

Publications (2)

Publication Number Publication Date
CN114070999A true CN114070999A (en) 2022-02-18
CN114070999B CN114070999B (en) 2023-12-12

Family

ID=80227522

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010756691.3A Active CN114070999B (en) 2020-07-31 2020-07-31 Image processing method, device and storage medium

Country Status (1)

Country Link
CN (1) CN114070999B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1705363A (en) * 2004-05-31 2005-12-07 佳能株式会社 Apparatus and method for image processing
US20100231786A1 (en) * 2006-06-28 2010-09-16 Panasonic Corporation Image special effect apparatus and image special effect method
CN102780852A (en) * 2012-05-31 2012-11-14 新奥特(北京)视频技术有限公司 Method for realizing spherical transition special effects
CN103020900A (en) * 2012-11-15 2013-04-03 北京小米科技有限责任公司 Method and device for image processing

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1705363A (en) * 2004-05-31 2005-12-07 佳能株式会社 Apparatus and method for image processing
US20100231786A1 (en) * 2006-06-28 2010-09-16 Panasonic Corporation Image special effect apparatus and image special effect method
CN102780852A (en) * 2012-05-31 2012-11-14 新奥特(北京)视频技术有限公司 Method for realizing spherical transition special effects
CN103020900A (en) * 2012-11-15 2013-04-03 北京小米科技有限责任公司 Method and device for image processing

Also Published As

Publication number Publication date
CN114070999B (en) 2023-12-12

Similar Documents

Publication Publication Date Title
KR102194094B1 (en) Synthesis method, apparatus, program and recording medium of virtual and real objects
US11368632B2 (en) Method and apparatus for processing video, and storage medium
US9332208B2 (en) Imaging apparatus having a projector with automatic photography activation based on superimposition
CN108495032B (en) Image processing method, image processing device, storage medium and electronic equipment
EP3010226A2 (en) Method and apparatus for obtaining photograph
CN107426502B (en) Shooting method and device, electronic equipment and storage medium
CN108632543B (en) Image display method, image display device, storage medium and electronic equipment
CN106791483B (en) Image transmission method and device and electronic equipment
EP3945494A1 (en) Video processing method, apparatus and storage medium
KR20120012201A (en) Method for photographing panorama picture
CN105554372A (en) Photographing method and device
CN111083371A (en) Shooting method and electronic equipment
CN112312042A (en) Display control method, display control device, electronic equipment and storage medium
CN110827219B (en) Training method, device and medium of image processing model
US20160295118A1 (en) Method and apparatus for displaying framing information
US11265529B2 (en) Method and apparatus for controlling image display
CN114070999B (en) Image processing method, device and storage medium
KR102557592B1 (en) Method and apparatus for displaying an image, electronic device and computer-readable storage medium
CN111586296B (en) Image capturing method, image capturing apparatus, and storage medium
CN113852752B (en) Photo taking method, photo taking device and storage medium
WO2023225910A1 (en) Video display method and apparatus, terminal device, and computer storage medium
JP2005117106A (en) Mobile telephone with imaging function and control method thereof, and control program
CN118018854A (en) Method, device and storage medium for generating high dynamic range image
CN118055334A (en) Photographing method, photographing device and storage medium
CN118052752A (en) Image processing method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant