CN110070585A - Image generating method, device and computer readable storage medium - Google Patents
Image generating method, device and computer readable storage medium Download PDFInfo
- Publication number
- CN110070585A CN110070585A CN201910101336.XA CN201910101336A CN110070585A CN 110070585 A CN110070585 A CN 110070585A CN 201910101336 A CN201910101336 A CN 201910101336A CN 110070585 A CN110070585 A CN 110070585A
- Authority
- CN
- China
- Prior art keywords
- image
- layer
- calculating
- moving
- generation method
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 38
- 238000009877 rendering Methods 0.000 claims abstract description 35
- 238000004364 calculation method Methods 0.000 claims description 46
- 238000000605 extraction Methods 0.000 claims description 14
- 230000000694 effects Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 10
- 238000004590 computer program Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000003796 beauty Effects 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
Abstract
The present disclosure discloses a kind of image generating method, device, electronic equipment and computer readable storage mediums.Wherein the image generating method includes: to obtain the first figure layer image;Obtain the second figure layer image;Calculate the first image-region after the first figure layer image and the second figure layer image superposition;According to the first image region, rendering generates the first image.The technical issues of embodiment of the present disclosure passes through the image-region precomputed after superposition, carries out image rendering again later, when solving the superimposed image for realizing different figure layers in the prior art, needs repeatedly to render, and increases overhead.
Description
Technical Field
The present disclosure relates to the field of image processing, and in particular, to an image generation method and apparatus, an electronic device, and a computer-readable storage medium.
Background
With the development of computer technology, the application range of the intelligent terminal is widely improved, for example, the intelligent terminal can listen to music, play games, chat on internet, take pictures and the like. For the photographing technology of the intelligent terminal, the photographing pixels of the intelligent terminal reach more than ten million pixels, and the intelligent terminal has higher definition and the photographing effect comparable to that of a professional camera.
At present, when an intelligent terminal is used for photographing, not only can photographing effects of traditional functions be realized by using photographing software built in when the intelligent terminal leaves a factory, but also photographing effects with additional functions can be realized by downloading an Application program (APP for short) from a network end, for example, the APP with functions of dark light detection, a beauty camera, super pixels and the like can be realized. Various special effect effects such as beauty, filtering, large-eye face thinning and the like can be formed by combining various basic image generation.
Currently, some special effects are generated by the superposition of multiple layers of images. And a plurality of layers are superposed, and the more used mode is to draw and superpose layer by layer on different layers. For example, when stacking A, B, C three layers is to be achieved, layer a may be drawn first, layer B may be drawn, and layer C may be drawn finally.
Disclosure of Invention
In a first aspect, an embodiment of the present disclosure provides an image generation method, including: acquiring a first image layer image; acquiring a second image layer image; calculating a first image area after the first image layer image and the second image layer image are superposed; and rendering to generate a first image according to the first image area.
Further, the acquiring the first layer image includes: acquiring an image to be processed; and extracting a foreground image of the image to be processed as a first image layer image.
Further, the acquiring the image to be processed includes: and acquiring a video image, and taking a current video image frame of the video image as an image to be processed.
Further, the extracting a foreground image of the image to be processed as a first image-layer image includes: acquiring an extraction template image of a foreground image; and superposing the template image and the image to be processed to obtain the foreground image as a first image layer image.
Further, the superimposing the template image and the image to be processed to obtain the foreground image as a first image-layer image includes: and superposing the template image and the image to be processed in a transparency manner to obtain the foreground image as a first image layer image.
Further, the obtaining the second layer image includes: moving the first layer image to obtain a moving image of the first layer; calculating a superposition image of the first layer image and the moving image of the first layer according to a first logic calculation rule; and taking the superposed image as a second image layer image.
Further, the calculating the first image area after the first image-layer image and the second image-layer image are superimposed includes: and calculating a first image area after the first image layer image and the second image layer image are overlapped according to a second logic calculation rule.
Further, the generating a first image by rendering according to the first image area includes: and rendering to generate a first image according to the front-back relation of the first image layer image and the second image layer image in the first image area.
Further, the calculating a superimposed image of the first layer image and the moving image of the first layer according to the first logic calculation rule includes: calculating a first intersection image of the first image layer image and the moving image of the first image layer; calculating a reverse image of the intersection image according to the first intersection image; and calculating a second intersection image of the reverse image of the intersection image and the moving image of the first image layer.
Further, the calculating a first image area after the first layer image and the second layer image are superimposed according to a second logic calculation rule includes: calculating a union set image of the first image layer image and the second image layer image; and taking the image area of the union image as a first image area.
Further, the moving the first layer image to obtain a moving image of the first layer includes: moving the first layer image for multiple times to obtain a plurality of moving images of the first layer; the calculating the superimposed image of the first layer image and the moving image of the first layer according to the first logic calculation rule includes: and calculating the superposed images of the first layer images and the moving images of the plurality of first layers according to a first logic calculation rule.
In a second aspect, an embodiment of the present disclosure provides an image generating apparatus, including:
the first image acquisition module is used for acquiring a first image layer image;
the second image acquisition module is used for acquiring a second image layer image;
the first image area calculation module is used for calculating a first image area after the first image layer image and the second image layer image are superposed;
and the rendering module is used for rendering and generating a first image according to the first image area.
Further, the first image obtaining module further includes:
the image to be processed acquisition module is used for acquiring an image to be processed;
and the foreground image extraction module is used for extracting a foreground image of the image to be processed as a first image layer image.
Further, the module for acquiring the image to be processed further includes:
and the video image acquisition module is used for acquiring a video image and taking the current video image frame of the video image as an image to be processed.
Further, the foreground image extraction module includes:
the template image acquisition module is used for acquiring an extraction template image of the foreground image;
and the foreground image extraction submodule is used for superposing the template image and the image to be processed to obtain the foreground image serving as a first image layer image.
Further, the foreground image extraction sub-module is further configured to:
and superposing the template image and the image to be processed in a transparency manner to obtain the foreground image as a first image layer image.
Further, the second image obtaining module further includes:
the moving module is used for moving the first layer image to obtain a moving image of the first layer;
the first logic calculation module is used for calculating a superposed image of the first layer image and the moving image of the first layer according to a first logic calculation rule; and taking the superposed image as a second image layer image.
Further, the first image region calculating module includes:
and the second logic calculation module is used for calculating a first image area after the first image layer image and the second image layer image are superposed according to a second logic calculation rule.
Further, the rendering module is further configured to:
and rendering to generate a first image according to the front-back relation of the first image layer image and the second image layer image in the first image area.
Further, the first logic calculation module is further configured to:
calculating a first intersection image of the first image layer image and the moving image of the first image layer;
calculating a reverse image of the intersection image according to the first intersection image;
and calculating a second intersection image of the reverse image of the intersection image and the moving image of the first image layer.
Further, the second logic calculation module is further configured to:
calculating a union set image of the first image layer image and the second image layer image;
and taking the image area of the union image as a first image area.
Further, the moving module is further configured to: moving the first layer image for multiple times to obtain a plurality of moving images of the first layer; the first logic computation module is further configured to: and calculating the superposed images of the first layer images and the moving images of the plurality of first layers according to a first logic calculation rule.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform any of the image generation methods of the preceding first aspect.
In a fourth aspect, the present disclosure provides a non-transitory computer-readable storage medium, wherein the non-transitory computer-readable storage medium stores computer instructions for causing a computer to execute the image generation method of any one of the foregoing first aspects.
The disclosure discloses an image generation method, an image generation device, an electronic device and a computer-readable storage medium. The image generation method comprises the following steps: acquiring a first image layer image; acquiring a second image layer image; calculating a first image area after the first image layer image and the second image layer image are superposed; and rendering to generate a first image according to the first image area. According to the embodiment of the disclosure, the image area after superposition is calculated in advance, and then image rendering is performed, so that the technical problems that multiple times of rendering is needed and the system overhead is increased when the superposed images of different image layers are realized in the prior art are solved.
The foregoing is a summary of the present disclosure, and for the purposes of promoting a clear understanding of the technical means of the present disclosure, the present disclosure may be embodied in other specific forms without departing from the spirit or essential attributes thereof.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present disclosure, and other drawings can be obtained according to the drawings without creative efforts for those skilled in the art.
Fig. 1 is a flowchart of an embodiment of an image generation method provided in an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a specific example of acquiring a first layer image according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a specific example of acquiring a second layer image according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of a specific example of generating a first image region according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an embodiment of an image generating apparatus provided in an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device provided according to an embodiment of the present disclosure.
Detailed Description
The embodiments of the present disclosure are described below with specific examples, and other advantages and effects of the present disclosure will be readily apparent to those skilled in the art from the disclosure in the specification. It is to be understood that the described embodiments are merely illustrative of some, and not restrictive, of the embodiments of the disclosure. The disclosure may be embodied or carried out in various other specific embodiments, and various modifications and changes may be made in the details within the description without departing from the spirit of the disclosure. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
It is noted that various aspects of the embodiments are described below within the scope of the appended claims. It should be apparent that the aspects described herein may be embodied in a wide variety of forms and that any specific structure and/or function described herein is merely illustrative. Based on the disclosure, one skilled in the art should appreciate that one aspect described herein may be implemented independently of any other aspects and that two or more of these aspects may be combined in various ways. For example, an apparatus may be implemented and/or a method practiced using any number of the aspects set forth herein. Additionally, such an apparatus may be implemented and/or such a method may be practiced using other structure and/or functionality in addition to one or more of the aspects set forth herein.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present disclosure, and the drawings only show the components related to the present disclosure rather than the number, shape and size of the components in actual implementation, and the type, amount and ratio of the components in actual implementation may be changed arbitrarily, and the layout of the components may be more complicated.
In addition, in the following description, specific details are provided to facilitate a thorough understanding of the examples. However, it will be understood by those skilled in the art that the aspects may be practiced without these specific details.
Fig. 1 is a flowchart of a first embodiment of an image generation method provided in this disclosure, where the image generation method provided in this embodiment may be executed by an image generation apparatus, the image generation apparatus may be implemented as software, or implemented as a combination of software and hardware, and the image generation apparatus may be integrated in some device in an image generation system, such as an image generation server or an image generation terminal device. As shown in fig. 1, the method comprises the steps of:
step S101, acquiring a first image layer image;
in one embodiment, the first image layer image may be any image.
In an embodiment, the acquiring the first layer image includes: acquiring an image to be processed; and extracting a foreground image of the image to be processed as a first image layer image.
In this embodiment, the image to be processed may be obtained by an image sensor, which refers to various devices capable of acquiring images, and typical image sensors are video cameras, and the like. In this embodiment, the image sensor may be a camera on the terminal device, such as a front-facing or rear-facing camera on a smart phone, and an image acquired by the camera may be directly displayed on a display screen of the smart phone. In this embodiment, the acquiring of the image to be processed may be acquiring a current video frame of a video acquired by a current terminal device, and since the video is composed of a plurality of video frames, the processing of the image in this embodiment may be processing the video frame of the video.
In this embodiment, the image to be processed may be any image obtained from a local or network. In this embodiment, the extracting a foreground image of the image to be processed as the first layer image includes: acquiring an extraction template image of a foreground image; and superposing the template image and the image to be processed to obtain the foreground image. In this embodiment, an extraction template image of a foreground image is obtained, the template image includes a region having the same shape as the foreground image to be extracted, and the foreground image is extracted through the region having the same shape. The overlapping the template image and the image to be processed to obtain the foreground image comprises: and superposing the template image and the image to be processed in a transparency manner to obtain the foreground image. In this embodiment, the template image comprises two regions, a first region having a transparency of 0, i.e. being completely opaque, and a second region having a transparency of 1, i.e. being completely transparent. In this embodiment, the transparency of the image in the RGB color space is used to calculate the foreground image, and the image to be processed is set as G1The template image is G2Where transparency is expressed using alpha, the foreground image G can be calculated by the following formula3Comprises the following steps:
G3=G1.alpha*G2.alpha
therefore, the required foreground image can be extracted through the transparent area of the template image.
Optionally, a specific embodiment of the template image is an image with the same size as the to-be-processed image, and a region with the same shape as the foreground image is located in the template image at a position corresponding to the foreground image of the to-be-processed image. As shown in fig. 2, 201 is an image to be processed, where 203 is a foreground image of the image to be processed 201, 204 is a background image of the image to be processed 201, and 202 is a template image for extracting the foreground image 203, where the template image includes a transparent area 205 and an opaque area, where the transparent area is the same as the shape and size of the foreground image 203, and the opaque area is a background 206, and the template image 202 is superimposed on the image to be processed 201, and since 206 is opaque and 205 is transparent, the foreground image 203 can be extracted from the image to be processed 201, and an image 207 is obtained.
It can be understood that the shape of the transparent region of the template image can be set arbitrarily, and can be set to any required shape to extract different foreground images, which is not described herein again.
Step S102: acquiring a second image layer image;
in one embodiment, the second image-layer image may be any image.
In an embodiment, the manner of acquiring the image of the second layer may be the same as the manner of acquiring the image of the first layer.
In an embodiment, the acquiring the second image-layer image includes: moving the first layer image to obtain a moving image of the first layer; calculating a superposition image of the first layer image and the moving image of the first layer according to a first logic calculation rule; and taking the superposed image as a second image layer image. In this embodiment, the image of the first image is moved, and then, a new image is generated as the second image layer image by superimposing the image of the first image with the original image. As shown in fig. 3, a specific manner of obtaining the second layer image in this embodiment is that 305 in the image 301 is extracted to obtain a foreground image, the foreground image 305 is moved to the left to obtain an image 302 and a moved foreground image 306, the images 305 and 306 are overlapped, and a second layer image 307 in the image 304 is obtained according to the following calculation logic:
G4=!(G5and G6)and G5
wherein G is4For the final second image layer image 307, G5Is a foreground image 305, G6The shifted foreground image 306 is thus computed according to the logic described above, resulting in image 307.
It is understood that the above logic calculation rule is only an example, and the logic calculation rule in the present disclosure may be any rule, and any calculation rule may be designed as needed to obtain a desired result of the superimposed image, which is not described herein again.
Step S103: calculating a first image area after the first image layer image and the second image layer image are superposed;
in this embodiment, the calculating the first image area after the first layer image and the second layer image are superimposed includes: and calculating a first image area after the first image layer image and the second image layer image are overlapped according to a second logic calculation rule. Fig. 3 shows an example of a first image area after the first layer image and the second layer image are superimposed by calculating a second logical calculation rule, such as the image 308 shown in fig. 3, and a superimposed image of the first layer image and the second layer image, i.e. the first image area 309, is obtained by the following logical calculation rule:
G7=G4or G5
wherein G is7Is a superimposed image. Through the above logic calculation, the second layer image 307 and the first layer image 305 are superimposed to generate a first image area 309, in which an effect that the image 307 is in the background and the image 305 is in the foreground is presented.
In one embodiment, a UV map for the first image region used for rendering may be generated.
It is understood that the above logic calculation rule is only an example, and the logic calculation rule in the present disclosure may be any rule, and any calculation rule may be designed as needed to obtain a desired result of the superimposed image, which is not described herein again.
Step S104: and rendering to generate a first image according to the first image area.
In one embodiment, the rendering the first image according to the first image area includes: and rendering to generate a first image according to the front-back relation of the first image layer image and the second image layer image in the first image area. In this embodiment, a first layer image is rendered in a first layer in the display area, and a second layer image is rendered in a second layer in the display area, and since the overlapping portion of the image of the first layer and the image of the second layer is already distinguished by logic calculation in step S102 and step S103, the rendering can be completed by one-time rendering. Specifically, a UV map of the superimposed image may be generated in the result in step S103, in which the first image is generated by rendering the color of the first image into the display area through the UV map.
In an embodiment, in the step S103, the moving the first layer image to obtain a moving image of the first layer includes: and moving the first layer image for multiple times to obtain a plurality of moving images of the first layer. In a specific example of this embodiment, the first layer image is moved by the same distance in four directions, i.e., up, down, left, and right.
In step S103, the calculating a superimposed image of the first layer image and the moving image of the first layer according to the first logic calculation rule includes: and calculating the superposed images of the first layer images and the moving images of the plurality of first layers according to a first logic calculation rule. As described in step S103, when there are a plurality of moving images of the first layer as a result of one-time superimposition, the superimposition result is repeatedly calculated, and according to the example in step S103, a superimposed image as shown in 401 in fig. 4, that is, an image in which the second layer is not wanted in this embodiment, is obtained after the first-layer image is moved in four directions, i.e., the up, down, left, and right directions and superimposed with the first-layer image according to the first logical calculation rule. Then, in step S104, the first layer image and the second layer image obtained in this embodiment may be superimposed according to a second logic rule to generate a first image area, as shown in 402 in fig. 4. Through the specific example in the embodiment, different superimposed image effects can be generated by setting a moving rule and a logic calculation rule, for example, in the specific example of the embodiment, an effect of a flower is realized, petals are located at the bottom of a flower layer of a flower center, a certain layering effect is achieved, and the effect of the flower can be generated through one-time image rendering without repeatedly rendering for 5 times.
The disclosure discloses an image generation method, an image generation device, an electronic device and a computer-readable storage medium. The image generation method comprises the following steps: acquiring a first image layer image; acquiring a second image layer image; calculating a first image area after the first image layer image and the second image layer image are superposed; and rendering to generate a first image according to the first image area. According to the embodiment of the disclosure, the image area after superposition is calculated in advance, and then image rendering is performed, so that the technical problems that multiple times of rendering is needed and the system overhead is increased when the superposed images of different image layers are realized in the prior art are solved.
Fig. 5 is a schematic structural diagram of an embodiment of an image generating apparatus provided in an embodiment of the present disclosure, and as shown in fig. 5, the apparatus 500 includes: a first image acquisition module 501, a second image acquisition module 502, a first image region calculation module 503, and a rendering module 504. Wherein,
a first image obtaining module 501, configured to obtain a first layer image;
a second image obtaining module 502, configured to obtain a second layer image;
a first image area calculating module 503, configured to calculate a first image area after the first layer image and the second layer image are superimposed;
and a rendering module 504, configured to generate a first image by rendering according to the first image area.
Further, the first image obtaining module 501 further includes:
the image to be processed acquisition module is used for acquiring an image to be processed;
and the foreground image extraction module is used for extracting a foreground image of the image to be processed as a first image layer image.
Further, the module for acquiring the image to be processed further includes:
and the video image acquisition module is used for acquiring a video image and taking the current video image frame of the video image as an image to be processed.
Further, the foreground image extraction module includes:
the template image acquisition module is used for acquiring an extraction template image of the foreground image;
and the foreground image extraction submodule is used for superposing the template image and the image to be processed to obtain the foreground image serving as a first image layer image.
Further, the foreground image extraction sub-module is further configured to:
and superposing the template image and the image to be processed in a transparency manner to obtain the foreground image as a first image layer image.
Further, the second image obtaining module 502 further includes:
the moving module is used for moving the first layer image to obtain a moving image of the first layer;
the first logic calculation module is used for calculating a superposed image of the first layer image and the moving image of the first layer according to a first logic calculation rule; and taking the superposed image as a second image layer image.
Further, the first image region calculating module 503 includes:
and the second logic calculation module is used for calculating a first image area after the first image layer image and the second image layer image are superposed according to a second logic calculation rule.
Further, the rendering module 504 is further configured to:
and rendering to generate a first image according to the front-back relation of the first image layer image and the second image layer image in the first image area.
Further, the first logic calculation module is further configured to:
calculating a first intersection image of the first image layer image and the moving image of the first image layer;
calculating a reverse image of the intersection image according to the first intersection image;
and calculating a second intersection image of the reverse image of the intersection image and the moving image of the first image layer.
Further, the second logic calculation module is further configured to:
calculating a union set image of the first image layer image and the second image layer image;
and taking the image area of the union image as a first image area.
Further, the moving module is further configured to: moving the first layer image for multiple times to obtain a plurality of moving images of the first layer; the first logic computation module is further configured to: and calculating the superposed images of the first layer images and the moving images of the plurality of first layers according to a first logic calculation rule.
The apparatus shown in fig. 5 can perform the method of the embodiment shown in fig. 1, and reference may be made to the related description of the embodiment shown in fig. 1 for a part of this embodiment that is not described in detail. The implementation process and technical effect of the technical solution refer to the description in the embodiment shown in fig. 1, and are not described herein again.
Referring now to FIG. 6, a block diagram of an electronic device 600 suitable for use in implementing embodiments of the present disclosure is shown. The electronic devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., car navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, image sensor, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a first image layer image; acquiring a second image layer image; calculating a first image area after the first image layer image and the second image layer image are superposed; and rendering to generate a first image according to the first image area.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Claims (14)
1. An image generation method, comprising:
acquiring a first image layer image;
acquiring a second image layer image;
calculating a first image area after the first image layer image and the second image layer image are superposed;
and rendering to generate a first image according to the first image area.
2. The image generation method according to claim 1, wherein the obtaining the first image-layer image includes:
acquiring an image to be processed;
and extracting a foreground image of the image to be processed as a first image layer image.
3. The image generation method of claim 2, wherein the acquiring the image to be processed comprises:
and acquiring a video image, and taking a current video image frame of the video image as an image to be processed.
4. The image generation method according to claim 2, wherein the extracting a foreground image of the image to be processed as the first layer image includes:
acquiring an extraction template image of a foreground image;
and superposing the template image and the image to be processed to obtain the foreground image as a first image layer image.
5. The image generation method according to claim 4, wherein the superimposing the template image and the image to be processed to obtain the foreground image as a first image-layer image includes:
and superposing the template image and the image to be processed in a transparency manner to obtain the foreground image as a first image layer image.
6. The image generation method according to claim 2, wherein the obtaining the second image-layer image includes:
moving the first layer image to obtain a moving image of the first layer;
calculating a superposition image of the first layer image and the moving image of the first layer according to a first logic calculation rule;
and taking the superposed image as a second image layer image.
7. The image generation method according to claim 1, wherein the calculating the first image area after the first image-layer image and the second image-layer image are superimposed includes:
and calculating a first image area after the first image layer image and the second image layer image are overlapped according to a second logic calculation rule.
8. The image generation method of claim 1, wherein the rendering the first image from the first image region comprises:
and rendering to generate a first image according to the front-back relation of the first image layer image and the second image layer image in the first image area.
9. The image generation method according to claim 6, wherein the calculating of the superimposed image of the first layer image and the moving image of the first layer according to the first logic calculation rule includes:
calculating a first intersection image of the first image layer image and the moving image of the first image layer;
calculating a reverse image of the intersection image according to the first intersection image;
and calculating a second intersection image of the reverse image of the intersection image and the moving image of the first image layer.
10. The image generation method according to claim 7, wherein the calculating the first image area after the first layer image and the second layer image are superimposed according to the second logic calculation rule includes:
calculating a union set image of the first image layer image and the second image layer image;
and taking the image area of the union image as a first image area.
11. The image generation method according to claim 6, wherein the moving the image of the first layer to obtain a moving image of the first layer includes:
moving the first layer image for multiple times to obtain a plurality of moving images of the first layer;
the calculating the superimposed image of the first layer image and the moving image of the first layer according to the first logic calculation rule includes:
and calculating the superposed images of the first layer images and the moving images of the plurality of first layers according to a first logic calculation rule.
12. An image generation apparatus, comprising:
the first image acquisition module is used for acquiring a first image layer image;
the second image acquisition module is used for acquiring a second image layer image;
the first image area calculation module is used for calculating a first image area after the first image layer image and the second image layer image are superposed;
and the rendering module is used for rendering and generating a first image according to the first image area.
13. An electronic device, comprising:
a memory for storing non-transitory computer readable instructions; and
a processor for executing the computer readable instructions such that the processor when executing implements the image generation method of any of claims 1-11.
14. A computer-readable storage medium storing non-transitory computer-readable instructions that, when executed by a computer, cause the computer to perform the image generation method of any one of claims 1-11.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910101336.XA CN110070585A (en) | 2019-01-31 | 2019-01-31 | Image generating method, device and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910101336.XA CN110070585A (en) | 2019-01-31 | 2019-01-31 | Image generating method, device and computer readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110070585A true CN110070585A (en) | 2019-07-30 |
Family
ID=67366119
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910101336.XA Pending CN110070585A (en) | 2019-01-31 | 2019-01-31 | Image generating method, device and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110070585A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112347301A (en) * | 2019-08-09 | 2021-02-09 | 北京字节跳动网络技术有限公司 | Image special effect processing method and device, electronic equipment and computer readable storage medium |
CN112348748A (en) * | 2019-08-09 | 2021-02-09 | 北京字节跳动网络技术有限公司 | Image special effect processing method and device, electronic equipment and computer readable storage medium |
CN112583996A (en) * | 2019-09-29 | 2021-03-30 | 北京嗨动视觉科技有限公司 | Video processing method and video processing device |
CN112849154A (en) * | 2021-01-27 | 2021-05-28 | 广州路派电子科技有限公司 | Backing track and image auxiliary system and method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1588453A (en) * | 2004-07-15 | 2005-03-02 | 浙江大学 | Travel-in-picture method based on relative depth computing |
CN102572305A (en) * | 2011-12-20 | 2012-07-11 | 深圳市万兴软件有限公司 | Method and system for processing video image |
CN102663786A (en) * | 2012-03-30 | 2012-09-12 | 惠州Tcl移动通信有限公司 | Layer superposition method and mobile terminal employing the same |
CN104038700A (en) * | 2014-06-26 | 2014-09-10 | Tcl集团股份有限公司 | Picture taking method and device |
CN105227860A (en) * | 2014-07-02 | 2016-01-06 | 索尼公司 | Image generating method, device and mobile terminal |
CN107016976A (en) * | 2017-05-31 | 2017-08-04 | 西安诺瓦电子科技有限公司 | Display control method and device and display screen system |
US20180095711A1 (en) * | 2016-09-30 | 2018-04-05 | Tomoki KANDA | Communication terminal, communication system, transmission method, and recording medium storing program |
CN108062510A (en) * | 2017-11-17 | 2018-05-22 | 维库(厦门)信息技术有限公司 | Dynamic display method and computer equipment during a kind of multiple target tracking fructufy |
CN108234825A (en) * | 2018-01-12 | 2018-06-29 | 广州市百果园信息技术有限公司 | Method for processing video frequency and computer storage media, terminal |
CN108805849A (en) * | 2018-05-22 | 2018-11-13 | 北京京东金融科技控股有限公司 | Image interfusion method, device, medium and electronic equipment |
-
2019
- 2019-01-31 CN CN201910101336.XA patent/CN110070585A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1588453A (en) * | 2004-07-15 | 2005-03-02 | 浙江大学 | Travel-in-picture method based on relative depth computing |
CN102572305A (en) * | 2011-12-20 | 2012-07-11 | 深圳市万兴软件有限公司 | Method and system for processing video image |
CN102663786A (en) * | 2012-03-30 | 2012-09-12 | 惠州Tcl移动通信有限公司 | Layer superposition method and mobile terminal employing the same |
CN104038700A (en) * | 2014-06-26 | 2014-09-10 | Tcl集团股份有限公司 | Picture taking method and device |
CN105227860A (en) * | 2014-07-02 | 2016-01-06 | 索尼公司 | Image generating method, device and mobile terminal |
US20180095711A1 (en) * | 2016-09-30 | 2018-04-05 | Tomoki KANDA | Communication terminal, communication system, transmission method, and recording medium storing program |
CN107016976A (en) * | 2017-05-31 | 2017-08-04 | 西安诺瓦电子科技有限公司 | Display control method and device and display screen system |
CN108062510A (en) * | 2017-11-17 | 2018-05-22 | 维库(厦门)信息技术有限公司 | Dynamic display method and computer equipment during a kind of multiple target tracking fructufy |
CN108234825A (en) * | 2018-01-12 | 2018-06-29 | 广州市百果园信息技术有限公司 | Method for processing video frequency and computer storage media, terminal |
CN108805849A (en) * | 2018-05-22 | 2018-11-13 | 北京京东金融科技控股有限公司 | Image interfusion method, device, medium and electronic equipment |
Non-Patent Citations (1)
Title |
---|
希望之星海: "ps小技巧:同一图层的复制变换", 《百度文库-HTTPS://WENKU.BAIDU.COM/VIEW/237DB6F50508763230121200.HTML?FROM=SEARCH》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112347301A (en) * | 2019-08-09 | 2021-02-09 | 北京字节跳动网络技术有限公司 | Image special effect processing method and device, electronic equipment and computer readable storage medium |
CN112348748A (en) * | 2019-08-09 | 2021-02-09 | 北京字节跳动网络技术有限公司 | Image special effect processing method and device, electronic equipment and computer readable storage medium |
CN112583996A (en) * | 2019-09-29 | 2021-03-30 | 北京嗨动视觉科技有限公司 | Video processing method and video processing device |
CN112849154A (en) * | 2021-01-27 | 2021-05-28 | 广州路派电子科技有限公司 | Backing track and image auxiliary system and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110070585A (en) | Image generating method, device and computer readable storage medium | |
CN110070496B (en) | Method and device for generating image special effect and hardware device | |
US12062116B2 (en) | Image processing method and apparatus, readable medium and electronic device | |
CN110070495B (en) | Image processing method and device and electronic equipment | |
US12019669B2 (en) | Method, apparatus, device, readable storage medium and product for media content processing | |
WO2020077913A1 (en) | Image processing method and device, and hardware device | |
US12041379B2 (en) | Image special effect processing method, apparatus, and electronic device, and computer-readable storage medium | |
CN115311178A (en) | Image splicing method, device, equipment and medium | |
CN114422698B (en) | Video generation method, device, equipment and storage medium | |
CN110310293B (en) | Human body image segmentation method and device | |
WO2020077912A1 (en) | Image processing method, device, and hardware device | |
CN112258622B (en) | Image processing method and device, readable medium and electronic equipment | |
CN113961280A (en) | View display method and device, electronic equipment and computer-readable storage medium | |
US11651529B2 (en) | Image processing method, apparatus, electronic device and computer readable storage medium | |
US20220245920A1 (en) | Object display method and apparatus, electronic device, and computer readable storage medium | |
CN110209861A (en) | Image processing method, device, electronic equipment and computer readable storage medium | |
CN111292227A (en) | Image processing method and device | |
CN111292247A (en) | Image processing method and device | |
CN111221444A (en) | Split screen special effect processing method and device, electronic equipment and storage medium | |
CN111200705B (en) | Image processing method and device | |
CN111275800B (en) | Animation generation method and device, electronic equipment and computer readable storage medium | |
CN111223105B (en) | Image processing method and device | |
CN111199519B (en) | Method and device for generating special effect package | |
CN111292276B (en) | Image processing method and device | |
CN111292245A (en) | Image processing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190730 |
|
RJ01 | Rejection of invention patent application after publication |