CN110782415A - Image completion method and device and terminal equipment - Google Patents

Image completion method and device and terminal equipment Download PDF

Info

Publication number
CN110782415A
CN110782415A CN201911061908.2A CN201911061908A CN110782415A CN 110782415 A CN110782415 A CN 110782415A CN 201911061908 A CN201911061908 A CN 201911061908A CN 110782415 A CN110782415 A CN 110782415A
Authority
CN
China
Prior art keywords
frame
optical flow
image
completion
network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911061908.2A
Other languages
Chinese (zh)
Inventor
刘烨枫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Map Duck Mdt Infotech Ltd
Original Assignee
Hefei Map Duck Mdt Infotech Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Map Duck Mdt Infotech Ltd filed Critical Hefei Map Duck Mdt Infotech Ltd
Priority to CN201911061908.2A priority Critical patent/CN110782415A/en
Publication of CN110782415A publication Critical patent/CN110782415A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T5/77
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Abstract

The invention is suitable for the technical field of image completion, and provides an image completion method and device, comprising the following steps: calculating forward optical flows and backward optical flows of the nth frame to the (n + 1) th frame according to the optical flow network; obtaining an n +2 th frame construction frame according to the reverse optical flows from the nth frame to the n +1 th frame, and obtaining an optical flow shielding area according to the forward optical flows from the nth frame to the n +1 th frame and the n +2 th frame construction frame; and shielding the n +2 th frame construction frame and the optical flow from the corresponding mask input image completion network to obtain a completion image. The method utilizes forward optical flows of two continuous frames in the video to calculate the shielding area and utilizes a convolution neural network to complement so as to solve the problem of ghosting generated in optical flow calculation.

Description

Image completion method and device and terminal equipment
Technical Field
The invention belongs to the technical field of image completion, and particularly relates to an image completion method and terminal equipment.
Background
The traditional image completion algorithm is mainly based on mathematical and physical methods, however, deep learning has an excellent effect in the visual field, and among them, image completion exists.
Since ghosting may occur in a video due to two consecutive frames through optical flow calculation, it is necessary to provide an image-based completion solution.
Disclosure of Invention
In view of this, embodiments of the present invention provide an image complementing method and a terminal device, so as to solve the problem in the prior art that ghosting is generated through optical flow calculation.
A first aspect of an embodiment of the present invention provides an image completion method, including:
calculating forward optical flows and backward optical flows of the nth frame to the (n + 1) th frame according to the optical flow network;
obtaining an n +2 th frame construction frame according to the reverse optical flows from the nth frame to the n +1 th frame, and obtaining an optical flow shielding area according to the forward optical flows from the nth frame to the n +1 th frame and the n +2 th frame construction frame;
and shielding the n +2 th frame construction frame and the optical flow from the corresponding mask input image completion network to obtain a completion image.
Further, the calculating forward optical flow and backward optical flow of the nth frame to the (n + 1) th frame according to the optical flow network includes:
and calculating the change of the pixels of the nth frame image to the pixels of the (n + 1) th frame image in a time domain to obtain a forward optical flow and a reverse optical flow.
Further, the obtaining of the n +2 th frame construction frame according to the reverse optical flow from the n th frame to the n +1 th frame includes:
and obtaining an n +2 th frame construction graph according to the reverse optical flows from the n frame to the n +1 th frame based on the n +1 th frame.
Further, the obtaining of the optical flow occlusion area according to the forward optical flows of the n frame to the n +1 frame and the n +2 frame construction frame includes:
and based on the n +2 th frame construction frame, obtaining an optical flow shielding area in the n +2 th frame construction frame according to the forward optical flows from the n frame to the n +1 th frame.
Further, the training method of the completion network comprises:
inputting the (n + 2) th frame to construct a frame;
calculating a loss function of the n +2 th frame construction frame and the n +2 th frame original image;
updating the gradient according to the loss function;
adjusting parameters of the completion network based on the gradient update.
A second aspect of an embodiment of the present invention provides an image complementing apparatus, including:
the optical flow calculation module is used for calculating forward optical flows and backward optical flows of the nth frame to the (n + 1) th frame according to the optical flow network;
the n +2 frame construction frame and optical flow shielding module is used for obtaining an n +2 frame construction frame according to the reverse optical flow from the n frame to the n +1 frame and obtaining an optical flow shielding area according to the forward optical flow from the n frame to the n +1 frame and the n +2 frame construction frame;
and the image completion module is used for shielding the n +2 th frame construction frame and the optical flow from the corresponding mask input image completion network to obtain a completion image.
Further, the optical flow calculation module includes:
and the optical flow calculating unit is used for calculating the change of the pixels of the nth frame image to the pixels of the (n + 1) th frame image in a time domain to obtain a forward optical flow and a backward optical flow.
Further, the image completion device further comprises a completion network training module, and the completion network training module comprises:
an input unit for inputting the (n + 2) th frame construction frame;
the loss function unit is used for calculating the loss functions of the n +2 th frame construction frame and the n +2 th frame original image;
a gradient updating unit for performing gradient updating according to the loss function;
and the parameter updating unit is used for updating and adjusting the parameters of the completion network based on the gradient.
A third aspect of embodiments of the present invention provides an image complementing terminal device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method provided in the first aspect when executing the computer program.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the method as provided in the first aspect above.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
the method utilizes forward optical flows of two continuous frames in the video to calculate the shielding area and utilizes a convolution neural network to complement so as to solve the problem of ghosting generated in optical flow calculation.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of an implementation of an image completion method according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a completion network according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of a completion network training method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of an image completion apparatus according to an embodiment of the present invention;
fig. 5 is a schematic diagram of an image completion terminal device according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to explain the technical means of the present invention, the following description will be given by way of specific examples.
Example one
Fig. 1 shows a flow of implementing an image compression method according to an embodiment of the present invention, where an execution subject of the method may be a terminal device, and details are as follows:
step S101, calculating forward optical flows and backward optical flows of the nth frame to the (n + 1) th frame according to the optical flow network.
Optionally, the optical flow is to use the change of the pixels in the image sequence in the time domain and the correlation between the adjacent frames to find the correlation between the two adjacent frames, so as to calculate the motion information of the object between the adjacent frames: and inputting the nth frame and the (n + 1) th frame into a preset optical flow network to obtain a forward optical flow and a reverse optical flow. Further, the optical flow network includes two network structures: FlowNeTS (FlowNetSimple) and FlowNetC (FlowNetCorr). The optical flow network FlowNet S directly overlaps and inputs two images according to channel dimensions, and the network structure of the FlowNet S only has convolution layers; the optical flow network FlowNet C firstly extracts the characteristics of the two input images respectively and then calculates the correlation of the characteristics, namely the characteristics of the two images are subjected to convolution operation in a space dimension.
Step S102, obtaining an n +2 th frame construction frame according to the reverse optical flows of the n th frame to the n +1 th frame, and obtaining an optical flow shielding area according to the forward optical flows of the n th frame to the n +1 th frame and the n +2 th frame construction frame.
Optionally, an n +2 th frame construction frame is obtained according to the reverse optical flow obtained in the step S101, a new optical flow same dimension grid is created based on the forward optical flow, that is, an optical flow hole, a new coordinate is obtained by adding the forward optical flow, and a blank block is combined to count that the optical flow hole area is an optical flow blocking area.
And step S103, covering the n +2 th frame construction frame and the optical flow in a corresponding mask input image completion network to obtain a completion image.
Optionally, as shown in fig. 2, the completion network PGnet is constructed in advance by building a Unet structure, and combining Partial convolution (Partial convolution) only conditioned on the effective pixels by using a mask (binary mask), where a value of the mask is 0 and represents an optical flow void, a value of the mask is 1 and represents that no repair is required, and a result of the Partial convolution layer only fills a void region and updates a value of the binary mask in real time until the repair is completed.
Further, as shown in fig. 3, the method for training the completion network includes:
step S301, inputting the n +2 th frame building frame.
Step S302 is to calculate the loss function of the n +2 th frame and the n +2 th frame original image.
Alternatively, the above-mentioned loss function of the n +2 th frame construction frame and the n +2 th frame original may use MSE (mean square error).
Step S303, performing gradient update according to the loss function.
Alternatively, the formula for the gradient update is shown in equation (1):
W′=W-αΔW (1)
where W represents the weight parameter of the completion network, W' represents the updated weight parameter, α is the preset learning rate, and Δ W is the calculated gradient.
Alternatively, the calculations can be performed using an existing adaptive gradient optimizer when performing the gradient update. In particular, an Adam optimizer may be used. Further, the MSE calculation result, the weight parameter of the neural network, and the preset learning rate are input into the Adam optimizer, and the updated weight parameter can be obtained.
Step S304, adjusting the parameters of the completion network based on the gradient update.
Further, the weight parameters of the neural network are adjusted through the gradient update.
Optionally, the updated weight parameter obtained by the calculation replaces the original weight parameter in the completion network, so as to become a new completion network.
Further, training the completion network until the performance index of the generated image reaches a preset threshold value. The performance indexes of the images generated by the completion network comprise peak Signal-to-Noise ratio (PSNR) (Peak Signal to Noise ratio) and pixel bit (bits per pixel). Optionally, under a fixed pixel bit BPP, it is determined whether the peak signal-to-noise ratio PSNR reaches a preset threshold, and a higher peak signal-to-noise ratio PSNR represents less information lost in picture compression.
In the embodiment, the problem of ghosting generated in optical flow calculation is solved by calculating the shielding area by using the forward optical flows of two continuous frames in the video and using the convolutional neural network to complete.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Example two
Fig. 4 is a block diagram showing a configuration of an image complementing apparatus according to an embodiment of the present invention, and only a portion related to the embodiment of the present invention is shown for convenience of explanation. The image complementing device 4 includes: an optical flow calculation module 41, an n +2 th frame construction frame and optical flow blocking module 42 and an image completion module 43.
Wherein, the optical flow calculating module 41 is configured to calculate forward optical flows and backward optical flows of the nth frame to the (n + 1) th frame according to an optical flow network;
an n +2 frame construction frame and optical flow blocking module 42, configured to obtain an n +2 frame construction frame according to the reverse optical flow from the n frame to the n +1 frame, and obtain an optical flow blocking area according to the forward optical flow from the n frame to the n +1 frame and the n +2 frame construction frame;
and an image completion module 43, configured to block the mask input image completion network corresponding to the n +2 th frame construction frame and the optical flow to obtain a completed image.
Further, the optical flow calculation module 41 includes:
and the optical flow calculating unit is used for calculating the change of the pixels of the nth frame image to the pixels of the (n + 1) th frame image in a time domain to obtain a forward optical flow and a backward optical flow.
Further, the image completing device 4 further includes a completing network training module 44, and the completing network training module 44 includes:
an input unit for inputting the (n + 2) th frame construction frame;
a loss function unit, configured to calculate a loss function between the (n + 2) th frame and the (n + 2) th frame original image;
a gradient updating unit for performing gradient updating according to the loss function;
and the parameter updating unit is used for updating and adjusting the parameters of the completion network based on the gradient.
The embodiment solves the ghost problem generated in the optical flow calculation by calculating the shielding area by using the forward optical flow of two continuous frames in the video and using the convolutional neural network to complement.
EXAMPLE III
Fig. 5 is a schematic diagram of an image complementing terminal device according to an embodiment of the present invention. As shown in fig. 5, the image complementing terminal device 5 of this embodiment includes: a processor 50, a memory 51 and a computer program 52, such as an image complementing program, stored in said memory 51 and executable on said processor 50. The processor 50 executes the computer program 52 to implement the steps in the various embodiments of the image completion method, such as the steps 101 to 103 shown in fig. 1. Alternatively, the processor 50 executes the computer program 52 to realize the functions of the modules/units in the device embodiments, such as the modules 41 to 43 shown in fig. 4.
Illustratively, the computer program 52 may be divided into one or more modules/units, which are stored in the memory 51 and executed by the processor 50 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 52 in the image completion terminal device 5. For example, the computer program 52 may be divided into an optical flow calculation module, an n +2 th frame construction frame and optical flow blocking module, and an image completion module, where the specific functions of the modules are as follows:
the optical flow calculation module is used for calculating forward optical flows and backward optical flows of the nth frame to the (n + 1) th frame according to the optical flow network;
the n +2 frame construction frame and optical flow shielding module is used for obtaining an n +2 frame construction frame according to the reverse optical flow from the n frame to the n +1 frame and obtaining an optical flow shielding area according to the forward optical flow from the n frame to the n +1 frame and the n +2 frame construction frame;
and the image completion module is used for covering the n +2 th frame construction frame and the optical flow to the corresponding mask input image completion network to obtain a completion image.
The image completion terminal device 5 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The image complementing terminal device may include, but is not limited to, the processor 50 and the memory 51. It will be understood by those skilled in the art that fig. 5 is merely an example of the image complementing terminal device 5, and does not constitute a limitation to the image complementing terminal device 5, and may include more or less components than those shown, or combine some components, or different components, for example, the image complementing terminal device may further include an input-output device, a network access device, a bus, and the like.
The Processor 50 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the image complementing terminal device 5, such as a hard disk or a memory of the image complementing terminal device 5. The memory 51 may be an external storage device of the image complementing terminal device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the image complementing terminal device 5. Further, the memory 51 may include both an internal storage unit of the image complementing terminal device 5 and an external storage device. The memory 51 is used to store the computer program and other programs and data required by the image complementing terminal device. The above-mentioned memory 51 may also be used to temporarily store data that has been output or is to be output.
From the above, the present embodiment solves the ghost problem generated in the optical flow calculation by calculating the occlusion area using the forward optical flow of two consecutive frames in the video and using the convolutional neural network to complement.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned functions may be distributed as different functional units and modules according to needs, that is, the internal structure of the apparatus may be divided into different functional units or modules to implement all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the above modules or units is only one logical function division, and there may be other division manners in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units described above, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium and can implement the steps of the embodiments of the method when the computer program is executed by a processor. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer readable medium may include: any entity or device capable of carrying the above-mentioned computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, software distribution medium, etc. It should be noted that the computer readable medium described above may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media excludes electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. An image completion method, comprising:
calculating forward optical flows and backward optical flows of the nth frame to the (n + 1) th frame according to the optical flow network;
obtaining an n +2 th frame construction frame according to the reverse optical flows from the nth frame to the n +1 th frame, and obtaining an optical flow shielding area according to the forward optical flows from the nth frame to the n +1 th frame and the n +2 th frame construction frame;
and inputting the (n + 2) th frame construction frame and a mask (mask) corresponding to the optical flow shielding into an image completion network to obtain a completion image.
2. The image completion method according to claim 1, wherein said calculating forward optical flow and backward optical flow of the n frame to the n +1 frame from the optical flow network comprises:
and calculating the change of the pixels of the nth frame image to the pixels of the (n + 1) th frame image in a time domain to obtain a forward optical flow and a reverse optical flow.
3. The image completion method according to claim 1, wherein said deriving an n +2 frame construction frame from the backward optical flow of the n frame to the n +1 frame comprises:
and obtaining an n +2 th frame construction graph according to the reverse optical flows from the n frame to the n +1 th frame based on the n +1 th frame.
4. The image completion method according to claim 1, wherein said deriving an optical-flow occlusion region from the forward optical flow of the n frame to the n +1 frame and the n +2 frame construction frame comprises:
and based on the n +2 th frame construction frame, obtaining an optical flow shielding area in the n +2 th frame construction frame according to the forward optical flows from the n frame to the n +1 th frame.
5. The image completion method of claim 1, wherein the training method of the completion network comprises:
inputting the (n + 2) th frame to construct a frame;
calculating a loss function of the n +2 th frame construction frame and the n +2 th frame original image;
updating the gradient according to the loss function;
adjusting parameters of the completion network based on the gradient update.
6. An image complementing apparatus, comprising:
the optical flow calculation module is used for calculating forward optical flows and backward optical flows of the nth frame to the (n + 1) th frame according to the optical flow network;
the n +2 frame construction frame and optical flow shielding module is used for obtaining an n +2 frame construction frame according to the reverse optical flow from the n frame to the n +1 frame and obtaining an optical flow shielding area according to the forward optical flow from the n frame to the n +1 frame and the n +2 frame construction frame;
and the image completion module is used for shielding the n +2 th frame construction frame and the optical flow from the corresponding mask input image completion network to obtain a completion image.
7. The image completion apparatus according to claim 6, wherein the optical flow calculation module comprises:
and the optical flow calculating unit is used for calculating the change of the pixels of the nth frame image to the pixels of the (n + 1) th frame image in a time domain to obtain a forward optical flow and a backward optical flow.
8. The image completion apparatus of claim 6, wherein the image completion apparatus further comprises a completion network training module, the completion network training module comprising:
an input unit for inputting the (n + 2) th frame construction frame;
the loss function unit is used for calculating the loss functions of the n +2 th frame construction frame and the n +2 th frame original image;
a gradient updating unit for performing gradient updating according to the loss function;
and the parameter updating unit is used for updating and adjusting the parameters of the completion network based on the gradient.
9. An image complementing terminal device comprising a memory, a processor and a computer program stored in said memory and executable on said processor, characterized in that said processor implements the steps of the method according to any one of claims 1 to 5 when executing said computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
CN201911061908.2A 2019-11-01 2019-11-01 Image completion method and device and terminal equipment Pending CN110782415A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911061908.2A CN110782415A (en) 2019-11-01 2019-11-01 Image completion method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911061908.2A CN110782415A (en) 2019-11-01 2019-11-01 Image completion method and device and terminal equipment

Publications (1)

Publication Number Publication Date
CN110782415A true CN110782415A (en) 2020-02-11

Family

ID=69388530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911061908.2A Pending CN110782415A (en) 2019-11-01 2019-11-01 Image completion method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN110782415A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113592777A (en) * 2021-06-30 2021-11-02 北京旷视科技有限公司 Image fusion method and device for double-shooting and electronic system
CN115118948A (en) * 2022-06-20 2022-09-27 北京华录新媒信息技术有限公司 Method and device for repairing irregular occlusion in panoramic video

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106056630A (en) * 2016-06-06 2016-10-26 南昌航空大学 Occlusion region detection method based on image sequence optical flow and triangular mesh
CN106791279A (en) * 2016-12-30 2017-05-31 中国科学院自动化研究所 Motion compensation process and system based on occlusion detection
CN109889849A (en) * 2019-01-30 2019-06-14 北京市商汤科技开发有限公司 Video generation method, device, medium and equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106056630A (en) * 2016-06-06 2016-10-26 南昌航空大学 Occlusion region detection method based on image sequence optical flow and triangular mesh
CN106791279A (en) * 2016-12-30 2017-05-31 中国科学院自动化研究所 Motion compensation process and system based on occlusion detection
CN109889849A (en) * 2019-01-30 2019-06-14 北京市商汤科技开发有限公司 Video generation method, device, medium and equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113592777A (en) * 2021-06-30 2021-11-02 北京旷视科技有限公司 Image fusion method and device for double-shooting and electronic system
CN115118948A (en) * 2022-06-20 2022-09-27 北京华录新媒信息技术有限公司 Method and device for repairing irregular occlusion in panoramic video
CN115118948B (en) * 2022-06-20 2024-04-05 北京华录新媒信息技术有限公司 Repairing method and device for irregular shielding in panoramic video

Similar Documents

Publication Publication Date Title
US10755173B2 (en) Video deblurring using neural networks
Bao et al. Memc-net: Motion estimation and motion compensation driven neural network for video interpolation and enhancement
JP7146091B2 (en) Information embedding method in video, computer equipment and computer program
US20180324465A1 (en) Edge-aware spatio-temporal filtering and optical flow estimation in real time
WO2021022685A1 (en) Neural network training method and apparatus, and terminal device
CN108600783B (en) Frame rate adjusting method and device and terminal equipment
CN110830808A (en) Video frame reconstruction method and device and terminal equipment
WO2020253103A1 (en) Video image processing method, device, apparatus, and storage medium
CN109309826B (en) Image color balancing method and device, terminal equipment and readable storage medium
CN110913218A (en) Video frame prediction method and device and terminal equipment
CN110197183B (en) Image blind denoising method and device, computer equipment and storage medium
CN110753225A (en) Video compression method and device and terminal equipment
CN110782415A (en) Image completion method and device and terminal equipment
CN110913219A (en) Video frame prediction method and device and terminal equipment
CN112967207A (en) Image processing method and device, electronic equipment and storage medium
CN110913230A (en) Video frame prediction method and device and terminal equipment
CN113744159B (en) Defogging method and device for remote sensing image and electronic equipment
CN111083478A (en) Video frame reconstruction method and device and terminal equipment
CN113989165A (en) Image processing method, image processing device, electronic equipment and storage medium
CN113222856A (en) Inverse halftone image processing method, terminal equipment and readable storage medium
CN113628259A (en) Image registration processing method and device
CN111861940A (en) Image toning enhancement method based on condition continuous adjustment
CN111083494A (en) Video coding method and device and terminal equipment
CN111083479A (en) Video frame prediction method and device and terminal equipment
CN110458754B (en) Image generation method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200211