CN116579964B - Dynamic frame gradual-in gradual-out dynamic fusion method and device - Google Patents
Dynamic frame gradual-in gradual-out dynamic fusion method and device Download PDFInfo
- Publication number
- CN116579964B CN116579964B CN202310577161.6A CN202310577161A CN116579964B CN 116579964 B CN116579964 B CN 116579964B CN 202310577161 A CN202310577161 A CN 202310577161A CN 116579964 B CN116579964 B CN 116579964B
- Authority
- CN
- China
- Prior art keywords
- image data
- data
- fusion
- frame data
- dynamic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 14
- 230000004927 fusion Effects 0.000 claims abstract description 132
- 238000012545 processing Methods 0.000 claims abstract description 38
- 238000000034 method Methods 0.000 claims abstract description 25
- 238000010612 desalination reaction Methods 0.000 claims description 11
- 238000012544 monitoring process Methods 0.000 abstract description 10
- 230000000694 effects Effects 0.000 abstract description 9
- 238000013138 pruning Methods 0.000 abstract description 6
- 238000004891 communication Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 230000008878 coupling Effects 0.000 description 4
- 238000010168 coupling process Methods 0.000 description 4
- 238000005859 coupling reaction Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000003111 delayed effect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000009966 trimming Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a dynamic fusion method and device for dynamic frame gradual-in and gradual-out. Wherein the method comprises the following steps: acquiring front moving frame data and rear moving frame data of an array camera; blurring the previous frame data to obtain first fused image data; comparing the post-motion frame data with the first fusion image data to obtain difference image data; generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data to obtain an image fusion result. The invention solves the technical problems that in the prior art, the dynamic shooting of the light field camera is simply carried out through the simple splicing of different dynamic frames, and the final fusion image data is obtained through pruning, and in the high-speed dynamic real-time monitoring of multiple targets and multiple movements, the good image processing operation cannot be realized, and the final effect of image fusion is influenced.
Description
Technical Field
The invention relates to the field of dynamic image data processing, in particular to a dynamic fusion method and device for moving frame gradual-in and gradual-out.
Background
Along with the continuous development of intelligent science and technology, intelligent equipment is increasingly used in life, work and study of people, and the quality of life of people is improved and the learning and working efficiency of people is increased by using intelligent science and technology means.
At present, in the field of camera arrays and light field camera shooting monitoring in China, corresponding image splitting and splitting operations are generally carried out according to the capturing result of dynamic images and aiming at the time requirement of a user for analysis so as to obtain monitoring which is convenient to identify and display intuitively. However, in the prior art, the dynamic shooting of the light field camera is simply carried out by simply splicing different dynamic frames and trimming the same to obtain final fusion image data, but when the high-speed dynamic real-time monitoring of multiple targets and multiple movements is carried out, good image processing operation cannot be achieved, and the final effect of image fusion is affected.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides a dynamic fusion method and a dynamic fusion device for gradually moving frames, which at least solve the technical problems that in the prior art, the dynamic shooting of a light field camera is simply carried out by splicing different moving frames and trimming is carried out to obtain final fusion image data, and when the high-speed dynamic real-time monitoring of multiple targets and multiple movements is carried out, good image processing operation cannot be realized, and the final effect of image fusion is influenced.
According to an aspect of an embodiment of the present invention, there is provided a dynamic frame fade-in and fade-out dynamic fusion method, including: acquiring front moving frame data and rear moving frame data of an array camera; blurring the previous frame data to obtain first fused image data; comparing the post-motion frame data with the first fusion image data to obtain difference image data; generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data to obtain an image fusion result.
Optionally, the acquiring the front moving frame data and the rear moving frame data of the array camera includes: acquiring time stamp data shot by the array camera in real time; and selecting the front moving frame data and the rear moving frame data according to the timestamp data and the reference time threshold.
Optionally, the comparing the post-motion frame data with the first fused image data to obtain difference image data includes: extracting all pixel parameters of the first fused image data, wherein the pixel parameters comprise: pixel data, coordinate data, boundary data; comparing the post-motion frame data according to the pixel parameters to obtain a comparison result, wherein the comparison result is a difference value between the image data in the post-motion frame data and the first fusion image data; and taking the comparison result as output, and outputting the difference image data.
Optionally, the generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data, to obtain an image fusion result includes: carrying out desalination treatment on the difference image data to obtain second fusion image data; and overlapping the content of the second fused image data to the corresponding coordinates in the first fused image data by taking the first fused image data as a reference to obtain the image fusion result.
According to another aspect of the embodiment of the present invention, there is also provided a dynamic frame fade-in and fade-out fusion device, including: the acquisition module is used for acquiring front moving frame data and rear moving frame data of the array camera; the processing module is used for carrying out blurring processing on the front moving frame data to obtain first fused image data; the comparison module is used for comparing the post-motion frame data with the first fusion image data to obtain difference image data; the generating module is used for generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data to obtain an image fusion result.
Optionally, the acquisition module includes: the acquisition unit is used for acquiring the timestamp data shot by the array camera in real time; and the selection unit is used for selecting the front dynamic frame data and the rear dynamic frame data according to the timestamp data and the reference time threshold value.
Optionally, the comparing module includes: an extracting unit, configured to extract all pixel parameters of the first fused image data, where the pixel parameters include: pixel data, coordinate data, boundary data; the comparison unit is used for comparing the post-motion frame data according to the pixel parameters to obtain a comparison result, wherein the comparison result is a difference value between the image data in the post-motion frame data and the first fusion image data; and the output unit is used for taking the comparison result as output and outputting the difference image data.
Optionally, the generating module includes: the desalination unit is used for carrying out desalination treatment on the difference image data to obtain second fusion image data; and the superposition unit is used for superposing the content of the second fusion image data on the corresponding coordinates in the first fusion image data by taking the first fusion image data as a reference to obtain the image fusion result.
According to another aspect of the embodiment of the present invention, there is further provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, the program controls a device in which the nonvolatile storage medium is located to execute a dynamic frame fade-in and fade-out dynamic fusion method.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a dynamic frame fade-in fade-out dynamic fusion method when executed.
In the embodiment of the invention, the front moving frame data and the rear moving frame data of the array camera are acquired; blurring the previous frame data to obtain first fused image data; comparing the post-motion frame data with the first fusion image data to obtain difference image data; the method solves the technical problems that in the prior art, the light field camera dynamic shooting is simply split through different moving frames and is subjected to pruning processing to obtain final fused image data, and when high-speed dynamic real-time monitoring with multiple targets and multiple movements is performed, good image processing operation cannot be achieved, and the final effect of image fusion is affected.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flow chart of a dynamic frame fade-in and fade-out dynamic fusion method according to an embodiment of the invention;
FIG. 2 is a block diagram of a dynamic frame fade-in and fade-out dynamic fusion device according to an embodiment of the present invention;
fig. 3 is a block diagram of a terminal device for performing the method according to the invention according to an embodiment of the invention;
fig. 4 is a memory unit for holding or carrying program code for implementing a method according to the invention, according to an embodiment of the invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with an embodiment of the present invention, a method embodiment of a dynamic frame fade-in and fade-out dynamic fusion method is provided, it being noted that the steps illustrated in the flowchart of the figures may be performed in a computer system, such as a set of computer executable instructions, and that, although a logical order is illustrated in the flowchart, in some cases, the steps illustrated or described may be performed in an order other than that illustrated herein.
Example 1
Fig. 1 is a flowchart of a dynamic frame fade-in and fade-out dynamic fusion method according to an embodiment of the invention, as shown in fig. 1, the method includes the following steps:
step S102, front moving frame data and rear moving frame data of the array camera are collected.
Optionally, the acquiring the front moving frame data and the rear moving frame data of the array camera includes: acquiring time stamp data shot by the array camera in real time; and selecting the front moving frame data and the rear moving frame data according to the timestamp data and the reference time threshold.
Specifically, in order to solve the technical problem that in the prior art, the light field camera dynamic shooting is simply performed by simply splicing different moving frames and performing pruning processing to obtain final fusion image data, but in the time of high-speed dynamic real-time monitoring with multiple targets and multiple movements, good image processing operation cannot be achieved, and the final effect of image fusion is affected, firstly, the data of each camera of the light field camera or the array camera needs to be acquired and extracted to obtain the front moving frame data and the rear moving frame data distributed by the moving image data according to timestamp information, wherein the occurrence action of the front moving frame data is earlier than a preset reference time threshold, and the occurrence action of the rear moving frame data is later than the preset reference time threshold.
Step S104, blurring processing is carried out on the previous frame data, and first fused image data are obtained.
Specifically, in the embodiment of the invention, after the acquisition of the previous frame data is delayed, in order to process on the subsequent fused image data, the pixels and the image information in the previous frame data need to be subjected to a certain degree of blurring processing, and the blurring processing can divide a blurring limit by utilizing a boundary identification value, so that the blurring degree is mastered, the fusion of the subsequent other image data is facilitated, and the fused image data convenient to analyze is generated.
And step S106, comparing the post-motion frame data with the first fusion image data to obtain difference image data.
Optionally, the comparing the post-motion frame data with the first fused image data to obtain difference image data includes: extracting all pixel parameters of the first fused image data, wherein the pixel parameters comprise: pixel data, coordinate data, boundary data; comparing the post-motion frame data according to the pixel parameters to obtain a comparison result, wherein the comparison result is a difference value between the image data in the post-motion frame data and the first fusion image data; and taking the comparison result as output, and outputting the difference image data.
Specifically, through the generation of the first fused image data, the embodiment of the present invention needs to compare the post-motion frame data with the first fused image data, so as to find out the difference or interpolation between images, that is, the difference, and extract all pixel parameters of the first fused image data, where the pixel parameters include: the pixel data, the coordinate data and the boundary data are compared according to the pixel parameters to obtain a comparison result, wherein the comparison result is a difference value between the image data in the post-motion frame data and the first fusion image data; and taking the comparison result as output, and constructing a data source of a subsequent fusion operation by using the method for outputting the difference image data.
Step S108, generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data to obtain an image fusion result.
Optionally, the generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data, to obtain an image fusion result includes: carrying out desalination treatment on the difference image data to obtain second fusion image data; and overlapping the content of the second fused image data to the corresponding coordinates in the first fused image data by taking the first fused image data as a reference to obtain the image fusion result.
Specifically, the difference between the image data contained in the rear moving frame data and the image data contained in the front moving frame image is obtained by comparing the first fusion image, and mainly comes from the movement and posture change of the moving image target, so that the difference image data is utilized to form second fusion image data which needs to be fused with the first fusion image data, which is a key step of generating final fusion image data in the embodiment of the invention.
By the embodiment, the technical problems that in the prior art, the dynamic shooting of the light field camera is simply carried out through the simple splicing of different moving frames, and the final fusion image data is obtained through pruning, and in the high-speed dynamic real-time monitoring of multiple targets and multiple movements, good image processing operation cannot be achieved, and the final effect of image fusion is influenced are solved.
Example two
Fig. 2 is a block diagram of a dynamic frame fade-in and fade-out dynamic fusion device according to an embodiment of the present invention, as shown in fig. 2, the device includes:
and the acquisition module 20 is used for acquiring the front moving frame data and the rear moving frame data of the array camera.
Optionally, the acquisition module includes: the acquisition unit is used for acquiring the timestamp data shot by the array camera in real time; and the selection unit is used for selecting the front dynamic frame data and the rear dynamic frame data according to the timestamp data and the reference time threshold value.
Specifically, in order to solve the technical problem that in the prior art, the light field camera dynamic shooting is simply performed by simply splicing different moving frames and performing pruning processing to obtain final fusion image data, but in the time of high-speed dynamic real-time monitoring with multiple targets and multiple movements, good image processing operation cannot be achieved, and the final effect of image fusion is affected, firstly, the data of each camera of the light field camera or the array camera needs to be acquired and extracted to obtain the front moving frame data and the rear moving frame data distributed by the moving image data according to timestamp information, wherein the occurrence action of the front moving frame data is earlier than a preset reference time threshold, and the occurrence action of the rear moving frame data is later than the preset reference time threshold.
And the processing module 22 is configured to perform blurring processing on the previous frame data to obtain first fused image data.
Specifically, in the embodiment of the invention, after the acquisition of the previous frame data is delayed, in order to process on the subsequent fused image data, the pixels and the image information in the previous frame data need to be subjected to a certain degree of blurring processing, and the blurring processing can divide a blurring limit by utilizing a boundary identification value, so that the blurring degree is mastered, the fusion of the subsequent other image data is facilitated, and the fused image data convenient to analyze is generated.
And the comparison module 24 is configured to compare the post-motion frame data with the first fused image data to obtain difference image data.
Optionally, the comparing module includes: an extracting unit, configured to extract all pixel parameters of the first fused image data, where the pixel parameters include: pixel data, coordinate data, boundary data; the comparison unit is used for comparing the post-motion frame data according to the pixel parameters to obtain a comparison result, wherein the comparison result is a difference value between the image data in the post-motion frame data and the first fusion image data; and the output unit is used for taking the comparison result as output and outputting the difference image data.
Specifically, through the generation of the first fused image data, the embodiment of the present invention needs to compare the post-motion frame data with the first fused image data, so as to find out the difference or interpolation between images, that is, the difference, and extract all pixel parameters of the first fused image data, where the pixel parameters include: the pixel data, the coordinate data and the boundary data are compared according to the pixel parameters to obtain a comparison result, wherein the comparison result is a difference value between the image data in the post-motion frame data and the first fusion image data; and taking the comparison result as output, and constructing a data source of a subsequent fusion operation by using the method for outputting the difference image data.
The generating module 26 is configured to generate second fusion data according to the difference image data, and fuse the first fusion image data with the second fusion image data to obtain an image fusion result.
Optionally, the generating module includes: the desalination unit is used for carrying out desalination treatment on the difference image data to obtain second fusion image data; and the superposition unit is used for superposing the content of the second fusion image data on the corresponding coordinates in the first fusion image data by taking the first fusion image data as a reference to obtain the image fusion result.
Specifically, the difference between the image data contained in the rear moving frame data and the image data contained in the front moving frame image is obtained by comparing the first fusion image, and mainly comes from the movement and posture change of the moving image target, so that the difference image data is utilized to form second fusion image data which needs to be fused with the first fusion image data, which is a key step of generating final fusion image data in the embodiment of the invention.
By the embodiment, the technical problems that in the prior art, the dynamic shooting of the light field camera is simply carried out through the simple splicing of different moving frames, and the final fusion image data is obtained through pruning, and in the high-speed dynamic real-time monitoring of multiple targets and multiple movements, good image processing operation cannot be achieved, and the final effect of image fusion is influenced are solved.
According to another aspect of the embodiment of the present invention, there is further provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, the program controls a device in which the nonvolatile storage medium is located to execute a dynamic frame fade-in and fade-out dynamic fusion method.
Specifically, the method comprises the following steps: acquiring front moving frame data and rear moving frame data of an array camera; blurring the previous frame data to obtain first fused image data; comparing the post-motion frame data with the first fusion image data to obtain difference image data; generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data to obtain an image fusion result. Optionally, the acquiring the front moving frame data and the rear moving frame data of the array camera includes: acquiring time stamp data shot by the array camera in real time; and selecting the front moving frame data and the rear moving frame data according to the timestamp data and the reference time threshold. Optionally, the comparing the post-motion frame data with the first fused image data to obtain difference image data includes: extracting all pixel parameters of the first fused image data, wherein the pixel parameters comprise: pixel data, coordinate data, boundary data; comparing the post-motion frame data according to the pixel parameters to obtain a comparison result, wherein the comparison result is a difference value between the image data in the post-motion frame data and the first fusion image data; and taking the comparison result as output, and outputting the difference image data. Optionally, the generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data, to obtain an image fusion result includes: carrying out desalination treatment on the difference image data to obtain second fusion image data; and overlapping the content of the second fused image data to the corresponding coordinates in the first fused image data by taking the first fused image data as a reference to obtain the image fusion result.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a dynamic frame fade-in fade-out dynamic fusion method when executed.
Specifically, the method comprises the following steps: acquiring front moving frame data and rear moving frame data of an array camera; blurring the previous frame data to obtain first fused image data; comparing the post-motion frame data with the first fusion image data to obtain difference image data; generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data to obtain an image fusion result. Optionally, the acquiring the front moving frame data and the rear moving frame data of the array camera includes: acquiring time stamp data shot by the array camera in real time; and selecting the front moving frame data and the rear moving frame data according to the timestamp data and the reference time threshold. Optionally, the comparing the post-motion frame data with the first fused image data to obtain difference image data includes: extracting all pixel parameters of the first fused image data, wherein the pixel parameters comprise: pixel data, coordinate data, boundary data; comparing the post-motion frame data according to the pixel parameters to obtain a comparison result, wherein the comparison result is a difference value between the image data in the post-motion frame data and the first fusion image data; and taking the comparison result as output, and outputting the difference image data. Optionally, the generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data, to obtain an image fusion result includes: carrying out desalination treatment on the difference image data to obtain second fusion image data; and overlapping the content of the second fused image data to the corresponding coordinates in the first fused image data by taking the first fused image data as a reference to obtain the image fusion result.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, fig. 3 is a schematic hardware structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device may include an input device 30, a processor 31, an output device 32, a memory 33, and at least one communication bus 34. The communication bus 34 is used to enable communication connections between the elements. The memory 33 may comprise a high-speed RAM memory or may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the processor 31 may be implemented as, for example, a central processing unit (Central Processing Unit, abbreviated as CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 31 is coupled to the input device 30 and the output device 32 through wired or wireless connections.
Alternatively, the input device 30 may include a variety of input devices, for example, may include at least one of a user-oriented user interface, a device-oriented device interface, a programmable interface of software, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware insertion interface (such as a USB interface, a serial port, etc.) for data transmission between devices; alternatively, the user-oriented user interface may be, for example, a user-oriented control key, a voice input device for receiving voice input, and a touch-sensitive device (e.g., a touch screen, a touch pad, etc. having touch-sensitive functionality) for receiving user touch input by a user; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, for example, an input pin interface or an input interface of a chip, etc.; optionally, the transceiver may be a radio frequency transceiver chip, a baseband processing chip, a transceiver antenna, etc. with a communication function. An audio input device such as a microphone may receive voice data. The output device 32 may include a display, audio, or the like.
In this embodiment, the processor of the terminal device may include functions for executing each module of the data processing apparatus in each device, and specific functions and technical effects may be referred to the above embodiments and are not described herein again.
Fig. 4 is a schematic hardware structure of a terminal device according to another embodiment of the present application. Fig. 4 is a specific embodiment of the implementation of fig. 3. As shown in fig. 4, the terminal device of the present embodiment includes a processor 41 and a memory 42.
The processor 41 executes the computer program code stored in the memory 42 to implement the methods of the above-described embodiments.
The memory 42 is configured to store various types of data to support operation at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, video, etc. The memory 42 may include a random access memory (random access memory, simply referred to as RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a processor 41 is provided in the processing assembly 40. The terminal device may further include: a communication component 43, a power supply component 44, a multimedia component 45, an audio component 46, an input/output interface 47 and/or a sensor component 48. The components and the like specifically included in the terminal device are set according to actual requirements, which are not limited in this embodiment.
The processing component 40 generally controls the overall operation of the terminal device. The processing component 40 may include one or more processors 41 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 40 may include one or more modules that facilitate interactions between the processing component 40 and other components. For example, processing component 40 may include a multimedia module to facilitate interaction between multimedia component 45 and processing component 40.
The power supply assembly 44 provides power to the various components of the terminal device. Power supply components 44 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for terminal devices.
The multimedia component 45 comprises a display screen between the terminal device and the user providing an output interface. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The audio component 46 is configured to output and/or input audio signals. For example, the audio component 46 includes a Microphone (MIC) configured to receive external audio signals when the terminal device is in an operational mode, such as a speech recognition mode. The received audio signals may be further stored in the memory 42 or transmitted via the communication component 43. In some embodiments, audio assembly 46 further includes a speaker for outputting audio signals.
The input/output interface 47 provides an interface between the processing assembly 40 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: volume button, start button and lock button.
The sensor assembly 48 includes one or more sensors for providing status assessment of various aspects for the terminal device. For example, the sensor assembly 48 may detect the open/closed state of the terminal device, the relative positioning of the assembly, the presence or absence of user contact with the terminal device. The sensor assembly 48 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 48 may also include a camera or the like.
The communication component 43 is configured to facilitate communication between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot, where the SIM card slot is used to insert a SIM card, so that the terminal device may log into a GPRS network, and establish communication with a server through the internet.
From the above, it will be appreciated that the communication component 43, the audio component 46, and the input/output interface 47, the sensor component 48 referred to in the embodiment of fig. 4 may be implemented as an input device in the embodiment of fig. 3.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.
Claims (8)
1. A dynamic frame fade-in and fade-out dynamic fusion method, comprising:
acquiring front moving frame data and rear moving frame data of an array camera;
blurring the previous frame data to obtain first fused image data;
comparing the post-motion frame data with the first fusion image data to obtain difference image data;
generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data to obtain an image fusion result;
generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data, wherein the obtaining the image fusion result comprises the following steps:
carrying out desalination treatment on the difference image data to obtain second fusion image data;
and overlapping the content of the second fused image data to the corresponding coordinates in the first fused image data by taking the first fused image data as a reference to obtain the image fusion result.
2. The method of claim 1, wherein the acquiring the front and rear moving frame data of the array camera comprises:
acquiring time stamp data shot by the array camera in real time;
and selecting the front moving frame data and the rear moving frame data according to the timestamp data and the reference time threshold.
3. The method of claim 1, wherein comparing the post-motion frame data with the first fused image data to obtain difference image data comprises:
extracting all pixel parameters of the first fused image data, wherein the pixel parameters comprise: pixel data, coordinate data, boundary data;
comparing the post-motion frame data according to the pixel parameters to obtain a comparison result, wherein the comparison result is a difference value between the image data in the post-motion frame data and the first fusion image data;
and taking the comparison result as output, and outputting the difference image data.
4. A dynamic frame fade-in and fade-out dynamic fusion device, comprising:
the acquisition module is used for acquiring front moving frame data and rear moving frame data of the array camera;
the processing module is used for carrying out blurring processing on the front moving frame data to obtain first fused image data;
the comparison module is used for comparing the post-motion frame data with the first fusion image data to obtain difference image data;
the generation module is used for generating second fusion data according to the difference image data, and fusing the first fusion image data and the second fusion image data to obtain an image fusion result;
the generation module comprises:
the desalination unit is used for carrying out desalination treatment on the difference image data to obtain second fusion image data;
and the superposition unit is used for superposing the content of the second fusion image data on the corresponding coordinates in the first fusion image data by taking the first fusion image data as a reference to obtain the image fusion result.
5. The apparatus of claim 4, wherein the acquisition module comprises:
the acquisition unit is used for acquiring the timestamp data shot by the array camera in real time;
and the selection unit is used for selecting the front dynamic frame data and the rear dynamic frame data according to the timestamp data and the reference time threshold value.
6. The apparatus of claim 4, wherein the comparison module comprises:
an extracting unit, configured to extract all pixel parameters of the first fused image data, where the pixel parameters include: pixel data, coordinate data, boundary data;
the comparison unit is used for comparing the post-motion frame data according to the pixel parameters to obtain a comparison result, wherein the comparison result is a difference value between the image data in the post-motion frame data and the first fusion image data;
and the output unit is used for taking the comparison result as output and outputting the difference image data.
7. A non-volatile storage medium, characterized in that the non-volatile storage medium comprises a stored program, wherein the program, when run, controls a device in which the non-volatile storage medium is located to perform the method of any one of claims 1 to 3.
8. An electronic device comprising a processor and a memory; the memory has stored therein computer readable instructions for executing the processor, wherein the computer readable instructions when executed perform the method of any of claims 1 to 3.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310577161.6A CN116579964B (en) | 2023-05-22 | 2023-05-22 | Dynamic frame gradual-in gradual-out dynamic fusion method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310577161.6A CN116579964B (en) | 2023-05-22 | 2023-05-22 | Dynamic frame gradual-in gradual-out dynamic fusion method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116579964A CN116579964A (en) | 2023-08-11 |
CN116579964B true CN116579964B (en) | 2024-02-02 |
Family
ID=87533675
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310577161.6A Active CN116579964B (en) | 2023-05-22 | 2023-05-22 | Dynamic frame gradual-in gradual-out dynamic fusion method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116579964B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111464760A (en) * | 2020-05-06 | 2020-07-28 | Oppo(重庆)智能科技有限公司 | Dynamic image generation method and device and terminal equipment |
CN113286078A (en) * | 2021-05-07 | 2021-08-20 | Oppo广东移动通信有限公司 | Image processing method and device, terminal and computer readable storage medium |
CN113592887A (en) * | 2021-06-25 | 2021-11-02 | 荣耀终端有限公司 | Video shooting method, electronic device and computer-readable storage medium |
CN113706421A (en) * | 2021-10-27 | 2021-11-26 | 深圳市慧鲤科技有限公司 | Image processing method and device, electronic equipment and storage medium |
CN115022679A (en) * | 2022-05-30 | 2022-09-06 | 北京百度网讯科技有限公司 | Video processing method, video processing device, electronic equipment and medium |
CN115883988A (en) * | 2023-02-17 | 2023-03-31 | 南昌航天广信科技有限责任公司 | Video image splicing method and system, electronic equipment and storage medium |
-
2023
- 2023-05-22 CN CN202310577161.6A patent/CN116579964B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111464760A (en) * | 2020-05-06 | 2020-07-28 | Oppo(重庆)智能科技有限公司 | Dynamic image generation method and device and terminal equipment |
CN113286078A (en) * | 2021-05-07 | 2021-08-20 | Oppo广东移动通信有限公司 | Image processing method and device, terminal and computer readable storage medium |
CN113592887A (en) * | 2021-06-25 | 2021-11-02 | 荣耀终端有限公司 | Video shooting method, electronic device and computer-readable storage medium |
CN113706421A (en) * | 2021-10-27 | 2021-11-26 | 深圳市慧鲤科技有限公司 | Image processing method and device, electronic equipment and storage medium |
CN115022679A (en) * | 2022-05-30 | 2022-09-06 | 北京百度网讯科技有限公司 | Video processing method, video processing device, electronic equipment and medium |
CN115883988A (en) * | 2023-02-17 | 2023-03-31 | 南昌航天广信科技有限责任公司 | Video image splicing method and system, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN116579964A (en) | 2023-08-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115426525B (en) | High-speed dynamic frame linkage image splitting method and device | |
CN115623336B (en) | Image tracking method and device for hundred million-level camera equipment | |
CN115631122A (en) | Image optimization method and device for edge image algorithm | |
CN115984126A (en) | Optical image correction method and device based on input instruction | |
CN116614453B (en) | Image transmission bandwidth selection method and device based on cloud interconnection | |
CN116261044B (en) | Intelligent focusing method and device for hundred million-level cameras | |
CN116579964B (en) | Dynamic frame gradual-in gradual-out dynamic fusion method and device | |
CN114866702A (en) | Multi-auxiliary linkage camera shooting technology-based border monitoring and collecting method and device | |
CN116579965B (en) | Multi-image fusion method and device | |
CN116468883B (en) | High-precision image data volume fog recognition method and device | |
CN116228593B (en) | Image perfecting method and device based on hierarchical antialiasing | |
CN115511735B (en) | Snow field gray scale picture optimization method and device | |
CN116664413B (en) | Image volume fog eliminating method and device based on Abbe convergence operator | |
CN115984333B (en) | Smooth tracking method and device for airplane target | |
CN115345808B (en) | Picture generation method and device based on multi-element information acquisition | |
CN116389915B (en) | Method and device for reducing flicker of light field camera | |
CN116088580B (en) | Flying object tracking method and device | |
CN115914819B (en) | Picture capturing method and device based on orthogonal decomposition algorithm | |
CN116402935B (en) | Image synthesis method and device based on ray tracing algorithm | |
CN117896625A (en) | Picture imaging method and device based on low-altitude high-resolution analysis | |
CN116757981A (en) | Multi-terminal image fusion method and device | |
CN116452481A (en) | Multi-angle combined shooting method and device | |
CN116309523A (en) | Dynamic frame image dynamic fuzzy recognition method and device | |
CN117351341A (en) | Unmanned aerial vehicle fish school identification method and device based on decomposition optimization | |
CN118735818A (en) | Method and device for sharpening surveillance video |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |