CN115170818A - Dynamic frame image feature extraction method and device - Google Patents
Dynamic frame image feature extraction method and device Download PDFInfo
- Publication number
- CN115170818A CN115170818A CN202210889466.6A CN202210889466A CN115170818A CN 115170818 A CN115170818 A CN 115170818A CN 202210889466 A CN202210889466 A CN 202210889466A CN 115170818 A CN115170818 A CN 115170818A
- Authority
- CN
- China
- Prior art keywords
- dynamic frame
- data
- dynamic
- frame
- dimensional matrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000605 extraction Methods 0.000 title claims abstract description 79
- 239000011159 matrix material Substances 0.000 claims abstract description 74
- 238000000354 decomposition reaction Methods 0.000 claims abstract description 51
- 238000000034 method Methods 0.000 claims abstract description 50
- 238000012512 characterization method Methods 0.000 claims abstract description 10
- 230000006870 function Effects 0.000 claims description 22
- 238000012549 training Methods 0.000 claims description 13
- 238000004422 calculation algorithm Methods 0.000 claims description 8
- 238000006243 chemical reaction Methods 0.000 claims description 8
- 238000010276 construction Methods 0.000 claims description 2
- 238000012545 processing Methods 0.000 abstract description 31
- 230000008569 process Effects 0.000 abstract description 10
- 230000003068 static effect Effects 0.000 abstract description 8
- 238000004891 communication Methods 0.000 description 15
- 230000008878 coupling Effects 0.000 description 6
- 238000010168 coupling process Methods 0.000 description 6
- 238000005859 coupling reaction Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000005236 sound signal Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012821 model calculation Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000003062 neural network model Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
Abstract
The invention discloses a method and a device for extracting dynamic frame picture features. Wherein, the method comprises the following steps: acquiring original picture data; inputting the original picture data into a dynamic frame decomposition model to generate dynamic frame data; constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the Lagrange normal operator, wherein the parameter characterization function of the dynamic frame three-dimensional matrix comprises the following steps: dynamic frame HDR dynamic range, frame amount data and dynamic fluctuation data; and extracting a dynamic frame picture meeting the requirements according to the dynamic frame three-dimensional matrix and preset extraction information. The invention solves the technical problems that the dynamic image processing process in the prior art only depends on the static frame picture of a dynamic frame to carry out the feature recognition operation or the feature extraction operation, but the extraction work is usually very heavy, and the extraction requirement is changed according to different conditions of the dynamic frame, so that the complete dynamic frame picture feature data can not be accurately extracted by only the extraction method.
Description
Technical Field
The invention relates to the field of dynamic picture processing, in particular to a method and a device for extracting dynamic frame picture features.
Background
Along with the continuous development of intelligent science and technology, people use intelligent equipment more and more among life, work, the study, use intelligent science and technology means, improved the quality of people's life, increased the efficiency of people's study and work.
At present, when identifying and processing a dynamic frame picture, it is often necessary to extract a feature value of the picture in a dynamic frame situation, for example, some feature required areas or features in the dynamic frame picture are extracted and processed, and a related extraction result is used for data and image processing operations such as model establishment of an upper server. However, in the prior art, the processing procedure of the dynamic image only depends on the feature recognition operation or the feature extraction operation on the still frame picture of the dynamic frame, but the workload of such extraction is often large, and the extraction requirement changes according to different situations of the dynamic frame, so that the complete characteristic data of the dynamic frame picture cannot be accurately extracted only by the above extraction method.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a method and a device for extracting characteristics of a dynamic frame picture, which are used for at least solving the technical problems that in the prior art, the dynamic image processing process only depends on the characteristic recognition operation or the characteristic extraction operation of a static frame picture of a dynamic frame, but the workload of the extraction operation is very large, and the extraction requirement is changed according to different conditions of the dynamic frame, so that the complete characteristic data of the dynamic frame picture cannot be accurately extracted only by the extraction method.
According to an aspect of the embodiments of the present invention, there is provided a method for extracting features of a dynamic frame picture, including: acquiring original picture data; inputting the original picture data into a dynamic frame decomposition model to generate dynamic frame data; constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the Lagrange normal operator, wherein the parameter characterization function of the dynamic frame three-dimensional matrix comprises the following steps: dynamic frame HDR dynamic range, frame amount data and dynamic fluctuation data; and extracting a dynamic frame picture meeting the requirement according to the dynamic frame three-dimensional matrix and preset extraction information.
Optionally, before the inputting the original picture data into a dynamic frame decomposition model and generating dynamic frame data, the method further includes: acquiring dynamic frame decomposition historical data according to the original picture data; and training the dynamic frame decomposition model according to the dynamic frame decomposition historical data.
Optionally, the constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the lagrangian normal operator includes: obtaining Lagrangian operator for characterizing normal distribution in the dynamic data group; calculating the dynamic frame three-dimensional matrix based on the dynamic frame data and the Lagrangian normal operator according to a Lagrangian normal algorithm disclosure, wherein the disclosure comprises:
and the delta and the theta are axial T matrix parameters of the three-dimensional matrix, the vector T is a dynamic frame distribution time stamp, s is a dynamic frame conversion function, and alpha is a Lagrangian normal operator.
Optionally, before the extracting, according to the dynamic frame three-dimensional matrix and preset extraction information, a dynamic frame picture meeting requirements, the method further includes: and generating the preset extraction information according to the original picture data and the demand parameters.
According to another aspect of the embodiments of the present invention, there is also provided a dynamic frame picture feature extraction apparatus, including: the acquisition module is used for acquiring original picture data; the generating module is used for inputting the original picture data into a dynamic frame decomposition model to generate dynamic frame data; the construction module is used for constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the Lagrange normal operator, wherein the dynamic frame three-dimensional matrix parameter characterization function comprises: dynamic frame HDR dynamic range, frame amount data and dynamic fluctuation data; and the extraction module is used for extracting the dynamic frame picture meeting the requirements according to the dynamic frame three-dimensional matrix and preset extraction information.
Optionally, the apparatus further comprises: the acquisition module is also used for acquiring dynamic frame decomposition historical data according to the original picture data; and the training module is used for training the dynamic frame decomposition model according to the dynamic frame decomposition historical data.
Optionally, the building module includes: the acquisition unit is used for acquiring Lagrangian operators for characterizing normal distribution in the dynamic data group; a calculating unit, configured to calculate the dynamic frame three-dimensional matrix based on the dynamic frame data and the lagrangian normal operator according to a lagrangian normal algorithm bulletin, where the bulletin includes:
wherein, δ and θ are three-dimensional matrix axial T matrix parameters, vector T is dynamic frame distribution time stamp, s is dynamic frame conversion function, and α is Lagrangian normal operator.
Optionally, the apparatus further comprises: and the generating module is further used for generating the preset extraction information according to the original picture data and the demand parameters.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium, which includes a stored program, where the program controls a device in which the non-volatile storage medium is located to execute a dynamic frame picture feature extraction method when running.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a processor and a memory; the memory has stored therein computer readable instructions, and the processor is configured to execute the computer readable instructions, wherein the computer readable instructions execute a method for extracting dynamic frame image features.
In the embodiment of the invention, the method comprises the steps of acquiring original picture data; inputting the original picture data into a dynamic frame decomposition model to generate dynamic frame data; constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the Lagrange normal operator, wherein the parameter characterization function of the dynamic frame three-dimensional matrix comprises the following steps: dynamic frame HDR dynamic range, frame amount data and dynamic fluctuation data; according to the dynamic frame three-dimensional matrix and the preset extraction information, a mode of extracting a dynamic frame picture meeting requirements is adopted, and the technical problems that in the prior art, the dynamic image processing process only depends on the static frame picture of a dynamic frame to perform feature recognition operation or feature extraction operation, but the workload is very large, the extraction requirement is changed according to different conditions of the dynamic frame, and therefore complete dynamic frame picture feature data cannot be accurately extracted only by the extraction method are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flow chart of a method for extracting features of a dynamic frame picture according to an embodiment of the present invention;
fig. 2 is a block diagram of a dynamic frame picture feature extraction apparatus according to an embodiment of the present invention;
fig. 3 is a block diagram of a terminal device for performing a method according to the invention, according to an embodiment of the invention;
fig. 4 is a memory unit for holding or carrying program code implementing a method according to the invention, according to an embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In accordance with an embodiment of the present invention, there is provided a method embodiment of a dynamic frame picture feature extraction method, it is noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Example one
Fig. 1 is a flowchart of a method for extracting features of a dynamic frame picture according to an embodiment of the present invention, and as shown in fig. 1, the method includes the following steps:
step S102, acquiring original picture data.
Specifically, in order to solve the technical problem that in the prior art, the processing process of a dynamic image only depends on performing a feature recognition operation or a feature extraction operation on a static frame picture of a dynamic frame, but such an extraction work is often very heavy, and the extraction requirement changes according to different situations of the dynamic frame, so that the complete dynamic frame picture feature data cannot be accurately extracted only by the above extraction method, and to overcome the related technical defects, it is first necessary to acquire original picture data by a high-precision image acquisition device, where the original picture data may be a continuous static image (a multi-frame module) or a form of a dynamic image, and provide a data source for decomposition and processing of a subsequent image.
And step S104, inputting the original picture data into a dynamic frame decomposition model to generate dynamic frame data.
Specifically, in order to convert the original image data of the dynamic frame acquired in the embodiment of the present invention into dynamic frame data, that is, convert the image pixel data into an n-1 horizontal factor with frame n as a unit, so as to extract the feature information in the dynamic frame data more accurately in the subsequent processing of the dynamic frame, the dynamic frame decomposition model may be a model trained by using a DNN neural network model and specially used for decomposing the original image into the dynamic frame data, and the output vector of the dynamic frame data is obtained by using the original image data as an input vector and performing model calculation processing.
Optionally, before the original picture data is input into a dynamic frame decomposition model and dynamic frame data is generated, the method further includes: acquiring dynamic frame decomposition historical data according to the original picture data; and training the dynamic frame decomposition model according to the dynamic frame decomposition historical data.
Specifically, in order to train the dynamic frame decomposition model in the embodiment of the present invention, a historical data decomposition condition, that is, a historical frame data-dynamic frame data matrix, needs to be extracted from a historical database according to original frame data to obtain a plurality of data sources for training the dynamic frame decomposition model, so as to obtain a complete and mature dynamic frame decomposition model.
Step S106, constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the Lagrangian normal operator, wherein the dynamic frame three-dimensional matrix parameter characterization function comprises: dynamic frame HDR dynamic range, frame size data, dynamic fluctuation data.
Optionally, the constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the lagrangian normal operator includes: obtaining Lagrange operators for representing normal distribution in the dynamic data set; calculating the dynamic frame three-dimensional matrix based on the dynamic frame data and the Lagrangian normal operator according to a Lagrangian normal algorithm disclosure, wherein the disclosure comprises:
and the delta and the theta are axial T matrix parameters of the three-dimensional matrix, the vector T is a dynamic frame distribution time stamp, s is a dynamic frame conversion function, and alpha is a Lagrangian normal operator.
And S108, extracting a dynamic frame picture meeting the requirement according to the dynamic frame three-dimensional matrix and preset extraction information.
Specifically, after the three-dimensional matrix data of the dynamic frame is obtained, in order to perform multi-point dynamic frame generation through the matrix data, the three-dimensional matrix data needs to be combined with preset extraction information in the embodiment of the present invention, and the dynamic frame picture needs to be extracted according to a preset extraction requirement.
Optionally, before the extracting, according to the dynamic frame three-dimensional matrix and preset extraction information, a dynamic frame picture meeting requirements, the method further includes: and generating the preset extraction information according to the original picture data and the demand parameters.
Specifically, in the embodiment of the present invention, before extracting a dynamic frame picture meeting requirements according to the dynamic frame three-dimensional matrix and preset extraction information, the preset extraction information needs to be generated according to the original picture data and the requirement parameter, where the requirement parameter may be generated and transmitted by a user according to the requirement and requirement for dynamic frame picture extraction, and the preset extraction information is used as one of parameters for generating a final dynamic frame picture.
Through the embodiment, the technical problem that in the prior art, the dynamic image processing process only depends on the feature recognition operation or the feature extraction operation on the static frame picture of the dynamic frame, but the workload of the extraction operation is often large, and the extraction requirement is changed according to different situations of the dynamic frame, so that the complete dynamic frame picture feature data cannot be accurately extracted only by the extraction method is solved.
Example two
Fig. 2 is a block diagram of a structure of a dynamic frame picture feature extraction apparatus according to an embodiment of the present invention, as shown in fig. 2, the apparatus includes:
and an obtaining module 20, configured to obtain original picture data.
Specifically, in order to solve the technical problem that in the prior art, a dynamic image processing process only depends on a still frame picture of a dynamic frame to perform a feature recognition operation or a feature extraction operation, but such an extraction work is often very heavy, and the extraction requirement changes according to different situations of the dynamic frame, so that the complete dynamic frame picture feature data cannot be accurately extracted only by the above extraction method, and to overcome the related technical defects, it is first necessary to acquire original picture data by a high-precision image acquisition device, where the original picture data may be a continuous static image (a multi-frame module) or may be in a dynamic image form, and provide a data source for the decomposition and processing of subsequent images.
And the generating module 22 is configured to input the original picture data into a dynamic frame decomposition model, and generate dynamic frame data.
Specifically, in order to convert the original image data of the dynamic frame acquired in the embodiment of the present invention into dynamic frame data, that is, convert the image pixel data into an n-1 horizontal factor with frame n as a unit, so as to extract the feature information in the dynamic frame data more accurately in the subsequent processing of the dynamic frame, the dynamic frame decomposition model may be a model trained by using a DNN neural network model and specially used for decomposing the original image into the dynamic frame data, and the output vector of the dynamic frame data is obtained by using the original image data as an input vector and performing model calculation processing.
Optionally, the apparatus further comprises: the acquisition module is also used for acquiring dynamic frame decomposition historical data according to the original picture data; and the training module is used for training the dynamic frame decomposition model according to the dynamic frame decomposition historical data.
Specifically, in order to train the dynamic frame decomposition model in the embodiment of the present invention, a historical data decomposition condition, that is, a historical frame data-dynamic frame data matrix, needs to be extracted from a historical database according to original frame data to obtain a plurality of data sources for training the dynamic frame decomposition model, so as to obtain a complete and mature dynamic frame decomposition model.
A constructing module 24, configured to construct a dynamic frame three-dimensional matrix according to the dynamic frame data and the lagrangian normal operator, where the dynamic frame three-dimensional matrix parameter characterization function includes: dynamic frame HDR dynamic range, frame size data, dynamic fluctuation data.
Optionally, the building module includes: the acquisition unit is used for acquiring Lagrangian operators for characterizing normal distribution in the dynamic data group; a calculating unit, configured to calculate the dynamic frame three-dimensional matrix based on the dynamic frame data and the lagrangian normal operator according to a lagrangian normal algorithm bulletin, where the bulletin includes:
and the delta and the theta are axial T matrix parameters of the three-dimensional matrix, the vector T is a dynamic frame distribution time stamp, s is a dynamic frame conversion function, and alpha is a Lagrangian normal operator.
And the extraction module 26 is configured to extract a dynamic frame picture meeting the requirement according to the dynamic frame three-dimensional matrix and preset extraction information.
Specifically, after the three-dimensional matrix data of the dynamic frame is obtained, in order to perform multi-point dynamic frame generation through the matrix data, the three-dimensional matrix data needs to be combined with preset extraction information in the embodiment of the present invention, and the dynamic frame picture needs to be extracted according to a preset extraction requirement.
Optionally, the apparatus further comprises: and the generating module is further used for generating the preset extraction information according to the original picture data and the demand parameters.
Specifically, in the embodiment of the present invention, before extracting a dynamic frame picture meeting requirements according to the dynamic frame three-dimensional matrix and preset extraction information, the preset extraction information needs to be generated according to the original picture data and the requirement parameter, where the requirement parameter may be generated and transmitted by a user according to the requirement and requirement for dynamic frame picture extraction, and the preset extraction information is used as one of parameters for generating a final dynamic frame picture.
Through the embodiment, the technical problem that in the prior art, the dynamic image processing process only depends on the feature recognition operation or the feature extraction operation on the static frame picture of the dynamic frame, but the workload of the extraction operation is very large, and the extraction requirement is changed according to different situations of the dynamic frame, so that the complete dynamic frame picture feature data cannot be accurately extracted by only the extraction method is solved.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium, which includes a stored program, where the program controls a device in which the non-volatile storage medium is located to execute a dynamic frame picture feature extraction method when running.
Specifically, the method comprises the following steps: acquiring original picture data; inputting the original picture data into a dynamic frame decomposition model to generate dynamic frame data; constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the Lagrange normal operator, wherein the parameter characterization function of the dynamic frame three-dimensional matrix comprises the following steps: dynamic frame HDR dynamic range, frame amount data and dynamic fluctuation data; and extracting a dynamic frame picture meeting the requirements according to the dynamic frame three-dimensional matrix and preset extraction information. Optionally, before the inputting the original picture data into a dynamic frame decomposition model and generating dynamic frame data, the method further includes: acquiring dynamic frame decomposition historical data according to the original picture data; and training the dynamic frame decomposition model according to the dynamic frame decomposition historical data. Optionally, the constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the lagrangian normal operator includes: obtaining Lagrange operators for representing normal distribution in the dynamic data set; calculating the dynamic frame three-dimensional matrix based on the dynamic frame data and the Lagrangian normal operator according to a Lagrangian normal algorithm notation, wherein the notation comprises:
and the delta and the theta are axial T matrix parameters of the three-dimensional matrix, the vector T is a dynamic frame distribution time stamp, s is a dynamic frame conversion function, and alpha is a Lagrangian normal operator. Optionally, before the extracting, according to the dynamic frame three-dimensional matrix and preset extraction information, a dynamic frame picture meeting requirements, the method further includes: and generating the preset extraction information according to the original picture data and the demand parameters.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a processor and a memory; the memory has stored therein computer readable instructions, and the processor is configured to execute the computer readable instructions, wherein the computer readable instructions execute a method for extracting dynamic frame image features.
Specifically, the method comprises the following steps: acquiring original picture data; inputting the original picture data into a dynamic frame decomposition model to generate dynamic frame data; constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the Lagrange normal operator, wherein the parameter characterization function of the dynamic frame three-dimensional matrix comprises the following steps: dynamic frame HDR dynamic range, frame amount data and dynamic fluctuation data; and extracting a dynamic frame picture meeting the requirements according to the dynamic frame three-dimensional matrix and preset extraction information. Optionally, before the inputting the original picture data into a dynamic frame decomposition model and generating dynamic frame data, the method further includes: acquiring dynamic frame decomposition historical data according to the original picture data; and training the dynamic frame decomposition model according to the dynamic frame decomposition historical data. Optionally, the constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the lagrangian normal operator includes: obtaining Lagrangian operator for characterizing normal distribution in the dynamic data group; calculating the dynamic frame three-dimensional matrix based on the dynamic frame data and the Lagrangian normal operator according to a Lagrangian normal algorithm disclosure, wherein the disclosure comprises:
and the delta and the theta are axial T matrix parameters of the three-dimensional matrix, the vector T is a dynamic frame distribution time stamp, s is a dynamic frame conversion function, and alpha is a Lagrangian normal operator. Optionally, before the extracting, according to the dynamic frame three-dimensional matrix and preset extraction information, a dynamic frame picture meeting requirements, the method further includes: and generating the preset extraction information according to the original picture data and the demand parameters.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or may not be executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, fig. 3 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device may include an input device 30, a processor 31, an output device 32, a memory 33, and at least one communication bus 34. The communication bus 34 is used to realize communication connections between the elements. The memory 33 may comprise a high speed RAM memory, and may also include a non-volatile memory NVM, such as at least one disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the processor 31 may be implemented by, for example, a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 31 is coupled to the input device 30 and the output device 32 through a wired or wireless connection.
Optionally, the input device 30 may include a variety of input devices, for example, at least one of a user-oriented user interface, a device-oriented device interface, a software programmable interface, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware plug-in interface (e.g., a USB interface, a serial port, etc.) for data transmission between devices; optionally, the user-facing user interface may be, for example, a user-facing control key, a voice input device for receiving voice input, and a touch sensing device (e.g., a touch screen with a touch sensing function, a touch pad, etc.) for receiving user touch input; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, such as an input pin interface or an input interface of a chip; optionally, the transceiver may be a radio frequency transceiver chip with a communication function, a baseband processing chip, a transceiver antenna, and the like. An audio input device such as a microphone may receive voice data. The output device 32 may include a display, a sound, or other output device.
In this embodiment, the processor of the terminal device includes a module for executing functions of each module of the data processing apparatus in each device, and specific functions and technical effects may be obtained by referring to the foregoing embodiments, which are not described herein again.
Fig. 4 is a schematic diagram of a hardware structure of a terminal device according to another embodiment of the present application. Fig. 4 is a specific embodiment of fig. 3 in an implementation process. As shown in fig. 4, the terminal device of the present embodiment includes a processor 41 and a memory 42.
The processor 41 executes the computer program code stored in the memory 42 to implement the method in the above-described embodiment.
The memory 42 is configured to store various types of data to support operations at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, videos, and so forth. The memory 42 may comprise a Random Access Memory (RAM) and may further comprise a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, the processor 41 is provided in the processing assembly 40. The terminal device may further include: a communication component 43, a power component 44, a multimedia component 45, an audio component 46, an input/output interface 47 and/or a sensor component 48. The specific components included in the terminal device are set according to actual requirements, which is not limited in this embodiment.
The processing component 40 generally controls the overall operation of the terminal device. Processing component 40 may include one or more processors 41 to execute instructions to perform all or a portion of the steps of the above-described method. Further, processing component 40 may include one or more modules that facilitate interaction between processing component 40 and other components. For example, the processing component 40 may include a multimedia module to facilitate interaction between the multimedia component 45 and the processing component 40.
The power supply component 44 provides power to the various components of the terminal device. The power components 44 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the terminal device.
The multimedia component 45 includes a display screen that provides an output interface between the terminal device and the user. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The audio component 46 is configured to output and/or input audio signals. For example, the audio component 46 includes a Microphone (MIC) configured to receive external audio signals when the terminal device is in an operational mode, such as a voice recognition mode. The received audio signal may further be stored in the memory 42 or transmitted via the communication component 43. In some embodiments, audio assembly 46 also includes a speaker for outputting audio signals.
The input/output interface 47 provides an interface between the processing component 40 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: a volume button, a start button, and a lock button.
The sensor assembly 48 includes one or more sensors for providing various aspects of status assessment for the terminal device. For example, the sensor assembly 48 may detect the open/closed status of the terminal device, the relative positioning of the components, the presence or absence of user contact with the terminal device. The sensor assembly 48 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 48 may also include a camera or the like.
The communication component 43 is configured to facilitate communication between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot for inserting a SIM card therein, so that the terminal device can log on to a GPRS network and establish communication with the server via the internet.
From the above, the communication component 43, the audio component 46, the input/output interface 47 and the sensor component 48 referred to in the embodiment of fig. 4 can be implemented as the input device in the embodiment of fig. 3.
In the embodiments provided in the present application, it should be understood that the disclosed technical content can be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (10)
1. A method for extracting features of a dynamic frame image, comprising:
acquiring original picture data;
inputting the original picture data into a dynamic frame decomposition model to generate dynamic frame data;
constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the Lagrange normal operator, wherein the parameter characterization function of the dynamic frame three-dimensional matrix comprises the following steps: dynamic frame HDR dynamic range, frame amount data and dynamic fluctuation data;
and extracting a dynamic frame picture meeting the requirements according to the dynamic frame three-dimensional matrix and preset extraction information.
2. The method of claim 1, wherein prior to said inputting said original picture data into a dynamic frame decomposition model to generate dynamic frame data, said method further comprises:
acquiring dynamic frame decomposition historical data according to the original picture data;
and training the dynamic frame decomposition model according to the dynamic frame decomposition historical data.
3. The method according to claim 1, wherein constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the lagrangian normal operator comprises:
obtaining Lagrange operators for representing normal distribution in the dynamic data set;
calculating the dynamic frame three-dimensional matrix based on the dynamic frame data and the Lagrangian normal operator according to a Lagrangian normal algorithm notation, wherein the notation comprises:
and the delta and the theta are axial T matrix parameters of the three-dimensional matrix, the vector T is a dynamic frame distribution time stamp, s is a dynamic frame conversion function, and alpha is a Lagrangian normal operator.
4. The method according to claim 1, wherein before said extracting a satisfactory dynamic frame picture according to the dynamic frame three-dimensional matrix and preset extraction information, the method further comprises:
and generating the preset extraction information according to the original picture data and the demand parameters.
5. A moving frame picture feature extraction device, comprising:
the acquisition module is used for acquiring original picture data;
the generating module is used for inputting the original picture data into a dynamic frame decomposition model to generate dynamic frame data;
the construction module is used for constructing a dynamic frame three-dimensional matrix according to the dynamic frame data and the Lagrange normal operator, wherein the dynamic frame three-dimensional matrix parameter characterization function comprises: dynamic frame HDR dynamic range, frame amount data and dynamic fluctuation data;
and the extraction module is used for extracting the dynamic frame picture meeting the requirements according to the dynamic frame three-dimensional matrix and preset extraction information.
6. The apparatus of claim 5, further comprising:
the acquisition module is also used for acquiring dynamic frame decomposition historical data according to the original picture data;
and the training module is used for training the dynamic frame decomposition model according to the historical dynamic frame decomposition data.
7. The apparatus of claim 5, wherein the building module comprises:
the acquisition unit is used for acquiring Lagrangian operators for characterizing normal distribution in the dynamic data group;
a calculating unit, configured to calculate the dynamic frame three-dimensional matrix based on the dynamic frame data and the lagrangian normal operator according to a lagrangian normal algorithm bulletin, where the bulletin includes:
and the delta and the theta are axial T matrix parameters of the three-dimensional matrix, the vector T is a dynamic frame distribution time stamp, s is a dynamic frame conversion function, and alpha is a Lagrangian normal operator.
8. The apparatus of claim 5, further comprising:
and the generating module is further used for generating the preset extraction information according to the original picture data and the demand parameters.
9. A non-volatile storage medium, comprising a stored program, wherein the program, when executed, controls an apparatus in which the non-volatile storage medium is located to perform the method of any one of claims 1 to 4.
10. An electronic device comprising a processor and a memory; the memory has stored therein computer readable instructions for execution by the processor, wherein the computer readable instructions when executed perform the method of any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210889466.6A CN115170818A (en) | 2022-07-27 | 2022-07-27 | Dynamic frame image feature extraction method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210889466.6A CN115170818A (en) | 2022-07-27 | 2022-07-27 | Dynamic frame image feature extraction method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115170818A true CN115170818A (en) | 2022-10-11 |
Family
ID=83497928
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210889466.6A Pending CN115170818A (en) | 2022-07-27 | 2022-07-27 | Dynamic frame image feature extraction method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115170818A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115809006A (en) * | 2022-12-05 | 2023-03-17 | 北京拙河科技有限公司 | Method and device for controlling manual instruction by picture |
CN116309523A (en) * | 2023-04-06 | 2023-06-23 | 北京拙河科技有限公司 | Dynamic frame image dynamic fuzzy recognition method and device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1998238A (en) * | 2004-06-15 | 2007-07-11 | 株式会社Ntt都科摩 | Device and method for generating a transmit frame |
CN103546314A (en) * | 2011-05-04 | 2014-01-29 | 成都勤智数码科技股份有限公司 | Device for predicting IT (information technology) operation and maintenance indexes by using correlation |
CN106933950A (en) * | 2017-01-22 | 2017-07-07 | 四川用联信息技术有限公司 | New Model tying algorithm realizes search engine optimization technology |
CN107765216A (en) * | 2017-08-29 | 2018-03-06 | 宁波大学 | Target location and timing parameter combined estimation method in unsynchronized wireless networks |
CN110852956A (en) * | 2019-07-22 | 2020-02-28 | 江苏宇特光电科技股份有限公司 | Method for enhancing high dynamic range image |
CN112001215A (en) * | 2020-05-25 | 2020-11-27 | 天津大学 | Method for identifying identity of text-independent speaker based on three-dimensional lip movement |
-
2022
- 2022-07-27 CN CN202210889466.6A patent/CN115170818A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1998238A (en) * | 2004-06-15 | 2007-07-11 | 株式会社Ntt都科摩 | Device and method for generating a transmit frame |
CN103546314A (en) * | 2011-05-04 | 2014-01-29 | 成都勤智数码科技股份有限公司 | Device for predicting IT (information technology) operation and maintenance indexes by using correlation |
CN106933950A (en) * | 2017-01-22 | 2017-07-07 | 四川用联信息技术有限公司 | New Model tying algorithm realizes search engine optimization technology |
CN107765216A (en) * | 2017-08-29 | 2018-03-06 | 宁波大学 | Target location and timing parameter combined estimation method in unsynchronized wireless networks |
CN110852956A (en) * | 2019-07-22 | 2020-02-28 | 江苏宇特光电科技股份有限公司 | Method for enhancing high dynamic range image |
CN112001215A (en) * | 2020-05-25 | 2020-11-27 | 天津大学 | Method for identifying identity of text-independent speaker based on three-dimensional lip movement |
Non-Patent Citations (2)
Title |
---|
石杰: ""基于静息态功能磁共振成像的精神疾病患者动态脑网络特性研究"", 《中国硕士学位论文全文数据库医药卫生科技辑》 * |
葛红: ""基于火焰光谱分析及图像处理的生物质燃烧监测研究"", 《中国博士学位论文全文数据库工程科技Ⅱ辑》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115809006A (en) * | 2022-12-05 | 2023-03-17 | 北京拙河科技有限公司 | Method and device for controlling manual instruction by picture |
CN115809006B (en) * | 2022-12-05 | 2023-08-08 | 北京拙河科技有限公司 | Method and device for controlling manual instructions through picture |
CN116309523A (en) * | 2023-04-06 | 2023-06-23 | 北京拙河科技有限公司 | Dynamic frame image dynamic fuzzy recognition method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115170818A (en) | Dynamic frame image feature extraction method and device | |
CN115426525B (en) | High-speed dynamic frame linkage image splitting method and device | |
CN115375582A (en) | Moire digestion method and device based on low-order Taylor decomposition | |
CN115631122A (en) | Image optimization method and device for edge image algorithm | |
CN115984126A (en) | Optical image correction method and device based on input instruction | |
CN114842424A (en) | Intelligent security image identification method and device based on motion compensation | |
CN116614453B (en) | Image transmission bandwidth selection method and device based on cloud interconnection | |
CN115334291A (en) | Tunnel monitoring method and device based on hundred million-level pixel panoramic compensation | |
CN115474091A (en) | Motion capture method and device based on decomposition metagraph | |
CN115187570B (en) | Singular traversal retrieval method and device based on DNN deep neural network | |
CN116402935B (en) | Image synthesis method and device based on ray tracing algorithm | |
CN115205313B (en) | Picture optimization method and device based on least square algorithm | |
CN115460389B (en) | Image white balance area optimization method and device | |
CN116797479B (en) | Image vertical distortion conversion method | |
CN116389915B (en) | Method and device for reducing flicker of light field camera | |
CN116468883B (en) | High-precision image data volume fog recognition method and device | |
CN115914819B (en) | Picture capturing method and device based on orthogonal decomposition algorithm | |
CN116579965B (en) | Multi-image fusion method and device | |
CN115809006B (en) | Method and device for controlling manual instructions through picture | |
CN116228593B (en) | Image perfecting method and device based on hierarchical antialiasing | |
CN115511735B (en) | Snow field gray scale picture optimization method and device | |
CN116664413B (en) | Image volume fog eliminating method and device based on Abbe convergence operator | |
CN116309523A (en) | Dynamic frame image dynamic fuzzy recognition method and device | |
CN117896625A (en) | Picture imaging method and device based on low-altitude high-resolution analysis | |
CN117911570A (en) | Image generation method and device based on low-pass filtering |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20221011 |
|
RJ01 | Rejection of invention patent application after publication |