CN109243179B - Method and device for distinguishing dynamic capture frames - Google Patents

Method and device for distinguishing dynamic capture frames Download PDF

Info

Publication number
CN109243179B
CN109243179B CN201811321354.0A CN201811321354A CN109243179B CN 109243179 B CN109243179 B CN 109243179B CN 201811321354 A CN201811321354 A CN 201811321354A CN 109243179 B CN109243179 B CN 109243179B
Authority
CN
China
Prior art keywords
time
frame data
snapshot
frame
distinguished
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811321354.0A
Other languages
Chinese (zh)
Other versions
CN109243179A (en
Inventor
陈务
汤官华
戈志明
曹李军
陈卫东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Keda Technology Co Ltd
Original Assignee
Suzhou Keda Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Keda Technology Co Ltd filed Critical Suzhou Keda Technology Co Ltd
Priority to CN201811321354.0A priority Critical patent/CN109243179B/en
Publication of CN109243179A publication Critical patent/CN109243179A/en
Application granted granted Critical
Publication of CN109243179B publication Critical patent/CN109243179B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a method and a device for distinguishing dynamic capture frames, wherein the method comprises the following steps: obtaining sending time T of random snapshot instruction0(ii) a Acquiring transmission initial signal time T of snapshot frame data to be distinguished1(ii) a Obtaining the mean value t of the time interval difference of a plurality of adjacent frame data transmissionn(ii) a Acquiring acquisition completion time T of frame data to be distinguished2(ii) a At the T2In [ T ]1,T1+tn]If so, determining the frame data to be distinguished as the capture frame to be distinguished; wherein the transmission start signal time T1And the snapshot instruction sending time T0The interval of (a) is greater than the time required for the field programmable gate array to pre-process the frame data. Mean value t of time interval differences transmitted by a plurality of adjacent frame datanThe performance of the front end of frame data processing is monitored, the monitored time interval difference mean value is updated in real time, the work of acquiring the capture frame can be smoothly completed, and the stability is good; the method is suitable for dynamic snapshot instructions, and can simply and effectively distinguish snapshot frames from common frames.

Description

Method and device for distinguishing dynamic capture frames
Technical Field
The invention relates to the technical field of image processing, in particular to a method and a device for distinguishing dynamic capture frames.
Background
With the improvement of the comprehensive strength and income level of China, motor vehicles rapidly increase at a speed of 10-20% every year, the road construction pace is accelerated, the nationwide urbanization level is continuously improved, the contradiction between the current situation and the demand of traffic management is further aggravated, criminals and public security cases related to traffic are also increased year by year, and particularly, the motor vehicles are stolen and robbed by driving along roads after causing or working, the vehicles run against regulations and the like.
The public security checkpoint vehicle intelligent monitoring and recording system is very beneficial to continuously and automatically recording the composition, flow distribution and violation conditions of vehicles running on a highway (national road), provides important basic and running data for traffic planning, traffic management and road maintenance departments, provides important technical means and evidence for quickly correcting traffic violation behaviors, quickly detecting traffic accident escape and motor vehicle robbery cases, and has very important significance for the safe running of the highway (national road) and improving the quick response capability of the highway traffic management.
The intelligent monitoring and recording system for vehicles at security gate utilizes advanced photoelectric, computer, image processing, mode recognition and WEB data access technology to continuously record the front characteristic image, vehicle panoramic image and road surface real-time video stream of each motor vehicle monitoring road surface in all-weather real-time mode, and the license plate recognizer can automatically recognize the number plate of the vehicle according to the picked image and can make dynamic vehicle control and violation alarm, and can organically share the information of each monitoring point through public security network. In which the acquisition of a characteristic image of the front of each motor vehicle is important for license plate recognition.
At present, the acquisition of the front characteristic image is generally realized by using an explosion flash lamp under the condition of low gain of an image sensor, and the data under the condition only has one frame of data without influencing the normal frame of data before and after the data. The program on the processor typically recognizes this capture frame by a particular method and notifies the upper layer business process by some method. The current method for the processor to recognize the capture frame is as follows: and at the data snapshot time, before the data reaches the processor, adding a feature code to the snapshot frame data through hardware, and notifying an upper-layer application after a program at the processor identifies the feature code.
In the above method, the feature code added to the snapshot frame data by the hardware is usually implemented by a Field-Programmable Gate Array (FPGA), which is directed to the case that the processor can directly acquire the unprocessed sensor data. However, if there is a hardware pre-processing before the processor does not obtain the data, the signature will change, causing the method to fail and failing to correctly distinguish the captured frames from the continuous frame data.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method and an apparatus for distinguishing a dynamic captured frame, so as to solve the problem in the prior art that a captured frame cannot be effectively distinguished from continuous frame data.
According to a first aspect, an embodiment of the present invention provides a method for distinguishing dynamic capture frames, including: obtaining sending time T of random snapshot instruction0(ii) a Acquiring the number of candid frames to be distinguishedAccording to the transmission start signal time T1(ii) a Obtaining the mean value t of the time interval difference of a plurality of adjacent frame data transmissionn(ii) a Acquiring acquisition completion time T of frame data to be distinguished2(ii) a At the T2In [ T ]1,T1+tn]If so, determining the frame data to be distinguished as the capture frame to be distinguished; wherein the transmission start signal time T1And the snapshot instruction sending time T0The interval of (a) is greater than the time required for the field programmable gate array to pre-process the frame data.
Optionally, the transmission start signal time T of the first to-be-distinguished snapshot frame data is acquired1Before, still include: acquiring a time average value for preprocessing all frame data in a first preset time range; comparing the time average value with a preset time average value to obtain a comparison result; and determining that the error between the time mean value and the preset time mean value is in a preset range based on the comparison result.
Optionally, a mean value t of time interval differences of a plurality of adjacent frame data transmissions is obtainednThe method comprises the following steps: acquiring the acquisition completion time of all frame data in the second preset time range, and recording the acquisition completion time as s1,s2,…,sn,sn+1(ii) a Calculating the time interval difference mean value:
Figure BDA0001857595700000031
optionally, a time average of preprocessing all frame data within a first preset time range is obtained, where the preprocessing includes noise reduction.
Optionally, the transmission start signal time T of the first to-be-distinguished snapshot frame data is acquired1Before, still include: sending a snapshot instruction to the field programmable gate array; and adjusting the exposure parameter and the gain parameter of the image sensor.
According to a second aspect, an embodiment of the present invention provides a device for distinguishing dynamic capture frames, including a first obtaining module, a second obtaining module, a third obtaining module, a fourth obtaining module, and a first determining module, where:
the first acquisition module is used for acquiring sending time T of the random snapshot instruction0(ii) a The second acquisition module is used for acquiring the transmission starting signal time T of the snapshot frame data to be distinguished1(ii) a The third obtaining module is used for obtaining the mean value t of the time interval difference of the transmission of a plurality of adjacent frame datan(ii) a The fourth acquisition module is used for acquiring the acquisition completion time T of the frame data to be distinguished2(ii) a At the T2In [ T ]1,T1+tn]In the meantime, the first determining module is used for determining the frame data to be distinguished as the capture frame to be distinguished; wherein the transmission start signal time T1And the snapshot instruction sending time T0The interval of (a) is greater than the time required for the field programmable gate array to pre-process the frame data.
Optionally, the method further comprises: the fifth acquisition module is used for acquiring a time average value for preprocessing all frame data in the first preset time range; the comparison module is used for comparing the time average value with a preset time average value to obtain a comparison result; and the second determination module is used for determining that the error between the time mean value and the preset time mean value is in a preset range based on the comparison result.
Optionally, the third obtaining module includes: a first obtaining unit, configured to obtain the acquisition completion time of all frame data within the second preset time range, which is recorded as s1,s2,…,sn,sn+1(ii) a A calculating unit, configured to calculate the time interval difference mean value:
Figure BDA0001857595700000032
optionally, the apparatus further comprises a preprocessing module, wherein the preprocessing module comprises a noise reduction unit.
Optionally, the apparatus further comprises: the sending module is used for sending a snapshot instruction to the field programmable gate array; and the adjusting module is used for adjusting the exposure parameter and the gain parameter of the image sensor.
According to a third aspect, an embodiment of the present invention provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of dynamic snap frame differentiation of any of the first aspects.
According to a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which computer instructions are stored, and the instructions, when executed by a processor, implement the steps of the dynamic snapshot frame distinguishing method according to any one of the first aspect.
The embodiment of the invention provides a method and a device for distinguishing dynamic snapshot frames, wherein the method comprises the step of acquiring sending time T of a random snapshot instruction0(ii) a Acquiring transmission initial signal time T of snapshot frame data to be distinguished1(ii) a Obtaining the mean value t of the time interval difference of a plurality of adjacent frame data transmissionn(ii) a Acquiring acquisition completion time T of frame data to be distinguished2(ii) a At the T2In [ T ]1,T1+tn]If so, determining the frame data to be distinguished as the capture frame to be distinguished; wherein the transmission start signal time T1And the snapshot instruction sending time T0The interval of (a) is greater than the time required for the field programmable gate array to pre-process the frame data. Mean value t of time interval differences transmitted by a plurality of adjacent frame datanTo monitor the performance of the front end of frame data processing and determine the sending time T of the snapshot instruction0Frame data transmission start time T1And frame data transmission completion time T2The relationship of the three satisfies: t is2In [ T ]1,T1+tn]In time of transmission start signal time T1And snapshot instruction sending time T0The interval of (a) is greater than the time required for the field programmable gate array to pre-process the frame data. The monitored time interval difference mean value is updated in real time, the work of acquiring the capture frame can be smoothly finished, and the stability is good; the method adapts to dynamic snapshot instructions and can simply and effectively distinguish the snapshot from the common framesAnd (6) shooting frames.
The embodiment of the invention provides a method and a device for distinguishing dynamic snapshot frames, which are used for obtaining the transmission starting signal time T of first snapshot frame data to be distinguished1Previously, the method further comprises: acquiring a time average value for preprocessing all frame data in a first preset time range; comparing the time average value with a preset time average value and acquiring a comparison result; and determining that the comparison result is smaller than a preset value. The performance of the frame data processing front end 25 is indirectly monitored by monitoring the frame data preprocessing time average value in a period of time range so as to judge whether the snapshot frame loses frames or not, the device performance is applicable regardless of the height, and the dynamic adaptability is good.
Drawings
The features and advantages of the present invention will be more clearly understood by reference to the accompanying drawings, which are illustrative and not to be construed as limiting the invention in any way, and in which:
FIG. 1 is a flow chart illustrating a method for differentiating dynamic capture frames in an embodiment of the present invention;
FIG. 2 is a block diagram of a system employing a method for distinguishing dynamic snapshot frames according to an embodiment of the present invention;
FIG. 3 shows a block diagram of a differentiating means of dynamic capture frames;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
FIG. 5 illustrates a snapshot implementation of an embodiment of the present invention;
fig. 6 shows a block diagram of a system structure of a distinguishing device for dynamic capture frames according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
One of the applicable situations of the method and the device for distinguishing the dynamic snapshot frames in the implementation of the invention is public security checkpoint, the snapshot has randomness, and the snapshot can be carried out only under the condition of meeting the conditions: for example, the coil detector detects the passage of the vehicle.
Example one
Fig. 1 shows a flowchart of a method for distinguishing dynamic snapshot frames in an embodiment of the present invention, and fig. 2 shows a system structure block diagram of a method for distinguishing dynamic snapshot frames in an embodiment of the present invention, where the method for distinguishing dynamic snapshot frames includes:
step S10, obtaining sending time T of random snapshot instruction0
Acquiring sending time T of snapshot instruction sent by snapshot logic 21 to field programmable gate array 22 (hereinafter referred to as FPGA22)0Meanwhile, the snapshot logic 21 indirectly controls the image sensor 23 by configuring the field programmable gate array 22, and the image sensor 23 changes the shooting setting to the snapshot mode. Wherein the snapshot logic 21 runs on the processor, and the snapshot logic 21 determines the working mode of the image sensor 23: a normal shooting mode or a snapshot mode. Due to the limitation of practical circuits, the snapshot logic 21 is not directly connected with the image sensor 23, but is connected through the FPGA22, so that the snapshot logic cannot directly implement configuration change on the image sensor 23, and indirectly changes the shooting parameter setting of the image sensor 23 by directly configuring the FPGA 22.
Step S20, obtaining the transmission start signal time T of the snapshot frame data to be distinguished1
After the snapshot is completed, the image sensor 23 sends snapshot frame data to the FPGA22, for example, the image sensor 23 has the model of IMX178, and outputs a data signal by using LVDS; the FPGA22 converts the received frame data into MIPI signals, sends the MIPI signals to the snapshot logic 21, and in the process that the FPGA22 sends the frame data to the snapshot logic 21, the frame data acquisition controller 24 sends transmission start signal time to the snapshot logic 21, wherein the start time is the transmission start signal time T of the snapshot frame data1. The frame data acquisition controller 24 completes the reception and decoding of the frame data in the physical layer, and when the frame data acquisition controller 24 decodes the frame data and completes the decoding into a complete frame, it will send a frame data acquisition completion signal to the snapshot logic 21.
Step S30, obtaining the mean value t of the time interval difference of the transmission of a plurality of adjacent frame datan
The difference of the transmission time intervals of the adjacent frame data is the difference between the transmission starting time of the current frame data and the transmission starting time of the previous frame; or, the difference between the transmission completion time of the current frame data and the transmission completion time of the previous frame.
The working mode of the intelligent monitoring and recording system for the public security checkpoint vehicle is as follows: the image sensor 23 takes pictures of the road at certain time intervals to obtain frame data; the frame data is transmitted to the FPGA22, and since the image sensor 23 uses LVDS output, the FPGA22 needs to format convert it into MPI output for subsequent processing, which is one format processing before the frame data is contacted by an upper layer application; the FPGA22 transmits frame data to the capture logic 21 or the upper layer application, and before this transmission, the frame data processing front end 25 (corresponding to the circuit of the capture logic 21) corresponding to the capture logic 21 performs preprocessing including noise reduction on the frame data, which is the preliminary image processing before the frame data is contacted by the upper layer application; after the preprocessing is completed, the snapshot logic 21 or the upper application can contact the frame data, and the system releases the storage space to store the next frame of frame data to be preprocessed.
Mean time interval difference t of multiple adjacent frame data transmissionnThe reciprocal of the time interval difference average value can be taken to represent the frame data acquisition frequency in the period of time, that is, the system acquisition frame rate, so that the time interval difference average value can intuitively reflect the performance of the frame data processing front end 25, and if the reciprocal of the time interval difference average value is consistent with or close to the frame rate set by the upper-layer application, the performance of the frame data processing front end 25 is considered to meet the current acquisition frame rate.
Step S40, acquiring the acquisition completion time T of the frame data to be distinguished2
In the figureAfter the image sensor 23 shoots the snapshot frame, the snapshot frame data is transmitted in the circuit, and the snapshot logic 21 or the upper layer application needs to be in contact with the frame data after the frame data is stored. Acquisition completion time T of frame data2It is the time it is stored, at which point the snap-shot logic 21 may mark the storage space for identification.
The system will not release the storage space for storing the current frame data until the frame data processing front-end 25 has processed the previous frame data. If the frame data is stored after each frame is shot, an additional storage device is needed to store the frame data to be processed.
However, there is only one frame data processing front end, so that under the condition that the performance of the frame data processing front end 25 is insufficient, if the current capture frame is completely photographed and is converted by the format of the FPGA22, and the previous frame is not completely processed, the current capture frame does not obtain a storage space and causes data loss (frame loss), the acquisition completion time of the current capture frame cannot be obtained, and the actually obtained acquisition completion time is a certain frame data after the capture frame. Similarly, even if additional storage devices are added, frame dropping may occur if the storage device space is insufficient in the case where the performance of the frame data processing front end 25 is insufficient.
Considering frame loss, the acquisition completion time cannot be determined to be T in the step2Whether the frame data is the snapshot frame data to be distinguished.
Step S50, at T2In [ T ]1,T1+tn]And in the meantime, determining the frame data to be distinguished as the capture frame to be distinguished. Wherein the transmission start signal time T1And snapshot instruction sending time T0The interval of (a) is greater than the time required for the field programmable gate array to pre-process the frame data.
Case 1: if the snapshot instruction sends time T0Occurring in a period of time in which the image sensor 23 is not performing photographing, the start signal time T is transmitted without any doubt1And snapshot instruction sending time T0The interval of (a) may be greater than the time required for the field programmable gate array to format convert the frame data. Specifically, FPGA22 pairs of framesThe time required for the data to undergo format conversion may be tested beforehand to obtain an average value or a time range.
In the monitoring of the security gate, even if the video is recorded, a frame rate is set for the video recording, that is, the shooting time of two adjacent frame data is time interval, and the next frame is shot immediately without the shooting of the previous frame being finished. In the case where the performance of the frame data processing front end 25 is sufficient, there is also a standby time during which no operation is performed after the preprocessing of the previous frame is completed and before the next frame is photographed. In the above case 1, only considering the state of the image sensor 23, there is a standby time in which the snapshot command occurs when two adjacent frames are photographed, and it is needless to say that the frame corresponding to the transmission start signal time at which the interval of the snapshot command transmission time is the shortest is the snapshot frame.
Case 2: if the snapshot instruction sends time T0What happens during the period of time taken by the image sensor 23 for capture is the start and end of transmission time of the current frame (the frame being captured) captured by the capture logic 21, which is obviously not a capture frame, T that the processor needs to capture1、T2The transmission starting time and the transmission finishing time of the next frame are set, the interference of the data can be eliminated by setting the conditions, after the snapshot frame is distinguished, the snapshot logic 21 marks the storage space of the snapshot frame, and the marking cannot be influenced by processing all the subsequent frame data, so that the accuracy of the distinguishing result is ensured.
In the above situation 2, since the capture logic 21 and the image sensor 23 are operated completely independently, the image sensor 23 itself only performs capturing according to the preset parameters and the frame rate, when the capture instruction generated by the capture logic 21 reaches the image sensor 23, if the image sensor 23 is capturing a normal frame, the image sensor 23 will not stop capturing, and the FPGA22 performs parameter configuration change on the frame after the image sensor 23 captures the frame, in such a case, the next frame is the capture frame, and therefore, the transmission start signal time T is the transmission start signal time T1And snapshot instruction sending time T0Is arranged in a defined manner.
In the present embodiment, the mean value t of the time interval difference transmitted by a plurality of adjacent frame datanThis data monitors the performance of the frame data processing front end 25 and determines the snapshot command transmission time T0Frame data transmission start time T1And frame data transmission completion time T2The relationship of the three satisfies: t is2In [ T ]1,T1+tn]In time of transmission start signal time T1And snapshot instruction sending time T0The interval of (a) is greater than the time required for the field programmable gate array to pre-process the frame data. The monitored time interval difference mean value is updated in real time, the work of acquiring the capture frame can be smoothly finished, and the stability is good; the method is suitable for dynamic snapshot instructions, and can simply and effectively distinguish snapshot frames from common frames.
As an optional implementation manner, before step S20, the method further includes:
step S11, obtaining a time average value of preprocessing all frame data within the first preset time range.
The first predetermined time range includes at least several frames of data before the current frame of data, and may also include all data in a period of time before the current frame of data. The frame data preprocessing time is the time interval between the frame data processing front end 25 and the processing completion time of all the adjacent frame data in the preset time range, that is, the time interval of the memory space release in the time range.
And step S12, comparing the time average value with a preset time average value to obtain a comparison result.
The preset time average value may be obtained by comparing the preprocessing time average value of all frame data in the first preset time range with 1/30 seconds with reference to an acquisition frame rate set by an upper layer application, for example, the set acquisition frame rate is 30 frames, that is, the time interval between two adjacent frames is 1/30 seconds.
In step S13, it is determined that the error between the time average value and the preset time average value is within a preset range based on the comparison result.
The preprocessing performance of the frame data processing front end 25 is not always matched with the shooting interval of the image sensor 23, and the device aging can cause the frame data processing front end 25 not to finish the preprocessing of the previous frame in the time interval of two adjacent frames. Here, the capturing interval of the image sensor 23 coincides with the acquisition frame rate set by the upper layer application mentioned in step S12.
In the present embodiment, the performance of the frame data processing front end 25 is monitored for the situation that may occur in step S40, and the average of the preprocessing time of all frame data within a period of time is compared with the reciprocal of the frame rate of acquisition, for example, the frame rate of acquisition is 30 frames, and if the error of the comparison result is ± 3ms, the performance of the frame data processing front end 25 is considered to be suitable for the shooting interval time of the image sensor 23; if the error exceeds 3ms, the performance of the frame data processing front end 25 is considered to be insufficient, an alarm is sent out to inform the upper layer application that frame loss occurs, that is, if the discrimination of the capture frame is currently carried out, the capture frame loss occurs, and the normal discrimination cannot be carried out.
In this embodiment, if the actual acquisition frame rate reflected by the time interval difference between two adjacent frames of data transmission is smaller than or equal to the acquisition frame rate set by the upper application, it is determined that the result of distinguishing the capturing frame is valid.
If the actual acquisition frame rate reflected by the time interval difference of the two adjacent frames of data transmission is greater than the acquisition frame rate set by the upper application, an abnormality exists, and it can be judged that the performance of the frame data processing front end 25 is overloaded, the preprocessing cannot be completed in time, and the storage space cannot be released in time, so that frame loss is finally caused. At the moment, an alarm device can be further arranged, alarm is triggered when frame loss occurs, alarm information is sent to upper-layer application, the upper-layer application properly reduces the acquisition frame rate, the memory occupation is reduced, the cpu calculation amount is reduced, or the network packet sending rate is reduced, the resource occupation is reduced, and the frame loss phenomenon is relieved.
The performance of the frame data processing front end 25 is indirectly monitored by monitoring the frame data preprocessing time average value in a period of time range so as to judge whether the snapshot frame loses frames or not, the device performance is applicable regardless of the height, and the dynamic adaptability is good.
As an alternative embodiment, step S30 includes:
step S31, acquiring the acquisition completion time of all frame data in the second preset time range, and recording the acquisition completion time as S1,s2,…,sn,sn+1
Step S32, calculating the time interval difference mean value:
Figure BDA0001857595700000101
in this embodiment, the mean value of the time interval differences may also be calculated by selecting the transmission start times of all the frame data within the preset time range. The preset time range includes at least a period of time before the transmission start time of the previous frame data. Mean time interval difference tnThe frame data acquisition frequency in the period of time can be represented after the reciprocal is taken, so the time interval difference mean value can intuitively reflect the performance of the frame data processing front end 25, and if the reciprocal of the time interval difference mean value is consistent with or close to the frame rate set by the upper-layer application, the performance of the frame data processing front end 25 is considered to meet the current acquisition frame rate.
As an alternative embodiment, in step S11, the preprocessing includes noise reduction.
In the present embodiment, the frame data processing front end 25 receives the frame data format-converted by the FPGA22, and performs preprocessing including noise reduction thereon.
As an optional implementation manner, before step S20, the method further includes:
and step S14, sending a snapshot instruction to the field programmable gate array.
In step S15, the exposure parameter and the gain parameter of the image sensor are adjusted.
In the present embodiment, as shown in fig. 2, the snapshot logic 21 sends a snapshot instruction to the FPGA22, and at the same time, performs snapshot configuration on the FPGA22, and the FPGA22 controls the image sensor 23 to adjust shooting parameters. After shooting is finished, the FPGA22 and the image sensor 23 are both changed to the original configuration, and common frames are still shot before the next snapshot instruction is sent.
Example two
Fig. 3 shows a block diagram of a discriminating apparatus for dynamic capture frames, the apparatus comprising:
a first obtaining module 301, configured to obtain sending time T of the random snapshot instruction0
A second obtaining module 302, configured to obtain a transmission start signal time T of the snapshot frame data to be distinguished1
A third obtaining module 303, configured to obtain a mean time interval difference t of transmission of a plurality of adjacent frame datan
A fourth obtaining module 304, configured to obtain the acquisition completion time T of the frame data to be distinguished2
The first determination module 305, at T2In [ T ]1,T1+tn]And the method is used for determining the frame data to be distinguished as the capture frames to be distinguished.
Wherein the transmission start signal time T1And snapshot instruction sending time T0The interval of (a) is greater than the time required for the field programmable gate array to pre-process the frame data.
As an optional implementation, further comprising:
a fifth obtaining module 306, configured to obtain a time average of preprocessing all frame data within a first preset time range.
And a comparing module 307, configured to compare the time average with a preset time average to obtain a comparison result.
A second determining module 308, configured to determine that an error between the time mean and the preset time mean is within a preset range based on the comparison result.
As an optional implementation, the third obtaining module 303 includes:
a first obtaining unit 3031, configured to obtain the acquisition completion time of all frame data within a second preset time range, which is recorded as s1,s2,…,sn,sn+1
A calculating unit 3032, configured to calculate a time interval difference mean value:
Figure BDA0001857595700000121
as an optional implementation, the apparatus further comprises a preprocessing module 309, wherein the preprocessing module 309 comprises a noise reduction unit 3091.
As an optional implementation, further comprising:
and the sending module 310 is configured to send a snapshot instruction to the field programmable gate array.
And an adjusting module 311, configured to adjust an exposure parameter and a gain parameter of the image sensor.
Further description of the functions of the modules is the same as that of the first embodiment, and is not repeated herein.
EXAMPLE III
Fig. 4 shows a schematic structural diagram of an electronic device according to an embodiment of the present invention, where the electronic device may include a processor 401 and a memory 404, where the processor 401 and the memory 404 may be connected by a bus or in another manner, and fig. 5 illustrates an example of a connection by a bus.
Processor 401 may be a Central Processing Unit (CPU). The Processor 401 may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, or combinations thereof.
The memory 404 is a non-transitory computer readable storage medium, and can be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the method for distinguishing dynamic capture frames in the embodiment of the present invention (for example, the first obtaining module 301, the second obtaining module 302, the third obtaining module 303, the fourth obtaining module 304, and the first determining module 305 shown in fig. 3). The processor 401 executes various functional applications and data processing of the processor by running non-transitory software programs, instructions and modules stored in the memory 404, that is, implements the method for distinguishing dynamic capture frames in the above method embodiments.
The memory 404 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by the processor 401, and the like. Further, the memory 404 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 404 may optionally include memory located remotely from processor 401, which may be connected to processor 401 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more modules are stored in the memory 404 and when executed by the processor 401 perform a method of distinguishing dynamic capture frames as in the embodiment shown in fig. 1.
The specific details of the vehicle terminal may be understood by referring to the corresponding related descriptions and effects in the embodiments shown in fig. 1 to fig. 3, and are not described herein again.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD) or a Solid State Drive (SSD), etc.; the storage medium may also comprise a combination of memories of the kind described above.
Fig. 5 shows a snapshot implementation diagram in an embodiment of the present invention, in which an APQ8056 is a processor model, an IMX178 is an image sensor model: the processor sends a snapshot instruction 501 to an FPGA (field programmable gate array), the FPGA sends a snapshot configuration signal 502 to the image sensor, the image sensor sends snapshot frame data 503 to the FPGA, the FPGA sends a snapshot frame data transmission starting signal 504 to the processor, and the FPGA sends the snapshot frame data 505 to the processor.
FIG. 6 is a block diagram of a system of a distinguishing device for dynamic capture frames according to an embodiment of the present invention: the snapshot logic sends a snapshot instruction to the FPGA and configures the FPGA, and the processor indirectly controls the image sensor to adjust the setting parameters for snapshot; the image sensor sends snapshot frame data to the FPGA; the FPGA sends a frame data transmission starting signal to the snapshot logic and simultaneously transmits the snapshot frame data to a frame data acquisition controller; the snapshot logic receives a snapshot frame data transmission completion signal sent by the frame data acquisition controller, and configures the frame data acquisition controller to mark a storage space in which the snapshot frame data are stored; and the frame data processing front end receives the snapshot frame data, performs preprocessing operation including noise reduction, and sends a frame data processing completion signal to the snapshot logic. The upper layer application can then directly process the snapshot frame data.
Although the embodiments of the present invention have been described in conjunction with the accompanying drawings, those skilled in the art may make various modifications and variations without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope defined by the appended claims.

Claims (12)

1. A method for differentiating dynamic capture frames, comprising:
obtaining sending time T of random snapshot instruction0
Acquiring transmission initial signal time T of snapshot frame data to be distinguished1
Obtaining the mean value t of the time interval difference of a plurality of adjacent frame data transmissionn
Acquiring acquisition completion time T of frame data to be distinguished2
At the T2In [ T ]1,T1+tn]If so, determining the frame data to be distinguished as the capture frame to be distinguished;
wherein the transmissionStarting signal time T1And the snapshot instruction sending time T0The interval of (a) is greater than the time required for the field programmable gate array to pre-process the frame data.
2. The method for discriminating a dynamic snapshot frame data of claim 1, wherein the transmission start signal time T at which the snapshot frame data to be discriminated is acquired1Before, still include:
acquiring a time average value for preprocessing all frame data in a first preset time range;
comparing the time average value with a preset time average value to obtain a comparison result;
and determining that the error between the time mean value and the preset time mean value is in a preset range based on the comparison result.
3. The method of claim 1, wherein the mean value t of the time interval differences of the data transmission of a plurality of adjacent frames is obtainednThe method comprises the following steps:
acquiring the acquisition completion time of all frame data in a second preset time range, and recording the acquisition completion time as s1,s2,…,sn,sn+1
Calculating the time interval difference mean value:
Figure FDA0002489748300000011
4. the method according to claim 2, wherein a time average of preprocessing all frame data within a first preset time range is obtained, wherein the preprocessing includes noise reduction.
5. The method for discriminating a dynamic snapshot frame data of claim 1, wherein the transmission start signal time T at which the first snapshot frame data to be discriminated is acquired1Before, still include:
sending a snapshot instruction to the field programmable gate array;
and adjusting the exposure parameter and the gain parameter of the image sensor.
6. An apparatus for differentiating a dynamic capture frame, comprising:
a first obtaining module for obtaining the sending time T of the random snapshot instruction0
A second obtaining module for obtaining the transmission start signal time T of the snapshot frame data to be distinguished1
A third obtaining module, configured to obtain a mean time difference t between transmission time intervals of multiple adjacent frame datan
A fourth obtaining module for obtaining the acquisition completion time T of the frame data to be distinguished2
A first determination module at T2In [ T ]1,T1+tn]When the frame data is the captured frame to be distinguished, determining the frame data to be distinguished as the captured frame to be distinguished;
wherein the transmission start signal time T1And the snapshot instruction sending time T0The interval of (a) is greater than the time required for the field programmable gate array to pre-process the frame data.
7. The apparatus for differentiating a dynamic capture frame of claim 6, further comprising:
the fifth acquisition module is used for acquiring a time average value for preprocessing all frame data in the first preset time range;
the comparison module is used for comparing the time average value with a preset time average value and acquiring a comparison result;
and the second determination module is used for determining that the error between the time mean value and the preset time mean value is in a preset range based on the comparison result.
8. The apparatus for differentiating a dynamic capture frame according to claim 6, wherein said third obtaining module comprises:
a first obtaining unit, configured to obtain the acquisition completion time of all frame data within a second preset time range, which is recorded as s1,s2,…,sn,sn+1
A calculating unit, configured to calculate the time interval difference mean value:
Figure FDA0002489748300000031
9. the apparatus for differentiating a dynamic capture frame of claim 7, further comprising a preprocessing module, wherein said preprocessing module comprises a noise reduction unit.
10. The apparatus for differentiating a dynamic capture frame of claim 6, further comprising:
the sending module is used for sending a snapshot instruction to the field programmable gate array;
and the adjusting module is used for adjusting the exposure parameter and the gain parameter of the image sensor.
11. An electronic device, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of any one of claims 1-5.
12. A computer-readable storage medium having stored thereon computer instructions, which when executed by a processor, perform the steps of the method of any of claims 1-5.
CN201811321354.0A 2018-11-07 2018-11-07 Method and device for distinguishing dynamic capture frames Active CN109243179B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811321354.0A CN109243179B (en) 2018-11-07 2018-11-07 Method and device for distinguishing dynamic capture frames

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811321354.0A CN109243179B (en) 2018-11-07 2018-11-07 Method and device for distinguishing dynamic capture frames

Publications (2)

Publication Number Publication Date
CN109243179A CN109243179A (en) 2019-01-18
CN109243179B true CN109243179B (en) 2020-11-03

Family

ID=65077384

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811321354.0A Active CN109243179B (en) 2018-11-07 2018-11-07 Method and device for distinguishing dynamic capture frames

Country Status (1)

Country Link
CN (1) CN109243179B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110620701B (en) * 2019-09-12 2024-03-08 北京百度网讯科技有限公司 Data stream monitoring processing method, device, equipment and storage medium
CN117376495B (en) * 2023-12-06 2024-02-23 苏州元脑智能科技有限公司 Image relay device, in-vehicle apparatus, vehicle, and data transmission method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101893804B (en) * 2010-05-13 2012-02-29 杭州海康威视软件有限公司 Exposure control method and device
CN102194320B (en) * 2011-04-25 2014-05-21 杭州海康威视数字技术股份有限公司 High-definition network intelligent camera and high-definition network intelligent shooting method
KR101858695B1 (en) * 2012-04-09 2018-05-16 엘지전자 주식회사 Method for managing data
CN103680137B (en) * 2012-09-18 2016-07-06 浙江大华技术股份有限公司 Image acquiring method and device based on intelligent traffic monitoring system
CN103856764B (en) * 2012-11-30 2016-07-06 浙江大华技术股份有限公司 A kind of device utilizing double-shutter to be monitored
US20150007057A1 (en) * 2013-07-01 2015-01-01 Cisco Technlogy, Inc. System and Method for Application Sharing
CN104580908B (en) * 2014-12-31 2019-03-05 惠州Tcl移动通信有限公司 A kind of video capture method and mobile terminal
CN105681653A (en) * 2016-01-12 2016-06-15 深圳市云智易联科技有限公司 Video file generation method and device
CN106534667B (en) * 2016-10-31 2020-02-11 努比亚技术有限公司 Distributed collaborative rendering method and terminal

Also Published As

Publication number Publication date
CN109243179A (en) 2019-01-18

Similar Documents

Publication Publication Date Title
CN110312057B (en) Intelligent video processing device
WO2018223955A1 (en) Target monitoring method, target monitoring device, camera and computer readable medium
CN110910655A (en) Parking management method, device and equipment
CN102194320B (en) High-definition network intelligent camera and high-definition network intelligent shooting method
CN112291520B (en) Abnormal event identification method and device, storage medium and electronic device
CN111105621B (en) Method and device for detecting illegal parking
CN109243179B (en) Method and device for distinguishing dynamic capture frames
CN106851229B (en) Security and protection intelligent decision method and system based on image recognition
CN111862627A (en) Traffic violation stop snapshot automatic processing system
CN110610610A (en) Vehicle access management method and device and storage medium
WO2020258720A1 (en) Blocking detection method and apparatus for image acquisition device, device, and storage medium
CN112601049B (en) Video monitoring method and device, computer equipment and storage medium
CN106981104B (en) Charging monitoring method, device, server and system
KR101610295B1 (en) Intelligent enforcement system for illegal parking enforcement and providing method thereof
KR101595545B1 (en) System and method for evaluating performance of image device of intelligent transportation system
CN105138321B (en) The control method and system of terminal
CN111818286A (en) Video monitoring equipment fault detection system
CN114973135A (en) Head-shoulder-based sequential video sleep post identification method and system and electronic equipment
CN113112814B (en) Snapshot method and device without stopping right turn and computer storage medium
CN115410112A (en) Elevator abnormal stay recognition method and device and storage medium
KR100936443B1 (en) Vehicle monitoring system and method using the same
CN114359828A (en) Target behavior recording method, device, storage medium and electronic device
WO2013173994A1 (en) Embedded system board, method, front device, backend server and system for video surveillance
CN111241879A (en) Vehicle detection method and device, electronic equipment and readable storage medium
CN110853363A (en) Vehicle traffic violation identification and video extraction device and method based on multiple algorithms

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant