CN112148769A - Data synchronization method, device, storage medium and electronic device - Google Patents

Data synchronization method, device, storage medium and electronic device Download PDF

Info

Publication number
CN112148769A
CN112148769A CN202010970355.9A CN202010970355A CN112148769A CN 112148769 A CN112148769 A CN 112148769A CN 202010970355 A CN202010970355 A CN 202010970355A CN 112148769 A CN112148769 A CN 112148769A
Authority
CN
China
Prior art keywords
data information
group
data
time
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010970355.9A
Other languages
Chinese (zh)
Inventor
李冬冬
李乾坤
卢维
殷俊
王凯
汪巧斌
泮江江
汤文轩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202010970355.9A priority Critical patent/CN112148769A/en
Publication of CN112148769A publication Critical patent/CN112148769A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2474Sequence data queries, e.g. querying versioned data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/27Replication, distribution or synchronisation of data between databases or within a distributed database system; Distributed database system architectures therefor

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

The embodiment of the invention provides a data synchronization method, a data synchronization device, a storage medium and an electronic device, wherein the method comprises the following steps: the method comprises the steps of obtaining a first group of data information from first equipment and a second group of data information from second equipment, wherein the first group of data information and the second group of data information are respectively data information obtained by the first equipment and the second equipment performing multiple times of information collection on a target area within a preset time period, the data collection frequencies of the first equipment and the second equipment are different, and matching the first group of data information with the second group of data information based on the collection time of the first group of data information and the collection time of the second group of data information to obtain a matching result. Therefore, the technical problems that data acquired by the data acquisition equipment in the related technology are asynchronous in time and difficult to effectively utilize are solved, and the technical effects of improving the utilization efficiency of the data acquired by the data acquisition equipment and accelerating the subsequent processing speed are achieved.

Description

Data synchronization method, device, storage medium and electronic device
Technical Field
The embodiment of the invention relates to the field of communication, in particular to a data synchronization method, a data synchronization device, a data synchronization storage medium and an electronic device.
Background
In the prior art, the traditional security terminal equipment mainly comprises a visible light camera, but the visible light camera cannot work at night; although there are drawbacks to infrared cameras that complement visible light cameras, this undoubtedly increases cost and operational difficulty. In addition, the optical sensor is also affected by weather, and monitoring effect is unsatisfactory in heavy fog days or rainy and snowy days, so that the area monitoring technology based on the millimeter wave radar is a hot content of recent research. The millimeter wave radar actively transmits electromagnetic waves and receives signals with the same frequency, has very high detection probability for moving objects or objects with large RCS (radar reflection area), and has lower detection probability (the detection probability is not zero) for static objects. The millimeter wave radar can work 24 hours all day, and is less influenced by weather.
The millimeter wave radar can be used for monitoring various targets, extracting the targets in which the user is interested from the various targets and terminating/filtering the targets in which the user is not interested or false targets as soon as possible. One of the purposes of object trajectory classification is to filter/filter objects. For example, in a park, a tree shakes to form a low-speed and small-range moving target track, the target type is a non-human, non-vehicle, non-animal target, and the target type does not need to be reported or a track ending method is called as soon as possible to delete the target track. If a small dog is going through the garden, the trajectory should also be terminated in time since it is not the target of the user's attention (whether the user is a person or a car). If the track is formed by the pedestrian, the radar outputs the track information of the pedestrian to the camera, and the camera takes pictures or records the pictures according to the track space position information provided by the radar.
However, in the related art, since data acquisition devices (e.g., radars or cameras) need to perform corresponding processing on data after acquiring the data, and acquisition frequencies of different data acquisition devices are different, data between different data acquisition devices cannot be determined as data acquired at the same time, and further, it is difficult to perform common analysis according to data acquired by different data acquisition devices, which results in a technical problem that the data acquired by the data acquisition devices is difficult to be effectively utilized.
Aiming at the technical problems that the data acquired by data acquisition equipment in the related technology are asynchronous in time and difficult to be effectively utilized, an effective solution is not provided at present.
Disclosure of Invention
The embodiment of the invention provides a data synchronization method, a data synchronization device, a storage medium and an electronic device, which are used for at least solving the technical problems that data acquired by data acquisition equipment in the related technology are not synchronized in time and are difficult to be effectively utilized.
According to an embodiment of the present invention, there is provided a data synchronization method including: acquiring a first group of data information from first equipment and a second group of data information from second equipment, wherein the first group of data information is data information obtained by the first equipment performing multiple information acquisition on a target area within a preset time period, the second group of data information is data information obtained by the second equipment performing multiple information acquisition on the target area within the preset time period, and the data acquisition frequencies of the first equipment and the second equipment are different; and matching the data information included in the first group of data information with the data information included in the second group of data information based on the acquisition time of the first group of data information acquired by the first equipment and the acquisition time of the second group of data information acquired by the second equipment to obtain a matching result.
According to another embodiment of the present invention, there is provided a data synchronization apparatus including: an obtaining module, configured to obtain a first set of data information from a first device and a second set of data information from a second device, where the first set of data information is data information obtained by the first device performing multiple information acquisitions on a target area within a predetermined time period, the second set of data information is data information obtained by the second device performing multiple information acquisitions on the target area within the predetermined time period, and data acquisition frequencies of the first device and the second device are different; and the matching module is used for matching the data information included in the first group of data information with the data information included in the second group of data information based on the acquisition time of the first group of data information acquired by the first equipment and the acquisition time of the second group of data information acquired by the second equipment to obtain a matching result.
According to a further embodiment of the present invention, there is also provided a computer-readable storage medium having a computer program stored thereon, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
According to yet another embodiment of the present invention, there is also provided an electronic device, including a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
According to the invention, the first group of data information from the first device and the second group of data information from the second device are obtained, wherein the first group of data information and the second group of data information are respectively data information obtained by the first device and the second device performing multiple times of information acquisition on the target area within the preset time period, the data acquisition frequencies of the first device and the second device are different, and the first group of data information and the second group of data information are matched based on the acquisition time of acquiring the first group of data information and the acquisition time of acquiring the second group of data information to obtain the matching result. Therefore, the technical problems that the data time acquired by the data acquisition equipment is asynchronous and is difficult to be effectively utilized in the related technology can be solved, the utilization efficiency of the data acquired by the data acquisition equipment is improved, the subsequent processing speed is increased, and the technical effect of time synchronization of the acquired data is ensured.
Drawings
Fig. 1 is a block diagram of a hardware configuration of a mobile terminal of a data synchronization method according to an embodiment of the present invention;
FIG. 2 is a flow chart of an alternative method of synchronizing data according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of an alternative method of synchronizing data according to an embodiment of the invention;
FIG. 4 is a schematic diagram of an alternative method of synchronizing data according to an embodiment of the present invention;
FIG. 5 is a flow diagram illustrating an alternative method for synchronizing data according to an embodiment of the present invention;
FIG. 6 is a flow chart illustrating an alternative method for synchronizing data according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of an alternative method of synchronizing data according to an embodiment of the invention;
FIG. 8 is a flow chart illustrating an alternative method for synchronizing data according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an alternative data synchronization apparatus according to an embodiment of the present invention.
Detailed Description
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings in conjunction with the embodiments.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
The method embodiments provided in the embodiments of the present application may be executed in a mobile terminal, a computer terminal, or a similar computing device. Taking the example of the method running on the mobile terminal, fig. 1 is a hardware structure block diagram of the mobile terminal of the data synchronization method according to the embodiment of the present invention. As shown in fig. 1, the mobile terminal may include one or more (only one shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA), and a memory 104 for storing data, wherein the mobile terminal may further include a transmission device 106 for communication functions and an input-output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration, and does not limit the structure of the mobile terminal. For example, the mobile terminal may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used for storing computer programs, for example, software programs and modules of application software, such as computer programs corresponding to the data synchronization method in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer programs stored in the memory 104, so as to implement the above-mentioned method. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used for receiving or transmitting data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission device 106 includes a Network adapter (NIC), which can be connected to other Network devices through a base station so as to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
In the present embodiment, a method for synchronizing data running on a mobile terminal, a computer terminal or a similar computing device is provided, and fig. 2 is a flowchart of an alternative method for synchronizing data according to an embodiment of the present invention, as shown in fig. 2, the flowchart includes the following steps:
s202, acquiring a first group of data information from first equipment and a second group of data information from second equipment, wherein the first group of data information is data information obtained by the first equipment performing multiple information acquisition on a target area within a preset time period, the second group of data information is data information obtained by the second equipment performing multiple information acquisition on the target area within the preset time period, and the data acquisition frequencies of the first equipment and the second equipment are different;
s204, based on the acquisition time of the first group of data information acquired by the first equipment and the acquisition time of the second group of data information acquired by the second equipment, matching the data information included in the first group of data information with the data information included in the second group of data information to obtain a matching result.
Optionally, in this embodiment, the application scenarios of the data synchronization method may include, but are not limited to, parks, construction sites, intersections, roads, park warehousing/warehousing, gates, and the like.
The above is merely an example, and the present embodiment does not limit this.
Optionally, in this embodiment, the first device may include, but is not limited to, a radar device, a video camera, a sound recording device, a scanner, and other terminal devices capable of data acquisition, and the second device may include, but is not limited to, the same or different from the first device.
Optionally, in this embodiment, fig. 3 is a schematic diagram of a network architecture of an optional data synchronization method according to an embodiment of the present invention, as shown in fig. 3, a first device 302 acquires a first group of data information and transmits the first group of data information to an MCU304, a second device 306 acquires a second group of data information and transmits the second group of data information to the MCU304, the MCU304 matches the first group of data information and the second group of data information based on an acquisition time of the first device 302 acquiring the first group of data information and an acquisition time of the second device 306 acquiring the second group of data information, so as to obtain a corresponding matching result, and the MCU304 may include, but is not limited to, a preconfigured chip or system capable of completing matching processing.
Optionally, in this embodiment, the matching result may include, but is not limited to, determining, based on the acquisition time, data information corresponding to the acquisition time from the second group of data information for each piece of data information in the first group of data information, and then generating a corresponding matching result.
Optionally, in this embodiment, the first group of data information may include, but is not limited to, data information obtained by the first device performing information acquisition on a target area, and the target area may include, but is not limited to, preset by a system, and may also be configured according to actual service needs. The data information obtained by the first device performing multiple information acquisitions on the target area within the predetermined time period and the data information obtained by the second device performing multiple information acquisitions on the target area within the predetermined time period may include but are not limited to data information obtained by the first device and the second device performing multiple information acquisitions on the same target area within the same time period, the length of the preset time period may be, but is not limited to, preset by a system, and may also be, but is not limited to, configured to continue to collect data until a preset condition is reached and the preset time period is terminated, the preset conditions may include, but are not limited to, the maximum storage capacity of a storage device for storing the first set of data information and/or the second set of data information, the failure of the first device and/or the second device to operate properly, and the like.
The above is merely an example, and the present embodiment is not limited in any way.
Optionally, in this embodiment, the data acquisition frequency of the second device is different from the data acquisition frequency of the first device, which may include but is not limited to that the data acquisition frequency of the first device is less than the data acquisition frequency of the second device or that the data acquisition frequency of the first device is greater than the data acquisition frequency of the second device.
It should be noted that, in the case that the data acquisition frequency of the first device is less than the data acquisition frequency of the second device, the data information having a matching relationship with each data information in the first group of data information may be selected from the second group of data information by including, but not limited to, taking the first group of data information as a reference, and in the case that the data acquisition frequency of the first device is greater than the data acquisition frequency of the second device, the data information having a matching relationship with each data information in the second group of data information may be selected from the first group of data information by including, but not limited to, taking the second group of data information as a reference.
Optionally, in this embodiment, the time for the first device to acquire the first set of data information and the time for the second device to acquire the second set of data information may include, but is not limited to, the time for the first device and the second device to start data acquisition, and may also include, but is not limited to, the time before the data is further processed after the data acquisition is completed.
Taking the first device as a millimeter wave radar as an example, the radar signal processing process may be as shown in fig. 4, where the process includes the following steps:
s402, starting;
s404, the millimeter wave radar (corresponding to the aforementioned first device) transmits a signal;
s406, the signal transmitted by the millimeter wave radar contacts a target and is reflected;
s408, receiving the signal after signal transmission;
s410, sending a control instruction and carrying out signal processing;
s412, transmitting a control command;
s414, encoding the acquired data;
s416, transmitting the coded data (corresponding to the first group of data information)
S418, next cycle.
Wherein,
Figure BDA0002683809160000071
represents the time when the radar transmits signals,
Figure BDA0002683809160000072
Indicating the time taken for the radar signal to travel from its emission to the detection of the target.
Figure BDA0002683809160000073
And the time when the radar signal contacts the target is represented, namely the time point when the radar obtains the target measurement, namely the acquisition time of the first set of data information acquired by the first equipment.
Figure BDA0002683809160000074
It is generally considered that the return of radar signal is time-consuming
Figure BDA0002683809160000075
Since the speed of light is fast, it is considered that
Figure BDA0002683809160000076
Figure BDA0002683809160000077
The input of the process for signal processing of radar wave signals can include, but is not limited to, radar waves, and the output can include, but is not limited to, the number of target points and detailed target point information.
Figure BDA0002683809160000078
The time consumption is not negligible.
Figure BDA0002683809160000079
Is an encoding process and takes time which is not negligible.
Figure BDA00026838091600000710
Is a transmission process, and the time consumption is not negligible. The control command valid information transmitted in S410 only occupies one byte (frame number) and does not need to be encoded, and thus, it is possible to prevent the occurrence of a problem in the related art
Figure BDA00026838091600000711
The time consumption is negligible.
Figure BDA00026838091600000712
Is a transmission process, the time consumption is not negligible, but the content transmitted in S412 is only one byte, which is negligible compared to the encoded data transmitted in S416.
It should be noted that the radar operates at a fixed frame rate, and its transmitted signal is an unconstrained active behavior. That is, whether there is an object in the monitored scene or not, the above-described processes of S402 to S408 are performed.
Taking the second device as a video sensor as an example, the video sensor signal processing process can be as shown in fig. 5, and the process includes the following steps:
s502, starting;
s504, the object emits light/reflects light;
s506, the photosensitive device receives a signal (corresponding to the video sensor);
s508, sending a control command and carrying out signal processing;
s510, transmitting a control instruction;
s512, encoding the acquired data;
s514, transmitting the encoded data (corresponding to the second set of data information)
And S516, next circulation.
Wherein the video sensor is a passive device,
Figure BDA0002683809160000081
indicating the departure time of light from the target object,
Figure BDA0002683809160000082
Representing the time it takes for the light signal to travel from the target to the video sensor.
Figure BDA0002683809160000083
And the time point corresponding to the target state obtained by the sensor is represented.
Figure BDA0002683809160000084
It is considered that the transmission of the optical signal takes time, and the optical signal is fast
Figure BDA0002683809160000085
Figure BDA0002683809160000086
Representing a signal processing process with inputs including, but not limited to, light and outputs including, but not limited to, RGB images or YUV images.
Figure BDA0002683809160000087
The time consumption is not negligible.
Figure BDA0002683809160000088
Is an encoding process and takes time which is not negligible.
Figure BDA0002683809160000089
Is a transmission process, and the time consumption is not negligible. In step 6, the effective information of the sent control command only occupies one byte (frame number) and does not need to be encoded, so that the effective information of the control command only occupies one byte (frame number)
Figure BDA00026838091600000810
The time consumption is negligible.
Figure BDA00026838091600000811
Is a transmission process, the time consumption is not negligible, but the control command transmitted in S510 is only one byte, which is negligible compared to the encoded data transmitted in S514.
It should be noted that the video sensor operates at a fixed frame rate, and the signal processing is an unconstrained active behavior. That is, the process from S502 to S516 is performed completely regardless of whether there is an object in the monitored scene.
According to the invention, the first group of data information from the first device and the second group of data information from the second device are obtained, wherein the first group of data information and the second group of data information are respectively data information obtained by the first device and the second device performing multiple times of information acquisition on the target area within the preset time period, the data acquisition frequencies of the first device and the second device are different, and the first group of data information and the second group of data information are matched based on the acquisition time of acquiring the first group of data information and the acquisition time of acquiring the second group of data information to obtain the matching result. Therefore, the technical problems that the data time acquired by the data acquisition equipment is asynchronous and is difficult to be effectively utilized in the related technology can be solved, the utilization efficiency of the data acquired by the data acquisition equipment is improved, the subsequent processing speed is increased, and the technical effect of time synchronization of the acquired data is ensured.
In an optional embodiment, before matching the data information included in the first set of data information with the data information included in the second set of data information based on the acquisition time of the first device acquiring the first set of data information and the acquisition time of the second device acquiring the second set of data information to obtain a matching result, the method further includes: receiving a first set of control information from the first device and a second set of control information from the second device; determining a time of receiving each control information included in the first set of control information as a time of acquiring the corresponding data information included in the first set of data information by the first device, and determining a time of receiving each control information included in the second set of control information as a time of acquiring the corresponding data information included in the second set of data information by the second device.
Optionally, in this embodiment, the receiving the first group of control information from the first device may include, but is not limited to, step S412 shown in fig. 4, and the first group of control information may include, but is not limited to, a control instruction, for example, a frame number, a serial number, and the like.
Taking the example that the first device is a millimeter wave radar, the radar sensor signal processing involves many processes, and the time for receiving each control information in the first set of control information is the time for receiving each control information
Figure BDA0002683809160000091
I.e. the moment when the radar detects an object. The target information transmission channel involves a time-consuming and non-negligible process of three:
Figure BDA0002683809160000092
and
Figure BDA0002683809160000093
these three processes are serial and fixed in time (with a small and negligible fluctuation time), and therefore can be as follows:
Figure BDA0002683809160000101
the target control command transmission channel involves a process with non-negligible time consumption, namely:
Figure BDA0002683809160000102
and is
Figure BDA0002683809160000103
Therefore, the time consumption of the control information transmission channel is far less than that of the data information transmission channel.
Taking the second device as a video sensor, the video sensor signal processing involves many processes, and the time for receiving each control information in the second set of control information is the time for receiving the control information
Figure BDA0002683809160000104
I.e. the moment at which the video detects the object. The target information transmission channel involves a time-consuming and non-negligible process of three:
Figure BDA0002683809160000105
and
Figure BDA0002683809160000106
the three processes are serial and fixed in time (the fluctuation time is small and can be ignored)Slightly negligible), therefore, can be as follows:
Figure BDA0002683809160000107
the target control command transmission channel involves a process with non-negligible time consumption, namely:
Figure BDA0002683809160000108
and is
Figure BDA0002683809160000109
Therefore, the time consumption of the control information transmission channel is far less than that of the data information transmission channel.
Through this embodiment, MCU can acquire the time of data information with the second equipment collection that data information is effectively acquireed, and then, can realize carrying out the matching of first group data information and second group data information according to the time of gathering data information, in order to generate the matching result, consequently, can solve the data time that data acquisition equipment that exists among the correlation technique gathered asynchronous, be difficult to by the technical problem of effective utilization, reach the utilization efficiency who improves the data that data acquisition equipment gathered, accelerate subsequent processing speed, guarantee the synchronous technological effect of data time of gathering.
In an optional embodiment, determining the time to receive each control information included in the first set of control information as the acquisition time for the first device to acquire the corresponding data information included in the first set of data information comprises: under the condition that the first group of control information comprises corresponding first identification information which is generated and sent by the first equipment according to the sequence of acquiring each data information included in the first group of data information, determining the time for receiving the first identification information as the time for acquiring the data information included in the first group of data information and corresponding to the first identification information; determining a time to receive each control information included in the second set of control information as a collection time at which the second device collects the corresponding data information included in the second set of data information includes: and under the condition that the second group of control information comprises corresponding second identification information which is generated and sent by the second equipment according to the sequence of acquiring each data information included in the second group of data information, determining the time for receiving the second identification information as the time for acquiring the data information included in the second group of data information and corresponding to the second identification information.
Optionally, in this embodiment, the first identifier and the second identifier may include, but are not limited to, identifiers capable of recording an acquisition time or an acquisition sequence of the data information, for example, a frame number, an image number, and the like.
Optionally, in this embodiment, but not limited to, determining the time of receiving the first identification information as the time of acquiring the data information corresponding to the first identification information included in the first group of data information may be included, and also, but not limited to, determining the time of receiving the second identification information as the time of acquiring the data information corresponding to the second identification information included in the second group of data information may be included.
For example, when the MCU receives the first identification information, the time when the first identification information is received is determined as the time information corresponding to the first identification; and when the MCU receives the second identification information, determining the time for receiving the second identification information as the time information corresponding to the second identification, and matching based on the time information corresponding to the first identification and the time information corresponding to the second identification.
Through this embodiment, can accomplish the matching correspondence of first group control information and second group control information high-efficiently, and then, can acquire the corresponding relation between the acquisition time of first equipment data collection and second equipment data collection, and then, can solve the data time that the data acquisition equipment that exists among the correlation technique and gather asynchronous, be difficult to by the technical problem of effective utilization, reach the utilization efficiency who improves the data that data acquisition equipment gathered, accelerate subsequent processing speed, guarantee the synchronous technological effect of data time of gathering.
In an optional embodiment, matching the data information included in the first set of data information with the data information included in the second set of data information based on the acquisition time of the first device acquiring the first set of data information and the acquisition time of the second device acquiring the second set of data information comprises: binding the first identifier and the second identifier based on the time of receiving the first identifier and the time of receiving the second identifier so as to bind the first identifier and the second identifier with the smallest difference of the receiving times; and correspondingly matching the data information corresponding to the first identifier and included in the first group of data information with the data information corresponding to the second identifier and included in the second group of data information according to the binding result of the first identifier and the second identifier.
Optionally, in this embodiment, the binding between the first identifier and the second identifier may include, but is not limited to, as shown in fig. 6, where, taking the first device as a radar and the second device as a camera as an example, the binding relationship may include, but is not limited to, as follows:
the binding relationship is generated in a manner that the time difference between the radar control command and the video control command after binding is minimized. It should be noted that the time is based on the clock on the MCU.
The time marked in fig. 6 is the time when the signal reaches the MCU. In the binding, the radar signal is taken as a reference (based on a sensor operating at a low frequency, and the radar operating frequency is generally lower than the video sensor operating power), and as shown in fig. 6, the left arrow 602 represents the arrival time of the radar signal, and the left arrow 604 represents the arrival time of the video signal.
Since the video signal has a higher frequency than the radar signal and the binding process is performed with reference to the radar signal, the video signal is occasionally left empty. The dashed boxes in fig. 6 represent signals to complete the binding.
The output of the control-specific binding process is a paired control-specific and internal time difference of
Figure BDA0002683809160000121
dTiIndicating the ith group binding result.
Through the embodiment, the technical effect of minimizing the time difference between the radar control command and the video control command after binding can be achieved.
In an optional embodiment, before matching the data information included in the first set of data information with the data information included in the second set of data information based on the acquisition time of the first device acquiring the first set of data information and the acquisition time of the second device acquiring the second set of data information to obtain a matching result, the method further includes: receiving a binding result from a target device, wherein the binding result is used for indicating a binding relationship between first identification information of the first group of data information collected by the first device and second identification information of the second group of data information collected by the second device, wherein first identification information of data information included in the first group of data information having a smallest difference in reception time is bound with second identification information of data information included in the second group of data information, the first identification information is generated by the first device according to the sequence of acquiring each data information included in the first group of data information and is sent to the target device, the second identification information is generated by the second device according to the sequence of acquiring each data information included in the second group of data information and is sent to the target device; and matching the data information included in the first group of data information with the data information included in the second group of data information based on the binding result to obtain a matching result.
Optionally, in this embodiment, the MCU obtains a control instruction sent by a first device and a control instruction sent by a second device to generate the binding relationship, and receives the binding relationship sent by the MCU, and further matches the data information included in the first group of data information with the data information included in the second group of data information according to the binding relationship to obtain a matching result, where, taking the first device as a radar, the second device as a camera, and the device generating the binding relationship as the MCU, as shown in fig. 7, the MCU702 receives the first identification information and the second identification information used for binding and sent by the radar 704 and the camera 706, and generates a corresponding binding relationship, and then sends the binding relationship to the MCU/SOC708, so as to match the data information included in the first group of data information with the data information included in the second group of data information based on the binding result, and obtaining a matching result, and further reducing the MCU operation pressure, so that the data acquisition synchronization process of the first equipment and the second equipment can be executed in a distributed manner, and the technical effects of improving the processing efficiency and accelerating the time synchronization speed are achieved.
In an optional embodiment, the first identification information includes first frame number information, and the second identification information includes second frame number information.
In an optional embodiment, after matching the data information included in the first set of data information with the data information included in the second set of data information based on the acquisition time of the first device to acquire the first set of data information and the acquisition time of the second device to acquire the second set of data information, and obtaining a matching result, the method further includes: and inputting the matching result into a target recognition model to obtain a target recognition result, wherein the target recognition model is obtained by training the recognition model to be trained, and the target recognition result is used for indicating whether the matching result contains a target object.
Optionally, in this embodiment, the target recognition model may be a model obtained by training a recognition model to be trained through a learning manner including, but not limited to, a supervised manner or an unsupervised manner, and the target recognition result may include, but not limited to, whether the matching result includes the target object, may also include, but not limited to, a probability that the target area includes the target object, and may also include, but not limited to, a probability that a corresponding operation needs to be performed after the target object is recognized in the target area.
It should be noted that, taking the first device as a millimeter wave radar and the second device as a camera as an example, the millimeter wave radar can monitor various targets, extract a target that is interested by a user from the various targets, and terminate/filter a target that is not interested by the user or a false target as soon as possible. One of the purposes of object trajectory classification is to filter/filter objects. For example, in a park, a 3-level wind is occasionally blown, trees shake to form a low-speed target track moving in a small range, the target type is a non-human, non-vehicle and non-animal target, and the target type does not need to be reported or a track ending method is called as soon as possible to delete the target track. If a small dog is going through the garden, the trajectory should also be terminated in time since it is not the target of the user's attention (whether the user is a person or a car). If the track is formed by the pedestrian, the radar outputs the track information of the pedestrian to the camera, and the camera takes pictures or records the pictures according to the track space position information provided by the radar.
Through the embodiment, the data information acquired by the first equipment and the second equipment can be synchronized in time, and then the target identification result obtained by inputting the corresponding data information acquired at the same time into the target identification model is input so as to achieve the technical effect of improving the utilization efficiency of the data information.
In an alternative embodiment, the first device comprises a millimeter wave radar; the second device includes a video sensor.
In an optional embodiment, before obtaining the first set of data information from the first device and the second set of data information from the second device, the method further comprises: controlling the millimeter wave radar to resolve a group of signals collected by the millimeter wave radar to obtain the number of a group of target points in the target area and corresponding target point information; generating the first group of data information according to the number of the group of target points and the corresponding target point information; under the condition that the target equipment comprises a video sensor, controlling the video sensor to perform signal processing on a group of signals acquired by the video sensor to obtain a group of image data; generating the second set of data information from the set of image data.
The invention will be further illustrated with reference to specific examples:
fig. 8 is a schematic flowchart of an alternative data synchronization method according to an embodiment of the present invention, and as shown in fig. 8, the flowchart includes the following steps:
taking the first device as a Radar (Radar), the second device as a video sensor (Camera), and the processing device as an MCU as an example, steps R1 to R8 are Radar processing flows, steps C1 to C7 are video sensor processing flows, and steps M1 to M4 are processing device processing flows.
And R1 radar signal emission. This step is a fixed period auto-triggering process. Go to step R2.
R2: the radar signal contacts the target and reflects. Go to step R3.
R3: the radar sensor receives radar returns. And simultaneously jumping to R7 and R4, and if the result is successful, executing the R7 step.
R4: and (4) radar signal processing. From the radar returns, radar target points, such as the number of target points, the content of each target (distance, angle, RCS and radial velocity) are resolved. This step has a low real-time requirement. Go to step R5.
R5: and (5) encoding. And encoding the radar target point information. This step has a low real-time requirement. Go to step R6.
R6: and (5) transmitting. A processor on the radar transmits the radar target point to the SOC. This step has a low real-time requirement. Go to step M3.
R7: and sending a control instruction. The command is a radar frame number, occupies one byte, and has an integer of 0-255. The cyclic use is that 0- >1- > … - >244- >255- >0- >1- > …. The step has higher requirement on real-time performance and higher priority. Go to step R8.
R8: and (5) transmitting. And sending the radar control instruction to the MCU, wherein the step has higher requirements on real-time performance and higher priority. Go to step M1.
C1: the object reflects/shines light. The generation place and the generation time of the video signal. Go to step C2.
C2: and the photosensitive element of the video sensor is sensitive. And C6 and C3 are skipped, and if the result is successful, the step C6 is executed.
C3: and (5) processing the video signal. I.e. the process of obtaining RGB or YUV images. This step has a low real-time requirement. Go to step C4.
C4: and (5) encoding. This step has a low real-time requirement. Go to step C5.
C5: and (5) transmitting. The processor on the video sensor transmits the YUV image to the SOC. This step has a low real-time requirement. Go to step M3.
C6: and sending a control instruction. The command is a video frame number, occupies one byte, and has an integer of 0 to 255. The cyclic use is that 0- >1- > … - >244- >255- >0- >1- > …. The step has higher requirement on real-time performance and higher priority. Go to step C7. (1, expansion)
C7: and (5) transmitting. And sending the video control instruction to the MCU, wherein the step has higher requirement on real-time performance and higher priority. Go to step M1.
M1: MCU binding. And synchronizing the time of the control command to obtain a matched radar control command and a matched video control command. This step has low real-time requirements. Go to step M2.
M2: and (5) transmitting. And transmitting the control instruction after binding to the SOC. This step has low real-time requirements. Go to step M3.
M3: and (6) data binding. On the SOC, radar frame data and video frame data are matched based on the control instruction. Go to step M4.
M4: end up
Through this embodiment, can realize multisensor collaborative work, all accomplish time synchronization. The time synchronization effect is improved, error information is prevented from being brought into a subsequent link, self-correction which is difficult to complete in the subsequent link is avoided, a time delay link can be reduced, the time delay and the time fluctuation can be reduced, the transmission content of a control instruction transmission link is less, the time delay and the time fluctuation can be reduced, a symmetric hardware circuit design can be designed into a hardware circuit trigger mechanism, the uncertainty of the time delay can be further reduced, the time difference corresponding to the bound frames can be accurately calculated (the error range is controllable), the non-random value (the range is large), and the technical effect of few real-time dependence conditions is achieved.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
In this embodiment, a data synchronization apparatus is further provided, and the apparatus is used to implement the foregoing embodiments and preferred embodiments, and the description already made is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 9 is a block diagram of an alternative data synchronization apparatus according to an embodiment of the present invention, as shown in fig. 9, the apparatus includes:
an obtaining module 902, configured to obtain a first set of data information from a first device and a second set of data information from a second device, where the first set of data information is data information obtained by performing multiple information acquisitions on a target area by the first device within a predetermined time period, the second set of data information is data information obtained by performing multiple information acquisitions on the target area by the second device within the predetermined time period, and data acquisition frequencies of the first device and the second device are different;
a matching module 904, configured to match, based on the acquisition time for the first device to acquire the first group of data information and the acquisition time for the second device to acquire the second group of data information, data information included in the first group of data information and data information included in the second group of data information, so as to obtain a matching result.
In an optional embodiment, the apparatus is further configured to:
receiving first group control information from the first device and second group control information from the second device before matching data information included in the first group data information with data information included in the second group data information based on acquisition time of the first device for acquiring the first group data information and acquisition time of the second device for acquiring the second group data information to obtain a matching result; determining a time of receiving each control information included in the first set of control information as a time of acquiring the corresponding data information included in the first set of data information by the first device, and determining a time of receiving each control information included in the second set of control information as a time of acquiring the corresponding data information included in the second set of data information by the second device.
In an alternative embodiment of the method according to the invention,
the apparatus is further configured to determine a time to receive each control information included in the first set of control information as a collection time at which the first device collects corresponding data information included in the first set of data information by: under the condition that the first group of control information comprises corresponding first identification information which is generated and sent by the first equipment according to the sequence of acquiring each data information included in the first group of data information, determining the time for receiving the first identification information as the time for acquiring the data information included in the first group of data information and corresponding to the first identification information;
the apparatus is further configured to determine a time to receive each control information included in the second set of control information as a collection time at which the second device collects corresponding data information included in the second set of data information by: and under the condition that the second group of control information comprises corresponding second identification information which is generated and sent by the second equipment according to the sequence of acquiring each data information included in the second group of data information, determining the time for receiving the second identification information as the time for acquiring the data information included in the second group of data information and corresponding to the second identification information.
In an optional embodiment, the apparatus is further configured to match data information included in the first set of data information with data information included in the second set of data information based on a collection time of the first device collecting the first set of data information and a collection time of the second device collecting the second set of data information by: binding the first identifier and the second identifier based on the time of receiving the first identifier and the time of receiving the second identifier so as to bind the first identifier and the second identifier with the smallest difference of the receiving times; and correspondingly matching the data information corresponding to the first identifier and included in the first group of data information with the data information corresponding to the second identifier and included in the second group of data information according to the binding result of the first identifier and the second identifier.
In an optional embodiment, the apparatus is further configured to:
receiving a binding result from a target device before a matching result is obtained based on the acquisition time of the first set of data information acquired by the first device and the acquisition time of the second set of data information acquired by the second device, matching the data information included in the first set of data information and the data information included in the second set of data information, wherein the binding result is used for indicating a binding relationship between first identification information of the first set of data information acquired by the first device and second identification information of the second set of data information acquired by the second device, wherein the first identification information of the data information included in the first set of data information with the smallest difference in receiving time is bound with the second identification information of the data information included in the second set of data information, and the first identification information is generated by the first device in the order of acquiring each data information included in the first set of data information and is obtained by the first device The second identification information is sent to the target device, and the second identification information is generated by the second device according to the sequence of acquiring each data information included in the second group of data information and sent to the target device; and matching the data information included in the first group of data information with the data information included in the second group of data information based on the binding result to obtain a matching result.
In an optional embodiment, the first identification information includes first frame number information, and the second identification information includes second frame number information.
In an optional embodiment, the apparatus is further configured to: the method comprises the steps of matching data information included in first group data information with data information included in second group data information based on the acquisition time of the first group data information acquired by first equipment and the acquisition time of the second group data information acquired by second equipment, inputting the matching result into a target recognition model after obtaining the matching result, and obtaining a target recognition result, wherein the target recognition model is obtained by training the recognition model to be trained, and the target recognition result is used for indicating whether a target object is included in the matching result.
In an alternative embodiment, the first device comprises a millimeter wave radar; the second device includes a video sensor.
In an optional embodiment, the apparatus is further configured to: before acquiring a first group of data information from first equipment and a second group of data information from second equipment, controlling the millimeter wave radar to resolve a group of signals collected by the millimeter wave radar to obtain the number of a group of target points in the target area and corresponding target point information; generating the first group of data information according to the number of the group of target points and the corresponding target point information; under the condition that the target equipment comprises a video sensor, controlling the video sensor to perform signal processing on a group of signals acquired by the video sensor to obtain a group of image data; generating the second set of data information from the set of image data.
It should be noted that, the above modules may be implemented by software or hardware, and for the latter, the following may be implemented, but not limited to: the modules are all positioned in the same processor; alternatively, the modules are respectively located in different processors in any combination.
Embodiments of the present invention also provide a computer-readable storage medium having a computer program stored thereon, wherein the computer program is arranged to perform the steps of any of the above-mentioned method embodiments when executed.
In the present embodiment, the above-mentioned computer-readable storage medium may be configured to store a computer program for executing the steps of:
s1, acquiring a first group of data information from a first device and a second group of data information from a second device, wherein the first group of data information is data information obtained by the first device performing multiple information acquisition on a target area within a preset time period, the second group of data information is data information obtained by the second device performing multiple information acquisition on the target area within the preset time period, and the data acquisition frequencies of the first device and the second device are different;
and S2, matching the data information included in the first group of data information with the data information included in the second group of data information based on the acquisition time of the first group of data information acquired by the first device and the acquisition time of the second group of data information acquired by the second device to obtain a matching result.
The computer readable storage medium is further arranged to store a computer program for performing the steps of:
s1, acquiring a first group of data information from a first device and a second group of data information from a second device, wherein the first group of data information is data information obtained by the first device performing multiple information acquisition on a target area within a preset time period, the second group of data information is data information obtained by the second device performing multiple information acquisition on the target area within the preset time period, and the data acquisition frequencies of the first device and the second device are different;
and S2, matching the data information included in the first group of data information with the data information included in the second group of data information based on the acquisition time of the first group of data information acquired by the first device and the acquisition time of the second group of data information acquired by the second device to obtain a matching result.
In an exemplary embodiment, the computer-readable storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
In an exemplary embodiment, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
In an exemplary embodiment, the processor may be configured to execute the following steps by a computer program:
s1, acquiring a first group of data information from a first device and a second group of data information from a second device, wherein the first group of data information is data information obtained by the first device performing multiple information acquisition on a target area within a preset time period, the second group of data information is data information obtained by the second device performing multiple information acquisition on the target area within the preset time period, and the data acquisition frequencies of the first device and the second device are different;
and S2, matching the data information included in the first group of data information with the data information included in the second group of data information based on the acquisition time of the first group of data information acquired by the first device and the acquisition time of the second group of data information acquired by the second device to obtain a matching result.
For specific examples in this embodiment, reference may be made to the examples described in the above embodiments and exemplary embodiments, and details of this embodiment are not repeated herein.
It will be apparent to those skilled in the art that the various modules or steps of the invention described above may be implemented using a general purpose computing device, they may be centralized on a single computing device or distributed across a network of computing devices, and they may be implemented using program code executable by the computing devices, such that they may be stored in a memory device and executed by the computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into various integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the principle of the present invention should be included in the protection scope of the present invention.

Claims (12)

1. A method for synchronizing data, comprising:
acquiring a first group of data information from first equipment and a second group of data information from second equipment, wherein the first group of data information is data information obtained by the first equipment performing multiple information acquisition on a target area within a preset time period, the second group of data information is data information obtained by the second equipment performing multiple information acquisition on the target area within the preset time period, and the data acquisition frequencies of the first equipment and the second equipment are different;
and matching the data information included in the first group of data information with the data information included in the second group of data information based on the acquisition time of the first group of data information acquired by the first equipment and the acquisition time of the second group of data information acquired by the second equipment to obtain a matching result.
2. The method of claim 1, wherein before matching the data information included in the first set of data information with the data information included in the second set of data information based on the acquisition time of the first device acquiring the first set of data information and the acquisition time of the second device acquiring the second set of data information to obtain a matching result, the method further comprises:
receiving a first set of control information from the first device and a second set of control information from the second device;
determining a time of receiving each control information included in the first set of control information as a time of acquiring the corresponding data information included in the first set of data information by the first device, and determining a time of receiving each control information included in the second set of control information as a time of acquiring the corresponding data information included in the second set of data information by the second device.
3. The method of claim 2,
determining a time to receive each control information included in the first set of control information as a collection time at which the first device collects the corresponding data information included in the first set of data information includes: under the condition that the first group of control information comprises corresponding first identification information which is generated and sent by the first equipment according to the sequence of acquiring each data information included in the first group of data information, determining the time for receiving the first identification information as the time for acquiring the data information included in the first group of data information and corresponding to the first identification information;
determining a time to receive each control information included in the second set of control information as a collection time at which the second device collects the corresponding data information included in the second set of data information includes: and under the condition that the second group of control information comprises corresponding second identification information which is generated and sent by the second equipment according to the sequence of acquiring each data information included in the second group of data information, determining the time for receiving the second identification information as the time for acquiring the data information included in the second group of data information and corresponding to the second identification information.
4. The method of claim 3, wherein matching data information included in the first set of data information with data information included in the second set of data information based on a time of acquisition for the first set of data information by the first device and a time of acquisition for the second set of data information by the second device comprises:
binding the first identifier and the second identifier based on the time of receiving the first identifier and the time of receiving the second identifier so as to bind the first identifier and the second identifier with the smallest difference of the receiving times;
and correspondingly matching the data information corresponding to the first identifier and included in the first group of data information with the data information corresponding to the second identifier and included in the second group of data information according to the binding result of the first identifier and the second identifier.
5. The method of claim 1, wherein before matching the data information included in the first set of data information with the data information included in the second set of data information based on the acquisition time of the first device acquiring the first set of data information and the acquisition time of the second device acquiring the second set of data information to obtain a matching result, the method further comprises:
receiving a binding result from a target device, wherein the binding result is used for indicating a binding relationship between first identification information of the first group of data information collected by the first device and second identification information of the second group of data information collected by the second device, wherein first identification information of data information included in the first group of data information having a smallest difference in reception time is bound with second identification information of data information included in the second group of data information, the first identification information is generated by the first device according to the sequence of acquiring each data information included in the first group of data information and is sent to the target device, the second identification information is generated by the second device according to the sequence of acquiring each data information included in the second group of data information and is sent to the target device;
and matching the data information included in the first group of data information with the data information included in the second group of data information based on the binding result to obtain a matching result.
6. The method according to any of claims 3 to 5, wherein the first identification information comprises first frame number information and the second identification information comprises second frame number information.
7. The method of claim 1, wherein after matching the data information included in the first set of data information with the data information included in the second set of data information based on the acquisition time of the first device acquiring the first set of data information and the acquisition time of the second device acquiring the second set of data information to obtain a matching result, the method further comprises:
and inputting the matching result into a target recognition model to obtain a target recognition result, wherein the target recognition model is obtained by training the recognition model to be trained, and the target recognition result is used for indicating whether the matching result contains a target object.
8. The method of claim 1,
the first device comprises a millimeter wave radar;
the second device includes a video sensor.
9. The method of claim 8, wherein prior to obtaining the first set of data information from the first device and the second set of data information from the second device, the method further comprises:
controlling the millimeter wave radar to resolve a group of signals collected by the millimeter wave radar to obtain the number of a group of target points in the target area and corresponding target point information;
generating the first group of data information according to the number of the group of target points and the corresponding target point information;
under the condition that the target equipment comprises a video sensor, controlling the video sensor to perform signal processing on a group of signals acquired by the video sensor to obtain a group of image data;
generating the second set of data information from the set of image data.
10. An apparatus for synchronizing data, comprising:
an obtaining module, configured to obtain a first set of data information from a first device and a second set of data information from a second device, where the first set of data information is data information obtained by the first device performing multiple information acquisitions on a target area within a predetermined time period, the second set of data information is data information obtained by the second device performing multiple information acquisitions on the target area within the predetermined time period, and data acquisition frequencies of the first device and the second device are different;
and the matching module is used for matching the data information included in the first group of data information with the data information included in the second group of data information based on the acquisition time of the first group of data information acquired by the first equipment and the acquisition time of the second group of data information acquired by the second equipment to obtain a matching result.
11. A computer-readable storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 9 when executed.
12. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 9.
CN202010970355.9A 2020-09-15 2020-09-15 Data synchronization method, device, storage medium and electronic device Pending CN112148769A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010970355.9A CN112148769A (en) 2020-09-15 2020-09-15 Data synchronization method, device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010970355.9A CN112148769A (en) 2020-09-15 2020-09-15 Data synchronization method, device, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN112148769A true CN112148769A (en) 2020-12-29

Family

ID=73892301

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010970355.9A Pending CN112148769A (en) 2020-09-15 2020-09-15 Data synchronization method, device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN112148769A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113473118A (en) * 2021-08-23 2021-10-01 追觅创新科技(苏州)有限公司 Data timestamp alignment method, device, equipment and storage medium
CN114710228A (en) * 2022-05-31 2022-07-05 杭州闪马智擎科技有限公司 Time synchronization method and device, storage medium and electronic device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050044438A1 (en) * 2003-08-19 2005-02-24 Kabushiki Kaisha Toshiba Trace data processing apparatus and method
CN106408940A (en) * 2016-11-02 2017-02-15 南京慧尔视智能科技有限公司 Microwave and video data fusion-based traffic detection method and device
WO2017089355A1 (en) * 2015-11-24 2017-06-01 T2 Data Ab Data synchronization in a distributed data storage system
CN110632849A (en) * 2019-08-23 2019-12-31 珠海格力电器股份有限公司 Intelligent household appliance, control method and device thereof and storage medium
CN110726990A (en) * 2019-09-23 2020-01-24 江苏大学 Multi-sensor fusion method based on DS-GNN algorithm
CN110741385A (en) * 2019-06-26 2020-01-31 Oppo广东移动通信有限公司 Gesture recognition method and device and location tracking method and device
US20200104649A1 (en) * 2018-09-27 2020-04-02 Siemens Aktiengesellschaft Method, system, and computer program product for identifying device
CN111192329A (en) * 2019-12-10 2020-05-22 苏州智加科技有限公司 Sensor calibration result verification method and device and storage medium
CN111291682A (en) * 2020-02-07 2020-06-16 浙江大华技术股份有限公司 Method and device for determining target object, storage medium and electronic device
CN111382768A (en) * 2018-12-29 2020-07-07 华为技术有限公司 Multi-sensor data fusion method and device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050044438A1 (en) * 2003-08-19 2005-02-24 Kabushiki Kaisha Toshiba Trace data processing apparatus and method
WO2017089355A1 (en) * 2015-11-24 2017-06-01 T2 Data Ab Data synchronization in a distributed data storage system
CN106408940A (en) * 2016-11-02 2017-02-15 南京慧尔视智能科技有限公司 Microwave and video data fusion-based traffic detection method and device
US20200104649A1 (en) * 2018-09-27 2020-04-02 Siemens Aktiengesellschaft Method, system, and computer program product for identifying device
CN111382768A (en) * 2018-12-29 2020-07-07 华为技术有限公司 Multi-sensor data fusion method and device
CN110741385A (en) * 2019-06-26 2020-01-31 Oppo广东移动通信有限公司 Gesture recognition method and device and location tracking method and device
CN110632849A (en) * 2019-08-23 2019-12-31 珠海格力电器股份有限公司 Intelligent household appliance, control method and device thereof and storage medium
CN110726990A (en) * 2019-09-23 2020-01-24 江苏大学 Multi-sensor fusion method based on DS-GNN algorithm
CN111192329A (en) * 2019-12-10 2020-05-22 苏州智加科技有限公司 Sensor calibration result verification method and device and storage medium
CN111291682A (en) * 2020-02-07 2020-06-16 浙江大华技术股份有限公司 Method and device for determining target object, storage medium and electronic device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
宋杰;何友;唐小明;邱军海;: "雷达视频回波信号的实时采集、显示与存储系统", 数据采集与处理, no. 01, 30 May 2006 (2006-05-30) *
戴明桢,唐明杰,杨振琪: "多通道、大容量同步数据采集系统", 航空学报, no. 08, 25 August 1994 (1994-08-25) *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113473118A (en) * 2021-08-23 2021-10-01 追觅创新科技(苏州)有限公司 Data timestamp alignment method, device, equipment and storage medium
CN113473118B (en) * 2021-08-23 2024-05-14 追觅创新科技(苏州)有限公司 Data timestamp alignment method, device, equipment and storage medium
CN114710228A (en) * 2022-05-31 2022-07-05 杭州闪马智擎科技有限公司 Time synchronization method and device, storage medium and electronic device
CN114710228B (en) * 2022-05-31 2022-09-09 杭州闪马智擎科技有限公司 Time synchronization method and device, storage medium and electronic device

Similar Documents

Publication Publication Date Title
CN108073890B (en) Method and system for motion recognition in video sequences captured by a camera
CN109740004B (en) Filing method and device
US11301754B2 (en) Sharing of compressed training data for neural network training
CN109410278B (en) Target positioning method, device and system
CN112148769A (en) Data synchronization method, device, storage medium and electronic device
CN102065275B (en) Multi-target tracking method in intelligent video monitoring system
CN111368622A (en) Personnel identification method and device, and storage medium
CN111929672A (en) Method and device for determining movement track, storage medium and electronic device
CN111047622B (en) Method and device for matching objects in video, storage medium and electronic device
CN112633120A (en) Intelligent roadside sensing system based on semi-supervised learning and model training method
CN110519690A (en) The determination method and device in candidate search region, storage medium, electronic device
CN111125382A (en) Personnel track real-time monitoring method and terminal equipment
CN112770265A (en) Pedestrian identity information acquisition method, system, server and storage medium
CN111896941A (en) Target track determination method and device for radar data
CN110473015A (en) A kind of smart ads system and advertisement placement method
CN111640300B (en) Vehicle detection processing method and device
CN113160406A (en) Road three-dimensional reconstruction method and device, storage medium and electronic equipment
CN113160272A (en) Target tracking method and device, electronic equipment and storage medium
CN115830342A (en) Method and device for determining detection frame, storage medium and electronic device
CN113592003B (en) Picture transmission method, device, equipment and storage medium
CN111866468B (en) Object tracking distribution method, device, storage medium and electronic device
CN114945033A (en) Vehicle end data returning method, vehicle end controller, cloud server and vehicle
CN113469130A (en) Shielded target detection method and device, storage medium and electronic device
CN112836089A (en) Method and device for confirming motion trail, storage medium and electronic device
CN114092958A (en) Method, system, electronic device and readable storage medium for object re-identification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination