CN111682918B - Synchronous control method, device and system of sensor and storage medium - Google Patents

Synchronous control method, device and system of sensor and storage medium Download PDF

Info

Publication number
CN111682918B
CN111682918B CN202010521864.3A CN202010521864A CN111682918B CN 111682918 B CN111682918 B CN 111682918B CN 202010521864 A CN202010521864 A CN 202010521864A CN 111682918 B CN111682918 B CN 111682918B
Authority
CN
China
Prior art keywords
sensor
visibility
control device
synchronous control
illumination intensity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010521864.3A
Other languages
Chinese (zh)
Other versions
CN111682918A (en
Inventor
谷之韬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202010521864.3A priority Critical patent/CN111682918B/en
Publication of CN111682918A publication Critical patent/CN111682918A/en
Application granted granted Critical
Publication of CN111682918B publication Critical patent/CN111682918B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation
    • H04J3/0658Clock or time synchronisation among packet nodes
    • H04J3/0661Clock or time synchronisation among packet nodes using timestamps
    • H04J3/0664Clock or time synchronisation among packet nodes using timestamps unidirectional timestamps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks

Abstract

The application provides a synchronous control method, a synchronous control device, a synchronous control system and a storage medium of a sensor. According to the method, a first sensor is dynamically determined from at least two sensors through a synchronous control device according to real-time acquired environment data, the sensors except the first sensor are used as second sensors, when the first sensor performs data acquisition according to a preset pulse signal, the synchronous control device controls the second sensor to perform data acquisition according to the preset pulse signal in the first sensor, the second sensor and the first sensor perform synchronous data acquisition, correct corresponding relation exists between data acquired by the at least two sensors, and the first sensor can perform self-adaptive switching according to environment changes, so that accuracy of finally acquired fusion data is higher.

Description

Synchronous control method, device and system of sensor and storage medium
Technical Field
The present disclosure relates to the field of sensor automatic control technologies, and in particular, to a method, an apparatus, a system, and a storage medium for synchronous control of a sensor.
Background
With the continuous development of data acquisition technology, users expect to acquire various data in a target area, not only limited to the acquisition of images, but also acquire radar data and the like in the target area, and the acquisition of various data often needs to be realized by relying on a plurality of different sensors.
In the prior art, when data acquisition is performed on the same target area through a plurality of sensors, each sensor is synchronously controlled through software, for example, each sensor is controlled to perform data acquisition at the same time point, but different sensors have independent timing systems, timing deviation inevitably exists among different timing systems, so that the acquisition time of the sensors is inconsistent, and the data acquired by each sensor cannot be correctly corresponding in the time dimension.
Disclosure of Invention
The application provides a synchronous control method, a synchronous control device, a synchronous control system and a storage medium of sensors, which can enable data collected by each sensor to correctly correspond in a time dimension.
In a first aspect, an embodiment of the present application provides a synchronous control method for sensors, including:
the synchronous control device dynamically determines a first sensor from at least two sensors according to environment data acquired in real time, and takes the sensors except the first sensor as second sensors;
when the first sensor carries out data acquisition according to a preset pulse signal, the synchronous control device controls the second sensor to carry out data acquisition according to the preset pulse signal in the first sensor, so that the second sensor and the first sensor carry out synchronous data acquisition.
In a second aspect, an embodiment of the present application provides a synchronization control apparatus, including:
the synchronous mode setting unit is used for dynamically determining a first sensor from at least two sensors according to environment data acquired in real time and taking the sensors except the first sensor as second sensors;
and the synchronous control unit is used for controlling the second sensor to carry out data acquisition according to a preset pulse signal in the first sensor so as to enable the second sensor and the first sensor to carry out synchronous data acquisition.
In a third aspect, an embodiment of the present application provides a synchronization control apparatus, including: a memory and a processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored by the memory, causing the processor to perform the method of synchronous control of a sensor according to the first aspect.
In a fourth aspect, an embodiment of the present application provides a synchronous control system for sensors, including: a synchronous control device and at least two sensors according to the third aspect;
and the synchronous control device is respectively connected with each sensor and is used for controlling the at least two sensors to carry out synchronous data acquisition.
In a fifth aspect, an embodiment of the present application provides a storage medium, including: a readable storage medium and a computer program for implementing the synchronization control method of the sensor according to the first aspect.
The embodiment of the application provides a synchronous control method, a device, a system and a storage medium of a sensor, dynamically determining a first sensor from at least two sensors according to the environment data collected in real time by a synchronous control device, and using the sensors except the first sensor as second sensors, controlling a second sensor to acquire data according to a pulse signal preset in a first sensor, enabling the second sensor and the first sensor to acquire data synchronously and enabling data acquired by at least two sensors to have a correct corresponding relation, and the first sensor is selected from the multiple sensors as a main control end for outputting the synchronous signals according to different environmental conditions, so that the self-adaptive switching among different synchronous modes under different environmental conditions or different requirements is realized, and the accuracy of finally obtained fusion data is higher.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic diagram of a synchronous control system of a sensor according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of another synchronous control system for a sensor according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of a synchronization control method for sensors according to an embodiment of the present disclosure;
FIG. 4 is a schematic flow chart illustrating another method for controlling synchronization of sensors according to an embodiment of the present disclosure;
fig. 5 is a schematic hardware structure diagram of a synchronous control system of a sensor according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a synchronization control apparatus according to an embodiment of the present application;
fig. 7 is a block diagram of a synchronization control apparatus according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In a scene of carrying out data acquisition on the same target area through a plurality of sensors, data acquired by each sensor are subjected to data fusion processing on the same time dimension, multidimensional fusion data of the target area can be obtained, in order to enable the data acquired by each sensor to be subjected to data fusion and obtain accurate fusion data, the data acquired by each sensor are required to be in one-to-one correspondence on the time dimension, and therefore, each sensor is required to be synchronously controlled through a synchronous control device, and synchronous data acquisition of the sensors is realized.
In practical applications, in order to simplify the control flow, different sensors are often controlled to perform synchronous data acquisition by a passive synchronization method, in which a synchronization control device receives a synchronization control signal sent by a primary sensor (hereinafter referred to as a first sensor), controls a secondary sensor (hereinafter referred to as a second sensor) to perform data acquisition according to the synchronization control signal, records time information, also referred to as a timestamp, of the signal through an internal hardware interrupt, and fuses data acquired by a plurality of sensors according to the timestamp, thereby implementing synchronous data acquisition of multiple sensors.
In the embodiment of the application, the first sensor is a sensor capable of collecting accurate data under the current environmental condition, and in different application scenes, different first sensors are required to be set, for example, the camera is set to be the main sensor when the illumination is strong, and the radar sensor is set to be the first sensor when the illumination is weak.
Based on the application scenario, in order to obtain accurate fusion data, the embodiment of the application enables data acquired by each sensor to correctly correspond to each other in a time dimension by synchronously controlling each sensor; the environment is detected in real time, the sensor with higher accuracy is selected as the first sensor under different environmental conditions, automatic switching of a synchronous control mode is achieved, the second sensors are controlled to conduct synchronous data acquisition according to pulse signals when the first sensors acquire data, the data acquired by the first sensors with higher accuracy and the data acquired by the second sensors are fused to obtain fused data, and the accuracy of the fused data is higher.
Fig. 1 is a schematic diagram of a synchronous control system of a sensor according to an embodiment of the present disclosure.
As shown in fig. 1, the synchronous control system 001 of the sensor includes: a synchronization control device 01 and at least two sensors 02.
The synchronous control device 01 is respectively connected with each sensor 02 and is used for controlling at least two sensors to carry out synchronous data acquisition. The synchronous control Device 01 may be a chip, or may be composed of a plurality of chips, such as a Field Programmable Gate Array (FPGA), a Complex Programmable Logic Device (CPLD), and a single chip microcomputer (e.g., an arm (advanced RISC machines) single chip microcomputer).
Illustratively, the sensor 02 may include one or more of a radar sensor, such as a laser radar sensor, a millimeter wave radar sensor, an image sensor, such as a digital camera, a digital video camera, a single lens reflex camera, etc., a thermal sensor, such as an infrared thermal meter, and a gravity sensor.
The synchronization control device 01 is also configured to acquire environmental data, determine a first sensor among the at least two sensors 02 based on the environmental data, and use a sensor other than the first sensor among the at least two sensors 02 as a second sensor. Illustratively, the synchronous control device 01 controls the second sensor to perform data acquisition according to a pulse signal preset in the first sensor, so that the second sensor and the first sensor perform synchronous data acquisition.
Alternatively, the number of the first sensors may be one to more, and in general, the number of the first sensors is 1, which can provide more stable pulse signals. If the number of the first sensors is at least two, it should be understood that the at least two first sensors should be preset with the same or similar pulse signals, or the at least two first sensors are the same type of sensors, for example, both image sensors.
Illustratively, as shown in connection with fig. 1, the synchronous control system 001 of the sensor further includes a data processing device 03,. The data processing device 03 is connected with the at least two sensors 02 in a wired or wireless manner, and is configured to receive data sent by each sensor 02, store the data, or fuse the data sent by each sensor 02 to obtain fused data.
It should be understood by those skilled in the art that the synchronous control system of the sensors provided in the embodiments of the present application can be applied to any field requiring data acquisition of a target area by a plurality of sensors. Taking application to the monitoring field as an example, in the embodiment of the present application, the synchronous control system 001 of the sensor may specifically be a monitoring device, or a component of the monitoring device, and the monitoring device performs data acquisition on the target area through the deployed multiple sensors, and performs fusion processing on data acquired by the multiple sensors to obtain fusion data.
For example, where the monitoring device is used to monitor a traffic surface, the sensors deployed on the monitoring device may include cameras, gravity sensors and millimeter wave radar sensors, when monitoring is carried out in daytime, the camera is taken as a first sensor, the gravity sensor is controlled to synchronously acquire gravity data corresponding to each frame of image (the gravity data can be understood as the pressure borne by the road surface in the target area) while the camera acquires image data, the millimeter wave radar sensor is controlled to synchronously acquire radar data corresponding to each frame of image, when monitoring is carried out at night, the millimeter wave radar sensor is taken as a first sensor, the millimeter wave radar sensor is controlled to synchronously acquire image data corresponding to radar data acquired each time while acquiring the radar data, and controlling the gravity sensor to synchronously acquire gravity data corresponding to the radar data acquired each time. And based on the corresponding relation among the radar data, the image data and the gravity data, fusing different data collected by the plurality of sensors to obtain fused monitoring data.
Fig. 2 is a schematic diagram of another synchronous control system for sensors according to an embodiment of the present disclosure.
Referring to fig. 2, in the embodiment of the present invention, the synchronization control system 001 of the sensor further includes a clock control device 04 connected to the synchronization control device 01.
The clock control means 04 is used to send a time reference signal to the synchronization control means 01.
The clock controller 04 may also be disposed in the synchronization controller 01 as a component of the synchronization controller.
Optionally, in order to provide an accurate time reference signal to the synchronous control device 01, a high-precision Temperature compensation crystal Oscillator (TCXO) is disposed in the clock control device 04.
For example, the synchronization control device 01 can record a time stamp at the acquisition time of each data acquisition of the first sensor according to the pulse signal according to the time reference signal, and can acquire the correspondence between the data acquired by the first sensor and the data acquired by the second sensor according to the time stamp.
For example, when the synchronization control mode is the active control mode, the synchronization control apparatus 01 may further generate a synchronization pulse signal according to the time reference signal, and send the synchronization pulse signal to each sensor, so that each sensor performs data acquisition according to the synchronization pulse signal. The triggering mode of triggering each sensor to acquire data by the synchronous pulse signal can be level triggering, rising edge triggering, falling edge triggering and the like, and the scheme does not require the method.
For example, in order to facilitate the user to select the synchronization control mode and provide personalized settings for the user, the synchronization control device 01 may set the synchronization control mode to the active control mode according to a synchronization mode control instruction input by the user, or set a corresponding sensor as the first sensor according to the synchronization mode control instruction input by the user.
In a specific implementation manner, if the user manually selects the synchronous control mode on the upper computer interface, the synchronous control mode selected by the user is preferentially switched to. If the synchronous control mode selected by the user is the active control mode, the FPGA outputs a synchronous pulse signal to sensors such as a millimeter wave radar and a camera to synchronously acquire data.
In the embodiment of the present application, as shown in fig. 2, the synchronous control system 001 of the sensor further includes a sensor for environment monitoring connected to the synchronous control device, where the sensor includes one or a combination of a light sensor 05 and a visibility sensor 06.
The light sensor 05 is used for acquiring the illumination intensity of a target area, converting the illumination intensity into an electric signal and outputting the electric signal to the synchronous control device 01; or the light sensor 05 judges whether the acquired illumination intensity meets a preset condition, and outputs a high level or a low level to the synchronous control device 01 according to the judgment result. For example, after the light sensor 05 acquires the light intensity, it is determined whether the light intensity is greater than a preset threshold, and whether the time greater than the preset threshold exceeds a preset time, if so, a high level is sent to the synchronous control device 01, otherwise, a low level is sent to the synchronous control device 01, and the synchronous control device 01 determines the first sensor according to the received level value.
The following describes a method for controlling synchronization of sensors according to the present application with several embodiments.
Fig. 3 is a schematic flowchart of a synchronization control method for sensors according to an embodiment of the present disclosure.
In order to enable at least two sensors to perform synchronous data acquisition, as shown in step S102 in fig. 3, the second sensor is controlled to perform data acquisition according to a preset pulse signal in the first sensor, so that the second sensor and the first sensor perform synchronous data acquisition, and data acquired by the at least two sensors have a correct corresponding relationship. It should be understood that the first sensor and the second sensor may be preset from at least two sensors, the first sensor is the sensor which is matched with the current environment condition, and the second sensor can be used for accurate data acquisition under the current environment condition.
Further, in order to enable the first sensor to be automatically matched with the current environmental condition, in conjunction with step S101 shown in fig. 3, before the multiple sensors are synchronously controlled, the synchronous control device dynamically determines the first sensor from at least two sensors according to the environmental data collected in real time, and uses the sensors other than the first sensor as the second sensor, so that the first sensor can be adaptively switched according to the change of the environment, so as to enable the accuracy of the finally obtained fusion data to be higher.
After the synchronous control device is started, the environment data can be collected and determined from at least two sensors, and the first sensor can also be determined according to a preset instruction, which is not required by the scheme.
The embodiment of the application provides the following possible implementation ways for how to acquire the environment data and how to determine the first sensor according to the environment data:
the first method is as follows: and when the environment data is the illumination intensity, determining whether the illumination intensity is greater than an illumination intensity threshold value.
If the illumination intensity is greater than the illumination intensity threshold, determining whether the time that the illumination intensity is greater than the illumination intensity threshold is greater than a first preset time period, and if the time that the illumination intensity is greater than the illumination intensity threshold is greater than the first preset time period, the accuracy of the image data acquired by the image sensor is higher, so that the first sensor is determined to be the image sensor. For example, if the number of the image sensors is multiple, the first sensor may be multiple image sensors, or one image sensor capable of acquiring image data with higher accuracy is determined from the multiple image sensors as the first sensor, or one or more image sensors are selected as the first sensor according to a preset strategy. It should be understood that if the number of the first sensors is at least two image sensors, the at least two image sensors should perform data acquisition according to the same or similar pulse signals. Further, the other sensors of the at least two sensors except the first sensor are regarded as second sensors.
For example, if the illumination intensity is strong and the duration time reaches a first preset time (e.g. 1 minute), the light sensor outputs a high level, and the synchronous control device recognizes the high level and then uses the image sensor as the first sensor.
If the illumination intensity is smaller than or equal to the illumination intensity threshold, determining whether the time that the illumination intensity is smaller than or equal to the illumination intensity threshold is greater than a first preset time length, and when the time that the illumination intensity is smaller than or equal to the illumination intensity threshold is greater than the first preset time length, acquiring data through a non-image sensor, such as a radar sensor, a thermal sensor and a gravity sensor, with higher accuracy, thereby determining that the first sensor is the non-image sensor. For example, if the number of the non-image sensors is multiple, the first sensor may be multiple non-image sensors, or one non-image sensor capable of acquiring data with higher accuracy is determined from the multiple non-image sensors as the first sensor, or one or more non-image sensors are selected as the first sensor according to a preset strategy. It should be understood that if the number of the first sensors is at least two non-image sensors, the at least two non-image sensors should perform data acquisition according to the same or similar pulse signals. Further, the other sensors of the at least two sensors except the first sensor are regarded as second sensors.
For example, if the illumination intensity is weak and the duration reaches a first preset duration (e.g. 1 minute), the light sensor outputs a low level, and the synchronous control device recognizes the low level and then uses the non-image sensor as the first sensor.
When the duration that the illumination intensity is greater than the illumination intensity threshold value does not reach the first preset duration, or the duration that the illumination intensity is less than or equal to the illumination intensity threshold value does not reach the first preset duration, the level value output by the light sensation sensor is unchanged, and the existing synchronous control mode is still maintained. The scheme does not limit the existing synchronous control mode, for example, a synchronous control device sends synchronous pulse signals to each sensor, and each sensor carries out data acquisition according to the synchronous pulse signals; or the synchronous control device controls other sensors to carry out synchronous data acquisition according to a preset pulse signal of the first sensor; or the synchronous control device controls other sensors to carry out synchronous data acquisition according to the pulse signal of the first sensor determined according to the environmental data at the last time.
The embodiment of the application not only detects the illumination intensity, but also detects the duration of the illumination intensity, and avoids frequently changing the first sensor when the illumination intensity is unstable.
Optionally, the light intensity may be collected by the light sensor, and the collected light intensity is sent to the synchronous control device, so that the synchronous control device determines the first sensor according to the light intensity and a preset threshold; or the light sensor collects the illumination intensity and determines whether the illumination intensity is greater than an illumination intensity threshold value or not, different level values are output after the illumination intensity is greater than the illumination intensity threshold value and lasts for a first preset time period or the illumination intensity is less than or equal to the illumination intensity threshold value and lasts for the first preset time period, and the synchronous control device determines the first sensor according to the level values output by the light sensor.
The second method comprises the following steps: when the environmental data is visibility, it is determined whether or not the visibility is greater than the visibility threshold, similarly to the first mode.
If the visibility is greater than the visibility threshold, determining whether the time that the visibility is greater than the visibility threshold is greater than a second preset time, and when the time that the visibility is greater than the visibility threshold is greater than the second preset time, the accuracy of the image data acquired by the image sensor is higher, so that the first sensor is determined to be the image sensor.
For example, if the visibility is greater than the visibility threshold, for example, greater than 10m, and the duration reaches a second preset duration (for example, 1 minute), the visibility sensor outputs a high level, and after the synchronous control device recognizes the high level, the image sensor is used as the first sensor.
And if the visibility is less than or equal to the visibility threshold, determining whether the time when the visibility is less than or equal to the visibility threshold is greater than a second preset time, and when the time when the visibility is less than or equal to the visibility threshold is greater than the second preset time, the accuracy of the data acquired by the non-image sensor is higher, so that the first sensor is determined to be the non-image sensor.
For example, if the visibility is less than or equal to the visibility threshold, for example, less than or equal to 10m, and the duration reaches a second preset duration (e.g., 1 minute), the visibility sensor outputs a low level, and after the synchronous control device identifies the low level, the non-image sensor is used as the first sensor.
When the duration time that the visibility is greater than the visibility threshold value does not reach a second preset duration time, or the duration time that the visibility is less than or equal to the visibility threshold value does not reach the second preset duration time, the level value output by the visibility sensor is unchanged, and the existing synchronous control mode is still maintained.
The embodiment of the application not only detects the visibility, but also detects the duration of the visibility, and avoids frequently changing the first sensor when the visibility is unstable.
Further, the other sensors of the at least two sensors except the first sensor are regarded as second sensors.
For example, the number of the first sensors may be one to multiple, and the process of determining at least one sensor as the first sensor from the multiple image sensors or the multiple non-image sensors is similar to the first mode, and is not described herein again.
Optionally, the visibility sensor may be used for collecting visibility and sending the collected visibility to the synchronous control device, so that the synchronous control device determines the first sensor according to the visibility and a preset threshold; or the visibility sensor collects visibility and determines whether the visibility is greater than a visibility threshold value, different level values are output after the visibility is greater than the visibility threshold value and lasts for a second preset time period, or the visibility is less than or equal to the visibility threshold value and lasts for the second preset time period, and the synchronous control device determines the first sensor according to the level values output by the visibility sensor.
Among them, the visibility sensor can measure atmospheric visibility (meteorological optical distance) by measuring the concentration of discrete light particles (smoke, dust, haze, fog, rainfall, and snowfall) in the air.
The third method comprises the following steps: when the environmental data specifically is illumination intensity and visibility, according to illumination intensity and illumination intensity threshold value and visibility threshold value, confirm first sensor, include:
and respectively determining whether the illumination intensity is greater than an illumination intensity threshold value and whether the visibility is greater than a visibility threshold value.
If the illumination intensity is greater than the illumination intensity threshold and the visibility is greater than the visibility threshold, when a duration that the illumination intensity is greater than the illumination intensity threshold is greater than a first preset duration and a duration that the visibility is greater than the visibility threshold is greater than a second preset duration, illustratively, when the light sensor outputs a high level and the visibility sensor outputs a high level, it is determined that the first sensor is an image sensor.
If the illumination intensity is less than or equal to the illumination intensity threshold value and the visibility is less than or equal to the visibility threshold value, when the duration that the illumination intensity is less than or equal to the illumination intensity threshold value is longer than a first preset duration and the duration that the visibility is less than or equal to the visibility threshold value is longer than a second preset duration, namely the light sensation sensor outputs a low level and the visibility sensor outputs a low level, the first sensor is determined to be a non-image sensor.
For the condition that any one of the conditions is not met, the process of determining the first sensor is not executed, or the existing synchronous control mode is still maintained when the light sensation sensor and the visibility sensor output high level and low level respectively.
For example, the number of the first sensors may be one to multiple, and the process of determining at least one sensor as the first sensor from the multiple image sensors or the multiple non-image sensors is similar to the first mode, and is not described herein again.
Optionally, the first preset duration and the second preset duration may be the same or different, and this is not limited in this embodiment.
According to the embodiment of the application, through the three possible implementation modes, the first sensor is determined from the at least two sensors through the synchronous control device according to the illumination intensity and/or the visibility and the preset threshold (including at least one of the illumination intensity threshold and the visibility threshold), the sensor with the highest accuracy is automatically matched as the first sensor according to the current environmental condition, then the second sensor is controlled to perform synchronous data acquisition according to the pulse signal of the first sensor, and the accuracy of data acquired by performing data acquisition on a target area is improved.
Fig. 4 is a schematic flowchart of another synchronization control method for a sensor according to an embodiment of the present disclosure. As shown in fig. 4, the synchronous control method of the sensor is applied to a synchronous control system of the sensor, which exemplarily includes a synchronous control device, a first sensor, a second sensor and a data processing device. After determining the first sensor and the second sensor based on any of the above embodiments, the method for synchronously controlling the sensors further includes the following steps:
s1: the first sensor sends a synchronization control signal to the synchronization control device.
S2: the first sensor acquires data according to a preset pulse signal.
The execution sequence of steps S1 and S2 is not required in the embodiment of the present application, and may be executed simultaneously.
After the synchronous control device determines each sensor connected with the synchronous control device as a first sensor or a second sensor, the first sensor performs data acquisition according to a pulse signal preset in the first sensor, and sends synchronous control information to the synchronous control device before or after each data acquisition, or at each acquisition time of performing the data acquisition, in other words, the synchronous control device receives the synchronous control signal sent by the first sensor. The synchronous control signal is used for triggering the second sensor to acquire data.
S3: the first sensor sends data to the data processing device.
The first sensor sends the acquired data to the data processing device in real time.
S4: the synchronization control device sends a synchronization control signal to the second sensor.
For example, step S4 and step S3 may be performed simultaneously. The synchronous control device sends the synchronous control signal sent by the first sensor to the second sensor, so that the second sensor can acquire data according to the synchronous control signal.
S5: the synchronization control means records a time stamp based on the time reference signal.
The execution sequence of steps S5 and S4 is not required in the embodiment of the present application, and for example, steps S5 and S4 may be executed simultaneously.
The synchronous control device receives a time reference signal sent by the clock control device in real time, and when the synchronous control device receives the synchronous control signal every time, namely, at the acquisition time of the first sensor for acquiring data every time according to a preset pulse signal, a timestamp is recorded, and the timestamp is used for indicating the corresponding relation between the data acquired by the second sensor and the data acquired by the first sensor.
S6: and the second sensor acquires data according to the synchronous control signal.
The second sensor performs data acquisition in response to the received synchronization control signal.
S7: the synchronization control means transmits the time stamp to the second sensor.
S8: the second sensor transmits data containing the time stamp to the data processing apparatus.
The present solution does not require the order of executing step S7 and step S6. And the synchronous control device sends the recorded time stamp to the second sensor, so that the second sensor sends the time stamp and the acquired data to the data processing device together.
S9: and the data processing device performs data fusion according to the data sent by the first sensor and the second sensor to obtain fused data.
The data processing device can only store the data sent by the first sensor and the second sensor, or perform data fusion on the data sent by the first sensor and the second sensor, and the fusion process determines the corresponding relationship of the data collected by the first sensor and the first sensor depending on the timestamp to obtain fused data, and stores or sends the fused data to other equipment for use.
It should be understood that, in a scenario where a plurality of sensors are used to collect data of the same target area, for example, when a detection system uses a plurality of sensors such as millimeter wave radar, laser radar, and camera to perform event detection, the data collected by each sensor is registered in the same time dimension, so that multidimensional fusion data of the target area can be obtained, and the data collected by each sensor corresponds to each other in the time dimension.
Fig. 5 is a schematic hardware structure diagram of a synchronous control system of a sensor according to an embodiment of the present disclosure.
As shown in fig. 5, in this embodiment, the synchronous control device is composed of an FPGA and an ARM single chip, and for example, a clock control device, such as a TCXO, may also belong to the synchronous control device.
The FPGA is provided with a synchronous interface of a plurality of sensors and can be connected with a plurality of different sensors, such as a camera synchronous interface, a radar synchronous interface, an infrared thermodynamics interface and the like.
The FPGA receives the time reference signal sent by the TCXO, manages and distributes the time reference signal, and sends the distributed time reference signal to the ARM single chip microcomputer.
The ARM single chip microcomputer takes the received distribution time reference signal as a clock, receives a synchronous mode control instruction sent by the upper computer, or receives an indication level sent by the light sensation sensor and/or the visibility sensor, generates a synchronous mode control instruction according to the indication level, and sends the synchronous mode control instruction to the FPGA.
The FPGA determines a synchronous control mode according to the received synchronous mode control instruction, where the synchronous control mode includes an active control mode and a passive control mode, and in this embodiment, the sensors connected to the synchronous control device are a camera and a millimeter wave radar sensor, and the passive control mode specifically includes a video trigger control mode and a radar trigger control mode. Illustratively, if the active control mode is adopted, the FPGA generates a synchronization pulse signal according to the received time reference signal, and sends the synchronization pulse signal to the camera through the camera synchronization interface and to the millimeter wave radar sensor through the radar synchronization interface.
Illustratively, if the synchronous control mode is a video trigger control mode, a synchronous control signal sent by the camera at each time or each frame of image data acquisition time is received through the camera synchronous interface, and the synchronous control signal is sent to the millimeter wave radar sensor through the radar synchronous interface to control the millimeter wave radar sensor to perform synchronous data acquisition.
Illustratively, if the synchronous control mode is a radar trigger control mode, the synchronous control signal sent by the millimeter wave radar sensor at each radar data acquisition time is received through the radar synchronous interface, and the synchronous control signal is sent to the camera through the camera synchronous interface, so as to control the camera to perform synchronous data acquisition.
The method and the device have the advantages that the human-computer interaction interface can be deployed in the upper computer, an individualized setting mode is provided for a user, the user can input a synchronous mode control command through the human-computer interaction interface, and the synchronous control device is directly instructed to set the synchronous control mode into an active control mode, a video trigger control mode or a radar trigger control mode through the synchronous mode control command.
The millimeter wave radar sensor and the camera output the acquired data to the flash memory for storage in the form of data packets.
Fig. 6 is a schematic structural diagram of a synchronization control device according to an embodiment of the present application. As shown in fig. 6, the synchronization control apparatus 10 includes:
the synchronous mode setting unit 11 is configured to dynamically determine a first sensor from at least two sensors according to environment data acquired in real time, and use a sensor other than the first sensor as a second sensor;
and the synchronous control unit 12 is configured to control the second sensor to perform data acquisition according to a preset pulse signal in the first sensor, so that the second sensor and the first sensor perform synchronous data acquisition.
The embodiment of the application provides a synchronous control device 10, including synchronous mode setting unit 11 and synchronous control unit 12, according to the environmental data of gathering in real time through synchronous control device, the first sensor is confirmed from the developments in at least two sensors, and regard the sensor except that first sensor as the second sensor, according to the pulse signal that predetermines in the first sensor, control the second sensor and carry out data acquisition, make second sensor and first sensor carry out synchronous data acquisition, make to have correct corresponding relation between the data of at least two sensor collections, and first sensor can carry out self-adaptation's switching according to the change of environment, so that the accuracy of the final fusion data that obtains is higher.
In a possible design, the synchronization mode setting unit 11 is specifically configured to determine the first sensor from at least two sensors according to the illumination intensity and/or the visibility and a preset threshold.
In one possible design, the synchronization pattern setting unit 11 is specifically configured to:
determining whether the illumination intensity is greater than an illumination intensity threshold, wherein the preset threshold comprises the illumination intensity threshold;
if the illumination intensity is greater than the illumination intensity threshold, determining that the first sensor is an image sensor when the time length that the illumination intensity is greater than the illumination intensity threshold is greater than a first preset time length;
if the illumination intensity is smaller than or equal to the illumination intensity threshold, determining that the first sensor is a non-image sensor when the time length that the illumination intensity is smaller than or equal to the illumination intensity threshold is longer than a first preset time length.
In one possible design, the synchronization pattern setting unit 11 is specifically configured to:
determining whether the visibility is greater than a visibility threshold, wherein the preset threshold comprises the visibility threshold;
if the visibility is greater than the visibility threshold, determining that the first sensor is an image sensor when the time length of the visibility greater than the visibility threshold is greater than a second preset time length;
if the visibility is less than or equal to the visibility threshold, determining that the first sensor is a non-image sensor when the time length of the visibility less than or equal to the visibility threshold is longer than a second preset time length.
In one possible design, the synchronization pattern setting unit 11 is specifically configured to:
determining whether the illumination intensity is greater than an illumination intensity threshold and the visibility is greater than a visibility threshold;
if the illumination intensity is greater than an illumination intensity threshold value and the visibility is greater than a visibility threshold value, determining that the first sensor is an image sensor when the duration that the illumination intensity is greater than the illumination intensity threshold value is greater than a first preset duration and the duration that the visibility is greater than the visibility threshold value is greater than a second preset duration;
if the illumination intensity is less than or equal to an illumination intensity threshold value, and the visibility is less than or equal to a visibility threshold value, determining that the first sensor is a non-image sensor when the duration of the illumination intensity which is less than or equal to the illumination intensity threshold value is longer than a first preset duration, and the duration of the visibility which is less than or equal to the visibility threshold value is longer than a second preset duration.
In one possible design, the synchronization pattern setting unit 11 is further configured to:
acquiring the illumination intensity of a target area through a light sensor;
and/or, collecting the visibility of the target area through a visibility sensor.
In one possible design, the non-image sensor includes: at least one of a radar sensor, a thermal sensor, and a gravity sensor.
In one possible design, the synchronization control unit 12 is specifically configured to:
receiving a synchronous control signal sent by the first sensor at the acquisition time when the first sensor acquires data once according to the pulse signal;
and sending the synchronous control signal to the second sensor, so that the second sensor performs data acquisition according to the synchronous control signal.
In one possible design, the synchronization control unit 12 is further configured to:
recording a timestamp at the acquisition time of the first sensor for acquiring data once according to the pulse signal according to the time reference signal output by the clock control device; the timestamp is used for indicating the corresponding relation between the data collected by the second sensor and the data collected by the first sensor.
The synchronization control apparatus provided in the foregoing embodiment may implement the technical solutions of the foregoing method embodiments, and the implementation principles and technical effects are similar, which are not described herein again.
Fig. 7 is a block diagram of a synchronization control apparatus according to an embodiment of the present application. As shown in fig. 7, the synchronization control device 600 generally includes: a processor 601 and a memory 602. Optionally, a bus 603 may also be included. The bus 603 is used to realize the connection between the elements.
The processor 601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 601 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 601 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 601 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, processor 601 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
Memory 602 may include one or more computer-readable storage media, which may be non-transitory. The memory 602 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 602 is used to store at least one instruction for execution by the processor 601 to implement the synchronization control method of the sensor provided by the method embodiments herein.
Those skilled in the art will appreciate that the configuration shown in fig. 7 is not intended to be limiting of the synchronization control apparatus 600, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Embodiments of the present application further provide a non-transitory computer-readable storage medium, where when instructions in the storage medium are executed by a processor of an electronic device, the electronic device is enabled to execute the synchronization control method for a sensor provided in the foregoing embodiments.
The computer-readable storage medium in this embodiment may be any available medium that can be accessed by a computer or a data storage device such as a server, a data center, etc. that is integrated with one or more available media, and the available media may be magnetic media (e.g., floppy disks, hard disks, magnetic tapes), optical media (e.g., DVDs), or semiconductor media (e.g., SSDs), etc.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Embodiments of the present application further provide a computer program product containing instructions, which when run on a computer, cause the computer to execute the synchronization control method for a sensor provided by the above embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. A method for synchronous control of a sensor, comprising:
the synchronous control device dynamically determines a first sensor from at least two sensors according to environment data acquired in real time, and takes the sensors except the first sensor as second sensors;
when the first sensor performs data acquisition according to a preset pulse signal, the synchronous control device controls the second sensor to perform data acquisition according to the preset pulse signal in the first sensor, so that the second sensor and the first sensor perform synchronous data acquisition;
the environment data includes illumination intensity and/or visibility, and the synchronous control device dynamically determines a first sensor from at least two sensors according to the environment data collected in real time, including:
and the synchronous control device determines the first sensor from at least two sensors according to the illumination intensity and/or the visibility and a preset threshold value.
2. The method of claim 1, wherein if the environmental data includes illumination intensity, the synchronization control device determines the first sensor from at least two sensors according to the collected environmental data and a preset threshold, comprising:
the synchronous control device determines whether the illumination intensity is greater than an illumination intensity threshold value, wherein the preset threshold value comprises the illumination intensity threshold value;
if the illumination intensity is greater than the illumination intensity threshold, the synchronous control device determines that the first sensor is an image sensor when the time length that the illumination intensity is greater than the illumination intensity threshold is greater than a first preset time length;
if the illumination intensity is smaller than or equal to the illumination intensity threshold, the synchronous control device determines that the first sensor is a non-image sensor when the time length that the illumination intensity is smaller than or equal to the illumination intensity threshold is longer than a first preset time length.
3. The method of claim 1, wherein if the environmental data includes visibility, the synchronization control device determines the first sensor from at least two sensors based on the collected environmental data and a preset threshold, comprising:
the synchronous control device determines whether the visibility is greater than a visibility threshold value, wherein the preset threshold value comprises the visibility threshold value;
if the visibility is greater than a visibility threshold, the synchronous control device determines that the first sensor is an image sensor when the time length of the visibility greater than the visibility threshold is greater than a second preset time length;
if the visibility is less than or equal to the visibility threshold, the synchronous control device determines that the first sensor is a non-image sensor when the time length of the visibility less than or equal to the visibility threshold is longer than a second preset time length.
4. The method as claimed in claim 1, wherein if the environmental data includes illumination intensity and visibility, the synchronous control device determines the first sensor from at least two sensors according to the collected environmental data and a preset threshold, comprising:
the synchronous control device determines whether the illumination intensity is greater than an illumination intensity threshold value or not and whether the visibility is greater than a visibility threshold value or not;
if the illumination intensity is greater than an illumination intensity threshold value and the visibility is greater than a visibility threshold value, the synchronous control device determines that the first sensor is an image sensor when the duration that the illumination intensity is greater than the illumination intensity threshold value is greater than a first preset duration and the duration that the visibility is greater than the visibility threshold value is greater than a second preset duration;
if the illumination intensity is less than or equal to an illumination intensity threshold value, and the visibility is less than or equal to a visibility threshold value, the synchronous control device determines that the first sensor is a non-image sensor when the duration that the illumination intensity is less than or equal to the illumination intensity threshold value is longer than a first preset duration, and the duration that the visibility is less than or equal to the visibility threshold value is longer than a second preset duration.
5. The method of any of claims 1 to 4, wherein prior to the synchronization control device dynamically determining the first sensor from the at least two sensors based on the real-time collected environmental data, the method further comprises:
the synchronous control device acquires the illumination intensity of a target area through a light sensor;
and/or the synchronous control device acquires the visibility of a target area through a visibility sensor.
6. The method of any of claims 3 to 4, wherein the non-image sensor comprises: at least one of a radar sensor, a thermal sensor, and a gravity sensor.
7. The method according to any one of claims 1 to 4, wherein the synchronous control device controls the second sensor to collect data according to a pulse signal preset in the first sensor, and the method comprises the following steps:
at the acquisition moment when the first sensor performs data acquisition once according to the pulse signal, the synchronous control device receives a synchronous control signal sent by the first sensor;
and the synchronous control device sends the synchronous control signal to the second sensor, so that the second sensor carries out data acquisition according to the synchronous control signal.
8. The method of claim 7, wherein before the synchronization control device sends the synchronization control signal to the second sensor to enable the second sensor to perform data acquisition according to the synchronization control signal, the method further comprises:
the synchronous control device records a timestamp at the acquisition time of each data acquisition of the first sensor according to the pulse signal according to the time reference signal output by the clock control device; the timestamp is used for indicating the corresponding relation between the data collected by the second sensor and the data collected by the first sensor.
9. A synchronous control device, comprising:
the synchronous mode setting unit is used for dynamically determining a first sensor from at least two sensors according to environment data acquired in real time and taking the sensors except the first sensor as second sensors;
the synchronous control unit is used for controlling the second sensor to carry out data acquisition according to a pulse signal preset in the first sensor, so that the second sensor and the first sensor carry out synchronous data acquisition;
the environment data includes illumination intensity and/or visibility, and the synchronization mode setting unit is specifically configured to determine the first sensor from at least two sensors according to the illumination intensity and/or visibility and a preset threshold.
10. A synchronous control device, comprising: a memory and a processor;
the memory stores computer execution instructions;
the processor executes computer-executable instructions stored by the memory, causing the processor to perform the method of synchronous control of a sensor of any of claims 1 to 8.
11. A synchronous control system for a sensor, comprising: a synchronous control device as recited in claim 10 and at least two sensors;
and the synchronous control device is respectively connected with each sensor and is used for controlling the at least two sensors to carry out synchronous data acquisition.
12. The system of claim 11, further comprising: the clock control device is connected with the synchronous control device;
the clock control device is used for sending a time reference signal to the synchronous control device.
13. The system of claim 11, further comprising: the light sensor and/or the visibility sensor are/is connected with the synchronous control device;
the light sensor is used for acquiring the illumination intensity of a target area;
the visibility sensor is used for collecting the visibility of the target area.
14. The system of claim 11, further comprising:
and the data processing device is used for carrying out data fusion on the data acquired by the at least two sensors.
15. A storage medium, comprising: readable storage medium and computer program for implementing the method of synchronous control of a sensor according to any of claims 1 to 8.
CN202010521864.3A 2020-06-10 2020-06-10 Synchronous control method, device and system of sensor and storage medium Active CN111682918B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010521864.3A CN111682918B (en) 2020-06-10 2020-06-10 Synchronous control method, device and system of sensor and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010521864.3A CN111682918B (en) 2020-06-10 2020-06-10 Synchronous control method, device and system of sensor and storage medium

Publications (2)

Publication Number Publication Date
CN111682918A CN111682918A (en) 2020-09-18
CN111682918B true CN111682918B (en) 2022-06-10

Family

ID=72435198

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010521864.3A Active CN111682918B (en) 2020-06-10 2020-06-10 Synchronous control method, device and system of sensor and storage medium

Country Status (1)

Country Link
CN (1) CN111682918B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113259040B (en) * 2021-05-06 2022-03-11 北京觉非科技有限公司 Time synchronization method, apparatus, and computer-readable storage medium
CN114338951A (en) * 2021-12-30 2022-04-12 智道网联科技(北京)有限公司 Sensor synchronization method, device and system and vehicle
CN115508040B (en) * 2022-11-17 2023-03-10 中国空气动力研究与发展中心高速空气动力研究所 Synchronous parallel acquisition system for data of speed field and temperature field and application method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105680975A (en) * 2016-03-07 2016-06-15 浙江大学 Time synchronization method of master-slave structure multi-node network
CN107743054A (en) * 2017-08-25 2018-02-27 杭州德泽机器人科技有限公司 System during a kind of synchronous pair of multisensor
CN108957478A (en) * 2018-07-23 2018-12-07 上海禾赛光电科技有限公司 Multisensor synchronous sampling system and its control method, vehicle
CN109905194A (en) * 2019-02-21 2019-06-18 初速度(苏州)科技有限公司 A kind of vehicle-mounted terminal system and synchronization data obtaining method, device
CN110493744A (en) * 2019-08-20 2019-11-22 郑州大学 A kind of synchronous data sampling method and system of master-slave radio sensor
US10582137B1 (en) * 2018-09-26 2020-03-03 Zoox, Inc. Multi-sensor data capture synchronizaiton

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105680975A (en) * 2016-03-07 2016-06-15 浙江大学 Time synchronization method of master-slave structure multi-node network
CN107743054A (en) * 2017-08-25 2018-02-27 杭州德泽机器人科技有限公司 System during a kind of synchronous pair of multisensor
CN108957478A (en) * 2018-07-23 2018-12-07 上海禾赛光电科技有限公司 Multisensor synchronous sampling system and its control method, vehicle
US10582137B1 (en) * 2018-09-26 2020-03-03 Zoox, Inc. Multi-sensor data capture synchronizaiton
CN109905194A (en) * 2019-02-21 2019-06-18 初速度(苏州)科技有限公司 A kind of vehicle-mounted terminal system and synchronization data obtaining method, device
CN110493744A (en) * 2019-08-20 2019-11-22 郑州大学 A kind of synchronous data sampling method and system of master-slave radio sensor

Also Published As

Publication number Publication date
CN111682918A (en) 2020-09-18

Similar Documents

Publication Publication Date Title
CN111682918B (en) Synchronous control method, device and system of sensor and storage medium
CN105430251A (en) Information processing device and image input device
TWI541767B (en) Method for controlling a surveillance system with aid of automatically generated patrol routes, and associated apparatus
JP7305768B2 (en) VEHICLE CONTROL METHOD, RELATED DEVICE, AND COMPUTER STORAGE MEDIA
CN106327461B (en) A kind of image processing method and device for monitoring
CN103945109A (en) Image pickup apparatus, remote control apparatus, and methods of controlling image pickup apparatus and remote control apparatus
CN107948515A (en) A kind of camera synchronous method and device, binocular camera
CN109089087A (en) The audio-visual linkage of multichannel
US20180084397A1 (en) Control apparatus, control system, and method for controlling control apparatus
KR20210034520A (en) Image capturing apparatus, control method, and computer-readable storage medium
CN109753158B (en) VR device delay determination method and control terminal
CN110690514A (en) Battery self-discharge period adjusting method and unmanned aerial vehicle
CN108965525B (en) Detection method and device, terminal, computer equipment and readable storage medium
US10529135B2 (en) Low-power mode feature identification at a head mounted display
CN105163055A (en) Dynamic recording device, playback device, dynamic recording method and playback method
CN115690224A (en) External parameter calibration method for radar and camera, electronic device and storage medium
CN104796585A (en) Image dynamic recording device, playback device, image dynamic recording method and playback method
CN104219425A (en) Thermal-image dynamic recording device, thermal-image dynamic playback device, thermal-image dynamic recording method and thermal-image dynamic playback method
EP3879260A1 (en) Abnormal part display apparatus, abnormal part display system, abnormal part display method, and abnormal part display program
US11477359B2 (en) Image capturing apparatus, control method, and computer-readable storage medium
CN208171448U (en) Power control cabinet infrared imaging inspection apparatus and system
CN113473118A (en) Data timestamp alignment method, device, equipment and storage medium
CN207854020U (en) Thermal imaging system and bucket tooth monitoring system based on thermal imaging
EP4068238A1 (en) Image capturing method and apparatus, electronic photography device, and computer-readable storage medium
US20220140963A1 (en) Communication method and apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant