CN110839131A - Synchronization control method, synchronization control device, electronic equipment and computer readable medium - Google Patents
Synchronization control method, synchronization control device, electronic equipment and computer readable medium Download PDFInfo
- Publication number
- CN110839131A CN110839131A CN201911162227.5A CN201911162227A CN110839131A CN 110839131 A CN110839131 A CN 110839131A CN 201911162227 A CN201911162227 A CN 201911162227A CN 110839131 A CN110839131 A CN 110839131A
- Authority
- CN
- China
- Prior art keywords
- camera
- laser beam
- azimuth angle
- laser radar
- laser
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 34
- 230000001360 synchronised effect Effects 0.000 claims abstract description 16
- 238000004590 computer program Methods 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 5
- 101150052583 CALM1 gene Proteins 0.000 description 4
- 101150114882 CALM2 gene Proteins 0.000 description 4
- 101150058073 Calm3 gene Proteins 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 230000008447 perception Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The invention provides a synchronous control method, a synchronous control device, electronic equipment and a computer readable medium, which relate to the technical field of equipment control and comprise the following steps: acquiring a laser beam emission azimuth angle of a laser radar and a field angle of each camera in a camera set; determining a target camera based on the laser beam emission azimuth angle and the field angle of each camera in the camera group, wherein the target camera is a camera corresponding to the field angle which is coincident with the laser beam emission azimuth angle; and sending a trigger instruction to the target camera so that the target camera acquires image information based on the trigger instruction, thereby solving the technical problem of low control precision of the existing synchronous control method of the laser radar and the camera.
Description
Technical Field
The present invention relates to the field of device control technologies, and in particular, to a synchronization control method and apparatus, an electronic device, and a computer-readable medium.
Background
With the increase of the level of intelligence, various intelligent driving vehicles/intelligent robots (hereinafter, collectively referred to as "intelligent devices") are increasingly applied to daily lives of people, such as intelligent sweeping robots, intelligent express delivery vehicles, intelligent toys, and the like. The common characteristic of all intelligent devices is the sensing capability of the intelligent devices to the external environment, namely, the intelligent devices can learn whether obstacles exist in the surrounding environment through a sensing system of the intelligent devices so as to determine the traveling route and the next action of the intelligent devices.
At present, for the synchronous control of all sensors in the intelligent device, the existing method: and stamping a time stamp on the data when the computing platform receives the sensor data, so that the data with the closer time stamp is selected for perception fusion in a back-end perception algorithm. The existing method has the problem of low precision, and the main reason is that the time stamp given by the computing platform can only represent the time information received by the computing platform and can not represent the time for the sensors to acquire the surrounding environment, so that the sensors cannot be ensured to acquire the environmental data at the same time. The synchronous control between the lidar and the camera also suffers from the disadvantages described above.
No effective solution has been proposed to the above problems.
Disclosure of Invention
In view of the above, the present invention provides a synchronization control method, a synchronization control apparatus, an electronic device, and a computer readable medium, so as to alleviate the technical problem of low control accuracy of the existing synchronization control method for laser radar and camera.
In a first aspect, an embodiment of the present invention provides a synchronization control method, including: acquiring a laser beam emission azimuth angle of a laser radar and a field angle of each camera in a camera set; determining a target camera based on the laser beam emission azimuth angle and the field angle of each camera in the camera group, wherein the target camera is a camera corresponding to the field angle which is coincident with the laser beam emission azimuth angle; sending a trigger instruction to the target camera to enable the target camera to acquire image information based on the trigger instruction.
Further, the method further comprises: constructing a signal data packet of the laser radar based on the scanning angular velocity of the laser radar and the time interval of the laser radar emitting laser beams at adjacent azimuth angles, wherein the signal data packet comprises: each laser beam emission azimuth angle of the laser radar and the emission timestamp of the laser beam corresponding to each laser beam azimuth angle.
Further, the scanning range of the laser radar is 360 °.
Further, the field angle coverage of the camera group is 360 °.
In a second aspect, an embodiment of the present invention provides a synchronization control apparatus, including: the device comprises an acquisition unit, a determination unit and an execution unit, wherein the acquisition unit is used for acquiring a laser beam emission azimuth angle of a laser radar and a field angle of each camera in a camera set; the determining unit is used for determining a target camera based on the laser beam emission azimuth angle and the field angle of each camera in the camera group, wherein the target camera is a camera corresponding to the field angle which is coincident with the laser beam emission azimuth angle; the execution unit is used for sending a trigger instruction to the target camera so that the target camera acquires image information based on the trigger instruction.
Further, the apparatus further comprises: a constructing unit, configured to construct a signal data packet of the lidar based on a scanning angular velocity of the lidar and a time interval at which the lidar emits a laser beam at an adjacent azimuth, where the signal data packet includes: each laser beam emission azimuth angle of the laser radar and the emission timestamp of the laser beam corresponding to each laser beam azimuth angle.
Further, the scanning range of the laser radar is 360 °.
Further, the field angle coverage of the camera group is 360 °.
In a third aspect, an embodiment of the present invention further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the method in any one of the above first aspects when executing the computer program.
In a fourth aspect, the present invention also provides a computer-readable medium having non-volatile program code executable by a processor, where the program code causes the processor to execute the method in any one of the above first aspects.
In the embodiment of the invention, firstly, a laser beam emission azimuth angle of a laser radar and a field angle of each camera in a camera set are obtained; then, determining a target camera based on the emission azimuth angle of the laser beam and the field angle of each camera in the camera set, wherein the target camera is a camera corresponding to the field angle which is coincident with the emission azimuth angle of the laser beam; and finally, sending a trigger instruction to the target camera so that the target camera acquires image information based on the trigger instruction, and determining the image acquisition time of the camera according to the data of the laser radar, thereby achieving the purpose that the laser radar and the camera simultaneously acquire surrounding environment data, further solving the technical problem that the control precision of the existing synchronous control method of the laser radar and the camera in the prior art is low, and further realizing the technical effect of carrying out high-precision synchronous control on the laser radar and the camera.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a synchronization control method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a lidar according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a camera according to an embodiment of the present invention;
fig. 4 is a schematic view illustrating an installation of a laser radar and a camera according to an embodiment of the present invention;
fig. 5 is a schematic view illustrating an installation of another lidar and a camera according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a laser radar and a camera according to an embodiment of the present invention;
fig. 7 is a signal diagram of a laser radar and a camera according to an embodiment of the present invention;
fig. 8 is a schematic diagram of a synchronization control apparatus according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The first embodiment is as follows:
in accordance with an embodiment of the present invention, there is provided a synchronization control method embodiment, it should be noted that the steps illustrated in the flowchart of the figure may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Fig. 1 is a flowchart of a synchronization control method according to an embodiment of the present invention, as shown in fig. 1, the method includes the steps of:
step S102, acquiring a laser beam emission azimuth angle of the laser radar and a field angle of each camera in the camera set;
step S104, determining a target camera based on the laser beam emission azimuth angle and the field angle of each camera in the camera group, wherein the target camera is a camera corresponding to the field angle which is coincident with the laser beam emission azimuth angle;
step S106, sending a trigger instruction to the target camera so that the target camera acquires image information based on the trigger instruction.
In the embodiment of the invention, firstly, a laser beam emission azimuth angle of a laser radar and a field angle of each camera in a camera set are obtained; then, determining a target camera based on the emission azimuth angle of the laser beam and the field angle of each camera in the camera set, wherein the target camera is a camera corresponding to the field angle which is coincident with the emission azimuth angle of the laser beam; and finally, sending a trigger instruction to the target camera so that the target camera acquires image information based on the trigger instruction, and determining the image acquisition time of the camera according to the data of the laser radar, thereby achieving the purpose that the laser radar and the camera simultaneously acquire surrounding environment data, further solving the technical problem that the control precision of the existing synchronous control method of the laser radar and the camera in the prior art is low, and further realizing the technical effect of carrying out high-precision synchronous control on the laser radar and the camera.
The perception system is an important component of the intelligent device, in other words, the intelligent device has no environment perception system, just like a person has no audio-visual sense. Smart devices rely on a number of external sensors to identify the environment, with currently common environmental sensors including LiDAR (LiDAR), microwave radar, vision, etc. In the expressway environment, because of the high speed, a microwave RADAR (such as a millimeter wave RADAR) with a large detection distance is usually selected; in urban environments, because of the complex environment, a laser radar with a large detection angle is usually selected. The vision sensor (camera) is the most flexible and relatively low in price, and is the sensor with the greatest application prospect.
A Camera (Camera) is a sensor that collects the largest amount of information in a smart device, similar to the human eye. But the camera cannot accurately acquire the distance of an obstacle in the scene, which can be detected by a LiDAR (LiDAR). LiDAR has the disadvantage that the LiDAR cannot know what obstacles the target is, so a sensing system of the intelligent device usually comprises various sensors, and the purpose is to synthesize the advantages and the disadvantages of the various sensors to analyze and know the type, the size, the motion condition and the like of the target obstacles so as to be used for path planning and behavior decision of the intelligent device. Therefore, the synchronization of multiple sensors becomes an indispensable function in the sensing system of the smart device, that is, all sensors are required to synchronously acquire data of the surrounding environment, so as to perform comprehensive analysis.
The laser radar transmits a laser beam through a transmitter, the laser beam is reflected back to a laser receiver after encountering an object, and the accurate distance of the obstacle is calculated and obtained according to the time difference between the transmission and the reception and the transmission speed of the laser beam. As shown in fig. 2, fig. 2 is a schematic diagram of the operation of the lidar, wherein the scanning range of the lidar is 360 °, and the scanning angular velocity w is. I.e., the lidar beam is currently scanning at azimuth a0, the scanning position at the next instant is a0+ w.
The principle of camera imaging is pinhole imaging, i.e. the object forms a sharp image in the imaging plane by reflecting light (or emitting light itself) through a pinhole. Due to the imaging principle and the lens, the field angle of each camera (i.e. the range of field that can be imaged by the camera) is limited, as shown in fig. 3, fig. 3 is a schematic diagram of the operation of the camera, when the scenery within the range of azimuth angles a 0-a 2 can be imaged, and the scenery outside the range of azimuth angles cannot be imaged (i.e. the area of scene cannot be seen by the camera).
And finally, synchronously controlling the laser radar and the camera. By analyzing the laser beam emission azimuth angle of the laser radar, when the laser beam emission azimuth angle of the laser radar is coincided with the view field azimuth angle of the camera, a trigger instruction is generated for triggering the camera to collect images. Therefore, the synchronous acquisition function of the laser radar and the camera can be realized, but some constraint conditions are required when the function is realized: 1) the camera needs to support an external trigger function; 2) the camera must be mounted above or below the lidar (as shown in fig. 4).
In an embodiment of the present invention, the method further includes the steps of:
step S108, constructing a signal data packet of the laser radar based on the scanning angular velocity of the laser radar and the time interval of the laser radar emitting laser beams at adjacent azimuth angles, wherein the signal data packet comprises: each laser beam emission azimuth angle of the laser radar and the emission timestamp of the laser beam corresponding to each laser beam azimuth angle.
In the embodiment of the invention, the azimuth angle in the signal data packet of the laser radar represents the angle of the current laser beam emission, and the timestamp represents the time of the first laser beam emission in the current signal data packet. The time interval between adjacent azimuth laser shots is constant (for example, Velodyne VLP-16 lidar, the time interval is 55.296us), and the time of laser beam shot at any azimuth can be sequentially calculated, so as to obtain the time stamp of laser shot corresponding to each azimuth.
The above method will be described in detail with reference to fig. 5 to 7:
taking the example of four Camera positions in a Camera group, as shown in fig. 5 and 6, four cameras (cameras) (named Cam0, Cam1, Cam2 and Cam3) are respectively installed at four positions right below the LiDAR, each Camera is responsible for a corresponding 90-degree view field angle, and a monitoring area is divided into four sections i, ii, iii and iv. Laser radar (LiDAR) supports 360 ° scanning.
According to the requirement of synchronous control, the working frequency of LiDAR and the exposure time and exposure mode of Camera are set in advance. After each device is normally powered on to work, current scanning azimuth information is analyzed from received LiDAR data, and if the current scanning azimuth is coincident with a Camera view field azimuth, a trigger signal is generated to perform external trigger on the Camera. As shown in FIG. 6, the installation schematic diagram of LiDAR and Camera defines the coincidence azimuth angles corresponding to Cam0, Cam1, Cam2 and Cam3 as w0, w1, w2 and w3 respectively. The synchronous control process for LiDAR and Camera is illustrated below.
Velodyne's VLP-16 lidar and ON AR0231 are examples. The VLP-16 laser radar has an operating frequency of 5-20 Hz, which is typically set to 10Hz for balanced rotation and scanning accuracy. Thus, the frequency at which a LiDAR scan passes through each coinciding azimuth angle is 10Hz, i.e., the frequency of the Trigger signal (Trigger _0, Trigger _1, Trigger _2, Trigger _3) output by the LiDAR scan that triggers each Camera (Cam0, Cam1, Cam2, Cam3) is 10Hz (as shown in the following figures, where the rising edge of the LiDAR signal indicates that the data for each scan orientation is valid). When the azimuth angle in the scanning data of the LiDAR is coincident with the azimuth angle w0 of Cam0, Trigger signals Trigger _0 of Cam0 are generated, and Trigger signals Trigger _1, Trigger _2 and Trigger _3 of Cam1, Cam2 and Cam3 are generated by analogy. If only Trigger the Camera according to the Trigger signal, the Camera frame rate is 10fps (video frame per second). If the frame frequency of 10fps cannot meet the requirement of the rear-end machine vision algorithm (or other reasons), all trigger signals can be combined, namely a Cam _ Tri signal with adjustable frequency (10Hz, 20Hz, 30Hz or 40Hz) is generated (as shown in FIG. 7), so that the requirements of simultaneous multi-Camera triggering, Camera and LiDAR synchronous control and human eye and machine vision can be met.
Example two:
the embodiment of the present invention further provides a synchronization control device, which is mainly used for executing the synchronization control method provided in the foregoing content of the embodiment of the present invention, and the following describes the synchronization control device provided in the embodiment of the present invention in detail.
Fig. 8 is a schematic diagram of a synchronization control apparatus according to an embodiment of the present invention, and as shown in fig. 8, the synchronization control apparatus mainly includes: an acquisition unit 10, a determination unit 20 and an execution unit 30.
The acquiring unit 10 is configured to acquire a laser beam emission azimuth angle of the laser radar and a field angle of each camera in the camera group;
the determining unit 20 is configured to determine a target camera based on the laser beam emission azimuth angle and the field angle of each camera in the camera group, where the target camera is a camera corresponding to the field angle that coincides with the laser beam emission azimuth angle;
the execution unit 30 is configured to send a trigger instruction to the target camera, so that the target camera acquires image information based on the trigger instruction.
In the embodiment of the invention, firstly, a laser beam emission azimuth angle of a laser radar and a field angle of each camera in a camera set are obtained; then, determining a target camera based on the emission azimuth angle of the laser beam and the field angle of each camera in the camera set, wherein the target camera is a camera corresponding to the field angle which is coincident with the emission azimuth angle of the laser beam; and finally, sending a trigger instruction to the target camera so that the target camera acquires image information based on the trigger instruction, and determining the image acquisition time of the camera according to the data of the laser radar, thereby achieving the purpose that the laser radar and the camera simultaneously acquire surrounding environment data, further solving the technical problem that the control precision of the existing synchronous control method of the laser radar and the camera in the prior art is low, and further realizing the technical effect of carrying out high-precision synchronous control on the laser radar and the camera.
Optionally, the apparatus further comprises: a constructing unit, configured to construct a signal data packet of the lidar based on a scanning angular velocity of the lidar and a time interval at which the lidar emits a laser beam at an adjacent azimuth, where the signal data packet includes: each laser beam emission azimuth angle of the laser radar and the emission timestamp of the laser beam corresponding to each laser beam azimuth angle.
Optionally, the scanning range of the lidar is 360 °.
Optionally, the field angle coverage of the camera group is 360 °.
The present application also provides a computer readable medium having non-volatile program code executable by a processor, the program code causing the processor to perform the method of any of the above method embodiments.
Example three:
as shown in fig. 9, the electronic device 900 includes one or more processors 902, one or more storage devices 904, an input device 906, an output device 908, and a data collector 910, which are interconnected via a bus system 912 and/or other form of connection (not shown). It should be noted that the components and configuration of the electronic device 900 shown in FIG. 9 are exemplary only, and not limiting, and the electronic device may have other components and configurations as desired.
The processor 902 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 900 to perform desired functions.
The storage 904 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. On which one or more computer program instructions may be stored that may be executed by processor 902 to implement client functionality and/or other desired functionality in embodiments of the invention (implemented by the processor) as described below. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer-readable storage medium.
The input device 906 may be a device used by a user to input instructions and may include one or more of a keyboard, a mouse, a microphone, a touch screen, and the like.
The output device 908 may output various information (e.g., images or sounds) to an outside (e.g., a user), and may include one or more of a display, a speaker, and the like.
The data collector 910 obtains the laser beam emission azimuth angle of the lidar and the field angles of the cameras in the camera group, and stores the obtained laser beam emission azimuth angle of the lidar and the obtained field angles of the cameras in the camera group in the storage device 904 for use by other components.
Illustratively, an exemplary electronic device for implementing a data search method according to embodiments of the present invention may be implemented on a device such as a server.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (10)
1. A synchronization control method, comprising:
acquiring a laser beam emission azimuth angle of a laser radar and a field angle of each camera in a camera set;
determining a target camera based on the laser beam emission azimuth angle and the field angle of each camera in the camera group, wherein the target camera is a camera corresponding to the field angle which is coincident with the laser beam emission azimuth angle;
sending a trigger instruction to the target camera to enable the target camera to acquire image information based on the trigger instruction.
2. The method of claim 1, further comprising:
constructing a signal data packet of the laser radar based on the scanning angular velocity of the laser radar and the time interval of the laser radar emitting laser beams at adjacent azimuth angles, wherein the signal data packet comprises: each laser beam emission azimuth angle of the laser radar and the emission timestamp of the laser beam corresponding to each laser beam azimuth angle.
3. The method of claim 1, wherein the scanning range of the lidar is 360 °.
4. The method of claim 1, wherein the camera group has a 360 ° field of view coverage.
5. A synchronous control device, comprising: an acquisition unit, a determination unit and an execution unit, wherein,
the acquisition unit is used for acquiring a laser beam emission azimuth angle of the laser radar and a field angle of each camera in the camera set;
the determining unit is used for determining a target camera based on the laser beam emission azimuth angle and the field angle of each camera in the camera group, wherein the target camera is a camera corresponding to the field angle which is coincident with the laser beam emission azimuth angle;
the execution unit is used for sending a trigger instruction to the target camera so that the target camera acquires image information based on the trigger instruction.
6. The apparatus of claim 5, further comprising:
the constructing unit is used for constructing a signal data packet of the laser radar based on the scanning angular velocity of the laser radar and the time interval of the laser radar emitting laser beams at adjacent azimuth angles, wherein the signal data packet comprises: each laser beam emission azimuth angle of the laser radar and the emission timestamp of the laser beam corresponding to each laser beam azimuth angle.
7. The apparatus of claim 5, wherein the scanning range of the lidar is 360 °.
8. The apparatus of claim 5, wherein the camera group has a 360 ° field of view coverage.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the method of any of the preceding claims 1 to 4 are implemented when the computer program is executed by the processor.
10. A computer-readable medium having non-volatile program code executable by a processor, wherein the program code causes the processor to perform the method of any of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911162227.5A CN110839131A (en) | 2019-11-22 | 2019-11-22 | Synchronization control method, synchronization control device, electronic equipment and computer readable medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911162227.5A CN110839131A (en) | 2019-11-22 | 2019-11-22 | Synchronization control method, synchronization control device, electronic equipment and computer readable medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110839131A true CN110839131A (en) | 2020-02-25 |
Family
ID=69577222
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911162227.5A Pending CN110839131A (en) | 2019-11-22 | 2019-11-22 | Synchronization control method, synchronization control device, electronic equipment and computer readable medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110839131A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113219487A (en) * | 2021-05-07 | 2021-08-06 | 北京理工大学 | High-speed target surface feature and motion parameter measuring device and method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103984198A (en) * | 2014-04-17 | 2014-08-13 | 北京金景科技有限公司 | 360-degree panorama camera |
CN107407866A (en) * | 2015-02-24 | 2017-11-28 | 嗨魄Vr公司 | Laser radar stereoscopic fusion true man's outdoor scene threedimensional model video reconstruction for 360 ° of body virtual reality videos of six degree of freedom |
US20180003822A1 (en) * | 2016-07-01 | 2018-01-04 | Baidu Online Network Technology (Beijing) Co., Ltd | Environmental sensing device and information acquiring method applied to environmental sensing device |
CN107991662A (en) * | 2017-12-06 | 2018-05-04 | 江苏中天引控智能系统有限公司 | A kind of 3D laser and 2D imaging synchronous scanning device and its scan method |
CN108957478A (en) * | 2018-07-23 | 2018-12-07 | 上海禾赛光电科技有限公司 | Multisensor synchronous sampling system and its control method, vehicle |
CN110082739A (en) * | 2019-03-20 | 2019-08-02 | 深圳市速腾聚创科技有限公司 | Method of data synchronization and equipment |
-
2019
- 2019-11-22 CN CN201911162227.5A patent/CN110839131A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103984198A (en) * | 2014-04-17 | 2014-08-13 | 北京金景科技有限公司 | 360-degree panorama camera |
CN107407866A (en) * | 2015-02-24 | 2017-11-28 | 嗨魄Vr公司 | Laser radar stereoscopic fusion true man's outdoor scene threedimensional model video reconstruction for 360 ° of body virtual reality videos of six degree of freedom |
US20180003822A1 (en) * | 2016-07-01 | 2018-01-04 | Baidu Online Network Technology (Beijing) Co., Ltd | Environmental sensing device and information acquiring method applied to environmental sensing device |
CN107991662A (en) * | 2017-12-06 | 2018-05-04 | 江苏中天引控智能系统有限公司 | A kind of 3D laser and 2D imaging synchronous scanning device and its scan method |
CN108957478A (en) * | 2018-07-23 | 2018-12-07 | 上海禾赛光电科技有限公司 | Multisensor synchronous sampling system and its control method, vehicle |
CN110082739A (en) * | 2019-03-20 | 2019-08-02 | 深圳市速腾聚创科技有限公司 | Method of data synchronization and equipment |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113219487A (en) * | 2021-05-07 | 2021-08-06 | 北京理工大学 | High-speed target surface feature and motion parameter measuring device and method |
CN113219487B (en) * | 2021-05-07 | 2022-07-19 | 北京理工大学 | High-speed target surface feature and motion parameter measuring device and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111435162B (en) | Laser radar and camera synchronization method, device, equipment and storage medium | |
KR102245648B1 (en) | Multi-dimensional data capture of an environment using plural devices | |
US20100208941A1 (en) | Active coordinated tracking for multi-camera systems | |
CN113671480A (en) | Radar and video fusion traffic target tracking method, system, equipment and terminal | |
CN112016483B (en) | Relay system, method, device and equipment for target detection | |
CN110491060B (en) | Robot, safety monitoring method and device thereof, and storage medium | |
CN111045000A (en) | Monitoring system and method | |
CN112017250A (en) | Calibration parameter determination method and device, radar vision equipment and radar ball joint system | |
EP3940666A1 (en) | Digital reconstruction method, apparatus, and system for traffic road | |
WO2022179207A1 (en) | Window occlusion detection method and apparatus | |
US11808857B2 (en) | Multi-sensor superresolution scanning and capture system | |
CN111179329A (en) | Three-dimensional target detection method and device and electronic equipment | |
US20220044558A1 (en) | Method and device for generating a digital representation of traffic on a road | |
US20230169683A1 (en) | Association of concurrent tracks across multiple views | |
US20230003549A1 (en) | Calibration of sensor position offsets based on rotation and translation vectors for matched trajectories | |
JP2022526071A (en) | Situational awareness monitoring | |
JP2016085602A (en) | Sensor information integrating method, and apparatus for implementing the same | |
US20220373683A1 (en) | Image processing device, monitoring system, and image processing method | |
CN113076830A (en) | Environment passing area detection method and device, vehicle-mounted terminal and storage medium | |
US20240201371A1 (en) | Three-dimensional ultrasonic imaging method and system based on lidar | |
CN110839131A (en) | Synchronization control method, synchronization control device, electronic equipment and computer readable medium | |
CN109708659A (en) | A kind of distributed intelligence photoelectricity low latitude guard system | |
CN118015559A (en) | Object identification method and device, electronic equipment and storage medium | |
EP3510573B1 (en) | Video surveillance apparatus and method | |
JP6988797B2 (en) | Monitoring system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200225 |
|
RJ01 | Rejection of invention patent application after publication |