CN114499733A - Four-legged robot-mounted SLAM device and sensor time synchronization method - Google Patents

Four-legged robot-mounted SLAM device and sensor time synchronization method Download PDF

Info

Publication number
CN114499733A
CN114499733A CN202210142484.8A CN202210142484A CN114499733A CN 114499733 A CN114499733 A CN 114499733A CN 202210142484 A CN202210142484 A CN 202210142484A CN 114499733 A CN114499733 A CN 114499733A
Authority
CN
China
Prior art keywords
data
module
camera
gnss
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210142484.8A
Other languages
Chinese (zh)
Inventor
王庆
张颖
严超
王怀虎
黎露
陈晓宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN202210142484.8A priority Critical patent/CN114499733A/en
Publication of CN114499733A publication Critical patent/CN114499733A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04JMULTIPLEX COMMUNICATION
    • H04J3/00Time-division multiplex systems
    • H04J3/02Details
    • H04J3/06Synchronising arrangements
    • H04J3/0635Clock or time synchronisation in a network
    • H04J3/0638Clock or time synchronisation among nodes; Internode synchronisation

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

Provided are a four-legged robot-mounted SLAM device and a sensor time synchronization method. The system comprises a combined navigation device, a laser radar, a camera, a processor, a visualization platform and a quadruped robot. According to the method, the GNSS/INS integrated navigation equipment outputs PPS signals to be accessed into the synchronous board, 1Hz signals are accessed into the laser radar and the software second pulse counting module, and 60Hz signals are accessed into the camera, so that synchronous data generation of the laser radar and the camera is realized. All threads run in respective threads without mutual interference. When the data of the laser radar, the camera and the GNSS/INS integrated navigation equipment are all sent to the asynchronous thread module, the asynchronous thread module sends all the data to the data synchronization module, and the data synchronization module forwards all the subsequent data to the subsequent processing module. The software second pulse counting module is used for collecting second pulses output by the GNSS and counting the second pulses, and providing a counting basis for other modules.

Description

Four-legged robot-mounted SLAM device and sensor time synchronization method
Technical Field
The invention belongs to the field of hard synchronization of vehicle-mounted SLAM devices and sensor time, and particularly relates to a four-legged robot-mounted SLAM device and a sensor time synchronization method.
Background
From the perspective analysis of the robot, the omnidirectional mobile robot can realize the movement in any direction, and can be used in the fields of military, industry, families and the like. Synchronous positioning and Mapping (SLAM) of a mobile robot is a popular research problem in the robot field and is a premise and basis of self-service task planning and path planning of the mobile robot. The SLAM problem of the robot is simply that the mobile robot does not need to establish an environment map but needs to position itself in real time on the map under an unknown environment. This process is comparable to the human entering a completely unfamiliar environment, without having to carry around a device that can determine location and orientation, he can only identify the environment and determine his own location by observing the surrounding environment and estimating his own motion.
From the perspective of sensor analysis, high-precision synchronous acquisition of multi-sensor data is one of the key technologies of automatic driving. With the support and promotion of global advanced technology, automatic driving becomes a hot spot of current research, and research on automatic driving by many enterprises and colleges has paid much attention, for example, large-scale foreign technology companies such as Google and Uber have successively released automatic driving automobiles in 2005, and other top domestic technology companies such as centesimal and tusson have opened automatic driving research projects in 2013. Most of the current systems for acquiring the multi-sensor data applied to automatic driving adopt a plurality of modules for integration, so that the defects of high overall cost, low time synchronization precision, large volume, relatively complex assembly and the like exist.
The accuracy, synchronism and effectiveness of the collected sensor data are the key points for ensuring accurate positioning and accurate navigation of automatic driving. The existing multi-sensor data acquisition technology at home and abroad has the following defects in general: (1) the time references of various different sensors are not uniform, and the data fusion of the sensors is difficult to effectively carry out; (2) the functions of various sensors are independent, and effective complementation is not formed; (3) because GPS signals are easy to interfere, the high-precision positioning of the vehicle in any scene is difficult to realize.
In order to realize effective fusion of multi-sensor data, high integration of the multi-sensors is needed to be realized, multi-source data of the sensors are unified on the same time and space reference by adopting a certain technical means, and data synchronization of the sensors is ensured, so that accurate pose determination and positioning of an automatic driving vehicle can be realized more effectively and accurately. The unification of the space reference can obtain the relative position relation of each sensor through a calibration technology, so that the exact position of each sensor in a specific coordinate system is calculated according to the initial coordinate in a high-precision map, and the unification of the time reference is that the absolute time precision of an acquisition system is required to be within a certain error range and the synchronous acquisition of the ultra-low time delay of the data of multiple sensors can be carried out. On one hand, the existing home and abroad automatic driving technology generally adopts a plurality of laser radars, cameras and inertial navigation units, so that the production cost and the volume of the automatic driving system are greatly increased. On the other hand, the domestic and foreign schemes for synchronizing the data of each sensor have certain defects. For example, a collision time estimation method based on multi-sensor fusion, multi-sensor data out-of-order fusion, and a plurality of projects in the world under study, such as IntelliDrive project, european Intersafe project or german Ko-per project, and a part of Ko-Fas project, exchange the result data of each sensor to achieve more reliable driving assistance based on using a separate synchronous acquisition module for each sensor, but bring a bigger problem in time stamping, and these methods need to synchronize clocks of different vehicles and infrastructures besides synchronizing different local sensors. In these works, the time from the beginning of measurement to the completion of data transmission is ignored, so that the data of each sensor has a large time error, which can reach several hundred milliseconds.
Disclosure of Invention
In order to solve the technical problems, the invention provides a four-footed robot-borne SLAM device and a sensor time synchronization method, provides the four-footed robot-borne SLAM device and the sensor time synchronization method, considers the time reference of various sensors to be different, provides the four-footed robot-borne SLAM device, outputs PPS signals through GNSS/INS combined navigation equipment and accesses the synchronization board, and the synchronization board divides the frequency of the PPS signals into 1Hz and 60Hz signals. And the 1Hz signal is accessed into the laser radar and the software second pulse counting module, and the 60Hz signal is accessed into the camera, so that the synchronous data generation of the laser radar and the camera is realized. The laser radar sends the point cloud number to a laser radar acquisition module through the Ethernet, records the numerical value and the frame value of the current second pulse module, and sends the numerical value and the frame value to an asynchronous thread module; the camera sends camera data to a camera acquisition module through a usb3.0 interface, records the numerical value and the frame value of the current pulse per second module, and sends the numerical value and the frame value to an asynchronous thread module; the second pulse module receives a 1Hz signal sent by the GNSS/INS combined navigation equipment through a GPIO port, records a second pulse count value and sends the second pulse count value to the asynchronous thread module; the GNSS/INS integrated navigation equipment sends the GNSS numerical value, the IMU numerical value and the pulse per second value to the GNSS acquisition module through the serial port and sends the GNSS numerical value, the IMU numerical value and the pulse per second value to the asynchronous thread module. All threads run in respective threads without mutual interference. When the data of the laser radar, the camera and the GNSS/INS integrated navigation equipment are all sent to the asynchronous thread module, the asynchronous thread module sends all the data to the data synchronization module, and the data synchronization module forwards all the subsequent data to the subsequent processing module. The software second pulse counting module is used for collecting second pulses output by the GNSS and counting the second pulses, and providing a counting basis for other modules.
The invention provides a four-footed robot-carried SLAM device which comprises GNSS/INS combined navigation equipment, an RS-LiDAR-32 laser radar, a UI-3140CP camera, a processor and a four-footed robot, wherein the GNSS/INS combined navigation equipment integrates a high-precision GNSS board card and a vehicle-scale MEMS IMU and is provided with an integrated 4G communication module, the GNSS/INS combined navigation equipment, the RS-LiDAR-32 laser radar and the UI-3140CP camera are all connected with the processor, and the processor consists of three control boards, namely a core board, an expansion board, a synchronization board and the like. The PPS signals are divided into 1Hz and 60Hz signals by the synchronization board, the 1Hz signals are accessed into the RS-LiDAR-32 laser radar and the software second pulse module, the 60Hz signals are accessed into the camera, synchronous data generation of the laser radar and the camera is realized, the RS-LiDAR-32 laser radar sends point cloud numbers to the laser radar acquisition module through the Ethernet, records the numerical value and the frame value of the current second pulse module and sends the numerical value and the frame value to the asynchronous thread module; the camera sends camera data to a camera acquisition module through a usb3.0 interface, records the numerical value and the frame value of the current pulse per second module, and sends the numerical value and the frame value to an asynchronous thread module; the second pulse module receives a 1Hz signal sent by the GNSS/INS combined navigation equipment through a GPIO port, records a second pulse count value and sends the second pulse count value to the asynchronous thread module; the GNSS/INS integrated navigation equipment sends the GNSS value, the IMU value and the pulse per second value to the GNSS acquisition module through the serial port and sends the GNSS value, the IMU value and the pulse per second value to the asynchronous thread module, all threads run in respective threads without mutual interference, when data of the laser radar, the camera and the GNSS/INS integrated navigation equipment are sent to the asynchronous thread module, the asynchronous thread module sends all the data to the data synchronization module, and the data synchronization module can forward all the follow-up data to a subsequent processing module.
The invention provides a sensor time synchronization method of a four-legged robot-mounted SLAM device, which comprises the following specific steps of:
the method comprises the following steps:
s1, connecting a laser radar, a camera and GNSS/INS integrated navigation equipment with a synchronization board, and accessing a PPS signal output by the GNSS/INS integrated navigation equipment into the synchronization board;
the step S1 includes the following steps:
s11, the PPS signals are divided into 1Hz and 60Hz signals by a synchronous board;
s12, accessing a 1Hz signal into a laser radar and a software second pulse module, and accessing a 60Hz signal into a camera;
the step S2 includes the following steps:
s21, registering a laser radar drive error callback function and a laser radar drive data callback function;
s22, initializing and starting a laser radar drive, and processing laser radar data;
s23, calling a laser radar to drive an analytic function to decode the data frame, and sending the data frame to an asynchronous thread module;
s2, the laser radar sends the point cloud number to a laser radar acquisition module through the Ethernet, records the numerical value and the frame value of the current second pulse module and sends the numerical value and the frame value to an asynchronous thread module;
s3, the camera sends the camera data to a camera acquisition module through a usb3.0 interface, records the numerical value and the frame value of the current pulse-per-second module, and sends the numerical value and the frame value to an asynchronous thread module;
the step S3 includes the following steps:
s31, acquiring a camera list, opening a camera and initializing;
s32, starting a camera exposure event monitoring thread;
s33, starting a camera frame to finish an event transmission monitoring thread;
s4, the second pulse counting module receives a 1Hz signal sent by the GNSS/INS integrated navigation equipment through a GPIO port, records a second pulse counting value and sends the second pulse counting value to the asynchronous thread module;
the step S4 includes the following steps:
s41, opening a GPIO port corresponding to 1PPS and initializing;
s42, pulling the GPIO port state, and reading the GPIO port state;
s43, adding 1 to the global counter, and pulling the GPIO port state;
s5, the GNSS/INS integrated navigation equipment sends the GNSS numerical value, the IMU numerical value and the pulse per second value to a GNSS acquisition module through a serial port and sends the GNSS numerical value, the IMU numerical value and the pulse per second value to an asynchronous thread module;
the step S5 includes the following steps:
s51, initializing a serial port 0 and a serial port 1, wherein the serial port 0 and the serial port 1 are interfaces for inputting parameters serial _0 and serial _1 respectively;
s52, reading a section of serial port 0 data, and judging whether the data contains a GPGGA command;
s53, reading a section of serial port 1 data, and judging whether the data contains a GPGGA command; (ii) a
S54, exchanging serial port configuration;
s55, processing two serial port data;
s6, when the data of the laser radar, the camera and the GNSS/INS integrated navigation equipment are all sent to the asynchronous thread module, the asynchronous thread module sends all the data to the data synchronization module, and the data synchronization module forwards all the subsequent data to the subsequent processing module;
the step S6 includes the following steps:
s61, the synchronization module processes radar data and forwards the radar data to a subsequent processing module;
s62, the synchronization module processes the camera data and forwards the camera data to a subsequent processing module;
and S63, the synchronization module processes the GNSS data and forwards the GNSS data to the subsequent processing module.
As a further improvement of the present invention, the step S32 includes the following steps: the method comprises the steps of obtaining a camera exposure event, creating a new camera event record and initializing items, and putting the camera event record into a record queue.
As a further improvement of the present invention, the step S33 includes the following steps: the method comprises the steps of acquiring camera frames to finish transmission events, acquiring corresponding camera frame data, searching corresponding camera exposure event records from a record queue, creating and initializing camera frame information objects, and sending the camera frame information objects to an asynchronous thread module.
As a further improvement of the present invention, the step S52 includes the following steps:
if not, continue to execute step S53;
if yes, the method comprises the following steps:
s521, using the default serial port configuration and continuing to execute the step S55.
The step S53 includes the following steps:
if yes, continuing to execute S54 and S55;
if not, the method comprises the following steps: step S521 is performed.
As a further improvement of the present invention, the step S55 includes the following steps: reading NEMA data, determining a frame tail according to \ n, determining a frame head according to $ and sending the NEMA data frame to a data synchronization module; reading IMU data, calling a framing function for framing, discarding data frames except the RAWIMUS frame, and sending the IMU data to the asynchronous thread module.
As a further improvement of the present invention, step S61 includes the following steps:
s611, if the radar data start to be forwarded, replacing the radar data with the latest radar data;
s612, judging whether the counter value when the forwarding is started is determined;
in step S612, the method includes the following steps:
if yes, go on to S613;
if not, the method comprises the following steps: judging whether other data are ready, if so, calculating a count value when forwarding is started;
and S613, if the current counter value is larger than or equal to the forwarding starting value, marking the radar data to start forwarding.
As a further improvement of the present invention, the step S62 includes the following steps:
s621, if the camera data starts to be forwarded, replacing the camera data with the latest camera data;
s622, judging whether the counter value when the forwarding is started is determined;
the step S622 includes the following steps:
if yes, continue to execute S623;
if not, the method comprises the following steps: judging whether other data are ready, if so, calculating the count when the forwarding is started;
and S623, if the current counter value is larger than or equal to the forwarding starting value, marking the camera data to start forwarding.
As a further improvement of the present invention, the step S63 includes the following steps:
s631, judging whether the GNSS data starts to be forwarded or not;
and S632, judging whether the data is IMU data.
The step S631 includes the following steps:
if yes, the GNSS data is forwarded to a subsequent processing module;
if not, the process continues to S632.
The step S632 includes the following steps:
if yes, the method comprises the following steps:
SS11. replace data with the latest IMU data;
SS12. judging whether the counter value when starting to forward is determined;
and SS13, if the current counter values of the IMU and the NEMA are larger than or equal to the forwarding starting value, marking the GNSS data to start forwarding.
The step SS12 includes the following steps:
if yes, continue to execute SS 13;
if not, the method comprises the following steps: judging whether other data are ready, if so, calculating a count value when forwarding is started;
if not, the method comprises the following steps:
SS21. replace data with the latest NEMA data;
SS22, judging whether the counter value when starting to forward is determined;
and SS23, if the current counter values of the IMU and the NEMA are larger than or equal to the forwarding starting value, marking the GNSS data to start forwarding.
The step SS22 includes the following steps:
if yes, continue to execute SS 23;
if not, the method comprises the following steps: and judging whether other data are ready, and if so, calculating a count value when forwarding is started.
Has the advantages that:
compared with the prior art, in an unknown environment, the carrier synchronously acquires data by using the loaded SLAM device, and the considered four-legged robot-loaded SLAM device comprises GNSS/INS combined navigation equipment, an RS-LiDAR-32 laser radar, a UI-3140CP camera, a processor and a four-legged robot. The GNSS/INS integrated navigation equipment integrates a high-precision GNSS board card and a vehicle-scale MEMS IMU and is provided with an integrated 4G communication module. The GNSS/INS combined navigation equipment, the RS-LiDAR-32 laser radar and the UI-3140CP camera are all connected with the processor. The processor comprises three control panels such as a core panel, an expansion panel and a synchronization panel. Aiming at the problems that the data of each sensor has large time errors and time hard synchronization cannot be realized, a synchronization board which outputs PPS signals by GNSS/INS combined navigation equipment and is connected to a processor is considered, and the PPS signals are divided into 1Hz signals and 60Hz signals by the synchronization board. The 1Hz signal is accessed into the laser radar and the software second pulse module, and the 60Hz signal is accessed into the camera, so that the synchronous data generation of the laser radar and the camera is realized. The laser radar sends the point cloud number to a laser radar acquisition module through the Ethernet, records the numerical value and the frame value of the current second pulse module, and sends the numerical value and the frame value to an asynchronous thread module; the camera sends the camera data to a camera acquisition module through a usb3.0 interface, records the numerical value and the frame value of the current pulse-per-second module, and sends the numerical value and the frame value to an asynchronous thread module; the second pulse module receives a 1Hz signal sent by the GNSS/INS combined navigation equipment through the GPIO port, records a second pulse count value and sends the second pulse count value to the asynchronous thread module; the GNSS/INS integrated navigation equipment sends the GNSS numerical value, the IMU numerical value and the pulse per second value to the GNSS acquisition module through the serial port and sends the GNSS numerical value, the IMU numerical value and the pulse per second value to the asynchronous thread module. All threads run in respective threads without mutual interference. When the data of the laser radar, the camera and the GNSS/INS integrated navigation equipment are all sent to the asynchronous thread module, the asynchronous thread module sends all the data to the data synchronization module, and the data synchronization module forwards all the subsequent data to the subsequent processing module.
Drawings
FIG. 1 is a flow chart of a method for hard synchronization of sensor time in accordance with the present invention;
FIG. 2 is a SLAM apparatus design of the present invention;
FIG. 3 is an initialization flow diagram of the present invention;
FIG. 4 is a diagram of the synchronization board hardware wiring of the present invention;
FIG. 5 is a flow chart of a lidar acquisition module of the present invention;
FIG. 6 is a flow chart of a camera acquisition module of the present invention;
FIG. 7 is a flow chart of a counter collection module of the present invention;
FIG. 8 is a flow chart of a GNSS acquisition module of the present invention;
FIG. 9 is a lidar data processing flow diagram of the present invention;
FIG. 10 is a camera data processing flow diagram of the present invention;
FIG. 11 is a flow chart of GNSS data processing according to the present invention.
Description of the accessories:
1. GNSS/INS integrated navigation equipment; 2. RS-LiDAR-32 LiDAR; 3. UI-3140CP camera; 4. a processor; 5. a quadruped robot.
Detailed Description
The invention is described in further detail below with reference to the following figures and embodiments:
the flow chart of the sensor time hard synchronization method is shown in figure 1, the SLAM device design chart is shown in figure 2, and the four-footed robot-borne SLAM device comprises GNSS/INS combined navigation equipment 1, an RS-LiDAR-32 laser radar 2, a UI-3140CP camera 3, a processor 4 and a four-footed robot 5. The GNSS/INS integrated navigation equipment is internally integrated with a high-precision GNSS board card and a vehicle-scale MEMS IMU and is provided with an integrated 4G communication module. The GNSS/INS combined navigation equipment, the RS-LiDAR-32 laser radar and the UI-3140CP camera are all connected with the processor. The processor comprises three control panels such as a core panel, an expansion panel and a synchronization panel. Aiming at the problems that the data of each sensor has large time errors and time hard synchronization cannot be realized, a synchronization board which outputs PPS signals by GNSS/INS combined navigation equipment and is connected to a processor is considered, and the PPS signals are divided into 1Hz signals and 60Hz signals by the synchronization board. The 1Hz signal is accessed into the laser radar and the software second pulse module, and the 60Hz signal is accessed into the camera, so that the synchronous data generation of the laser radar and the camera is realized. The laser radar sends the point cloud number to a laser radar acquisition module through the Ethernet, records the numerical value and the frame value of the current second pulse module, and sends the numerical value and the frame value to an asynchronous thread module; the camera sends the camera data to the camera acquisition module through a usb3.0 interface, records the numerical value and the frame value of the current second pulse module, and sends the numerical value and the frame value to the asynchronous thread module; the second pulse module receives a 1Hz signal sent by the GNSS/INS combined navigation equipment through the GPIO port, records a second pulse count value and sends the second pulse count value to the asynchronous thread module; the GNSS/INS integrated navigation equipment sends the GNSS numerical value, the IMU numerical value and the pulse per second value to the GNSS acquisition module through the serial port and sends the GNSS numerical value, the IMU numerical value and the pulse per second value to the asynchronous thread module. All threads run in respective threads without mutual interference. When the data of the laser radar, the camera and the GNSS/INS integrated navigation equipment are all sent to the asynchronous thread module, the asynchronous thread module sends all the data to the data synchronization module, and the data synchronization module forwards all the subsequent data to the subsequent processing module.
In the invention, in an unknown environment, in the unknown environment, a carrier outputs PPS signals through GNSS/INS combined navigation equipment to access a synchronization board of a processor in the moving process, and the synchronization board divides the PPS signals into 1Hz and 60Hz signals. The 1Hz signal is accessed into the laser radar and the software second pulse module, and the 60Hz signal is accessed into the camera, so that the synchronous data generation of the laser radar and the camera is realized. The laser radar, the camera and the GNSS/INS integrated navigation equipment run in respective threads without mutual interference. FIG. 4 is a flow chart of a method for hard synchronization of sensor time according to the present invention.
One, configuring the four-legged robot-carried SLAM device
A four-legged robot-carried SLAM device comprises GNSS/INS combined navigation equipment, an RS-LiDAR-32 laser radar, a UI-3140CP camera, a processor, a visualization platform and a four-legged robot. The GNSS/INS integrated navigation equipment integrates a high-precision GNSS board card and a vehicle-specification-level MEMS IMU and is provided with a 4G communication module. The GNSS/INS combined navigation equipment, the RS-LiDAR-32 laser radar and the UI-3140CP camera are all connected with the processor. The processor comprises three control panels such as a core panel, an expansion panel and a synchronization panel. The core board is a Jeson TX2 embedded AI computing platform of great, adopts a Pascal GPU architecture, integrates 264 bits of dual-core NVIDIADenver and a four-core ARM Cortex-A57 MPCore, has 8GB memory and 59.7GB/S memory broadband, and has indexes exceeding Jeson TX 1. The expansion board expands input and output channels of the core board into a standard interface compatible with Jetson TX2/TX2I/TX2-4GB modules. The synchronization of the synchronization board refers to the synchronization of output signals, and the laser radar, the camera and the GNSS/INS integrated navigation equipment are all connected with the synchronization board. The GNSS/INS combined navigation equipment is connected with a type-c docking station through a COM1 serial port and a USB patch cord, the camera is connected with the type-c docking station through a USB cord, and the type-c docking station is connected with a TX2 through a type-c cord; the lidar is connected via a network cable TX 2. The GNSS/INS combined navigation device outputs the PPS signal to TX2, and TX2 divides the PPS signal into 1Hz and 60Hz signals. Then, sending out the 1Hz signal from a CH2 interface, and receiving by a GPIO-1 interface of the camera; the 60Hz signal is sent out from the CH0 interface and received by the laser radar GPS PULSE interface. The processor is connected with the visualization platform.
Sensor time hard synchronization method
1. Initialization procedure
The module mainly has the functions of initializing an acquisition module, a counter module and the like of related equipment and providing an operation basis for later data acquisition.
The initialization flow chart is shown in fig. 3:
2. synchronous plate
The synchronization of the synchronization board refers to the synchronization of output signals, and the laser radar, the camera and the GNSS/INS combined navigation equipment are all connected with the synchronization board. The GNSS/INS combined navigation equipment is connected with a type-c docking station through a COM1 serial port and a USB patch cord, the camera is connected with the type-c docking station through a USB cord, and the type-c docking station is connected with a TX2 through a type-c cord; the lidar is connected via a network cable TX 2. The GNSS/INS combined navigation device outputs the PPS signal to TX2, and TX2 divides the PPS signal into 1Hz and 60Hz signals. Then, sending out the 1Hz signal from a CH2 interface, and receiving by a GPIO-1 interface of the camera; the 60Hz signal is sent out from the CH0 interface and received by the laser radar GPS PULSE interface. Therefore, synchronous data generation of the laser radar and the camera is achieved.
The synchronization board hardware wiring diagram is shown in fig. 4:
3. laser radar acquisition module
The module has the main functions of calling the laser radar to drive the related function to initialize the laser radar and registering the related callback interface.
The laser radar sends the point cloud number to a laser z radar acquisition module through the Ethernet, registers a laser radar drive error callback function and a laser radar drive data callback function, initializes and starts laser radar drive, processes laser radar data, calls a laser radar drive analysis function to decode a data frame, records the numerical value and the frame value of a current second pulse module, and sends the numerical value and the frame value to an asynchronous thread module.
The flow chart of the laser radar acquisition module is shown in fig. 5:
4. camera acquisition module
The module mainly monitors related events of the camera to acquire the occurrence time of the related events of the camera and data frame data of the camera.
The camera sends camera data to the camera acquisition module through the usb3.0 interface, acquires a camera list, opens the camera and initializes, starts a camera exposure event monitoring thread, starts a camera frame to finish transmitting the event monitoring thread, records the value and the frame value of the current second pulse module, and sends the value and the frame value to the asynchronous thread module.
The camera acquisition module flowchart is shown in fig. 6:
5. counter acquisition module
The module has the main function of counting the 1PPS signals output by the GNSS and providing a counting basis for other modules.
The second pulse module pulls the GPIO port state, reads the GPIO port state, the global counter adds 1, pulls the GPIO port state, records the second pulse count value and sends the second pulse count value to the asynchronous thread module.
The flow chart of the counter acquisition module is shown in fig. 7:
GNSS acquisition module
The module mainly collects NEMA protocol data and IMU data of the GNSS device, the NEMA protocol data only comprises GPGGA data at present, the IMU data comprises accelerometer increment data and gyroscope increment data, and the frequency is 125 Hz.
The GNSS/INS combined navigation equipment sends the GNSS numerical value, the IMU numerical value and the pulse per second value to the GNSS acquisition module through the serial port, initializes a serial port 0 and a serial port 1, wherein the serial port 0 and the serial port 1 are interfaces of input parameters serial _0 and serial _1 respectively. Reading a section of serial port 0 data, and if the data contains a GPGGA command, processing the two serial port data by using default serial port configuration; if the data does not contain the GPGGA command, reading a section of serial port 1 data and judging whether the data contains the GPGGA command, if not, processing two serial port data by using default serial port configuration, and if so, exchanging the serial port configuration to process the two serial port data. Reading NEMA data, respectively determining a frame tail and a frame head according to \ n and $, and then sending the NEMA data frame to a subsequent processing module; reading the IMU data, calling a framing function to frame, discarding the data frames except the RAWIMUS frame, and then sending the IMU data frame to a subsequent processing module.
The flow chart of the GNSS acquisition module is shown in fig. 8:
7. synchronization module
The module has the function that all subsequent data can be forwarded to a subsequent processing module only when output data exist in the laser radar, the camera and the GNSS equipment.
When the data of the laser radar, the camera and the GNSS/INS integrated navigation equipment are all sent to the asynchronous thread module, the asynchronous thread module sends all the data to the data synchronization module. And the data synchronization module separately processes the laser radar data, the camera data and the GNSS data.
The laser radar data processing flow chart is shown in fig. 9:
the camera data processing flow chart is shown in fig. 10:
GNSS data processing flow chart as shown in FIG. 11
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way, but any modifications or equivalent variations made according to the technical spirit of the present invention are within the scope of the present invention as claimed.

Claims (9)

1. A four-legged robotic SLAM-based apparatus comprising a GNSS/INS combined navigation device, an RS-LiDAR-32 LiDAR, a UI-3140CP camera, a processor, and a four-legged robot, characterized in that: the GNSS/INS combined navigation equipment integrates a high-precision GNSS board card and a vehicle gauge-level MEMSIMU and is provided with an integrated 4G communication module, the GNSS/INS combined navigation equipment, an RS-LiDAR-32 laser radar and a UI-3140CP camera are all connected with a processor, and the processor is composed of three control panels, namely a core panel, an expansion panel, a synchronization panel and the like. The PPS signals are divided into 1Hz and 60Hz signals by the synchronization board, the 1Hz signals are accessed into the RS-LiDAR-32 laser radar and the software second pulse module, the 60Hz signals are accessed into the camera, synchronous data generation of the laser radar and the camera is realized, the RS-LiDAR-32 laser radar sends point cloud numbers to the laser radar acquisition module through the Ethernet, records the numerical value and the frame value of the current second pulse module and sends the numerical value and the frame value to the asynchronous thread module; the camera sends camera data to a camera acquisition module through a usb3.0 interface, records the numerical value and the frame value of the current pulse per second module, and sends the numerical value and the frame value to an asynchronous thread module; the second pulse module receives a 1Hz signal sent by the GNSS/INS combined navigation equipment through a GPIO port, records a second pulse count value and sends the second pulse count value to the asynchronous thread module; the GNSS/INS integrated navigation equipment sends the GNSS value, the IMU value and the pulse per second value to the GNSS acquisition module through the serial port and sends the GNSS value, the IMU value and the pulse per second value to the asynchronous thread module, all threads run in respective threads without mutual interference, when data of the laser radar, the camera and the GNSS/INS integrated navigation equipment are sent to the asynchronous thread module, the asynchronous thread module sends all the data to the data synchronization module, and the data synchronization module can forward all the follow-up data to a subsequent processing module.
2. The sensor time synchronization method for the SLAM device carried by the four-legged robot as set forth in claim 1, wherein: the method comprises the following specific steps:
the method comprises the following steps:
s1, connecting a laser radar, a camera and GNSS/INS integrated navigation equipment with a synchronization board, and accessing a PPS signal output by the GNSS/INS integrated navigation equipment into the synchronization board;
the step S1 includes the following steps:
s11, the PPS signals are divided into 1Hz and 60Hz signals by a synchronous board;
s12, accessing a 1Hz signal into a laser radar and a software second pulse module, and accessing a 60Hz signal into a camera;
the step S2 includes the following steps:
s21, registering a laser radar drive error callback function and a laser radar drive data callback function;
s22, initializing and starting a laser radar drive, and processing laser radar data;
s23, calling a laser radar driving analysis function to decode the data frame, and sending the data frame to an asynchronous thread module;
s2, the laser radar sends the point cloud number to a laser radar acquisition module through the Ethernet, records the numerical value and the frame value of the current second pulse module and sends the numerical value and the frame value to an asynchronous thread module;
s3, the camera sends the camera data to a camera acquisition module through a usb3.0 interface, records the numerical value and the frame value of the current pulse-per-second module, and sends the numerical value and the frame value to an asynchronous thread module;
the step S3 includes the following steps:
s31, acquiring a camera list, opening a camera and initializing;
s32, starting a camera exposure event monitoring thread;
s33, starting a camera frame to finish an event transmission monitoring thread;
s4, the second pulse counting module receives a 1Hz signal sent by the GNSS/INS integrated navigation equipment through a GPIO port, records a second pulse counting value and sends the second pulse counting value to the asynchronous thread module;
the step S4 includes the following steps:
s41, opening a GPIO port corresponding to 1PPS and initializing;
s42, pulling the GPIO port state, and reading the GPIO port state;
s43, adding 1 to the global counter, and pulling the GPIO port state;
s5, the GNSS/INS integrated navigation equipment sends the GNSS numerical value, the IMU numerical value and the pulse per second value to a GNSS acquisition module through a serial port and sends the GNSS numerical value, the IMU numerical value and the pulse per second value to an asynchronous thread module;
the step S5 includes the following steps:
s51, initializing a serial port 0 and a serial port 1, wherein the serial port 0 and the serial port 1 are interfaces for inputting parameters serial _0 and serial _1 respectively;
s52, reading a section of serial port 0 data, and judging whether the data contains a GPGGA command;
s53, reading a section of serial port 1 data, and judging whether the data contains a GPGGA command; (ii) a
S54, exchanging serial port configuration;
s55, processing two serial port data;
s6, when the data of the laser radar, the camera and the GNSS/INS integrated navigation equipment are all sent to the asynchronous thread module, the asynchronous thread module sends all the data to the data synchronization module, and the data synchronization module forwards all the subsequent data to the subsequent processing module;
the step S6 includes the following steps:
s61, the synchronization module processes radar data and forwards the radar data to a subsequent processing module;
s62, the synchronization module processes the camera data and forwards the camera data to a subsequent processing module;
and S63, the synchronization module processes the GNSS data and forwards the GNSS data to the subsequent processing module.
3. The method of claim 2, wherein the sensor time synchronization method of the SLAM device carried by the four-legged robot comprises: the step S32 includes the following steps: the method comprises the steps of obtaining a camera exposure event, creating a new camera event record and initializing items, and putting the camera event record into a record queue.
4. The method of claim 2, wherein the sensor time synchronization method of the SLAM device carried by the four-legged robot comprises: the step S33 includes the following steps: the method comprises the steps of acquiring camera frames to finish transmission events, acquiring corresponding camera frame data, searching corresponding camera exposure event records from a record queue, creating and initializing camera frame information objects, and sending the camera frame information objects to an asynchronous thread module.
5. The method of claim 2, wherein the sensor time synchronization method of the SLAM device carried by the four-legged robot comprises: the step S52 includes the following steps:
if not, continue to execute step S53;
if yes, the method comprises the following steps:
s521, using the default serial port configuration and continuing to execute the step S55.
The step S53 includes the following steps:
if yes, continuing to execute S54 and S55;
if not, the method comprises the following steps: step S521 is performed.
6. The method of claim 2, wherein the sensor time synchronization method of the SLAM device carried by the four-legged robot comprises: the step S55 includes the following steps: reading NEMA data, determining a frame tail according to \ n, determining a frame head according to $ and sending the NEMA data frame to a data synchronization module; reading IMU data, calling a framing function for framing, discarding data frames except the RAWIMUS frame, and sending the IMU data to the asynchronous thread module.
7. The method of claim 2, wherein the sensor time synchronization method of the SLAM device carried by the four-legged robot comprises: the step S61 includes the following steps:
s611, if the radar data start to be forwarded, replacing the radar data with the latest radar data;
s612, judging whether the counter value when the forwarding is started is determined;
in step S612, the method includes the following steps:
if yes, go on to S613;
if not, the method comprises the following steps: judging whether other data are ready, if so, calculating a count value when forwarding is started;
and S613, if the current counter value is larger than or equal to the forwarding starting value, marking the radar data to start forwarding.
8. The method of claim 2, wherein the sensor time synchronization method of the SLAM device carried by the four-legged robot comprises: the step S62 includes the following steps:
s621, if the camera data starts to be forwarded, replacing the camera data with the latest camera data;
s622, judging whether the counter value when the forwarding is started is determined;
the step S622 includes the following steps:
if yes, continue to execute S623;
if not, the method comprises the following steps: judging whether other data are ready, if so, calculating the count when the forwarding is started;
and S623, if the current counter value is larger than or equal to the forwarding starting value, marking the camera data to start forwarding.
9. The method of claim 2, wherein the sensor time synchronization method of the SLAM device carried by the four-legged robot comprises: the step S63 includes the following steps:
s631, judging whether the GNSS data starts to be forwarded or not;
and S632, judging whether the data is IMU data.
The step S631 includes the following steps:
if yes, the GNSS data is forwarded to a subsequent processing module;
if not, the process continues to S632.
The step S632 includes the following steps:
if yes, the method comprises the following steps:
SS11. replace data with the latest IMU data;
SS12. judging whether the counter value when starting to forward is determined;
and SS13, if the current counter values of the IMU and the NEMA are larger than or equal to the forwarding starting value, marking the GNSS data to start forwarding.
The step SS12 includes the following steps:
if yes, continue to execute SS 13;
if not, the method comprises the following steps: judging whether other data are ready, if so, calculating a count value when forwarding is started;
if not, the method comprises the following steps:
SS21. replace data with the latest NEMA data;
SS22, judging whether the counter value when starting to forward is determined;
and SS23, if the current counter values of the IMU and the NEMA are larger than or equal to the forwarding starting value, marking the GNSS data to start forwarding.
The step SS22 includes the following steps:
if yes, continue to execute SS 23;
if not, the method comprises the following steps: and judging whether other data are ready, and if so, calculating a count value when forwarding is started.
CN202210142484.8A 2022-02-16 2022-02-16 Four-legged robot-mounted SLAM device and sensor time synchronization method Pending CN114499733A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210142484.8A CN114499733A (en) 2022-02-16 2022-02-16 Four-legged robot-mounted SLAM device and sensor time synchronization method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210142484.8A CN114499733A (en) 2022-02-16 2022-02-16 Four-legged robot-mounted SLAM device and sensor time synchronization method

Publications (1)

Publication Number Publication Date
CN114499733A true CN114499733A (en) 2022-05-13

Family

ID=81480273

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210142484.8A Pending CN114499733A (en) 2022-02-16 2022-02-16 Four-legged robot-mounted SLAM device and sensor time synchronization method

Country Status (1)

Country Link
CN (1) CN114499733A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114754769A (en) * 2022-06-15 2022-07-15 天津大学四川创新研究院 Data synchronization time service system and method for laser radar and inertial sensor

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112362051A (en) * 2020-10-16 2021-02-12 无锡卡尔曼导航技术有限公司 GNSS/INS/LIDAR-SLAM information fusion-based mobile robot navigation positioning system
CN112787740A (en) * 2020-12-26 2021-05-11 武汉光庭信息技术股份有限公司 Multi-sensor time synchronization device and method
CN113922910A (en) * 2021-10-09 2022-01-11 广东汇天航空航天科技有限公司 Sensor time synchronization processing method, device and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112362051A (en) * 2020-10-16 2021-02-12 无锡卡尔曼导航技术有限公司 GNSS/INS/LIDAR-SLAM information fusion-based mobile robot navigation positioning system
CN112787740A (en) * 2020-12-26 2021-05-11 武汉光庭信息技术股份有限公司 Multi-sensor time synchronization device and method
CN113922910A (en) * 2021-10-09 2022-01-11 广东汇天航空航天科技有限公司 Sensor time synchronization processing method, device and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114754769A (en) * 2022-06-15 2022-07-15 天津大学四川创新研究院 Data synchronization time service system and method for laser radar and inertial sensor
CN114754769B (en) * 2022-06-15 2022-11-18 天津大学四川创新研究院 Data synchronization time service system and method for laser radar and inertial sensor

Similar Documents

Publication Publication Date Title
CN109211298B (en) Sensor calibration method and device
US11704833B2 (en) Monocular vision tracking method, apparatus and non-transitory computer-readable storage medium
CN110329273B (en) Method and device for synchronizing data acquired by unmanned vehicle
CN110178048B (en) Method and system for generating and updating vehicle environment map
CN108776474B (en) Robot embedded computing terminal integrating high-precision navigation positioning and deep learning
US20230060096A1 (en) Sensor module, electronic apparatus, and vehicle
CN103189717A (en) Indoor positioning using pressure sensors
CN112945228B (en) Multi-sensor time synchronization method and synchronization device
WO2020118545A1 (en) Time-aware occupancy grid mapping for robots in dynamic environments
CN106197410A (en) For the method and apparatus accurately capturing inertial sensor data
CN108988974B (en) Time delay measuring method and device and system for time synchronization of electronic equipment
CN110044377B (en) Vicon-based IMU offline calibration method
CN114926378B (en) Method, system, device and computer storage medium for sound source tracking
CN110398258A (en) A kind of performance testing device and method of inertial navigation system
CN114499733A (en) Four-legged robot-mounted SLAM device and sensor time synchronization method
CN112947384B (en) Multifunctional satellite simulation test system
CN108871317B (en) High-precision star sensor information processing system
CN111443370B (en) Vehicle positioning method, device and equipment and vehicle
EP3955030A1 (en) Apparatus for precise positioning, mapping or object detection
CN114563017B (en) Navigation performance test system and method for strapdown inertial navigation device
CN115685275A (en) POS high-precision shielding-avoiding positioning method
WO2022256976A1 (en) Method and system for constructing dense point cloud truth value data and electronic device
CN112947383B (en) Satellite simulation test system for data stream multi-directional transmission
US20220341751A1 (en) Systems and methods for multi-sensor mapping using a single device that can operate in multiple modes
CN109669192B (en) Use method of multi-station distance and direction measuring instrument in underwater acoustic test

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination