CN209928281U - Automatic pilot - Google Patents

Automatic pilot Download PDF

Info

Publication number
CN209928281U
CN209928281U CN201921238135.6U CN201921238135U CN209928281U CN 209928281 U CN209928281 U CN 209928281U CN 201921238135 U CN201921238135 U CN 201921238135U CN 209928281 U CN209928281 U CN 209928281U
Authority
CN
China
Prior art keywords
module
camera
autopilot
electrically connected
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201921238135.6U
Other languages
Chinese (zh)
Inventor
张亮
鄢胜超
周家龙
夏凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Smart Mapping Tech Co Ltd
Original Assignee
Shenzhen Smart Mapping Tech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Smart Mapping Tech Co Ltd filed Critical Shenzhen Smart Mapping Tech Co Ltd
Priority to CN201921238135.6U priority Critical patent/CN209928281U/en
Application granted granted Critical
Publication of CN209928281U publication Critical patent/CN209928281U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the utility model discloses autopilot, including GNSS module, IMU module, lidar, camera, FPGA module, time reference module, inertial navigation unit, dead reckoning unit and treater, wherein, the camera includes external trigger camera and two mesh cameras, and FPGA module and GNSS module, IMU module, lidar, external trigger camera, time reference module, inertial navigation unit, dead reckoning unit electricity are connected, and the treater is connected with GNSS module, two mesh cameras, external trigger camera, lidar electricity. The embodiment of the utility model provides a carry out data processing and fusion to each sensor through adopting the FPGA module, then real-time transmission carries out autopilot's decision-making and control to the treater, has solved sensor data and has been difficult to fuse, the sensor function is independent, be difficult to realize the problem of high accuracy location under the arbitrary scene, and then has promoted unmanned security, reliability and available line.

Description

Automatic pilot
Technical Field
The utility model relates to an unmanned technical field especially relates to an automatic pilot.
Background
The automatic pilot is the brain of the unmanned vehicle, and all actions of the unmanned vehicle are controlled and completed by the automatic pilot. The automatic pilot receives and processes various data collected by the sensor to form control instructions of the vehicle, including steering wheel angle, accelerator opening, brake and the like. In the existing unmanned system, a differential global satellite navigation positioning system is often adopted in the aspect of vehicle self positioning; in the aspect of measuring and controlling the attitude of the vehicle, a high-precision inertial measurement unit is adopted; in the aspects of sensing other environmental targets and detecting obstacles, laser radars, cameras, millimeter wave radars and the like are adopted. The data are sent to the processing unit through various interfaces such as network ports, serial ports, CAN and the like. The reliability of the autopilot depends to a great extent on the accuracy of the data fusion, the accuracy of the vehicle positioning and attitude determination and the accuracy of the perception.
The current unmanned systems have several disadvantages that affect the safety, reliability and availability of unmanned vehicles:
(1) the time references of various different sensors are not uniform, and the data fusion of the sensors is difficult to effectively carry out;
(2) the functions of various sensors are independent, and effective complementation is not formed;
(3) because GNSS signals are easy to interfere, high-precision positioning of the vehicle in any scene is difficult to realize.
SUMMERY OF THE UTILITY MODEL
The embodiment of the utility model provides a technical problem that will solve provides an automatic pilot to make security, reliability and the available line that promotes unmanned driving.
In order to solve the technical problem, the embodiment of the utility model provides an automatic pilot is provided, be applied to among the unmanned vehicle driving system, including the GNSS module, the IMU module, lidar, the camera, the FPGA module, the time reference module, inertial navigation unit, dead reckoning unit and treater, wherein, the camera includes external trigger camera and two mesh cameras, FPGA module and GNSS module, the IMU module, lidar, external trigger camera, the time reference module, inertial navigation unit, dead reckoning unit electricity is connected, treater and GNSS module, two mesh cameras, external trigger camera, lidar electricity is connected.
Further, the system also comprises a mileage meter electrically connected with the FPGA module.
Further, a gigabit network interface is included that is electrically connected to the processor.
Further, the USB interface device further comprises a USB3.0 interface electrically connected with the processor.
Further, the FPGA module is electrically connected with the FPGA module.
Further, the TF card module is electrically connected with the processor.
Further, the system also comprises an MSATA interface electrically connected with the processor.
Further, the HDMI interface electrically connected with the processor is further included.
The embodiment of the utility model provides an automatic pilot, including the GNSS module, the IMU module, laser radar, the camera, the FPGA module, the time reference module, inertial navigation unit, dead reckoning unit and treater, carry out data processing and fusion to each sensor through adopting the FPGA module, then real-time transmission carries out autopilot's decision-making and control to the treater, solved sensor data and be difficult to fuse, the sensor function is independent, be difficult to realize the problem of high accuracy location under the arbitrary scene, and then unmanned security, reliability and available row have been promoted.
Drawings
Fig. 1 is a schematic structural diagram of an autopilot according to an embodiment of the present invention.
Fig. 2 is a circuit diagram of a portion of the time reference module and the FPGA module according to an embodiment of the present invention.
Fig. 3 is a circuit diagram of a power module according to an embodiment of the present invention.
Fig. 4 is another circuit diagram of a part of the FPGA module according to an embodiment of the present invention.
Fig. 5 is a circuit diagram of the USB3.0 interface according to an embodiment of the present invention.
Fig. 6 is a circuit diagram of a GNSS module and an inertial navigation unit according to an embodiment of the present invention.
Fig. 7 is a circuit diagram of a camera and an IMU module according to an embodiment of the present invention.
Fig. 8 is a circuit diagram of a processor according to an embodiment of the invention.
Fig. 9 is a circuit diagram of an encoder according to an embodiment of the present invention.
Fig. 10 is a schematic block diagram of an autopilot according to an embodiment of the invention.
Detailed Description
It should be noted that, in the present application, the embodiments and features of the embodiments may be combined with each other without conflict, and the present invention is further described in detail with reference to the accompanying drawings and specific embodiments.
In the embodiment of the present invention, if there is directional indication (such as upper, lower, left, right, front, and rear … …) only for explaining the relative position relationship between the components and the motion situation under a certain posture (as shown in the drawing), if the certain posture is changed, the directional indication is changed accordingly.
In addition, the descriptions of the first, second, etc. in the present invention are for descriptive purposes only and are not to be construed as indicating or implying any relative importance or implicit indication of the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature.
Referring to fig. 1 to 10, an autopilot according to an embodiment of the present invention mainly includes a GNSS module, an IMU module, a lidar, a camera, an FPGA module, a time reference module, an inertial navigation unit, a dead reckoning unit, and a processor.
The automatic pilot is applied to the unmanned vehicle driving system and is used for controlling the corner, the accelerator, the brake and other systems of the unmanned vehicle. The camera includes an outer trigger camera and a binocular camera. The FPGA module is electrically connected with the GNSS module, the IMU module (adopting an ADIS16490 chip), the laser radar (adopting a 16-wire laser radar), the external trigger camera, the time reference module, the inertial navigation unit and the dead reckoning unit. Preferably, the camera is a CCD camera. The inertial navigation unit is preferably a high precision MEMS inertial navigation. The processor is electrically connected with the GNSS module, the binocular camera, the external trigger camera and the laser radar, and adopts an Invitta intelligent platform TX 2-bit core processor.
Each sensor data of the embodiment of the utility model is firstly sent to an FPGA module, and accurate time labels can be marked for each type of sensor data in the FPGA module by utilizing the characteristics of low delay and high concurrency of the FPGA module, thereby realizing microsecond-level time synchronization; after the time synchronization of the sensors is completed, data of a GNSS module, an IMU module, a laser radar, a camera and a odometer are fused in the FPGA module, and then coordinates of the vehicle are output, so that high-precision vehicle positioning under any scene is realized; in addition, laser radar and camera data are fused in the FPGA module, so that point cloud data output by the laser radar has color information, and image data has depth information, and the accuracy of target classification and extraction is enhanced.
After the sensor data of the embodiment of the utility model is synchronously processed by the FPGA module, all data have accurate time labels, which lays a foundation for the data fusion of heterogeneous multi-sensors and improves the precision and reliability of the data fusion; the embodiment of the utility model provides an utilize multi-sensor data fusion to realize the high accuracy coordinate measurement of vehicle under the complicated scene, improved unmanned reliability; the embodiment of the utility model provides an fuse the image data of laser radar and camera for the information volume of data promotes greatly, is favorable to promoting environment target detection and categorised accuracy, robustness and reliability.
As an embodiment, the autopilot further includes a odometer electrically connected to the FPGA module.
As one embodiment, the autopilot further includes a gigabit interface electrically connected to the processor.
As an embodiment, the autopilot further includes a USB3.0 interface electrically connected to the processor.
As an embodiment, the autopilot further includes an encoder electrically connected to the FPGA module.
As an embodiment, the autopilot further includes a TF card module electrically connected to the processor.
As one embodiment, the autopilot further includes an MSATA interface electrically connected to the processor. The MSATA interface is used for externally connecting an MSATA hard disk.
As one embodiment, the autopilot further includes an HDMI interface electrically connected to the processor.
As one embodiment, the autopilot further includes a power module electrically connected to the processor.
The utility model discloses each peripheral hardware interface is used for expanding the function of autopilot.
The embodiment of the utility model provides a theory of operation does: the automatic pilot is electrified and started, and the system of the automatic pilot is timed through the GNSS module or the time reference module; the FPGA module sends NAMA and PPS signals to the laser radar, sends synchronous trigger signals to the external trigger camera, and receives sensor data such as an inertial navigation unit, an encoder, the laser radar and the camera; the FPGA module processes and fuses data and then transmits the data to the processor in real time; the processor completes data fusion, makes decision and control of automatic driving, and finally realizes high-precision positioning and accurate automatic control of unmanned driving.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (8)

1. The utility model provides an autopilot, be applied to among the unmanned vehicle driving system, a serial communication port, including the GNSS module, the IMU module, lidar, the camera, the FPGA module, the time reference module, inertial navigation unit, dead reckoning unit and treater, wherein, the camera includes external trigger camera and binocular camera, the FPGA module is connected with the GNSS module, the IMU module, lidar, external trigger camera, the time reference module, inertial navigation unit, dead reckoning unit electricity, the treater is connected with the GNSS module, binocular camera, external trigger camera, lidar electricity.
2. The autopilot of claim 1 further comprising an odometer electrically connected to the FPGA module.
3. The autopilot of claim 1 further comprising a gigabit interface electrically connected to the processor.
4. The autopilot of claim 1 further comprising a USB3.0 interface electrically connected to the processor.
5. The autopilot of claim 1 further comprising an encoder electrically connected to the FPGA module.
6. The autopilot of claim 1 further comprising a TF card module electrically connected to the processor.
7. The autopilot of claim 1 further comprising an MSATA interface electrically connected to the processor.
8. The autopilot of claim 1 further comprising an HDMI interface electrically connected to the processor.
CN201921238135.6U 2019-08-02 2019-08-02 Automatic pilot Active CN209928281U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201921238135.6U CN209928281U (en) 2019-08-02 2019-08-02 Automatic pilot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201921238135.6U CN209928281U (en) 2019-08-02 2019-08-02 Automatic pilot

Publications (1)

Publication Number Publication Date
CN209928281U true CN209928281U (en) 2020-01-10

Family

ID=69094183

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201921238135.6U Active CN209928281U (en) 2019-08-02 2019-08-02 Automatic pilot

Country Status (1)

Country Link
CN (1) CN209928281U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111343379A (en) * 2020-02-25 2020-06-26 华南理工大学 FPGA-based high-speed data acquisition device and method
CN111556226A (en) * 2020-07-13 2020-08-18 深圳市智绘科技有限公司 Camera system
CN114526725A (en) * 2022-02-21 2022-05-24 山东新一代信息产业技术研究院有限公司 Super-fusion navigation system based on system-on-chip

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111343379A (en) * 2020-02-25 2020-06-26 华南理工大学 FPGA-based high-speed data acquisition device and method
CN111556226A (en) * 2020-07-13 2020-08-18 深圳市智绘科技有限公司 Camera system
CN114526725A (en) * 2022-02-21 2022-05-24 山东新一代信息产业技术研究院有限公司 Super-fusion navigation system based on system-on-chip
CN114526725B (en) * 2022-02-21 2023-11-24 山东新一代信息产业技术研究院有限公司 Super-fusion navigation system based on system-in-chip

Similar Documents

Publication Publication Date Title
CN209928281U (en) Automatic pilot
JP7157054B2 (en) Vehicle navigation based on aligned images and LIDAR information
CN111192295B (en) Target detection and tracking method, apparatus, and computer-readable storage medium
EP3792660B1 (en) Method, apparatus and system for measuring distance
CN107817488B (en) Unmanned aerial vehicle obstacle avoidance device and method based on millimeter wave radar and vision fusion
US8510027B2 (en) Method for judging vehicle traveling position and vehicle traveling position judgment device
CN111223315B (en) Traffic guidance object recognition device, traffic guidance object recognition method, and storage medium
JP2021516401A (en) Data fusion method and related equipment
CN109215083A (en) The method and apparatus of the calibrating external parameters of onboard sensor
CN109583416B (en) Pseudo lane line identification method and system
CN109747530A (en) A kind of dual camera and millimeter wave merge automobile sensory perceptual system
CN112208529B (en) Perception system for object detection, driving assistance method, and unmanned device
CN111508276B (en) High-precision map-based V2X reverse overtaking early warning method, system and medium
US20230047404A1 (en) Driver assistance system and method
CN110620632A (en) Time synchronization method and device
CN112041767B (en) System and method for synchronizing vehicle sensors and devices
KR20150078881A (en) Method for measureling position of vehicle using cloud computing
CN112884892A (en) Unmanned mine car position information processing system and method based on road side device
CN210518410U (en) Automobile sensor system based on time synchronization and automatic driving vehicle
CN111731304B (en) Vehicle control device, vehicle control method, and storage medium
CN113008237A (en) Path planning method and device and aircraft
CN210038170U (en) Tightly-coupled automatic driving sensing system
CN112666954B (en) Intelligent driving device, intelligent driving method, intelligent driving system and driving device
CN111753901B (en) Data fusion method, device, system and computer equipment
CN112083412B (en) Fusion method of millimeter wave radar and C-V2X system, system and electronic equipment thereof

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant