CN208224794U - Robot terminal device - Google Patents

Robot terminal device Download PDF

Info

Publication number
CN208224794U
CN208224794U CN201820790147.9U CN201820790147U CN208224794U CN 208224794 U CN208224794 U CN 208224794U CN 201820790147 U CN201820790147 U CN 201820790147U CN 208224794 U CN208224794 U CN 208224794U
Authority
CN
China
Prior art keywords
chip
jetson
module
terminal device
navigation module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201820790147.9U
Other languages
Chinese (zh)
Inventor
郭迟
董巍
张艺芬
陈梁
代永红
崔竞松
郭文飞
左文炜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Jingtian Electrical Co ltd
Zhongshan Saibotan Intelligent Technology Co ltd
Wuhan University WHU
Original Assignee
Wuhan Jingtian Electrical Co ltd
Zhongshan Saibotan Intelligent Technology Co ltd
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Jingtian Electrical Co ltd, Zhongshan Saibotan Intelligent Technology Co ltd, Wuhan University WHU filed Critical Wuhan Jingtian Electrical Co ltd
Priority to CN201820790147.9U priority Critical patent/CN208224794U/en
Application granted granted Critical
Publication of CN208224794U publication Critical patent/CN208224794U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Navigation (AREA)

Abstract

The utility model provides a robot terminal device, its characterized in that: the sensor comprises a core processing unit Jetson chip, an FPGA chip, an embedded ARM module and a plurality of sensors, wherein the core processing unit Jetson chip, the FPGA chip, the embedded ARM module and the sensors are arranged in an integrated mode, the sensors comprise a GNSS satellite navigation module, an IMU inertial navigation module, a laser radar and a camera, the core processing unit Jetson chip is connected with the camera and the laser radar, and the FPGA chip is respectively connected with the GNSS satellite navigation module, the IMU inertial navigation module and the embedded ARM module; the embedded ARM module is connected with a servo motor for controlling the robot. The utility model discloses integrated degree is high, and the interface is abundant, can provide the hardware basis for multisensor information fusion and degree of depth study, and is small, with low costs, and market advantage is big, accords with the miniaturized trend of current robot product.

Description

A kind of robot terminal device
Technical field
The utility model provides a kind of robot terminal device, belongs to robot device field.
Background technique
Multiple Source Sensor use processing in robot is robot autonomous perception, accurate control and mission planning Key, since embedded platform has the characteristics that small in size, low in energy consumption and degree of integration is high, embedded platform becomes machine The mainstream research/development platform of people.But current robot requires positioning, driving and three industrial personal computers of core calculations under normal circumstances To complete basic task, therefore the generally existing computing capability of existing robot embedded platform is insufficient, hardware integration degree is low, scarce The limitations such as weary deep learning platform support, shortage ambient intelligence sensing capability condition, therefore be sought after carrying out on hardware device It improves.
Utility model content
The utility model mainly provides a kind of robot terminal device, and the device degree of integration is high, to solve extensive ring Precision positioning and Intellisense problem is cooperateed with to provide hardware platform basis under border.
Technical solution used by the utility model provides a kind of robot terminal device, at integrally disposed core Managing unit Jetson chip, fpga chip, embedded-type ARM module and multiple sensors, the sensor includes that GNSS satellite is led Model plane block, IMU inertial navigation module, laser radar and camera, core processing unit Jetson chip connect camera and swash Optical radar, fpga chip are separately connected GNSS satellite navigation module and IMU inertial navigation module, embedded-type ARM module;It is embedded The servo motor of ARM module connection control robot.
Moreover, the GNSS satellite navigation module uses compass in ancient China K505 chip.
Moreover, the core processing unit Jetson chip model is Jetson TX1, laser thunder is connected by USB interface It reaches and camera.
Moreover, the core processing unit Jetson chip connection setting fan.
Moreover, the fpga chip model Altera Cyclone III.
Moreover, the IMU inertial navigation module uses ADIS16460 chip.
Moreover, embedded-type ARM module connection setting LCD module.
Compared with prior art, present apparatus degree of integration is high, and rich interface can pass through highly integrated embedded platform There is provided hardware foundation for multi-sensor information fusion and deep learning, thus be equivalent to only with an industrial personal computer can support to User provides high-precision, real-time, intelligentized adaptive dynamic navigation and positioning, path planning, avoidance and ambient intelligence perception Etc. functions, small in size, at low cost, the market advantage is big, meets the trend of the current robot miniaturization of products.
Detailed description of the invention
Fig. 1 is the device frame figure of the utility model embodiment.
Specific embodiment
The utility model is understood and implemented for the ease of those of ordinary skill in the art, it is right with reference to the accompanying drawings and embodiments The utility model is described in further detail, it should be understood that implementation example described herein is only used for describing and explaining this Utility model is not used to limit the utility model.
The utility model devises a kind of robot device that high-precision is integrated, provides all kinds of interfaces for accessing various biographies Sensor equipment realizes the access of Multiple Source Sensor data.Entire setting meets the acquisition of multicomponent signal, handles and exports, and Have continuous high-precision multisensor, supports fusion adaptive location and deep learning ability.
Referring to Fig. 1, a kind of robot terminal device provided by the embodiment of the utility model is integrated with a core processing list First Jetson chip, fpga chip, high-precision GNSS satellite navigation module, the IMU inertial navigation module and STM32 of six axis are embedding Enter formula ARM module, it can also the other sensors such as carry camera and laser radar.The connection of core processing unit Jetson chip is taken the photograph As head and laser radar, fpga chip connects GNSS satellite navigation module and IMU inertial navigation module, embedded-type ARM module, embedding Enter the servo motor of formula ARM module connection control robot.
When using the utility model embodiment device, it is synchronous that system clock row clock of going forward side by side is provided using FPGA, simultaneously The observed quantity of GNSS satellite navigation module and inertia sensing module is input in the ARM of core processing unit Jetson chip, Support realize real-time high-precision positioning result resolving, and using in core processing unit Jetson chip serial ports and USB connect The mouth connection sensors such as camera and laser radar, support the environment sensing and mission planning of realizing robot.
The modules of embodiment are described below:
The high-precision GNSS satellite navigation module chip model is compass in ancient China K505, supports RTK positioning function, therefore it With high-precision Big Dipper positioning function.GNSS satellite navigation module has the function of that observation exports and support Differential positioning, provides Satellite reception data and ground strengthening system receive data, provide high-precision GNSS positioning for robot.
The core processing unit model Jetson TX1, it includes a four core ARM Cortex-A57 processor dies Block and a 256 core low-power consumption GPGPU Accelerating running modules, GPGPU is general-purpose computations graphics processor, aims at graphics calculations and sets Meter, makes optimization for floating-point operation, calculates suitable for intensive highly-parallel, is primarily useful for image procossing and depth It practises.Arm processor has stronger processing capacity, is responsible for normal operational processing, is used as primary processor herein.Therefore it possesses by force Big calculation function, core processing unit meet real-time deep study operation demand, may be implemented environment Intellisense and Mission planning.When utility model works, target identification and detection algorithm based on deep learning can be run with it.Simultaneously The self-contained multiple external interfaces of Jetson can be carried out by multiple sensors such as USB interface carry laser radar, video cameras The Intellisense of environment supports building to have the semantic map of accurate distance information, and carries out mission planning on this basis.Separately Outside, its peripheral hardware further includes network interface, high-definition multimedia interface HDMI, serial port hard disk SATA, other spare USB ports, and Fan is mounted with to cool down to system.When it is implemented, SD card can also be set as needed.
The FPGA model Altera Cyclone III, it not only provides system clock, but also it is same to can be carried out clock Step, therefore the data that can be used for completing metric data in GNSS satellite navigation module chip and IMU inertial navigation module chip are read It takes, and supports the task schedule of carry out system.When it is implemented, FPGA be mainly used for providing system clock row clock of going forward side by side it is same Step is triggered the metric data for reading the output in GNSS and IMU by time delay, and is forwarded.By task schedule mode, use Family can use it and solve the problems, such as that custom circuit is inflexible and programming device gate circuit number amount is limited, may be based on it into The application and development of the various robots of row.
The IMU inertial navigation module chip model is (ADIS16460), main to provide six axis inertia measurement data, is Robot provides mileage information, supports that navigator fix is combined with GNSS observed quantity to be resolved.GNSS observes data and IMU measurement Data are forwarded after being read by FPGA and are input in STM32 embedded-type ARM module, and support is combined navigation pose and resolves.
The embedded-type ARM module is STM32, and CPU is 32 framework of ARM Cortex, although its performance is not so good as main place Device is managed, but its power consumption is lower, is suitable as coprocessor.The chip is mainly used to manage and dispatch chip on board output and various outer Peripheral equipment access can both be combined navigation position resolving, can also be directly connected to control servo motor, support multipath servo Motor Serial Control, the control including bottom layer driving and mechanical arm to robot.It is used for when it is implemented, can connect setting Support LCD display module, control module, com interface and the Mini USB interface of display.
The purpose of this utility model is to provide a kind of highly integrated robot terminal devices, are embedded system of robot System exploitation provides highly integrated, a high-performance, convenient and fast platform, to realize the multi-sensor data fusion of robot, supports It is positioned when various ways high-precision real, and supports to carry out path planning, the sense of environment can be further realized by deep learning Know, realizes avoidance and semantic class task.
The utility model only proposes the design and protection of hardware aspect, is that will list the product sold.When it is implemented, The working method of terminal installation can be voluntarily set as needed in user, such as:
Step 1: starting robot terminal device provided by the utility model.The device is not only integrated with including embedded Numerous biographies such as ARM module, GNSS satellite navigation module, FPGA, core processing module Jetson TX1 and IMU inertial navigation module Sensor, and include all kinds of interfaces to access various sensors, possess 4G module and realize the transmission of high speed network data, can hang over The sensors such as camera, laser radar.
Step 2: being provided based on integrated high-precision GNSS satellite navigation module and IMU inertial navigation module continuous real-time Outdoor high-precision Global localization and path planning.
Step 3: in conjunction with real-time fixed under the sensors such as IMU inertial navigation module, radar and camera realization indoor situations Position and path planning.
Step 4: realizing the Intellisense of environment by sensors such as laser radar and cameras, based on camera and swash Optical radar building has the semantic map of accurate distance information.It is implemented as follows:
1. deep learning training is carried out in core processing module Jetson TX1 by the image that monocular cam acquires, Realize camera in real time to the identification of image object and semantic segmentation;
2. semantic objects are matched in the environmental information and picture that are obtained by laser radar scanning, to obtain with essence The semantic map of true range information.
Step 5: the control command of sector planning is sent to bottom layer driving by SMT32 chip, directly accurate control machine The movement of device people, to complete semantic class task.
As needed, user can use sensing data abundant and powerful processing capacity realizes more applications.
It should be understood that the above-mentioned description for preferred embodiment is more detailed, can not therefore be considered to this The limitation of utility model patent protection scope, those skilled in the art are not departing under the enlightenment of the utility model In situation protected by the claims of this utility model, replacement or deformation can also be made, each falls within the protection of the utility model Within the scope of, the utility model is claimed range and should be determined by the appended claims.

Claims (7)

1. a kind of robot terminal device, it is characterised in that: including integrally disposed core processing unit Jetson chip, FPGA Chip, embedded-type ARM module and multiple sensors, the sensor include GNSS satellite navigation module, IMU inertial navigation mould Block, laser radar and camera, core processing unit Jetson chip connection camera and laser radar, fpga chip connect respectively Connect GNSS satellite navigation module and IMU inertial navigation module, embedded-type ARM module;Embedded-type ARM module connection control robot Servo motor.
2. robot terminal device according to claim 1, it is characterised in that: the GNSS satellite navigation module uses compass in ancient China K505 chip.
3. robot terminal device according to claim 1, it is characterised in that: the core processing unit Jetson is chip-shaped Number it is Jetson TX1, passes through USB interface and connect laser radar and camera.
4. robot terminal device according to claim 1, it is characterised in that: the core processing unit Jetson chip connects Connect setting fan.
5. robot terminal device according to claim 1, it is characterised in that: the fpga chip model Altera Cyclone III。
6. robot terminal device according to claim 1, it is characterised in that: the IMU inertial navigation module uses ADIS16460 chip.
7. the according to claim 1 or 2 or 3 or 4 or 5 or 6 robot terminal devices, it is characterised in that: embedded-type ARM module Connection setting LCD module.
CN201820790147.9U 2018-05-24 2018-05-24 Robot terminal device Active CN208224794U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201820790147.9U CN208224794U (en) 2018-05-24 2018-05-24 Robot terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201820790147.9U CN208224794U (en) 2018-05-24 2018-05-24 Robot terminal device

Publications (1)

Publication Number Publication Date
CN208224794U true CN208224794U (en) 2018-12-11

Family

ID=64506535

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201820790147.9U Active CN208224794U (en) 2018-05-24 2018-05-24 Robot terminal device

Country Status (1)

Country Link
CN (1) CN208224794U (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109855624A (en) * 2019-01-17 2019-06-07 宁波舜宇智能科技有限公司 Navigation device and air navigation aid for AGV vehicle
CN110488712A (en) * 2019-08-30 2019-11-22 上海有个机器人有限公司 A kind of dispensing machine people human-computer interaction embedded main board
CN112388677A (en) * 2020-10-27 2021-02-23 四川大学 Miniature VSLAM vision sensor
CN114526725A (en) * 2022-02-21 2022-05-24 山东新一代信息产业技术研究院有限公司 Super-fusion navigation system based on system-on-chip
CN115847451A (en) * 2022-12-26 2023-03-28 江西洪都航空工业集团有限责任公司 Distributed intelligent robot control system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109855624A (en) * 2019-01-17 2019-06-07 宁波舜宇智能科技有限公司 Navigation device and air navigation aid for AGV vehicle
CN110488712A (en) * 2019-08-30 2019-11-22 上海有个机器人有限公司 A kind of dispensing machine people human-computer interaction embedded main board
CN112388677A (en) * 2020-10-27 2021-02-23 四川大学 Miniature VSLAM vision sensor
CN114526725A (en) * 2022-02-21 2022-05-24 山东新一代信息产业技术研究院有限公司 Super-fusion navigation system based on system-on-chip
CN114526725B (en) * 2022-02-21 2023-11-24 山东新一代信息产业技术研究院有限公司 Super-fusion navigation system based on system-in-chip
CN115847451A (en) * 2022-12-26 2023-03-28 江西洪都航空工业集团有限责任公司 Distributed intelligent robot control system

Similar Documents

Publication Publication Date Title
CN208224794U (en) Robot terminal device
CN108776474B (en) Robot embedded computing terminal integrating high-precision navigation positioning and deep learning
CN206709853U (en) Drawing system is synchronously positioned and builds in a kind of multi-rotor unmanned aerial vehicle room
CN103901895B (en) Target positioning method based on unscented FastSLAM algorithm and matching optimization and robot
EP2572336B1 (en) Mobile device, server arrangement and method for augmented reality applications
CN109029433A (en) Join outside the calibration of view-based access control model and inertial navigation fusion SLAM on a kind of mobile platform and the method for timing
CN110446159A (en) A kind of system and method for interior unmanned plane accurate positioning and independent navigation
CN103064416A (en) Indoor and outdoor autonomous navigation system for inspection robot
CN109358342B (en) Three-dimensional laser SLAM system based on 2D laser radar and control method
CN110488850A (en) A kind of quadrotor drone vision navigation system and method based on raspberry pie
CN111260751B (en) Mapping method based on multi-sensor mobile robot
US20230236280A1 (en) Method and system for positioning indoor autonomous mobile robot
Klinker et al. Distributed user tracking concepts for augmented reality applications
US20200259570A1 (en) Indoor localization with beacon technology based on signal strength distribution and deep learning techniques
CN112198903A (en) Modular multifunctional onboard computer system
Singh et al. Ubiquitous hybrid tracking techniques for augmented reality applications
CN105184268A (en) Gesture recognition device, gesture recognition method, and virtual reality system
Tippetts et al. An on-board vision sensor system for small unmanned vehicle applications
He et al. Cooperative localization and evaluation of small-scaled spherical underwater robots
CN115307646B (en) Multi-sensor fusion robot positioning method, system and device
He et al. Visual positioning system for small-scaled spherical robot in underwater environment
CN113960614A (en) Elevation map construction method based on frame-map matching
Hu et al. 3D indoor modeling using a hand-held embedded system with multiple laser range scanners
CN110764511A (en) Mobile robot with multi-sensor fusion and control method thereof
Zhang et al. Recent Advances in Mobile Robot Localization in Complex Scenarios

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant