CN111949018A - ROS system intelligent vehicle - Google Patents

ROS system intelligent vehicle Download PDF

Info

Publication number
CN111949018A
CN111949018A CN202010653136.8A CN202010653136A CN111949018A CN 111949018 A CN111949018 A CN 111949018A CN 202010653136 A CN202010653136 A CN 202010653136A CN 111949018 A CN111949018 A CN 111949018A
Authority
CN
China
Prior art keywords
wheel
main controller
real time
controller
wheel trolley
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010653136.8A
Other languages
Chinese (zh)
Inventor
吴平
刘峰
郑银燕
何兵兵
陈建飞
金晨鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wenzhou University
Original Assignee
Wenzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wenzhou University filed Critical Wenzhou University
Priority to CN202010653136.8A priority Critical patent/CN111949018A/en
Publication of CN111949018A publication Critical patent/CN111949018A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides an ROS system intelligent vehicle, which comprises four-wheel trolleys, and a main controller, a motion controller, four motors, a laser radar and a camera which are all arranged on the four-wheel trolleys; the camera calibrates the peripheral image of the four-wheel trolley in real time; the laser radar determines the peripheral detectable range of the four-wheel trolley and the coordinates of obstacles in the range in real time; the main controller converts the peripheral image coordinates to construct a navigation map, detects the collision risk of the four-wheel trolley and any obstacle in the detectable range in real time in the navigation map based on a grid map value-taking principle according to the detectable range and the obstacle coordinates in the range, and generates a steering command to be issued to the motion controller when the collision risk exists; and the motion controller drives one or more motors to act to adjust the rotating speed of the rotating wheel according to the steering command, so that the four-wheel trolley can run along the adjusted motion track. By implementing the intelligent vehicle, the defects and the defects of automatic speed change and automatic road identification of the existing intelligent vehicle are overcome, and the intelligence is improved.

Description

ROS system intelligent vehicle
Technical Field
The invention relates to the technical field of intelligent robots, in particular to an ROS (robot Operating system) system intelligent vehicle.
Background
The intelligent vehicle is a product of combining the latest technological achievements of electronic computers and the like with the modern automobile industry, and the single chip microcomputer controls the trolley to move forward, turn left and turn right. Along with the development of artificial intelligence technology, the intellectualization of intelligent vehicles needs to be further improved.
In China, many robot enterprises and unmanned enterprises join ROS for marketing. The ROS is a free source-opening meta-operation secondary system, the bottom-layer task of the ROS runs on Linux, the ROS is a series of software development and functional package development integration warehouses, nodes communicate with each other through topics and are in a distributed architecture, and the nodes can communicate with each other on the same machine and different machines. The ROS improves code reuse as a core design target, and can be understood that partial functions are made into a function library, so that the ROS is convenient to use.
Therefore, it is necessary to research the intelligent vehicle based on the ROS system, so that the defects and shortcomings of automatic speed change and automatic road identification of the existing intelligent vehicle are overcome, and the intelligence is further improved.
Disclosure of Invention
The technical problem to be solved by the embodiment of the invention is to provide an ROS system intelligent vehicle, which overcomes the defects and shortcomings of automatic speed change and automatic road identification of the existing intelligent vehicle and further improves the intelligence.
In order to solve the technical problem, the embodiment of the invention provides an ROS system intelligent vehicle, which comprises four-wheel trolleys, and a raspberry pi 4B-based main controller, a motion controller, four motors, a laser radar and a camera which are all arranged on the four-wheel trolleys; wherein,
the camera is connected with the first end of the main controller and is used for calibrating the peripheral image of the four-wheel trolley in real time;
the laser radar is connected with the second end of the main controller and used for determining the peripheral detectable range of the four-wheel trolley and the coordinates of obstacles in the range in real time;
the third end of the main controller is connected with the input end of the motion controller and is used for performing coordinate conversion on a peripheral image calibrated by the camera in real time to construct a navigation map, detecting the collision risk of the four-wheel trolley and any obstacle in the detectable range in real time based on a grid map dereferencing principle in the constructed navigation map according to the detectable range determined by the laser radar in real time and the obstacle coordinates in the range, generating a steering instruction for automatically adjusting the current motion track of the four-wheel trolley when the collision risk is detected, and further issuing the steering instruction to the motion controller;
the output end of the motion controller is connected with four motors connected with four rotating wheels on the four-wheel trolley, and the motion controller is used for driving one or more corresponding actions among the four motors to adjust the rotating speed of the corresponding rotating wheel according to a steering command issued by the main controller, so that the four-wheel trolley can run along the adjusted motion track.
The ROS system software is burnt on the main controller, hardware comprises a Cortex-A7264-bit four-core processor, a 2-path micro-HDMI port supports a double-display screen with 4K resolution, a hardware video decoding with 4Kp60, a 4GB RAM, a double-frequency 2.4/5.0GHz wireless local area network interface, a Bluetooth 5.0 interface, a gigabit Ethernet interface, a USB 3.0 interface and a PoE interface.
The main controller is connected with the laser radar and the camera through USB 3.0 interfaces.
The motion controller adopts a single chip microcomputer with the model number of STM32, is connected with the main controller through a serial port and is correspondingly connected with the four motors through four encoders respectively.
The embodiment of the invention has the following beneficial effects:
the invention constructs a navigation chart by performing coordinate conversion according to a peripheral image calibrated in real time by a raspberry pi 4B-based main controller, detects the collision risk of the four-wheel trolley and any obstacle in the detectable range in real time in the navigation chart according to the detectable range determined in real time by a laser radar and the obstacle coordinates in the range, generates a steering instruction for automatically adjusting the current motion track of the four-wheel trolley when the collision risk exists, and enables a motion controller to drive one or more motors to adjust the rotating speed of a rotating wheel of the four-wheel trolley so that the four-wheel trolley can run along the adjusted motion track, thereby overcoming the defects and shortcomings of automatic speed change and automatic road identification of the existing intelligent vehicle and further improving the intelligence.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is within the scope of the present invention for those skilled in the art to obtain other drawings based on the drawings without inventive exercise.
FIG. 1 is a schematic system diagram of an ROS system intelligent vehicle according to an embodiment of the present invention;
FIG. 2 is a logic connection diagram of a main controller, a motion controller, four motors, a laser radar and a camera in the ROS system intelligent vehicle provided by the embodiment of the invention;
fig. 3 is a schematic diagram of a grid map value sampling adopted by the main controller in fig. 2.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings.
As shown in fig. 1, in an embodiment of the present invention, an ROS system intelligent vehicle includes a four-wheel vehicle 1, and a master controller 2, a motion controller 3, four motors 4, a laser radar 5, and a camera 6, all of which are mounted on the four-wheel vehicle 1 and based on a raspberry pi 4B; wherein,
the camera 6 is connected with the first end of the main controller 2 and is used for calibrating the peripheral image of the four-wheel trolley 1 in real time;
the laser radar 5 is connected with the second end of the main controller 2 and is used for determining the peripheral detectable range of the four-wheel trolley 1 and the coordinates of obstacles in the range in real time;
the third end of the main controller 2 is connected with the input end of the motion controller 3, and is used for performing coordinate conversion on a peripheral image calibrated by the camera 6 in real time to construct a navigation map, detecting the collision risk of the four-wheel trolley 1 and any obstacle in the detectable range in real time based on a grid map value-taking principle in the constructed navigation map according to the detectable range determined by the laser radar 5 in real time and the obstacle coordinates in the range, and generating a steering instruction for automatically adjusting the current motion track of the four-wheel trolley 1 when the collision risk is detected, and further issuing the steering instruction to the motion controller 3;
the output end of the motion controller 3 is connected with four motors 4 connected with four rotating wheels (not shown) on the four-wheel trolley 1, and is used for driving one or more corresponding actions among the four motors 4 to adjust the rotating speed of the corresponding rotating wheel according to a steering command sent by the main controller 3, so that the four-wheel trolley 1 can run along the adjusted motion track.
It should be noted that the main controller 2 is not limited to the above-described functions, but includes functions of voice recognition, face recognition, and the like. For example, a voice recognition module is added to recognize task issue (such as start, turn, pause) of a user, and the like, the corresponding main controller 2 receives the relevant information of the voice recognition module to analyze, and generates a corresponding control command to send to the motion controller 3; for another example, a face recognition module is added to recognize the user permission, and the current use permission of the user is judged by receiving face information related to the face recognition module with respect to the main controller 2. Meanwhile, the main controller 2 is provided with a preset path adjusting algorithm, and when the collision risk of the four-wheel trolley 1 and any obstacle in the detectable range is detected, the current motion track of the four-wheel trolley 1 can be adjusted according to the preset path adjusting algorithm, so that a corresponding steering instruction is formed.
It can be understood that the current motion track of the four-wheel vehicle 1 is preset inside the main controller 2 and can be reflected on the navigation map, so that the four-wheel vehicle 1 can automatically navigate, simulate and run according to the navigation map. Of course, the current movement trajectory of the four-wheeled vehicle 1 may also be temporarily set by the user.
In the embodiment of the invention, the collision risk detection of the four-wheel trolley and any obstacle in the detectable range is determined by calculating the distance between the center of mass of the four-wheel trolley and the obstacle and judging whether the calculated distance is smaller than the preset collision early warning distance or not; specifically, if the value is less than the threshold value, the collision risk is present; otherwise, it indicates that there is no collision risk.
In one embodiment, the main controller 2 is burned with ROS system software, the hardware includes a Cortex-a 7264-bit quad-core processor, a 2-way micro-HDMI port supports a dual-display screen with a resolution of 4K, a hardware video decoder with a resolution of 4Kp60, a 4GB RAM, a dual-frequency 2.4/5.0GHz wireless local area network interface, a bluetooth 5.0 interface, a gigabit ethernet interface, a USB 3.0 interface, and a PoE interface; the main controller 2 is connected with the laser radar 5 and the camera 6 through USB 3.0 interfaces; wherein, it is the singlechip of STM32 to describe motion controller 3, and it links to each other with main control unit 2 through the serial ports, corresponds with four motors 4 respectively through four encoders and links to each other.
As shown in fig. 2 and fig. 3, the application scenario of the ROS system smart car in the embodiment of the present invention is further explained:
the ROS system intelligent vehicle mainly comprises two parts, a motion controller OpenCRP (Open-source Control Module for ROS on Pi) is based on an STM32F103 single chip microcomputer, 4 motors are connected to a motor encoder, meanwhile, a power supply input is communicated with a power supply of a raspberry group 4B on a main controller, a serial port is communicated with the power supply, cameras for external laser radar and depth vision are connected to the raspberry group 4B of the main controller by means of USB ports, and the ROS system is installed on the raspberry group 4B of the main controller. And then the user side can be connected to the raspberry group through distributed communication, so that control is realized.
Four motors at the OpenCRP end of the motion controller are four driving wheels of the ROS intelligent vehicle, and the left and right rotation and the front and back motion are controlled by adjusting the difference value of the speed and the rotating speed through an encoder, so that an electric drive system is formed. And the motion controller OpenCRP realizes the connection with the raspberry group 4B on the main controller through TTL serial port communication, and is the interaction between a control system and a driving system, so that the brain is successfully connected with four limbs. The laser radar and the camera of the depth vision are connected to the raspberry pi 4B of the main controller, namely the data of the sensor is connected to the ROS to participate in the analysis of the scene, and the sensor extension of the whole system is realized.
Furthermore, the master controller, namely the raspberry group 4B, needs to be brushed with an Ubuntu 16.04 system to run a Wuban map, go to an official network of the raspberry group (https:// Ubuntu-mate.org/ports/raspberry-pi /), download the raspberry group 16.04 for standby, and download Win32Disk imager burning software on windows. After the recording is finished, the host controller raspberry pi 4B is plugged into an sd card, and is plugged into a power supply after being connected with a keyboard mouse, and ROS installation steps similar to those of the virtual machine above are executed (the ROS installation steps are basically the same). In the running process of the ROS intelligent vehicle, the ROS is installed inside a used main controller raspberry group 4B, and is not installed on a computer host. The ROS is a distributed software framework, and nodes can be connected through hosts file remote IP addresses. The realization of distributed multi-machine communication mainly depends on two steps: 1. setting ip addresses ensures the connection of the underlying links. First, the terminal config determines the ip address of the terminal. 2. Sudo vim/etc/hosts then. And editing the ip address and the set name of the slave at the end of the master.
The motion controller OpenCRP is an open source ROS control module special for a raspberry group 4B of a main controller, STM32F103 is used as a main control, an onboard IMU acceleration gyroscope sensor supports 4-path direct current motor closed-loop control. The device is provided with a BOOT key and supports ISP serial port updating software. The power supply device can be directly inserted into the raspberry pi and can realize power supply and serial port communication of the raspberry pi 4B of the main controller through the pin interface. STM32F103 singlechip, the resource that possess includes: 48KB SRAM, 256KB FLASH, 2 basic timers, 4 general timers, 2 advanced timers, 51 general IO ports. The motor driving module is a TB6612FNG PWM dual-drive motor which can drive two motors.
The laser radar ranging obtains an std _ msgs/Header (message Header), which specifically includes angle _ min: a starting angle of the detectable range; angle _ max: the end angle of the detectable range and angle _ min form the detectable range of the laser radar; angle _ increment: an angle step between adjacent data frames; time _ increment: acquiring the time step between adjacent data frames, and performing compensation when the sensor is in a relative motion state; scan _ time: collecting time required by one frame of data; range _ min: a threshold of most recent detectable depth; range _ max: a threshold of furthest detectable depth; and (4) ranging: a memory array of frame depth data; the odometer information rosmsg show nav _ msgs/odometer; and (4) a dose: the current position coordinates of the four-wheel trolley comprise the XYZ three-axis positions and the direction parameters of the four-wheel trolley and a covariance matrix for correcting errors.
The camera marks peripheral images, and the raspberry Pi 4B of the main controller converts coordinates through marking to achieve 3D stereoscopic vision and establish an imaging model. For example, the kniec camera can be used to obtain depth information for mapping of the ROS smart car.
In fig. 3, a schematic diagram of a grid map value of the master controller raspberry pi 4B is shown. The square frame is the shape of the four-wheel trolley, and the inner circle judgment position of the shape is drawn mainly according to the distance between the barrier and the inscribed circle; the circumscribed circle is the area of potential impact. In general, a larger value is a larger probability of an obstruction.
The navigation simulation can be built on the main controller raspberry pi 4B, for example, reflected by the following procedures:
(1) simulation of drawing
Starting gmapping demonstration (laser radar)
roslaunch mbot_gazebo mbot_laser_nav_gazebo.launch
roslaunch mbot_navigation gmapping_demo.launch
roslaunch mbot_teleop mbot_teleop.launch
Saving the map after running from the starting point to the end point: rosrun map _ server map _ saver-fcloid _ mapping
After the storage, the corresponding pgm map file and the yaml file of the map information can be obtained.
Sudo apt-get induced ros kinetic-navigation of navigation framework
(2) Navigation simulation
roslaunch mbot_gazebo mbot_laser_nav_gazebo.launch
roslaunch mbot_navigation nav_cloister_demo.launch
(3) Map building navigation
Navigation file details of xtark _ nav.
Figure BDA0002575726130000071
Figure BDA0002575726130000081
Connecting the host to the robot through SSH command, entering the launch folder of the xtark _ nav function packet, and modifying the map path of the navigation launch file of the' xtark _ nav
"/ros _ ws/src/xtark _ nav/maps" default maps are stored in "/ros _ ws/src/xtark _ nav/maps"
Running a navigation node of 'xtark _ nav.launch'; roslaunch xtark _ nav
Then starting a rviz terminal to watch a real-time 2D/3D scene; rosrun rviz
A target position is issued by using a 2D Nav Goal button, and the ROS robot can automatically navigate and avoid obstacles to reach the target position.
The navigation map is changed into the map file obtained by the previous map building, and then the navigation is started to realize the starting and ending of the appointed place.
The embodiment of the invention has the following beneficial effects:
the invention constructs a navigation chart by performing coordinate conversion according to a peripheral image calibrated in real time by a raspberry pi 4B-based main controller, detects the collision risk of the four-wheel trolley and any obstacle in the detectable range in real time in the navigation chart according to the detectable range determined in real time by a laser radar and the obstacle coordinates in the range, generates a steering instruction for automatically adjusting the current motion track of the four-wheel trolley when the collision risk exists, and enables a motion controller to drive one or more motors to adjust the rotating speed of a rotating wheel of the four-wheel trolley so that the four-wheel trolley can run along the adjusted motion track, thereby overcoming the defects and shortcomings of automatic speed change and automatic road identification of the existing intelligent vehicle and further improving the intelligence.
While the invention has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not to be limited to the disclosed embodiment, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.

Claims (4)

1. An ROS system intelligent vehicle is characterized by comprising four-wheel trolleys, and a raspberry pi 4B-based main controller, a motion controller, four motors, a laser radar and a camera which are all arranged on the four-wheel trolleys; wherein,
the camera is connected with the first end of the main controller and is used for calibrating the peripheral image of the four-wheel trolley in real time;
the laser radar is connected with the second end of the main controller and used for determining the peripheral detectable range of the four-wheel trolley and the coordinates of obstacles in the range in real time;
the third end of the main controller is connected with the input end of the motion controller and is used for performing coordinate conversion on a peripheral image calibrated by the camera in real time to construct a navigation map, detecting the collision risk of the four-wheel trolley and any obstacle in the detectable range in real time based on a grid map dereferencing principle in the constructed navigation map according to the detectable range determined by the laser radar in real time and the obstacle coordinates in the range, generating a steering instruction for automatically adjusting the current motion track of the four-wheel trolley when the collision risk is detected, and further issuing the steering instruction to the motion controller;
the output end of the motion controller is connected with four motors connected with four rotating wheels on the four-wheel trolley, and the motion controller is used for driving one or more corresponding actions among the four motors to adjust the rotating speed of the corresponding rotating wheel according to a steering command issued by the main controller, so that the four-wheel trolley can run along the adjusted motion track.
2. The ROS system intelligent vehicle of claim 1, wherein ROS system software is burned on the main controller, and hardware comprises a Cortex-A7264 bit quad-core processor, a 2-way micro-HDMI port supporting a dual display screen with a resolution of 4K, a hardware video decoder with 4Kp60, a RAM with 4GB, a dual-frequency 2.4/5.0GHz wireless local area network interface, a Bluetooth 5.0 interface, a gigabit Ethernet interface, a USB 3.0 interface and a PoE interface.
3. The ROS system smart car of claim 2, wherein the master controller is connected to both the lidar and the camera via USB 3.0 interfaces.
4. The ROS system intelligent vehicle of claim 1, wherein the motion controller is a single chip microcomputer of model STM32, and is connected with the main controller through a serial port and correspondingly connected with the four motors through four encoders.
CN202010653136.8A 2020-07-08 2020-07-08 ROS system intelligent vehicle Pending CN111949018A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010653136.8A CN111949018A (en) 2020-07-08 2020-07-08 ROS system intelligent vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010653136.8A CN111949018A (en) 2020-07-08 2020-07-08 ROS system intelligent vehicle

Publications (1)

Publication Number Publication Date
CN111949018A true CN111949018A (en) 2020-11-17

Family

ID=73340336

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010653136.8A Pending CN111949018A (en) 2020-07-08 2020-07-08 ROS system intelligent vehicle

Country Status (1)

Country Link
CN (1) CN111949018A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115648221A (en) * 2022-11-22 2023-01-31 福州大学 Education robot based on ROS system
CN116067555A (en) * 2023-04-06 2023-05-05 西南交通大学 Bolt looseness detection system and method for urban rail transit and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105487535A (en) * 2014-10-09 2016-04-13 东北大学 Mobile robot indoor environment exploration system and control method based on ROS
CN108279679A (en) * 2018-03-05 2018-07-13 华南理工大学 A kind of Intelligent meal delivery robot system and its food delivery method based on wechat small routine and ROS
CN108981695A (en) * 2018-07-26 2018-12-11 浙江大学 A kind of carriage navigation system based on ROS
CN109062201A (en) * 2018-07-23 2018-12-21 南京理工大学 Intelligent navigation micro-system and its control method based on ROS
CN209215939U (en) * 2018-08-30 2019-08-06 苏州德斯米尔智能科技有限公司 A kind of all-around mobile intelligent carriage based on ROS system
CN210835730U (en) * 2019-12-19 2020-06-23 郭凯诚 Control device of ROS blind guiding robot
CN210819528U (en) * 2019-09-27 2020-06-23 内蒙古智诚物联股份有限公司 IOT (Internet of things) intelligent household robot based on ROS (reactive oxygen species) system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105487535A (en) * 2014-10-09 2016-04-13 东北大学 Mobile robot indoor environment exploration system and control method based on ROS
CN108279679A (en) * 2018-03-05 2018-07-13 华南理工大学 A kind of Intelligent meal delivery robot system and its food delivery method based on wechat small routine and ROS
CN109062201A (en) * 2018-07-23 2018-12-21 南京理工大学 Intelligent navigation micro-system and its control method based on ROS
CN108981695A (en) * 2018-07-26 2018-12-11 浙江大学 A kind of carriage navigation system based on ROS
CN209215939U (en) * 2018-08-30 2019-08-06 苏州德斯米尔智能科技有限公司 A kind of all-around mobile intelligent carriage based on ROS system
CN210819528U (en) * 2019-09-27 2020-06-23 内蒙古智诚物联股份有限公司 IOT (Internet of things) intelligent household robot based on ROS (reactive oxygen species) system
CN210835730U (en) * 2019-12-19 2020-06-23 郭凯诚 Control device of ROS blind guiding robot

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115648221A (en) * 2022-11-22 2023-01-31 福州大学 Education robot based on ROS system
CN116067555A (en) * 2023-04-06 2023-05-05 西南交通大学 Bolt looseness detection system and method for urban rail transit and storage medium

Similar Documents

Publication Publication Date Title
JP6915009B2 (en) Sensor calibration methods and equipment, computer equipment, media and vehicles
US10990099B2 (en) Motion planning methods and systems for autonomous vehicle
CN110082781B (en) Fire source positioning method and system based on SLAM technology and image recognition
US10867409B2 (en) Methods and systems to compensate for vehicle calibration errors
US10974390B2 (en) Autonomous localization and navigation equipment, localization and navigation method, and autonomous localization and navigation system
CN114474061B (en) Cloud service-based multi-sensor fusion positioning navigation system and method for robot
CN111949018A (en) ROS system intelligent vehicle
WO2021218310A1 (en) Parking method and apparatus, and vehicle
CN112147999A (en) Automatic driving experiment AGV vehicle platform
CN110901656B (en) Experimental design method and system for autonomous vehicle control
Kessler et al. Bridging the gap between open source software and vehicle hardware for autonomous driving
US20220371579A1 (en) Driving assistance apparatus, vehicle, and driving assistance method
CN114684202B (en) Intelligent system for automatically driving vehicle and integrated control method thereof
US11975738B2 (en) Image annotation for deep neural networks
CN111547045A (en) Automatic parking method and device for vertical parking spaces
CN116067555B (en) Bolt looseness detection system and method for urban rail transit and storage medium
Kannapiran et al. Go-CHART: A miniature remotely accessible self-driving car robot
CN212781778U (en) Intelligent vehicle based on vision SLAM
CN108646759B (en) Intelligent detachable mobile robot system based on stereoscopic vision and control method
CN115648221A (en) Education robot based on ROS system
CN103529837A (en) Dual-core two-wheeled top-speed microcomputer mouse-based diagonal sprint servo system
Liu et al. The multi-sensor fusion automatic driving test scene algorithm based on cloud platform
CN115758687A (en) Unmanned aerial vehicle autopilot simulation platform
KR102175943B1 (en) Platform and Automotive platform for education of unmanned vehicle
CN115373404A (en) Mobile robot for indoor static article identification and autonomous mapping and working method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Ye Yixiu

Inventor after: Wu Ping

Inventor after: Wang Xingguo

Inventor after: Zheng Yinyan

Inventor after: Jin Chenxin

Inventor before: Wu Ping

Inventor before: Liu Feng

Inventor before: Zheng Yinyan

Inventor before: He Bingbing

Inventor before: Chen Jianfei

Inventor before: Jin Chenxin

CB03 Change of inventor or designer information
RJ01 Rejection of invention patent application after publication

Application publication date: 20201117

RJ01 Rejection of invention patent application after publication