CN117234139A - Control system based on ROS hanging rail carrying robot - Google Patents

Control system based on ROS hanging rail carrying robot Download PDF

Info

Publication number
CN117234139A
CN117234139A CN202311493407.8A CN202311493407A CN117234139A CN 117234139 A CN117234139 A CN 117234139A CN 202311493407 A CN202311493407 A CN 202311493407A CN 117234139 A CN117234139 A CN 117234139A
Authority
CN
China
Prior art keywords
robot
ros
camera
control system
usb
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311493407.8A
Other languages
Chinese (zh)
Inventor
王安炜
张超
王涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Jiegou Information Technology Co ltd
Original Assignee
Shandong Jiegou Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Jiegou Information Technology Co ltd filed Critical Shandong Jiegou Information Technology Co ltd
Priority to CN202311493407.8A priority Critical patent/CN117234139A/en
Publication of CN117234139A publication Critical patent/CN117234139A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Manipulator (AREA)

Abstract

The invention provides a control system based on an ROS (reactive oxygen species) lifting rail carrying robot, which relates to the technical field of intelligent control of agricultural greenhouses and is characterized in that the system adopts a rail-guided detection mode, comprises a robot local system and a remote operation library, and controls the movement of the lifting rail carrying robot by adopting a robot development system ROS; the robot can smoothly perform horizontal and cornering motions on the single hanger rail through the control system, and completely stop anywhere on the rail. The robot can also visually identify abnormal conditions in the environment and give an alarm in time, and can accurately position the position of the robot in the map.

Description

Control system based on ROS hanging rail carrying robot
Technical Field
The invention relates to the technical field of intelligent control of agricultural greenhouses, and particularly provides a control system based on an ROS (reactive oxygen species) lifting rail carrying robot.
Background
With continuous progress in society, the traditional agricultural production mode can not meet the requirements of modern civilization development, and the novel agricultural greenhouse is pursued by people in the industry. The agricultural equipment is mainly greenhouse facilities, is not limited by time and space, and can be used for agricultural production in special environments such as highland, deep mountain, desert and the like.
At present, more and more agricultural greenhouses exist, but in the working environment in the greenhouses, the environment is relatively bad, and a large amount of manpower is required. The agricultural chemical needs to be carried in each link, and meanwhile, in order to ensure the normal growth of crops, a great deal of manpower is also required for spraying the agricultural chemical, and meanwhile, the physical health of operators is endangered by the manual application of the agricultural chemical; the picking process also requires a large number of operators.
The existing greenhouse multi-application ground track carrying equipment occupies precious greenhouse ground resources and is unfavorable for ground operation in the greenhouse.
Disclosure of Invention
Because GPS signals may have interference in the greenhouse, the positioning of the robot mainly depends on relative positioning and absolute positioning. The relative positioning adopts an encoder and an inertial measurement unit, the encoder calculates the displacement of the robot relative to the track through the pulse change amount in the acquisition period, but the method has accumulated errors and is not suitable for positioning the robot in a long distance. The inertial measurement unit internally comprises a triaxial accelerometer and a triaxial gyroscope, so that triaxial acceleration and triaxial angular velocity of the robot can be output, and position change and speed can be obtained after resolving. Although the positioning method based on the inertial measurement unit has better short-time precision, drift can be generated with the increase of time, and smaller constant errors can become extremely large after multiple integration.
The technical task of the invention is to provide a control system based on the ROS hanging rail carrying robot aiming at the problems.
The system adopts a track-guided detection mode, comprises a robot local system and a remote operation library, and controls the movement of the suspended rail carrying robot by adopting a robot development system ROS;
the remote operation library is responsible for heavy data storage, data management, data processing and processing of three-dimensional image depth information;
the robot local system comprises a driving system, a data acquisition system, a communication system and an extended motion system.
Furthermore, the remote operation library adopts a microprocessor of a central computer as a decision center of the robot;
the central computer is connected with the STM32F407 through a serial port to serve as a lower computer, and controls the driving system to drive the braking system.
Further, the driving system consists of a motor and motor drivers, and is controlled in a CAN communication mode, and comprises four motor drivers and four motor groups;
the data acquisition system extracts useful information from the surrounding environment and reports the useful information to the central computer in the running process of the robot so as to make decisions;
the data acquisition system extracts useful information from the surrounding environment and reports the useful information to the central computer in the running process of the robot so as to make decisions;
the communication system comprises USB serial port communication, a local gigabit Ethernet network, a Control Area Network (CAN) bus and a wireless technology.
Still further, the hanger rail structure of hanger rail delivery robot includes hanger rail support, hanger rail spare connecting piece, slope rail connecting piece, hanger rail two-dimensional code label sets up in the hanger rail support, ROS hanger rail delivery robot passes through the two-dimensional code on the camera scanning hanger rod support frame, realizes absolute location to eliminate the error.
Furthermore, the control system adopts Rosbridge of ROS function package to realize the control way of robot through WebSocket, and provides a JSON interface, which can make the non-ROS platform use ROS function.
Further, the control system generalizes the Web server end into a part of ROS by adopting a RosBridge function package in remote control;
the control system adopts a WebSocket Server function of Rosbridge at the robot end, and monitors at a port 9090 after starting;
the control system adopts a WebSocket Client of Rosbridge at a server end to create a WebSocket to be connected to the robot end, and transmits JSON format data (the data format is required to follow the RosBridge v2.0 Protocol) to the robot end, and the WebSocket Server analyzes the request data and calls a corresponding ROS API to control the robot;
the robot end is automatically connected with the server end after being started by installing websocket-client.
Further, the implementation process of the absolute positioning includes:
the control system drives a robot USB camera using a V4L protocol by using a usb_cam function packet of the ROS, the USB camera outputs two-dimensional image data, the usb_cam is a ROS driving packet aiming at the USB camera of the V4L protocol, and the core node is a usb_cam_node; the specific implementation process is as follows:
when the system runs usb_cam/map, the camera starts up usb_cam_node first, configures corresponding parameters, then runs image_view node, subscribes to image topics/usb_cam/image_raw, and visually presents the world seen by the camera;
then calibrating the monocular camera, installing a camera calibration function package camera_calibration by an installation command, calibrating a required calibration target, and printing out the pattern (under a robot_vision/doc folder) and pasting the pattern on a plane cardboard for later use;
after calibration is completed, all YAML configuration files are automatically generated under the file of the ros/camera_info.
Furthermore, the robot local system adopts a serial port inertial sensor, realizes the control information and robot state information interaction between a robot calculation host and a main control unit through a serial port communication module, and realizes the information interaction by leading out two serial ports on a main control board of the robot local system, wherein the serial port 1 is connected with a robot control host through a USB-to-RS 232 chip, and the serial port 2 is used as a later expansion interface.
Furthermore, the control system designs each module in a distributed manner according to different functions and hardware modules of the mobile robot, packages specific functions of different modules in different nodes, uniformly manages each node through a node manager of the ROS, subscribes each node to send messages to each other to complete communication among the nodes, and outputs cmd_vel information to a motor driving module to finally complete autonomous control.
Furthermore, the implementation process of the robot camera identification is as follows:
installing one of the function packages ar_track_alvar in the ROS;
the ar_track_alvar function package supports a USB camera or an RGB-D camera as a visual sensor for identifying the two-dimensional code, and is respectively applied to two different identification nodes, namely an inventioKinect and an inventiodulmars;
first, the most commonly used USB cameras are used for identification.
After installation is complete, ar_track_alvar is found under the ROS default installation path/opt/ROS/media/share,
copying pr2_index_no_kinect.launch files in an ar_track_alvar function package launch folder as a blue book, modifying and setting a used USB camera, and renaming the blue book to be robot_vision/launch/ar_track_camera;
viewing camera information in an opened rviz interface, and displaying two coordinate systems of world and camera in a main interface; the two-dimensional code label is placed in the visual field range of the camera for identification; a plurality of two-dimensional codes are simultaneously and accurately identified, coordinate axes of the two-dimensional codes in the image represent the identified gesture of the two-dimensional codes;
the ar_track_alvar function package can identify the two-dimensional code in the image and the space gesture of the two-dimensional code, and can calculate the space position of the two-dimensional code relative to the camera, thereby realizing the obstacle avoidance and absolute positioning functions.
The ar_post_marker of the ar_track_alvar function package lists all the identified two-dimensional code information, including the ID number and the pose state of the two-dimensional code, and can print the data of the message by using 'rostopic echo'.
Compared with the prior art, the control system based on the ROS hanging rail carrying robot has the following outstanding beneficial effects:
the robot can smoothly perform horizontal and cornering motions on the single hanger rail through the control system, and completely stop anywhere on the rail. The robot can also visually identify abnormal conditions in the environment and give an alarm in time, and can accurately position the position of the robot in the map.
Drawings
FIG. 1 is a block diagram of a robotic control system of the present invention;
FIG. 2 is a block diagram of the ROS software node and robot hardware modules of the control system of the present invention;
fig. 3 is a schematic diagram of a remote control implementation process according to the present invention.
Detailed Description
The invention will be described in further detail with reference to the drawings and examples.
The system adopts a track-guided detection mode, comprises a robot local system and a remote operation library, and controls the movement of the suspended rail carrying robot by adopting a robot development system ROS;
as shown in fig. 1, the carrying robot needs high computing power due to the need of processing a large amount of image data, and a computer of the carrying robot is mainly used for heavy data storage, data management, data processing and processing of three-dimensional image depth information by taking a high-performance Intel CoreTMi7 microprocessor as a decision center of the robot. The central computer is connected with STM32F407 through a serial port to serve as a lower computer, and the driving system is controlled to drive the braking system. The robot subsystem further includes a drive system, a data acquisition system, a communication system, and an extended motion system. The driving system consists of motors and motor drivers, is controlled by a CAN communication mode and consists of four motor drivers and four motor groups. The data acquisition system is used for extracting useful information from the surrounding environment and reporting the useful information to the central computer in the running process of the robot so as to make decisions. The communication system includes USB serial port communication, a local gigabit Ethernet network, a Control Area Network (CAN) bus, and wireless technology.
The robot control system adopts an embedded control board, the embedded control board adopts stm32 serial port communication which is a common data communication mode in embedded control, and the embedded control board is commonly used for information transmission between a sensor and a controller and information interaction between controllers. In the invention, a serial port-based inertial sensor is used, and control information and robot state information interaction are needed between a robot computing host and a main control unit, so that a serial port communication module is an essential part, two serial ports are led out on a designed main control board, a serial port 1 is connected with a robot control host through a USB-to-RS 232 chip to realize information interaction, and a serial port 2 is used as a later expansion interface.
The main motion mechanism of the robot is an in-wheel motor, namely a brushless direct current motor, the driving control of the motor directly influences the motion performance of the robot, and the main function of the motor is to receive the output signal of the driving plate and convert the output signal into corresponding three-phase voltage. The rated voltage of the motor is 24V, the rated power is 150W, the output torque is 1.2 N.m, the maximum torque is 2.8 N.m, the rated motor rotating speed is 1500r/min, and the motor is provided with a Hall sensor.
As shown in fig. 2, the control system performs distributed design and development on each module according to different functions and hardware modules of the mobile robot, encapsulates specific functions of different modules in different nodes, uniformly manages each node through a node manager of an ROS, subscribes each node to send messages to each other to complete communication between nodes, outputs cmd_vel information to a motor driving module to finally complete autonomous control, and controls and positions the robot to perform reactive actions according to external conditions to perform start-stop, acceleration-deceleration or retreating actions.
The lifting rail structure of the lifting rail carrying robot comprises a lifting rail support, a lifting rail part connecting piece and an inclined rail connecting piece, wherein the lifting rail two-dimension code label is arranged on the lifting rail support, and the ROS lifting rail carrying robot scans the two-dimension code on the lifting rail support through the camera to realize absolute positioning, so that errors are eliminated.
As shown in fig. 3, the control system adopts RosBridge of ROS function package to realize a way of controlling robot through WebSocket, and provides a JSON interface, which can enable the non-ROS platform to use ROS function.
The control system generalizes a Web server end into a part of ROS by adopting a Rosbridge function package in remote control;
the control system adopts a WebSocket Server function of Rosbridge at the robot end, and monitors at a port 9090 after starting;
the control system adopts a WebSocket Client of Rosbridge at a server end to create a WebSocket to be connected to the robot end, and transmits JSON format data (the data format is required to follow the RosBridge v2.0 Protocol) to the robot end, and the WebSocket Server analyzes the request data and calls a corresponding ROS API to control the robot;
the robot end is automatically connected with the server end after being started by installing websocket-client.
The implementation process of the absolute positioning comprises the following steps:
the control system drives a robot USB camera using a V4L protocol by using a usb_cam function packet of the ROS, the USB camera outputs two-dimensional image data, the usb_cam is a ROS driving packet aiming at the USB camera of the V4L protocol, and the core node is a usb_cam_node; the specific implementation process is as follows:
when the system runs usb_cam/map, the camera starts up usb_cam_node first, configures corresponding parameters, then runs image_view node, subscribes to image topics/usb_cam/image_raw, and visually presents the world seen by the camera;
then calibrating the monocular camera, installing a camera calibration function package camera_calibration by an installation command, calibrating a required calibration target, and printing out the pattern (under a robot_vision/doc folder) and pasting the pattern on a plane cardboard for later use;
after calibration is completed, all YAML configuration files are automatically generated under the file of the ros/camera_info.
The robot local system adopts a serial port inertial sensor, realizes the interaction of control information and robot state information between a robot computing host and a main control unit through a serial port communication module, and realizes the information interaction by leading out two serial ports on a main control board of the robot local system, wherein a serial port 1 is connected with a robot control host through a USB-to-RS 232 chip, and a serial port 2 is used as a later expansion interface.
The control system designs each module in a distributed mode according to different functions and hardware modules of the mobile robot, packages specific functions of different modules in different nodes, uniformly manages the nodes through a node manager of an ROS, subscribes each node to send messages to each other to complete communication among the nodes, and outputs cmd_vel information to a motor driving module to finally complete autonomous control.
The implementation process of the robot camera identification is as follows:
installing one of the function packages ar_track_alvar in the ROS;
the ar_track_alvar function package supports a USB camera or an RGB-D camera as a visual sensor for identifying the two-dimensional code, and is respectively applied to two different identification nodes, namely an inventioKinect and an inventiodulmars;
first, the most commonly used USB cameras are used for identification.
After installation is complete, ar_track_alvar is found under the ROS default installation path/opt/ROS/media/share,
copying pr2_index_no_kinect.launch files in an ar_track_alvar function package launch folder as a blue book, modifying and setting a used USB camera, and renaming the blue book to be robot_vision/launch/ar_track_camera;
viewing camera information in an opened rviz interface, and displaying two coordinate systems of world and camera in a main interface; the two-dimensional code label is placed in the visual field range of the camera for identification; a plurality of two-dimensional codes are simultaneously and accurately identified, coordinate axes of the two-dimensional codes in the image represent the identified gesture of the two-dimensional codes;
the ar_track_alvar function package can identify the two-dimensional code in the image and the space gesture of the two-dimensional code, and can calculate the space position of the two-dimensional code relative to the camera, thereby realizing the obstacle avoidance and absolute positioning functions.
The above embodiments are only preferred embodiments of the present invention, and it is intended that the common variations and substitutions made by those skilled in the art within the scope of the technical solution of the present invention are included in the scope of the present invention.

Claims (10)

1. The control system based on the ROS hanging rail carrying robot is characterized in that the system adopts a rail-guided detection mode, comprises a robot local system and a remote operation library, and controls the movement of the hanging rail carrying robot by adopting a robot development system ROS;
the remote operation library is responsible for heavy data storage, data management, data processing and processing of three-dimensional image depth information;
the robot local system comprises a driving system, a data acquisition system, a communication system and an extended motion system.
2. The ROS suspended rail-carrying robot-based control system of claim 1, wherein the remote operation library employs a microprocessor of a central computer as a decision center for the robot;
the central computer is connected with the STM32F407 through a serial port to serve as a lower computer, and controls the driving system to drive the braking system.
3. The control system based on the ROS suspended rail carrying robot according to claim 2, wherein the driving system consists of a motor and a motor driver, and is controlled by a CAN communication mode, and comprises four motor drivers and four motor groups;
the data acquisition system extracts useful information from the surrounding environment and reports the useful information to the central computer in the running process of the robot so as to make decisions;
the communication system comprises USB serial port communication, a local gigabit Ethernet network, a Control Area Network (CAN) bus and a wireless technology.
4. A control system based on an ROS suspended rail carrying robot according to any one of claims 1-3, wherein the suspended rail structure of the suspended rail carrying robot comprises a suspended rail bracket, a suspended rail member connector and an inclined rail connector, the suspended rail two-dimension code tag is arranged on the suspended rail bracket, and the ROS suspended rail carrying robot scans the two-dimension code on the suspended rail bracket through a camera to realize absolute positioning, thereby eliminating errors.
5. The control system of claim 4, wherein the control system uses RosBridge of ROS function packages to implement a way to control the robot by WebSocket and provides a JSON interface to enable non-ROS platforms to use ROS functions.
6. The ROS suspended rail vehicle-based control system of claim 5, wherein the control system generalizes the Web server side to be part of ROS in remote control by employing RosBridge packages;
the control system adopts a WebSocket Server function of Rosbridge at the robot end, and monitors at a port 9090 after starting;
the control system adopts a WebSocket Client of Rosbridge at a server end to create a WebSocket which is connected to the robot end and transmits JSON format data to the WebSocket Client, and the WebSocket Server analyzes the request data and calls a corresponding ROS API to control the robot;
the robot end is automatically connected with the server end after being started by installing websocket-client.
7. The ROS-based control system of a suspended rail carrying robot of claim 6, wherein the absolute positioning is achieved by:
the control system drives a robot USB camera using a V4L protocol by using a usb_cam function packet of the ROS, the USB camera outputs two-dimensional image data, the usb_cam is a ROS driving packet aiming at the USB camera of the V4L protocol, and the core node is a usb_cam_node; the specific implementation process is as follows:
when the system runs usb_cam/map, the camera starts up usb_cam_node first, configures corresponding parameters, then runs image_view node, subscribes to image topics/usb_cam/image_raw, and visually presents the world seen by the camera;
then calibrating the monocular camera, installing a camera calibration function package camera_calibration by an installation command, calibrating a required calibration target, and printing out a pattern under a robot_vision/doc folder and pasting the pattern on a plane cardboard for later use;
after calibration is completed, all YAML configuration files are automatically generated under the file of the ros/camera_info.
8. The control system of a ROS-based lifting rail carrying robot of claim 4,
the robot local system adopts a serial port inertial sensor, realizes the interaction of control information and robot state information between a robot computing host and a main control unit through a serial port communication module, and realizes the information interaction by leading out two serial ports on a main control board of the robot local system, wherein a serial port 1 is connected with a robot control host through a USB-to-RS 232 chip, and a serial port 2 is used as a later expansion interface.
9. The control system of a ROS-based lifting rail carrying robot of claim 4,
the control system designs each module in a distributed mode according to different functions and hardware modules of the mobile robot, packages specific functions of different modules in different nodes, uniformly manages the nodes through a node manager of an ROS, subscribes each node to send messages to each other to complete communication among the nodes, and outputs cmd_vel information to a motor driving module to finally complete autonomous control.
10. The control system of claim 7, wherein the robot camera recognition is implemented as follows:
installing one of the function packages ar_track_alvar in the ROS;
the ar_track_alvar function package supports a USB camera or an RGB-D camera as a visual sensor for identifying the two-dimensional code, and is respectively applied to two different identification nodes, namely an inventioKinect and an inventiodulmars;
firstly, the most commonly used USB cameras are used for identification;
after installation is complete, ar_track_alvar is found under the ROS default installation path/opt/ROS/media/share,
copying pr2_index_no_kinect.launch files in an ar_track_alvar function package launch folder as a blue book, modifying and setting a used USB camera, and renaming the blue book to be robot_vision/launch/ar_track_camera;
viewing camera information in an opened rviz interface, and displaying two coordinate systems of world and camera in a main interface; the two-dimensional code label is placed in the visual field range of the camera for identification; a plurality of two-dimensional codes are simultaneously and accurately identified, coordinate axes of the two-dimensional codes in the image represent the identified gesture of the two-dimensional codes;
the ar_track_alvar function package can identify the two-dimensional code in the image and the space gesture of the two-dimensional code, and can calculate the space position of the two-dimensional code relative to the camera, thereby realizing the obstacle avoidance and absolute positioning functions.
CN202311493407.8A 2023-11-10 2023-11-10 Control system based on ROS hanging rail carrying robot Pending CN117234139A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311493407.8A CN117234139A (en) 2023-11-10 2023-11-10 Control system based on ROS hanging rail carrying robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311493407.8A CN117234139A (en) 2023-11-10 2023-11-10 Control system based on ROS hanging rail carrying robot

Publications (1)

Publication Number Publication Date
CN117234139A true CN117234139A (en) 2023-12-15

Family

ID=89098574

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311493407.8A Pending CN117234139A (en) 2023-11-10 2023-11-10 Control system based on ROS hanging rail carrying robot

Country Status (1)

Country Link
CN (1) CN117234139A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108406764A (en) * 2018-02-02 2018-08-17 上海大学 Intelligence style of opening service robot operating system and method
CN110673612A (en) * 2019-10-21 2020-01-10 重庆邮电大学 Two-dimensional code guide control method for autonomous mobile robot
CN112017247A (en) * 2020-08-11 2020-12-01 盐城工学院 Method for realizing unmanned vehicle vision by using KINECT
CN113253719A (en) * 2021-04-06 2021-08-13 南京工程学院 Intelligent mobile equipment based on ROS operating system and communication establishing method
US20220001534A1 (en) * 2020-07-02 2022-01-06 Robert Bosch Gmbh Uniform remote control of mobile platforms
CN114800518A (en) * 2022-05-20 2022-07-29 东南大学 Multi-mobile-robot cooperative control experiment platform based on embedded framework

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108406764A (en) * 2018-02-02 2018-08-17 上海大学 Intelligence style of opening service robot operating system and method
CN110673612A (en) * 2019-10-21 2020-01-10 重庆邮电大学 Two-dimensional code guide control method for autonomous mobile robot
US20220001534A1 (en) * 2020-07-02 2022-01-06 Robert Bosch Gmbh Uniform remote control of mobile platforms
CN112017247A (en) * 2020-08-11 2020-12-01 盐城工学院 Method for realizing unmanned vehicle vision by using KINECT
CN113253719A (en) * 2021-04-06 2021-08-13 南京工程学院 Intelligent mobile equipment based on ROS operating system and communication establishing method
CN114800518A (en) * 2022-05-20 2022-07-29 东南大学 Multi-mobile-robot cooperative control experiment platform based on embedded framework

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王宸: "《机器视觉与图像识别》", 31 July 2022, 北京理工大学出版社, pages: 161 - 165 *

Similar Documents

Publication Publication Date Title
CN109032138B (en) Consistency algorithm-based multi-robot formation control system and method
CN103853156B (en) A kind of small-sized four-rotor aircraft control system based on machine set sensor and method
WO2017097170A1 (en) Autonomous positioning and navigation device, positioning and navigation method and autonomous positioning and navigation system
CN108163718B (en) Group's tower intelligent accurate hoist controlling method based on Internet of Things
EP3460616A2 (en) Addressing method for functional modules of a movable object
GB2180958A (en) Automatically piloted vehicle
WO2022252221A1 (en) Mobile robot queue system, path planning method and following method
CN205121338U (en) AGV navigation based on image recognition and wireless network
US20030230998A1 (en) Distributed control system and distributed control method
CN104914864A (en) Mobile device, mobile device control system and control method
CN106647729A (en) AGV navigation system based on image identification and wireless network and navigation method thereof
KR101436555B1 (en) Internet based Teleoperation System of UAV
CN103472839A (en) Fast exploring controller of four-wheel micro-mouse based on double processors
CN110262499A (en) A kind of two-track mobile robot synchronous control system and method
CN210835730U (en) Control device of ROS blind guiding robot
CN105867396B (en) The information processing system and its method of work of campus autonomous cruise four-axle aircraft
CN108375979A (en) Self-navigation robot general-purpose control system based on ROS
CN117234139A (en) Control system based on ROS hanging rail carrying robot
CN108748148A (en) A kind of intelligence climbing robot intelligent electric control system and its control method
CN111300491A (en) Human-computer cooperation safety system based on cooperative driving and controlling integrated robot
CN112860571B (en) WCS system virtual debugging method
CN110702118A (en) AGV-based outdoor positioning navigation system and positioning method thereof
CN205263658U (en) Unmanned aerial vehicle self stabilization segmentation drive visual tracking cloud platform
CN105302133A (en) Single-core low-speed six-wheel miniature micro-mouse full-digital navigation servo system controller
US20140277623A1 (en) Graphics driven motion control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination