WO2021254367A1 - 机器人系统及定位导航方法 - Google Patents

机器人系统及定位导航方法 Download PDF

Info

Publication number
WO2021254367A1
WO2021254367A1 PCT/CN2021/100278 CN2021100278W WO2021254367A1 WO 2021254367 A1 WO2021254367 A1 WO 2021254367A1 CN 2021100278 W CN2021100278 W CN 2021100278W WO 2021254367 A1 WO2021254367 A1 WO 2021254367A1
Authority
WO
WIPO (PCT)
Prior art keywords
positioning
module
navigation
robot
navigation device
Prior art date
Application number
PCT/CN2021/100278
Other languages
English (en)
French (fr)
Inventor
姚秀军
崔丽华
王栋
桂晨光
Original Assignee
京东科技信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东科技信息技术有限公司 filed Critical 京东科技信息技术有限公司
Publication of WO2021254367A1 publication Critical patent/WO2021254367A1/zh

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/022Optical sensing devices using lasers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/026Acoustical sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Definitions

  • the present disclosure generally relates to the field of artificial intelligence technology, and more specifically, to a robot system and a positioning and navigation method.
  • Mobile robots have been greatly used in daily work and production.
  • Mobile robots can construct maps and path planning during use.
  • the current mobile robots are not accurate enough to construct maps, path planning is not reasonable, and collision obstacles often occur.
  • the phenomenon of objects, autonomous navigation and obstacle avoidance are not strong enough, which will reduce work efficiency and reduce the service life of mobile robots.
  • the present disclosure provides a robot system.
  • the system includes a positioning and navigation device, a driving device, a control device, a human-computer interaction device, and an environment perception device.
  • the environment perception device includes at least an ultrasonic sensor, a lidar, and a depth camera.
  • the positioning and navigation device is respectively connected with the driving device, the control device, the human-computer interaction device, the ultrasonic sensor, the lidar, and the depth camera;
  • the driving device is configured to provide driving force
  • the control device is configured to control the robot
  • the human-computer interaction device is configured to send operation instructions to the positioning and navigation device, and to receive feedback information from the positioning and navigation device;
  • the environment sensing device is configured to receive a data collection instruction sent by the positioning and navigation device, and send the collected environment data to the positioning and navigation device;
  • the positioning and navigation device is configured to construct a map and perform route planning.
  • the environment sensing device further includes an anti-drop sensor, an inertial measurement unit, and a collision sensor
  • the positioning navigation device is connected to the infrared sensor, the anti-drop sensor, the inertial measurement unit, and the collision sensor, respectively ;
  • the infrared sensor is configured to perform infrared ranging
  • the anti-drop sensor is configured to detect the height of the ground, and adjust the forward direction of the robot based on the detected height of the ground;
  • the inertial measurement unit is configured to measure the three-axis attitude angle and acceleration of the robot.
  • the collision sensor is configured to detect a collision signal and transmit the collision signal to the positioning and navigation device.
  • the positioning and navigation device includes: a positioning module and a navigation module, and the positioning module is connected to the navigation module;
  • the positioning module is configured to perform pose detection and relocation
  • the navigation module is configured to perform obstacle avoidance processing, speed control, path smoothness control, path deduction, and 3D obstacle detection.
  • the positioning and navigation device includes a mapping module
  • the map building module is configured to perform map optimization loop processing and build a map.
  • the positioning and navigation device includes: a security module and a micro-control unit, and the security module is connected to the micro-control unit;
  • the security module is configured to control heartbeat detection, ultrasonic collision detection, and set up a lidar fence;
  • the micro control unit is configured to control the movement of the driving device.
  • the positioning and navigation device includes a monitoring module
  • the monitoring module is configured for power detection, positioning accuracy detection and safety early warning.
  • the human-computer interaction device includes: a visualization software module and a software development kit module, the visualization software module and the software development kit module are respectively connected to the positioning and navigation device;
  • the visualization software module is configured to set inspection points, manage maps, set modes and real-time monitoring;
  • the software development kit module is configured to send mapping instructions, navigation instructions, and set virtual walls and obtain locations.
  • the positioning and navigation device includes:
  • a collision switch configured to open the circuit after the robot collides
  • the emergency stop module is configured to disconnect the power supply of the circuit to stop the movement of the robot.
  • the positioning and navigation device includes:
  • a recording module configured to record working data and send the working data to the human-computer interaction device
  • the working data includes: operating data, abnormal data and operating data.
  • the present disclosure provides a positioning and navigation method, which is applied to any one of the above-mentioned robot systems, and the method includes:
  • positioning and navigation processing is performed by the positioning and navigation device.
  • the robot system includes a positioning and navigation device, a driving device, a control device, a human-computer interaction device, and an environment sensing device.
  • the environment sensing device includes at least an infrared sensor, an ultrasonic sensor, a lidar, and a depth camera,
  • the positioning and navigation device is respectively connected with the driving device, the control device, the human-computer interaction device, the infrared sensor, the ultrasonic sensor, the lidar, and the depth camera.
  • Figure 1 shows a structural diagram of a robot system provided by an embodiment of the present disclosure
  • Figure 2 shows a connection diagram of an environment sensing device and a positioning and navigation device provided by an embodiment of the present disclosure
  • Fig. 3 shows a block diagram of a robot system provided by an embodiment of the present disclosure.
  • Fig. 4 shows a functional diagram of a robot system provided by an embodiment of the present disclosure.
  • the embodiments of the present disclosure provide a robot system and a positioning and navigation method, which can be applied to a robot for autonomously constructing a map and performing path planning and navigation.
  • the robot system includes: positioning and navigation device, driving device, control device, human-computer interaction device and environment perception
  • the device and the positioning and navigation device are respectively connected with the driving device, the control device, the human-computer interaction device and the environment sensing device.
  • the driving device is configured to provide driving force;
  • the control device is configured to control the robot;
  • the human-computer interaction device is configured to send operation instructions to the positioning and navigation device and receive feedback information from the positioning and navigation device;
  • the environment sensing device is configured to Receive data collection instructions sent by the positioning and navigation device, and send the collected environmental data to the positioning and navigation device;
  • the positioning and navigation device is configured to construct a map and perform path planning.
  • the robot navigation process is: the user sends task instructions to the positioning navigation device through the human-computer interaction device, the positioning navigation device analyzes the navigation information according to the task instruction, and sends the navigation information to the driving device so that the driving device drives the robot to move.
  • the automatic navigation device receives the environment data sent by the environment perception device, and updates the navigation information in real time according to the environment data to control the robot movement until the task ends.
  • Figure 2 shows the equipment connected to the positioning and navigation device.
  • Positioning and navigation are connected to audio equipment through the 12S interface, connected to the infrared sensor through the AD interface or serial port, connected to the anti-drop sensor through the AD interface, connected to the ultrasonic sensor through the RS485 interface, connected to the lidar through the network port, and connected to the depth through USB
  • the camera is connected to the IMU (Inertial Measurement Unit), and is connected to the collision switch and the anti-collision sensor through the IO interface.
  • the positioning and navigation device includes a variety of hardware interfaces, which are suitable for connecting with various hardware.
  • the hardware interfaces include: 6 network ports, CAN interface, IO interface, RS485 or RS232, USB2.0 or USB3.0, WIFI or 4G.
  • the environment sensing device includes detection devices for building a map, including: ultrasonic sensors, lidars, depth cameras, positioning and navigation devices are respectively connected with ultrasonic sensors, lidars and depth cameras, among which the ultrasonic sensors are configured to perform ultrasonic ranging , Suitable for accurately measuring the distance between stationary objects and moving objects; Lidar, configured for omni-directional scanning ranging, depth camera, configured to obtain three-dimensional orientation, strong resistance to ambient light interference, and high measurement accuracy, suitable for sunlight Less demanding outdoor.
  • the robot can be applied to many types of lidar, including: single-line sparse radar, single-line dense radar, multi-line dense radar and multi-line sparse radar.
  • Ultrasonic sensors are sensors that convert ultrasonic signals into other energy signals (usually electrical signals).
  • Ultrasound is a mechanical wave with a vibration frequency higher than 20kHz. It has the characteristics of high frequency, short wavelength, small diffraction phenomenon, especially good directivity, which can become rays and propagate directionally. Ultrasound has a great ability to penetrate liquids and solids, especially in solids that are opaque to sunlight. When the ultrasonic wave hits the impurity or the interface, it will produce a significant reflection to form a reflected echo, and it can produce a Doppler effect when it hits a moving object.
  • Lidar is a radar system that emits laser beams to detect the position and speed of the target.
  • the lidar transmits a detection signal (laser beam) to the target, and then compares the received signal (target echo) from the target with the transmitted signal, and after proper processing, the relevant information of the target can be obtained.
  • the radar that works in the infrared and visible light bands and uses laser as its working beam is called lidar. It consists of a laser transmitter, an optical receiver, a turntable, and an information processing system.
  • the laser converts electrical pulses into light pulses and emits them.
  • the optical receiver restores the light pulses reflected from the target to electrical pulses and sends them to the display.
  • the depth camera is highly resistant to ambient light interference, and its measurement accuracy is high, reaching the millimeter level.
  • the depth camera can use binocular matching for depth detection.
  • the process is: the triangulation principle is the difference between the abscissa of the target point in the left and right views, which is inversely proportional to the distance from the target point to the imaging plane. Get in-depth information.
  • Binocular matching adopts the principle of triangulation and is completely based on image processing technology. The matching point is obtained by finding the same feature points in the two images, thereby obtaining the depth value.
  • the robot will simultaneously use ultrasonic sensors, lidar and depth cameras for distance measurement and construct a flat map.
  • robots generally use lidar ranging to build maps, but this is only applicable to some robots.
  • This disclosure combines depth cameras, lidars and ultrasonic sensors to build maps in complex environments. It is suitable for outdoors and indoors, and is suitable for various types of robots.
  • Kinds of robots such as household sweeping robots, business service robots, industrial unmanned forklifts, and automatic guided transport vehicles.
  • the environmental sensing device further includes a device for navigation, which includes: an infrared sensor, an anti-drop sensor, an inertial measurement unit, and a collision sensor. connect.
  • a device for navigation which includes: an infrared sensor, an anti-drop sensor, an inertial measurement unit, and a collision sensor. connect.
  • Infrared sensor configured to perform infrared ranging, suitable for long-distance ranging
  • anti-drop sensor configured to detect ground height, adjust the robot's forward direction based on the detected ground height, and realize the anti-fall function
  • IMU Inertial measurement unit, inertial Measuring unit
  • the collision sensor is configured to detect the collision signal and transmit the collision signal to Position the navigation device to stop in time when the robot collides, and protect the robot from damage.
  • Infrared sensor is a kind of sensing device. It is a measuring system that uses infrared as the medium. It has a wide measuring range and short response time.
  • the infrared sensor has a pair of infrared signal transmitting and receiving diodes.
  • the infrared distance measuring sensor used emits a beam of infrared light, which forms a reflection process after it is irradiated to the object, and then receives the signal after being reflected to the sensor, and then uses CCD (Charge Coupled Device) , Charge coupled device) image processing receives the data of the time difference between transmission and reception, and the distance of the object is calculated after processing by the signal processor.
  • CCD Charge Coupled Device
  • Charge coupled device Charge coupled device
  • IMU is a device that measures the three-axis attitude angle (or angular rate) and acceleration of an object.
  • an IMU contains three single-axis accelerometers and three single-axis gyroscopes.
  • the accelerometer detects the acceleration signal of the object in the independent three-axis coordinate system of the carrier, while the gyroscope detects the angular velocity signal of the carrier relative to the navigation coordinate system. Measuring the angular velocity and acceleration of an object in three-dimensional space, and calculating the posture of the object based on this, has very important application value in navigation.
  • the anti-fall sensor When the robot is running, if it encounters steps or uneven ground, the anti-fall sensor will detect the ground height difference, and adjust the robot's forward direction when the ground height difference is greater than the preset height to prevent the robot from falling; if the robot detects during operation When there is an obstacle within the safe distance, the inertial measurement unit will decelerate according to the robot's three-axis attitude angle and acceleration to prevent the robot from hitting the obstacle due to excessive speed; if the robot collides, the collision sensor will detect the collision signal, and The collision signal is sent to the positioning and navigation device, so that the robot stops moving in time and reduces greater damage.
  • the positioning and navigation device further includes: a collision switch configured to disconnect the circuit after the robot collides to stop the robot from moving; the emergency stop module is configured to disconnect the power of the circuit to stop the robot from moving.
  • FIG. 3 is a system block diagram of the robot.
  • the system includes a human-computer interaction layer, hardware modules, core algorithms, chassis and external equipment.
  • the human-computer interaction layer includes visualization software modules and SDK (Software Development Kit).
  • the hardware modules are set on the ARM (Advanced RISC Machines) hardware platform.
  • the core algorithm functions include navigation and motion control, positioning and construction.
  • the chassis includes the driving device
  • the external equipment includes the environment sensing device.
  • the hardware module accepts the task instructions issued by the human-computer interaction layer through the communication interface, the core algorithm presets the obstacle safety distance, analyzes and executes the corresponding task information, and returns the analysis data to the visualization software module to display the task status and progress.
  • the hardware module interacts with the motor drive module in the drive device through the serial port, and sends the navigation information analyzed by the core algorithm to the motor drive module, so that the motor drive module controls the movement of the robot.
  • the autonomous navigation control module receives environmental perception
  • the device collects environmental data and updates the navigation information in real time according to the environmental data to control the movement of the robot until the end of the navigation task.
  • FIG. 4 shows a functional diagram of the system of the present disclosure, and each module in the figure will be described in detail below.
  • the positioning and navigation device includes: a positioning module and a navigation module, the positioning module is connected to the navigation module, the positioning module is configured to perform pose detection and relocation; the navigation module is configured to perform obstacle avoidance processing and speed control , Path smoothness control, path deduction and 3D obstacle detection.
  • the positioning module is configured to detect the position and pose of the mobile robot in the map and determine whether the positioning information is accurate.
  • the mobile robot is repositioned according to the preset relocation algorithm to obtain the correct pose of the robot , Perform pose loss protection, improve the robustness of mapping and positioning algorithms, and extend the validity period of the map.
  • the process of pose tracking and detection is: trying to track the pose (coordinates and orientation) of the robot at the current moment when the pose at the previous moment is given.
  • the dead-reckoning algorithm is the most classic.
  • Dead reckoning algorithm is that the robot uses internal sensors such as odometer to estimate the current position by calculating the displacement of the two wheels relative to the position at the previous moment.
  • Dead reckoning algorithm is often used in robot positioning in the early days due to its simple algorithm and low sensor price.
  • the basic idea of dead reckoning method is to calculate the relative position of the robot based on the robot's own sensor information, such as odometer information. The pose of the robot is not corrected by the external environment information.
  • the present disclosure adds external sensors to the robot based on the dead reckoning algorithm to perceive the surrounding environment, and uses Kalman filter technology to estimate the pose of the robot.
  • Kalman filter is an optimal recursive estimation algorithm with a very simple principle. The filter only needs to know the mean value and variance of the noise to perform an iterative solution.
  • the navigation module is configured to avoid obstacles during the navigation process, control the speed of travel, calculate the smoothness of the robot movement through the Freud path smoothing algorithm, and use the parsed navigation information to deduce the real-time trajectory of the robot and through the point cloud data Detect 3D obstacles.
  • the user can set the minimum size of the area the robot passes through to prevent the robot from getting stuck.
  • the positioning module After the robot receives the task instruction sent by the human-computer interaction device, the positioning module obtains the address of the destination and at the same time locates the robot itself.
  • the positioning process is: detecting the position of the mobile robot on the map and judging whether the positioning information is accurate.
  • the mobile robot is repositioned according to the preset repositioning algorithm to obtain the correct pose of the robot.
  • the positioning information is sent to the navigation module, and the navigation module performs path navigation according to the received positioning information, which can avoid obstacles during the navigation process, prevent collisions from causing damage to the robot, shorten the life of the robot, control the travel speed, and avoid the travel speed If it is too fast to decelerate and hit the obstacle in time, the smoothness of the robot motion is calculated by the Freud path smoothing algorithm, the real-time trajectory of the robot is deduced by the parsed navigation information, and the 3D obstacle is detected by the point cloud data.
  • Navigation algorithms can include fuzzy algorithms, neural network algorithms, fuzzy neural networks, genetic algorithms, and evolutionary neural networks.
  • Navigation methods include inertial navigation, magnetic navigation, visual navigation, navigation based on sensor data, satellite navigation, etc.
  • the navigation principle is: the robot adopts a combined positioning system of lidar, encoder and IMU.
  • the two encoders are respectively installed on the robot's axles, which can record the walking distance of the wheels in real time, and the heading of the vehicle body during walking is determined by the IMU.
  • a set of ultrasonic sensors are arranged around the car body to detect various obstacles in the work area. When the robot starts from a certain base point, it walks along the planned trajectory.
  • the encoder and IMU are used to measure the real-time left and right wheel angles and headings when the robot is walking, and record these data through the data acquisition system. As a result, the distance traveled by the left and right wheels of the car body in a unit time can be calculated respectively.
  • the method of detecting 3D obstacles is: determining the second 2D feature vector corresponding to the 2D image area of the current obstacle detected from the 3D point cloud and 2D image of the current frame; Each of the second 2D feature vectors is compared with each of the first 2D feature vectors in the obstacle feature vector set to obtain a plurality of difference feature vectors, wherein the obstacle feature vectors are stored in the set representing previously detected The first 2D feature vector of the previous obstacle; perform deep learning calculations on multiple difference feature vectors to generate multiple corresponding probability values, each of which indicates that a current obstacle and a previous obstacle are the same obstacle Probability; and the corresponding relationship between the current obstacle and the previous obstacle is determined according to multiple probability values, so as to achieve obstacle tracking.
  • the positioning and navigation device includes a mapping module; the mapping module is configured to perform map optimization loop processing and build a map.
  • the present disclosure quickly optimizes the cumulative error of the pose through the loop optimization model and the g2o map optimization method, thereby further improving the accuracy and efficiency of autonomous positioning, improving the accuracy of autonomous positioning and mapping of mobile robots, and has good real-time performance .
  • the positioning and navigation device is also configured to construct a map to assist the robot in positioning and navigation, and to process the edge of the map to make it smoother.
  • the constructed map may have various forms, including a grid map and a topological map.
  • the present disclosure describes the construction of a topological map.
  • the environment description method of topological map is represented by the connection relationship between nodes and does not require very clear geometric information.
  • the topological map is built on the basis of the grid map. After the grid map is split, it can be divided into different blocks. These blocks together generate a isomorphic map.
  • the nodes in the map are the same as the block map.
  • the adjacent areas connected by the arcs in the middle correspond to each other.
  • This map is the topological map of the environment. Therefore, the topological map is defined as a graph data structure.
  • the topological map uses nodes to represent some important location points in the environment, and the path information in the environment is represented by the lines between the nodes. In this way, the robot can express navigation between two nodes through some intermediate nodes.
  • the construction process of the topological map can be divided into the following parts: the first step, rasterization; the second step, the construction of the Voronoi (Tyson polygon) map: the third step, the search for key points; the fourth step, the key route The search; the fifth step, the construction of the topological map.
  • the first step rasterization
  • the second step the construction of the Voronoi (Tyson polygon) map
  • the third step the search for key points
  • the fourth step the key route The search
  • the fifth step the construction of the topological map.
  • the positioning and navigation device includes a security module and a micro-control unit, and the security module is connected to the micro-control unit.
  • the security module is configured to control heartbeat detection, ultrasonic collision detection, and set up lidar fence; the micro-control unit is configured to control the movement of the driving device.
  • the security module receives the information sent by the sending unit. If the information is not received after exceeding the preset threshold, it is considered that the connection with the sending unit is disconnected, and corresponding measures are taken; it is used to set up a virtual wall to detect the collision time of ultrasonic waves. Calculate the distance to the obstacle.
  • the micro-control unit, the C51+AVR control board can be used in this disclosure, which is a single-chip microcomputer control board with C51 and AVR functions.
  • the C51 part uses the AT89S52 single-chip microcomputer
  • the AVR part uses the ALTMEGA8 single-chip microcomputer.
  • the micro-control unit is configured to receive task instructions sent by the human-computer interaction device, and control the movement of the driving device according to the task instructions, and receive environmental data sent by the environment detection device, and control the movement of the driving device according to the environmental data.
  • the positioning and navigation device includes a monitoring module configured for power detection, positioning accuracy detection, and safety warning.
  • the monitoring module is configured to detect the power, display the remaining power on the visualization software, and detect whether the robot is positioned correctly, and send the detection result to the positioning module. It can also be configured as a safety warning, such as providing a full range of warning functions, once an warning appears within the range Objects, the robot immediately slows down to avoid strong collisions.
  • the human-computer interaction device includes: a visualization software module and a software development kit module, the visualization software module and the software development kit module are respectively connected to the positioning and navigation device; the visualization software module is configured to set inspection points, Manage maps, set modes and real-time monitoring; software development kit modules, configured to send mapping instructions, navigation instructions, and set virtual walls and obtain locations.
  • the essence of the visualization software module is to use the display function of the computer monitor to simulate the control panel of the traditional instrument, express and output the test results in various forms; use the powerful software function of the computer to display the calculation, analysis and processing of the signal; use the I/0 interface
  • the equipment completes signal acquisition and conditioning, thereby completing a computer test system with various test functions.
  • the user uses the mouse, keyboard or touch screen to operate the virtual panel, just like using a dedicated measuring instrument to achieve the required measurement target.
  • the visualization software module is configured to enable the user to set information and view the robot status.
  • the user can set the inspection point, and the robot will inspect the inspection point regularly to realize the unmanned and automated inspection.
  • the user can set the position of the charging station.
  • the user can perform map management, such as setting the name of each area, and the user can also set the robot working mode.
  • the visualization software module can display the running status of the robot, so that the user can easily know the status of the robot, such as working or charging, and can also display the current position of the robot so that the user can know the current position of the robot.
  • the user sets the robot tasks through the visualization software module, such as: first perform map management, for example, set the name of each area, and then set the inspection point.
  • the visualization software module sends task instructions to the positioning and navigation device, and the positioning navigation device performs positioning and navigation according to the task instructions, and feeds back the running status of the robot to the visualization software module so that the user can view the status of the robot.
  • SDK is a collection of development tools used by software engineers to build application software for specific software packages, software frameworks, hardware platforms, operating systems, etc., including complex hardware that can communicate with a certain embedded system, including Utility tools for debugging and other purposes.
  • the SDK often includes sample code, supporting technical notes, or other supporting documents that clarify doubts as basic reference materials.
  • the SDK may include a license that prevents it from developing software under an incompatible license.
  • the software development kit module supports users to carry out secondary development through the software development kit module. It is configured to send map construction instructions and navigation instructions to obtain the current location and destination of the robot, so that the robot can navigate according to the constructed map At the destination, speed command control can also be carried out to make the robot slow down in time to avoid collision with obstacles and to prevent damage to the robot; navigation parameters can be set. Set up a virtual wall to facilitate the designation of prohibited areas for robots and protect the privacy of robots and users.
  • Sensor data can be obtained, including: anti-drop sensors, inertial measurement units, collision sensors, infrared sensors, ultrasonic sensors, lidar and depth cameras The collected data can also be set to automatically recharge the robot when the power is not higher than the preset power.
  • the positioning and navigation device includes: a recording module, which is used for working data and configured to send the working data to the human-computer interaction device; wherein the working data includes: operating data, abnormal data, and operating data.
  • the recording module records the data generated during the operation of the robot, and sends the data to the control module so that the human-computer interaction device can view the data.
  • the data includes the operation data, abnormal data and operation data of the robot.
  • the robot can adapt to a variety of wheels, including two-wheel differential, front-wheel single steering wheel, rear-drive single steering wheel, four-wheel drive mecanum wheel, dual steering wheel and three-zone omnidirectional.
  • the robot includes a variety of network interfaces, such as firewall, NTP (Network Time Protocol), DHPC (Dynamic Host Configuration Protocol), router, port mapping, 4G, etc., to facilitate operation under different networks.
  • network interfaces such as firewall, NTP (Network Time Protocol), DHPC (Dynamic Host Configuration Protocol), router, port mapping, 4G, etc.
  • the robot also supports online updates in the cloud and OTA (Over the Air Technology), which facilitates timely update of the robot system and improves user experience.
  • OTA Over the Air Technology
  • the present disclosure also provides a positioning and navigation method.
  • the navigation process is as follows: the robot receives the operation information sent by the user through the human-computer interaction device, and controls the movement of the driving device according to the operation information.
  • the environmental data detected by the radar and depth camera constructs a map, and uses infrared sensors, anti-drop sensors, inertial measurement units and collision sensors for positioning and navigation.
  • the driving device changes the movement direction and speed to realize automatic positioning and navigation. .

Abstract

一种机器人系统及定位导航方法,机器人系统包括:定位导航装置、驱动装置、控制装置、人机交互装置和环境感知装置,环境感知装置至少包括超声波传感器、激光雷达、深度摄像头,定位导航装置分别与驱动装置、控制装置、人机交互装置、超声波传感器、激光雷达和深度摄像头连接。

Description

机器人系统及定位导航方法
相关申请的引用
本公开要求于2020年6月18日向中华人民共和国国家知识产权局提交的申请号为202010560848.5、名称为“一种机器人系统及定位导航方法”的发明专利申请的全部权益,并通过引用的方式将其全部内容并入本文。
领域
本公开大体上涉及人工智能技术领域,更具体地,涉及机器人系统及定位导航方法。
背景
目前移动机器人在日常工作和生产中得到了极大的应用,移动机器人在使用过程中可以构建地图和路径规划,但目前移动机器人构建地图还不够精确,路径规划也不合理,常常会出现碰撞障碍物的现象,自主导航和避障能力不够强,会降低工作效率,减少移动机器人的使用寿命。
概述
一方面,本公开提供了机器人系统,该系统包括:定位导航装置、驱动装置、控制装置、人机交互装置和环境感知装置,所述环境感知装置至少包括超声波传感器、激光雷达、深度摄像头,所述定位导航装置分别与所述驱动装置、所述控制装置、所述人机交互装置、所述超声波传感器、所述激光雷达和所述深度摄像头连接;
所述驱动装置,配置为提供驱动力;
所述控制装置,配置为控制所述机器人;
所述人机交互装置,配置为将操作指令发送至所述定位导航装置,并接收所述定位导航装置的反馈信息;
所述环境感知装置,配置为接收所述定位导航装置发送的采集数 据指令,并将采集到的环境数据发送至所述定位导航装置;并且
所述定位导航装置,配置为构建地图,并进行路径规划。
在某些实施方案中,所述环境感知装置还包括防跌落传感器、惯性测量单元和碰撞传感器,所述定位导航装置分别与所述红外传感器、所述防跌落传感器、惯性测量单元和碰撞传感器连接;
所述红外传感器,配置为进行红外测距;
所述防跌落传感器,配置为探测地面高度,基于检测到的地面高度调整机器人前进方向;
所述惯性测量单元,配置为测量所述机器人三轴姿态角以及加速度;并且
所述碰撞传感器,配置为检测碰撞信号,并将所述碰撞信号传递至所述定位导航装置。
在某些实施方案中,所述定位导航装置包括:定位模块和导航模块,所述定位模块与导航模块连接;
所述定位模块,配置为进行位姿检测与重定位;并且
所述导航模块,配置为进行避障处理、速度控制、路径平滑性控制、路径推演和3D障碍物检测。
在某些实施方案中,所述定位导航装置包括建图模块;
所述建图模块配置为进行图优化回环处理和构建地图。
在某些实施方案中,所述定位导航装置包括:安全模块和微控制单元,所述安全模块与所述微控制单元连接;
所述安全模块,配置为主控心跳检测、超声波碰撞检测,以及设置激光雷达围栏;并且
所述微控制单元,配置为控制驱动装置运动。
在某些实施方案中,所述定位导航装置包括监控模块;
所述监控模块,配置为电量检测、定位准确性检测和安全预警。
在某些实施方案中,所述人机交互装置包括:可视化软件模块和软件开发工具包模块,所述可视化软件模块和所述软件开发工具包模块分别与所述定位导航装置连接;
所述可视化软件模块,配置为设置巡检点、管理地图、设定模式 和实时监控;并且
所述软件开发工具包模块,配置为发送建图指令、导航指令以及设置虚拟墙和获取位置。
在某些实施方案中,所述定位导航装置包括:
碰撞开关,配置为在所述机器人发生碰撞后断开电路;以及
急停模块,配置为断开电路电源从而停止所述机器人运动。
在某些实施方案中,所述定位导航装置包括:
记载模块,所述记载模块配置为记载工作数据,并将所述工作数据发送至所述人机交互装置;
其中,所述工作数据包括:运行数据、异常数据和操作数据。
另一方面,本公开提供了定位导航方法,所述方法应用于上述任一所述的机器人系统,所述方法包括:
通过人机交互装置接收用户发送的操作信息;
根据所述操作信息,控制驱动装置运动;
通过环境感知装置检测环境数据;以及
根据所述环境数据,通过所述定位导航装置进行定位导航处理。
本公开某些实施方案提供的机器人系统包括:定位导航装置、驱动装置、控制装置、人机交互装置和环境感知装置,所述环境感知装置至少包括红外传感器、超声波传感器、激光雷达、深度摄像头,所述定位导航装置分别与所述驱动装置、所述控制装置、所述人机交互装置、所述红外传感器、所述超声波传感器、所述激光雷达和所述深度摄像头连接,本公开某些实施方案通过融合了多种环境检测器,并将定位导航装置与多种环境检测器连接,可以使机器人构建地图和规划路径更精确,实现导航智能化。
当然,实施本公开的任一产品或方法并不一定需要同时达到以上所述的所有优点。
附图的简要说明
为了更清楚地说明本公开的技术方案,下面将对本公开描述中所需要使用的附图作简单地介绍,对于本领域普通技术人员而言,在不 付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1示出了本公开一实施例提供的机器人系统结构图;
图2示出了本公开一实施例提供的环境感知装置与定位导航装置连接图;
图3示出了本公开一实施例提供的机器人系统框图;并且
图4示出了本公开一实施例提供的机器人系统功能图。
详述
下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
本公开实施例提供了机器人系统及定位导航方法,可以应用于机器人,用于自主构建地图并进行路径规划与导航。
下面将结合实施方式,对本公开实施例提供的机器人系统及定位导航方法进行详细的说明,如图1所示,机器人系统包括:定位导航装置、驱动装置、控制装置、人机交互装置和环境感知装置,定位导航装置分别与驱动装置、控制装置、人机交互装置和环境感知装置连接。其中,驱动装置,配置为提供驱动力;控制装置,配置为控制机器人;人机交互装置,配置为将操作指令发送至定位导航装置,并接收定位导航装置的反馈信息;环境感知装置,配置为接收定位导航装置发送的采集数据指令,并将采集到的环境数据发送至定位导航装置;定位导航装置,配置为构建地图,并进行路径规划。
机器人导航过程为:用户通过人机交互装置发送任务指令至定位导航装置,定位导航装置根据任务指令分析导航信息,将该导航信息发送至驱动装置,以使驱动装置带动机器人运动,机器人运动的同时自动导航装置接收环境感知装置发送的环境数据,并根据环境数据实时更新导航信息,以控制机器人运动,直至任务结束。
如图2所示,图2示出了与定位导航装置连接的设备。定位导航 分别通过12S接口与音频设备连接,通过AD接口或串口与红外传感器连接,通过AD接口与防跌落传感器连接,通过RS485接口与超声波传感器连接,通过网口与激光雷达连接,通过USB与深度摄像头和IMU(惯性测量单元)连接,通过IO接口与碰撞开关和防碰撞传感器连接。定位导航装置包括多种硬件接口,适用于与各种硬件连接,硬件接口包括:6路网口、CAN接口、IO接口、RS485或RS232、USB2.0或USB3.0、WIFI或4G。
环境感知装置包括用于构建地图的检测器件,其包括:超声波传感器、激光雷达、深度摄像头,定位导航装置分别与超声波传感器、激光雷达和深度摄像头连接,其中,超声波传感器,配置为进行超声波测距,适用于精确测量静止物体和移动物体之间的距离;激光雷达,配置为全方位扫描测距,深度摄像头,配置为获得三维方位,抗环境光干扰强,而且测量精度高,适用于对阳光要求较少的户外。其中,机器人可以适用于多种类型的激光雷达,包括:单线稀疏雷达、单线稠密雷达、多线稠密雷达和多线稀疏雷达。
下文对上述传感器做介绍。
超声波传感器是将超声波信号转换成其他能量信号(通常是电信号)的传感器。超声波是振动频率高于20kHz的机械波。它具有频率高、波长短、绕射现象小,特别是方向性好、能够成为射线而定向传播等特点。超声波对液体、固体的穿透本领很大,尤其是在阳光不透明的固体中。超声波碰到杂质或分界面会产生显著反射形成反射回波,碰到活动物体能产生多普勒效应。
激光雷达是以发射激光束探测目标的位置、速度等特征量的雷达系统。激光雷达向目标发射探测信号(激光束),然后将接收到的从目标反射回来的信号(目标回波)与发射信号进行比较,作适当处理后,就可获得目标的有关信息。工作在红外和可见光波段的,以激光为工作光束的雷达称为激光雷达。它由激光发射机、光学接收机、转台和信息处理系统等组成,激光器将电脉冲变成光脉冲发射出去,光接收机再把从目标反射回来的光脉冲还原成电脉冲,送到显示器。
深度摄像头抗环境光干扰强,而且它的测量精度高,可以达到毫 米级。深度摄像头可以采用双目匹配进行深度检测,其过程为:三角测量原理即目标点在左右两幅视图中成像的横坐标之间存在的差异,与目标点到成像平面的距离成反比例的关系,得到深度信息。双目匹配采用三角测量原理完全基于图像处理技术,通过寻找两个图像中的相同的特征点得到匹配点,从而得到深度值。
机器人在构建地图过程中,会同时采用超声波传感器、激光雷达和深度摄像头进行测距,并构建平面地图。目前机器人一般采用激光雷达测距构建地图,但这仅适用于部分机器人,本公开融合了深度摄像头、激光雷达和超声波传感器,可以复杂的环境中构建地图,适用于户外和户内,适用于各种机器人,如家用扫地机器人、商服机器人以及工业类无人叉车、自动导引运输车。
在某些实施方案中,环境感知装置还包括用于导航的器件,其包括:红外传感器、防跌落传感器、惯性测量单元和碰撞传感器,定位导航装置分别与防跌落传感器、惯性测量单元和碰撞传感器连接。
红外传感器,配置为进行红外测距,适用于远距离测距;防跌落传感器,配置为探测地面高度,基于检测到的地面高度调整机器人前进方向,实现防跌落功能;IMU(Inertial measurement unit,惯性测量单元),配置为测量机器人三轴姿态角以及加速度,计算出机器人姿态,有助于机器人在遇到障碍物时及时进行减速控制;碰撞传感器,配置为检测碰撞信号,并将碰撞信号传递至定位导航装置,从而在机器人发生碰撞时及时停止,保护机器人不被损坏。
红外传感器是一种传感装置,是用红外线为介质的测量系统,测量范围广,响应时间短。红外传感器具有一对红外信号发射与接收二极管,利用的红外测距传感器发射出一束红外光,在照射到物体后形成一个反射的过程,反射到传感器后接收信号,然后利用CCD(Charge Coupled Device,电荷藕合器件)图像处理接收发射与接收的时间差的数据,经信号处理器处理后计算出物体的距离。这不仅可以使用于自然表面,也可用于加反射板,测量距离远,频率响应高,适合于恶劣的工业环境中。
IMU是测量物体三轴姿态角(或角速率)以及加速度的装置。一般 的,一个IMU包含了三个单轴的加速度计和三个单轴的陀螺,加速度计检测物体在载体坐标系统独立三轴的加速度信号,而陀螺检测载体相对于导航坐标系的角速度信号,测量物体在三维空间中的角速度和加速度,并以此解算出物体的姿态,在导航中有着很重要的应用价值。
机器人在运行过程中,如果遇到台阶或不平整地面,防跌落传感器会探测地面高度差,并在地面高度差大于预设高度时调整机器人前进方向,避免机器人跌落;如果机器人在运行过程中探测到安全距离内有障碍物出现,那么惯性测量单元会根据机器人的三轴姿态角和加速度进行减速,避免机器人因速度过快撞击障碍物;如果机器人发生碰撞,那么碰撞传感器会检测碰撞信号,并将碰撞信号发送至定位导航装置,以使机器人及时停止运动,减少更大损坏。
在某些实施方案中,定位导航装置还包括:碰撞开关,配置为在机器人发生碰撞后断开电路,使机器人停止运动;急停模块,配置为断开电路电源从而停止机器人运动。
图3为机器人的系统框图。该系统包括人机交互层、硬件模块、核心算法和底盘与外部设备。其中,人机交互层包括可视化软件模块和SDK(Software Development Kit,软件开发工具包),硬件模块设于ARM(Advanced RISC Machines)硬件平台上,核心算法的功能包括导航与运动控制、定位和建图,底盘包括驱动装置,外部设备包括环境感知装置。
硬件模块通过通信接口接受人机交互层下发的任务指令,核心算法预设障碍物安全距离,分析并执行相应的任务信息并返回分析数据至可视化软件模块显示任务状态与进程。硬件模块通过串口和驱动装置中的电机驱动模块进行数据交互,将核心算法分析的导航信息发送至电机驱动模块,使电机驱动模块控制机器人移动,机器人开始运动的同时,自主导航控制模块接收环境感知装置采集的环境数据,并根据环境数据实时更新导航信息,以控制机器人运动,直至导航任务结束。
图4示出了本公开的系统功能图,下面将对图中的各模块分别进 行详细描述。
在某些实施方案中,定位导航装置包括:定位模块和导航模块,定位模块与导航模块连接,定位模块,配置为进行位姿检测与重定位;导航模块,配置为进行避障处理、速度控制、路径平滑性控制、路径推演和3D障碍物检测。
定位模块,配置为检测移动机器人在地图中的位姿并判断定位信息是否准确,在移动机器人发生定位信息错误时,根据预设的重定位算法对移动机器人进行重定位,得到机器人的正确位姿,进行位姿丢失保护,提升建图和定位算法的鲁棒性,延长地图的有效期。
在某些实施方案中,位姿跟踪检测的过程为:在前一刻位姿给定的情况下,试图追踪机器人的当前时刻的位姿(坐标和朝向)。机器人相对位姿的估计有多种方法,其中最为经典的是航位推算法。航位推算法是机器人利用内在传感器比如里程计等,通过计算两个轮子相对于前一时刻位置的位移来估算当前位置。航位推算法由于其算法简单和传感器价格低廉在早期经常被用于机器人定位中。航位推法的基本思想是根据机器人自身传感器信息如里程计信息计算机器人的相对位置,机器人位姿没有外界环境信息加以修正,随着时间的推移会存在车轮打滑等原因导致累积误差越来越大。因此,它不适合长距离的位姿估计。于是,本公开在航位推算法的基础上给机器人加上外在传感器来感知周围的环境,运用卡尔曼滤波技术对机器人的位姿进行估计。卡尔曼滤波相比于其它的滤波器,是一种原理十分简单的最优递推估计算法,滤波器只需要知道噪声的均值和方差就进行迭代求解。
导航模块,配置为在导航过程中避开障碍物,控制行进速度,通过弗洛伊德路径平滑算法计算机器人运动的平滑性,通过解析后的导航信息推演出机器人的实时轨迹和通过点云数据检测3D障碍物。用户可以设置机器人通过的区域的最小尺寸,防止机器人卡住。
机器人在接收到人机交互装置发送的任务指令后,定位模块获取目的地的地址,同时对机器人自身进行定位,定位过程为:检测移动机器人在地图中的位姿并判断定位信息是否准确,在移动机器人发生定位信息错误时,根据预设的重定位算法对移动机器人进行重定位, 得到机器人的正确位姿。然后将定位信息发送至导航模块,导航模块根据接收到的定位信息进行路径导航,可以为在导航过程中避开障碍物,防止碰撞对机器人造成损坏,缩短机器人寿命,控制行进速度,避免行进速度过快而无法及时减速撞击障碍物,通过弗洛伊德路径平滑算法计算机器人运动的平滑性,通过解析后的导航信息推演出机器人的实时轨迹和通过点云数据检测3D障碍物。
导航算法可以包括模糊算法、神经网络算法、模糊神经网络、遗传算法和进化神经网络。导航方式包括惯性导航、磁导航、视觉导航、基于传感器数据导航、卫星导航等。导航原理为:机器人采用激光雷达、编码器和IMU组合定位系统,两编码器分别装在机器人轮轴上,可实时记录轮子的行走距离,车体行走过程中的航向由IMU测定。车体四周布置一组超声波传感器,用于探测工作区域内各种障碍作时,机器人从某一基点出发,沿规划好的轨迹行走。用编码器和IMU测出机器人行走时的实时左右轮转角和航向,并通过数据采集系统记录这些数据。由此,可分别计算出车体左右轮在单位时间内运行的距离。
在某些实施方案中,检测3D障碍物的方法为:确定从当前帧的3D点云和2D图像检测到的当前障碍物在2D图像中的2D图像区所对应的第二2D特征向量;将第二2D特征向量中的每一者与障碍物特征向量集中的每个第一2D特征向量进行比对,以获得多个差值特征向量,其中障碍物特征向量集中存储有表征先前检测到的先前障碍物的第一2D特征向量;对多个差值特征向量执行深度学习计算,以生成对应的多个概率值,每个概率值指示一当前障碍物与一先前障碍物为同一障碍物的概率;以及根据多个概率值确定当前障碍物与先前障碍物之间的对应关系,以实现障碍物跟踪。
在某些实施方案中,定位导航装置包括建图模块;建图模块配置为进行图优化回环处理、构建地图。
本公开通过回环优化模型,并结合g2o图优化方法对位姿累积误差进行快速优化,从而进一步提高自主定位精度和效率,提高了移动机器人的自主定位和建图精度,而且具有较好的实时性。定位导航装置还配置为构建地图,以辅助机器人的定位与导航,以及对地图边缘 线进行处理,使其更平滑。
在某些实施方案中,构建的地图可以有多种形式,包括栅格地图和拓扑地图,本公开对拓扑地图的构建进行描述。
拓扑地图的环境描述方法是通过节点和节点之间的连接关系来表示的,并不需要十分明确的几何信息。拓扑地图是基于栅格地图的基础建立起来的,栅格地图在拆分后可以被划分为一个个不同的区块,这些区块则共同生成一个同构图,图中的节点便和区块图中的弧连接着的相邻区域相对应,这个图就是环境的拓扑地图。所以,拓扑地图被定义为一个图的数据结构。拓扑地图以节点来表示环境中的重要的一些位置点,而环境中的路径信息则由节点之间的连线表示。如此一来,机器人则可以通过一些中间节点来表示在两个节点之间的导航。拓扑地图的构建过程可分为如下几个部分:第一步,栅格化;第二步,Voronoi(泰森多边形)图的构建:第三步,关键点的寻找;第四步,关键路线的寻找;第五步,拓扑图的构建。
在某些实施方案中,定位导航装置包括:安全模块和微控制单元,安全模块与微控制单元连接。安全模块,配置为主控心跳检测、超声波碰撞检测,以及设置激光雷达围栏;微控制单元,配置为控制驱动装置运动。
安全模块,接收发送单元发送的信息,如果超过预设阈值还没有接收到信息,则认为与发送单元的连接断开,则采取对应措施;用于设置虚拟墙,用于检测超声波的碰撞时间以计算与障碍物的距离。
微控制单元,本公开可以采用C51+AVR控制板,是拥有C51和AVR功能的单片机控制板。C51部分采用的是AT89S52单片机,AVR部分采用的是ALTMEGA8的单片机。微控制单元配置为接收人机交互装置发送的任务指令,并根据该任务指令控制驱动装置运动,以及接收环境检测装置发送的环境数据,并根据该环境数据控制驱动装置运动。
在某些实施方案中,定位导航装置包括监控模块,配置为电量检测、定位准确性检测和安全预警。
监控模块配置为检测电量将剩余电量显示在可视化软件上,并检 测机器人定位是否准确,将检测结果发送至定位模块,还可以配置为安全预警,例如提供全方位预警功能,一旦有预警范围内出现物体,机器人立即减速避免强力碰撞。
在某些实施方案中,人机交互装置包括:可视化软件模块和软件开发工具包模块,可视化软件模块和软件开发工具包模块分别与定位导航装置连接;可视化软件模块,配置为设置巡检点、管理地图、设定模式和实时监控;软件开发工具包模块,配置为发送建图指令、导航指令以及设置虚拟墙和获取位置。
可视化软件模块的实质是利用计算机显示器的显示功能来模拟传统仪器的控制面板,以多种形式表达和输出检测结果;利用计算机强大的软件功能显现信号的运算、分析、处理;利用I/0接口设备完成信号的采集与调理,从而完成各种测试功能的计算机测试系统。用户通过鼠标、键盘或者触摸屏来操作虚拟面板、就如同使用一台专用的测量仪器一样,实现所需要的测量目标。
可视化软件模块配置为使用户设置信息并查看机器人状态。在某些实施方案中,用户可以设定巡检点,机器人定期巡检该巡检点,实现巡检的无人化和自动化。用户可以进行充电桩位置设定。用户可以进行地图管理,例如设置各区域名称,用户还可以设置机器人工作模式。可视化软件模块可以展示机器人运行状态,使用户方便得知机器人状态,如工作中、充电中,还可以显示机器人当前所处位置,以使用户得知机器人当前所处位置。
用户通过可视化软件模块设置机器人任务,如:先进行地图管理,例如设置各区域名称,然后设定巡检点。可视化软件模块将任务指令发送至定位导航装置,定位导航装置根据任务指令进行定位导航,并将机器人运行状态反馈至可视化软件模块,以使用户能够查看机器人状态。
SDK,是一些被软件工程师用于为特定的软件包、软件框架、硬件平台、操作系统等建立应用软件的开发工具的集合,包括能与某种嵌入式系统通讯的复杂的硬件,包括用于调试和其他用途的实用工具。SDK还经常包括示例代码、支持性的技术注解或者其他的为基本参 考资料澄清疑点的支持文档,SDK可能附带了使其不能在不兼容的许可证下开发软件的许可证。
软件开发工具包模块,支持用户通过软件开发工具包模块进行二次开发,其配置为发送构建地图的指令和导航指令,获取机器人当前所处位置和目的地,以使机器人根据构建好的地图导航到目的地,还可以进行速度指令控制,使机器人及时减速避免碰撞到障碍物,防止给机器人带来损坏;可以设定导航参数。设定虚拟墙,方便指定机器人的禁止进入区域,保护机器人及用户隐私等安全,可以获取传感器数据,包括:防跌落传感器、惯性测量单元、碰撞传感器、红外传感器、超声波传感器、激光雷达和深度摄像头采集的数据,还可以设置机器人在电量不高于预设电量时进行自动回充。
在某些实施方案中,定位导航装置包括:记载模块,记载模块用于工作数据,并配置为将工作数据发送至人机交互装置;其中,工作数据包括:运行数据、异常数据和操作数据。
记载模块记载机器人在运行过程中产生的数据,并将该数据发送至控制模块,以使人机交互装置查看该数据,该数据包括机器人的运行数据、异常数据和操作数据。
机器人可以适配多种车轮,包括两轮差速、前轮单舵轮、后驱单舵轮、四驱麦克纳姆轮、双舵轮和三区全向。
机器人包括多种网络接口,如防火墙、NTP(Network Time Protocol,网络时间协议)、DHPC(Dynamic Host Configuration Protocol,动态主机配置协议)、路由器、端口映射、4G等,便于不同网络下运行。
该机器人还支持云端和OTA(Over the Air Technology,空中下载技术)在线更新,便于及时更新机器人系统,提高用户体验。
本公开还提供了定位导航方法,导航过程为:机器人通过人机交互装置接收用户发送的操作信息,并根据所述操作信息,并控制驱动装置运动,机器人在运动过程中,通过超声波传感器、激光雷达、深度摄像头检测到的环境数据构建地图,并通过红外传感器、防跌落传感器、惯性测量单元和碰撞传感器进行定位和导航,根据定位和导航 信息使驱动装置改变运动方向和速度,实现自动定位导航。
需要说明的是,在本文中,诸如“第一”和“第二”等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。而且,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上所述仅是本公开的具体实施方式,使本领域技术人员能够理解或实现本公开。对这些实施例的多种修改对本领域的技术人员来说将是显而易见的,本文中所定义的一般原理可以在不脱离本公开的精神或范围的情况下,在其它实施例中实现。因此,本公开将不会被限制于本文所示的这些实施例,而是要符合与本文所公开的原理和新颖特点相一致的最宽的范围。

Claims (10)

  1. 机器人系统,其包括:定位导航装置、驱动装置、控制装置、人机交互装置和环境感知装置,所述环境感知装置至少包括超声波传感器、激光雷达、深度摄像头,所述定位导航装置分别与所述驱动装置、所述控制装置、所述人机交互装置、所述超声波传感器、所述激光雷达和所述深度摄像头连接;
    所述驱动装置,配置为提供驱动力;
    所述控制装置,配置为控制所述机器人;
    所述人机交互装置,配置为将操作指令发送至所述定位导航装置,并接收所述定位导航装置的反馈信息;
    所述环境感知装置,配置为接收所述定位导航装置发送的采集数据指令,并将采集到的环境数据发送至所述定位导航装置;并且
    所述定位导航装置,配置为构建地图,并进行路径规划。
  2. 如权利要求1所述的系统,其中,所述环境感知装置还包括防跌落传感器、惯性测量单元和碰撞传感器,所述定位导航装置分别与所述红外传感器、所述防跌落传感器、惯性测量单元和碰撞传感器连接;
    所述红外传感器,配置为进行红外测距;
    所述防跌落传感器,配置为探测地面高度,基于检测到的地面高度调整机器人前进方向;
    所述惯性测量单元,配置为测量所述机器人三轴姿态角以及加速度;并且
    所述碰撞传感器,配置为检测碰撞信号,并将所述碰撞信号传递至所述定位导航装置。
  3. 如权利要求1或2所述的系统,其中,所述定位导航装置包括:定位模块和导航模块,所述定位模块与导航模块连接;
    所述定位模块,配置为进行位姿检测与重定位;并且
    所述导航模块,配置为进行避障处理、速度控制、路径平滑性控 制、路径推演和3D障碍物检测。
  4. 如权利要求1至3中任一权利要求所述的系统,其中,所述定位导航装置包括建图模块;
    所述建图模块配置为进行图优化回环处理和构建地图。
  5. 如权利要求1至4中任一权利要求所述的系统,其中,所述定位导航装置包括:安全模块和微控制单元,所述安全模块与所述微控制单元连接;
    所述安全模块,配置为主控心跳检测、超声波碰撞检测,以及设置激光雷达围栏;并且
    所述微控制单元,配置为控制驱动装置运动。
  6. 如权利要求1至5中任一权利要求所述的系统,其中,所述定位导航装置包括监控模块;
    所述监控模块,配置为进行电量检测、定位准确性检测和安全预警。
  7. 如权利要求1至6中任一权利要求所述的系统,其中,所述人机交互装置包括:可视化软件模块和软件开发工具包模块,所述可视化软件模块和所述软件开发工具包模块分别与所述定位导航装置连接;
    所述可视化软件模块,配置为设置巡检点、管理地图、设定模式和实时监控;并且
    所述软件开发工具包模块,配置为发送建图指令、导航指令以及设置虚拟墙和获取位置。
  8. 如权利要求1至7中任一权利要求所述的系统,其中,所述定位导航装置包括:
    碰撞开关,配置为在所述机器人发生碰撞后断开电路;以及
    急停模块,配置为断开电路电源从而停止所述机器人运动。
  9. 如权利要求1至8中任一权利要求所述的系统,其中,所述定位导航装置包括:
    记载模块,所述记载模块配置为记载工作数据,并将所述工作数据发送至所述人机交互装置;
    其中,所述工作数据包括:运行数据、异常数据和操作数据。
  10. 定位导航方法,其应用于权利要求1至9中任一权利要求所述的机器人系统,所述方法包括:
    通过人机交互装置接收用户发送的操作信息;
    根据所述操作信息,控制驱动装置运动;
    通过环境感知装置检测环境数据;以及
    根据所述环境数据,通过所述定位导航装置进行定位导航处理。
PCT/CN2021/100278 2020-06-18 2021-06-16 机器人系统及定位导航方法 WO2021254367A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010560848.5A CN111531549A (zh) 2020-06-18 2020-06-18 一种机器人系统及定位导航方法
CN202010560848.5 2020-06-18

Publications (1)

Publication Number Publication Date
WO2021254367A1 true WO2021254367A1 (zh) 2021-12-23

Family

ID=71971180

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/100278 WO2021254367A1 (zh) 2020-06-18 2021-06-16 机器人系统及定位导航方法

Country Status (2)

Country Link
CN (1) CN111531549A (zh)
WO (1) WO2021254367A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114518102A (zh) * 2022-02-14 2022-05-20 中建八局第三建设有限公司 一种实现测量仪与移动机器人交互的建筑测量方法及系统
CN114526725A (zh) * 2022-02-21 2022-05-24 山东新一代信息产业技术研究院有限公司 一种基于系统级芯片的超融合导航系统
CN114577206A (zh) * 2022-03-08 2022-06-03 宁波诺丁汉大学 一种基于超声波的室内惯导建图方法和系统
CN114856422A (zh) * 2022-05-07 2022-08-05 中国矿业大学 钻孔机器人用全自主移动底盘控制系统及控制方法
CN115793649A (zh) * 2022-11-29 2023-03-14 硕能(上海)自动化科技有限公司 一种电缆沟自动巡检装置及巡检方法

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111531549A (zh) * 2020-06-18 2020-08-14 北京海益同展信息科技有限公司 一种机器人系统及定位导航方法
CN112220399A (zh) * 2020-09-04 2021-01-15 南京蹑波物联网科技有限公司 一种全局定位系统、具有该全局定位系统的智能扫地机器人及其工作方法
CN111966109B (zh) * 2020-09-07 2021-08-17 中国南方电网有限责任公司超高压输电公司天生桥局 基于柔性直流换流站阀厅的巡检机器人定位方法及装置
CN112305547B (zh) * 2020-10-20 2022-05-13 山东新一代信息产业技术研究院有限公司 一种机器人防跌落检测方法
CN112520299A (zh) * 2020-11-23 2021-03-19 山东建筑大学 钢管仓库出入库管理引导系统及地面轨道式的巡查机器人
CN112604211A (zh) * 2020-12-16 2021-04-06 北京中电飞华通信有限公司 一种变电站用消防机器人及消防机器人灭火系统
CN113485381A (zh) * 2021-08-24 2021-10-08 山东新一代信息产业技术研究院有限公司 一种基于多传感器的机器人移动系统及方法
CN114720663A (zh) * 2022-04-25 2022-07-08 国网陕西省电力有限公司电力科学研究院 一种激光、光敏、超声三相融合测钎装置及方法
CN116360466B (zh) * 2023-05-31 2023-09-15 天津博诺智创机器人技术有限公司 一种基于深度相机的机器人运行避障系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110231061A1 (en) * 2009-09-17 2011-09-22 Reeve David R Gnss integrated multi-sensor control system and method
CN108073167A (zh) * 2016-11-10 2018-05-25 深圳灵喵机器人技术有限公司 一种基于深度相机与激光雷达的定位与导航方法
CN109828587A (zh) * 2019-03-08 2019-05-31 南京康尼智控技术有限公司 一种避障系统及避障方法
CN109917786A (zh) * 2019-02-04 2019-06-21 浙江大学 一种面向复杂环境作业的机器人感知系统及系统运行方法
KR20200011344A (ko) * 2018-06-28 2020-02-03 바이두 유에스에이 엘엘씨 여분의 초음파 radar를 구비한 자율 주행 차량
CN111531549A (zh) * 2020-06-18 2020-08-14 北京海益同展信息科技有限公司 一种机器人系统及定位导航方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20000011344U (ko) * 1998-12-01 2000-07-05 구자홍 전자레인지용 노브 구조

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110231061A1 (en) * 2009-09-17 2011-09-22 Reeve David R Gnss integrated multi-sensor control system and method
CN108073167A (zh) * 2016-11-10 2018-05-25 深圳灵喵机器人技术有限公司 一种基于深度相机与激光雷达的定位与导航方法
KR20200011344A (ko) * 2018-06-28 2020-02-03 바이두 유에스에이 엘엘씨 여분의 초음파 radar를 구비한 자율 주행 차량
CN109917786A (zh) * 2019-02-04 2019-06-21 浙江大学 一种面向复杂环境作业的机器人感知系统及系统运行方法
CN109828587A (zh) * 2019-03-08 2019-05-31 南京康尼智控技术有限公司 一种避障系统及避障方法
CN111531549A (zh) * 2020-06-18 2020-08-14 北京海益同展信息科技有限公司 一种机器人系统及定位导航方法

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114518102A (zh) * 2022-02-14 2022-05-20 中建八局第三建设有限公司 一种实现测量仪与移动机器人交互的建筑测量方法及系统
CN114518102B (zh) * 2022-02-14 2023-01-03 中建八局第三建设有限公司 一种实现测量仪与移动机器人交互的建筑测量方法及系统
CN114526725A (zh) * 2022-02-21 2022-05-24 山东新一代信息产业技术研究院有限公司 一种基于系统级芯片的超融合导航系统
CN114526725B (zh) * 2022-02-21 2023-11-24 山东新一代信息产业技术研究院有限公司 一种基于系统级芯片的超融合导航系统
CN114577206A (zh) * 2022-03-08 2022-06-03 宁波诺丁汉大学 一种基于超声波的室内惯导建图方法和系统
CN114577206B (zh) * 2022-03-08 2023-10-31 宁波诺丁汉大学 一种基于超声波的室内惯导建图方法和系统
CN114856422A (zh) * 2022-05-07 2022-08-05 中国矿业大学 钻孔机器人用全自主移动底盘控制系统及控制方法
CN115793649A (zh) * 2022-11-29 2023-03-14 硕能(上海)自动化科技有限公司 一种电缆沟自动巡检装置及巡检方法
CN115793649B (zh) * 2022-11-29 2023-09-01 硕能(上海)自动化科技有限公司 一种电缆沟自动巡检装置及巡检方法

Also Published As

Publication number Publication date
CN111531549A (zh) 2020-08-14

Similar Documents

Publication Publication Date Title
WO2021254367A1 (zh) 机器人系统及定位导航方法
Gao et al. Review of wheeled mobile robots’ navigation problems and application prospects in agriculture
AU2019210565B2 (en) Moving robot, method for controlling moving robot, and moving robot system
CN111522339A (zh) 畜禽舍巡检机器人自动路径规划与定位方法及装置
CN111308490B (zh) 基于单线激光雷达的平衡车室内定位与导航系统
CN114468898B (zh) 机器人语音控制方法、装置、机器人和介质
CN109917786A (zh) 一种面向复杂环境作业的机器人感知系统及系统运行方法
CN113189977B (zh) 一种用于机器人的智能导航路径规划系统及方法
CN214520204U (zh) 一种基于深度相机和激光雷达的港区智能巡检机器人
CN108890611A (zh) 一种基于slam的双目视觉避障轮式机器人
CN112518739A (zh) 履带式底盘机器人侦察智能化自主导航方法
CN113093756A (zh) 树莓派平台下基于激光slam的室内导航机器人
Csaba et al. Mobil robot navigation using 2D LIDAR
WO2018129648A1 (zh) 一种机器人及其以深度摄像头和避障系统构建地图的方法
CN110658828A (zh) 一种地貌自主探测方法及无人机
Beom et al. Mobile robot localization using a single rotating sonar and two passive cylindrical beacons
CN113566808A (zh) 一种导航路径规划方法、装置、设备以及可读存储介质
CN114527763A (zh) 基于目标检测和slam构图的智能巡检系统及方法
WO2016158683A1 (ja) 地図作成装置、自律走行体、自律走行体システム、携帯端末、地図作成方法、地図作成プログラム及びコンピュータ読み取り可能な記録媒体
CN116352722A (zh) 多传感器融合的矿山巡检救援机器人及其控制方法
CN208854616U (zh) 一种基于slam的双目视觉动态避障轮式机器人
Yee et al. Autonomous mobile robot navigation using 2D LiDAR and inclined laser rangefinder to avoid a lower object
WO2022121392A1 (zh) 停泊控制方法、控制系统、移动机器人及存储介质
EP2836853B1 (en) Apparatus and method for determining reference elements of an environment
Park et al. Multilevel localization for mobile sensor network platforms

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21825509

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21825509

Country of ref document: EP

Kind code of ref document: A1