CN108710376A - The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion - Google Patents

The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion Download PDF

Info

Publication number
CN108710376A
CN108710376A CN201810623638.9A CN201810623638A CN108710376A CN 108710376 A CN108710376 A CN 108710376A CN 201810623638 A CN201810623638 A CN 201810623638A CN 108710376 A CN108710376 A CN 108710376A
Authority
CN
China
Prior art keywords
servo
module
chassis body
sensor
mobile robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810623638.9A
Other languages
Chinese (zh)
Inventor
刘玉斌
朱文浩
臧希喆
赵杰
缪文良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology filed Critical Harbin Institute of Technology
Priority to CN201810623638.9A priority Critical patent/CN108710376A/en
Publication of CN108710376A publication Critical patent/CN108710376A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D61/00Motor vehicles or trailers, characterised by the arrangement or number of wheels, not otherwise provided for, e.g. four wheels in diamond pattern
    • B62D61/10Motor vehicles or trailers, characterised by the arrangement or number of wheels, not otherwise provided for, e.g. four wheels in diamond pattern with more than four wheels
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Transportation (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present invention provides a kind of mobile chassis for improving and building figure precision and realizing accurate SLAM and avoidance of the avoidance based on Multi-sensor Fusion, belongs to robotic technology field.The present invention includes:Environmental perception module, for acquiring the environmental data around chassis body by laser sensor and visual sensor;Bottom sensing module, the rotation direction angular displacement for acquiring chassis body and peripheral obstacle information;Control module, to handle environmental data, map is built, according to obstacle information, to determine the position of barrier, and acquisition is to the motion control instruction of chassis body, the instruction is sent to servo-driven module, realizes avoidance, while being additionally operable to receive the straight-line displacement for the chassis body that servo-driven module returns, in conjunction with the rotation direction angular displacement, the current pose of chassis body is obtained;Servo-driven module realizes speed control, while obtaining the straight-line displacement information of chassis body according to motion control instruction, to be driven to chassis body.

Description

The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion
Technical field
It is the present invention relates to a kind of mobile chassis carrying mobile tow-armed robot, more particularly to a kind of to be melted based on multisensor The SLAM of conjunction and the mobile chassis of avoidance, belong to robotic technology field.
Background technology
Because environment is complicated in family or factory floor, channel is narrow crowded, and laying for goods does not have the reasons such as rule, It requires mobile chassis to have simultaneously and possesses good flexibility among these environment.And have in common chassis type of drive Differential drives and the driving of Mecanum wheel omnidirectional, and zero turning radius may be implemented in the former, and three planar may be implemented in the latter Degree of freedom is flexibly walked, but the shock-absorbing performance and load-bearing capacity of the latter are poor, is not suitable for this movement for carrying mechanical arm Chassis, at the same in view of the task of its grasp handling etc., the ability for needing the chassis that there is load capacity and antidumping, energy Realize the movement of safety and steady.And in above-mentioned working space, may someone or remaining means of transport presence, and Belong to dynamic duty environment, it is therefore desirable to by static-obstacle thing therein and dynamic barrier, and in manipulator motion height On may existing barrier all consider wherein, to need the ability with dynamic interaction.
Mobile robot can reach the understanding to real world by the map of structure local environment, and be to realize to exist The location navigation of real world then needs map support accurate in detail, and laser radar is as a kind of biography to start to walk very early Sensor, stable, detection range remote feature high with precision, and the first choice as structure map always.But laser radar The data of two dimensional surface be can only obtain to build two-dimensional map, the elevation information of barrier in map can be ignored, can be unable to get Environmental catastrophe can cause the failure of navigation avoidance, and when barrier in the environment has uncertain, local paths planning Required degree can be not achieved in real-time.And the environmental information in the visual field can then be detected using depth camera sensor, The initial data characterized by dense point diagram can be obtained, alone as SLAM (simultaneous localization And mapping, immediately positioning and map structuring) sensor can strong influence algorithm real-time, increase the negative of sensor Load.Only estimate simultaneously using the data of encoder as the pose of mobile chassis, it can be by the shadow of encoder accumulation drift for a long time It rings, causes to build figure low precision on a large scale.
Invention content
Against the above deficiency, the present invention is provided a kind of improve and builds figure precision and realize being melted based on multisensor for accurate avoidance The SLAM of conjunction and the mobile chassis of avoidance.
The mobile chassis of a kind of SLAM and avoidance based on Multi-sensor Fusion of the present invention, the mobile chassis include moving Mobile robot chassis body, control module, servo-driven module, environmental perception module and bottom sensing module;
Environmental perception module, for being acquired around mobile robot chassis body by laser sensor and visual sensor Environmental data;
Bottom sensing module, the rotation direction angular displacement for acquiring current mobile robot chassis body and surrounding obstacles Object information;
Control module is connect respectively with environmental perception module, bottom sensing module and servo-driven module, to environment The environmental data of sensing module acquisition is handled, and map is built, to the obstacle information acquired according to bottom sensing module, It determines the position of barrier, and obtains the motion control instruction to mobile robot chassis body, which is sent to servo Drive module realizes avoidance, while being additionally operable to receive the straight line position for the mobile robot chassis body that servo-driven module returns It moves, in conjunction with the rotation direction angular displacement for the current mobile robot chassis body that bottom sensing module acquires, obtains mobile machine The current pose of people's chassis body;
Servo-driven module realizes speed according to motion control instruction, to be driven to mobile robot chassis body Degree control, while obtaining the straight-line displacement information of mobile robot chassis body.
Preferably, the control module, to the environmental data acquired according to laser sensor, with obtaining local laser Figure obtains three-dimensional local environment map according to the environmental data of visual sensor acquisition, using Bayesian formula by the three-dimensional Local environment map is converted to projection environmental map, and the local laser map of acquisition and projection environmental map are merged, obtained Obtain global grating map.
Preferably, the bottom sensor module includes Inertial Measurement Unit, avoiding obstacles by supersonic wave sensor, safe touch side Sensor, infrared distance sensor and microprocessor;
Inertial Measurement Unit, the pose for detecting mobile robot chassis body, is sent to microprocessor;
Avoiding obstacles by supersonic wave sensor, for measuring in front of mobile robot chassis body, the barrier of the left and right away from From being sent to microprocessor;
Safe touch side sensor, for detecting whether the information with bar contact, is sent to microprocessor;
Infrared distance sensor is sent to micro- for detecting mobile robot chassis body bottom surface at a distance from ground Processor;
Microprocessor obtains current mobile robot chassis body rotation direction angular displacement and barrier according to information is received The information for hindering object is converted into switching value and is sent to control module.
Preferably, mobile robot chassis body include chassis frame, two independent suspension devices and six wheel, it is described Six wheels universal wheel and two driving wheels including there are four, wherein four universal wheels are fixed on chassis frame bottom, and two Driving wheel is connect by an independent suspension device with chassis frame bottom respectively.
Preferably, the servo-driven module includes servo controller, two servo-drivers, two servo motors With two encoders, a servo-driver is sequentially connected in series with a servo motor, an encoder and a driving wheel, Electromechanical integration module is formed,
The rate control instruction that servo controller is sent out is sent to wherein No. 1 servo-driver by EtherCAT buses, And then motion control is carried out to corresponding driving wheel by the servo motor of connection;Meanwhile the No.1 servo-driver will be fast Degree instruction is sent to No. 2 servo-drivers, and then carries out motion control to corresponding driving wheel by the servo motor of connection;
Two encoders acquire the move distance of the servo motor respectively connected respectively, and code-disc data are sent to respectively The code-disc data of the servo-driver of connection, No. 1 servo-driver are sent to No. 2 servo-drivers, and No. 2 servo-drivers will connect The code-disc data received are sent to servo controller.
Preferably, the straight-line displacement of the mobile robot chassis body includes to the vertical of mobile underpan ontology To the straight-line displacement with transverse direction;
Mileage is obtained according to the code-disc data calculation of encoder to count, the servo controller counts solution according to mileage Calculate the straight-line displacement for the vertical and horizontal for obtaining mobile underpan ontology.
Preferably, the environmental data of acquisition is transferred to control by the laser sensor by Ethernet communication interfaces The environmental data of acquisition is then transferred to control module by module, the visual sensor by USB communication interface.
Above-mentioned technical characteristic may be combined in various suitable ways or be substituted by equivalent technical characteristic, as long as can reach To the purpose of the present invention.
The beneficial effects of the present invention are the present invention is according to the environment number around laser sensor and visual sensor acquisition According to map is constructed, enhance sensing capability of the mobile robot chassis body for external environment, for navigation or task Execution sufficient data parameters can be provided, figure precision is built in raising, while the information acquired by bottom sensing module and being watched The straight-line displacement that drive module returns to mobile robot chassis body is taken, determines Obstacle Position and mobile robot chassis body Pose, realize positioning and avoidance, increase safety.
Description of the drawings
Fig. 1 is the principle schematic of the mobile chassis of the SLAM and avoidance based on Multi-sensor Fusion of the present invention;
Fig. 2 is the principle schematic of pose estimation of the present invention;
Fig. 3 is the investigative range schematic diagram of laser radar and depth camera;
Fig. 4 is the structural schematic diagram at the top of chassis frame;
Fig. 5 is the structural schematic diagram of chassis frame bottom;
Fig. 6 is the structural schematic diagram of driving wheel;
Fig. 7 is the structural exploded view of driving wheel;
Fig. 8 is two wheel guide robot driving bobbin movement illustraton of model;
Fig. 9 is the flow diagram that control module blends the data of laser sensor and visual sensor.
Specific implementation mode
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete Site preparation describes, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on Embodiment in the present invention, those of ordinary skill in the art obtained under the premise of not making creative work it is all its His embodiment, shall fall within the protection scope of the present invention.
It should be noted that in the absence of conflict, the feature in embodiment and embodiment in the present invention can phase Mutually combination.
The invention will be further described in the following with reference to the drawings and specific embodiments, but not as limiting to the invention.
The mobile chassis of a kind of SLAM and avoidance based on Multi-sensor Fusion of the present invention, as shown in Figure 1, including movement Robot chassis body, control module, servo-driven module, environmental perception module and bottom sensing module;
Environmental perception module, for being acquired around mobile robot chassis body by laser sensor and visual sensor Environmental data;
Bottom sensing module, the rotation direction angular displacement for acquiring current mobile robot chassis body and surrounding obstacles Object information;
Control module is connect respectively with environmental perception module, bottom sensing module and servo-driven module, to environment The environmental data of sensing module acquisition is handled, and map is built, to the obstacle information acquired according to bottom sensing module, It determines the position of barrier, and obtains the motion control instruction to mobile robot chassis body, which is sent to servo Drive module realizes avoidance, while being additionally operable to receive the straight line position for the mobile robot chassis body that servo-driven module returns It moves, in conjunction with the rotation direction angular displacement for the current mobile robot chassis body that bottom sensing module acquires, obtains mobile machine The current pose of people's chassis body;
Servo-driven module realizes speed according to motion control instruction, to be driven to mobile robot chassis body Degree control, while obtaining the straight-line displacement information of mobile robot chassis body.
Present embodiment constructs map according to the environmental data around laser sensor and visual sensor acquisition, increases Strong mobile robot chassis body can provide abundance for the sensing capability of external environment for the execution of navigation or task Data parameters, figure precision is built in raising, while the information and servo-driven module that are acquired by bottom sensing module return to movement The straight-line displacement of robot chassis body, determines the pose of Obstacle Position and mobile robot chassis body, realize positioning and Avoidance increases safety.
In preferred embodiment, as shown in Fig. 2, the control module of present embodiment, to what is acquired according to laser sensor Environmental data obtains local laser map, and three-dimensional local environment map, profit are obtained according to the environmental data of visual sensor acquisition The three-dimensional local environment map is converted into projection environmental map with Bayesian formula, by the local laser map of acquisition and throwing Shadow environmental map is merged, and global grating map is obtained.
The laser sensor of present embodiment realizes that visual sensor is adopted using the laser radar of 270 ° of detection angles of plane Realized with depth camera, the external environment residing for chassis body perceived, laser radar can on the plane of scanning motion 0.1~ Object features in 10m detection ranges provide data and support for the map structuring of chassis body and navigation, depth camera acquisition Information includes image information and infrared depth information, existing three-dimensional barrier can be identified, while can also be right The dynamic objects such as pedestrian are identified, and find that barrier of the identification with certain altitude provides in three dimensions for chassis body Condition, or navigation of the mobile chassis in dynamic environment is supported with that can interact offer data.Laser radar and depth phase The investigative range of machine is as shown in Figure 3.
The data of laser sensor are sent to control module, visual sensor data by Ethernet in present embodiment It is transferred to control module by USB communication interface, two-dimensional map can only be built in order to make up laser radar, and ignores in environment and hinders The characteristics of hindering the elevation information of object, and then cause mobile chassis that can not detect environmental catastrophe, and caused by navigation and avoidance lose It loses, the data of laser sensor and visual sensor can be merged, and then complete environmental information data are provided, improve The Vertical Observation region of SLAM algorithms increases algorithm robustness, improves positioning accuracy.Bayesian iteration formula is used wherein, To build the grating map of depth camera and laser radar fusion, number is completed by prior model and the estimation of nearest dependent probability According to update, trellis states could and data associations and Bayesian formula are recycled to obtain new estimated value, to complete data fusion Process.
In preferred embodiment, as shown in Figure 1, the bottom sensor module of present embodiment includes Inertial Measurement Unit, surpasses Sound wave avoidance sensor, safe touch side sensor, infrared distance sensor and microprocessor;
Inertial Measurement Unit, the pose for detecting mobile robot chassis body, is sent to microprocessor;
Avoiding obstacles by supersonic wave sensor, for measuring in front of mobile robot chassis body, the barrier of the left and right away from From being sent to microprocessor;
Safe touch side sensor, for detecting whether the information with bar contact, is sent to microprocessor;
Infrared distance sensor is sent to for detecting mobile robot chassis body bottom surface with at a distance from ground Microprocessor;
Microprocessor obtains current mobile robot chassis body rotation direction angular displacement and barrier according to information is received The information for hindering object is converted into switching value and is sent to control module.
Present embodiment is improved safety when robot operation, is passed using avoiding obstacles by supersonic wave for realizing urgent avoidance Sensor calculates at a distance from front obstacle, to compensate detection blind area of the laser radar with depth camera in low spatial region; Ensure robot in remaining sensor failure and when having unpredictable collision generation using safe touch side sensor, Neng Gouqi It being transported to control module to cushioning effect and by collision alarm so that control module timely responds to, and sends out the order of emergency braking, With the generation to avoid dangerous situation;Using infrared distance sensor, it is arranged in mobile chassis bottom, when its probe value is super When crossing limit value, path planning circuit can be changed in time, to avoid encounter front step and caused by mobile chassis and mechanical arm Damage.In view of the output of avoiding obstacles by supersonic wave sensor and infrared distance sensor it is simultaneously distance value, and safe touch side passes Sensor is then output switching value, the priority of control module cross-thread can be caused chaotic if being transmitted directly to control module, and And also not enough interface can distribute control module, therefore it is that microprocessor first carries out sensor information to use bottom Integrated, wherein avoiding obstacles by supersonic wave sensor is connected by IIC interfaces with microprocessor, reads three avoiding obstacles by supersonic wave sensings respectively Device in the past, the distance of the barriers of three orientation detections in left and right, infrared distance sensor passes through USART and microprocessor phase Even, mobile chassis is read at a distance from ground, safe touch side sensor is directly connected with the I/O of microprocessor mouths, exports To switching value.Microprocessor handles three classes sensor signal respectively later, and whether front obstacle is located at Security alert region Be transformed into switching value is transferred to host computer by RS232 outside.Wherein, microprocessor uses the list of model STM32F407ZGT6 There is piece machine IIC interfaces, USART interfaces, the hardware resources such as I/O mouthfuls and A/D conversions can meet the requirements.
Present embodiment obtains odometer by being fixed on the code-disc data calculation of motor shaft end in servo-driven module Data can be counted to obtain displacement and the angle variable quantity of robot by mileage, and then obtain current location and deflection;
Inertial Measurement Unit IMU includes three-axis gyroscope, three axis accelerometer and magnetometer.Wherein, the inertia of use Measuring unit is MPU9250, including required gyroscope, accelerometer and magnetometer, and Inertial Measurement Unit passes through the logical of IIC Letter mode is connected with micro-control unit, carries out data transmission, to realize the read-write calculating etc. to IMU data.Wherein, Slave equipment of the MPU9250 as bus, microprocessor are completed as main equipment, IIC communications by two data lines, and data line is fixed Justice is SDA, and clock line is defined as SCL, and microprocessor will be sent to SDA line from device address, Self address is read from equipment After make it is corresponding, to which microprocessor can read the initial data of MPU9250 sensors simultaneously.
In preferred embodiment, as shown in Figure 5 and Figure 6, mobile robot chassis body includes 12, two independences of chassis frame Suspension arrangement 15 and six wheels, six wheels include that there are four universal wheel 13 and two driving wheels 14, wherein four universal wheels 13 are fixed to the bottom of chassis frame 12, and two driving wheels 14 pass through an independent suspension device 15 and chassis frame 12 respectively Bottom connects.
Two driving wheels 14 are installed on to the position of the intermediate both sides of chassis frame 12, four universal wheels 13 are mounted on bottom The quadrangle of jiggering frame 12, mounting height can be adjusted according to the decrement of suspension, while keeping stability so that move The steering on dynamic chassis is more flexible, to adapt to different use environments.
As shown in figure 4, present embodiment is equipped with lithium battery 1, mechanical arm mounting seat 2, driving at the top of chassis frame It device 3, control module 4, interchanger 5, depth camera 6, laser radar 7, sensor support 8, commutator transformer 9, wire casing 10 and opens Close 11;
Sensor support 8 is installed on vehicle frame edge, is used to support depth camera 6 and laser radar 7, can give full play to sharp The measurement advantage of 270 ° of optical radar.
The independent suspension device of present embodiment includes connector, spring conductor rod, damping spring and limit assembly, wherein Connector for independent suspension device being connected on chassis frame and connect limit assembly, the position-limit mechanism in limit assembly with Locking nut is used for the limit of Suspension movement, while connecting driving wheel, and damping spring is installed between limit assembly, as by Die mould spring, the height of driving wheel is adjusted for adaptive ground, and plays the function of buffering decompression, and spring conductor rod is fixed on In the plane of limit assembly, to ensure the dilatation direction of spring.
The driving wheel of present embodiment, as shown in fig. 7, comprises fixed frame 141, encoder 142, bearing 143, servo motor 144, No. 1 axis 145, sealing ring 146, bearing 147,148, No. 2 axis 149 of wheel, harmonic gear reducer 150 and sealing plate 151;
By No. 1 axis 145 and 2 axis 149 by encoder 142, servo motor 144, wheel 148 and harmonic gear reducer 150 It connects together, then wheel 148 is connected on vehicle frame by fixed frame 141.The wherein design of harmonic gear reducer 150 exists Inside wheel 148, the flexbile gear of harmonic gear reducer 150 is connected directly with wheel hub, saves transmission shaft, reduces installation volume, Motor 144 uses hollow cupulate simultaneously so that structure is compacter.
Present embodiment may make the wheel of every side all to be hanged by suspension system using the design of class MacPherson strut Vehicle frame is hung in the following, to reduce the impact that vehicle body is subject to, wheel is improved for the adhesive force on ground, can make entire movement Chassis also possesses good passability on the road surface of out-of-flatness, while two driving wheels use Two-wheeled system form, respectively From independent bounce, do not interfere with each other, ensure that the wheelspan of mobile chassis and wheelbase do not change, effectively reduce the inclination of vehicle body with Shake increases control accuracy, avoids deviation phenomenon, while improving the perpendicular positioning precision of the mobile platform, facilitates carrying double Mechanical arm has certain heavy burden ability and resistance to capsizing.
In preferred embodiment, the servo-driven module of present embodiment includes servo controller, two servo-drivers, two A servo motor and two encoders, a servo-driver and a servo motor, an encoder and a driving wheel according to It is secondary to be connected in series with, electromechanical integration module is formed,
The rate control instruction that servo controller is sent out is sent to wherein No. 1 servo-driver by EtherCAT buses, And then motion control is carried out to corresponding driving wheel by the servo motor of connection;Meanwhile the No.1 servo-driver will be fast Degree instruction is sent to No. 2 servo-drivers, and then carries out motion control to corresponding driving wheel by the servo motor of connection;
Two encoders acquire the move distance of the servo motor respectively connected respectively, and code-disc data are sent to respectively The code-disc data of the servo-driver of connection, No. 1 servo-driver are sent to No. 2 servo-drivers, and No. 2 servo-drivers will connect The code-disc data received are sent to servo controller.
The control module of present embodiment using industrial personal computer realize, mainly to sensor acquisition information handled with And servo-driven module is controlled;The data of sensor acquisition are that external environment is perceived, description robot itself shape State detects obtained data information, and servo-driven module is then to convert the object form speed of the robot to each drive The rotating speed of driving wheel is sent to No. 1 servo-driven module, it is contemplated that mobile chassis is using two-wheeled by ModBus communications protocol Differential drive system, you can to realize the linear movement and rotation of car body by changing the rotating speed of two driving wheels, therefore transport Dynamic control is that the target travel speed of the robot is calculated to the difference rotating speed for being changed into two driving wheels by speed, together When need to obtain the relationship of two wheel speeds and car body linear velocity and angular velocity of rotation in SLAM algorithms, derived with next To each driving wheel speed and angular speed, it sets the systemic velocity for moving chassis to target travel speed Vc, V is set1、V2For The speed at left and right sidesing driving wheel center, it is assumed that chassis is advanced in the ideal situation, is not considered the factors such as skidding, can be obtained barycenter Speed is:
Chassis operation angular speed beMobile chassis does clockwise movement, obtains:
And then the angular speed that can obtain mobile chassis is:
Wherein, B is the distance between two wheels, and R is the mobile bottom of instantaneous center of velocity distance of mobile chassis under current state The distance of the barycenter of disk, it is specific as shown in Figure 8.
In preferred embodiment, the straight-line displacement of the mobile robot chassis body of present embodiment includes being moved to robot The straight-line displacement of the vertical and horizontal of chassis body;
Mileage is obtained according to the code-disc data calculation of encoder to count, the servo controller counts solution according to mileage Calculate the straight-line displacement for the vertical and horizontal for obtaining mobile underpan ontology.
As shown in figure 9, the angular speed of three axis can be measured using the gyroscope of Inertial Measurement Unit, can be obtained by integral To the variation of posture, and accelerometer and accelerometer can be with the 3-axis accelerations of output transducer, and also passing through integral can be with Obtain speed and the displacement of platform, by resolving and complementary filter and update, can with obtain location information, i.e.,:Robot moves The displacement of dynamic chassis body rotation direction, and odometer information is obtained by encoder code disc data, robot shifting can be obtained The straight-line displacement of the vertical and horizontal of dynamic chassis body, controller module are estimated that robot moves according to displacement information is obtained The pose of dynamic chassis body.IMU obtains displacement information and can help to reduce the cumulative errors of odometer calculating.Present embodiment is adopted With multi-sensor fusion technology, merging the data information of odometer data information and Inertial Measurement Unit can improve to mobile bottom The pose and positioning accuracy of disk.
Although describing the present invention herein with reference to specific embodiment, it should be understood that, these realities Apply the example that example is only principles and applications.It should therefore be understood that can be carried out to exemplary embodiment Many modifications, and can be designed that other arrangements, without departing from the spirit of the present invention as defined in the appended claims And range.It should be understood that can be by combining different appurtenances different from mode described in original claim Profit requires and feature described herein.It will also be appreciated that the feature in conjunction with described in separate embodiments can use In other described embodiments.

Claims (7)

1. a kind of mobile chassis of SLAM and avoidance based on Multi-sensor Fusion, which is characterized in that the mobile chassis includes Mobile robot chassis body, control module, servo-driven module, environmental perception module and bottom sensing module;
Environmental perception module, for acquiring the ring around mobile robot chassis body by laser sensor and visual sensor Border data;
Bottom sensing module, the rotation direction angular displacement for acquiring current mobile robot chassis body and peripheral obstacle letter Breath;
Control module is connect respectively with environmental perception module, bottom sensing module and servo-driven module, to environment sensing The environmental data of module acquisition is handled, and map is built, and to the obstacle information acquired according to bottom sensing module, is determined The position of barrier, and the motion control instruction to mobile robot chassis body is obtained, which is sent to servo-drive Module realizes avoidance, while being additionally operable to receive the straight-line displacement for the mobile robot chassis body that servo-driven module returns, knot The rotation direction angular displacement for closing the current mobile robot chassis body of bottom sensing module acquisition, obtains mobile robot chassis The current pose of ontology;
Servo-driven module realizes speed control according to motion control instruction, to be driven to mobile robot chassis body System, while obtaining the straight-line displacement information of mobile robot chassis body.
2. the mobile chassis of SLAM and avoidance according to claim 1 based on Multi-sensor Fusion, which is characterized in that institute Control module is stated, to the environmental data acquired according to laser sensor, local laser map is obtained, is adopted according to visual sensor The environmental data of collection obtains three-dimensional local environment map, and the three-dimensional local environment map is converted to throwing using Bayesian formula Shadow environmental map merges the local laser map of acquisition and projection environmental map, obtains global grating map.
3. the mobile chassis of SLAM and avoidance according to claim 2 based on Multi-sensor Fusion, which is characterized in that institute It states laser sensor and the environmental data of acquisition is transferred to by control module, the visual sensor by Ethernet communication interfaces The environmental data of acquisition is then transferred to by control module by USB communication interface.
4. the mobile chassis of SLAM and avoidance according to claim 2 based on Multi-sensor Fusion, which is characterized in that institute It includes Inertial Measurement Unit, avoiding obstacles by supersonic wave sensor, safe touch side sensor, infrared distance measurement sensing to state bottom sensor module Device and microprocessor;
Inertial Measurement Unit, the pose for detecting mobile robot chassis body, is sent to microprocessor;
Avoiding obstacles by supersonic wave sensor, the obstacle distance for measuring mobile robot chassis body front, the left and right, hair It send to microprocessor;
Safe touch side sensor, for detecting whether the information with bar contact, is sent to microprocessor;
Infrared distance sensor is sent to microprocessor for detecting mobile robot chassis body bottom surface at a distance from ground Device;
Microprocessor obtains current mobile robot chassis body rotation direction angular displacement and barrier according to information is received Information, be converted into switching value and be sent to control module.
5. the mobile chassis of SLAM and avoidance according to claim 4 based on Multi-sensor Fusion, which is characterized in that move Mobile robot chassis body includes chassis frame, two independent suspension devices and six wheels, and six wheels include that there are four ten thousand To wheel and two driving wheels, wherein four universal wheels are fixed on chassis frame bottom, and two driving wheels pass through one respectively Independent suspension device is connect with chassis frame bottom.
6. the mobile chassis of SLAM and avoidance according to claim 5 based on Multi-sensor Fusion, which is characterized in that institute The servo-driven module stated includes servo controller, and two servo-drivers, two servo motors and two encoders, one is watched It takes driver to be sequentially connected in series with a servo motor, an encoder and a driving wheel, forms electromechanical integration module,
The rate control instruction that servo controller is sent out is sent to wherein No. 1 servo-driver by EtherCAT buses, in turn Motion control is carried out to corresponding driving wheel by the servo motor of connection;Meanwhile the No.1 servo-driver refers to speed Order is sent to No. 2 servo-drivers, and then carries out motion control to corresponding driving wheel by the servo motor of connection;
Two encoders acquire the move distance of the servo motor respectively connected respectively, and code-disc data are sent to respective connection Servo-driver, the code-disc data of No. 1 servo-driver are sent to No. 2 servo-drivers, and No. 2 servo-drivers will receive Code-disc data be sent to servo controller.
7. the mobile chassis of SLAM and avoidance according to claim 6 based on Multi-sensor Fusion, which is characterized in that institute The straight-line displacement for stating mobile robot chassis body includes the straight-line displacement to the vertical and horizontal of mobile underpan ontology;
Mileage is obtained according to the code-disc data calculation of encoder to count, the servo controller is obtained according to odometer data calculation To the straight-line displacement of the vertical and horizontal of mobile underpan ontology.
CN201810623638.9A 2018-06-15 2018-06-15 The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion Pending CN108710376A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810623638.9A CN108710376A (en) 2018-06-15 2018-06-15 The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810623638.9A CN108710376A (en) 2018-06-15 2018-06-15 The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion

Publications (1)

Publication Number Publication Date
CN108710376A true CN108710376A (en) 2018-10-26

Family

ID=63871828

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810623638.9A Pending CN108710376A (en) 2018-06-15 2018-06-15 The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion

Country Status (1)

Country Link
CN (1) CN108710376A (en)

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109375629A (en) * 2018-12-05 2019-02-22 苏州博众机器人有限公司 A kind of cruiser and its barrier-avoiding method that navigates
CN109506661A (en) * 2019-01-11 2019-03-22 轻客小觅智能科技(北京)有限公司 A kind of localization method of robot, device, robot and storage medium
CN109596078A (en) * 2019-01-28 2019-04-09 吉林大学 Multi-information fusion spectrum of road surface roughness real-time testing system and test method
CN109613875A (en) * 2019-01-24 2019-04-12 上海思岚科技有限公司 A kind of robot chassis control system
CN109782768A (en) * 2019-01-26 2019-05-21 哈尔滨玄智科技有限公司 A kind of autonomous navigation system adapting to expert's planetary compound gear train transfer robot
CN109828587A (en) * 2019-03-08 2019-05-31 南京康尼智控技术有限公司 A kind of obstacle avoidance system and barrier-avoiding method
CN109917786A (en) * 2019-02-04 2019-06-21 浙江大学 A kind of robot tracking control and system operation method towards complex environment operation
CN109940587A (en) * 2019-03-26 2019-06-28 盐城工学院 A kind of robot being loaded with intelligent vehicle chassis system
CN110104070A (en) * 2019-04-28 2019-08-09 北京云迹科技有限公司 Robot chassis and robot
CN110143396A (en) * 2019-06-27 2019-08-20 广东利元亨智能装备股份有限公司 Intelligent cruise vehicle
CN110221607A (en) * 2019-05-22 2019-09-10 北京德威佳业科技有限公司 A kind of control system and control method holding formula vehicle access AGV
CN110286685A (en) * 2019-07-23 2019-09-27 中科新松有限公司 A kind of mobile robot
CN110286686A (en) * 2019-07-23 2019-09-27 中科新松有限公司 Mobile robot
CN110695956A (en) * 2019-10-21 2020-01-17 东北农业大学 STM 32-based forest information acquisition robot
CN110764511A (en) * 2019-11-13 2020-02-07 苏州大成有方数据科技有限公司 Mobile robot with multi-sensor fusion and control method thereof
CN110823211A (en) * 2019-10-29 2020-02-21 珠海市一微半导体有限公司 Multi-sensor map construction method, device and chip based on visual SLAM
CN110861605A (en) * 2019-11-29 2020-03-06 中汽研(常州)汽车工程研究院有限公司 Large-scale vehicle blind area composite monitoring device and method
CN110888443A (en) * 2019-12-04 2020-03-17 上海大学 Rotary obstacle avoidance method and system for mobile robot
CN111090087A (en) * 2020-01-21 2020-05-01 广州赛特智能科技有限公司 Intelligent navigation machine, laser radar blind area compensation method and storage medium
CN111114292A (en) * 2018-10-31 2020-05-08 深圳市优必选科技有限公司 Chassis structure of mute machine
CN111251271A (en) * 2020-03-17 2020-06-09 青岛大学 SLAM robot for constructing and positioning rotary laser radar and indoor map
CN111290403A (en) * 2020-03-23 2020-06-16 内蒙古工业大学 Transport method for transporting automated guided vehicle and automated guided vehicle
CN111664843A (en) * 2020-05-22 2020-09-15 杭州电子科技大学 SLAM-based intelligent storage checking method
CN111708368A (en) * 2020-07-07 2020-09-25 上海工程技术大学 Intelligent wheelchair based on fusion of laser and visual SLAM
CN112223348A (en) * 2019-06-28 2021-01-15 坎德拉(深圳)科技创新有限公司 Indoor distribution robot
CN112486190A (en) * 2020-10-16 2021-03-12 北京电子工程总体研究所 Comprehensive test system for realizing attitude control
CN112817315A (en) * 2020-12-31 2021-05-18 江苏集萃智能制造技术研究所有限公司 Obstacle avoidance method and system for unmanned cleaning vehicle in dynamic environment
CN112859873A (en) * 2021-01-25 2021-05-28 山东亚历山大智能科技有限公司 Semantic laser-based mobile robot multi-stage obstacle avoidance system and method
CN112947426A (en) * 2021-02-01 2021-06-11 南京抒微智能科技有限公司 Cleaning robot motion control system and method based on multi-sensing fusion
WO2021139536A1 (en) * 2020-01-08 2021-07-15 京东数科海益信息科技有限公司 Robot control system
CN113534810A (en) * 2021-07-22 2021-10-22 乐聚(深圳)机器人技术有限公司 Logistics robot and logistics robot system
CN113678082A (en) * 2019-03-25 2021-11-19 索尼集团公司 Mobile body, control method for mobile body, and program
CN113689502A (en) * 2021-09-01 2021-11-23 南京信息工程大学 Multi-information fusion obstacle measuring method
CN114035561A (en) * 2020-07-29 2022-02-11 四川鼎鸿智电装备科技有限公司 Construction machine
CN114403760A (en) * 2021-12-22 2022-04-29 天津希格玛微电子技术有限公司 Movable carrier positioning method and device and sweeping robot
CN115235036A (en) * 2022-08-03 2022-10-25 合肥工业大学 Mobile air purifier and control method thereof

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004059900A2 (en) * 2002-12-17 2004-07-15 Evolution Robotics, Inc. Systems and methods for visual simultaneous localization and mapping
CN101504546A (en) * 2008-12-12 2009-08-12 北京科技大学 Children robot posture tracking apparatus
CN101786272A (en) * 2010-01-05 2010-07-28 深圳先进技术研究院 Multisensory robot used for family intelligent monitoring service
CN102662400A (en) * 2012-05-10 2012-09-12 慈溪思达电子科技有限公司 Path planning algorithm of mowing robot
CN105352508A (en) * 2015-10-22 2016-02-24 深圳创想未来机器人有限公司 Method and device of robot positioning and navigation
CN106182027A (en) * 2016-08-02 2016-12-07 西南科技大学 A kind of open service robot system
CN106276009A (en) * 2016-08-11 2017-01-04 中国科学院宁波材料技术与工程研究所 Omni-mobile transfer robot
CN107065863A (en) * 2017-03-13 2017-08-18 山东大学 A kind of guide to visitors based on face recognition technology explains robot and method
CN107088869A (en) * 2017-04-20 2017-08-25 哈尔滨工业大学 A kind of modularization all directionally movable robot for environment sensing
US20170374342A1 (en) * 2016-06-24 2017-12-28 Isee, Inc. Laser-enhanced visual simultaneous localization and mapping (slam) for mobile devices
CN206990800U (en) * 2017-07-24 2018-02-09 宗晖(上海)机器人有限公司 A kind of alignment system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004059900A2 (en) * 2002-12-17 2004-07-15 Evolution Robotics, Inc. Systems and methods for visual simultaneous localization and mapping
CN101504546A (en) * 2008-12-12 2009-08-12 北京科技大学 Children robot posture tracking apparatus
CN101786272A (en) * 2010-01-05 2010-07-28 深圳先进技术研究院 Multisensory robot used for family intelligent monitoring service
CN102662400A (en) * 2012-05-10 2012-09-12 慈溪思达电子科技有限公司 Path planning algorithm of mowing robot
CN105352508A (en) * 2015-10-22 2016-02-24 深圳创想未来机器人有限公司 Method and device of robot positioning and navigation
US20170374342A1 (en) * 2016-06-24 2017-12-28 Isee, Inc. Laser-enhanced visual simultaneous localization and mapping (slam) for mobile devices
CN106182027A (en) * 2016-08-02 2016-12-07 西南科技大学 A kind of open service robot system
CN106276009A (en) * 2016-08-11 2017-01-04 中国科学院宁波材料技术与工程研究所 Omni-mobile transfer robot
CN107065863A (en) * 2017-03-13 2017-08-18 山东大学 A kind of guide to visitors based on face recognition technology explains robot and method
CN107088869A (en) * 2017-04-20 2017-08-25 哈尔滨工业大学 A kind of modularization all directionally movable robot for environment sensing
CN206990800U (en) * 2017-07-24 2018-02-09 宗晖(上海)机器人有限公司 A kind of alignment system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张毅 等: "一种融合激光和深度视觉传感器的SLAM地图创建方法", 《计算机应用研究》 *

Cited By (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111114292A (en) * 2018-10-31 2020-05-08 深圳市优必选科技有限公司 Chassis structure of mute machine
CN109375629A (en) * 2018-12-05 2019-02-22 苏州博众机器人有限公司 A kind of cruiser and its barrier-avoiding method that navigates
CN109506661A (en) * 2019-01-11 2019-03-22 轻客小觅智能科技(北京)有限公司 A kind of localization method of robot, device, robot and storage medium
CN109613875A (en) * 2019-01-24 2019-04-12 上海思岚科技有限公司 A kind of robot chassis control system
CN109782768A (en) * 2019-01-26 2019-05-21 哈尔滨玄智科技有限公司 A kind of autonomous navigation system adapting to expert's planetary compound gear train transfer robot
CN109596078A (en) * 2019-01-28 2019-04-09 吉林大学 Multi-information fusion spectrum of road surface roughness real-time testing system and test method
CN109917786A (en) * 2019-02-04 2019-06-21 浙江大学 A kind of robot tracking control and system operation method towards complex environment operation
CN109828587A (en) * 2019-03-08 2019-05-31 南京康尼智控技术有限公司 A kind of obstacle avoidance system and barrier-avoiding method
CN113678082A (en) * 2019-03-25 2021-11-19 索尼集团公司 Mobile body, control method for mobile body, and program
CN109940587A (en) * 2019-03-26 2019-06-28 盐城工学院 A kind of robot being loaded with intelligent vehicle chassis system
CN110104070A (en) * 2019-04-28 2019-08-09 北京云迹科技有限公司 Robot chassis and robot
CN110221607A (en) * 2019-05-22 2019-09-10 北京德威佳业科技有限公司 A kind of control system and control method holding formula vehicle access AGV
CN110143396A (en) * 2019-06-27 2019-08-20 广东利元亨智能装备股份有限公司 Intelligent cruise vehicle
CN112223348A (en) * 2019-06-28 2021-01-15 坎德拉(深圳)科技创新有限公司 Indoor distribution robot
CN110286686A (en) * 2019-07-23 2019-09-27 中科新松有限公司 Mobile robot
CN110286685A (en) * 2019-07-23 2019-09-27 中科新松有限公司 A kind of mobile robot
CN110695956A (en) * 2019-10-21 2020-01-17 东北农业大学 STM 32-based forest information acquisition robot
CN110823211A (en) * 2019-10-29 2020-02-21 珠海市一微半导体有限公司 Multi-sensor map construction method, device and chip based on visual SLAM
CN110764511A (en) * 2019-11-13 2020-02-07 苏州大成有方数据科技有限公司 Mobile robot with multi-sensor fusion and control method thereof
CN110861605A (en) * 2019-11-29 2020-03-06 中汽研(常州)汽车工程研究院有限公司 Large-scale vehicle blind area composite monitoring device and method
CN110888443A (en) * 2019-12-04 2020-03-17 上海大学 Rotary obstacle avoidance method and system for mobile robot
WO2021139536A1 (en) * 2020-01-08 2021-07-15 京东数科海益信息科技有限公司 Robot control system
CN111090087A (en) * 2020-01-21 2020-05-01 广州赛特智能科技有限公司 Intelligent navigation machine, laser radar blind area compensation method and storage medium
CN111251271A (en) * 2020-03-17 2020-06-09 青岛大学 SLAM robot for constructing and positioning rotary laser radar and indoor map
CN111251271B (en) * 2020-03-17 2023-02-21 青岛聚远网络科技有限公司 SLAM robot for constructing and positioning rotary laser radar and indoor map
CN111290403A (en) * 2020-03-23 2020-06-16 内蒙古工业大学 Transport method for transporting automated guided vehicle and automated guided vehicle
CN111664843A (en) * 2020-05-22 2020-09-15 杭州电子科技大学 SLAM-based intelligent storage checking method
CN111708368A (en) * 2020-07-07 2020-09-25 上海工程技术大学 Intelligent wheelchair based on fusion of laser and visual SLAM
CN111708368B (en) * 2020-07-07 2023-03-10 上海工程技术大学 Intelligent wheelchair based on fusion of laser and visual SLAM
CN114035561A (en) * 2020-07-29 2022-02-11 四川鼎鸿智电装备科技有限公司 Construction machine
CN112486190A (en) * 2020-10-16 2021-03-12 北京电子工程总体研究所 Comprehensive test system for realizing attitude control
CN112817315A (en) * 2020-12-31 2021-05-18 江苏集萃智能制造技术研究所有限公司 Obstacle avoidance method and system for unmanned cleaning vehicle in dynamic environment
AU2021266203B2 (en) * 2021-01-25 2023-01-19 Shandong Alesmart Intelligent Technology Co., Ltd. Semantic laser-based multilevel obstacle avoidance system and method for mobile robot
CN112859873A (en) * 2021-01-25 2021-05-28 山东亚历山大智能科技有限公司 Semantic laser-based mobile robot multi-stage obstacle avoidance system and method
CN112947426A (en) * 2021-02-01 2021-06-11 南京抒微智能科技有限公司 Cleaning robot motion control system and method based on multi-sensing fusion
CN113534810A (en) * 2021-07-22 2021-10-22 乐聚(深圳)机器人技术有限公司 Logistics robot and logistics robot system
CN113689502A (en) * 2021-09-01 2021-11-23 南京信息工程大学 Multi-information fusion obstacle measuring method
CN113689502B (en) * 2021-09-01 2023-06-30 南京信息工程大学 Multi-information fusion obstacle measurement method
CN114403760A (en) * 2021-12-22 2022-04-29 天津希格玛微电子技术有限公司 Movable carrier positioning method and device and sweeping robot
CN115235036A (en) * 2022-08-03 2022-10-25 合肥工业大学 Mobile air purifier and control method thereof

Similar Documents

Publication Publication Date Title
CN108710376A (en) The mobile chassis of SLAM and avoidance based on Multi-sensor Fusion
CN110262495B (en) Control system and method capable of realizing autonomous navigation and accurate positioning of mobile robot
WO2021254367A1 (en) Robot system and positioning navigation method
JP6868028B2 (en) Autonomous positioning navigation equipment, positioning navigation method and autonomous positioning navigation system
US11216006B2 (en) Robot and method for localizing a robot
CN110174903B (en) System and method for controlling a movable object within an environment
CN202255404U (en) Binocular vision navigation system of indoor mobile robot
US20070150111A1 (en) Embedded network-controlled omni-directional motion system with optical flow based navigation
US11633848B2 (en) Independent pan of coaxial robotic arm and perception housing
CN208953962U (en) A kind of robot tracking control and robot
KR20150038776A (en) Auto parking system using infra sensors
KR20200080421A (en) Disaster relief robot and operating method of thereof
Chen et al. Collision-free UAV navigation with a monocular camera using deep reinforcement learning
Guo et al. Navigation and positioning system applied in underground driverless vehicle based on IMU
CN111376263B (en) Human-computer cooperation system of compound robot and cross coupling force control method thereof
Aref et al. Position-based visual servoing for pallet picking by an articulated-frame-steering hydraulic mobile machine
EP4261113A1 (en) Robotic vehicle and a support assembly for a wheel thereof
CN209674238U (en) A kind of high loading trolley of high speed of more sense fusions
CN112318507A (en) Robot intelligent control system based on SLAM technology
Liu et al. Fuzzy logic-based navigation controller for an autonomous mobile robot
CN111736599A (en) AGV navigation obstacle avoidance system, method and equipment based on multiple laser radars
CN117234203A (en) Multi-source mileage fusion SLAM downhole navigation method
JP2019197241A5 (en)
CN210321764U (en) Positioning navigation system of exhibition room robot
WO2019176278A1 (en) Information processing device, information processing method, program, and mobile body

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20181026