CN113759787A - Unmanned robot for closed park and working method - Google Patents

Unmanned robot for closed park and working method Download PDF

Info

Publication number
CN113759787A
CN113759787A CN202111065179.5A CN202111065179A CN113759787A CN 113759787 A CN113759787 A CN 113759787A CN 202111065179 A CN202111065179 A CN 202111065179A CN 113759787 A CN113759787 A CN 113759787A
Authority
CN
China
Prior art keywords
robot
controller
laser radar
frame
front side
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111065179.5A
Other languages
Chinese (zh)
Inventor
杨鸿城
文浩
吴迪
岳峥嵘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Langyi Robot Co ltd
Original Assignee
Wuhan Langyi Robot Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Langyi Robot Co ltd filed Critical Wuhan Langyi Robot Co ltd
Priority to CN202111065179.5A priority Critical patent/CN113759787A/en
Publication of CN113759787A publication Critical patent/CN113759787A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller

Abstract

The invention provides a closed park unmanned robot which comprises a frame and traveling wheels arranged below the frame, wherein a front door and a rear door are respectively arranged on the front side and the rear side of the frame, a front side camera and a front side laser radar scanning port are also arranged on the front side of the frame, a multi-line laser radar, a top single-line laser radar and a human-computer interaction interface are arranged on the top of the frame, and the front side laser radar is a rotatable single-line laser radar and scans through the front side laser radar scanning port. The unmanned sweeping robot can be used for conveying express, can have other purposes only by additionally arranging some simple equipment on the premise of not changing the structure, can be used as a garden security patrol robot by arranging a plurality of cameras with night vision functions on the robot, can be used as an unmanned sweeping robot by additionally arranging a sweeping tool at the bottom of the robot, and has wide application.

Description

Unmanned robot for closed park and working method
Technical Field
The invention belongs to the field of intelligent robots, and particularly relates to an unmanned robot for a closed park and a working method.
Background
In the great closed garden of area such as industry garden, university campus, often can set up the express delivery accepting point, the addressee need take the express delivery at the express delivery accepting point, not only expend time, if express quality is great or when weather is bad moreover, it is very inconvenient to take, still causes express packaging or article to damage easily at the removal in-process, under the prevailing trend of shopping on the net now, this problem is waited for and is solved urgently.
Disclosure of Invention
In view of the above technical problems, the present invention aims to provide an unmanned robot for a closed park and a working method thereof, which have a simple structure and do not need to be considered to be controlled.
Therefore, the technical scheme adopted by the invention is as follows: a closed park unmanned robot comprises a frame and walking wheels arranged below the frame, wherein a front door and a rear door are respectively arranged on the front side and the rear side of the frame, a front side camera and a front side laser radar scanning port are also arranged on the front side of the frame, a multi-line laser radar or/and a top single-line laser radar and a human-computer interaction interface are arranged on the top of the frame, at least one bearing partition plate capable of moving up and down is arranged in the frame, a front side laser radar, a controller, a USB expander, an attitude sensor, a circuit board, a motor driver and a battery are also arranged in the frame, and the front side laser radar is a rotatable single-line laser radar and scans through the front side laser radar scanning port;
the image data output of front side camera links to each other with the front side image data input of controller, the data output of multi-line laser radar links to each other with the first data input of radar of controller, the data output of top single line laser radar links to each other with the radar second data input of controller, the data interaction end of man-machine interface links to each other with the data interaction end of controller, the data output of front side laser radar links to each other with the radar third data input of controller, the data end of USB expander links to each other with the USB data end of controller, attitude sensor's data output end links to each other with the gesture data input of controller, motor driver's drive signal input links to each other with the drive signal output of controller.
Structure more than adopting, the front side camera is used for catching the image in robot the place ahead, make the robot can avoid the barrier, plan the walking route, multi-thread lidar and top single line lidar are used for establishing the 3D model of robot surrounding environment, pinpoint the robot, human-computer interaction interface is used for showing information, the user operation of being convenient for, it is used for bearing the express delivery to bear the baffle, the front side lidar is used for confirming the place ahead barrier distance, the controller is used for the various sensor information of integrated processing, and give instructions, the USB expander is used for extending the USB port of controller, attitude sensor is used for detecting the walking direction of robot, data such as speed and gradient, motor drive is used for controlling the motor.
Preferably, the front camera comprises one or any combination of a wide-angle camera, a far-focus camera and a near-focus camera;
the image data output end of the wide-angle camera is connected with the wide-angle image data input end of the controller, the image output end of the far-focus camera is connected with the far-focus image data input end of the controller, and the image data output end of the near-focus camera is connected with the near-focus image data input end of the controller. By adopting the structure, the wide-angle camera is beneficial to enlarging the searchlighting image range, and the far-focus camera is beneficial to searchlighting a remote barrier, so that the robot can avoid the barrier in time.
Preferably, the multi-line laser radar and the top single-line laser radar are arranged on the top of the frame in an overlapped mode. The structure more than adopting, it is reasonable to arrange, avoids having shelter from the thing around multi-thread lidar and the top single line lidar, influences lidar's normal work.
Preferably, the front laser radar scanning port is a horizontally extending rectangular opening, a horizontally extending and inward-recessed conical groove is formed in the front side of the frame, the depth of the conical groove is gradually deepened from two ends to the middle, and the front laser radar scanning port is just located in the middle of the conical groove. By adopting the structure, sundries such as rainwater and leaves are prevented from entering the frame, and the detection range of the front laser radar is not influenced.
Preferably, a tool placing groove is further arranged on the top of the frame. By adopting the structure, the tool is convenient to place when the robot is overhauled.
Preferably, an electrical component mounting rack is arranged in the frame, the controller, the USB expander and the attitude sensor are mounted on the electrical component mounting rack, and the battery is mounted below the electrical component mounting rack. By adopting the structure, the electrical component mounting frame is convenient for the mounting and the arrangement of the electrical components.
Preferably, the vehicle frame further comprises a GPS module arranged in the vehicle frame, and the position data output end of the GPS module is connected with the position data input end of the controller; the GPS module is used for acquiring longitude and latitude coordinates of the position of the robot to realize positioning of the position.
Or/and an ultrasonic sensor is arranged in the frame, and the data output end of the ultrasonic sensor is connected with the distance data input end of the controller; the ultrasonic sensor is used for acquiring the distance between the robot and the obstacle; obstacles within 1 meter can be judged.
And/or the walking wheel is provided with an encoder, and the data output end of the encoder is connected with the encoded data input end of the controller. The encoder is used for measuring the wheel speed of the travelling wheel.
The invention also discloses a working method of the unmanned robot in the closed park, which comprises the following steps:
s1, the robot acquires the position of the target place and the current position of the robot;
and S2, the robot drives to the target place.
In a preferred embodiment of the present invention, step S2 includes the following steps:
s21, the robot perceives the surrounding environment according to the multi-line laser radar, obtains point cloud data, and obtains the accurate distance between the pedestrian and/or the vehicle according to a 3D obstacle perception algorithm;
s22, the robot obtains images in a longer distance range according to the far-focus camera, identifies traffic lights in the environment and judges the starting and stopping states of the robot according to the colors of the traffic lights:
if the far-focus camera acquires that the traffic light is red, the robot stops moving;
if the far-focus camera acquires that the traffic light is green, the robot continues to move;
s23, the robot obtains an image in a short distance range according to the near-focus camera, detects pedestrians or/and vehicles in the environment through a target recognition algorithm, and obtains the position of the pedestrians or/and vehicles relative to the robot;
s24, planning the driving path by the robot through a dynamic path planning algorithm according to the data information of one or any combination of the steps S21-S23;
s25, the robot measures the Euler angle and the three-axis acceleration of the robot attitude according to the attitude sensor; the robot measures the wheel speed according to the encoder installed on the robot; the robot acquires the longitude and latitude coordinates of the position of the robot according to the GPS module; the data of the three parts are fused to obtain the accurate position and speed information of the robot, and the robot makes a driving action judgment according to the position, the speed information and the global path information;
s26, the robot obtains the distance between the robot and the obstacle according to the ultrasonic sensor and obtains point cloud data of a two-dimensional plane through a top single-line laser radar, the data of the robot and the obstacle are fused, and the robot avoids the obstacle through a close-range obstacle avoidance algorithm;
and S27, generating a motion command of the robot according to the dynamic path obtained in the step S24, the driving action obtained in the step S25 and the short-distance obstacle avoidance algorithm in the step S26, and controlling the motion of the drive-by-wire chassis by the motion command.
The invention has the beneficial effects that: the express can be automatically conveyed to the addressee, so that the time and energy of the addressee are effectively saved, and meanwhile, the express package or article damage can be avoided; in the process that the robot transports the express, the robot can autonomously plan a driving route and autonomously navigate to a destination according to map data, can avoid obstacles in the driving process, realizes automatic positioning, realizes man-machine interaction through a man-machine interaction interface, and is high in intelligent degree.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic structural diagram of the present invention.
Fig. 2 is a schematic structural diagram of the invention (except for the left baffle and the right baffle).
Fig. 3 is a front view of the present invention.
Fig. 4 is a rear view of the present invention.
Detailed Description
The invention will be further illustrated by the following examples in conjunction with the accompanying drawings:
as shown in fig. 1 and fig. 2, a closed park unmanned robot mainly comprises a vehicle frame 1, traveling wheels 2, a front-side camera 3, a front-side laser radar scanning port 4, a multi-line laser radar 5, a top single-line laser radar 6, a man-machine interaction interface 7, a bearing partition 8, a front-side laser radar 9, a controller 10, a USB expander 11, a posture sensor 12, a circuit board 13, a motor driver 14, a battery 15 and an electrical component mounting rack 16, wherein an image data output end of the front-side camera 3 is connected with a front-side image data input end of the controller 10, a data output end of the multi-line laser radar 5 is connected with a first radar data input end of the controller 10, a data output end of the top single-line laser radar 6 is connected with a second radar data input end of the controller 10, a data interaction end of the man-machine interaction interface 7 is connected with a data interaction end of the controller 10, the data output end of the front laser radar 9 is connected with the radar third data input end of the controller 10, the data end of the USB expander 11 is connected with the USB data end of the controller 10, the data output end of the attitude sensor 12 is connected with the attitude data input end of the controller 10, and the driving signal input end of the motor driver 14 is connected with the driving signal output end of the controller 10; the power drive end of the motor drive 14 is connected to the road wheels 2. The motor driver 14 and the road wheels 2 are components of a drive-by-wire chassis. The frame 1 comprises a rectangular frame, a top plate, a bottom plate, a front baffle plate, a rear baffle plate, a left baffle plate and a right baffle plate which are enclosed around the frame, the frame is fixed on the bottom plate, four traveling wheels 2 are arranged below the bottom plate in a rectangular shape, the bearing baffle plate 8 is fixed on the frame through bolts, adjusting bolt holes which are arranged at intervals in the vertical direction are arranged on the frame, the height of the bearing baffle plate 8 is convenient to adjust, a front door 101 and a rear door 102 are respectively arranged on the front baffle plate and the rear baffle plate of the frame 1, a tool placing groove 104 is also arranged on the top of the frame 1, a front camera 3 and a front laser radar scanning port 4 are also arranged on the front baffle plate of the frame 1, the front camera 3, the front door 101 and the front laser radar scanning port 4 are sequentially distributed from top to bottom, a multi-line laser radar 5, a top single-line laser radar 6 and a human-computer interaction interface 7 are arranged on the top of the frame 1, at least one bearing baffle plate 8 which can move up and down is arranged in the frame 1, bear baffle 8 and be used for bearing the weight of the express delivery, still be provided with front side laser radar 9, controller 10, USB expander 11, attitude sensor 12, circuit board 13, motor drive 14, battery 15 in frame 1, front side laser radar 9 is revolvable single line laser radar to scan through front side laser radar scanning mouth 4. The automatic driving industrial personal computer is adopted as the controller 10, the radium god C16 laser radar is adopted as the multiline laser radar 5, the Silang A2M6 laser radar is adopted as the top single line laser radar 6 and the front side laser radar 9, the 1080P camera is adopted as the front side camera 3, the LPMS-IG19 shaft attitude sensor is adopted as the attitude sensor 12, the GPS module 17 is further arranged on the frame 1, and the position data output end of the GPS module is connected with the position data input end of the controller 10; the GPS module 17 adopts a DOVE-E4plus-4G differential GPS, two ultrasonic modules are arranged in the frame 1, and two ultrasonic probes are respectively arranged on the front door 101 and the rear door 102 and are used for detecting the distance between a front object and a rear object of the robot; the two ultrasonic probes arranged on the front door 101 are respectively an ultrasonic first probe and an ultrasonic second probe, the data output end of the ultrasonic first probe is connected with the first distance data input end of the controller 10, and the data output end of the ultrasonic second probe is connected with the second distance data input end of the controller 10; the two ultrasonic probes arranged on the back door 102 are an ultrasonic third probe and an ultrasonic fourth probe respectively, a data output end of the ultrasonic third probe is connected with a third distance data input end of the controller 10, and a data output end of the ultrasonic fourth probe is connected with a fourth distance data input end of the controller 10; the perception range of each sensor (multiline lidar, ultrasonic probe (ultrasonic sensor), front-side lidar) is identified.
As shown in fig. 1 and fig. 3, the front camera 3 includes a wide-angle camera 301, a far-focus camera 302 and a near-focus camera, an image data output terminal of the wide-angle camera 301 is connected to a wide-angle image data input terminal of the controller 10, an image output terminal of the far-focus camera 302 is connected to a far-focus image data input terminal of the controller 10, and an image data output terminal of the near-focus camera is connected to a near-focus image data input terminal of the controller 10. Wide angle camera 301 and afocal camera 302 are located same water flat line, and wide angle camera 301 searchlighting scope is big, and afocal camera 302 searchlighting distance is far away, effectively avoids the object in front of the robot to bump into with the robot at the removal in-process.
As shown in fig. 1 to 4, multi-line laser radar 5 and top single line laser radar 6 overlap and install at frame 1 top, and the arrangement is reasonable, avoids sheltering from multi-line laser radar 5 and top single line laser radar 6's laser. The robot can scan a 3D model of the surrounding environment of the robot through the multi-line laser radar 5, the change of the environment of the previous frame and the change of the environment of the next frame are compared through a related algorithm, surrounding vehicles and pedestrians can be easily detected, a map is built synchronously, a global map obtained in real time is compared with feature objects in a high-precision map, the positioning precision of the robot can be enhanced, and autonomous navigation is realized. Top single line laser radar 6 can only the planar scanning, can not measure the object height, compares multi-thread laser radar 5, and top single line laser radar 6 reacts more swiftly on angular frequency and sensitivity, so, all more accurate on the distance and the precision of barrier around the test, and the two combines can more accurately fix a position the robot and realize the autonomous navigation of robot.
As shown in fig. 1 and 3, the front side lidar scanning port 4 is a horizontally extending rectangular opening, a horizontally extending and inwardly recessed tapered groove 103 is provided at the front side of the frame 1, the width of the tapered groove 103 is consistent, the depth of the tapered groove 103 is gradually deepened from two ends to the middle, and the front side lidar scanning port 4 is just located at the middle of the tapered groove 103. The taper groove 103 does not affect the scanning of the front side laser radar 9, while preventing foreign matter or rainwater from entering the inside of the robot.
As shown in fig. 2, an electrical component mounting rack 16 is provided in the vehicle body frame 1, the controller 10, the USB extender 11, and the attitude sensor 12 are mounted on the electrical component mounting rack 16, the battery 15 is mounted below the electrical component mounting rack 16, and the load-bearing partition 8 is located above the electrical component mounting rack 16.
The unmanned sweeping robot can be used for conveying express, can have other purposes only by additionally arranging some simple equipment on the premise of not changing the structure, can be used as a garden security patrol robot by arranging a plurality of cameras with night vision functions on the robot, can be used as an unmanned sweeping robot by additionally arranging a sweeping tool at the bottom of the robot, and has wide application.
The invention also discloses a working method of the unmanned robot in the closed park, which comprises the following steps:
s1, the robot acquires the position of the target place and the current position of the robot;
and S2, the robot drives to the target place.
In a preferred embodiment of the present invention, step S2 includes the following steps:
s21, the robot perceives the surrounding environment according to the multi-line laser radar 5, obtains point cloud data, and obtains the accurate distance of the pedestrian or/and the vehicle according to a 3D obstacle perception algorithm; the 3D obstacle perception algorithm is to firstly filter a 3D point cloud, filter point cloud data except a region of interest (ROI) through a preset ROI, and remove background objects such as roadside trees, buildings and the like. And removing information on a Z axis (vertical direction) from the remaining point cloud, projecting the point cloud onto a 2D plane, clustering the point cloud on the plane, and framing the clustered objects by using a rectangular frame to serve as an obstacle in the advancing process of the robot. The distance and the direction of the barrier frame relative to the robot are obtained through calculation, and accurate information is provided for obstacle avoidance of the robot.
S22, the robot obtains images in a longer distance range according to the far-focus camera, identifies traffic lights in the environment and judges the starting and stopping states of the robot according to the colors of the traffic lights: and identifying the left-turn, right-turn and straight lights of the traffic lights through a YOLO target detection algorithm, and judging whether the lights in three directions are red lights or green lights.
If the far-focus camera acquires that the traffic light is red, the robot stops moving;
if the far-focus camera acquires that the traffic light is green, the robot continues to move;
s23, the robot obtains an image in a short distance range according to the near-focus camera, detects pedestrians or/and vehicles in the environment through a target recognition algorithm, and obtains the position of the pedestrians or/and vehicles relative to the robot; pedestrians and vehicles in the environment are detected by a YOLO target recognition algorithm.
S24, planning the driving path by the robot through a dynamic path planning algorithm according to the data information of one or any combination of the steps S21-S23; and planning the driving path by a D Star (D Star) dynamic path planning algorithm.
S25, the robot measures the Euler angle and the three-axis acceleration of the robot attitude according to the attitude sensor; the robot measures the wheel speed according to the encoder installed on the robot; the robot acquires the longitude and latitude coordinates of the position of the robot according to the GPS module; the data of the three parts are fused to obtain the accurate position and speed information of the robot, and the robot makes a driving action judgment according to the position, the speed information and the global path information;
s26, the robot obtains the distance between the robot and the obstacle according to the ultrasonic sensor and obtains point cloud data of a two-dimensional plane through a top single-line laser radar, the data of the robot and the obstacle are fused, and the robot avoids the obstacle through a close-range obstacle avoidance algorithm;
and S27, generating a motion command of the robot according to the dynamic path obtained in the step S24, the driving action obtained in the step S25 and the short-distance obstacle avoidance algorithm in the step S26, and controlling the motion of the drive-by-wire chassis by the motion command.

Claims (10)

1. The utility model provides a park unmanned robot seals, includes frame (1) and walking wheel (2) of setting in frame (1) below, its characterized in that the front side and the rear side of frame (1) are provided with qianmen (101) and back door (102) respectively frame (1) front side still is provided with front side camera (3) and front side laser radar scanning mouth (4) frame (1) top is provided with multi-thread laser radar (5) or/and top single line laser radar (6) and man-machine interface (7) be provided with at least one bearing partition plate (8) that can reciprocate in frame (1) still be provided with front side laser radar (9), controller (10), USB expander (11), attitude sensor (12), circuit board (13), motor driver (14), The battery (15), the said front side laser radar (9) is a rotatable single line laser radar, and scan through the scanning port of the front side laser radar (4);
the image data output end of the front camera (3) is connected with the front image data input end of the controller (10), the data output end of the multi-line laser radar (5) is connected with the first data input end of the radar of the controller (10), the data output end of the top single-line laser radar (6) is connected with the second data input end of the radar of the controller (10), the data interaction end of the man-machine interaction interface (7) is connected with the data interaction end of the controller (10), the data output end of the front laser radar (9) is connected with the third data input end of the radar of the controller (10), the data end of the USB expander (11) is connected with the USB data end of the controller (10), the data output end of the attitude sensor (12) is connected with the attitude data input end of the controller (10), and the driving signal input end of the motor driver (14) is connected with the driving signal output end of the controller (10).
2. The closed park unmanned robot of claim 1, wherein the front camera (3) comprises one or any combination of a wide-angle camera (301), a far-focus camera (302) and a near-focus camera;
the image data output end of the wide-angle camera (301) is connected with the wide-angle image data input end of the controller (10), the image output end of the far-focus camera (302) is connected with the far-focus image data input end of the controller (10), and the image data output end of the near-focus camera is connected with the near-focus image data input end of the controller (10).
3. The closed park unmanned robot of claim 1, wherein the multiline lidar (5) and the top singlet lidar (6) are mounted overlapping on top of the frame (1).
4. The unmanned robot for closed park according to claim 1, wherein the front lidar scanning port (4) is a horizontally extending rectangular opening, a horizontally extending and inwardly recessed tapered groove (103) is provided at the front side of the frame (1), the depth of the tapered groove (103) is gradually deepened from two ends to the middle, and the front lidar scanning port (4) is located at the middle of the tapered groove (103).
5. The closed park unmanned robot of claim 1, wherein a tool placement slot (104) is also provided on top of the carriage (1).
6. The closed park unmanned robot of claim 1, wherein an electrical component mounting bracket (16) is provided within the frame (1), the controller (10), the USB extender (11), the attitude sensor (12) are mounted on the electrical component mounting bracket (16), and the battery (15) is mounted below the electrical component mounting bracket (16).
7. The closed park unmanned robot of claim 1, further comprising a GPS module disposed within the vehicle frame (1), a position data output of the GPS module being connected to a position data input of the controller (10).
8. The closed park unmanned robot of claim 1, wherein an ultrasonic sensor is provided within the frame (1), a data output of the ultrasonic sensor being connected to a distance data input of the controller (10).
9. The closed park unmanned robot of claim 1, further comprising an encoder disposed on the road wheels (2), a data output of the encoder being connected to a coded data input of the controller (10).
10. A working method of an unmanned robot for a closed park is characterized by comprising the following steps:
s1, the robot acquires the position of the target place and the current position of the robot;
and S2, the robot drives to the target place.
CN202111065179.5A 2021-09-11 2021-09-11 Unmanned robot for closed park and working method Pending CN113759787A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111065179.5A CN113759787A (en) 2021-09-11 2021-09-11 Unmanned robot for closed park and working method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111065179.5A CN113759787A (en) 2021-09-11 2021-09-11 Unmanned robot for closed park and working method

Publications (1)

Publication Number Publication Date
CN113759787A true CN113759787A (en) 2021-12-07

Family

ID=78794926

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111065179.5A Pending CN113759787A (en) 2021-09-11 2021-09-11 Unmanned robot for closed park and working method

Country Status (1)

Country Link
CN (1) CN113759787A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115933634A (en) * 2022-10-12 2023-04-07 海南大学 Unknown environment exploration method, unknown environment exploration system, mobile robot and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107807652A (en) * 2017-12-08 2018-03-16 灵动科技(北京)有限公司 Merchandising machine people, the method for it and controller and computer-readable medium
CN108303985A (en) * 2018-03-07 2018-07-20 广东工业大学 A kind of unmanned transport vehicle system in garden
CN108469820A (en) * 2018-03-19 2018-08-31 徐州艾奇机器人科技有限公司 A kind of round-the-clock unmanned cruiser system of two-wheel drive low speed
CN211565918U (en) * 2020-02-11 2020-09-25 深圳市欧拉智造科技有限公司 Intelligent logistics distribution robot
CN212623747U (en) * 2020-07-02 2021-02-26 长沙智能驾驶研究院有限公司 Vehicle with a steering wheel
CN112684784A (en) * 2019-10-17 2021-04-20 武汉小狮科技有限公司 Low-speed unmanned driving system
US20210132625A1 (en) * 2018-05-31 2021-05-06 Carla R Gillett Modular delivery vehicle system
CN213876030U (en) * 2020-12-09 2021-08-03 江西赛特智能科技有限公司 Unmanned robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107807652A (en) * 2017-12-08 2018-03-16 灵动科技(北京)有限公司 Merchandising machine people, the method for it and controller and computer-readable medium
CN108303985A (en) * 2018-03-07 2018-07-20 广东工业大学 A kind of unmanned transport vehicle system in garden
CN108469820A (en) * 2018-03-19 2018-08-31 徐州艾奇机器人科技有限公司 A kind of round-the-clock unmanned cruiser system of two-wheel drive low speed
US20210132625A1 (en) * 2018-05-31 2021-05-06 Carla R Gillett Modular delivery vehicle system
CN112684784A (en) * 2019-10-17 2021-04-20 武汉小狮科技有限公司 Low-speed unmanned driving system
CN211565918U (en) * 2020-02-11 2020-09-25 深圳市欧拉智造科技有限公司 Intelligent logistics distribution robot
CN212623747U (en) * 2020-07-02 2021-02-26 长沙智能驾驶研究院有限公司 Vehicle with a steering wheel
CN213876030U (en) * 2020-12-09 2021-08-03 江西赛特智能科技有限公司 Unmanned robot

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115933634A (en) * 2022-10-12 2023-04-07 海南大学 Unknown environment exploration method, unknown environment exploration system, mobile robot and storage medium

Similar Documents

Publication Publication Date Title
US11024055B2 (en) Vehicle, vehicle positioning system, and vehicle positioning method
CN113002396B (en) A environmental perception system and mining vehicle for automatic driving mining vehicle
US10471904B2 (en) Systems and methods for adjusting the position of sensors of an automated vehicle
CN102944224B (en) Work method for automatic environmental perception systemfor remotely piloted vehicle
CN106598039B (en) A kind of Intelligent Mobile Robot barrier-avoiding method based on laser radar
CN113085896B (en) Auxiliary automatic driving system and method for modern rail cleaning vehicle
CN102368158B (en) Navigation positioning method of orchard machine
CN105699985A (en) Single-line laser radar device
CN206031278U (en) Self -driving car control system
CN107589744B (en) Omnidirectional mobile unmanned platform method based on highway tunnel crack detection
US10688841B1 (en) Expanding sensor domain coverage using differential active suspension
CN113791621B (en) Automatic steering tractor and airplane docking method and system
CN212220188U (en) Underground parking garage fuses positioning system
CN114442101B (en) Vehicle navigation method, device, equipment and medium based on imaging millimeter wave radar
TWI754808B (en) Vehicle, vehicle positioning system, and vehicle positioning method
CN112859860A (en) Robot system and path planning method thereof
CN111459172A (en) Autonomous navigation system of boundary security unmanned patrol car
US20220043157A1 (en) Self-Reflection Filtering
CN210882093U (en) Automatic driving vehicle environment perception system and automatic driving vehicle
US11577748B1 (en) Real-time perception system for small objects at long range for autonomous vehicles
CN110412565A (en) Sensing system and automatic driving vehicle
El-Hassan Experimenting with sensors of a low-cost prototype of an autonomous vehicle
CN113759787A (en) Unmanned robot for closed park and working method
CN217801729U (en) Outdoor robot
CN111367273A (en) Unmanned small-sized sweeping machine control system based on path tracking and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination