CN110865636A - Cloud robot navigation system based on Docker container and working method thereof - Google Patents

Cloud robot navigation system based on Docker container and working method thereof Download PDF

Info

Publication number
CN110865636A
CN110865636A CN201810906387.5A CN201810906387A CN110865636A CN 110865636 A CN110865636 A CN 110865636A CN 201810906387 A CN201810906387 A CN 201810906387A CN 110865636 A CN110865636 A CN 110865636A
Authority
CN
China
Prior art keywords
robot
docker
container
cloud
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810906387.5A
Other languages
Chinese (zh)
Inventor
王鲁佳
陈明
刘延东
须成忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201810906387.5A priority Critical patent/CN110865636A/en
Publication of CN110865636A publication Critical patent/CN110865636A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal

Abstract

The invention provides a cloud robot navigation system based on a Docker container and a working method thereof, wherein the cloud robot navigation system comprises the following steps: a cloud control platform and a mobile robot, wherein the mobile robot comprises a first wireless transceiver, a mobile body for realizing self movement, an environment perception component for collecting image data of the surrounding environment of the mobile robot, and a data processor for drawing a regional map of the image data, the cloud control platform comprises a second wireless transceiver communicated with the first wireless transceiver, and a server host used for receiving the collected data of the mobile robot through the second wireless transceiver and sending control instructions to the data processor, the server host is provided with a Docker ROS container, the data processor receives the control instructions through the first wireless transceiver, by adopting the Docker container cloud control platform, the system has the advantages of high portability, application-centered, automatic construction, bandwidth resource saving, component reuse, mirror image sharing, tool ecosystem and the like.

Description

Cloud robot navigation system based on Docker container and working method thereof
Technical Field
The invention relates to the field of cloud platform robot control, in particular to a cloud robot navigation system based on a Docker container and a working method thereof.
Background
At present, the navigation technologies applied to the service robot are various, and the most common navigation technologies include magnetic navigation, inertial navigation, sensor navigation, satellite navigation and visual navigation.
Magnetic navigation: the magnetic navigation of the robot is mainly realized by embedding a structure (such as an electrified lead or a magnet) capable of generating a magnetic field under a driving route, detecting the magnetic field through a magnetic sensor arranged on the robot, and guiding the robot to drive according to a preset track for navigation.
Inertial navigation: the inertial navigation system is mainly applied to the field of aerospace, and is divided into a platform type inertial navigation system and a strapdown type inertial navigation system, wherein the platform type inertial navigation system has a physical platform, and an inertial element is arranged on the physical platform to measure the angular velocity and the acceleration of the platform relative to an inertial space; the strapdown inertial navigation system adopts a digital platform to replace a traditional physical platform, and an inertial device is directly and fixedly connected on a carrier.
Sensor data navigation: the positioning navigation is carried out through a non-visual sensor, and the commonly used sensor data navigation technology comprises infrared navigation and ultrasonic navigation laser navigation. The infrared navigation utilizes an infrared sensor to measure distance and judge the position of the robot in the environment. The device has simple structure and high reaction speed, but is easily influenced by light, color, shape and the like. The ultrasonic navigation is the most widely applied sensor navigation technology, distance measurement is realized through an ultrasonic sensor so as to complete navigation, and the method is low in cost, simple in structure, free from light influence, susceptible to the influence of the surface shape of an object, capable of reducing navigation precision and incapable of detecting a long-distance object. The laser navigation measures distance through a laser sensor, the principle is basically the same as that of infrared navigation and ultrasonic navigation, but laser signals are high in energy density, high in brightness and pure in color, so that the laser navigation is higher in precision, longer in measured distance and better in resolution, but the cost is relatively higher.
Satellite navigation: satellite navigation is initially applied to the military field, and is completed by installing a satellite signal receiving system for a robot and utilizing information such as position, speed, time and the like provided by a global navigation satellite system. Then, the precision of civil satellite navigation is gradually improved, and the satellite navigation is not influenced by terrain, environment and the like, and can provide global navigation, so the satellite navigation has a wide application range, but the navigation precision is not high.
Visual navigation: machine vision techniques have been widely applied to service robot navigation systems. The visual navigation mainly comprises the steps of loading a camera on the robot, obtaining visual information of the surrounding environment of the robot, completing the identification of obstacles and road signs through image processing, obtaining navigation parameters and completing navigation. According to the difference of the number of the data cameras, the visual navigation is divided into monocular visual navigation, binocular visual navigation and multi-view visual navigation. The monocular vision system has small calculation amount, the algorithm is mature, but the visual field range is limited; the binocular vision system can obtain more comprehensive environmental information, and can obtain depth information in a scene through stereo matching, so that accurate three-dimensional positioning is realized; the multi-vision system can observe different directions of the environment, but the structure is complex, the information amount required to be processed is too large, and the research aiming at the multi-vision is relatively less at present.
The current various navigation technologies mainly have the following disadvantages;
magnetic navigation: the variability and the maintainability are poor, and the obstacle avoidance movement cannot be realized;
inertial navigation: as the distance increases, errors accumulate, resulting in reduced accuracy;
sensor navigation: obstacles which interfere with each other, cannot be well identified, and have strong absorptivity or transparency;
satellite navigation: the positioning precision is low, and indoor navigation cannot be performed;
visual navigation: the calculation amount is large, and the precision is low when the distance is long.
Disclosure of Invention
The embodiment of the invention provides a cloud robot navigation system based on a Docker container and a working method thereof, solves one of the problems, and has the advantages of high portability, application-centered automatic construction, bandwidth resource saving, component reuse, mirror image sharing and the like.
A Docker container-based cloud robot navigation system comprising: the system comprises a cloud control platform and more than one mobile robot, wherein the mobile robots are connected with the cloud control platform by adopting a wireless communication channel;
the mobile robot comprises a first wireless transceiver, a mobile body for realizing self movement, an environment sensing assembly for acquiring image data of the surrounding environment of the mobile robot, and a data processor for drawing a regional map of the image data, wherein the data processor is respectively electrically connected with the mobile body and the environment sensing assembly;
the cloud control platform comprises a second wireless transceiver communicated with the first wireless transceiver, and a server host used for receiving the collected data of the mobile robot through the second wireless transceiver and sending a control instruction to the data processor, wherein the server host is provided with a Docker ROS container, and the data processor receives the control instruction through the first wireless transceiver.
As an optional solution, the data processor is disposed inside the mobile body, and the environment sensing component is disposed on the mobile body.
As an optional solution, the control instruction includes a mapping instruction and a navigation instruction, and the data processor is further configured to control the mobile body to move in the surrounding environment and draw an area map according to the mapping instruction, and determine a traveling direction and a traveling route of the mobile body in the drawn area map according to the navigation instruction.
As an alternative, the mobile body is further used for navigation according to the travel direction and the travel route.
As an optional scheme, the system further comprises a storage device for providing a pre-stored pre-drawn map, wherein the storage device is in wireless connection with the mobile robot.
As an alternative, the mobile robot further comprises a memory for providing a pre-stored pre-rendered map, the memory being electrically connected to the data processor.
As an optional solution, the mobile robot further includes a rotation motion component for mounting the environment sensing component and realizing rotation, the rotation motion component is mounted on the mobile body, and the rotation motion component is electrically connected with the data processor.
As an alternative, the pre-mapped area is a map of the area where the mobile robot moves within the environmental area and maps the area by means of SLAM algorithm.
Optionally, the environment sensing component includes at least one or more of a camera, an infrared sensor, and a laser sensor.
In order to solve the above technical problem, the present invention further provides a method for operating a navigation system, including:
s1, configuring a communication network of the cloud robot on the server host;
s2, appointing a communication address (IP address) of each cloud robot on the server host;
s3, running a map building program of the SLAM on the mobile robot;
s4, running the rivz software on the server host to synchronously display the construction of the map, and storing the map after the construction is finished;
and S5, opening the constructed map on the server host, and designating a destination on the map, wherein the mobile robot can automatically navigate to the destination.
The invention provides a cloud robot navigation system based on a Docker container and a working method thereof, wherein the cloud robot navigation system comprises the following steps: the system comprises a cloud control platform and a mobile robot, wherein the mobile robot is connected with the cloud control platform through a wireless communication channel; the mobile robot comprises a first wireless transceiver, a mobile body for realizing self movement, an environment perception component for acquiring image data of the surrounding environment of the mobile robot, and a data processor for drawing an area map of the image data, wherein the data processor is respectively electrically connected with the mobile body and the environment perception component, the cloud control platform comprises a second wireless transceiver communicated with the first wireless transceiver, and a server host for receiving the acquired data of the mobile robot through the second wireless transceiver and sending a control command to the data processor, the server host is provided with a Docker ROS container, the data processor receives the control command through the first wireless transceiver, and by adopting the Docker container cloud control platform, by combining the platform technology of the cloud robot and the SLAM visual composition technology, the method can overcome a plurality of difficulties of composition and navigation of the single robot, and has the advantages of high portability, application-centered, automatic construction, bandwidth resource saving, component reuse, mirror image sharing, tool ecosystem and the like.
Drawings
Fig. 1 is a structural block diagram of a cloud robot navigation system based on a Docker container according to an embodiment of the present invention;
fig. 2 is a work flow diagram of a cloud control platform in a cloud robot navigation system based on a Docker container according to an embodiment of the present invention;
fig. 3 is a flowchart of a work flow of a server host in a cloud robot navigation system based on a Docker container according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Referring to fig. 1, an embodiment of the present invention provides a cloud robot navigation system based on a Docker container, including: the system comprises a cloud control platform and a mobile robot, wherein the mobile robot is connected with the cloud control platform through a wireless communication channel; the mobile robot comprises a first wireless transceiver, a mobile body for realizing self movement, an environment perception component for acquiring image data of the surrounding environment of the mobile robot, and a data processor for drawing an area map of the image data, wherein the data processor is respectively electrically connected with the mobile body and the environment perception component, the cloud control platform comprises a second wireless transceiver communicated with the first wireless transceiver, and a server host for receiving the acquired data of the mobile robot through the second wireless transceiver and sending a control command to the data processor, the server host is provided with a Docker ROS container, the data processor receives the control command through the first wireless transceiver, and by adopting the Docker container cloud control platform, by combining the platform technology of the cloud robot and the SLAM visual composition technology, the method can overcome a plurality of difficulties of composition and navigation of the single robot, and has the advantages of high portability, application-centered, automatic construction, bandwidth resource saving, component reuse, mirror image sharing, tool ecosystem and the like.
The mobile body can be provided with a shell to protect internal components and parts and be attractive as a whole, the data processor is arranged inside the mobile body, namely in the shell, the environment sensing assembly is arranged on the mobile body, namely on the shell, and ordinary technicians in the field can flexibly select the environment sensing assembly without limitation.
Specifically, the control instruction may include a mapping instruction and a navigation instruction, the data processor may control the mobile body to move in the surrounding environment and draw an area map according to the mapping instruction, and determine a traveling direction and a traveling route of the mobile body in the drawn area map according to the navigation instruction, and the mobile body is further configured to navigate according to the traveling direction and the traveling route.
For the pre-stored pre-drawn map, two ways are provided in this embodiment, one is an external storage way, and specifically, the method further includes a storage device for providing the pre-stored pre-drawn map, where the storage device is wirelessly connected with the mobile robot, and specifically, may adopt bluetooth or near field communication, and the like, and is not limited specifically.
As an alternative, the mobile robot further comprises a memory for providing a pre-stored pre-rendered map, the memory being electrically connected to the data processor.
In order to realize that the environmental perception subassembly scans the surrounding environment, mobile robot is still including being used for the installation the environmental perception subassembly realizes pivoted rotary motion subassembly, the rotary motion subassembly is installed remove on the body, the rotary motion subassembly with the data processor electricity is connected, and the rotary motion subassembly is in make mobile robot drive environmental perception subassembly at space translation, turn under data processor's control to make the environmental perception subassembly shoot the surrounding environment situation, the rotary motion subassembly can adopt the servo motor structure in this embodiment.
The method comprises the steps that a pre-drawing map is drawn, wherein the mobile robot moves in an environment area and draws the map of the area through an SLAM algorithm, the SLAM technology is used for positioning and map building in real time, the robot starts from an unknown place of an unknown environment, positions the position and the posture of the robot through repeatedly observed map features (such as wall corners, columns and the like) in the moving process, and builds the map in an incremental mode according to the position of the robot, so that the purpose of positioning and map building at the same time is achieved.
In this embodiment, the environment sensing component includes at least one or more of a camera, an infrared sensor, and a laser sensor, and a person skilled in the art can flexibly select the environment sensing component, which is not limited to this.
As an alternative, the robot includes at least one robot, and the robot can simultaneously perform control operation on a plurality of robots.
As shown in fig. 2, for the cloud control platform, the working method may be as follows:
s1, configuring a communication network of the cloud robot on the server host;
s2, appointing a communication address (IP address) of each cloud robot on the server host;
s3, running a map building program of the SLAM on the mobile robot;
s4, running the rivz software on the server host to synchronously display the construction of the map, and storing the map after the construction is finished;
and S5, opening the constructed map on the server host, and designating a destination on the map, wherein the mobile robot can automatically navigate to the destination.
As shown in fig. 3, the network setup at the server host may include:
s11, starting a route forwarding function of the gateway server at the server host, and setting net.ipv4.ip _ forward to 1;
s12, starting a Docker container (ros (robot Operating system) as a start-up image at the server host), setting the network to be in a none mode, closing a bridge mode, and setting a container environment variable (— env ═ DISPLAY), (-env ═ QT _ X11_ NO _ mitsehm ═ 1") to be in a data volume sharing variable (— volume ═ tmp/. X11-unix:/. X11-unix: rw");
s13, setting authority (xhost + local: root) of X server host in the terminal window outside the container of the server host;
s14, respectively setting the IP addresses of containers (ROS as a starting mirror image) by a pipeline tool at a container external terminal window of the server host;
s15, bridging an eth0 network card of the server host to a br0 network bridge (sudo brctl addif br0 eth 0);
s16, the server host sets the IP address of bridge br0 (IP addr add < IP address > dev br 0).
The experiment shows that the result is good. After experiments are carried out, a Docker ROS container is opened, cross-host communication of the Docker container, control over a mobile robot and transmission of environmental data collected by a camera to a Docker container cloud control platform are achieved, the Docker container cloud control platform carries out drawing of a local area map according to RGB-D data collected by the camera, an infrared sensor and the like, a reasonable target point is arranged on the local area map of the Docker container cloud control platform, and the offline mobile robot can move to the corresponding target point according to a target instruction.
The mobile robot provided by the invention is used for unloading the processing task of the robot to the cloud end by combining the strong computing capability of cloud computing. In the cloud computing technology, a technology suitable for a large-scale scene is a containerization cluster technology, and a virtualization technology applied to a cloud robot at present is mainly an LXC (linux condtainer) containerization technology, however, with the push-out of a Docker technology platform, the bottom layer of the Docker is realized by using the LXC. The LXC sandboxes the Linux processes, so that the processes are isolated from each other, and the resource allocation of each process can be controlled. Docker is based on LXC, which provides a range of more powerful functions. (1) Portability: docker defines a new format to package applications and their dependent environments all into a single object that can be shared on any machine on which Docker is installed, and the effect of executing this object on any machine is the same. LXC only enables process sandboxing and does not allow migration across different machines. Docker abstracts all the configurations of the application and packages them into a container, making the container portable. (2) Centering on the application: docker optimizes the deployment of applications, reflecting on its API, user interface, design principles and documentation. Whereas LXCs focus only on containers as a lightweight server. (3) Automated construction: docker supports Dockerfile, all dependent items, construction tools and packages of application are written in Dockerfile in the form of source codes, and then Docker can construct images according to Dockerfile. The mirror works equally well on any machine. (4) Version control: the Docker provides a version control function similar to Github for the container, and supports functions of version rollback and the like. Docker also realizes the functions of incremental uploading and downloading, and saves bandwidth resources during uploading and downloading. (5) And (3) component reuse: one image can be used as a base image to create more specific images, and multi-layer reuse is supported between the images. (6) Mirror image sharing: docker develops a Docker Hub, which contains various common mirror images, so that the Docker Hub is very convenient to use, and the Docker Hub can upload the mirror images of the Docker Hub. The user can also build a Docker warehouse of the user in a private environment to meet the internal sharing of the mirror image. (7) Tool ecosystem: docker defines an API for automating and localizing the creation and deployment of containers. There are already a large number of tool sets that integrate Docker, such as Deis, messes, Docker-ui, jenkins, etc.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable storage medium, and the storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic or optical disk, or the like.
While the cloud robot navigation system based on the Docker container provided by the present invention has been described in detail, for a person skilled in the art, according to the ideas of the embodiments of the present invention, the specific implementation manner and the application scope may be changed, and in summary, the content of the present description should not be construed as limiting the present invention.

Claims (10)

1. A cloud robot navigation system based on a Docker container, comprising: the system comprises a cloud control platform and more than one mobile robot, wherein the mobile robots are connected with the cloud control platform by adopting a wireless communication channel;
the mobile robot comprises a first wireless transceiver, a mobile body for realizing self movement, an environment sensing assembly for acquiring image data of the surrounding environment of the mobile robot, and a data processor for drawing a regional map of the image data, wherein the data processor is respectively electrically connected with the mobile body and the environment sensing assembly;
the cloud control platform comprises a second wireless transceiver communicated with the first wireless transceiver, and a server host used for receiving the collected data of the mobile robot through the second wireless transceiver and sending a control instruction to the data processor, wherein the server host is provided with a Docker ROS container, and the data processor receives the control instruction through the first wireless transceiver.
2. The Docker-container-based cloud robot navigation system of claim 1, wherein the data processor is disposed inside the mobile body, and the environment sensing component is disposed on the mobile body.
3. The Docker-container-based cloud robot navigation system according to claim 1, wherein the control instructions include mapping instructions and navigation instructions, and the data processor is further configured to control the mobile body to move in the surrounding environment and draw a regional map according to the mapping instructions, and determine a traveling direction and a traveling route of the mobile body in the drawn regional map according to the navigation instructions.
4. The Docker-container-based cloud robot navigation system of claim 3, wherein the mobile body is further configured to navigate according to the travel direction and the travel route.
5. The Docker-container-based cloud robot navigation system of claim 1, further comprising a storage device for providing a pre-stored pre-rendered map, the storage device being in wireless connection with the mobile robot.
6. The Docker-container-based cloud robot navigation system of claim 1, wherein the mobile robot further comprises a memory for providing a pre-stored pre-rendered map, the memory being electrically connected to the data processor.
7. The Docker-container-based cloud robot navigation system of claim 1, wherein the mobile robot further comprises a rotational motion component for mounting the environment sensing component and enabling rotation, the rotational motion component being mounted on the mobile body, the rotational motion component being electrically connected to the data processor.
8. The Docker-container-based cloud robot navigation system of claim 6 or 7, wherein the pre-mapped map is a map of an area where the mobile robot moves within an environmental area and maps the area by a SLAM algorithm.
9. The Docker-container-based cloud robot navigation system of claim 1, wherein the environment sensing component comprises at least one or more of a camera, an infrared sensor, and a laser sensor.
10. A method of operating a Docker container-based cloud robotic navigation system as claimed in any of claims 1 to 9, comprising the steps of:
s1, configuring a communication network of the cloud robot on the server host;
s2, appointing a communication address (IP address) of each cloud robot on the server host;
s3, running a map building program of the SLAM on the mobile robot;
s4, running the rivz software on the server host to synchronously display the construction of the map, and storing the map after the construction is finished;
and S5, opening the constructed map on the server host, and designating a destination on the map, wherein the mobile robot can automatically navigate to the destination.
CN201810906387.5A 2018-08-10 2018-08-10 Cloud robot navigation system based on Docker container and working method thereof Pending CN110865636A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810906387.5A CN110865636A (en) 2018-08-10 2018-08-10 Cloud robot navigation system based on Docker container and working method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810906387.5A CN110865636A (en) 2018-08-10 2018-08-10 Cloud robot navigation system based on Docker container and working method thereof

Publications (1)

Publication Number Publication Date
CN110865636A true CN110865636A (en) 2020-03-06

Family

ID=69650823

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810906387.5A Pending CN110865636A (en) 2018-08-10 2018-08-10 Cloud robot navigation system based on Docker container and working method thereof

Country Status (1)

Country Link
CN (1) CN110865636A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111805557A (en) * 2020-07-22 2020-10-23 上海上实龙创智能科技股份有限公司 Indoor explanation system and method based on humanoid robot
CN114043486A (en) * 2021-12-09 2022-02-15 东北大学 Distributed SLAM robot control strategy and system based on cloud service

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111805557A (en) * 2020-07-22 2020-10-23 上海上实龙创智能科技股份有限公司 Indoor explanation system and method based on humanoid robot
CN114043486A (en) * 2021-12-09 2022-02-15 东北大学 Distributed SLAM robot control strategy and system based on cloud service

Similar Documents

Publication Publication Date Title
JP6868028B2 (en) Autonomous positioning navigation equipment, positioning navigation method and autonomous positioning navigation system
CN111968262B (en) Semantic intelligent substation inspection operation robot navigation system and method
CN111448476B (en) Technique for sharing mapping data between unmanned aerial vehicle and ground vehicle
CN110160542B (en) Method and device for positioning lane line, storage medium and electronic device
CN109737981B (en) Unmanned vehicle target searching device and method based on multiple sensors
CN112161618B (en) Storage robot positioning and map construction method, robot and storage medium
US20210131821A1 (en) Techniques for collaborative map construction between an unmanned aerial vehicle and a ground vehicle
US20210365038A1 (en) Local sensing based autonomous navigation, and associated systems and methods
CN104180814A (en) Navigation method in live-action function on mobile terminal, and electronic map client
US11069080B1 (en) Collaborative airborne object tracking systems and methods
KR20140144921A (en) Simulation system for autonomous vehicle using virtual reality
CN108458712A (en) Unmanned trolley navigation system and air navigation aid, unmanned trolley
CN109459029A (en) It is a kind of for determining the method and apparatus of the navigation routine information of target object
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
CN109978954A (en) The method and apparatus of radar and camera combined calibrating based on cabinet
CN208937980U (en) Cloud Algorithms of Robots Navigation System based on Docker container
CN110794844A (en) Automatic driving method, device, electronic equipment and readable storage medium
CN114485619A (en) Multi-robot positioning and navigation method and device based on air-ground cooperation
CN111026107A (en) Method and system for determining the position of a movable object
CN110865636A (en) Cloud robot navigation system based on Docker container and working method thereof
CN112447058B (en) Parking method, parking device, computer equipment and storage medium
CN115435772A (en) Method and device for establishing local map, electronic equipment and readable storage medium
CN114127738A (en) Automatic mapping and positioning
CN113253719B (en) Intelligent mobile device based on ROS (reactive oxygen species) operating system and communication establishment method
CN112154480B (en) Positioning method and device for movable platform, movable platform and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination