CN111352425A - Navigation system, method, device, electronic equipment and medium - Google Patents

Navigation system, method, device, electronic equipment and medium Download PDF

Info

Publication number
CN111352425A
CN111352425A CN202010182901.2A CN202010182901A CN111352425A CN 111352425 A CN111352425 A CN 111352425A CN 202010182901 A CN202010182901 A CN 202010182901A CN 111352425 A CN111352425 A CN 111352425A
Authority
CN
China
Prior art keywords
robots
sensor data
robot
computing platform
sent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010182901.2A
Other languages
Chinese (zh)
Other versions
CN111352425B (en
Inventor
王兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Orion Star Technology Co Ltd
Original Assignee
Beijing Orion Star Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Orion Star Technology Co Ltd filed Critical Beijing Orion Star Technology Co Ltd
Priority to CN202010182901.2A priority Critical patent/CN111352425B/en
Publication of CN111352425A publication Critical patent/CN111352425A/en
Application granted granted Critical
Publication of CN111352425B publication Critical patent/CN111352425B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The embodiment of the invention provides a navigation system, a navigation method, a navigation device, electronic equipment and a navigation medium, and relates to the technical field of robots. The system comprises: the system comprises a computing platform and a plurality of robots, wherein the plurality of robots are provided with sensors; wherein: the robots are used for sending sensor data acquired by the sensors of the robots to the computing platform; and the computing platform is used for receiving the sensor data sent by the plurality of robots, creating a map based on the sensor data sent by the plurality of robots, and positioning and/or repositioning any robot in the plurality of robots. The system can reduce the deployment difficulty and the deployment cost of the robot.

Description

Navigation system, method, device, electronic equipment and medium
Technical Field
The present invention relates to the field of robotics, and in particular, to a navigation system, method, apparatus, electronic device, and medium.
Background
Synchronous positioning and mapping (SLAM) means that a robot moves from an unknown position in an unknown environment, positions the robot according to position estimation and an existing map in the moving process, and constructs an incremental map on the basis of self positioning to realize autonomous positioning and mapping of the robot.
In the related art, a robot may collect surrounding information through a plurality of sensors, and then perform local calculation based on the information collected by each sensor, thereby completing works such as mapping and positioning. However, if the robot uses a large number of sensors, the robot needs to calculate each type of information collected by the sensors, which results in a large amount of calculation required by the robot. In order to provide enough computing resources for the robot, a very complex computing chip needs to be configured for the robot, which results in high deployment difficulty and deployment cost of the robot.
Disclosure of Invention
The embodiment of the invention aims to provide a navigation system, a navigation method, a navigation device, electronic equipment and a navigation medium, so as to reduce the deployment difficulty and the deployment cost of a robot. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a navigation system, including: the system comprises a computing platform and a plurality of robots, wherein the plurality of robots are provided with sensors; wherein:
the robots are used for sending sensor data acquired by the sensors of the robots to the computing platform;
and the computing platform is used for receiving the sensor data sent by the plurality of robots, creating a map based on the sensor data sent by the plurality of robots, and positioning and/or repositioning any robot in the plurality of robots.
In one possible implementation manner, the method further includes: the clock synchronization server is connected with the plurality of robots, and the plurality of robots share the time of the clock synchronization server;
the multiple robots are specifically used for sending sensor data which are acquired by sensors of the robots and carry timestamps to the computing platform;
the computing platform is specifically configured to receive sensor data with timestamps sent by the multiple robots, perform data alignment on the sensor data sent by the multiple robots according to the timestamps, create a map based on the sensor data after the data alignment, and perform positioning and/or repositioning on any one of the multiple robots.
In a possible implementation manner, the computing platform is further configured to, after a map is created based on the sensor data after data alignment, determine a position corresponding to the sensor data received this time if the sensor data with the timestamp is received again, and acquire historical sensor data corresponding to the same position, and if a confidence level of the sensor data received this time is greater than a confidence level of the historical sensor data, perform data alignment on the sensor data received this time according to the timestamp, create a map based on the sensor data received this time after data alignment, and update an existing map at the same position based on a newly created map.
In a possible implementation manner, the computing platform is specifically configured to perform data alignment on sensor data sent by multiple robots in different areas according to timestamps, and create a map based on the sensor data after the data alignment; and/or the presence of a gas in the gas,
the computing platform is specifically used for performing data alignment on sensor data sent by multiple robots in the same area according to the timestamp, performing data fusion on the sensor data subjected to data alignment, and creating a map based on the sensor data subjected to data fusion.
In a possible implementation manner, the multiple robots are specifically configured to acquire a picture including a mapping guidance object through a camera, move along with the mapping guidance object, and send sensor data acquired by their own sensors to the computing platform in a moving process.
In one possible implementation manner, the computing platform is further configured to send a positioning failure message to a first robot if the first robot cannot be positioned based on sensor data sent by the first robot before any one of the plurality of robots is positioned based on the sensor data after data alignment, where the first robot is any one of the plurality of robots;
the first robot is further configured to receive a positioning failure message sent by the computing platform, and send a co-location request to the computing platform;
the computing platform is further used for receiving a co-location request sent by the first robot;
the computing platform is specifically configured to, in response to the co-location request, determine, based on a location result obtained by last locating the plurality of robots, a second robot whose distance from the first robot is smaller than a preset distance threshold, acquire sensor data sent by the second robot from sensor data with timestamps sent by the plurality of robots, perform data alignment on the sensor data sent by the first robot and the sensor data sent by the second robot according to the timestamps, and locate the first robot based on the sensor data after data alignment.
In a possible implementation manner, the computing platform is specifically configured to, if the first robot cannot be located based on the sensor data sent by the first robot, determine, based on a location result obtained by last locating the plurality of robots, a second robot whose distance from the first robot is smaller than a preset distance threshold, acquire, from the sensor data with timestamps sent by the plurality of robots, the sensor data sent by the second robot and the sensor data sent by the second robot are aligned according to the timestamps, and locate, based on the sensor data after data alignment, the first robot which is any one of the plurality of robots.
In a possible implementation manner, a third robot of the plurality of robots is further configured to send a relocation request to the computing platform, where the relocation request carries sensor data, which carries timestamps and is collected around the third robot by a sensor of the third robot, and the third robot is any one of the plurality of robots;
the computing platform is specifically configured to receive a relocation request sent by the third robot, search sensor data that is the same as or similar to the sensor data carried in the relocation request from stored sensor data, perform data alignment on the searched sensor data based on a timestamp, and determine the position of the third robot based on a position corresponding to the sensor data after the data alignment.
In one possible implementation, the computing platform is a cloud computing platform, an edge computing platform, or a designated robot.
In a second aspect, an embodiment of the present application provides a navigation method, where the method is applied to a computing platform in a navigation system, and the navigation system further includes multiple robots, where the multiple robots are all provided with sensors; the method comprises the following steps:
receiving sensor data sent by the plurality of robots;
creating a map, locating and/or repositioning any of the plurality of robots based on sensor data sent by the plurality of robots.
In one possible implementation manner, the navigation system further comprises a clock synchronization server connected with the plurality of robots, and the plurality of robots share the time of the clock synchronization server;
the receiving sensor data that the multiple robots sent includes:
receiving sensor data which are sent by the plurality of robots and carry timestamps;
the creating a map, locating and/or repositioning any robot of the plurality of robots based on sensor data sent by the plurality of robots includes:
and performing data alignment on the sensor data sent by the plurality of robots according to the timestamps, creating a map based on the sensor data after the data alignment, and positioning and/or repositioning any robot in the plurality of robots.
In one possible implementation, after the creating a map based on the data-aligned sensor data, the method further includes:
if the sensor data carrying the timestamp are received again, determining the position corresponding to the sensor data received this time, and acquiring historical sensor data corresponding to the same position;
if the confidence coefficient of the sensor data received this time is greater than that of the historical sensor data, performing data alignment on the sensor data received this time according to the timestamp, and creating a map based on the sensor data received this time after the data alignment;
and updating the existing map of the same position based on the newly created map.
In one possible implementation manner, the data aligning the sensor data sent by the multiple robots according to the time stamps and creating a map based on the sensor data after the data aligning includes:
performing data alignment on sensor data sent by a plurality of robots in different areas according to the timestamps, and creating a map based on the sensor data after the data alignment; and/or the presence of a gas in the gas,
and performing data alignment on sensor data sent by a plurality of robots in the same area according to the time stamp, performing data fusion on the sensor data subjected to data alignment, and creating a map based on the sensor data subjected to data fusion.
In one possible implementation, before positioning any one of the plurality of robots based on the data-aligned sensor data, the method further includes:
if the first robot cannot be positioned based on the sensor data sent by the first robot, sending a positioning failure message to the first robot, wherein the first robot is any one of the plurality of robots;
receiving a co-location request sent by the first robot;
the data alignment of the sensor data sent by the plurality of robots according to the timestamps and the positioning of any one of the plurality of robots based on the sensor data after the data alignment comprises:
determining a second robot, the distance between which and the first robot is smaller than a preset distance value, based on a positioning result of the last positioning of the plurality of robots, in response to the co-positioning request;
acquiring sensor data sent by the second robot from the sensor data with the timestamps sent by the plurality of robots;
and performing data alignment on the sensor data sent by the first robot and the sensor data sent by the second robot according to the time stamp, and positioning the first robot based on the sensor data after the data alignment.
In one possible implementation manner, the data aligning the sensor data sent by the plurality of robots according to the time stamps and locating any one of the plurality of robots based on the sensor data after the data aligning includes:
if the first robot cannot be positioned based on sensor data sent by the first robot, determining a second robot of which the distance from the first robot is smaller than a preset distance threshold value based on a positioning result of the last positioning of the plurality of robots, wherein the first robot is any one of the plurality of robots;
acquiring sensor data sent by the second robot from the sensor data with the timestamps sent by the plurality of robots;
and performing data alignment on the sensor data sent by the first robot and the sensor data sent by the second robot according to the time stamp, and positioning the first robot based on the sensor data after the data alignment.
In one possible implementation, data alignment is performed on sensor data sent by the plurality of robots according to timestamps, and any one of the plurality of robots is repositioned based on the sensors after data alignment, including:
receiving a relocation request sent by a third robot, wherein the relocation request carries sensor data which are acquired by a sensor of the third robot around the robot and carry timestamps, and the third robot is any one of the plurality of robots;
and searching sensor data which is the same as or similar to the sensor data carried in the repositioning request from the stored sensor data, performing data alignment on the searched sensor data based on the timestamp, and determining the position of the third robot based on the position corresponding to the sensor data after the data alignment.
In one possible implementation, the computing platform is a cloud computing platform, an edge computing platform, or a designated robot.
In a third aspect, an embodiment of the present application provides a navigation device, where the navigation device is applied to a computing platform in a navigation system, where the navigation system further includes multiple robots, and each of the multiple robots is provided with a sensor; the device comprises:
the receiving module is used for receiving the sensor data sent by the plurality of robots;
and the navigation module is used for creating a map based on the sensor data sent by the plurality of robots and positioning and/or relocating any one of the plurality of robots.
In one possible implementation manner, the navigation system further comprises a clock synchronization server connected with the plurality of robots, and the plurality of robots share the time of the clock synchronization server;
the receiving module is specifically used for receiving sensor data which are sent by the plurality of robots and carry timestamps;
the navigation module is specifically configured to perform data alignment on sensor data sent by the plurality of robots according to the timestamps, create a map based on the sensor data after the data alignment, and perform positioning and/or repositioning on any one of the plurality of robots.
In one possible implementation, the apparatus further includes:
the acquisition module is used for determining the position corresponding to the sensor data received at this time and acquiring historical sensor data corresponding to the same position if the receiving module receives the sensor data carrying the timestamp again after the navigation module creates a map based on the sensor data after data alignment;
the navigation module is further configured to perform data alignment on the sensor data received this time according to the timestamp if the confidence of the sensor data received this time by the receiving module is greater than the confidence of the historical sensor data, and create a map based on the sensor data received this time after the data alignment;
and the updating module is used for updating the existing map at the same position based on the newly created map.
In a possible implementation manner, the navigation module is specifically configured to:
performing data alignment on sensor data sent by a plurality of robots in different areas according to the timestamps, and creating a map based on the sensor data after the data alignment; and/or the presence of a gas in the gas,
and performing data alignment on sensor data sent by a plurality of robots in the same area according to the time stamp, performing data fusion on the sensor data subjected to data alignment, and creating a map based on the sensor data subjected to data fusion.
In one possible implementation, the apparatus further includes:
a sending module, configured to send a positioning failure message to a first robot if the navigation module cannot position the first robot based on sensor data sent by the first robot, where the first robot is any one of the plurality of robots;
the receiving module is further configured to receive a co-location request sent by the first robot;
the navigation module is specifically configured to:
determining a second robot, the distance between which and the first robot is smaller than a preset distance value, based on a positioning result of the last positioning of the plurality of robots, in response to the co-positioning request;
acquiring sensor data sent by the second robot from the sensor data with the timestamps sent by the plurality of robots;
and performing data alignment on the sensor data sent by the first robot and the sensor data sent by the second robot according to the time stamp, and positioning the first robot based on the sensor data after the data alignment.
In a possible implementation manner, the navigation module is specifically configured to:
if the first robot cannot be positioned based on sensor data sent by the first robot, determining a second robot of which the distance from the first robot is smaller than a preset distance threshold value based on a positioning result of the last positioning of the plurality of robots, wherein the first robot is any one of the plurality of robots;
acquiring sensor data sent by the second robot from the sensor data with the timestamps sent by the plurality of robots;
and performing data alignment on the sensor data sent by the first robot and the sensor data sent by the second robot according to the time stamp, and positioning the first robot based on the sensor data after the data alignment.
In a possible implementation manner, the receiving module is further configured to receive a relocation request sent by a third robot, where the relocation request carries sensor data that is acquired by a sensor of the third robot around the robot and carries a timestamp, and the third robot is any one of the multiple robots;
the navigation module is specifically configured to search sensor data that is the same as or similar to the sensor data carried in the relocation request from the stored sensor data, perform data alignment on the searched sensor data based on the timestamp, and determine the position of the third robot based on the position corresponding to the sensor data after the data alignment.
In one possible implementation, the computing platform is a cloud computing platform, an edge computing platform, or a designated robot.
In a fourth aspect, an embodiment of the present application provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor and the communication interface complete communication between the memory and the processor through the communication bus;
a memory for storing a computer program;
a processor, configured to implement the method steps of the second aspect when executing the program stored in the memory.
In a fifth aspect, the present invention further provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the method steps of the second aspect.
In a sixth aspect, embodiments of the present invention also provide a computer program product including instructions, which when executed on a computer, cause the computer to perform the method steps of the second aspect.
By adopting the technical scheme, the plurality of robots can send sensor data acquired by the sensors of the robots to the computing platform, then the computing platform receives the sensor data sent by the plurality of robots, and a map is created based on the sensor data sent by the plurality of robots, and any one of the plurality of robots is positioned and/or relocated. I.e. the robot does not need to be mapped, positioned and/or repositioned based on the acquired sensor data, thereby reducing the amount of calculations required by the robot. Because the computing platform has more computing resources, the computing resources of the computing platform are used for computing the sensor data sent by the multiple robots, so that the mapping and the positioning and/or repositioning of any one of the multiple robots are completed, and therefore, the robot is not required to be configured with a complex computing chip by adopting the embodiment of the application, and the deployment difficulty and the deployment cost of the robot are reduced.
Of course, not all of the advantages described above need to be achieved at the same time in the practice of any one product or method of the invention.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a navigation system according to an embodiment of the present application;
FIG. 2 is a schematic structural diagram of another navigation system provided in an embodiment of the present application;
fig. 3 is a flowchart of a navigation method according to an embodiment of the present application;
FIG. 4 is a flow chart of another navigation method provided by an embodiment of the present application;
FIG. 5 is a flow chart of another navigation method provided by an embodiment of the present application;
fig. 6 is a schematic structural diagram of a navigation device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of another navigation device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of another navigation device according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
An embodiment of the present application provides a navigation system, as shown in fig. 1, including: the system comprises a computing platform and a plurality of robots, wherein the plurality of robots are provided with sensors; wherein:
the system comprises a plurality of robots, a computing platform and a plurality of sensors, wherein the robots are used for sending sensor data acquired by own sensors to the computing platform;
and the computing platform is used for receiving the sensor data sent by the plurality of robots, creating a map based on the sensor data sent by the plurality of robots, and positioning and/or relocating any one of the plurality of robots.
By adopting the navigation system, the plurality of robots can send sensor data acquired by the sensors of the robots to the computing platform, then the computing platform receives the sensor data sent by the plurality of robots, creates a map based on the sensor data sent by the plurality of robots, and positions and/or relocates any one of the plurality of robots. I.e. the robot does not need to be mapped, positioned and/or repositioned based on the acquired sensor data, thereby reducing the amount of calculations required by the robot. Because the computing platform has more computing resources, the computing resources of the computing platform are used for computing the sensor data sent by the multiple robots, so that the mapping and the positioning and/or repositioning of any one of the multiple robots are completed, and therefore, the robot is not required to be configured with a complex computing chip by adopting the embodiment of the application, and the deployment difficulty and the deployment cost of the robot are reduced.
In addition, the robot does not need to perform complex calculation, so the endurance time of the robot can be prolonged, the complexity of the robot is reduced, and the maintenance cost of the robot is reduced.
Specifically, the robot in the embodiment of the present application may include a plurality of sensors, and the sensors may be a monocular camera, a binocular camera, a depth camera, an Inertial Measurement Unit (IMU) code disc, an ultrasonic sensor, an infrared sensor, and the like.
The computing platform in the embodiment of the present application may be, but is not limited to, a cloud computing platform, an edge computing platform, or a designated robot. Optionally, the computing platform is a heterogeneous computing platform including different computing resources, for example, the computing platform includes computing resources such as a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), an embedded Neural Network Processor (NPU), and the like, different processors may be used to process different types of sensor data, and multiple processors may perform cooperative Processing, so that the computing efficiency is improved, and further, the efficiency of mapping and positioning may be improved.
In the embodiment of the present application, since the data volume of the sensor data sent by the robot to the computing platform is large, the robot needs to perform communication with the computing platform at a large flow rate and with low delay. In order to realize communication with large flow and low time delay, the computing platform and the robot can communicate through a 5th generation Wireless communication system (5G) network or Wireless Fidelity (WiFi) 6. The communication through 5G or WiFi6 can enable the communication between the robot and the computing platform to reach the communication speed of more than 1Gbps and the communication time delay of less than 10 milliseconds, so that the real-time performance of the computing platform on the robot positioning and/or repositioning is guaranteed.
In one embodiment of the present application, as shown in fig. 2, the navigation system further includes a clock synchronization server connected to the plurality of robots, and the plurality of robots share the time of the clock synchronization server.
The multiple robots and the clock synchronization server can be connected through a low-delay network such as a local area network to ensure clock synchronization among the robots.
The multiple robots are specifically used for sending sensor data which are acquired by sensors of the robots and carry timestamps to the computing platform.
The computing platform is specifically used for receiving sensor data which are sent by the multiple robots and carry timestamps, performing data alignment on the sensor data sent by the multiple robots according to the timestamps, creating a map based on the sensor data after the data alignment, and positioning and/or repositioning any robot in the multiple robots.
Optionally, the sensor data sent by the robot further includes information such as a robot ID and a sensor ID, so that the computing platform can distinguish sensor data of different sensors of different robots.
The method includes that a computing platform receives sensor data which are sent by a plurality of robots and carry timestamps, the sensor data are influenced by transmission factors such as a network, and the sensor data received by the computing platform may not be arranged according to a time sequence. For example, the computing platform receives radar data acquired by robot a in 0.2 th second and video data acquired by robot B in 0.1 th second at a certain time, and receives radar data acquired by robot B in 0.1 th second and video data acquired by robot B in 0.2 th second. Obviously, if the computing platform uses the radar data and the video data acquired by the robot at different times to perform mapping, positioning and/or repositioning computations, the computation result will be inaccurate. The computing platform needs to perform data alignment on the sensor data sent by robot a and robot B based on the timestamps, so that mapping, positioning and/or repositioning is completed using the radar data and video data acquired by robot a at 0.1 second, the radar data and video data acquired by robot B at 0.2 second, and the radar data and video data acquired by robot a and robot B every 0.1 second.
Optionally, in this embodiment of the application, when the timestamp is used to align the data of the sensor, the used time granularity may be a second level or a millisecond level, or of course, may be other levels, and may be set according to actual needs, which is not limited in this embodiment of the application.
By adopting the embodiment of the application, after the sensor data which are sent by the plurality of robots and carry timestamps are received, the sensor data sent by the plurality of robots are aligned according to the timestamps, the accuracy of the created map can be improved by creating the map based on the sensor data after the data alignment, and the accuracy of positioning and/or repositioning can be improved by positioning and/or repositioning any robot in the plurality of robots based on the sensor data after the data alignment.
In one implementation, the movement routes of the plurality of robots may be configured in advance. And the plurality of robots are specifically used for moving based on a pre-configured moving route and sending sensor data acquired by the sensors of the robots to the computing platform in the moving process. Or the robot can autonomously detect the unknown area and send the sensor data to the computing platform, so that the computing platform creates a map based on the sensor data to realize automatic map creation.
In another implementation manner, the multiple robots are specifically configured to acquire a picture including the mapping guide object through cameras, move along with the mapping guide object, and send sensor data acquired by their own sensors to the computing platform in the moving process.
The mapping guide object may be a person or other movable equipment. Taking the example that the moving object is a person, the robot can determine the moving track of the person through the picture acquired by the camera and move along with the movement of the person, so that the robot can be controlled manually.
In another embodiment of the application, the computing platform is specifically configured to perform data alignment on sensor data sent by multiple robots located in different areas according to timestamps, and create a map based on the sensor data after the data alignment; and/or the presence of a gas in the gas,
and the computing platform is specifically used for performing data alignment on sensor data sent by a plurality of robots in the same area according to the timestamp, performing data fusion on the sensor data subjected to data alignment, and creating a map based on the sensor data subjected to data fusion.
In the first embodiment, a plurality of robots may be configured to collect sensor data in different areas, respectively. After receiving the sensor data collected by the multiple robots, the cloud platform aligns the sensor data based on the timestamps, splices the sensor data from the multiple robots, and creates a map based on the spliced sensor data. Or after the cloud platform is based on the sensor data acquired by the multiple robots, aligning the sensor data based on the timestamps, respectively creating a map based on the aligned sensor data sent by each robot to obtain a map of the area where each robot is located, and then splicing the created maps to obtain a complete map.
For example, to create an internal map of an office, 3 robots, robot 1, robot 2, and robot 3, respectively, may be configured, and the office is divided into 3 areas, area 1, area 2, and area 3, respectively. The robot 1, the robot 2 and the robot 3 are respectively used for collecting radar data and image data in the area 1, the area 2 and the area 3 and sending the radar data and the image data to the computing platform. After receiving the radar data and the image data sent by the robot 1, the robot 2, and the robot 3, the computing platform aligns the received radar data and the received image data based on the timestamp, for example, sequentially acquires the radar data and the image data of … … th second in 1 st and 2 nd seconds, creates a map of the area 1 based on the radar data and the image map after the data alignment corresponding to the robot 1, creates a map of the area 2 based on the radar data and the image data after the data alignment corresponding to the robot 2, creates a map of the area 3 based on the radar data and the image data after the data alignment corresponding to the robot 3, and splices the map of the area 1, the map of the area 2, and the map of the area 3 to obtain an internal map of the whole office.
It can be understood that there are boundaries among the area 1, the area 2, and the area 3, and at the boundary of the areas, taking the boundary of the area 1 and the area 2 as an example, the sensor of the robot 1 can acquire part of the radar data and the image data in the area 2 at the boundary, and similarly, the sensor of the robot 2 can acquire part of the radar data and the image data in the area 1 at the boundary. Optionally, when creating the map of the area 1, the computing platform may perform data alignment on the radar data and the image data in the area 1 acquired by the robot 2 and the radar data and the image data in the area 1 acquired by the robot 1 based on the timestamp, and then create the map of the area 1 based on the radar data and the image data after the data alignment. Likewise, maps for area 2 and area 3 may also be created based on the same method, and will not be described herein.
In the second embodiment, multiple robots may be configured to move in the same area, and the acquisition ranges of the sensors of the multiple robots may partially overlap.
For example, in the area 1, there are two robots, robot a and robot B, respectively. The robot A and the robot B move in the area 1 and send radar data and image data acquired by the robot A and the robot B to the computing platform, and after the computing platform receives the radar data and the image data sent by the robot A and the robot B, the computing platform sequentially acquires radar data and image data of the 1 st second and the 2 nd second … … nth second from the received radar data and image data, so that data alignment of the radar data and the image data sent by the robot A and the robot B is realized. And then performing data fusion on the radar data and the image data sent by the robot A and the robot B which are subjected to data alignment, and creating a map of the area 1 based on the radar data and the image data after the data fusion.
The data fusion refers to combining data from a plurality of sensors, eliminating redundancy and contradiction between the data and keeping complementary data. For example, if robot a acquires sensor data within range 1 and robot B acquires sensor data within range 2, the sensor data within range 1 acquired by robot a and the sensor data within range 2 acquired by robot B are spliced. If the robot A and the robot B both acquire the sensor data within the range 3, the sensor data within the range 3 acquired by the robot A and the robot B are analyzed, redundant data are deleted, complementary data are reserved, and sensor data which are more complete than the sensor data within the range 3 acquired by a single robot are obtained. For example, if there is a blocking object at a certain position in the range 3, which results in that the robot a can only collect the image data that is not blocked, but the robot B collects the image data of the blocked part at another angle, the image data of the robot a and the image data of the robot B are fused to obtain the complete image data in the range 3. The specific data fusion method can refer to the data fusion technology of the sensor in the related art.
In the third embodiment, the first embodiment described above may be combined with the second embodiment. That is, some robots are configured to move in the same area and some robots move in different areas.
For example, in order to create an internal map of an office, the robots 2 and 3 may be configured to collect sensor data in the areas 2 and 3, respectively, and the robots a and B may be configured to collect sensor data in the area 1, based on the example in the first implementation, maps of the areas 2 and 3 may be obtained, based on the example in the second implementation, a map of the area 1 may be obtained, and then the maps of the areas 1, 2, and 3 are spliced to obtain an internal map of the entire office.
By adopting the three implementation modes, the computing platform can realize distributed map building based on the sensor data sent by the multiple robots, and because the sensor data collected by a single robot has a blind area or the quality of part of the data is poor, compared with a map built by a single robot, the map built by the computing platform based on the sensor data sent by the multiple robots is more accurate and perfect.
In another embodiment of the present application, after the map is created, in order to make the map more complete, or in order to make environmental information in an area corresponding to the map change, for example, a setup position of an article in a room changes, the robot may be configured to move again in the area where the created map is located.
The computing platform is further configured to, after a map is created based on the sensor data after data alignment, determine a position corresponding to the sensor data received this time if the sensor data with the timestamp is received again, acquire historical sensor data corresponding to the same position, perform data alignment on the sensor data received this time according to the timestamp if the confidence level of the sensor data received this time is greater than the confidence level of the historical sensor data, create a map based on the sensor data received this time after data alignment, and update the existing map at the same position based on the newly created map.
Wherein, after the computing platform receives the sensor data carrying the timestamp again, the confidence of the sensor data can be determined based on the similarity between the sensor data of adjacent moments. Alternatively, the similarity between the sensor data at the adjacent time instants may be determined based on a covariance matrix between the sensor data at the adjacent time instants, and then the confidence may be determined based on the similarity, the higher the similarity between the sensor data, the greater the confidence.
If the confidence coefficient of the sensor data received this time is greater than that of the historical sensor data, the map created based on the sensor data received this time is more accurate, so that the map can be created based on the sensor data received this time and replace the existing map at the same position, dynamic update of the map is achieved, and accuracy of the map is improved.
In another embodiment of the present application, the computing platform has a co-location function, where co-location refers to using sensor data of multiple robots to locate any one of the multiple robots. However, in order to save the computation overhead, the computing platform may perform the co-location again when the robot cannot be located based on the sensor data sent by the robot.
In one implementation, the computing platform is further configured to send a positioning failure message to the first robot if the first robot cannot be positioned based on the sensor data sent by the first robot before any one of the plurality of robots is positioned based on the sensor data after the data alignment.
Wherein the first robot is any one of the plurality of robots.
If there is data missing in the sensor data sent by the first robot, or the reliability of the sensor data acquired by the first robot is poor due to environmental reasons, for example, valid data is not acquired in the sensor data of the robot due to the existence of an obstacle, or the sensor data acquired by the robot is unavailable due to too dark or too bright light, the first robot cannot be positioned by the computing platform based on the sensor data after data alignment.
And the first robot is also used for receiving the positioning failure message sent by the computing platform and sending a cooperative positioning request to the computing platform.
The co-location request is used for requesting the computing platform to locate the first robot based on the co-location function.
And the computing platform is also used for receiving the cooperative positioning request sent by the first robot.
And the computing platform is specifically used for responding to the cooperative positioning request, determining a second robot of which the distance to the first robot is smaller than a preset distance threshold value based on the positioning result of positioning the plurality of robots last time, acquiring sensor data sent by the second robot from the sensor data with the timestamps sent by the plurality of robots, performing data alignment on the sensor data sent by the first robot and the sensor data sent by the second robot according to the timestamps, and positioning the first robot based on the sensor data after the data alignment.
For example, if the plurality of robots include robot a, robot B, and robot C, the first robot is robot a. If the computing platform receives the cooperative positioning request sent by the robot A, the positions of the robot A, the robot B and the robot C are determined according to the positioning result of the last positioning of the plurality of robots, and if the distance between the robot B and the robot A is smaller than the preset distance, the robot B is near the robot A, namely the robot B can acquire sensor data around the robot A, so that the robot A can be positioned based on the sensor data sent by the robot B and the sensor data sent by the robot A.
Specifically, if the computing platform receives the radar data and the image data sent by the robot a, but the computing platform cannot position the robot a due to lack of image data within a certain range, the computing platform may acquire the image data within the range from the image data sent by the robot B, perform data alignment on the image data acquired from the robot B, the image data sent by the robot a, and the radar data sent by the robot a based on the timestamp, and position the robot a.
In another implementation, if the computing platform cannot locate the first robot based on the sensor data sent by the first robot, the computing platform may actively perform cooperative positioning, thereby implementing the location of the first robot.
Specifically, the computing platform is configured to, if the first robot cannot be positioned based on the sensor data transmitted by the first robot, determine a second robot having a distance from the first robot smaller than a preset distance threshold value based on a positioning result of the previous positioning of the plurality of robots, acquire the sensor data transmitted by the second robot from the sensor data with time stamps transmitted by the plurality of robots, perform data alignment on the sensor data transmitted by the first robot and the sensor data transmitted by the second robot according to the time stamps, and position the first robot based on the sensor data after the data alignment. Wherein the first robot is any one of the plurality of robots.
By adopting the embodiment of the application, the computing platform can perform cooperative positioning when needed, and does not perform cooperative positioning in other time, so that the computing overhead of the computing platform can be reduced. In addition, when the computing platform cannot position the robot based on sensor data sent by the robot, the robot is positioned in a cooperative positioning mode in time, the success rate of positioning the robot can be improved, and the condition that the robot cannot be positioned is avoided.
Optionally, in another embodiment of the present application, the robot may have a computing resource for implementing a basic positioning function, and after the sensor of the robot acquires the sensor data, the sensor data may be stored for a specified duration, so that after the robot is disconnected from the computing platform due to an emergency, the robot may use the sensor data to perform positioning.
In another embodiment of the present application, the computing platform may enable repositioning of any of the plurality of robots. Repositioning refers to repositioning the robot by the computing platform based on the sensor data stored in the computing platform when the robot cannot determine its position.
And the third robot in the multiple robots is also used for sending a repositioning request to the computing platform, wherein the repositioning request carries sensor data which is acquired by the sensors of the third robot around the third robot and carries time stamps, and the third robot is any one of the multiple robots.
The computing platform is specifically configured to receive a relocation request sent by a third robot, search sensor data that is the same as or similar to the sensor data carried in the relocation request from the stored sensor data, perform data alignment on the searched sensor data based on the timestamp, and determine the position of the third robot based on the position corresponding to the sensor data after the data alignment.
In the embodiment of the application, the computing platform can receive and store sensor data of the plurality of robots, when the computing platform finds sensor data which is the same as or similar to the sensor data carried in the repositioning request from the sensor data, it indicates that other robots are at the same position as the robot once, and the computing platform locates the other robots based on the found sensor data once, so that the computing platform can determine the position corresponding to the found sensor data, and the position of the robot can be determined based on the position corresponding to the found sensor data, thereby completing the repositioning of the robot.
For example, the third robot is moved to an unknown position in a power-off state, and when the third robot is started and restarted, if the position of the third robot cannot be determined, the third robot may rotate for a circle or move randomly, and send a relocation request carrying sensor data acquired by a sensor of the third robot to the computing platform, where the sensor data includes image data and radar data. After receiving the repositioning request, the computing platform can search sensor data which is the same as or similar to the image data and the radar data of the third robot from the sensor data stored in the computing platform.
And then, performing data alignment on the searched image data and radar data based on the timestamp, determining the positions corresponding to the image data and the radar data after the data alignment, for example, the searched image data and the radar data come from the robot A, determining that the robot A is in front of a wall and a picture is on the wall through the image data and the radar data after the data alignment, and determining the relative position relationship between the robot A and the picture. And the third robot can be determined to be in front of the same wall and have the same picture on the wall based on the image data and the radar data sent by the third robot, and the relative position relationship between the third robot and the picture can be determined. Further, the relative positional relationship between the robot a and the third robot is determined based on the relative positional relationship between the robot a and the drawing and the relative positional relationship between the third robot and the drawing, and then the position of the third robot is determined based on the relative positional relationship and the position of the robot a.
By adopting the embodiment of the application, under the condition that the robot cannot acquire the position of the robot, the computing platform can quickly position the robot based on the sensor data stored by the computing platform, so that the robot can acquire the position of the robot in time.
Based on the navigation system, an embodiment of the present application further provides a navigation method, where the method is applied to a computing platform in the navigation system, the navigation system further includes a plurality of robots, each of the plurality of robots is provided with a sensor, and as shown in fig. 3, the method includes:
and S301, receiving sensor data sent by the plurality of robots.
S302, a map is created based on sensor data sent by the multiple robots, and any one of the multiple robots is positioned and/or repositioned.
By adopting the navigation method, the computing platform can receive the sensor data sent by the plurality of robots, create a map based on the sensor data sent by the plurality of robots, and position and/or reposition any one of the plurality of robots. I.e. the robot does not need to be mapped, positioned and/or repositioned based on the acquired sensor data, thereby reducing the amount of calculations required by the robot. Because the computing platform has more computing resources, the computing resources of the computing platform are used for computing the sensor data sent by the multiple robots, so that the mapping and the positioning and/or repositioning of any one of the multiple robots are completed, and therefore, the robot is not required to be configured with a complex computing chip by adopting the embodiment of the application, and the deployment difficulty and the deployment cost of the robot are reduced.
In an implementation manner of the embodiment of the application, the navigation system further includes a clock synchronization server connected to the plurality of robots, and the plurality of robots share the time of the clock synchronization server, and on this basis, the S301 receiving the sensor data sent by the plurality of robots may specifically be implemented as:
and receiving sensor data which are sent by a plurality of robots and carry timestamps.
Accordingly, the step S302 of creating a map based on the sensor data sent by the multiple robots, and positioning and/or repositioning any robot in the multiple robots may specifically be implemented as:
and performing data alignment on the sensor data sent by the multiple robots according to the timestamps, creating a map based on the sensor data after the data alignment, and positioning and/or repositioning any robot in the multiple robots.
By adopting the embodiment of the application, the computing platform aligns the sensor data sent by the multiple robots according to the timestamps after receiving the sensor data which are sent by the multiple robots and carry the timestamps, the accuracy of the created map can be improved by creating the map based on the sensor data after the data alignment, and the accuracy of positioning and/or repositioning can be improved by positioning and/or repositioning any robot in the multiple robots based on the sensor data after the data alignment.
The following describes methods of creating a map, positioning, and repositioning in the embodiments of the present application.
As shown in fig. 4, the method for creating a graph in the embodiment of the present application specifically includes the following steps:
s401, sensor data which are sent by the multiple robots and carry timestamps are received.
S402, aligning the sensor data sent by the robots in different areas according to the time stamps, and creating a map based on the sensor data after data alignment; and/or performing data alignment on sensor data sent by a plurality of robots in the same area according to the time stamp, performing data fusion on the sensor data subjected to data alignment, and creating a map based on the sensor data subjected to data fusion.
Alternatively, after the map is created, the map may also be updated based on the actual situation, so after S402 described above, S403 to S405 may also be performed.
And S403, if the sensor data with the timestamp is received again, determining the position corresponding to the sensor data received this time, and acquiring historical sensor data corresponding to the same position.
S404, if the confidence coefficient of the sensor data received this time is greater than that of the historical sensor data, performing data alignment on the sensor data received this time according to the timestamp, and creating a map based on the sensor data received this time after the data alignment.
It will be appreciated that if the confidence level of the sensor data received this time is less than or equal to the confidence level of the historical sensor data, no update to the map that has been created is necessary.
And S405, updating the existing map of the same position based on the newly created map.
By adopting the method, the computing platform can realize distributed map building based on the sensor data sent by the multiple robots, and because the sensor data collected by a single robot has a blind area or the quality of part of the data is poor, compared with a map built by a single robot, the map built by the computing platform based on the sensor data sent by the multiple robots is more accurate and perfect. And after the map is created, the dynamic update of the map can be realized, so that the accuracy of the map is higher.
In an implementation manner, as shown in fig. 5, the positioning method in the embodiment of the present application specifically includes the following steps:
and S501, receiving sensor data which are sent by a plurality of robots and carry timestamps.
S502, if the first robot cannot be positioned based on the sensor data sent by the first robot, sending a positioning failure message to the first robot.
Wherein the first robot is any one of the plurality of robots.
S503, receiving a cooperative positioning request sent by the first robot.
After receiving the co-location request, the computing platform may implement positioning of the first robot based on the co-location function, that is, perform data alignment on sensor data sent by the multiple robots according to the timestamps, and locate any one of the multiple robots based on the sensor data after the data alignment, which may be specifically implemented as the following S504 to S506.
And S504, responding to the cooperative positioning request, and determining a second robot of which the distance from the first robot is smaller than a preset distance value based on the positioning result of the last positioning of the plurality of robots.
And S505, acquiring the sensor data transmitted by the second robot from the sensor data which are transmitted by the plurality of robots and carry the time stamps.
S506, performing data alignment on the sensor data sent by the first robot and the sensor data sent by the second robot according to the time stamp, and positioning the first robot based on the sensor data after the data alignment.
It is to be appreciated that after the first robot is located, the computing platform can send the location results to the first robot.
In the flow of fig. 5, the computing platform co-locates the robot as needed, that is, the computing platform generally locates the robot based on the sensor data of the robot, and when the robot cannot be located based on the sensor data of the robot, the computing platform co-locates the robot as needed through the flow of fig. 5. In another embodiment, the computing platform may also perform cooperative positioning in real time, and after receiving the sensor data with the time stamp sent by the multiple robots, the computing platform may directly perform cooperative positioning on the first robot based on the sensor data with the time stamp sent by the multiple robots.
By adopting the method, the computing platform does not need to carry out cooperative positioning on all robots in real time, so that computing resources can be saved, and when the first robot cannot be positioned based on the sensor data sent by the first robot, the first robot can be cooperatively positioned based on the request of the first robot, so that the condition that the robot cannot be positioned is avoided, the computing resources of the computing platform are saved, and the accuracy and the success rate of positioning the robot are considered.
In another implementation manner, the computing platform may actively perform cooperative positioning, that is, perform data alignment on sensor data sent by the multiple robots according to timestamps, and position any one of the multiple robots based on the sensor data after the data alignment, which may specifically be implemented as:
and if the first robot cannot be positioned based on the sensor data sent by the first robot, determining a second robot of which the distance from the first robot is less than a preset distance threshold value based on the positioning result of the last positioning of the plurality of robots, wherein the first robot is any one of the plurality of robots.
And acquiring the sensor data transmitted by the second robot from the sensor data which are transmitted by the plurality of robots and carry the time stamps.
And performing data alignment on the sensor data sent by the first robot and the sensor data sent by the second robot according to the time stamp, and positioning the first robot based on the sensor data after the data alignment.
In another embodiment of the present application, aligning sensor data sent by a plurality of robots according to timestamps, and repositioning any robot in the plurality of robots based on the sensor data after data alignment may specifically be implemented as:
and receiving a repositioning request sent by the third robot. The repositioning request carries sensor data which is collected by a sensor of a third robot around the robot and carries a time stamp, and the third robot is any one of the plurality of robots.
And then, searching sensor data which is the same as or similar to the sensor data carried in the repositioning request from the stored sensor data, performing data alignment on the searched sensor data based on the timestamp, and determining the position of the third robot based on the position corresponding to the sensor data after the data alignment.
By adopting the embodiment of the application, under the condition that the robot cannot acquire the position of the robot, the computing platform can quickly position the robot based on the sensor data stored by the computing platform, so that the robot can acquire the position of the robot in time.
Corresponding to the method embodiment, the embodiment of the application also provides a navigation device, the device is applied to a computing platform in a navigation system, the navigation system also comprises a plurality of robots, and the plurality of robots are provided with sensors; as shown in fig. 6, the apparatus includes:
a receiving module 601, configured to receive sensor data sent by the multiple robots;
a navigation module 602, configured to create a map, locate and/or reposition any one of the plurality of robots based on the sensor data sent by the plurality of robots.
Optionally, the navigation system further comprises a clock synchronization server connected to the plurality of robots, the plurality of robots sharing time of the clock synchronization server;
a receiving module 601, specifically configured to receive sensor data with timestamps sent by the multiple robots;
the navigation module 602 is specifically configured to perform data alignment on the sensor data sent by the multiple robots according to the timestamps, create a map based on the sensor data after the data alignment, and position and/or reposition any one of the multiple robots.
Optionally, as shown in fig. 7, the apparatus further includes: an acquisition module 701 and an update module 702.
An obtaining module 701, configured to, after the navigation module 602 creates a map based on the sensor data after data alignment, determine a position corresponding to the sensor data received this time if the receiving module 601 receives the sensor data carrying the timestamp again, and obtain historical sensor data corresponding to the same position;
the navigation module 602 is further configured to, if the confidence of the sensor data received this time by the receiving module 601 is greater than the confidence of the historical sensor data, perform data alignment on the sensor data received this time according to the timestamp, and create a map based on the sensor data received this time after the data alignment;
an updating module 702 is configured to update an existing map of the same location based on the newly created map.
Optionally, the navigation module 602 is specifically configured to:
performing data alignment on sensor data sent by a plurality of robots in different areas according to the timestamps, and creating a map based on the sensor data after the data alignment; and/or the presence of a gas in the gas,
and performing data alignment on sensor data sent by a plurality of robots in the same area according to the time stamp, performing data fusion on the sensor data subjected to data alignment, and creating a map based on the sensor data subjected to data fusion.
Optionally, as shown in fig. 8, the apparatus further includes: a sending module 801.
A sending module 801, configured to send a positioning failure message to a first robot if the navigation module cannot position the first robot based on sensor data sent by the first robot, where the first robot is any one of the plurality of robots;
a receiving module 601, configured to receive a co-location request sent by the first robot;
the navigation module 602 is specifically configured to:
determining a second robot, the distance between which and the first robot is smaller than a preset distance value, based on a positioning result of the last positioning of the plurality of robots, in response to the co-positioning request;
acquiring sensor data sent by the second robot from the sensor data with the timestamps sent by the plurality of robots;
and performing data alignment on the sensor data sent by the first robot and the sensor data sent by the second robot according to the time stamp, and positioning the first robot based on the sensor data after the data alignment.
Optionally, the navigation module 602 is specifically configured to:
if the first robot cannot be positioned based on sensor data sent by the first robot, determining a second robot of which the distance from the first robot is smaller than a preset distance threshold value based on a positioning result of the last positioning of the plurality of robots, wherein the first robot is any one of the plurality of robots;
acquiring sensor data sent by the second robot from the sensor data with the timestamps sent by the plurality of robots;
and performing data alignment on the sensor data sent by the first robot and the sensor data sent by the second robot according to the time stamp, and positioning the first robot based on the sensor data after the data alignment.
Optionally, the receiving module 601 is further configured to receive a relocation request sent by a third robot, where the relocation request carries sensor data that is acquired by a sensor of the third robot around the robot and carries a timestamp, and the third robot is any one of the multiple robots;
the navigation module 602 is specifically configured to search sensor data that is the same as or similar to the sensor data carried in the relocation request from the stored sensor data, perform data alignment on the searched sensor data based on the timestamp, and determine the position of the third robot based on the position corresponding to the sensor data after the data alignment.
Optionally, the computing platform in the embodiment of the present application may be specifically, but not limited to, a cloud computing platform, an edge computing platform, or a designated robot.
The specific implementation of each module in the embodiment of the present application may refer to the above method embodiment, and is not described in detail herein.
An embodiment of the present invention further provides an electronic device, which is embodied as a computing platform in the foregoing embodiments, as shown in fig. 9, and includes a processor 901, a communication interface 902, a memory 903, and a communication bus 904, where the processor 901, the communication interface 902, and the memory 903 complete mutual communication through the communication bus 904,
a memory 903 for storing computer programs;
the processor 901 is configured to implement the method steps in the above-described method embodiments when executing the program stored in the memory 903.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component.
In a further embodiment of the present invention, a computer-readable storage medium is further provided, in which a computer program is stored, which when executed by a processor implements the steps of any of the above-mentioned navigation methods.
In a further embodiment provided by the present invention, there is also provided a computer program product containing instructions which, when run on a computer, cause the computer to perform any of the navigation methods of the above embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the method and apparatus embodiments, since they are substantially similar to the system embodiments, the description is relatively simple, and reference may be made to some descriptions of the system embodiments for relevant points.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.

Claims (10)

1. A navigation system, comprising: the system comprises a computing platform and a plurality of robots, wherein the plurality of robots are provided with sensors; wherein:
the robots are used for sending sensor data acquired by the sensors of the robots to the computing platform;
and the computing platform is used for receiving the sensor data sent by the plurality of robots, creating a map based on the sensor data sent by the plurality of robots, and positioning and/or repositioning any robot in the plurality of robots.
2. The navigation system of claim 1, further comprising: the clock synchronization server is connected with the plurality of robots, and the plurality of robots share the time of the clock synchronization server;
the multiple robots are specifically used for sending sensor data which are acquired by sensors of the robots and carry timestamps to the computing platform;
the computing platform is specifically configured to receive sensor data with timestamps sent by the multiple robots, perform data alignment on the sensor data sent by the multiple robots according to the timestamps, create a map based on the sensor data after the data alignment, and perform positioning and/or repositioning on any one of the multiple robots.
3. The navigation system of claim 2,
the computing platform is further configured to, after a map is created based on the sensor data after data alignment, determine a position corresponding to the sensor data received this time if the sensor data with the timestamp is received again, and acquire historical sensor data corresponding to the same position, perform data alignment on the sensor data received this time according to the timestamp if a confidence level of the sensor data received this time is greater than a confidence level of the historical sensor data, create a map based on the sensor data received this time after data alignment, and update an existing map at the same position based on the newly created map.
4. The navigation system of any one of claims 1-3, wherein the computing platform is a cloud computing platform, an edge computing platform, or a designated robot.
5. The navigation method is characterized in that the method is applied to a computing platform in a navigation system, the navigation system also comprises a plurality of robots, and the robots are provided with sensors; the method comprises the following steps:
receiving sensor data sent by the plurality of robots;
creating a map, locating and/or repositioning any of the plurality of robots based on sensor data sent by the plurality of robots.
6. The method of claim 5, wherein the navigation system further comprises a clock synchronization server connected to the plurality of robots, the plurality of robots sharing a time of the clock synchronization server;
the receiving sensor data that the multiple robots sent includes:
receiving sensor data which are sent by the plurality of robots and carry timestamps;
the creating a map, locating and/or repositioning any robot of the plurality of robots based on sensor data sent by the plurality of robots includes:
and performing data alignment on the sensor data sent by the plurality of robots according to the timestamps, creating a map based on the sensor data after the data alignment, and positioning and/or repositioning any robot in the plurality of robots.
7. The navigation device is characterized in that the device is applied to a computing platform in a navigation system, the navigation system also comprises a plurality of robots, and the robots are provided with sensors; the device comprises:
the receiving module is used for receiving the sensor data sent by the plurality of robots;
and the navigation module is used for creating a map based on the sensor data sent by the plurality of robots and positioning and/or relocating any one of the plurality of robots.
8. The apparatus of claim 7, wherein the navigation system further comprises a clock synchronization server connected to the plurality of robots, the plurality of robots sharing a time of the clock synchronization server;
the receiving module is specifically used for receiving sensor data which are sent by the plurality of robots and carry timestamps;
the navigation module is specifically configured to perform data alignment on sensor data sent by the plurality of robots according to the timestamps, create a map based on the sensor data after the data alignment, and perform positioning and/or repositioning on any one of the plurality of robots.
9. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the method steps of any of claims 5 to 6 when executing a program stored in the memory.
10. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when being executed by a processor, carries out the method steps of any one of claims 5 to 6.
CN202010182901.2A 2020-03-16 2020-03-16 Navigation system, method, device, electronic equipment and medium Active CN111352425B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010182901.2A CN111352425B (en) 2020-03-16 2020-03-16 Navigation system, method, device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010182901.2A CN111352425B (en) 2020-03-16 2020-03-16 Navigation system, method, device, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN111352425A true CN111352425A (en) 2020-06-30
CN111352425B CN111352425B (en) 2024-02-09

Family

ID=71194596

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010182901.2A Active CN111352425B (en) 2020-03-16 2020-03-16 Navigation system, method, device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN111352425B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111890368A (en) * 2020-08-06 2020-11-06 深圳优地科技有限公司 Position calibration method, device and system based on robot and storage medium
CN111970351A (en) * 2020-08-11 2020-11-20 震坤行工业超市(上海)有限公司 Data alignment-based multi-dimensional sensing optimization method and system for Internet of things
CN112666942A (en) * 2020-12-15 2021-04-16 美智纵横科技有限责任公司 Self-moving robot and path planning method, device, equipment and storage medium thereof
CN113814997A (en) * 2021-10-18 2021-12-21 上海擎朗智能科技有限公司 Robot repositioning method and device, electronic equipment and storage medium
CN114413903A (en) * 2021-12-08 2022-04-29 上海擎朗智能科技有限公司 Positioning method for multiple robots, robot distribution system, and computer-readable storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040073337A1 (en) * 2002-09-06 2004-04-15 Royal Appliance Sentry robot system
CN107544515A (en) * 2017-10-10 2018-01-05 苏州中德睿博智能科技有限公司 Multirobot based on Cloud Server builds figure navigation system and builds figure air navigation aid
US20180147721A1 (en) * 2016-11-28 2018-05-31 Brain Corporation Systems and methods for remote operating and/or monitoring of a robot
CN108801254A (en) * 2017-05-02 2018-11-13 北京米文动力科技有限公司 A kind of method for relocating and robot
CN109074407A (en) * 2018-07-23 2018-12-21 深圳前海达闼云端智能科技有限公司 Multi-source data mapping method, related device and computer-readable storage medium
CN109084786A (en) * 2018-08-09 2018-12-25 北京智行者科技有限公司 A kind of processing method of map datum
CN109141393A (en) * 2018-07-02 2019-01-04 北京百度网讯科技有限公司 Method for relocating, equipment and storage medium
CN109725327A (en) * 2019-03-07 2019-05-07 山东大学 A kind of method and system of multimachine building map
CN109935077A (en) * 2017-12-15 2019-06-25 百度(美国)有限责任公司 System for constructing vehicle and cloud real-time traffic map for automatic driving vehicle
CN110553652A (en) * 2019-10-12 2019-12-10 上海高仙自动化科技发展有限公司 robot multi-sensor fusion positioning method and application thereof
CN110673614A (en) * 2019-10-25 2020-01-10 湖南工程学院 Mapping system and mapping method of small robot group based on cloud server
CN110686676A (en) * 2019-09-12 2020-01-14 深圳市银星智能科技股份有限公司 Robot repositioning method and device and robot

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040073337A1 (en) * 2002-09-06 2004-04-15 Royal Appliance Sentry robot system
US20180147721A1 (en) * 2016-11-28 2018-05-31 Brain Corporation Systems and methods for remote operating and/or monitoring of a robot
CN108801254A (en) * 2017-05-02 2018-11-13 北京米文动力科技有限公司 A kind of method for relocating and robot
CN107544515A (en) * 2017-10-10 2018-01-05 苏州中德睿博智能科技有限公司 Multirobot based on Cloud Server builds figure navigation system and builds figure air navigation aid
CN109935077A (en) * 2017-12-15 2019-06-25 百度(美国)有限责任公司 System for constructing vehicle and cloud real-time traffic map for automatic driving vehicle
CN109141393A (en) * 2018-07-02 2019-01-04 北京百度网讯科技有限公司 Method for relocating, equipment and storage medium
CN109074407A (en) * 2018-07-23 2018-12-21 深圳前海达闼云端智能科技有限公司 Multi-source data mapping method, related device and computer-readable storage medium
CN109084786A (en) * 2018-08-09 2018-12-25 北京智行者科技有限公司 A kind of processing method of map datum
CN109725327A (en) * 2019-03-07 2019-05-07 山东大学 A kind of method and system of multimachine building map
CN110686676A (en) * 2019-09-12 2020-01-14 深圳市银星智能科技股份有限公司 Robot repositioning method and device and robot
CN110553652A (en) * 2019-10-12 2019-12-10 上海高仙自动化科技发展有限公司 robot multi-sensor fusion positioning method and application thereof
CN110673614A (en) * 2019-10-25 2020-01-10 湖南工程学院 Mapping system and mapping method of small robot group based on cloud server

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111890368A (en) * 2020-08-06 2020-11-06 深圳优地科技有限公司 Position calibration method, device and system based on robot and storage medium
CN111970351A (en) * 2020-08-11 2020-11-20 震坤行工业超市(上海)有限公司 Data alignment-based multi-dimensional sensing optimization method and system for Internet of things
CN112666942A (en) * 2020-12-15 2021-04-16 美智纵横科技有限责任公司 Self-moving robot and path planning method, device, equipment and storage medium thereof
CN113814997A (en) * 2021-10-18 2021-12-21 上海擎朗智能科技有限公司 Robot repositioning method and device, electronic equipment and storage medium
CN114413903A (en) * 2021-12-08 2022-04-29 上海擎朗智能科技有限公司 Positioning method for multiple robots, robot distribution system, and computer-readable storage medium

Also Published As

Publication number Publication date
CN111352425B (en) 2024-02-09

Similar Documents

Publication Publication Date Title
CN111352425B (en) Navigation system, method, device, electronic equipment and medium
CN108828527B (en) Multi-sensor data fusion method and device, vehicle-mounted equipment and storage medium
EP3505869B1 (en) Method, apparatus, and computer readable storage medium for updating electronic map
CN109901138B (en) Laser radar calibration method, device, equipment and storage medium
KR102548282B1 (en) High-precision mapping method and device
US9058538B1 (en) Bundle adjustment based on image capture intervals
CN107845114B (en) Map construction method and device and electronic equipment
WO2019126950A1 (en) Positioning method, cloud server, terminal, system, electronic device and computer program product
EP3621286B1 (en) Method, and apparatus for clock synchronization, device, storage medium and vehicle
US11281228B2 (en) Method and device for determining a position of a transportation vehicle
CN109814137B (en) Positioning method, positioning device and computing equipment
CN105229490A (en) Use the positional accuracy of satellite visibility data for promoting
CN111624550B (en) Vehicle positioning method, device, equipment and storage medium
CN110146086B (en) Method and device for generating indoor map
WO2022099482A1 (en) Exposure control method and apparatus, mobile platform, and computer-readable storage medium
CN112689234A (en) Indoor vehicle positioning method and device, computer equipment and storage medium
CN108776333B (en) Data secondary cascade fusion method and system, vehicle-mounted equipment and storage medium
CN112815962A (en) Calibration method and device for parameters of combined application sensor
CN112284405B (en) Method, apparatus, computing device and computer readable medium for navigation
CN114580537A (en) Point cloud data processing method and device
CN113192335A (en) Map sharing method and device, vehicle and cloud server
CN111381587B (en) Following method and device for following robot
CN113074751B (en) Visual positioning error detection method and device
CN113534156B (en) Vehicle positioning method, device and equipment based on vehicle millimeter wave radar
EP4310780A1 (en) Object positioning method, electronic apparatus, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant