CN115200568A - Navigation map adjusting method and device applied to robot and electronic equipment - Google Patents

Navigation map adjusting method and device applied to robot and electronic equipment Download PDF

Info

Publication number
CN115200568A
CN115200568A CN202210824756.2A CN202210824756A CN115200568A CN 115200568 A CN115200568 A CN 115200568A CN 202210824756 A CN202210824756 A CN 202210824756A CN 115200568 A CN115200568 A CN 115200568A
Authority
CN
China
Prior art keywords
navigation map
robot
obstacle
information
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210824756.2A
Other languages
Chinese (zh)
Inventor
齐心
赵博学
支涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yunji Technology Co Ltd
Original Assignee
Beijing Yunji Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yunji Technology Co Ltd filed Critical Beijing Yunji Technology Co Ltd
Priority to CN202210824756.2A priority Critical patent/CN115200568A/en
Publication of CN115200568A publication Critical patent/CN115200568A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The disclosure relates to the technical field of navigation map adjustment of robots, and provides a navigation map adjustment method and device applied to a robot and electronic equipment. The method comprises the following steps: controlling the robot to download a navigation map; controlling the robot to move according to the navigation map; controlling the robot to collect image data in the advancing process; and adjusting the navigation map based on the image data to obtain a new navigation map. According to the embodiment, the image data are collected in the process that the robot finishes the task according to the progress of the downloaded navigation map, and the navigation map is adjusted according to the number of the images to finish the updating of the navigation map.

Description

Navigation map adjusting method and device applied to robot and electronic equipment
Technical Field
The present disclosure relates to the field of a robot navigation map adjustment technology, and in particular, to a navigation map adjustment method and apparatus applied to a robot, and an electronic device.
Background
In recent years, intelligent robots are gradually applied to various fields such as sterilization work, distribution work. The intelligent robot can reach the target location according to the instruction of input and accomplish work task, and the prerequisite of this function is that intelligent robot can move to the target location by oneself, and navigation becomes the necessary outfit function of intelligent robot then. However, since the environment changes every time, the travel route also changes, and how to adjust the navigation map when the environment changes is a problem to be solved first.
Disclosure of Invention
In view of this, the embodiments of the present disclosure provide a method and an apparatus for adjusting a navigation map applied to a robot, and an electronic device, so as to solve a problem in the prior art how to adjust the navigation map when an environment changes.
In a first aspect of the embodiments of the present disclosure, a navigation map adjusting method applied to a robot is provided, including: controlling the robot to download a navigation map; controlling the robot to move according to the navigation map; controlling the robot to collect image data in the advancing process; and adjusting the navigation map based on the image data to obtain a new navigation map.
In a second aspect of the embodiments of the present disclosure, there is provided a navigation map adjusting apparatus applied to a robot, including: a first control unit configured to control the robot to download the navigation map; a second control unit configured to control the robot to travel according to the navigation map; a collection unit configured to control the robot to collect image data during the traveling; and the adjusting unit is configured to adjust the navigation map based on the image data to obtain a new navigation map.
In a third aspect of the disclosed embodiments, an electronic device is provided, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the above method when executing the computer program.
In a fourth aspect of the embodiments of the present disclosure, a computer-readable storage medium is provided, which stores a computer program, which when executed by a processor, implements the steps of the above-mentioned method.
Compared with the prior art, the embodiment of the disclosure has the following beneficial effects: firstly, controlling a robot to download a navigation map; then, controlling the robot to move according to the navigation map; then, in the process of advancing, controlling the robot to collect image data; and finally, adjusting the navigation map based on the image data to obtain a new navigation map. The method can acquire image data in the process that the robot finishes a task according to the progress of the downloaded navigation map, and then adjust the navigation map according to the number of the images to finish the updating of the navigation map.
Drawings
To more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without inventive efforts.
Fig. 1 is a schematic diagram of one application scenario of a navigation map adjustment method applied to a robot according to some embodiments of the present disclosure;
FIG. 2 is a flow diagram of some embodiments of a navigation map adjustment method applied to a robot according to the present disclosure;
FIG. 3 is a schematic structural diagram of some embodiments of a navigation map adjustment apparatus applied to a robot according to the present disclosure;
FIG. 4 is a schematic block diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a schematic diagram of one application scenario of a navigation map adjustment method applied to a robot according to some embodiments of the present disclosure.
In the application scenario of fig. 1, first, the computing device 101 may control the robot to download the navigation map 102. Then, the computing device 101 may control the robot to travel according to the navigation map 102, as indicated by reference numeral 103. Thereafter, during the travel, the computing device 101 may control the robot to capture image data 104. Finally, based on the image data 104, the computing device 101 may adjust the navigation map 102 to obtain a new navigation map 105.
The computing device 101 may be hardware or software. When the computing device 101 is hardware, it may be implemented as a distributed cluster of multiple servers or terminal devices, or as a single server or a single terminal device (e.g., robot, intelligent robot). When the computing device 101 is embodied as software, it may be installed in the hardware devices listed above. It may be implemented, for example, as multiple software or software modules for providing distributed services, or as a single software or software module. And is not particularly limited herein.
It should be understood that the number of computing devices in FIG. 1 is merely illustrative. There may be any number of computing devices, as implementation needs dictate.
Fig. 2 is a flow diagram of some embodiments of a navigation map adjustment method applied to a robot according to the present disclosure. The navigation map adjustment method applied to the robot of fig. 2 may be performed by the computing device 101 of fig. 1. As shown in fig. 2, the navigation map adjusting method applied to the robot includes:
and step S201, controlling the robot to download the navigation map.
In some embodiments, an executing agent (such as the computing device 101 shown in fig. 1) of the navigation map adjusting method applied to the robot may control the robot to connect with the robot map management platform to download the navigation map. Here, the navigation map may be generated according to the following steps:
firstly, controlling the test robot to advance according to a preset line, and collecting environmental image data. As an example, the preset route may be a route where the robot is controlled to travel by using the current position information of the test robot as a start point and an end point, a target range as a travel area, and a robot passable route in the travel area as the preset route, and then environmental image data may be obtained by capturing images by using a camera mounted on the test robot.
And secondly, analyzing the environment image data, wherein the executive body can determine the obstacle information of at least one obstacle to obtain an obstacle information set. Here, the obstacle information includes obstacle feature data, obstacle outer shape volume information, and obstacle position information. As an example, the obstacle feature data may be feature data obtained by extracting features of an obstacle in image data, the obstacle external shape volume information may be point cloud data or contour extraction feature data of the obstacle in the image data, and the obstacle position information may be position information of the obstacle with respect to the test robot. Specifically, the executing body may use an existing edge detection extraction algorithm to realize the contour extraction features, so as to obtain the outer volume information of the obstacle.
And thirdly, the execution main body can establish a space rectangular coordinate system by taking the position information of the test robot as an original point, taking the true north direction as the positive direction of a longitudinal coordinate axis, taking the true east direction as the positive direction of a transverse coordinate axis and taking the direction of the test robot relative to the ground as the positive direction of a vertical coordinate axis.
And a fourth step of updating the obstacle represented by the obstacle information to the spatial rectangular coordinate system by the execution subject based on the obstacle external physical volume information and the obstacle position information of each obstacle information in the obstacle information set, and generating the navigation map.
And step S202, controlling the robot to move according to the navigation map.
In some embodiments, the execution body may generate travel path information based on the navigation map. Then, the execution body may control the robot to travel according to the travel path information. Here, the travel route information represents closed-loop route information having the position information of the robot as a start point and an end point and having a passable area in the navigation map as a waypoint.
And step S203, controlling the robot to collect image data in the advancing process.
In some embodiments, the execution body may control the robot to capture an image of an area covered by the navigation map using a camera mounted on the robot, so as to obtain image data.
And step S204, adjusting the navigation map based on the image data to obtain a new navigation map.
In some embodiments, based on the image data, the executing entity may adjust the navigation map to obtain a new navigation map by:
in the first step, the execution subject may perform obstacle feature extraction on the image data to obtain at least one obstacle feature extraction result. As an example, the executing entity may perform feature extraction on the image data by using a Speeded Up Robust Features algorithm (SURF), which is translated into: the feature algorithm with the robust characteristic of the accelerated version is a robust image identification and description algorithm.
In the second step, for each obstacle feature extraction result of the at least one obstacle feature extraction result, the execution main body may compare the obstacle feature extraction result with obstacle feature data of an obstacle in the navigation map. The comparison here may be performed by acquiring obstacle feature data of an obstacle having the same position information in the navigation map based on the position information of the obstacle represented by the obstacle feature extraction result with respect to the robot.
And thirdly, in response to the fact that the compared comparison result representations are inconsistent, the execution main body can determine appearance volume information and position information corresponding to the obstacle feature extraction result.
And fourthly, based on the shape volume information and the position information, the executive body can update the obstacles represented by the obstacle feature extraction result to the navigation map, complete the adjustment of the navigation map and obtain a new navigation map.
In some optional implementations of some embodiments, the method further comprises: transmitting the new navigation map to a robot map management platform; and controlling the robot map management platform to replace the navigation map with the new navigation map. Here, the function of the robot map management platform includes providing a downloadable navigation map for the robot that establishes the connection.
In some optional implementations of some embodiments, the method further comprises: generating prompt information for representing the completion of the adjustment and the update of the navigation map; and sending the prompt information to the robot with the history downloading record in the robot map management platform. The generation and the issue of the prompt information are used for prompting that the robot navigation map which has downloaded the navigation map is updated, so that a new navigation map can be downloaded, and the unified adjustment of the navigation map is realized.
Compared with the prior art, the embodiment of the disclosure has the following beneficial effects: firstly, controlling a robot to download a navigation map; then, controlling the robot to move according to the navigation map; then, in the process of advancing, controlling the robot to collect image data; and finally, adjusting the navigation map based on the image data to obtain a new navigation map. The method can acquire the image data in the process that the robot finishes the task according to the downloaded navigation map, and then adjusts the navigation map to finish the updating of the navigation map after determining whether the barrier with the same position information in the navigation map is changed according to the image data, thereby realizing the real-time updating of the navigation map. In addition, the new navigation map is transmitted to the robot map management platform and replaces the original navigation map, so that the navigation map downloaded by the future robot is the latest, the prompt information is generated and sent to the robot with the historical download record, the update of the robot navigation map can be prompted, the new navigation map can be downloaded, the unified update of the navigation map is realized, the robot is prevented from walking by mistake and consuming time according to the original navigation map, the working efficiency is improved, and the user experience is indirectly improved.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described in detail herein.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 3 is a schematic structural diagram of some embodiments of a navigation map adjusting apparatus applied to a robot according to the present disclosure. As shown in fig. 3, the navigation map adjusting apparatus applied to the robot includes: a first control unit 301, a second control unit 302, a collecting unit 303 and an adjusting unit 304. Wherein the first control unit 301 is configured to control the robot to download the navigation map; a second control unit 302 configured to control the robot to travel according to the navigation map; an acquisition unit 303 configured to control the robot to acquire image data during the traveling; an adjusting unit 304 is configured to adjust the navigation map based on the image data to obtain a new navigation map.
In some optional implementations of some embodiments, the navigation map is generated according to the following steps: controlling the test robot to advance according to a preset line and collecting environmental image data; analyzing the environment image data, determining obstacle information of at least one obstacle to obtain an obstacle information set, wherein the obstacle information comprises obstacle feature data, obstacle outer shape volume information and obstacle position information; establishing a space rectangular coordinate system by taking the position information of the test robot as an origin, taking the north direction as the positive direction of a longitudinal coordinate axis, taking the east direction as the positive direction of a transverse coordinate axis and taking the direction of the test robot relative to the ground as the positive direction of a vertical coordinate axis; and supplementing the obstacles represented by the obstacle information to the space rectangular coordinate system based on the obstacle external volume information and the obstacle position information of each obstacle information in the obstacle information set, and generating the navigation map.
In some optional implementations of some embodiments, the second control unit 302 of the navigation map adjusting apparatus applied to the robot is further configured to: generating travel route information representing closed-loop route information using position information of the robot as a start point and an end point and using a passable area in the navigation map as a passing point, based on the navigation map; and controlling the robot to move according to the moving path information.
In some optional implementations of some embodiments, the acquisition unit 303 of the navigation map adjusting apparatus applied to the robot is further configured to: and controlling the robot to shoot the area covered by the navigation map in the advancing process to obtain image data.
In some optional implementations of some embodiments, the adjusting unit 304 of the navigation map adjusting apparatus applied to the robot is further configured to: extracting obstacle features of the image data to obtain at least one obstacle feature extraction result; comparing the obstacle feature extraction result with obstacle feature data of obstacles in the navigation map for each obstacle feature extraction result in the at least one obstacle feature extraction result; in response to the fact that the compared comparison result representations are inconsistent, determining appearance volume information and position information corresponding to the obstacle feature extraction result; and updating the obstacles represented by the obstacle feature extraction result to the navigation map based on the shape volume information and the position information, and finishing the adjustment of the navigation map to obtain a new navigation map.
In some optional implementations of some embodiments, the navigation map adjusting device applied to the robot is further configured to: transmitting the new navigation map to a robot map management platform; and controlling the robot map management platform to replace the navigation map with the new navigation map.
In some optional implementations of some embodiments, the navigation map adjusting device applied to the robot is further configured to: generating prompt information for representing the completion of the adjustment and the update of the navigation map; and sending the prompt information to the robot with the history downloading record in the robot map management platform.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present disclosure.
Referring now to FIG. 4, a block diagram of an electronic device (e.g., computing device 101 of FIG. 1) 400 suitable for use in implementing some embodiments of the present disclosure is shown. The server shown in fig. 4 is only an example, and should not bring any limitation to the functions and use range of the embodiments of the present disclosure.
As shown in fig. 4, electronic device 400 may include a processing device (e.g., central processing unit, graphics processor, etc.) 401 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 402 or a program loaded from a storage device 408 into a Random Access Memory (RAM) 403. In the RAM 403, various programs and data necessary for the operation of the electronic apparatus 400 are also stored. The processing device 401, the ROM402, and the RAM 403 are connected to each other through a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
Generally, the following devices may be connected to the I/O interface 405: input devices 406 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 407 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 408 including, for example, tape, hard disk, etc.; and a communication device 409. The communication means 409 may allow the electronic device 400 to communicate wirelessly or by wire with other devices to exchange data. While fig. 4 illustrates an electronic device 400 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 4 may represent one device or may represent multiple devices as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network through the communication device 409, or from the storage device 408, or from the ROM 402. The computer program, when executed by the processing device 401, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described above in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the apparatus; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: controlling the robot to download a navigation map; controlling the robot to move according to the navigation map; controlling the robot to collect image data in the advancing process; and adjusting the navigation map based on the image data to obtain a new navigation map.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. The described units may also be provided in a processor, which may be described as: a processor includes a first control unit, a second control unit, a collection unit, and an adjustment unit. Where the names of the units do not in some cases constitute a limitation of the unit itself, for example, the first control unit may also be described as a "unit controlling the robot to download the navigation map".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems on a chip (SOCs), complex Programmable Logic Devices (CPLDs), and the like.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combinations of the above-mentioned features, and other embodiments in which the above-mentioned features or their equivalents are combined arbitrarily without departing from the spirit of the invention are also encompassed. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (10)

1. A navigation map adjusting method applied to a robot is characterized by comprising the following steps:
controlling the robot to download a navigation map;
controlling the robot to travel according to the navigation map;
controlling the robot to collect image data during the traveling process;
and adjusting the navigation map based on the image data to obtain a new navigation map.
2. The method of claim 1, wherein the navigation map is generated according to the following steps:
controlling the test robot to advance according to a preset line, and collecting environmental image data;
analyzing the environment image data, determining obstacle information of at least one obstacle to obtain an obstacle information set, wherein the obstacle information comprises obstacle feature data, obstacle outer shape volume information and obstacle position information;
establishing a space rectangular coordinate system by taking the position information of the test robot as an origin, taking the north direction as the positive direction of a longitudinal coordinate axis, taking the east direction as the positive direction of a transverse coordinate axis and taking the direction of the test robot relative to the ground as the positive direction of a vertical coordinate axis;
and supplementing the obstacles represented by the obstacle information to the space rectangular coordinate system based on the obstacle external physical volume information and the obstacle position information of each obstacle information in the obstacle information set, and generating the navigation map.
3. The method of claim 1, wherein the controlling the robot to travel according to the navigation map comprises:
generating travel path information based on the navigation map, wherein the travel path information represents closed loop path information which takes the position information of the robot as a starting point and an end point and takes a passable area in the navigation map as a passing point;
and controlling the robot to travel according to the travel path information.
4. The method as claimed in claim 3, wherein the controlling the robot to collect the image data during the traveling comprises:
and in the advancing process, controlling the robot to shoot the area covered by the navigation map to obtain image data.
5. The method of claim 2, wherein the adjusting the navigation map based on the image data to obtain a new navigation map comprises:
extracting obstacle features of the image data to obtain at least one obstacle feature extraction result;
for each obstacle feature extraction result in the at least one obstacle feature extraction result, comparing the obstacle feature extraction result with obstacle feature data of obstacles in the navigation map;
in response to determining that the compared comparison result representations are inconsistent, determining appearance volume information and position information corresponding to the obstacle feature extraction result;
and updating the obstacles represented by the obstacle feature extraction result to the navigation map based on the shape volume information and the position information, and completing the adjustment of the navigation map to obtain a new navigation map.
6. The method of adjusting a navigation map applied to a robot according to claim 1, further comprising:
transmitting the new navigation map to a robot map management platform;
and controlling the robot map management platform to replace the navigation map with the new navigation map.
7. The method of adjusting a navigation map applied to a robot according to claim 6, further comprising:
generating prompt information for representing the completion of the adjustment and the update of the navigation map;
and sending the prompt information to the robot with the history downloading record in the robot map management platform.
8. A navigation map adjusting device applied to a robot is characterized by comprising:
a first control unit configured to control the robot to download the navigation map;
a second control unit configured to control the robot to travel according to the navigation map;
an acquisition unit configured to control the robot to acquire image data during the travel;
and the adjusting unit is configured to adjust the navigation map based on the image data to obtain a new navigation map.
9. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202210824756.2A 2022-07-13 2022-07-13 Navigation map adjusting method and device applied to robot and electronic equipment Pending CN115200568A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210824756.2A CN115200568A (en) 2022-07-13 2022-07-13 Navigation map adjusting method and device applied to robot and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210824756.2A CN115200568A (en) 2022-07-13 2022-07-13 Navigation map adjusting method and device applied to robot and electronic equipment

Publications (1)

Publication Number Publication Date
CN115200568A true CN115200568A (en) 2022-10-18

Family

ID=83579473

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210824756.2A Pending CN115200568A (en) 2022-07-13 2022-07-13 Navigation map adjusting method and device applied to robot and electronic equipment

Country Status (1)

Country Link
CN (1) CN115200568A (en)

Similar Documents

Publication Publication Date Title
CN110532981B (en) Human body key point extraction method and device, readable storage medium and equipment
CN115326099B (en) Local path planning method and device, electronic equipment and computer readable medium
CN113255619B (en) Lane line recognition and positioning method, electronic device, and computer-readable medium
CN112150490B (en) Image detection method, device, electronic equipment and computer readable medium
CN114964296B (en) Vehicle driving path planning method, device, equipment and computer readable medium
US11713980B2 (en) Method and apparatus for generating map
CN115540894B (en) Vehicle trajectory planning method and device, electronic equipment and computer readable medium
CN115167182B (en) Automatic driving simulation test method, device, equipment and computer readable medium
CN113607185A (en) Lane line information display method, lane line information display device, electronic device, and computer-readable medium
CN113074726A (en) Pose determination method and device, electronic equipment and storage medium
CN113190613A (en) Vehicle route information display method and device, electronic equipment and readable medium
CN114894205A (en) Three-dimensional lane line information generation method, device, equipment and computer readable medium
CN112150491B (en) Image detection method, device, electronic equipment and computer readable medium
CN115200568A (en) Navigation map adjusting method and device applied to robot and electronic equipment
CN116164770A (en) Path planning method, path planning device, electronic equipment and computer readable medium
CN112597174B (en) Map updating method and device, electronic equipment and computer readable medium
CN115273012A (en) Dotted lane line identification method and device, electronic equipment and computer readable medium
CN110553639B (en) Method and apparatus for generating location information
CN114140538A (en) Vehicle-mounted camera pose adjusting method, device, equipment and computer readable medium
CN113970754A (en) Positioning method and device of autonomous travelable equipment
CN114119973A (en) Spatial distance prediction method and system based on image semantic segmentation network
CN112781581B (en) Method and device for generating path from moving to child cart applied to sweeper
CN113778078A (en) Positioning information generation method and device, electronic equipment and computer readable medium
CN113204661B (en) Real-time road condition updating method, electronic equipment and computer readable medium
CN110633707A (en) Method and device for predicting speed

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination