CN115371686A - Method and related device for instantly positioning robot - Google Patents

Method and related device for instantly positioning robot Download PDF

Info

Publication number
CN115371686A
CN115371686A CN202211315053.3A CN202211315053A CN115371686A CN 115371686 A CN115371686 A CN 115371686A CN 202211315053 A CN202211315053 A CN 202211315053A CN 115371686 A CN115371686 A CN 115371686A
Authority
CN
China
Prior art keywords
robot
preset
environment
positioning
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211315053.3A
Other languages
Chinese (zh)
Other versions
CN115371686B (en
Inventor
李卫
李强
徐刚
刘建强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SY Technology Engineering and Construction Co Ltd
Original Assignee
SY Technology Engineering and Construction Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SY Technology Engineering and Construction Co Ltd filed Critical SY Technology Engineering and Construction Co Ltd
Priority to CN202211315053.3A priority Critical patent/CN115371686B/en
Publication of CN115371686A publication Critical patent/CN115371686A/en
Application granted granted Critical
Publication of CN115371686B publication Critical patent/CN115371686B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3837Data obtained from a single source

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The application discloses a method and a related device for the real-time positioning of a robot, and the inspection robot in the related technology has the problems of low accuracy and reliability of real-time positioning, map construction and path planning. The method comprises the steps of sending a starting instruction to a robot, instructing the robot to acquire initial environment information, positioning and drawing according to initial environment characteristic information in the initial environment information, periodically acquiring a first environment image containing the robot through image acquisition equipment, determining a first position of the robot and a preset area formed by preset boundary lines based on the first environment image, and sending a route correction instruction to the robot by judging whether the position of the robot is located outside the preset area formed by the preset boundary lines or not and if the position of the robot is determined to be located outside the preset area formed by the preset boundary lines, so that the robot stops moving and is positioned and drawn again, and the planned route is corrected continuously.

Description

Method and related device for instantly positioning robot
Technical Field
The application relates to the technical field of inspection robots, in particular to a method and a related device for real-time positioning of a robot.
Background
At present, the robot inspection is more and more widely applied in the industrial field, and positioning and navigation are key technologies of the robot inspection. From the view of navigation mode, the rail type inspection robot or the mode of pre-laying magnetic tracks and the like is used for solving the problem that the robot has higher reliability in positioning and navigation, but the mode needs to lay the rails in advance, so that the change of the rails is not easy and has great limitation; in a place where the outdoor requirement for Positioning is not high, GPS (Global Positioning System) navigation can be adopted, but the GPS Positioning accuracy is not high and is easily affected by the surrounding electromagnetic environment, so that navigation failure or navigation getting is sometimes caused; in recent years, SLAM (Simultaneous Localization and Mapping) has become a technical means for solving autonomous navigation problems for widely used robots, and SLAM is mainly realized based on various sensors installed on a patrol robot.
In the related technology, the inspection robot adopting the SLAM technology can better exert the inspection advantage of the robot. However, at present, even though the SLAM technology adopts a scheme of multi-sensor fusion, the method cannot economically and effectively realize the accuracy and reliability of instant positioning, map construction and path planning in the inspection process. Therefore, the inspection robot in the related technology has the problems of low accuracy and reliability of instant positioning, map construction and path planning.
Disclosure of Invention
The embodiment of the application provides a method and a related device for real-time positioning of a robot, which are used for solving the problems of low accuracy and reliability of real-time positioning, map construction and path planning of an inspection robot in the related technology.
In a first aspect, the present application provides a method for instantly positioning a robot, the method comprising:
sending a starting instruction to the robot, wherein the starting instruction is used for instructing the robot to acquire initial environment information and positioning and drawing according to initial environment characteristic information in the initial environment information;
periodically acquiring a first environment image containing a robot, and determining a first position of the robot and a preset area formed by a preset boundary line based on the first environment image;
and if the first position of the robot is determined to be located outside a preset area formed by a preset boundary line, sending a route correction instruction to the robot, wherein the route correction instruction is used for indicating the robot to stop moving, and carrying out positioning and drawing construction again according to first environment characteristic information in environment information corresponding to the first position, and the content of the first environment characteristic information is more than that of second environment characteristic information in the environment information corresponding to the first position before the robot stops moving.
In a possible implementation, after determining a preset area composed of a first position of the robot in the first environment image and a preset boundary line, the method further includes:
if the first position of the robot is determined to be located in the preset area, periodically acquiring a second environment image containing the robot, and determining a second position, a preset center and a preset positioning line of the robot based on the second environment image;
if the second position of the robot is determined to be located in a preset positioning line, and the distance between the second position of the robot and the preset central position exceeds a preset distance, sending a position correction instruction containing a position correction direction to the robot, wherein the position correction instruction is used for indicating the robot to stop moving, and carrying out positioning and drawing again according to third environment characteristic information in the environment information corresponding to the second position and the position correction direction, and the content of the third environment characteristic information is more than fourth environment characteristic information in the environment information corresponding to the second position before the robot stops moving.
In a possible embodiment, the determining a second position, a preset center position and a preset position line of the robot based on the second environment image includes:
identifying feature points of the robot in the second environment image, and determining a second position of the robot based on the identified feature points;
and determining a preset center according to the second vertical distance from the feature point to the horizontal ground and the position of the preset center in the second environment image, and determining a preset positioning line according to the second vertical distance from the feature point to the horizontal ground and the position of a preset positioning line in the second environment image, wherein the second position, the preset center and the preset positioning line of the robot are in the same plane.
In a possible embodiment, the determining a preset area composed of a first position of the robot and a preset boundary line based on the first environment image includes:
identifying feature points of the robot in the first environment image, and determining a first position of the robot based on the identified feature points;
and determining a preset area formed by preset boundary lines according to the first vertical distance from the characteristic point to the horizontal ground and the position of the preset boundary lines in the first environment image, wherein the first position of the robot and the preset area formed by the preset boundary lines are on the same horizontal plane.
In a possible embodiment, after determining that the first position of the robot is located outside the preset area formed by the preset boundary line, before sending a route modification instruction to the robot, the method further includes:
determining a route correction direction based on the position relation of a preset area formed by the robot and the preset boundary line;
and placing the route correction direction in the route correction instruction so that the robot can be repositioned and mapped based on first environmental characteristic information in the environmental information corresponding to the first position and the route correction direction.
In one possible embodiment, the method further comprises:
if the number of times of sending the route correction instruction to the robot exceeds a first preset number of times, or the time of the first position of the robot outside a preset area formed by a preset boundary line exceeds a first preset duration, it is determined that the robot cannot move into the preset area, and a first fault signal is sent out according to a preset alarm mode.
In a possible implementation manner, after determining that the second position of the robot is located within the preset positioning line and the second position of the robot is deviated from the preset central position, the method further includes:
and if the number of times of sending the position correction instruction to the robot is determined to exceed a second preset number of times, or the second position of the robot is located in a preset positioning line, and the time of the second position of the robot deviating from the preset central position exceeds a second preset time length, determining that the robot cannot move to the preset central position, and sending a second fault signal according to a preset alarm mode.
In a second aspect, the present application provides a device for instantly positioning a robot, the device comprising:
a starting instruction sending module configured to send a starting instruction to the robot, where the starting instruction is used to instruct the robot to acquire initial environment information and perform positioning and mapping according to initial environment feature information in the initial environment information;
the robot positioning system comprises an environment image acquisition module, a positioning module and a positioning module, wherein the environment image acquisition module is configured to periodically acquire a first environment image containing a robot and determine a first position of the robot and a preset area formed by a preset boundary line based on the first environment image;
the correction instruction sending module is configured to send a route correction instruction to the robot if it is determined that the first position of the robot is located outside a preset area formed by a preset boundary line, wherein the route correction instruction is used for instructing the robot to stop moving, and carrying out positioning and drawing again according to first environment feature information in environment information corresponding to the first position, and the content of the first environment feature information is more than that of second environment feature information in the environment information corresponding to the first position before the robot stops moving.
In a possible implementation, after the determining a preset area composed of a first position of the robot and a preset boundary line in the first environment image, the method further includes:
the environment image acquisition module is configured to periodically acquire a second environment image containing the robot if the first position of the robot is determined to be located in a preset area, and determine a second position, a preset center and a preset positioning line of the robot based on the second environment image;
a correction instruction sending module configured to send a position correction instruction including a position correction direction to the robot if it is determined that the second position of the robot is located in a preset positioning line and the distance between the second position of the robot and a preset center position exceeds a preset distance, where the position correction instruction is used to instruct the robot to stop moving and to perform positioning and mapping again according to third environment feature information in environment information corresponding to the second position and the position correction direction, and the content of the third environment feature information is more than fourth environment feature information in environment information corresponding to the second position before the robot stops moving.
In a possible implementation, the determining a second position, a preset center position and a preset location line of the robot based on the second environment image is performed, the environment image acquisition module is configured to:
identifying feature points of the robot in the second environment image and determining a second position of the robot based on the identified feature points;
and determining a preset center according to the second vertical distance from the feature point to the horizontal ground and the position of the preset center in the second environment image, and determining a preset positioning line according to the second vertical distance from the feature point to the horizontal ground and the position of a preset positioning line in the second environment image, wherein the second position, the preset center and the preset positioning line of the robot are in the same plane.
In a possible embodiment, said determining a first position of said robot and a preset area of a preset boundary line based on said first environment image is performed, said environment image acquisition module is configured to:
identifying feature points of the robot in the first environment image, and determining a first position of the robot based on the identified feature points;
and determining a preset area formed by preset boundary lines according to the first vertical distance from the characteristic point to the horizontal ground and the position of the preset boundary lines in the first environment image, wherein the first position of the robot and the preset area formed by the preset boundary lines are on the same horizontal plane.
In a possible implementation, after determining that the first position of the robot is outside the preset area of the preset boundary line and before sending the route correction instruction to the robot, the correction instruction sending module is further configured to:
determining a route correction direction based on a position relation of a preset area formed by the robot and the preset boundary line;
and placing the route correcting direction in the route correcting instruction so that the robot can be repositioned and mapped based on first environmental characteristic information in the environmental information corresponding to the first position and the route correcting direction.
In one possible embodiment, the apparatus further comprises:
the first fault signal sending module is configured to determine that the robot cannot move into a preset area if the number of times of sending the route correction instruction to the robot exceeds a first preset number of times or the time of the first position of the robot outside the preset area formed by the preset boundary line exceeds a first preset duration, and send a first fault signal according to a preset alarm mode.
In a possible implementation manner, after determining that the second position of the robot is located within the preset positioning line and the second position of the robot is deviated from the preset central position, the apparatus further includes:
and the second fault signal sending module is configured to determine that the robot cannot move to the preset central position if the number of times of sending the position correction instruction to the robot is determined to exceed a second preset number of times, or the second position of the robot is located in a preset positioning line, and the time of the second position of the robot deviating from the preset central position exceeds a second preset time length, and send a second fault signal according to a preset alarm mode.
In a third aspect, the present application further provides an electronic device, including:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement any of the methods as provided in the first aspect of the application.
In a fourth aspect, an embodiment of the present application further provides a computer-readable storage medium, where instructions, when executed by a processor of an electronic device, enable the electronic device to perform any one of the methods as provided in the first aspect of the present application.
In a fifth aspect, an embodiment of the present application provides a computer program product comprising a computer program that, when executed by a processor, performs any of the methods as provided in the first aspect of the present application.
The technical scheme provided by the embodiment of the application at least has the following beneficial effects:
according to the embodiment of the application, the starting instruction is sent to the robot, the robot is instructed to acquire initial environment information and is positioned and built according to initial environment characteristic information in the initial environment information, a first environment image containing the robot is acquired periodically through the image acquisition equipment, a first position of the robot and a preset area formed by preset boundary lines are determined based on the first environment image, whether the position of the robot is located outside the preset area formed by the preset boundary lines is judged, if the position of the robot is determined to be located outside the preset area formed by the preset boundary lines, a route correction instruction is sent to the robot, the robot stops moving, the positioning and the building are carried out again, and the planned route is corrected continuously. In summary, in the present application, according to an environment image including a robot, if it is determined that a position of the robot is located outside a preset area, a correction instruction is transmitted to the robot, so that the robot can use more and more detailed feature information for positioning and mapping, a planned route of the robot is more accurate, and the route correction is implemented. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required to be used in the embodiments of the present application will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
Fig. 1 is an application scenario diagram of a flow diagram of a method for instantly positioning a robot according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a method for instantly positioning a robot according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a predetermined area formed by a predetermined boundary line according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a first environment image including a robot according to an embodiment of the present disclosure;
FIG. 5 is a flowchart of step 202 provided by an embodiment of the present application;
fig. 6 is a schematic view of a positional relationship between a robot, a camera, and a preset boundary line according to an embodiment of the present disclosure;
FIG. 7 is a schematic diagram illustrating a flow chart of generating a route correction command according to an embodiment of the present application;
FIG. 8 is a schematic flow chart illustrating a process for correcting positioning errors according to an embodiment of the present disclosure;
FIG. 9 is a flowchart illustrating a step 801 provided in an embodiment of the present application;
FIG. 10 is a schematic diagram illustrating an image of a second position and a preset center position of a robot according to an exemplary embodiment;
FIG. 11 is a block diagram illustrating an apparatus for robotic instant positioning in accordance with an exemplary embodiment;
fig. 12 is a schematic structural diagram of an electronic device illustrating a method for instantly positioning a robot according to an exemplary embodiment.
Detailed Description
In order to make the technical solutions of the present application better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the accompanying drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. The implementations described in the following exemplary examples do not represent all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
Hereinafter, some terms in the embodiments of the present application are explained to facilitate understanding by those skilled in the art.
(1) In the embodiments of the present application, the term "plurality" means two or more, and other terms are similar thereto.
(2) "and/or" describes the association relationship of the associated object, indicating that there may be three relationships, for example, a and/or B, which may indicate: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
(3) The server is used for serving the camera, the content of the service is such as providing resources for the camera and storing the data collected by the camera; the server corresponds to the application program installed on the camera and operates in cooperation with the application program on the camera.
In view of the fact that the robot is adopted for inspection in electronic workshops such as semiconductor projects, liquid crystal panel projects, data centers and the like, inspection quality can be improved to a great extent, inspection cost is reduced, and particularly, the inspection robot adopting the SLAM technology can better exert the advantages of robot inspection. However, although the current SLAM technology adopts a scheme of multi-sensor fusion, it is not economical and effective to realize the accuracy and reliability of instant positioning, map construction and path planning in the inspection process, and the main reasons are as follows: the environment in the electronic factory building has the characteristics of structural characteristic loss (the same type of equipment is periodically arranged in a workshop, a corridor exists in the workshop, a large-area white wall and the like), and the characteristics cause the problem that the laser SLAM or the visual SLAM is easy to track and lose; the IMU + SLAM can solve the problem of the lack of structural features of the environment, but the inertial technology can only solve the problem in a short time, because the random deviation of the IMU becomes larger gradually with the increase of time, and the accuracy of the instantaneous positioning becomes unacceptable without correction for a long time. The simultaneous use of multiple SLAM techniques greatly improves the performance of the system, but does not solve the problem completely, and there is also a probability of SLAM failure. Due to the large number of critical equipment in the electronics plant, and the number of lines and equipment involved with toxic and hazardous gases, the various unpredictable consequences that can result from SLAM failure are unacceptable.
In view of this, in order to improve accuracy and reliability of the robot instant positioning, the map construction, and the path planning, the present application provides a method and a related apparatus for the robot instant positioning.
The method for the real-time positioning of the robot includes the steps of sending a starting instruction to the robot, instructing the robot to obtain initial environment information, positioning and drawing according to initial environment feature information in the initial environment information, periodically obtaining a first environment image including the robot through image collection equipment, determining a first position of the robot and a preset area formed by preset boundary lines based on the first environment image, and sending a route correction instruction to the robot by judging whether the position of the robot is located outside the preset area formed by the preset boundary lines or not and if the position of the robot is determined to be located outside the preset area formed by the preset boundary lines, so that the robot stops moving, the positioning and drawing are carried out again, and the planned route is corrected continuously. In summary, in the present application, according to an environment image including a robot, if it is determined that a position of the robot is located outside a preset area, a correction instruction is transmitted to the robot, so that the robot can use more and more detailed feature information for positioning and mapping, a planned route of the robot is more accurate, and the route correction is implemented.
After introducing the design concept of the embodiment of the present application, some simple descriptions are provided below for application scenarios to which the technical solution of the embodiment of the present application can be applied, and it should be noted that the application scenarios described below are only used for describing the embodiment of the present application and are not limited. In specific implementation, the technical scheme provided by the embodiment of the application can be flexibly applied according to actual needs.
Fig. 1 is a schematic view of an application scenario of a method for instantly positioning a robot according to an embodiment of the present disclosure.
The drawing comprises the following steps: network 10, server 20, memory 30, camera, robot. The server 20 is connected to a plurality of cameras through a network. By the method provided by the embodiment of the application, the plurality of cameras can be used for acquiring images of the robot in the electronic factory building, so that the motion state and the track of the robot can be monitored.
The description in this application is detailed in terms of only a single server or camera, but it will be understood by those skilled in the art that the cameras, robots, network 10, server 20 and memory 30 shown are intended to represent the operations of electronic devices, robots, servers and memories to which the solution of the present application relates. The detailed description of a single server and memory is for convenience of description at least and is not meant to imply limitations on the number, type, or location of cameras and servers. It should be noted that the underlying concepts of the example embodiments of the present application may not be altered if additional modules are added or removed from the illustrated environments. In addition, although fig. 1 shows a bidirectional arrow from the memory 30 to the server 20 for convenience of explanation, it will be understood by those skilled in the art that the above-described data transmission and reception also need to be implemented through the network 10.
It should be noted that the storage in the embodiment of the present application may be, for example, a cache system, or a hard disk storage, a memory storage, and the like. Certainly, the method provided in the embodiment of the present application is not limited to the application scenario shown in fig. 1, and may also be used in other possible application scenarios, and the embodiment of the present application is not limited. The functions that can be implemented by each device in the application scenario shown in fig. 1 will be described in the following method embodiments, and will not be described in detail herein.
To further illustrate the technical solutions provided by the embodiments of the present application, the following detailed description is made with reference to the accompanying drawings and the detailed description. Although the embodiments of the present application provide method operation steps as shown in the following embodiments or figures, more or fewer operation steps may be included in the methods based on conventional or non-inventive efforts. In steps where no necessary causal relationship exists logically, the order of execution of the steps is not limited to that provided by the embodiments of the present application.
Referring to fig. 2, a flow chart of a method for instantly positioning a robot according to an embodiment of the present application is schematically shown, including the following steps:
in step 201, a starting instruction is sent to the robot, and the starting instruction is used for instructing the robot to acquire initial environment information and perform positioning and mapping according to initial environment feature information in the initial environment information.
In a possible embodiment, after the robot receives the start instruction, the robot starts to move and acquires environment information while moving, and performs immediate positioning and mapping (SLAM) by using the acquired environment information, so that for the robot which just receives the start instruction, the robot acquires initial environment information and performs positioning and mapping according to initial environment feature information in the initial environment information.
In step 202, a first environment image including the robot is periodically acquired, and a first position of the robot and a preset area composed of preset boundary lines are determined based on the first environment image.
In a possible embodiment, in order to overcome the problem that the accuracy and reliability of the instant positioning, mapping and path planning are low during the moving process of the robot, in the embodiment of the present invention, first, a plurality of visual imaging devices (such as a camera house in fig. 1) are arranged at a certain location in an inspection environment (such as an electronic factory building in fig. 1), and a predictable and determined environment map information is obtained through the cameras, so that the embodiment of the present invention can set an existing or imaginary boundary line in a field environment, and can represent the boundary line in an image through an imaging technology or an image processing technology in two-dimensional imaging of the environment by the cameras.
For example, a schematic diagram of a preset area formed by the preset boundary lines is shown in fig. 3, two black oval solid lines in the diagram are the preset boundary lines, an annular area formed by the preset boundary lines is the preset area, and the robot performs mobile inspection in the preset area formed by the two boundary lines in the diagram.
Secondly, a certain feature of the robot, especially a feature point, may also be extracted or identified in the imaging of the vision device, and a first environment image containing the robot is shown in fig. 4, and the central point of the five-pointed star in the figure is used as the representation of the feature point of the robot in the first environment image, and the feature point can be extracted from the image by means of an image processing technology.
In a possible implementation, in step 202, the first position of the robot and the preset area formed by the preset boundary line are determined based on the first environment image, as shown in fig. 5, which includes the following:
in step 501, feature points of the robot in the first environment image are identified, and a first position of the robot is determined based on the identified feature points. As shown in fig. 4, the central point of the five-pointed star is the representation of the feature point of the robot in the first environment image, and according to the position of the feature point in the first environment image, the embodiment of the present application may determine the first position of the robot.
Furthermore, in view of the perspective relationship of imaging, if the true position of the boundary line in fig. 4 is not on the same plane as the true position of the feature point representing the position of the robot, even in the vertical top view image, it is already determined that the feature point is within the preset region composed of two preset boundary lines, but in other oblique angle views, there may be a case that the feature point is outside the preset region composed of two preset boundary lines, and the camera usually obtains the first environment image at an oblique angle, in order to avoid this, the embodiment of the present application adopts the following step 502 to make the first position of the robot and the preset region composed of preset boundary lines be on the same horizontal plane:
in step 502, a preset area formed by preset boundary lines is determined according to a first vertical distance from the feature point to the horizontal ground and positions of the preset boundary lines in the first environment image, wherein the first position of the robot and the preset area formed by the preset boundary lines are on the same horizontal plane.
As shown in fig. 6, in the drawing, 1 is a robot, 2 is a spherical positioning body (according to the positioning body, the characteristic point of the robot is determined), 3 is a camera, 4 and 5 are positions of preset boundary lines, the preset boundary lines 4 and 5 are coplanar and parallel to the ground, the center of the spherical positioning body (i.e. the characteristic point of the robot) is on the plane of 4 and 5, and the problem of misjudgment of the position relation of the characteristic point of the robot and the preset boundary line caused by the perspective relation of the image is avoided by adopting the position relation of the first position of the robot and the preset boundary line in fig. 6.
After the first environment image including the robot is periodically acquired, in step 203, if it is determined that the first position of the robot is located outside the preset area formed by the preset boundary line, a route correction instruction is sent to the robot, the route correction instruction is used for instructing the robot to stop moving, and performing positioning and mapping again according to first environment feature information in environment information corresponding to the first position, and the content of the first environment feature information is more than that of second environment feature information in the environment information corresponding to the first position before the robot stops moving.
In a possible implementation manner, after receiving the route correction instruction, the robot stops moving, and at the time when the robot receives the route correction instruction and the time after the robot stops moving, the two times are considered as not having a time interval, so that the first position where the robot is located during the movement (i.e. the time when the route correction instruction is received) is the same as the environmental information of the first position where the robot is located after the movement is stopped. However, in the inspection process of the robot, the factors such as the moving speed, the inspection efficiency, the positioning and mapping calculation efficiency and the like are considered, the characteristic information used for positioning and mapping in the moving process is less, the route correction instruction is used for indicating the robot to stop moving, and in the static state, the robot can adopt more and more detailed characteristic information for positioning and mapping, so that the planned route is more accurate. Therefore, although the environmental information before and after the movement is stopped is the same, the content of the first environmental characteristic information is greater than that of the second environmental characteristic information in the environmental information corresponding to the first position before the movement is stopped, and the robot can perform positioning and mapping by using more characteristic information, thereby realizing the correction of the route.
It should be added that after determining that the first position of the robot is located outside the preset area formed by the preset boundary line, and before sending the route correction instruction to the robot, the schematic flow chart of the route correction instruction generation is shown in fig. 7, and the method further includes the following steps:
in step 701, a route correction direction is determined based on a positional relationship between the robot and a preset area formed by a preset boundary line.
In step 702, the route correction direction is set in the route correction instruction, so that the robot performs the positioning and mapping again based on the first environmental characteristic information in the environmental information corresponding to the first position and the route correction direction.
For example, if the robot is located on the left side of a preset area formed by a preset boundary line, the route correction direction indicates that the robot moves to the right, and the route correction direction is placed in the route correction instruction, so that the robot can perform positioning and mapping again based on the first environmental characteristic information and the route correction direction in the environmental information corresponding to the first position.
In a possible implementation manner, in order to help the robot correct the positioning deviation, for example, although the robot is located in the preset area, the robot is not located at the center of the preset area, after determining the preset area composed of the first position of the robot in the first environment image and the preset boundary line, the flowchart of correcting the positioning deviation is shown in fig. 8, and further includes the following steps:
in step 801, if it is determined that the first position of the robot is located in the preset area, a second environment image including the robot is periodically acquired, and a second position, a preset center, and a preset positioning line of the robot are determined based on the second environment image.
In one possible embodiment, in order to avoid the problem that the perspective relationship of the imaging results in misjudgment of the position of the characteristic point of the robot and the position of the preset boundary line, in step 801, the second position, the preset center position, and the preset position line of the robot are determined based on the second environment image, and a flowchart of the step is shown in fig. 9, and includes the following steps:
in step 901, feature points of the robot in the second environment image are identified, and a second position of the robot is determined based on the identified feature points.
In step 902, a preset center is determined according to a second vertical distance from the feature point to the horizontal ground and a position of the preset center in the second environment image, and a preset position line is determined according to the second vertical distance from the feature point to the horizontal ground and a position of the preset position line in the second environment image, wherein the second position, the preset center and the preset position line of the robot are on the same plane. Namely, the second position, the preset center and the preset position line of the robot are positioned on the same plane through the steps, so that the problem that the positions of the characteristic point and the preset boundary line of the robot are misjudged due to the imaging perspective relation is solved.
In step 802, if it is determined that the second position of the robot is located within the preset positioning line and the distance between the second position of the robot and the preset center position exceeds the preset distance, a position correction instruction including a position correction direction is sent to the robot, the position correction instruction is used for instructing the robot to stop moving and performing positioning and mapping again according to third environment feature information in the environment information corresponding to the second position and the position correction direction, and the content of the third environment feature information is more than fourth environment feature information in the environment information corresponding to the second position before the robot stops moving.
For example, the image of the second position and the preset center position of the robot is shown in fig. 10, where a non-filled five-pointed star is the second position of the robot, a filled five-pointed star is the preset center position, an ellipse is a preset bit line, after determining that the distance between the second position and the preset center position of the robot exceeds the preset distance, a position correction instruction including a position correction direction is sent to the robot, and after receiving the position correction instruction, the robot stops moving, performs positioning and mapping again according to the third environmental characteristic information in the environmental information corresponding to the second position and the position correction direction, until the robot can move to the preset center position, thereby realizing correction of the positioning deviation.
It should be noted that the content of the third environmental characteristic information is greater than that of the fourth environmental characteristic information in the environmental information corresponding to the second position before the movement is stopped, so that the robot performs positioning and mapping by using more characteristic information, and the positioning correction is realized.
It should be added that the camera head end transmits the route correction command and the position correction command in a wireless communication manner.
In a possible implementation manner, if the number of times of sending the route correction instruction to the robot exceeds a first preset number of times, or the time that the first position of the robot is located outside a preset region formed by a preset boundary line exceeds a first preset duration, it is determined that the robot cannot move into the preset region, and a first fault signal is sent according to a preset alarm manner.
In another possible implementation manner, after it is determined that the second position of the robot is located in the preset positioning line and the second position of the robot deviates from the preset central position, if it is determined that the number of times of sending the position correction instruction to the robot exceeds a second preset number of times, or the second position of the robot is located in the preset positioning line and the time of the second position of the robot deviating from the preset central position exceeds a second preset duration, it is determined that the robot cannot move to the preset central position, and a second fault signal is sent according to a preset alarm manner.
It should be noted that the preset alarm mode may be an audible and visual alarm, a short message, a mail, or the like, and the mode of sending the first fault signal and the second fault signal notifies a relevant technician to perform manual intervention so as to move the robot to a predetermined boundary line or a preset central position.
In summary, according to the embodiment of the application, if it is determined that the position of the robot is located outside the preset area, the robot can use more and more detailed feature information for positioning and mapping by transmitting the correction instruction to the robot according to the environment image including the robot, so that the planned route of the robot is more accurate, and the route correction is realized.
The embodiment of the application also provides a device for the instant positioning of the robot based on the same inventive concept. Fig. 11 is a block diagram illustrating an apparatus for immediate positioning of a robot according to an exemplary embodiment, and referring to fig. 11, the apparatus 1100 includes:
a starting instruction sending module 1101 configured to send a starting instruction to the robot, where the starting instruction is used to instruct the robot to acquire initial environment information and perform positioning and mapping according to initial environment feature information in the initial environment information;
an environment image acquiring module 1102 configured to periodically acquire a first environment image including a robot, and determine a first position of the robot and a preset area composed of preset boundary lines based on the first environment image;
a correction instruction sending module 1103 configured to send a route correction instruction to the robot if it is determined that the first location of the robot is located outside a preset area formed by a preset boundary line, where the route correction instruction is used to instruct the robot to stop moving, and to reposition and map according to first environment feature information in environment information corresponding to the first location, where content of the first environment feature information is greater than second environment feature information in the environment information corresponding to the first location before the robot stops moving.
In a possible implementation, after the determining a preset area composed of a first position of the robot in the first environment image and a preset boundary line is performed, the method further includes:
the environment image acquisition module is configured to periodically acquire a second environment image containing the robot if the first position of the robot is determined to be located in a preset area, and determine a second position, a preset center and a preset positioning line of the robot based on the second environment image;
and the correction instruction sending module is configured to send a position correction instruction containing a position correction direction to the robot if it is determined that the second position of the robot is located in a preset positioning line and the distance between the second position of the robot and a preset central position exceeds a preset distance, wherein the position correction instruction is used for indicating the robot to stop moving, and carrying out positioning and drawing again according to third environmental characteristic information in the environmental information corresponding to the second position and the position correction direction, and the content of the third environmental characteristic information is more than fourth environmental characteristic information in the environmental information corresponding to the second position before the robot stops moving.
In a possible implementation, the determining a second position, a preset center position and a preset location line of the robot based on the second environment image is performed, the environment image acquisition module is configured to:
identifying feature points of the robot in the second environment image and determining a second position of the robot based on the identified feature points;
and determining a preset center according to the second vertical distance from the feature point to the horizontal ground and the position of the preset center in the second environment image, and determining a preset positioning line according to the second vertical distance from the feature point to the horizontal ground and the position of a preset positioning line in the second environment image, wherein the second position, the preset center and the preset positioning line of the robot are in the same plane.
In a possible implementation, said determining a first position of the robot based on the first environment image and a preset area of a preset boundary line is performed, said environment image acquisition module being configured to:
identifying feature points of the robot in the first environment image, and determining a first position of the robot based on the identified feature points;
and determining a preset area formed by preset boundary lines according to the first vertical distance from the characteristic point to the horizontal ground and the position of the preset boundary lines in the first environment image, wherein the first position of the robot and the preset area formed by the preset boundary lines are on the same horizontal plane.
In a possible implementation manner, after determining that the first position of the robot is located outside the preset area formed by the preset boundary line and before sending the route correction instruction to the robot, the correction instruction sending module is further configured to:
determining a route correction direction based on a position relation of a preset area formed by the robot and the preset boundary line;
and placing the route correction direction in the route correction instruction so that the robot can be repositioned and mapped based on first environmental characteristic information in the environmental information corresponding to the first position and the route correction direction.
In a possible embodiment, the apparatus further comprises:
the first fault signal sending module is configured to determine that the robot cannot move into a preset area if the number of times of sending the route correction instruction to the robot exceeds a first preset number of times or the time of the first position of the robot outside the preset area formed by the preset boundary lines exceeds a first preset duration, and send a first fault signal according to a preset alarm mode.
In a possible implementation manner, after determining that the second position of the robot is located within the preset positioning line and the second position of the robot is deviated from the preset central position, the apparatus further includes:
and the second fault signal sending module is configured to determine that the robot cannot move to the preset central position if the number of times of sending the position correction instruction to the robot is determined to exceed a second preset number of times, or the second position of the robot is located in a preset positioning line, and the time of the second position of the robot deviating from the preset central position exceeds a second preset time length, and send a second fault signal according to a preset alarm mode.
Having described the method and apparatus for immediate positioning of a robot according to an exemplary embodiment of the present application, an electronic device according to another exemplary embodiment of the present application is described next.
As will be appreciated by one skilled in the art, aspects of the present application may be embodied as a system, method or program product. Accordingly, various aspects of the present application may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
In some possible implementations, an electronic device according to the present application may include at least one processor, and at least one memory. Wherein the memory stores program code which, when executed by the processor, causes the processor to perform the method for immediate positioning of a robot according to various exemplary embodiments of the present application described above in the present specification. For example, the processor may perform steps in a method such as robotic point-of-care positioning.
The electronic device 130 according to this embodiment of the present application is described below with reference to fig. 12. The electronic device 130 shown in fig. 12 is only an example, and should not bring any limitation to the functions and the use range of the embodiment of the present application.
As shown in fig. 12, the electronic device 130 is represented in the form of a general electronic device. The components of the electronic device 130 may include, but are not limited to: the at least one processor 131, the at least one memory 132, and a bus 133 that connects the various system components (including the memory 132 and the processor 131).
Bus 133 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, a processor, or a local bus using any of a variety of bus architectures.
The memory 132 may include readable media in the form of volatile memory, such as Random Access Memory (RAM) 1321 and/or cache memory 1322, and may further include Read Only Memory (ROM) 1323.
Memory 132 may also include a program/utility 1325 having a set (at least one) of program modules 1324, such program modules 1324 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which or some combination thereof may comprise an implementation of a network environment.
The electronic device 130 may also communicate with one or more external devices 134 (e.g., keyboard, pointing device, etc.), with one or more devices that enable a user to interact with the electronic device 130, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 130 to communicate with one or more other electronic devices. Such communication may occur via input/output (I/O) interfaces 135. Also, the electronic device 130 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 136. As shown, network adapter 136 communicates with other modules for electronic device 130 over bus 133. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with electronic device 130, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
In an exemplary embodiment, a computer-readable storage medium comprising instructions, such as the memory 132 comprising instructions, executable by the processor 131 of the electronic device 130 to perform the above-described method of robot instant positioning is also provided. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, comprising a computer program which, when executed by the processor 131, implements any of the methods of robot instant positioning as provided herein.
In an exemplary embodiment, the various aspects of a method for robot instant positioning provided by the present application may also be implemented in the form of a program product comprising program code for causing a computer device to perform the steps of the method for robot instant positioning according to various exemplary embodiments of the present application described above in this specification when the program product is run on the computer device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The program product for the multimedia information editing method of the embodiments of the present application may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be executed on an electronic device. However, the program product of the present application is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the consumer electronic device, partly on the consumer electronic device, as a stand-alone software package, partly on the consumer electronic device and partly on a remote electronic device, or entirely on the remote electronic device or server. In the case of remote electronic devices, the remote electronic devices may be connected to the consumer electronic device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external electronic device (e.g., through the internet using an internet service provider).
It should be noted that although in the above detailed description several units or sub-units of the apparatus are mentioned, such a division is merely exemplary and not mandatory. Indeed, the features and functions of two or more units described above may be embodied in one unit, according to embodiments of the application. Conversely, the features and functions of one unit described above may be further divided into embodiments by a plurality of units.
Further, while the operations of the methods of the present application are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable image scaling apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable image scaling apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable image scaling apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable image scaling device to cause a series of operational steps to be performed on the computer or other programmable device to produce a computer implemented process such that the instructions which execute on the computer or other programmable device provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A method of instantaneous robot positioning, the method comprising:
sending a starting instruction to the robot, wherein the starting instruction is used for instructing the robot to acquire initial environment information and positioning and drawing according to initial environment characteristic information in the initial environment information;
periodically acquiring a first environment image containing a robot, and determining a first position of the robot and a preset area formed by a preset boundary line based on the first environment image;
and if the first position of the robot is determined to be located outside a preset area formed by a preset boundary line, sending a route correction instruction to the robot, wherein the route correction instruction is used for indicating the robot to stop moving, and carrying out positioning and drawing construction again according to first environment characteristic information in environment information corresponding to the first position, and the content of the first environment characteristic information is more than that of second environment characteristic information in the environment information corresponding to the first position before the robot stops moving.
2. The method of claim 1, wherein after determining a preset area consisting of the first position of the robot and a preset boundary line in the first environment image, further comprising:
if the first position of the robot is determined to be located in the preset area, periodically acquiring a second environment image containing the robot, and determining a second position, a preset center and a preset positioning line of the robot based on the second environment image;
if the second position of the robot is determined to be located in a preset positioning line, and the distance between the second position of the robot and the preset central position exceeds a preset distance, sending a position correction instruction containing a position correction direction to the robot, wherein the position correction instruction is used for indicating the robot to stop moving, and carrying out positioning and drawing again according to third environment characteristic information in the environment information corresponding to the second position and the position correction direction, and the content of the third environment characteristic information is more than fourth environment characteristic information in the environment information corresponding to the second position before the robot stops moving.
3. The method of claim 2, wherein determining a second position, a preset center position, and a preset position line of the robot based on the second environmental image comprises:
identifying feature points of the robot in the second environment image, and determining a second position of the robot based on the identified feature points;
and determining a preset center according to the second vertical distance from the feature point to the horizontal ground and the position of the preset center in the second environment image, and determining a preset positioning line according to the second vertical distance from the feature point to the horizontal ground and the position of a preset positioning line in the second environment image, wherein the second position, the preset center and the preset positioning line of the robot are in the same plane.
4. The method of claim 1, wherein determining a preset area of a first location of the robot and a preset boundary line based on the first environment image comprises:
identifying feature points of the robot in the first environment image, and determining a first position of the robot based on the identified feature points;
and determining a preset area formed by preset boundary lines according to the first vertical distance from the characteristic point to the horizontal ground and the position of the preset boundary lines in the first environment image, wherein the first position of the robot and the preset area formed by the preset boundary lines are on the same horizontal plane.
5. The method of claim 1, wherein determining that the first position of the robot is outside the predetermined area of the predetermined boundary line and before sending a route modification instruction to the robot, further comprises:
determining a route correction direction based on a position relation of a preset area formed by the robot and the preset boundary line;
and placing the route correction direction in the route correction instruction so that the robot can be repositioned and mapped based on first environmental characteristic information in the environmental information corresponding to the first position and the route correction direction.
6. The method of claim 1, further comprising:
if the times of sending the route correction instruction to the robot exceed a first preset time, or the time of the first position of the robot outside a preset area formed by a preset boundary line exceeds a first preset time length, it is determined that the robot cannot move into the preset area, and a first fault signal is sent out according to a preset alarm mode.
7. The method of claim 2, wherein if it is determined that the second position of the robot is within the predetermined position line and the second position of the robot is offset from the predetermined center position, the method further comprises:
and if the number of times of sending the position correction instruction to the robot is determined to exceed a second preset number of times, or the second position of the robot is located in a preset positioning line, and the time of the second position of the robot deviating from the preset central position exceeds a second preset time length, determining that the robot cannot move to the preset central position, and sending a second fault signal according to a preset alarm mode.
8. An apparatus for real-time positioning of a robot, the apparatus comprising:
the starting instruction sending module is configured to send a starting instruction to the robot, and the starting instruction is used for instructing the robot to acquire initial environment information and positioning and drawing according to initial environment feature information in the initial environment information;
the robot positioning system comprises an environment image acquisition module, a positioning module and a control module, wherein the environment image acquisition module is configured to periodically acquire a first environment image containing the robot and determine a first position of the robot and a preset area formed by a preset boundary line based on the first environment image;
the correction instruction sending module is configured to send a route correction instruction to the robot if the first position of the robot is determined to be located outside a preset area formed by preset boundary lines, wherein the route correction instruction is used for instructing the robot to stop moving, repositioning and drawing according to first environment characteristic information in environment information corresponding to the first position, and the content of the first environment characteristic information is more than that of second environment characteristic information in the environment information corresponding to the first position before the robot stops moving.
9. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the method of robot instant positioning as claimed in any of claims 1-7.
10. A computer-readable storage medium, wherein instructions in the computer-readable storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the method of robotic point-in-time positioning of any of claims 1-7.
CN202211315053.3A 2022-10-26 2022-10-26 Method and related device for real-time positioning of robot Active CN115371686B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211315053.3A CN115371686B (en) 2022-10-26 2022-10-26 Method and related device for real-time positioning of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211315053.3A CN115371686B (en) 2022-10-26 2022-10-26 Method and related device for real-time positioning of robot

Publications (2)

Publication Number Publication Date
CN115371686A true CN115371686A (en) 2022-11-22
CN115371686B CN115371686B (en) 2023-01-31

Family

ID=84072699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211315053.3A Active CN115371686B (en) 2022-10-26 2022-10-26 Method and related device for real-time positioning of robot

Country Status (1)

Country Link
CN (1) CN115371686B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279949A (en) * 2013-05-09 2013-09-04 浙江大学 Operation method of self-positioning robot-based multi-camera parameter automatic calibration system
CN107671831A (en) * 2017-08-02 2018-02-09 国网浙江省电力公司紧水滩水力发电厂 A kind of power station subregion intelligent inspection system and method
CN110618436A (en) * 2019-04-04 2019-12-27 中国石油大学(北京) Inspection method, device and equipment based on instant positioning and map construction
US20200215694A1 (en) * 2019-01-03 2020-07-09 Ecovacs Robotics Co., Ltd. Dynamic region division and region passage identification methods and cleaning robot
CN111399502A (en) * 2020-03-09 2020-07-10 惠州拓邦电气技术有限公司 Mobile robot and drawing establishing method and device thereof
CN113888584A (en) * 2021-08-04 2022-01-04 北京化工大学 Robot wheelchair tracking system based on omnibearing vision and control method
CN114489058A (en) * 2022-01-13 2022-05-13 深圳市优必选科技股份有限公司 Sweeping robot, path planning method and device thereof and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103279949A (en) * 2013-05-09 2013-09-04 浙江大学 Operation method of self-positioning robot-based multi-camera parameter automatic calibration system
CN107671831A (en) * 2017-08-02 2018-02-09 国网浙江省电力公司紧水滩水力发电厂 A kind of power station subregion intelligent inspection system and method
US20200215694A1 (en) * 2019-01-03 2020-07-09 Ecovacs Robotics Co., Ltd. Dynamic region division and region passage identification methods and cleaning robot
CN110618436A (en) * 2019-04-04 2019-12-27 中国石油大学(北京) Inspection method, device and equipment based on instant positioning and map construction
CN111399502A (en) * 2020-03-09 2020-07-10 惠州拓邦电气技术有限公司 Mobile robot and drawing establishing method and device thereof
CN113888584A (en) * 2021-08-04 2022-01-04 北京化工大学 Robot wheelchair tracking system based on omnibearing vision and control method
CN114489058A (en) * 2022-01-13 2022-05-13 深圳市优必选科技股份有限公司 Sweeping robot, path planning method and device thereof and storage medium

Also Published As

Publication number Publication date
CN115371686B (en) 2023-01-31

Similar Documents

Publication Publication Date Title
US10406687B2 (en) Layered multi-agent coordination
CN110146098B (en) Robot map extension method and device, control equipment and storage medium
CN109986561B (en) Robot remote control method, device and storage medium
CN108235736B (en) Positioning method, cloud server, terminal, system, electronic device and computer program product
CN111988524A (en) Unmanned aerial vehicle and camera collaborative obstacle avoidance method, server and storage medium
EP3559771A1 (en) Multi-agent coordination under sparse networking
CN113031591B (en) Exception handling method and device for material pushing robot, server and storage medium
US20160109883A1 (en) Method and apparatus for offboard navigation of a robotic device
CN104144326A (en) Robot monitoring system with image recognition and automatic patrol route setting function
CN111352425A (en) Navigation system, method, device, electronic equipment and medium
CN115371686B (en) Method and related device for real-time positioning of robot
CN113094132A (en) Remote checking robot history backtracking method, device, terminal and storage medium
CN116974291A (en) Control error determining method and device for master-slave cooperative navigation agricultural machinery
CN103714439A (en) Train overhaul process monitoring system and monitoring method
CN116626700A (en) Robot positioning method and device, electronic equipment and storage medium
CN113762140A (en) Robot-based mapping method, electronic device and storage medium
CN113741529A (en) Remote guidance method and remote guidance device for spacecraft and intersection part
CN220271771U (en) Track inspection robot based on map
CN112183524A (en) Robot wired network docking method, system, terminal device and storage medium
CN112579423A (en) Equipment monitoring method and device
CN110595480A (en) Navigation method, device, equipment and storage medium
CN113776516B (en) Method and device for adding barriers, electronic equipment and storage medium
CN110261098A (en) A kind of engineering machinery pitching reversal valve dynamic failure detection system and method
CN110726564A (en) System and method for simulating automatic driving of vehicle
US20230359219A1 (en) Method and system for environment maintenance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant