CN110873875B - Robot positioning method, robot and post-working device for controlling robot - Google Patents

Robot positioning method, robot and post-working device for controlling robot Download PDF

Info

Publication number
CN110873875B
CN110873875B CN201911172328.0A CN201911172328A CN110873875B CN 110873875 B CN110873875 B CN 110873875B CN 201911172328 A CN201911172328 A CN 201911172328A CN 110873875 B CN110873875 B CN 110873875B
Authority
CN
China
Prior art keywords
robot
charging pile
positioning result
positioning
scene information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911172328.0A
Other languages
Chinese (zh)
Other versions
CN110873875A (en
Inventor
胡佳文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Everest Shenzhen Technology Co ltd
Original Assignee
Everest Shenzhen Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Everest Shenzhen Technology Co ltd filed Critical Everest Shenzhen Technology Co ltd
Priority to CN201911172328.0A priority Critical patent/CN110873875B/en
Publication of CN110873875A publication Critical patent/CN110873875A/en
Application granted granted Critical
Publication of CN110873875B publication Critical patent/CN110873875B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T90/00Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation
    • Y02T90/10Technologies relating to charging of electric vehicles
    • Y02T90/16Information or communication technologies improving the operation of electric vehicles

Abstract

The application relates to a robot positioning method and device and a method and device for controlling a robot to work on duty. The method comprises the following steps: the method comprises the steps of obtaining scene information within a preset range from a robot, receiving radar radiation signals, carrying out fusion positioning on the robot according to the scene information and the radar radiation signals, and determining the position of the robot. The position of the robot can be determined more accurately, and positioning errors or failures are avoided.

Description

Robot positioning method, robot and post-working device for controlling robot
Technical Field
The present disclosure relates to the field of robots, and in particular, to a robot positioning method, a robot, and a device for controlling the robot to work on duty.
Background
In the field of service robots, autonomous navigation endows the robots with the capability of autonomous movement, so that the robots can direct paths, guide paths and the like for users like field workers. Wherein, the robot need independently look for and fill electric pile and charge when the electric quantity is not enough, and after the completion of charging, get back to the work post automatically again and go.
In practical application, a plurality of robots are required to work simultaneously on the spot, so that a plurality of charging piles are required on the spot, the position of each charging pile is different, and the plurality of robots are used for charging the plurality of charging piles to bring great challenges to the positioning function.
Therefore, the technical problem that positioning is difficult under the condition that multiple robots charge multiple electric piles exists in the prior art.
Disclosure of Invention
In view of the above, it is necessary to provide a robot positioning method and apparatus, and a method and apparatus for controlling a robot to perform a post job, which can perform accurate positioning.
In a first aspect:
a robot positioning method, the method comprising:
after the robot is started, scene information within a preset range from the robot is acquired;
receiving a radar radiation signal;
and performing fusion positioning on the robot according to the scene information and the radar radiation signal to determine the position of the robot.
In one embodiment, the performing fusion positioning on the robot according to the scene information and the radar radiation signal to determine the position of the robot includes:
carrying out coarse positioning on the robot according to the scene information to obtain a first positioning result;
radar positioning is carried out on the robot according to the radar radiation signals, and a second positioning result is obtained;
and determining the position of the robot according to the first positioning result and the second positioning result.
In one embodiment, after the robot is powered on, acquiring scene information within a preset range from the robot includes:
after the robot is started, judging whether the robot is on a charging pile or not;
if the robot is on the charging pile, the pile is removed, and scene information within a preset range from the robot is acquired;
and if the robot is not on the charging pile, acquiring scene information within a preset range from the robot.
In one embodiment, the determining the position of the robot further comprises:
judging whether the robot has a locking relation with a charging pile or not;
and if the locking relation exists, sending a state updating instruction to the server, so that the server updates the state of the charging pile locked with the robot into an idle state according to the state updating instruction.
In a second aspect:
a method of controlling a robot to work on duty, the method comprising:
after the robot is started, acquiring scene information within a preset range from the robot;
receiving a radar radiation signal;
according to the scene information and the radar radiation signals, performing fusion positioning on the robot, and determining the position of the robot;
and controlling the robot to move to a specified position for operation according to the position of the robot and a preset operation position.
In a third aspect:
a robot positioning device, characterized in that the positioning device comprises:
the acquisition module is used for acquiring scene information within a preset range from the robot after the robot is started;
the receiving module is used for receiving radar radiation signals;
and the fusion positioning module is used for performing fusion positioning on the robot according to the scene information and the radar radiation signal and determining the position of the robot.
In a fourth aspect:
a robot, characterized in that the robot comprises a robot positioning device according to the first aspect.
In a fifth aspect:
a controlled robot post-job device, comprising:
the acquisition module is used for acquiring scene information within a preset range from the robot after the robot is started;
the receiving module is used for receiving radar radiation signals;
the fusion positioning module is used for performing fusion positioning on the robot according to the scene information and the radar radiation signal and determining the position of the robot;
and the operation module is used for controlling the robot to move to a specified position for operation according to the position of the robot and a preset operation position.
A sixth aspect:
a robot comprising a work implement on duty for controlling a robot as described in the second aspect.
Seventh aspect:
a computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
after the robot is started, scene information within a preset range from the robot is acquired;
receiving a radar radiation signal;
and performing fusion positioning on the robot according to the scene information and the radar radiation signal to determine the position of the robot.
An eighth aspect:
a computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
after the robot is started, scene information within a preset range from the robot is acquired;
receiving a radar radiation signal;
according to the scene information and the radar radiation signals, performing fusion positioning on the robot, and determining the position of the robot;
and controlling the robot to move to a specified position for operation according to the position of the robot and a preset operation position.
According to the robot positioning method and device and the robot post-working control method and device, after the robot is started, scene information within a preset range from the robot is obtained, radar radiation signals are received, the robot is subjected to fusion positioning according to the scene information and the radar radiation signals, and the position of the robot is determined. The robot is subjected to fusion positioning in two modes according to the scene information and the radar radiation signal within the preset range, the position of the robot can be determined more accurately, and positioning errors or failures are avoided.
Drawings
FIG. 1 is a schematic flow chart diagram of a method for positioning a robot in one embodiment;
FIG. 2 is a schematic flow chart illustrating the step of refining step S13 in one embodiment;
FIG. 3 is a schematic flow chart illustrating a method for controlling a robot to perform work on duty in one embodiment;
FIG. 4 is a block diagram of a robotic positioning device in accordance with an embodiment;
FIG. 5 is a block diagram of an embodiment of a device for controlling a robot to work on duty.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
In one embodiment, as shown in fig. 1, there is provided a robot positioning method, the execution subject of the method being a robot positioning device, the method comprising the steps of:
step S11, after the robot is started, scene information within a preset range from the robot is acquired;
in the embodiment of the invention, the robot detects the starting instruction to start, and after the robot is started, the scene information within a preset range from the robot is acquired, wherein the scene information comprises the two-dimensional code containing the ID and/or the ORB data of the charging pile.
Step S12, receiving radar radiation signals;
in the embodiment of the present invention, the content of receiving the radar radiation signal is consistent with the prior art, and will not be described herein again.
And step S13, performing fusion positioning on the robot according to the scene information and the radar radiation signal, and determining the position of the robot.
In the embodiment of the invention, the fusion positioning is to perform positioning in different modes according to different information, and in the embodiment of the invention, scene positioning is performed according to scene information, and radar positioning is performed according to radar radiation signals.
According to the robot positioning method, after the robot is started, scene information within a preset range from the robot is obtained, radar radiation signals are received, the robot is subjected to fusion positioning according to the scene information and the radar radiation signals, and the position of the robot is determined. The robot is subjected to fusion positioning in two modes according to the scene information and the radar radiation signal within the preset range, the position of the robot can be determined more accurately, and positioning errors or failures are avoided.
In an embodiment, as shown in fig. 2, a schematic flow chart of the refining step of step S13 specifically includes:
step S131, roughly positioning the robot according to the scene information to obtain a first positioning result;
step S132, radar positioning is carried out on the robot according to the radar radiation signals, and a second positioning result is obtained;
in the embodiments of the present invention, radar positioning belongs to the prior art, and is not described herein.
And S133, determining the position of the robot according to the first positioning result and the second positioning result.
In the prior art, radar positioning is usually performed in robot positioning, but errors often occur only by using radar positioning, so that rough positioning is performed through scene information to obtain a first positioning result, then radar positioning is performed to obtain a fine second positioning result, then the first positioning result and the second positioning result are analyzed, if the first positioning result comprises the second positioning result, the second positioning result is accurate, the second positioning result is determined as the position of the robot, and if the first positioning result does not comprise the second positioning result, the second positioning result is inaccurate, repositioning is required, and the position of the robot is determined.
In an embodiment, after the robot is powered on in step S11, the acquiring scene information within a preset range from the robot specifically includes:
after the robot is started, judging whether the robot is on a charging pile or not; if the robot is on the charging pile, the pile is removed, and scene information within a preset range from the robot is acquired; and if the robot is not on the charging pile, acquiring scene information within a preset range from the robot.
In the embodiment of the invention, after the robot is started, whether the robot is on the charging pile or not is judged, if the robot is on the charging pile, the pile is removed, and the scene information within the preset range from the robot is obtained, and if the robot is not on the charging pile, the scene information within the preset range from the robot is obtained. And carrying out coarse positioning through the scene information to obtain a first positioning result, then carrying out radar positioning to obtain a fine second positioning result, and determining the position of the robot according to the first positioning result and the second positioning result. In the prior art, when the robot is powered off, the position of the robot is recorded, and when the robot is powered on, the position of the robot stored before is directly called, but the method may not be accurate. Because the robot is powered off when charging, the robot may be manually pushed when powering off, and the position stored before powering off is changed, but the robot cannot know the position change when powering off, so that the robot needs to be repositioned according to the method after the robot is powered on, and the position of the robot can be known more accurately.
In one embodiment, the determining the position of the robot further comprises:
judging whether the robot has a locking relation with a charging pile or not;
and if the locking relation exists, sending a state updating instruction to the server, so that the server updates the state of the charging pile locked with the robot into an idle state according to the state updating instruction.
In the embodiment of the invention, after the robot is charged, for example, the charging pile a charges the robot a, the server stores the state of the charging pile a as an occupied state, a locking relationship exists between the charging pile a and the robot a, the server and/or the robot stores the locking relationship, if the robot a finishes charging on the charging pile a and does not need to charge the robot a any more, the server updates the state of the charging pile a to be an idle state, if the robot a is pushed down by a person in the charging process (incomplete charging), the charging pile a is still in a locking state, therefore, after the robot a is started, after the position of the robot a is determined, the robot a judges whether the robot a still has the locking relationship with the charging pile a, and if the locking relationship exists, a state updating instruction is sent to the server, so that the server updates the instruction according to the state, and updating the state of the charging pile locked with the robot into an idle state.
In one embodiment, as shown in fig. 3, there is provided a method for controlling a robot to work on duty, the method being performed by controlling a robot working device on duty, the method comprising the steps of:
step S31, after the robot is started, scene information within a preset range from the robot is acquired;
step S32, receiving radar radiation signals;
step S33, according to the scene information and the radar radiation signal, performing fusion positioning on the robot, and determining the position of the robot;
in the embodiment of the present invention, the content described in the above steps S31-S33 is the same as that described in the embodiment shown in fig. 1, and will not be described again here.
And step S34, controlling the robot to move to a designated position for working according to the position of the robot and a preset working position.
In the embodiment of the invention, each robot is preset with a position which the robot should work, and the robot is controlled to move to the specified position to work according to the position of the robot and the preset work position.
According to the method for controlling the robot to work on duty, after the robot is started, scene information within a preset range from the robot is obtained, radar radiation signals are received, the robot is fused and positioned according to the scene information and the radar radiation signals, the position of the robot is determined, and the robot is controlled to move to an appointed position to work according to the position of the robot and the preset working position. The robot is subjected to fusion positioning in two modes according to the scene information and the radar radiation signal within the preset range, the position of the robot can be determined more accurately, and positioning errors or failures are avoided. Further, the robot can be guided to a designated working position.
In one embodiment, there is provided a robot charging method, the execution subject of the method being a robot charging device, the method including the steps of:
step A: after a charging function is started, acquiring a charging pile in an idle state from a server;
in the embodiment of the present invention, the robot detects its own power, and starts the charging function when the power is lower than a preset threshold (for example, the power is lower than 10%). The server records the states of the charging piles, including idle states and occupied states. After the robot starts the charging function, the robot acquires the charging pile state from the server and determines the idle charging pile according to the charging pile state.
And B, step B: selecting a target charging pile from the charging piles in an idle state, and receiving identification information which is sent by the server and is associated with the target charging pile;
in the embodiment of the invention, in a specific charging pile in the target charging pile finger-shaped idle charging piles, in the server, each charging pile stores corresponding identification information, and the identification information is used for uniquely identifying the charging pile and playing a role of identity identification, wherein the identification information can be an ID of the charging pile, a two-dimensional code containing the ID, ORB data of the charging pile, and the like.
In the embodiment of the invention, after the robot selects the target charging pile A from the idle charging piles, the robot receives the identification information which is sent by the server and is associated with the target charging pile A.
And C: moving to the position of the target charging pile, and acquiring at least one piece of scene information of the position of the target charging pile;
in the embodiment of the present invention, the location refers to a vicinity of the target charging pile, and for example, the location of the target charging pile can be within 1 meter of the target charging pile.
In the embodiment of the invention, when the robot moves to the position of the target charging pile, the robot or the robot detects the number of the charging piles at the position through the server, if only one charging pile is located, the scene information of the charging pile is acquired, and if a plurality of charging piles are included, the scene information corresponding to each charging pile is acquired. The scene information comprises a two-dimensional code containing an ID and/or ORB data of the charging pile.
Step D: establishing connection with the target charging pile according to the at least one piece of scene information and the identification information associated with the target charging pile;
in the embodiment of the invention, if the robot moves to the position of the target charging pile and finds that a plurality of charging piles exist, the robot does not know that the robot should be connected with the charging pile, the robot receives the identification information which is sent by the server and is associated with the target charging pile in step S12, and acquires at least one piece of scene information of the position of the target charging pile in step S13, and the robot matches each piece of scene information with the identification information associated with the target charging pile, so that the target charging pile matched with the target charging pile can be determined and is connected with the target charging pile.
Step E: and after the robot is successfully connected with the target charging pile, charging the robot through the target charging pile.
In the embodiment of the invention, after the robot is successfully connected with the target charging pile, the robot is shut down, and the target charging pile charges the robot.
According to the robot charging method, after a charging function is started, the charging piles in an idle state are obtained from the server, the target charging pile is selected from the charging piles in the idle state, the identification information which is sent by the server and is associated with the target charging pile is received, the target charging pile is moved to the position where the target charging pile is located, at least one piece of scene information of the position where the target charging pile is located is obtained, connection with the target charging pile is established according to the at least one piece of scene information and the identification information which is associated with the target charging pile, and after the robot is successfully connected with the target charging pile, the robot is charged through the target charging pile. The robot charging pile system comprises a server, a plurality of charging piles and a plurality of charging piles, wherein the charging piles in idle states are acquired from the server, the robot is prevented from being connected with the charging piles in occupied states, and the accuracy of connection is improved.
In one embodiment, the step B of selecting a target charging pile from the idle charging piles specifically includes:
step B1: scene information within a preset range from the robot is obtained, and the robot is roughly positioned according to the scene information to obtain a first positioning result;
in the embodiment of the invention, after the robot acquires the charging pile in an idle state from the server, scene information within a preset range from the robot is acquired, the scene information comprises a two-dimensional code containing an ID and/or ORB data of the charging pile, and the robot is coarsely positioned according to the scene information to obtain a first positioning result.
Step B2: radar positioning is carried out on the robot, and a second positioning result is obtained;
in the embodiments of the present invention, radar positioning belongs to the prior art, and is not described herein.
Step B3: determining the position of the robot according to the first positioning result and the second positioning result;
in the prior art, radar positioning is usually performed in robot positioning, but errors often occur only by using radar positioning, so that coarse positioning is performed through scene information to obtain a first positioning result, then radar positioning is performed to obtain a fine second positioning result, then the first positioning result and the second positioning result are analyzed, if the first positioning result contains the second positioning result, the second positioning result is accurate, the second positioning result is determined as the position of the robot, and if the first positioning result does not contain the second positioning result, the second positioning result is inaccurate, repositioning is required, and the position of the robot is determined.
Step B4: and selecting the target charging pile according to the position of the robot and the position of the idle charging pile in the state.
In the embodiment of the invention, when the robot acquires the state-free charging piles from the server, the position of the state-free charging piles can be acquired, after the robot acquires the position of the robot, the distance from the robot to each state-free charging pile is calculated according to the position of the robot and the position of the state-free charging pile, and the charging pile with the shortest distance is taken as a target charging pile, so that the robot can select the charging pile closest to the robot and can move to the charging pile more quickly for charging.
In one embodiment, step D specifically includes:
step D1: matching the identification information associated with the target charging pile with the at least one piece of scene information; wherein, one scene information corresponds to one charging pile;
step D2: determining the charging pile corresponding to the matched scene information as the target charging pile;
step D3: and establishing connection with the target charging pile.
In the embodiment of the invention, if the robot moves to the position of the target charging pile and finds that a plurality of charging piles exist, the robot does not know that the robot should be connected with the charging pile, and after receiving the identification information which is sent by the server and is associated with the target charging pile and acquiring at least one piece of scene information of the position of the target charging pile, the robot matches each piece of scene information with the identification information associated with the target charging pile, so that the target charging pile matched with the target charging pile can be determined and connected with the target charging pile. For example, the identification information associated with the target charging pile is ID-a, the scene information includes charging pile a scene information (the charging pile a scene information may be a two-dimensional code including ID-a) and charging pile B scene information (the charging pile B scene information may be a two-dimensional code including ID-B), the robot analyzes the scene information to obtain the identification information of the charging pile corresponding to each scene information, matches the identification information associated with the target charging pile with the identification information of the charging pile corresponding to each obtained scene information to obtain the scene information matched with the identification information associated with the target charging pile, and determines the charging pile corresponding to the matched scene information as the target charging pile to be connected with the target charging pile.
In one embodiment, the method further comprises:
and if the connection with the target charging pile fails, returning to execute the step of acquiring the idle charging pile from the server.
In the embodiment of the invention, if the connection with the target charging pile A fails, the target charging pile A cannot charge the robot, and the charging pile in an idle state needs to be acquired from the server again. It should be noted that, before the state-idle charging pile is acquired from the server again, the target charging pile a needs to be removed from the state-idle charging pile, so that the target charging pile a is prevented from being selected again.
In one embodiment, after the connection with the target charging pile is successful, the method further includes:
and sending a connection success instruction to the server, so that the server updates the state of the target charging pile into an occupied state according to the connection success instruction.
In the embodiment of the invention, after the robot is successfully connected with the target charging pile A, the connection success instruction is sent to the server, so that the server updates the state of the target charging pile A to the occupied state according to the connection success instruction, and other robots are prevented from selecting the target charging pile A in the charging process.
It should be understood that although the various steps in the flow charts of fig. 1-3 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1-3 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternating with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 4, there is provided a robot positioning device, the device comprising:
the acquiring module 41 is used for acquiring scene information within a preset range from the robot after the robot is started;
a receiving module 42, configured to receive a radar radiation signal;
and a fusion positioning module 43, configured to perform fusion positioning on the robot according to the scene information and the radar radiation signal, and determine the position of the robot.
In one embodiment, the fusion localization module 43 is configured to:
carrying out coarse positioning on the robot according to the scene information to obtain a first positioning result;
radar positioning is carried out on the robot according to the radar radiation signals, and a second positioning result is obtained;
and determining the position of the robot according to the first positioning result and the second positioning result.
In one embodiment, the obtaining module 41 is configured to:
after the robot is started, judging whether the robot is on a charging pile or not;
if the robot is on the charging pile, the pile is removed, and scene information within a preset range from the robot is acquired;
and if the robot is not on the charging pile, acquiring scene information within a preset range from the robot.
In one embodiment, the device further comprises a state updating module for judging whether the robot has a locking relationship with the charging pile; and if the locking relation exists, sending a state updating instruction to the server, so that the server updates the state of the charging pile locked with the robot into an idle state according to the state updating instruction.
For the specific definition of the robot positioning device, the robot, see the above definition of the robot-based positioning method, which is not described herein again. The robot positioning device and each module in the robot can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a robot is provided, the robot comprising the robot positioning device of the above embodiments.
In one embodiment, as shown in fig. 5, there is provided an apparatus for controlling a robot on duty, the apparatus comprising:
the acquiring module 51 is used for acquiring scene information within a preset range from the robot after the robot is started;
a receiving module 52, configured to receive a radar radiation signal;
the fusion positioning module 53 is configured to perform fusion positioning on the robot according to the scene information and the radar radiation signal, and determine the position of the robot;
and the operation module 54 is used for controlling the robot to move to a specified position for operation according to the position of the robot and a preset operation position.
In one embodiment, the fusion localization module 53 is configured to:
carrying out coarse positioning on the robot according to the scene information to obtain a first positioning result;
radar positioning is carried out on the robot according to the radar radiation signals, and a second positioning result is obtained;
and determining the position of the robot according to the first positioning result and the second positioning result.
In one embodiment, the obtaining module 51 is configured to:
after the robot is started, judging whether the robot is on a charging pile or not;
if the robot is on the charging pile, the pile is removed, and scene information within a preset range from the robot is acquired;
and if the robot is not on the charging pile, acquiring scene information within a preset range from the robot.
In one embodiment, the device further comprises a state updating module for judging whether the robot has a locking relationship with the charging pile; and if the locking relation exists, sending a state updating instruction to the server, so that the server updates the state of the charging pile locked with the robot into an idle state according to the state updating instruction.
In one embodiment, a robot is provided, and the robot comprises the working device for controlling the robot to work on duty according to the embodiments.
For specific limitations on controlling the robot working device and the robot on duty, reference may be made to the above limitations on the method for controlling the robot working on duty, and details are not described here. The robot working device and each module in the robot can be completely or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
after the robot is started, scene information within a preset range from the robot is acquired;
receiving a radar radiation signal;
and performing fusion positioning on the robot according to the scene information and the radar radiation signal to determine the position of the robot.
In one embodiment, the computer program when executed by the processor further performs the steps of:
carrying out coarse positioning on the robot according to the scene information to obtain a first positioning result;
radar positioning is carried out on the robot according to the radar radiation signals, and a second positioning result is obtained;
and determining the position of the robot according to the first positioning result and the second positioning result.
In one embodiment, the computer program when executed by the processor further performs the steps of:
after the robot is started, judging whether the robot is on a charging pile or not;
if the robot is on the charging pile, the pile is removed, and scene information within a preset range from the robot is acquired;
and if the robot is not on the charging pile, acquiring scene information within a preset range from the robot.
In one embodiment, the computer program when executed by the processor further performs the steps of:
judging whether the robot has a locking relation with a charging pile or not;
and if the locking relation exists, sending a state updating instruction to the server, so that the server updates the state of the charging pile locked with the robot into an idle state according to the state updating instruction.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
after the robot is started, scene information within a preset range from the robot is acquired;
receiving a radar radiation signal;
according to the scene information and the radar radiation signals, performing fusion positioning on the robot, and determining the position of the robot;
and controlling the robot to move to a specified position for operation according to the position of the robot and a preset operation position.
In one embodiment, the computer program when executed by the processor further performs the steps of:
in one embodiment, the computer program when executed by the processor further performs the steps of:
carrying out coarse positioning on the robot according to the scene information to obtain a first positioning result;
radar positioning is carried out on the robot according to the radar radiation signals, and a second positioning result is obtained;
and determining the position of the robot according to the first positioning result and the second positioning result.
In one embodiment, the computer program when executed by the processor further performs the steps of:
after the robot is started, judging whether the robot is on a charging pile or not;
if the robot is on the charging pile, the pile is removed, and scene information within a preset range from the robot is acquired;
and if the robot is not on the charging pile, acquiring scene information within a preset range from the robot.
In one embodiment, the computer program when executed by the processor further performs the steps of:
judging whether the robot has a locking relation with a charging pile or not;
and if the locking relation exists, sending a state updating instruction to the server, so that the server updates the state of the charging pile locked with the robot into an idle state according to the state updating instruction.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
All possible combinations of the technical features in the above embodiments may not be described for the sake of brevity, but should be considered as being within the scope of the present disclosure as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (5)

1. A robot positioning method, characterized in that the method comprises:
after the robot is started and a charging function is started, acquiring a charging pile in an idle state from a server;
selecting a target charging pile from the charging piles in an idle state, and receiving identification information which is sent by the server and is associated with the target charging pile;
moving to the position of the target charging pile, and acquiring at least one piece of scene information of the position of the target charging pile; the position refers to the area near the target charging pile; the scene information comprises charging pile ID, charging pile ID two-dimensional code or charging pile ORB data;
receiving a radar radiation signal;
carrying out coarse positioning on the robot according to the scene information to obtain a first positioning result;
radar positioning is carried out on the robot according to the radar radiation signals, and a second positioning result is obtained;
determining the position of the robot according to the first positioning result and the second positioning result; analyzing the first positioning result and the second positioning result, determining that the second positioning result is accurate if the first positioning result comprises the second positioning result, determining the second positioning result as the position of the robot, and determining that the second positioning result is inaccurate if the first positioning result does not comprise the second positioning result, requiring repositioning and confirming the position of the robot;
if the robot moves to the position where the target charging pile is located and a plurality of charging piles are found, the robot cannot determine which charging pile to connect with, and after receiving identification information which is sent by a server and associated with the target charging pile and acquiring at least one piece of scene information of the position where the target charging pile is located, the robot matches each piece of scene information with the identification information associated with the target charging pile, determines a matched target charging pile from the plurality of charging piles, and establishes connection with the target charging pile.
2. The method of claim 1, wherein the determining the position of the robot further comprises:
judging whether the robot has a locking relation with a charging pile or not;
and if the locking relation exists, sending a state updating instruction to the server, so that the server updates the state of the charging pile locked with the robot into an idle state according to the state updating instruction.
3. A job device on duty of control robot, characterized by that, the job device on duty of said control robot includes:
the acquisition module is used for acquiring the charging pile in an idle state from the server after the robot is started and a charging function is started; selecting a target charging pile from the charging piles in an idle state, and receiving identification information which is sent by the server and is associated with the target charging pile; moving to the position of the target charging pile, and acquiring at least one piece of scene information of the position of the target charging pile; the position refers to the area near the target charging pile; the scene information comprises charging pile ID, charging pile ID two-dimensional code or charging pile ORB data;
the receiving module is used for receiving radar radiation signals;
the fusion positioning module is used for: carrying out coarse positioning on the robot according to the scene information to obtain a first positioning result; radar positioning is carried out on the robot according to the radar radiation signals, and a second positioning result is obtained; determining the position of the robot according to the first positioning result and the second positioning result; analyzing the first positioning result and the second positioning result, and determining the second positioning result as the position of the robot if the first positioning result comprises the second positioning result; if the first positioning result does not contain the second positioning result, returning to execute the step of acquiring the scene information within the preset range from the robot;
if the robot moves to the position where the target charging pile is located and a plurality of charging piles are found, the robot cannot determine which charging pile to connect with, and after receiving identification information which is sent by a server and associated with the target charging pile and acquiring at least one piece of scene information of the position where the target charging pile is located, the robot matches each piece of scene information with the identification information associated with the target charging pile, determines a matched target charging pile from the plurality of charging piles, and establishes connection with the target charging pile.
4. A robot characterized in that it comprises a work station for controlling a robot according to claim 3.
5. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of claim 1 or 2.
CN201911172328.0A 2019-11-26 2019-11-26 Robot positioning method, robot and post-working device for controlling robot Active CN110873875B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911172328.0A CN110873875B (en) 2019-11-26 2019-11-26 Robot positioning method, robot and post-working device for controlling robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911172328.0A CN110873875B (en) 2019-11-26 2019-11-26 Robot positioning method, robot and post-working device for controlling robot

Publications (2)

Publication Number Publication Date
CN110873875A CN110873875A (en) 2020-03-10
CN110873875B true CN110873875B (en) 2022-08-16

Family

ID=69718159

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911172328.0A Active CN110873875B (en) 2019-11-26 2019-11-26 Robot positioning method, robot and post-working device for controlling robot

Country Status (1)

Country Link
CN (1) CN110873875B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114454764B (en) * 2022-04-12 2022-09-06 北京京东乾石科技有限公司 Charging method, charging station, system and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104571133A (en) * 2014-12-03 2015-04-29 嘉兴市德宝威微电子有限公司 Indoor charging method and system and indoor moving method for robot
CN106451635A (en) * 2016-11-02 2017-02-22 深圳乐行天下科技有限公司 Intelligent recharging method and apparatus
CN108955667A (en) * 2018-08-02 2018-12-07 苏州中德睿博智能科技有限公司 A kind of complex navigation method, apparatus and system merging laser radar and two dimensional code
CN109084732A (en) * 2018-06-29 2018-12-25 北京旷视科技有限公司 Positioning and air navigation aid, device and processing equipment
CN109807911A (en) * 2019-03-14 2019-05-28 湖南超能机器人技术有限公司 Based on GNSS, UWB, IMU, laser radar, code-disc the multi-environment joint positioning method of outdoor patrol robot
CN109993794A (en) * 2019-03-29 2019-07-09 北京猎户星空科技有限公司 A kind of robot method for relocating, device, control equipment and storage medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5669168B2 (en) * 2009-03-05 2015-02-12 日本電気株式会社 Distance measuring system and distance measuring method
DE102016202052B3 (en) * 2016-02-11 2017-04-06 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method and arrangement for high-precision positioning of a robot-controlled interaction device by means of radar
WO2018140748A1 (en) * 2017-01-26 2018-08-02 The Regents Of The University Of Michigan Localization using 2d maps which capture vertical structures in 3d point data
CN107462869B (en) * 2017-06-27 2020-04-24 深圳市优必选科技有限公司 Robot recharging alignment method, robot, system and storage medium
CN107612879B (en) * 2017-07-21 2023-12-01 山东省科学院自动化研究所 Wall penetrating radar remote assistance system and method based on WiFi (wireless fidelity) safety communication
CN110412530B (en) * 2018-04-27 2021-09-17 深圳市优必选科技有限公司 Method and device for identifying charging pile and robot
CN110238850A (en) * 2019-06-13 2019-09-17 北京猎户星空科技有限公司 A kind of robot control method and device
CN110491060B (en) * 2019-08-19 2021-09-17 深圳市优必选科技股份有限公司 Robot, safety monitoring method and device thereof, and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104571133A (en) * 2014-12-03 2015-04-29 嘉兴市德宝威微电子有限公司 Indoor charging method and system and indoor moving method for robot
CN106451635A (en) * 2016-11-02 2017-02-22 深圳乐行天下科技有限公司 Intelligent recharging method and apparatus
CN109084732A (en) * 2018-06-29 2018-12-25 北京旷视科技有限公司 Positioning and air navigation aid, device and processing equipment
CN108955667A (en) * 2018-08-02 2018-12-07 苏州中德睿博智能科技有限公司 A kind of complex navigation method, apparatus and system merging laser radar and two dimensional code
CN109807911A (en) * 2019-03-14 2019-05-28 湖南超能机器人技术有限公司 Based on GNSS, UWB, IMU, laser radar, code-disc the multi-environment joint positioning method of outdoor patrol robot
CN109993794A (en) * 2019-03-29 2019-07-09 北京猎户星空科技有限公司 A kind of robot method for relocating, device, control equipment and storage medium

Also Published As

Publication number Publication date
CN110873875A (en) 2020-03-10

Similar Documents

Publication Publication Date Title
CN110880798A (en) Robot charging method, robot charging device, robot and system
CN110244738B (en) Vehicle running control method and device and vehicle
US8483512B2 (en) Position determination method for a geodetic measuring device
CN109062207A (en) Localization method, device, robot and the storage medium of cradle
CN107742304B (en) Method and device for determining movement track, mobile robot and storage medium
CN110873875B (en) Robot positioning method, robot and post-working device for controlling robot
CN110146098B (en) Robot map extension method and device, control equipment and storage medium
CN108590657B (en) Excavation control method and control system for underground roadway
CN110539662A (en) Method for matching vehicle with charging pile
CN113733166B (en) Robot positioning method, device, robot and computer readable storage medium
CN109579793B (en) Terrain mapping method, apparatus, flight platform, computer device and storage medium
CN113900454A (en) Charging pile aligning method, device, equipment and storage medium
US20210364265A1 (en) Device and method for managing registration and arrangement of detonator
CN103763731A (en) Positioning detection method and device
CN112540604A (en) Robot charging system and method and terminal equipment
CN115902843A (en) Multi-laser-radar calibration method and device and electronic equipment
CN115143953A (en) Rapid updating method, device, equipment and storage medium based on off-line map
CN113190380A (en) Equipment relocation error recovery method and device, computer equipment and storage medium
CN116846020A (en) Robot charging method, device, robot charging system and storage medium
CN114010108B (en) Sweeper control method and device based on voice guidance and computer equipment
CN114310042B (en) Control method of welding robot and related equipment
CN111486792B (en) Nuclear power station pipeline thermal displacement measuring method, device, equipment and medium
CN115191890B (en) Control method of cleaning robot, cleaning robot and storage medium
CN110196054B (en) Navigation method and system
CN115127585A (en) Navigation positioning precision measuring method, device, medium and equipment of patrol robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant