CN111474928A - Robot control method, robot, electronic device, and readable storage medium - Google Patents

Robot control method, robot, electronic device, and readable storage medium Download PDF

Info

Publication number
CN111474928A
CN111474928A CN202010254980.3A CN202010254980A CN111474928A CN 111474928 A CN111474928 A CN 111474928A CN 202010254980 A CN202010254980 A CN 202010254980A CN 111474928 A CN111474928 A CN 111474928A
Authority
CN
China
Prior art keywords
robot
pose
preset
error
controlling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010254980.3A
Other languages
Chinese (zh)
Other versions
CN111474928B (en
Inventor
徐恩科
霍峰
卜大鹏
陈侃
秦宝星
程昊天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Gaussian Automation Technology Development Co Ltd
Original Assignee
Shanghai Gaussian Automation Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Gaussian Automation Technology Development Co Ltd filed Critical Shanghai Gaussian Automation Technology Development Co Ltd
Priority to CN202010254980.3A priority Critical patent/CN111474928B/en
Publication of CN111474928A publication Critical patent/CN111474928A/en
Application granted granted Critical
Publication of CN111474928B publication Critical patent/CN111474928B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/60Other road transportation technologies with climate change mitigation effect
    • Y02T10/70Energy storage systems for electromobility, e.g. batteries

Abstract

The application discloses a robot control method, a robot, an electronic device and a readable storage medium. The robot control method includes: acquiring the position of a charging pile and controlling the robot to move to a preset position right facing the charging pile; controlling the robot to move forwards and/or backwards to adjust the pose of the robot to a preset pose; and controlling the robot to move to the charging pile for charging. In the robot and the robot control method in the embodiment of the application, the logic of coarse adjustment is adopted, and the actual distance of more convergence errors is obtained when the pose of the robot is adjusted, so that the robot does not need too large adjustment distance before charging a pile, the beneficial effect of effectively saving space is achieved, and meanwhile, the sufficient pile accuracy can be ensured, so that the requirement of the robot on the field is reduced.

Description

Robot control method, robot, electronic device, and readable storage medium
Technical Field
The present application relates to the field of robot intelligent control technologies, and more particularly, to a robot control method, a robot, an electronic device, and a readable storage medium.
Background
In the related art, robots can generally realize automatic task operation through positioning and navigation technologies, so that the labor cost is reduced. In order to realize accurate pile alignment, the robots usually need to have a large space in front of the charging pile and need to start to advance or retreat at a position far in front of a charging point, and during driving, the posture and the angle of a vehicle body are continuously adjusted through front wheel steering so as to accurately adjust the posture of the robots. Therefore, the robot automatically has high requirements on the space for the pile, and greater limitation is brought to the deployment of the charging pile of the robot.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, the invention aims to provide a robot control method for realizing automatic pile charging of a robot.
It is another object of the invention to provide a robot, an electronic device and a readable storage medium enabling automatic charging of a pile.
In order to achieve the above object, an embodiment of the present application provides a robot control method, including: acquiring the position of a charging pile and controlling the robot to move to a preset position right facing the charging pile; controlling the robot to advance and/or retreat to adjust the pose of the robot to a preset pose; and controlling the robot to move to the charging pile for charging.
In the robot control method of the embodiment of the application, the logic of coarse adjustment is adopted, the robot can obtain more actual distances of convergence errors when the pose of the robot is adjusted through the processes of retreating, advancing again and retreating again after navigating to the preset position, so that the robot does not need too large adjustment distance before charging a pile, the beneficial effect of effectively saving space can be achieved, and meanwhile, the sufficient pile accuracy can be ensured, so that the requirement of the robot on the field is reduced.
In some embodiments, the step of acquiring a position of a charging post and controlling the robot to move to a predetermined position facing the charging post comprises: establishing a coordinate system based on the center of the charging pile and determining the pose of the robot; and controlling the robot to move to the preset position according to the pose of the robot. In this way, the pose of the robot is determined through establishment of the coordinate system so as to guide the robot to move to a predetermined position.
In some embodiments, the step of controlling the robot to advance and/or retreat to adjust the pose of the robot to a preset pose includes: acquiring the current pose of the robot in real time, and judging whether the error between the current pose and a preset pose is smaller than a set value or not; under the condition that the error between the current pose and a preset pose is not less than a set value, controlling the robot to retreat and adjusting the pose of the robot in the process of retreating; and determining the pose of the robot to be adjusted to the preset pose under the condition that the error between the current pose and the preset pose is smaller than a set value. Therefore, whether the error between the current pose and the preset pose and the size of the set value meet the requirements or not is judged in real time, and therefore whether the pose of the robot is continuously adjusted or not is judged according to the judgment result.
In some embodiments, the pose of the robot includes a coordinate value and a direction angle, the setting value includes a first setting value and a second setting value, and the step of obtaining the current pose of the robot in real time and determining whether an error between the current pose and a preset pose is smaller than the setting value includes: under the condition that the error of the coordinate value of the current pose and the preset pose is smaller than the first set value and the error of the direction angle of the current pose and the preset pose is smaller than the second set value, judging that the error of the current pose and the preset pose is smaller than a set value; or under the condition that the error of the coordinate value of the current pose and the preset pose is not less than the first set value and/or the error of the direction angle of the current pose and the preset pose is not less than the second set value, judging that the error of the current pose and the preset pose is not less than the set value. Adjusting the pose of the robot means adjusting the Y-axis coordinate value error and the direction angle error of the robot to zero and converging. In this way, the robot 10 is determined to meet the pose adjustment requirement only when the coordinate value error and the direction angle error meet the requirement at the same time and are smaller than the corresponding values.
In some embodiments, the step of controlling the robot to retreat and adjusting the pose of the robot during the retreat includes: under the condition that the robot retreats to reach a retreating ending position and the error between the current pose and a preset pose is not smaller than a set value, controlling the robot to advance and adjusting the pose of the robot in the advancing process, wherein the distance from the retreating ending position to the charging pile is smaller than the distance from the preset position to the charging pile; when the robot moves forward to reach the preset position and the error between the current pose and the preset pose is not less than a set value, controlling the robot to move backwards and adjusting the pose of the robot in the process of moving backwards; and repeating the steps until the pose of the robot is adjusted to the preset pose. The robot can make the pose of the robot reach the preset pose through the circular adjustment process of retreating, advancing again and retreating again in the range between the retreating termination position and the preset position, the robot does not need too large adjustment distance in front of the charging pile, the space can be effectively saved, and meanwhile, the sufficient pile aligning precision can be ensured, so that the requirement of the robot on the field is reduced.
In some embodiments, the step of controlling the robot to advance and/or retreat to adjust the pose of the robot to a preset pose includes: acquiring the current pose of the robot in real time, and calculating the current error between the current pose and the preset pose; and generating a control command according to the current error, the reference speed of the robot and the control coefficient so as to control the robot to move forwards and/or backwards. The robot generates a corresponding control instruction by combining the performance parameters of the robot through error calculation, so that the control of the robot is realized, and the pose adjustment of the robot can achieve a better effect.
In certain embodiments, the control instructions include linear velocity instructions and angular velocity instructions. Therefore, when the pose of the robot is adjusted, the linear velocity and the angular velocity need to be controlled in a combined manner to realize accurate adjustment.
The embodiment of the application provides a robot, which comprises an acquisition module, a pose adjusting module and a pile alignment control module, wherein the acquisition module is used for acquiring the position of a charging pile and controlling the robot to move to a preset position just opposite to the charging pile; the pose adjusting module is used for controlling the robot to move forwards and/or backwards so as to adjust the pose of the robot to a preset pose; and the pile control module is used for controlling the robot to move to the charging pile for charging.
The robot of the embodiment of the application adopts the logic of coarse adjustment, and the robot can obtain more actual distances of convergence errors when adjusting the pose of the robot through the processes of retreating, advancing again and retreating after navigating to a preset position, so that the robot does not need too large adjustment distance before charging a pile, the beneficial effect of effectively saving space can be achieved, and meanwhile, the sufficient pile accuracy can be ensured to reduce the requirement of the robot on the field.
In some embodiments, the acquisition module includes a first determination unit configured to establish a coordinate system based on a center of the charging pile and determine a pose of the robot; the first control unit is used for controlling the robot to move to the preset position according to the pose of the robot. In this way, the pose of the robot is determined through establishment of the coordinate system so as to guide the robot to move to a predetermined position.
In some embodiments, the pose adjusting module includes a pose determining unit, a second control unit, and a second determining unit, where the pose determining unit is configured to obtain a current pose of the robot in real time, and determine whether an error between the current pose and a preset pose is smaller than a set value; the second control unit is used for controlling the robot to retreat and adjusting the pose of the robot in the process of retreating under the condition that the error between the current pose and a preset pose is not less than a set value; the second determining unit is used for determining that the pose of the robot is adjusted to the preset pose under the condition that the error between the current pose and the preset pose is smaller than a set value. Therefore, whether the error between the current pose and the preset pose and the size of the set value meet the requirements or not is judged in real time, and therefore whether the pose of the robot is continuously adjusted or not is judged according to the judgment result.
In some embodiments, the pose of the robot includes a coordinate value and an orientation angle, the set value includes a first set value and a second set value, and the pose determination unit is configured to determine that an error between the current pose and a preset pose is smaller than the set value in a case where an error between the coordinate value of the current pose and the preset pose is smaller than the first set value and an error between the orientation angle of the current pose and the preset pose is smaller than the second set value, or to determine that an error between the current pose and the preset pose is not smaller than the set value in a case where an error between the coordinate value of the current pose and the preset pose is not smaller than the first set value and/or an error between the orientation angle of the current pose and the preset pose is not smaller than the second set value. Adjusting the pose of the robot means adjusting the Y-axis coordinate value error and the direction angle error of the robot to zero and converging. In this way, the robot 10 is determined to meet the pose adjustment requirement only when the coordinate value error and the direction angle error meet the requirement at the same time and are smaller than the corresponding values.
In some embodiments, the second control unit is configured to control the robot to advance and adjust the pose of the robot during the advance if the robot moves backward to a backward end position and an error between the current pose and a preset pose is not less than a set value, the distance from the backward end position to the charging post is less than the distance from the predetermined position to the charging post, and configured to control the robot to move backward and adjust the pose of the robot during the backward movement if the robot moves forward to the predetermined position and an error between the current pose and the preset pose is not less than a set value, and configured to repeat the above steps until the pose of the robot is adjusted to the preset pose. The robot can make the pose of the robot reach the preset pose through the circular adjustment process of retreating, advancing again and retreating again in the range between the retreating termination position and the preset position, the robot does not need too large adjustment distance in front of the charging pile, the space can be effectively saved, and meanwhile, the sufficient pile aligning precision can be ensured, so that the requirement of the robot on the field is reduced.
In some embodiments, the pose adjustment module includes a calculation unit and a processing unit, the calculation unit is configured to acquire a current pose of the robot in real time and calculate a current error between the current pose and the preset pose; and the processing unit is used for generating a control instruction according to the current error, the reference speed of the robot and the control coefficient so as to control the robot to move forwards and/or backwards. The robot generates a corresponding control instruction by combining the performance parameters of the robot through error calculation, so that the control of the robot is realized, and the pose adjustment of the robot can achieve a better effect.
The embodiment of the present application provides a robot, which includes a processor, a readable storage medium, and computer-executable instructions stored on the readable storage medium and executable on the processor, wherein when the computer-executable instructions are executed by the processor, the processor is caused to execute the robot control method according to any one of the above embodiments.
The robot of the embodiment of the application executes computer executable instructions through the processor, and by adopting the logic of coarse adjustment, the robot is controlled to navigate to a preset position and then obtain more actual distances of convergence errors through processes of retreating, advancing again and retreating when the pose of the robot is adjusted, so that the robot does not need too large adjustment distance before charging a pile, the beneficial effect of effectively saving space can be achieved, and meanwhile, the sufficient pile accuracy can be ensured to reduce the requirement of the robot on the field.
The present embodiments provide a non-transitory computer-readable storage medium including computer-executable instructions that, when executed by one or more processors, cause the processors to perform the robot control method of the above embodiments.
In the readable storage medium of the embodiment of the application, the processor executes the computer executable instruction, the logic of coarse adjustment is adopted, the robot is controlled to navigate to the preset position, and then the actual distance of more convergence errors is obtained by the processes of retreating, re-advancing and re-retreating when the pose of the robot is adjusted, so that the robot does not need too large adjustment distance before charging a pile, the beneficial effect of effectively saving space can be achieved, and meanwhile, the sufficient pile accuracy can be ensured, so that the requirement of the robot on the field is reduced.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart illustrating a robot control method according to an embodiment of the present invention.
Fig. 2 is a coordinate schematic diagram of a charging pile according to an embodiment of the present application.
Fig. 3 is a schematic diagram of a robot module according to an embodiment of the present application.
Fig. 4 is a schematic view of an application scenario of a robot according to an embodiment of the present application.
Fig. 5 is another flowchart illustrating a robot control method according to an embodiment of the present application.
Fig. 6 is a further flowchart illustrating a robot control method according to an embodiment of the present invention.
Fig. 7 is a further flowchart of the robot control method according to the embodiment of the present application.
Fig. 8 is a further flowchart of the robot control method according to the embodiment of the present application.
Fig. 9 is a further flowchart of the robot control method according to the embodiment of the present application.
Fig. 10 is a further flowchart of the robot control method according to the embodiment of the present application.
Fig. 11 is a logic diagram of pile charging control by the robot according to the embodiment of the present application.
Fig. 12 is another block diagram of a robot according to an embodiment of the present disclosure.
Fig. 13 is a schematic view of a further module of a robot according to an embodiment of the present application.
Description of the main element symbols:
the robot 10, the acquisition module 11, the first determination unit 112, the first control unit 114, the pose adjustment module 12, the pose determination unit 121, the second control unit 122, the second determination unit 123, the calculation unit 124, the processing unit 125, the pair pile control module 13, the third control unit 132, the pair pile determination unit 134, the charging unit 136, the laser transmitter 14, the laser sensor 15, the charging pile 20, the predetermined position 30, the retreat termination position 40, the terminal device 50, the server 60, the electronic device 60, the processor 62, the readable storage medium 64, and the computer-executable instructions 642.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
Referring to fig. 1, 2 and 3, a robot control method according to an embodiment of the present disclosure is used to implement an ackermann-type robot 10 to automatically charge a pile. The ackermann-type robot 10 uses an ackermann steering mechanism as a steering mechanism, and when the robot 10 turns, the steering angle of the inner side wheel is larger than that of the outer side wheel by a certain angle by using the equal crank of the four connecting rods, so that the circle centers of the four wheel paths approximately intersect with the instantaneous steering center on the extension line of the rear axle, and the robot 10 can smoothly turn. The robot 10 includes, but is not limited to, a mobile robot such as a sweeping robot, a transporting robot, a patrol robot, etc.
In some embodiments, a robotic control method comprises:
step S1, acquiring the position of the charging pile 20 and controlling the robot 10 to move to the preset position 30 opposite to the charging pile 20;
step S2, controlling the robot 10 to advance and/or retreat to adjust the pose of the robot 10 to a preset pose; and
in step S3, the robot 10 is controlled to move to the charging pile 20 for charging.
Specifically, the robot 10 includes an acquisition module 11, a pose adjustment module 12, and a pair-pile control module 13, and step S1 may be implemented by the acquisition module 11, step S2 may be implemented by the pose adjustment module 12, and step S3 may be implemented by the pair-pile control module 13. That is, the acquiring module 11 may be used to acquire the position of the charging pile 20 and control the robot 10 to move to the predetermined position 30 opposite to the charging pile 20. The pose adjustment module 12 may be used to control the robot 10 to advance and/or retreat to adjust the pose of the robot 10 to a preset pose. The pile control module 13 may be used to control the robot 10 to move to the charging pile 20 for charging.
In the robot 10 and the robot control method in the embodiment of the application, by adopting the coarse adjustment logic, the robot 10 can obtain more actual distances of convergence errors when adjusting the pose of the robot 10 through the processes of retreating, advancing again and retreating after navigating to the preset position 30, so that the robot 10 does not need too large adjustment distance before charging the electric pile 20, the beneficial effect of effectively saving space can be achieved, and meanwhile, the sufficient pile accuracy can be ensured, so that the requirement of the robot 10 on the field can be reduced.
In some embodiments, the robot 10 may execute the robot control method of the present application after receiving the charging instruction to control the robot 10 to automatically charge the pile.
The user may send a charging instruction to the robot 10 according to an actual situation, or the robot 10 automatically generates the charging instruction after the electric quantity is lower than the threshold value, or the robot 10 automatically generates the charging instruction after completing the job task.
In some embodiments, the robot 10 may obtain the location of the charging pile 20 according to the scanned environment map.
It will be appreciated that when entering a new job scenario, the robot 10 needs to scan a map of the scenario and then perform the corresponding tasks for automatic job depending on the location of the robot 10 in the map. Specifically, the robot 10 may scan an environment pre-scanning environment through a synchronous positioning and mapping technique to obtain an environment map and store the environment map, so that when the robot 10 performs a task operation, the position of the charging pile 20 may be obtained from the pre-stored environment map.
Referring to fig. 4, in some embodiments, the robot 10 may store the scanned environment map in the robot 10, the terminal device 50, and/or the server 60, and the robot 10, the terminal device 50, and/or the server 60 may be connected via wired and/or wireless communication to guide the robot 10 to move. The wireless network communication connection includes, but is not limited to, wireless communication modes such as WiFi, Bluetooth (Bluetooth), Zigbee (Zigbee), narrowband Band Internet of Things (NB-Iot), and the like.
Specifically, when the map is scanned, the contour of the charging pile 20 can be completely scanned, and the position of the charging pile 20 in the environment map is edited in the related application of the terminal device 50.
In some embodiments, the robot 10 includes a laser emitter 14 and a laser sensor 15, the laser emitter 14 may be used to emit laser light to the environment surrounding the robot 10, and the laser sensor 15 may be used to receive laser light reflected by the environment surrounding the robot 10 to obtain laser point cloud data of the surrounding environment.
Further, the robot 10 may establish a map according to the acquired point cloud data of the surrounding environment to realize navigation of the robot 10, so as to position the robot 10 and avoid collision of the robot 10 with an obstacle.
In one example, the laser emitter 14 may be an infrared laser emitter 14 and the laser sensor 15 may be an infrared laser sensor 15. The infrared laser transmitter 14 may be configured to transmit infrared laser light to the environment around the robot 10, and the infrared laser sensor 15 may be configured to receive infrared laser light reflected by the environment around the robot 10 to obtain infrared laser point cloud data of the surrounding environment.
In this way, the robot 10 can reduce the influence of visible light, so that the infrared laser point cloud data acquired by the infrared laser sensor 15 is more accurate. The infrared laser sensor 15 can filter out visible light through an infrared filter arranged inside, and allow infrared laser with corresponding wavelength to pass through.
In some embodiments, the charging post 20 has a characteristic identifier, and the robot 10 may determine the location of the charging post 20 according to the characteristic identifier of the charging post 20 in the scanned environment map.
So, robot 10 can confirm robot 10 and the relative position who fills electric pile 20 in real time according to the characteristic sign who fills electric pile 20 to combine the environmental map of prestoring can further guarantee the accuracy of robot 10 location, improve the automatic accuracy to the stake.
Referring to fig. 5, in some embodiments, step S1 includes:
step S11, establishing a coordinate system based on the center of the charging pile 20 and determining the pose of the robot 10; and
in step S12, the robot 10 is controlled to move to the predetermined position 30 according to the pose of the robot 10.
Specifically, the obtaining module 11 includes a first determining unit 112 and a first controlling unit 114, and step S11 may be implemented by the first determining unit 112, and step S12 may be implemented by the first controlling unit 114. That is, the first determination unit 112 may be configured to establish a coordinate system based on the center of the charging pile 20 and determine the pose of the robot 10. The first control unit 114 may be used to control the robot 10 to move to the predetermined position 30 according to the pose of the robot 10.
In step S11, when the coordinate system is established based on the center of the charging pile 20, the origin of coordinates is the center of the charging pile 20, the direction in which the charging pile 20 faces the center is the positive X-axis direction, and then the positive Y-axis direction is determined according to the right-hand rule. In this way, the pose of the robot 10 can be determined as (X, y, θ) according to the coordinate values of the robot 10 and a direction angle θ, where the direction angle θ may be an included angle between the direction in which the robot 10 moves linearly and the X axis. It can be understood that when the coordinate value of the pose of the robot 10 and the direction angle θ are all zero, the charging pile 20 is accurately installed.
In step S12, the predetermined position 30 is the position just opposite to the charging pile 20 by a distance x1I.e. (x) of the predetermined position 301,0). The pose after the robot 10 moves to the predetermined position 30 is (x)10, θ), wherein x1May be flexibly configured according to the parameter (e.g., shape, size, speed control, etc.) requirements of the robot 10. In one example, x1Can be [30cm,100cm ]]. For example, x1May be 30cm, 50cm, 70cm, 100cm, etc. X-axis coordinate value X of preset position when general small robot is aligned with pile1May be 40 cm.
In this manner, the pose of the robot 10 is determined by the establishment of the coordinate system to guide the robot 10 to move to the predetermined position 30.
Referring to fig. 6, in some embodiments, step S2 includes:
step S21, acquiring the current pose of the robot 10 in real time, and judging whether the error between the current pose and the preset pose is smaller than a set value;
step S22, controlling the robot 10 to retreat and adjusting the pose of the robot 10 in the process of retreating under the condition that the error between the current pose and the preset pose is not less than a set value; and
step S23, determining that the pose of the robot 10 is adjusted to the preset pose when the error between the current pose and the preset pose is smaller than a set value.
Specifically, the pose adjustment module 12 includes the pose determination unit 121, the second control unit 122, and the second determination unit 123, and step S21 may be implemented by the pose determination unit 121, step S22 may be implemented by the second control unit 122, and step S23 may be implemented by the second determination unit 123. That is, the pose determination unit 121 may be configured to acquire the current pose of the robot 10 in real time, and determine whether an error between the current pose and a preset pose is smaller than a set value. The second control unit 122 may be configured to control the robot 10 to move backward and adjust the pose of the robot 10 during the moving backward, when an error between the current pose and the preset pose is not less than a set value. The second determination unit 123 may be configured to determine that the pose of the robot 10 is adjusted to the preset pose in a case where an error of the current pose from the preset pose is smaller than a set value.
In this way, whether the error between the current pose and the preset pose and the size of the set value meet the requirements is judged in real time, so that whether the pose of the robot 10 is continuously adjusted according to the judgment result.
In the embodiment of the present application, controlling the robot 10 to retreat refers to controlling the robot 10 to move in a direction close to the charging pile 20, and, relatively, controlling the robot 10 to advance refers to controlling the robot 10 to move in a direction away from the charging pile 20.
Referring to fig. 7, in some embodiments, the pose of the robot 10 includes coordinate values and an orientation angle, and the setting value includes a first setting value1And a second set value2Step S21 includes:
step S211, when the coordinate value error between the current pose and the preset pose is smaller than the first set value1And the direction angle error between the current pose and the preset pose is smaller than a second set value2Under the condition of (1), judging that the error between the current pose and the preset pose is smaller than a set value; or
Step S212, when the error of the coordinate value between the current pose and the preset pose is not less than the first set value1And/or the error of the direction angle between the current pose and the preset pose is not less than a second set value2And under the condition of (3), judging that the error between the current pose and the preset pose is not less than a set value.
Specifically, step S211 and step S212 may be implemented by the pose determination unit 121. That is, the pose determination unit 121 may be configured to determine that an error in coordinate values between the current pose and the preset pose is smaller than the first setting value1And the direction angle error between the current pose and the preset pose is smaller than a second set value2In the case of (3), it is determined that the error between the current pose and the preset pose is smaller than the set value, or it may be used to determine that the coordinate value between the current pose and the preset pose is incorrectDifference is not less than the first set value1And/or the error of the direction angle between the current pose and the preset pose is not less than a second set value2And under the condition of (3), judging that the error between the current pose and the preset pose is not less than a set value.
It can be understood that, in the pile aligning process, if the Y-axis coordinate value of the robot 10 is zero and the direction angle is zero, the robot 10 may move forward or backward to the charging pile 20 to achieve accurate pile alignment, that is, the pose of the robot 10 is adjusted to the preset pose (x, 0,0), and then the robot may move forward or backward to the charging pile 20 to achieve accurate pile alignment. Adjusting the pose of the robot 10 is to adjust the Y-axis coordinate value error and the azimuth angle error of the robot 10 to zero. In this way, the robot 10 is determined to meet the pose adjustment requirement only when the coordinate value error and the direction angle error meet the requirement at the same time and are smaller than the corresponding values.
In one example, the first set point1May be (0,10 cm)]The second set value2May be (0,15 degree)]. For example, the first set value1May be 1cm, 3cm or 5cm, etc., and the second set value2May be 1 °, 3 °, 5 °, or the like. Preferably, the first set value1And a second set value2Convergence towards zero is possible.
Of course, in other embodiments, the first set point is1And a second set value2Can be set according to the situation, and the robots 10 with different sizes, types and functions can adopt first set values with different values1And a second set value2And is not particularly limited herein.
Referring to fig. 8, in some embodiments, step S22 includes:
step S221, when the robot 10 retreats to reach the retreating termination position 40 and the error between the current pose and the preset pose is not less than a set value, controlling the robot 10 to advance and adjusting the pose of the robot 10 in the advancing process, wherein the distance from the retreating termination position 40 to the charging pile 20 is less than the distance from the preset position 30 to the charging pile 20;
step S222, controlling the robot 10 to retreat and adjusting the pose of the robot 10 in the process of retreating under the condition that the robot 10 advances to reach the preset position 30 and the error between the current pose and the preset pose is not less than a set value; and
and step S223, repeating the above steps until the pose of the robot 10 is adjusted to the preset pose.
Specifically, step S221, step S222, and step S223 may be implemented by the second control unit 122. That is, the second control unit 122 may be configured to control the robot 10 to advance and adjust the pose of the robot 10 during the advance in a case where the robot 10 retreats to reach the retreat end position 40 and an error of the current pose from the preset pose is not less than a set value, and configured to control the robot 10 to retreat and adjust the pose of the robot 10 during the retreat in a case where the robot 10 advances to reach the predetermined position 30 and an error of the current pose from the preset pose is not less than a set value, and configured to repeat the above steps until the pose of the robot 10 is adjusted to the preset pose.
Therefore, the robot 10 can reach the preset pose through the cycle adjusting process of retreating, advancing again and retreating again in the range between the retreating termination position 40 and the preset position 30, the robot 10 does not need too large adjusting distance in front of the charging pile 20, the space can be effectively saved, and meanwhile, the sufficient pile aligning precision can be ensured, so that the requirement of the robot 10 on the field is reduced. In the process of moving the robot 10 forward or backward, it is possible to determine whether the predetermined position 30 or the backward end position 40 is reached based on the coordinate value of the X axis of the robot 10.
In one example, the retreat end position 40 is (x)2Y), i.e. the retreat ending position 40 is located a distance x in front of the charging pile 202The position of (a). In the embodiment of the present application, after the robot 10 navigates to the predetermined position 30, the pose of the robot 10 is adjusted by backing, and the distance x from the backing end position 40 to the charging pile 202Less than the distance x from the predetermined position 30 to the charging post 201. Wherein x is2May be flexibly configured according to the parameter (e.g., shape, size, speed control, etc.) requirements of the robot 10. In one example, x2Can take on values of[10cm,30cm]. For example, x2May be 10cm, 20cm, 30cm, etc. X-axis coordinate value X of preset position when general small robot is aligned with pile2May be 10 cm.
In one example, the distance | x of the predetermined position 30 and the retreat ending position 401-x2L may be larger than the size of the robot 10 in the moving direction when moving in a straight line. Thus, it is advantageous that the body attitude and angle are constantly adjusted by the front wheel steering during the travel of the robot 10 between the predetermined position 30 and the retreat end position 40.
In other embodiments, after the robot 10 navigates to the predetermined position 30, the pose of the robot 10 may be adjusted in an advancing manner, at this time, the robot 10 may advance to an advancing ending position at most, and the distance from the advancing ending position to the charging pile 20 is greater than the distance x from the predetermined position 30 to the charging pile 201
Referring to fig. 9, in some embodiments, step S2 includes:
step S24, acquiring the current pose of the robot 10 in real time, and calculating the current error between the current pose and the preset pose; and
in step S25, a control command is generated to control the robot 10 to move forward and/or backward according to the current error, the reference speed of the robot 10, and the control coefficient.
Specifically, the pose adjustment module 12 includes a calculation unit 124 and a processing unit 125, and step S24 may be implemented by the calculation unit 124 and step S25 may be implemented by the processing unit 125. That is, the calculation unit 124 may be configured to acquire the current pose of the robot 10 in real time and calculate the current error between the current pose and the preset pose. The processing unit 125 may be configured to generate control instructions to control the robot 10 to move forward and/or backward based on the current error, the reference speed of the robot 10, and the control coefficients.
Therefore, the robot 10 generates a corresponding control instruction by combining the performance parameters of the robot 10 through error calculation, so as to control the robot 10, and the pose adjustment of the robot 10 can achieve a better effect.
In one example, the reference speed is advancedGiven that the reference speed is specifically dependent on the type of robot, the size of the occupied space, and the steering mechanism, for example, the reference speed of a typical small robot may be Vr=0.1m/s。
In certain embodiments, the control instructions include linear velocity instructions and angular velocity instructions.
Since the ackermann-type robot 10 cannot achieve the pivot steering function, the coordinate values and the direction angles of the robot 10 are changed simultaneously during the steering process of the robot 10, and thus, the line velocity and the angular velocity need to be controlled jointly to achieve accurate adjustment when the pose of the robot 10 is adjusted.
In one example, step S25 may calculate the current error of the current pose from the preset pose by the following conditional expression:
ey=-y*cosθ;
eθ=-θ;
wherein e isyIs the current error of the Y-axis coordinate value of the current pose of the robot 10 from the Y-axis coordinate value of the predetermined pose, eθIs the current error of the azimuth of the current pose of the robot 10 from the azimuth of the predetermined pose.
Step S25 may generate the control amount by the following conditional expression:
v=Vr*cos(eθ);
w=ky*Vr*ey+kθ*Vr*sin(eθ);
where v is the speed of pose adjustment of the robot 10 in the current pose, and w is the angular speed of pose adjustment of the robot 10 in the current pose. k is a radical ofyAnd kθAre control coefficients and are all real numbers greater than zero. k is a radical ofyAnd kθThe value of (2) can be adjusted according to the parameter description performance of the robot 10, so that the generated control quantity can achieve a better effect.
Referring to fig. 10, in some embodiments, step S3 includes:
step S32, determining the pile aligning speed of the robot 10, and controlling the robot 10 to move backwards to align the piles;
step S34, acquiring the current pose of the robot 10 in real time, and judging whether the robot 10 successfully aligns the pile according to the current pose; and
in step S36, when the robot 10 succeeds in pile driving, the robot 10 is controlled to charge.
Specifically, the pair pile control module 13 includes a third control unit 132, a pair pile determination unit 134, and a charging unit 136, and step S32 may be implemented by the third control unit 132, step S34 may be implemented by the pair pile determination unit 134, and step S36 may be implemented by the charging unit 136. That is, the third control unit 132 may be configured to determine the pile aligning speed of the robot 10 for the pile and control the robot 10 to retreat for pile aligning. The pile-pairing judging unit 134 may be configured to acquire the current pose of the robot 10 in real time, and judge whether the robot 10 succeeds in pairing the pile according to the current pose. The charging unit 136 may be used to control the robot 10 to charge in case the pile is successfully handled by the robot 10.
Thus, after the pose of the robot 10 is adjusted, the robot can directly move to the charging pile 20 along a straight line by retreating so as to charge the pile.
Further, referring to fig. 11, in step S34, the stub matching judgment unit 134 may judge whether the robot 10 matches the stub according to the X-axis coordinate value in the current pose of the robot 10. Specifically, when the X-axis coordinate value X in the current pose of the robot 10 is less than or equal to 0, it is determined that the robot 10 successfully pairs the pile. At this time, the charging unit 136 controls the robot 10 to enter a charging mode.
Accordingly, the value of the pile speed is specifically related to the type of robot 10, the amount of space occupied, and the steering mechanism. In one example, the pile-aligning speed of the small robot for aligning the pile may be V0=0.1m/s。
In the above embodiment, the robot 10 is adjusted in posture and then directly retreats in a straight line toward the charging pile 20 to perform pile alignment. Of course, in other embodiments, in step S32, it may also be that the pose of the robot 10 is adjusted in real time during the process of backing up the pile so that the accurate pile requirement can be met after the robot 10 reaches the charging pile 20.
For example, when the robot 10 performs the posture adjustment, it is determined whether or not the robot 10 is adjusted to the first set value of the predetermined posture1When 10cm is taken, a certain deviation may exist between the Y-axis coordinate value of the robot 10 after pose adjustment and the Y-axis coordinate value of the pose (0,0,0) when the pile is accurately aligned, and the robot 10 may adjust the Y-axis coordinate value of the robot 10 again in the process of retreating the pile alignment, so that the offset of the Y-axis coordinate value after the robot 10 reaches the charging pile 20 may converge toward zero. Accordingly, the second set value2When 15 degrees are taken, the robot 10 can adjust the direction angle of the robot 10 again in the process of backing up the pile, so that the offset of the direction angle after the robot 10 reaches the charging pile 20 can be converged to zero.
Referring to fig. 12, an electronic device 60 provided in an embodiment of the present disclosure includes a processor 62, a readable storage medium 64, and computer-executable instructions 642 stored on the readable storage medium 64 and executable on the processor 62, where the computer-executable instructions 642, when executed by the processor 62, cause the processor 62 to perform the robot control method according to any one of the embodiments.
In one example, the computer-executable instructions 642, when executed by the processor 62, cause the processor 62 to perform the steps of:
step S1, acquiring the position of the charging pile 20 and controlling the robot 10 to move to the preset position 30 opposite to the charging pile 20;
step S2, controlling the robot 10 to advance and/or retreat to adjust the pose of the robot 10 to a preset pose; and
in step S3, the robot 10 is controlled to move to the charging pile 20 for charging.
The electronic device 60 according to the embodiment of the present application executes the computer executable instruction 642 through the processor 62, and by using the coarse tuning logic, the robot 10 can be controlled to navigate to the predetermined position 30 and then obtain more actual distances of convergence errors when adjusting the pose of the robot 10 through the processes of moving back, moving forward again, and moving back again, so that the robot 10 does not need too large adjustment distance before charging the electric pile 20, and therefore, the space can be effectively saved, and meanwhile, sufficient pile accuracy can be ensured, so as to reduce the requirement of the robot 10 on the field.
Embodiments of the present application also provide a non-transitory computer-readable storage medium 64, where the readable storage medium 64 includes computer-executable instructions 642, and when the computer-executable instructions 642 are executed by one or more processors 62, the processor 62 is caused to execute the robot control method of any one of the above embodiments.
Referring to fig. 13, one or more processors 62 may be coupled to a readable storage medium 64 through a bus, and the readable storage medium 64 stores computer-executable instructions 642, which are processed by the processors 62 to perform the robot control method according to the embodiment of the present disclosure, so that the robot 10 automatically charges the pile. The electronic device 60 may also be connected to a network via a communication module to enable communication with the server 60 and/or the terminal device 50, and connected to an input/output device via an input/output interface to collect environmental information or output control status signals.
In the description herein, reference to the term "one embodiment," "some embodiments," or "an example" etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium. The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (10)

1. A robot control method, characterized by comprising:
acquiring the position of a charging pile and controlling the robot to move to a preset position right facing the charging pile;
controlling the robot to advance and/or retreat to adjust the pose of the robot to a preset pose; and
and controlling the robot to move to the charging pile for charging.
2. The robot control method of claim 1, wherein the step of acquiring the position of the charging post and controlling the robot to move to a predetermined position facing the charging post comprises:
establishing a coordinate system based on the center of the charging pile and determining the pose of the robot; and
and controlling the robot to move to the preset position according to the pose of the robot.
3. The robot control method according to claim 1, wherein the step of controlling the robot to advance and/or retreat to adjust the pose of the robot to a preset pose comprises:
acquiring the current pose of the robot in real time, and judging whether the error between the current pose and a preset pose is smaller than a set value or not;
under the condition that the error between the current pose and a preset pose is not less than a set value, controlling the robot to retreat and adjusting the pose of the robot in the process of retreating; and
and determining the pose of the robot to be adjusted to the preset pose under the condition that the error between the current pose and the preset pose is smaller than a set value.
4. The robot control method according to claim 3, wherein the pose of the robot includes a coordinate value and a direction angle, the setting values include a first setting value and a second setting value, and the step of acquiring the current pose of the robot in real time and determining whether an error between the current pose and a preset pose is smaller than the setting values includes:
under the condition that the error of the coordinate value of the current pose and the preset pose is smaller than the first set value and the error of the direction angle of the current pose and the preset pose is smaller than the second set value, judging that the error of the current pose and the preset pose is smaller than a set value; or
And under the condition that the error of the coordinate value of the current pose and the preset pose is not less than the first set value and/or the error of the direction angle of the current pose and the preset pose is not less than the second set value, judging that the error of the current pose and the preset pose is not less than the set value.
5. The robot control method according to claim 3, wherein the step of controlling the robot to retreat and adjusting the pose of the robot during the retreat comprises:
under the condition that the robot retreats to reach a retreating ending position and the error between the current pose and a preset pose is not smaller than a set value, controlling the robot to advance and adjusting the pose of the robot in the advancing process, wherein the distance from the retreating ending position to the charging pile is smaller than the distance from the preset position to the charging pile;
when the robot moves forward to reach the preset position and the error between the current pose and the preset pose is not less than a set value, controlling the robot to move backwards and adjusting the pose of the robot in the process of moving backwards; and
and repeating the steps until the pose of the robot is adjusted to the preset pose.
6. The robot control method according to claim 1, wherein the step of controlling the robot to advance and/or retreat to adjust the pose of the robot to a preset pose comprises:
acquiring the current pose of the robot in real time, and calculating the current error between the current pose and the preset pose; and
and generating a control instruction according to the current error, the reference speed of the robot and a control coefficient so as to control the robot to move forwards and/or backwards.
7. The robot control method according to claim 6, wherein the control command includes a linear velocity command and an angular velocity command.
8. A robot, characterized in that the robot comprises:
the acquisition module is used for acquiring the position of a charging pile and controlling the robot to move to a preset position right facing the charging pile;
a pose adjustment module for controlling the robot to advance and/or retreat to adjust a pose of the robot to a preset pose; and
and the pile control module is used for controlling the robot to move to the charging pile for charging.
9. An electronic device comprising a processor, a readable storage medium, and computer-executable instructions stored on the readable storage medium and executable on the processor, the computer-executable instructions, when executed by the processor, causing the processor to perform the robot control method of any of claims 1-7.
10. A non-transitory computer-readable storage medium, comprising computer-executable instructions that, when executed by one or more processors, cause the processors to perform the robot control method of any one of claims 1-7.
CN202010254980.3A 2020-04-02 2020-04-02 Robot control method, robot, electronic device, and readable storage medium Active CN111474928B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010254980.3A CN111474928B (en) 2020-04-02 2020-04-02 Robot control method, robot, electronic device, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010254980.3A CN111474928B (en) 2020-04-02 2020-04-02 Robot control method, robot, electronic device, and readable storage medium

Publications (2)

Publication Number Publication Date
CN111474928A true CN111474928A (en) 2020-07-31
CN111474928B CN111474928B (en) 2023-08-01

Family

ID=71750106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010254980.3A Active CN111474928B (en) 2020-04-02 2020-04-02 Robot control method, robot, electronic device, and readable storage medium

Country Status (1)

Country Link
CN (1) CN111474928B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112346453A (en) * 2020-10-14 2021-02-09 深圳市杉川机器人有限公司 Automatic robot recharging method and device, robot and storage medium
CN112388638A (en) * 2020-11-12 2021-02-23 沈阳建筑大学 Construction robot control method
CN112684813A (en) * 2020-11-23 2021-04-20 深圳拓邦股份有限公司 Docking method and device for robot and charging pile, robot and readable storage medium
CN113359777A (en) * 2021-07-20 2021-09-07 山东新一代信息产业技术研究院有限公司 Automatic pile returning and charging method and system for robot
CN113703456A (en) * 2021-08-30 2021-11-26 山东新一代信息产业技术研究院有限公司 Automatic pile returning and charging method of robot based on multiple sensors
CN114296467A (en) * 2021-12-31 2022-04-08 福建汉特云智能科技有限公司 Method for automatically finding and aligning piles for robot charging pile
CN115373375A (en) * 2021-05-18 2022-11-22 未岚大陆(北京)科技有限公司 Method and device for returning robot to charging pile, robot and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100809352B1 (en) * 2006-11-16 2008-03-05 삼성전자주식회사 Method and apparatus of pose estimation in a mobile robot based on particle filter
CN105375574A (en) * 2015-12-01 2016-03-02 纳恩博(北京)科技有限公司 Charging system and charging method
CN106406316A (en) * 2016-10-26 2017-02-15 山东大学 Autonomous charging system and charging method thereof for intelligent home accompanying robot
CN109648602A (en) * 2018-09-11 2019-04-19 深圳优地科技有限公司 Automatic recharging method, device and terminal device
CN109974727A (en) * 2017-12-28 2019-07-05 深圳市优必选科技有限公司 A kind of robot charging method, device and robot
JP6631823B1 (en) * 2018-09-25 2020-01-15 ネクストヴイピーユー(シャンハイ)カンパニー リミテッドNextvpu(Shanghai)Co.,Ltd. Robot and its automatic docking and charging method, system, electronic device, and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100809352B1 (en) * 2006-11-16 2008-03-05 삼성전자주식회사 Method and apparatus of pose estimation in a mobile robot based on particle filter
CN105375574A (en) * 2015-12-01 2016-03-02 纳恩博(北京)科技有限公司 Charging system and charging method
CN106406316A (en) * 2016-10-26 2017-02-15 山东大学 Autonomous charging system and charging method thereof for intelligent home accompanying robot
CN109974727A (en) * 2017-12-28 2019-07-05 深圳市优必选科技有限公司 A kind of robot charging method, device and robot
CN109648602A (en) * 2018-09-11 2019-04-19 深圳优地科技有限公司 Automatic recharging method, device and terminal device
JP6631823B1 (en) * 2018-09-25 2020-01-15 ネクストヴイピーユー(シャンハイ)カンパニー リミテッドNextvpu(Shanghai)Co.,Ltd. Robot and its automatic docking and charging method, system, electronic device, and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
周祖德 等, 华中科技大学出版社 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112346453A (en) * 2020-10-14 2021-02-09 深圳市杉川机器人有限公司 Automatic robot recharging method and device, robot and storage medium
CN112388638A (en) * 2020-11-12 2021-02-23 沈阳建筑大学 Construction robot control method
CN112684813A (en) * 2020-11-23 2021-04-20 深圳拓邦股份有限公司 Docking method and device for robot and charging pile, robot and readable storage medium
CN112684813B (en) * 2020-11-23 2024-04-02 深圳拓邦股份有限公司 Butt joint method and device of robot and charging pile, robot and readable storage medium
CN115373375A (en) * 2021-05-18 2022-11-22 未岚大陆(北京)科技有限公司 Method and device for returning robot to charging pile, robot and storage medium
CN115373375B (en) * 2021-05-18 2023-08-18 未岚大陆(北京)科技有限公司 Method and device for returning charging pile of robot, robot and storage medium
CN113359777A (en) * 2021-07-20 2021-09-07 山东新一代信息产业技术研究院有限公司 Automatic pile returning and charging method and system for robot
CN113703456A (en) * 2021-08-30 2021-11-26 山东新一代信息产业技术研究院有限公司 Automatic pile returning and charging method of robot based on multiple sensors
CN114296467A (en) * 2021-12-31 2022-04-08 福建汉特云智能科技有限公司 Method for automatically finding and aligning piles for robot charging pile
CN114296467B (en) * 2021-12-31 2023-06-06 福建汉特云智能科技有限公司 Automatic pile finding and pile aligning method for robot charging pile

Also Published As

Publication number Publication date
CN111474928B (en) 2023-08-01

Similar Documents

Publication Publication Date Title
CN111474928A (en) Robot control method, robot, electronic device, and readable storage medium
CN106956270B (en) Six-degree-of-freedom mechanical arm for automatic charging pile of electric automobile and control method of six-degree-of-freedom mechanical arm
RU2720138C2 (en) Method of automatic supply to loading-unloading platform for use in large-capacity trucks
CN109974727B (en) Robot charging method and device and robot
US10444764B2 (en) Self-position estimating apparatus and self-position estimating method
EP2437130B1 (en) Control apparatus for autonomous operating vehicle
CN108575095B (en) Self-moving equipment and positioning system, positioning method and control method thereof
JP2017088112A (en) Steering control device for vehicle
CN104238557A (en) Automated Guided Vehicle And Method Of Operating An Automated Guided Vehicle
JP2017049933A (en) Autonomous travelling vehicle system
CN111026102A (en) Mobile robot autonomous recharging method and system based on upper and lower computer collaborative planning
KR101662081B1 (en) A vacuum cleaner
CN111830984B (en) Multi-machine cooperative car washing system and method based on unmanned car washing equipment
CN112684813B (en) Butt joint method and device of robot and charging pile, robot and readable storage medium
US20210214008A1 (en) Transverse steering method and transverse steering device for moving a vehicle into a target position, and vehicle for this purpose
CN112285739B (en) Data processing method, device, equipment and storage medium
JP7369626B2 (en) Vehicle control system, vehicle control method and program
CN111090284A (en) Method for returning from traveling equipment to base station and self-traveling equipment
US20230333568A1 (en) Transport vehicle system, transport vehicle, and control method
US20190302783A1 (en) System and method for autonomous work vehicle operations
CN110837257B (en) AGV composite positioning navigation system based on iGPS and vision
CN115993089B (en) PL-ICP-based online four-steering-wheel AGV internal and external parameter calibration method
KR20210046501A (en) unmanned mowing robot and automatic driving method thereof
JP2021047724A (en) Work system, autonomous work machine, and control method and program of autonomous work machine
CN114625113A (en) Automatic calibration method, control system and storage medium for AGV steering wheel

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant