CN116852345A - Robot, motion control method and device thereof, electronic equipment and storage medium - Google Patents

Robot, motion control method and device thereof, electronic equipment and storage medium Download PDF

Info

Publication number
CN116852345A
CN116852345A CN202210316335.9A CN202210316335A CN116852345A CN 116852345 A CN116852345 A CN 116852345A CN 202210316335 A CN202210316335 A CN 202210316335A CN 116852345 A CN116852345 A CN 116852345A
Authority
CN
China
Prior art keywords
robot
target object
charging pile
charging
controlling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210316335.9A
Other languages
Chinese (zh)
Inventor
李东方
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Robot Technology Co ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202210316335.9A priority Critical patent/CN116852345A/en
Publication of CN116852345A publication Critical patent/CN116852345A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present disclosure relates to a robot, and a motion control method, apparatus, electronic device, and storage medium thereof, the method comprising: acquiring communication data between a UWB base station end of a robot and a UWB tag end of a target object; determining the position information of the target object according to the communication data; and controlling the robot to move to a preset range around the target object according to the position information of the target object.

Description

Robot, motion control method and device thereof, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of robot control, and in particular relates to a robot, a motion control method and device thereof, electronic equipment and a storage medium.
Background
In recent years, the related art of robots has been rapidly developed, and robots can realize more and more functions, thereby performing work in various fields instead of human beings, and freeing the human beings from heavy work. The motion control of the robot is an important content in the robot control technology, and the motion control is to control the robot to move to a destination, for example, to control the robot to move to a position of a charging pile for charging. In the related art, the motion control of the robot often needs the robot to construct a navigation map in a motion area, so that the motion control of the robot is poor in convenience and low in efficiency.
Disclosure of Invention
To overcome the problems in the related art, embodiments of the present disclosure provide a robot, a motion control method, a motion control device, an electronic device, and a storage medium thereof, which are used to solve the drawbacks in the related art.
According to a first aspect of an embodiment of the present disclosure, there is provided a motion control method of a robot, including:
acquiring communication data between a UWB base station end of a robot and a UWB tag end of a target object;
determining the position information of the target object according to the communication data;
and controlling the robot to move to a preset range around the target object according to the position information of the target object.
In one embodiment, the UWB base station end of the robot includes at least one UWB antenna array.
In one embodiment, the determining the location information of the target object according to the communication data includes:
determining the distance between the target object and the robot according to the communication data;
determining an angle between a connecting line of the target object and the robot and a preset direction according to the communication data;
and determining the coordinate position of the target object in the coordinate system of the robot according to the distance and the angle.
In one embodiment, the robot comprises a foot robot and the target object comprises an asymmetric charging stake;
after controlling the robot to move within a preset range around the target object according to the position information of the target object, the method further comprises:
acquiring an image to be detected acquired by a camera of the robot, wherein the charging pile exists in the image to be detected;
performing identification processing on the image to be detected, and determining a direction mark on the charging pile in the image to be detected;
and determining the azimuth information of the robot and the charging pile according to the direction identification, and controlling the robot to move to the preset azimuth of the charging pile according to the azimuth information.
In one embodiment, at least one side surface of the charging pile is provided with a corresponding direction mark;
the controlling the robot to move to the preset azimuth of the charging pile according to the azimuth information comprises the following steps:
controlling the robot to remain motionless in place under the condition that the direction information characterizes the preset azimuth of the robot in the charging pile;
and under the condition that the direction information characterizes that the robot is not in the preset azimuth of the charging pile, controlling the robot to move to the preset azimuth of the charging pile.
In one embodiment, after the robot moves to the preset orientation of the charging stake, further comprising:
acquiring an image to be detected acquired by a camera of the robot, wherein a direction identifier of the charging pile exists in the image to be detected, and at least one positioning point is arranged in the direction identifier;
according to camera parameters of the camera, converting each positioning point in the image to be detected into a corresponding three-dimensional space coordinate point;
and controlling the robot to move to a preset charging position according to the coordinate position of at least one three-dimensional space coordinate point in the space coordinate system of the camera, so that the charging coil of the robot and the charging coil of the charging pile are in a matching position.
In one embodiment, the robot comprises a foot robot and the target object comprises a symmetrical charging stake;
after controlling the robot to move within a preset range around the target object according to the position information of the target object, the method further comprises:
acquiring an image to be detected acquired by a camera of the robot, wherein the charging pile exists in the image to be detected;
Performing identification processing on the image to be detected, and determining the position of a center mark on the charging pile in the image to be detected;
and controlling the robot to move to a position opposite to the center of the charging pile according to the position of the center mark on the charging pile, and controlling the robot to move forward to the charging pile so as to enable the charging coil of the robot and the charging coil of the charging pile to be in a pairing position.
In one embodiment, after the robot moves to the preset orientation of the charging stake, further comprising:
and controlling the robot to execute a preset charging action so that a charging coil of the robot and a charging coil of the charging pile are in a pairing position.
In one embodiment, the acquiring the communication data between the UWB base station end of the robot and the UWB tag end of the target object includes:
acquiring communication data between a UWB base station end of the robot and a UWB tag end of a target object under at least one of the following conditions:
the navigation map corresponding to the robot movement area cannot be obtained;
the position of the robot in the navigation map cannot be acquired;
the position of the target object within the navigation map cannot be acquired.
In one embodiment, the position information of the target object includes a coordinate position of the target object within a coordinate system of the robot;
the controlling the robot to move to a preset range around the target object according to the position information of the target object includes:
determining the position of one of the robot and the target object in the navigation map according to the position of the one of the robot and the target object in the navigation map and the position information of the target object under the condition that the position of the one of the robot and the target object in the navigation map is successfully acquired and the position of the other of the robot and the target object in the navigation map cannot be acquired;
and controlling the robot to move to a preset range around the target object according to the positions of the robot and the target object in the navigation map.
According to a second aspect of the embodiments of the present disclosure, there is provided a motion control apparatus of a robot, including:
the acquisition module is used for acquiring communication data between the UWB base station end of the robot and the UWB tag end of the target object;
The positioning module is used for determining the position information of the target object according to the communication data;
and the first movement module is used for controlling the robot to move to a preset range around the target object according to the position information of the target object.
In one embodiment, the UWB base station end of the robot includes at least one UWB antenna array.
In one embodiment, the positioning module is specifically configured to:
determining the distance between the target object and the robot according to the communication data;
determining an angle between a connecting line of the target object and the robot and a preset direction according to the communication data;
and determining the coordinate position of the target object in the coordinate system of the robot according to the distance and the angle.
In one embodiment, the robot comprises a foot robot and the target object comprises an asymmetric charging stake;
the device also comprises a second motion module for:
after the robot is controlled to move to a preset range around the target object according to the position information of the target object, acquiring an image to be detected, which is acquired by a camera of the robot, wherein the charging pile exists in the image to be detected;
Performing identification processing on the image to be detected, and determining a direction mark on the charging pile in the image to be detected;
and determining the azimuth information of the robot and the charging pile according to the direction identification, and controlling the robot to move to the preset azimuth of the charging pile according to the azimuth information.
In one embodiment, at least one side surface of the charging pile is provided with a corresponding direction mark;
the second movement module is used for controlling the robot to move to the preset azimuth of the charging pile according to the azimuth information, and is specifically used for:
controlling the robot to remain motionless in place under the condition that the direction information characterizes the preset azimuth of the robot in the charging pile;
and under the condition that the direction information characterizes that the robot is not in the preset azimuth of the charging pile, controlling the robot to move to the preset azimuth of the charging pile.
In one embodiment, the apparatus further comprises a third motion module for:
after the robot moves to the preset azimuth of the charging pile, acquiring an image to be detected, which is acquired by a camera of the robot, wherein a direction mark of the charging pile exists in the image to be detected, and at least one positioning point is arranged in the direction mark;
According to camera parameters of the camera, converting each positioning point in the image to be detected into a corresponding three-dimensional space coordinate point;
and controlling the robot to move to a preset charging position according to the coordinate position of at least one three-dimensional space coordinate point in the space coordinate system of the camera, so that the charging coil of the robot and the charging coil of the charging pile are in a matching position.
In one embodiment, the robot comprises a foot robot and the target object comprises a symmetrical charging stake;
the device further comprises a fourth motion module for:
after the robot is controlled to move to a preset range around the target object according to the position information of the target object, acquiring an image to be detected, which is acquired by a camera of the robot, wherein the charging pile exists in the image to be detected;
performing identification processing on the image to be detected, and determining the position of a center mark on the charging pile in the image to be detected;
and controlling the robot to move to a position opposite to the center of the charging pile according to the position of the center mark on the charging pile, and controlling the robot to move forward to the charging pile so as to enable the charging coil of the robot and the charging coil of the charging pile to be in a pairing position.
In one embodiment, the charging action module is further configured to:
and after the robot moves to the preset direction of the charging pile, controlling the robot to execute a preset charging action so as to enable the charging coil of the robot and the charging coil of the charging pile to be in a pairing position.
In one embodiment, the obtaining module is specifically configured to:
acquiring communication data between a UWB base station end of the robot and a UWB tag end of a target object under at least one of the following conditions:
the navigation map corresponding to the robot movement area cannot be obtained;
the position of the robot in the navigation map cannot be acquired;
the position of the target object within the navigation map cannot be acquired.
In one embodiment, the position information of the target object includes a coordinate position of the target object within a coordinate system of the robot;
the first motion module is specifically configured to:
determining the position of one of the robot and the target object in the navigation map according to the position of the one of the robot and the target object in the navigation map and the position information of the target object under the condition that the position of the one of the robot and the target object in the navigation map is successfully acquired and the position of the other of the robot and the target object in the navigation map cannot be acquired;
And controlling the robot to move to a preset range around the target object according to the positions of the robot and the target object in the navigation map.
According to a third aspect of embodiments of the present disclosure, a robot for performing the motion control method of the robot of the first aspect is provided.
According to a fourth aspect of embodiments of the present disclosure, there is provided an electronic device comprising a memory for storing computer instructions executable on a processor for performing the method of controlling movements of a robot according to the first aspect.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of the first aspect.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects:
according to the method, the communication data between the UWB base station end of the robot and the UWB tag end of the target object are obtained, so that the position information of the target object can be determined according to the communication data, and finally the robot is controlled to move to a preset range around the target object according to the position information of the target object. The UWB communication data exist between the robot and the target object, so that the position relation between the robot and the target object can be accurately determined, the position information of the target object can be determined by taking the robot as a reference object, the robot is controlled to move by taking the position information as a destination, and a navigation map in a moving area is not needed in the control process, so that the convenience and the efficiency of the movement control can be improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow chart of a method of controlling motion of a robot shown in an exemplary embodiment of the present disclosure;
fig. 2 is a flowchart of a motion control method of a robot shown in another exemplary embodiment of the present disclosure;
fig. 3 is a flowchart of a motion control method of a robot shown in yet another exemplary embodiment of the present disclosure;
fig. 4 is a schematic structural view of a motion control apparatus of a robot shown in an exemplary embodiment of the present disclosure;
fig. 5 is a block diagram of an electronic device shown in an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in this disclosure to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "at … …" or "responsive to a determination", depending on the context.
In a first aspect, at least one embodiment of the present disclosure provides a motion control method of a robot, please refer to fig. 1, which illustrates a flow of the method, including step S101 and step S103.
The method can be applied to a robot, and particularly can be applied to a processor of the robot. The robot may be a wheeled robot, such as a sweeping robot, etc., a tracked robot, a foot robot, such as a robot dog, etc.
The method can be applied to a scene that the robot moves to a specified destination, for example, the robot moves to a charging pile for charging.
In step S101, communication data between the UWB base station end of the robot and the UWB tag end of the target object is acquired.
The target object may be a charging stake or the like.
Among them, UWB technology (Ultra Wide Band, wireless carrier communication technology) uses non-sinusoidal narrow pulses of nanosecond order to transmit data, so that it occupies a Wide spectrum. The UWB technology has the advantages of low system complexity, low power spectrum density of the transmitted signal, insensitivity to channel fading, low interception capability, high positioning accuracy and the like, and is particularly suitable for high-speed wireless access in indoor and other dense multipath places.
And communication data interaction can be carried out between the UWB base station end and the UWB tag end. The UWB base station may include at least one UWB antenna array, for example, an antenna array is disposed on each of the front, rear, left, and right sides of the robot, so that the robot can communicate with the UWB tag at any angle of 360 ° and further perform positioning. The UWB tag may include at least one UWB antenna.
When the robot and the target object are in a starting state and the robot is in the signal range of the UWB tag end, the UWB base station end of the robot can be connected with the UWB tag end of the target object and communicate with the UWB tag end.
It can be understood that in this step, communication data between the UWB base station end of the robot and the UWB tag end of the target object may be obtained in real time, or communication data between the UWB base station end of the robot and the UWB tag end of the target object may be obtained according to a certain frequency.
In step S102, position information of the target object is determined from the communication data.
Wherein, each time of information sent in the communication data carries a time stamp. The distance between the target object and the robot can be determined according to the communication data, for example, the distance measurement is completed by adopting an arrival time difference algorithm in UWB ranging technology; then determining the angle between the connecting line of the target object and the robot and the preset direction according to the communication data, and calculating the angle by calculating the information received and transmitted by different antennas in the antenna array, for example, using a PDoA algorithm or a TDoA algorithm in the UWB angle measurement technology to finish the angle measurement; and finally, determining the coordinate position of the target object in the coordinate system of the robot according to the distance and the angle, wherein the coordinate system of the robot can be the directions of an X axis and a Y axis which are preset relative to the two directions of the robot by taking the position of the robot as an origin.
It can be understood that, in this step, the position information of the target object may be determined in real time, or the position information of the target object may be determined according to a certain frequency.
In step S103, the robot is controlled to move within a preset range around the target object according to the position information of the target object.
Optionally, taking the position information of the target object as a destination, and adopting a SLAM (Simultaneous Localization and Mapping, instant localization and mapping) algorithm to control the robot to move to a preset range around the target object. That is, a map of a local area around the robot may be constructed in real time according to an image acquired by a camera of the robot, and the map may be controlled to move toward a destination according to road conditions in the map, for example, if the road conditions in the map indicate no obstacle in a direction of moving toward the destination, the map may be directly moved toward the destination, and if the road conditions in the map indicate an obstacle in a direction of moving toward the destination, the map may be moved toward the destination after the completion of obstacle avoidance.
When the robot moves to a preset range around the target object, the robot can be further controlled to complete other operations. For example, when the target object is a charging pile of the robot, the robot may further move to a preset charging position of the charging pile for charging when moving to a preset range around the charging pile.
According to the method, the communication data between the UWB base station end of the robot and the UWB tag end of the target object are obtained, so that the position information of the target object can be determined according to the communication data, and finally the robot is controlled to move to a preset range around the target object according to the position information of the target object. The UWB communication data exist between the robot and the target object, so that the position relation between the robot and the target object can be accurately determined, the position information of the target object can be determined by taking the robot as a reference object, the robot is controlled to move by taking the position information as a destination, and a navigation map in a moving area is not needed in the control process, so that the convenience and the efficiency of the movement control can be improved. But also adapt the robot to more complex motion environments.
In some embodiments of the present disclosure, the robot comprises a foot robot and the target object comprises an asymmetric charging post, such as a riding charging post (i.e., the foot robot needs to ride on the charging post for charging). Since the asymmetric charging pile requires the foot robot to perform charging from a specific orientation, the robot may be controlled to move to a preset orientation of the charging pile in a manner as shown in fig. 2 after being controlled to move to a preset range around the target object according to the position information of the target object, including steps S201 to S203.
In step S201, an image to be detected acquired by a camera of the robot is acquired, where the charging pile exists in the image to be detected.
Wherein the robot has a camera, such as a monocular or binocular camera. The number of cameras of the robot can be one or a plurality of cameras. If the robot has one camera, the collection direction of the camera may be set to be the front of the robot, and if the robot has a plurality of cameras, the collection direction of the plurality of cameras may be set to be a plurality of directions including the front of the robot.
The camera can acquire images in real time or acquire images according to a certain frequency. Because the robot is within a preset range around the charging pile, there may be a charging pile in the image acquired by the camera. If the charging pile does not exist in the image acquired by the camera, the charging pile is not acquired as the image to be detected in the step, and if the charging pile exists in the image acquired by the camera, the charging pile is acquired as the image to be detected in the step. It can be understood that if the charging pile is not present in the image acquired by the camera, the robot can be controlled to rotate in situ, and the image is acquired in the rotation process until the target object is present in the acquired image to stop rotating.
In step S202, the image to be detected is identified, and a direction identifier on the charging pile in the image to be detected is determined.
The charging pile comprises a charging pile body, wherein at least one side face of the charging pile body is provided with a corresponding direction mark, for example, the left side face of the charging pile body is provided with a left mark, the right side face of the charging pile body is provided with a right mark, the front side face of the charging pile body is provided with a front mark, and the rear side face of the charging pile body is provided with a rear mark. The direction identifier can be a character identifier, an arrow identifier, a two-dimensional code identifier and the like. The direction indicator is used for indicating the direction of the side surface of the charging pile, for example, the left indicator is used for indicating that the side surface of the charging pile is the left side surface of the charging pile.
The neural network model for identifying the direction identification may be trained in advance, and then the image to be detected is identified in this step using the neural network model to obtain the direction identification.
In step S203, the azimuth information of the robot and the charging pile is determined according to the direction identifier, and the robot is controlled to move to the preset azimuth of the charging pile according to the azimuth information.
The azimuth information of the robot and the charging pile is used for representing which azimuth of the robot in the charging pile. The direction identification on the charging pile corresponds to the azimuth information of the robot and the charging pile one by one, and the direction identification on the charging pile in the image to be detected corresponds to the azimuth information of the robot and the charging pile when the robot collects the image to be detected. For example, if the direction identifier on the charging pile in the image to be detected is a left identifier, the robot is located at the left side of the charging pile when collecting the image to be detected, that is, the azimuth information of the robot and the charging pile is that the robot is located at the left side of the charging pile.
Wherein, the robot is controlled to remain motionless in place under the condition that the direction information characterizes the preset azimuth of the charging pile; and under the condition that the direction information characterizes that the robot is not in the preset azimuth of the charging pile (namely, under the condition that the robot is in other azimuth of the charging pile), the robot can be controlled to move to the preset azimuth of the charging pile according to the relation between the azimuth of the robot and the preset azimuth. For example, if the preset azimuth is the front, the direction information indicates that the robot is in front of the charging pile, the robot is controlled to do not move in situ, and if the direction information indicates that the robot is not in front of the charging pile, the controller moves to the front of the charging pile, for example, if the robot is behind the charging pile, the robot needs to bypass 180 ° to come to the front of the charging pile.
In addition, after the robot moves to the preset position of the charging pile, the robot can be controlled to perform pile loading in the following manner, namely, the robot is controlled to enter the preset charging position: firstly, acquiring an image to be detected acquired by a camera of the robot, wherein a direction mark of the charging pile exists in the image to be detected, and at least one positioning point is arranged in the direction mark; then, according to camera parameters of the camera, converting each positioning point in the image to be detected into a corresponding three-dimensional space coordinate point; and finally, controlling the robot to move to a preset charging position according to the coordinate position of at least one three-dimensional space coordinate point in a space coordinate system of the camera, so that the charging coil of the robot and the charging coil of the charging pile are in a matching position.
The positioning points in the direction identifier may be pixel points in a specific position in the two-dimensional code identifier, for example, four corner points of the two-dimensional code identifier. According to the preset charging position, positioning coordinates of each positioning point in a space coordinate system of the camera can be marked in advance, and then when the control robot moves to the preset charging position, three-dimensional space coordinate points obtained by converting each positioning point can be aligned with corresponding coordinates. When the robot moves to a preset charging position, the charging coil of the robot and the charging coil of the charging pile are in a pairing position, and at the moment, the charging coils are mutually identified and are mutually interacted in a charging protocol, so that the charging pile charges the robot. In addition, if the robot is further provided with a preset charging action, the robot can be controlled to execute the preset charging action after the robot moves to a preset charging position, so that the charging coil of the robot and the charging coil of the charging pile are in a pairing position, for example, the preset charging action of the robot dog is a groveling action, and the robot dog can be controlled to execute the groveling action after the robot dog moves to the preset charging position, so that the piling action of the robot dog is completed, and the charging coil of the robot dog and the charging coil of the charging pile are in a pairing position.
In some embodiments of the present disclosure, the robot comprises a foot robot and the target object comprises a symmetrical charging stake, such as a disk-type charging stake. Since the foot robot can charge the piles from the directions of the disc-type charging piles, after the robot is controlled to move within a preset range around the target object according to the position information of the target object, the robot is controlled to move onto the charging piles (i.e. to perform pile loading) for charging in the following manner: firstly, acquiring an image to be detected acquired by a camera of the robot, wherein the charging pile exists in the image to be detected; then, carrying out identification processing on the image to be detected, and determining the position of a center mark on the charging pile in the image to be detected; and finally, controlling the robot to move to a position opposite to the center of the charging pile according to the position of the center mark on the charging pile, and controlling the robot to move forward to the charging pile so as to enable the charging coil of the robot and the charging coil of the charging pile to be in a pairing position.
The position of the center mark on the charging pile in the image to be detected may be the position of the center mark in the image to be detected, for example, the position in the width direction of the image to be detected. For example, a pre-trained neural network model may be utilized to identify the location of a center mark in an image to be detected. Based on the method, the center mark is positioned at the center position of the image to be detected in the width direction, so that the robot can be characterized to move to the position opposite to the center of the charging pile, and the robot can be controlled to move to the position opposite to the center of the charging pile conveniently.
The robot moves forward to the charging pile from the position opposite to the center of the charging pile, so that each foot of the robot can be arranged on the charging pile, and the robot can be conveniently arranged on the center of the charging pile. The controller stops moving after each foot of the robot falls onto the charging pile, and at the moment, the charging coil of the robot and the charging coil of the charging pile are in a matched position, can mutually identify and carry out charging protocol interaction, and then the charging pile charges the robot. In addition, if the robot is further provided with a preset charging action, the robot can be controlled to execute the preset charging action after the robot moves onto the charging pile, so that the charging coil of the robot and the charging coil of the charging pile are in a pairing position, for example, the preset charging action of the robot dog is a groveling action, the robot dog can be controlled to execute the groveling action after the robot dog moves onto the charging pile, and thus the piling action of the robot dog is completed, and the charging coil of the robot dog and the charging coil of the charging pile are in a pairing position.
In this embodiment, through the pile-up action of the structural feature assorted with the circular-disc type charging pile, make the robot after the peripheral preset within range of charging pile of motion, can be simple convenient motion to charging pile on to begin to charge, improved convenience and efficiency that the robot charges.
In some embodiments of the present disclosure, a start condition may be set for step S101 in fig. 1, that is, step S101 is performed only if the start condition is satisfied: and acquiring communication data between the UWB base station end of the robot and the UWB tag end of the target object.
In one possible embodiment, the communication data between the UWB base station end of the robot and the UWB tag end of the target object is acquired in at least one of the following cases: the navigation map corresponding to the robot movement area cannot be obtained; the position of the robot in the navigation map cannot be acquired; the position of the target object within the navigation map cannot be acquired. That is, in the case that the movement path cannot be planned by using the navigation map, the robot is controlled to move within a preset range around the target object by the method shown in fig. 1.
Based on this, in step S103: when the robot is controlled to move within a preset range around the target object according to the position information of the target object (the position information of the target object includes the coordinate position of the target object in the coordinate system of the robot), the position of one of the robot and the target object in the navigation map can be determined according to the position of the other of the robot and the target object in the navigation map and the position information of the target object when the position of the other of the robot and the target object in the navigation map is successfully acquired and the position of the other of the robot and the target object in the navigation map cannot be acquired; and controlling the robot to move to a preset range around the target object according to the positions of the robot and the target object in the navigation map. Namely, planning a motion path on a navigation map and controlling the robot to move to a preset range of a target object according to the motion path.
In another possible embodiment, the target object is a charging peg, such as a riding type charging peg or a disc type charging peg. The communication data between the UWB base station end of the robot and the UWB tag end of the target object may be acquired under at least one of the following conditions: the robot receives a charging instruction (such as a voice instruction and the like) sent by a user; the electric quantity of the robot is lower than a preset low-electric-quantity threshold value; the movement time of the robot reaches a preset time threshold.
In this embodiment, the starting condition of step S101 is set, so that the method can be combined with the motion control of the navigation map, and the motion control is more stable and reliable. In addition, starting conditions under the charging scene are set, so that the method can be suitable for the charging scene of automatic recharging, and the adaptability of the method is improved.
Referring to fig. 3, a specific flow of a charging scenario of automatic recharging of a robot by combining the motion control method of the robot and the motion control of a navigation map is shown in an exemplary manner.
Firstly, automatic recharging is triggered by one of three triggering modes, namely APP triggering, voice triggering and low-electricity triggering, so that normal back flushing is started. Then, the robot is controlled to move to the position right in front of the charging pile by adopting a map mode or a UWB recharging mode, wherein the map mode is preferably adopted, namely, the robot is successfully positioned in the map mode, and when the map coordinates of the charging pile are successfully acquired, the map navigation robot is right in front of the charging pile, when the map coordinates of the charging pile are not successfully acquired, AI identification can be adopted to search for the charging pile, if the search is successful, the map navigation robot is right in front of the charging pile, if the search is not successful, the voice prompt is made that the charging pile is not found, so that the map recharging fails; if the robot is not successfully positioned in the map mode, a UWB recharging mode can be started, when the robot is matched with the UWB of the charging pile and the data are normal, the robot can be controlled to automatically avoid the obstacle and displace to a position point provided by UWB data (namely, the position of the charging pile is successfully identified by AI), if the robot is successfully positioned, the robot is controlled to move to the position right in front of the charging pile, and corresponding voice reminding can be carried out when any link of the UWB recharging mode is unsuccessful. Then the robot can be controlled to accurately position the charging pile through the N-Tag (namely a plurality of direction marks), the robot is controlled to advance after the accurate positioning, the charging pile is successfully carried out after the robot is successfully advanced, and corresponding voice reminding can be carried out when any link is unsuccessful.
According to a second aspect of the embodiments of the present disclosure, there is provided a motion control apparatus for a robot, referring to fig. 4, including:
an acquisition module 401, configured to acquire communication data between a UWB base station end of the robot and a UWB tag end of the target object;
a positioning module 402, configured to determine location information of the target object according to the communication data;
the first movement module 403 is configured to control the robot to move within a preset range around the target object according to the position information of the target object.
In some embodiments of the present disclosure, the UWB base station end of the robot includes at least one UWB antenna array.
In some embodiments of the present disclosure, the positioning module is specifically configured to:
determining the distance between the target object and the robot according to the communication data;
determining an angle between a connecting line of the target object and the robot and a preset direction according to the communication data;
and determining the coordinate position of the target object in the coordinate system of the robot according to the distance and the angle.
In some embodiments of the disclosure, the robot comprises a foot robot, and the target object comprises an asymmetric charging stake;
The device also comprises a second motion module for:
after the robot is controlled to move to a preset range around the target object according to the position information of the target object, acquiring an image to be detected, which is acquired by a camera of the robot, wherein the charging pile exists in the image to be detected;
performing identification processing on the image to be detected, and determining a direction mark on the charging pile in the image to be detected;
and determining the azimuth information of the robot and the charging pile according to the direction identification, and controlling the robot to move to the preset azimuth of the charging pile according to the azimuth information.
In some embodiments of the present disclosure, at least one side of the charging pile is provided with a corresponding direction identifier;
the second movement module is used for controlling the robot to move to the preset azimuth of the charging pile according to the azimuth information, and is specifically used for:
controlling the robot to remain motionless in place under the condition that the direction information characterizes the preset azimuth of the robot in the charging pile;
and under the condition that the direction information characterizes that the robot is not in the preset azimuth of the charging pile, controlling the robot to move to the preset azimuth of the charging pile.
In one embodiment, the apparatus further comprises a third motion module for:
after the robot moves to the preset azimuth of the charging pile, acquiring an image to be detected, which is acquired by a camera of the robot, wherein a direction mark of the charging pile exists in the image to be detected, and at least one positioning point is arranged in the direction mark;
according to camera parameters of the camera, converting each positioning point in the image to be detected into a corresponding three-dimensional space coordinate point;
and controlling the robot to move to a preset charging position according to the coordinate position of at least one three-dimensional space coordinate point in the space coordinate system of the camera, so that the charging coil of the robot and the charging coil of the charging pile are in a matching position.
In one embodiment, the robot comprises a foot robot and the target object comprises a symmetrical charging stake;
the device further comprises a fourth motion module for:
after the robot is controlled to move to a preset range around the target object according to the position information of the target object, acquiring an image to be detected, which is acquired by a camera of the robot, wherein the charging pile exists in the image to be detected;
Performing identification processing on the image to be detected, and determining the position of a center mark on the charging pile in the image to be detected;
and controlling the robot to move to a position opposite to the center of the charging pile according to the position of the center mark on the charging pile, and controlling the robot to move forward to the charging pile so as to enable the charging coil of the robot and the charging coil of the charging pile to be in a pairing position.
In one embodiment, the charging action module is further configured to:
and after the robot moves to the preset direction of the charging pile, controlling the robot to execute a preset charging action so as to enable the charging coil of the robot and the charging coil of the charging pile to be in a pairing position.
In one embodiment, the obtaining module is specifically configured to:
acquiring communication data between a UWB base station end of the robot and a UWB tag end of a target object under at least one of the following conditions:
the navigation map corresponding to the robot movement area cannot be obtained;
the position of the robot in the navigation map cannot be acquired;
the position of the target object within the navigation map cannot be acquired.
In some embodiments of the disclosure, the position information of the target object includes a coordinate position of the target object within a coordinate system of the robot;
the first motion module is specifically configured to:
determining the position of one of the robot and the target object in the navigation map according to the position of the one of the robot and the target object in the navigation map and the position information of the target object under the condition that the position of the one of the robot and the target object in the navigation map is successfully acquired and the position of the other of the robot and the target object in the navigation map cannot be acquired;
and controlling the robot to move to a preset range around the target object according to the positions of the robot and the target object in the navigation map.
The specific manner in which the various modules perform the operations in relation to the apparatus of the above embodiments has been described in detail in relation to the embodiments of the method of the first aspect and will not be described in detail here.
According to a third aspect of embodiments of the present disclosure, a robot for performing the motion control method of the robot of the first aspect is provided.
In accordance with a fourth aspect of embodiments of the present disclosure, reference is made to fig. 5, which schematically illustrates a block diagram of an electronic device. For example, the apparatus 500 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, or the like.
Referring to fig. 5, an apparatus 500 may include one or more of the following components: a processing component 502, a memory 504, a power supply component 506, a multimedia component 508, an audio component 510, an input/output (I/O) interface 512, a sensor component 514, and a communication component 516.
The processing component 502 generally controls overall operation of the apparatus 500, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing element 502 may include one or more processors 520 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 502 can include one or more modules that facilitate interactions between the processing component 502 and other components. For example, the processing component 502 may include a multimedia module to facilitate interaction between the multimedia component 508 and the processing component 502.
Memory 504 is configured to store various types of data to support operations at device 500. Examples of such data include instructions for any application or method operating on the apparatus 500, contact data, phonebook data, messages, pictures, videos, and the like. The memory 504 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 506 provides power to the various components of the device 500. The power components 506 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 500.
The multimedia component 508 includes a screen between the device 500 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touch, swipe, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or sliding action, but also the duration and pressure associated with the touch or sliding operation. In some embodiments, the multimedia component 508 includes a front-facing camera and/or a rear-facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the apparatus 500 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 510 is configured to output and/or input audio signals. For example, the audio component 510 includes a Microphone (MIC) configured to receive external audio signals when the device 500 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 504 or transmitted via the communication component 516. In some embodiments, the audio component 510 further comprises a speaker for outputting audio signals.
The I/O interface 512 provides an interface between the processing component 502 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 514 includes one or more sensors for providing status assessment of various aspects of the apparatus 500. For example, the sensor assembly 514 may detect the on/off state of the device 500, the relative positioning of the components, such as the display and keypad of the device 500, the sensor assembly 514 may also detect a change in position of the device 500 or a component of the device 500, the presence or absence of user contact with the device 500, the orientation or acceleration/deceleration of the device 500, and a change in temperature of the device 500. The sensor assembly 514 may also include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 514 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 514 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 516 is configured to facilitate communication between the apparatus 500 and other devices in a wired or wireless manner. The apparatus 500 may access a wireless network based on a communication standard, such as WiFi,2G or 3G,4G or 5G, or a combination thereof. In one exemplary embodiment, the communication part 516 receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 516 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 500 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for performing the power supply methods of electronic devices described above.
In a fifth aspect, the present disclosure also provides, in an exemplary embodiment, a non-transitory computer-readable storage medium, such as memory 504, comprising instructions executable by processor 520 of apparatus 500 to perform the method of powering an electronic device described above. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (23)

1. A method of controlling movement of a robot, comprising:
acquiring communication data between a UWB base station end of a robot and a UWB tag end of a target object;
determining the position information of the target object according to the communication data;
and controlling the robot to move to a preset range around the target object according to the position information of the target object.
2. The method of claim 1, wherein the UWB base station end of the robot includes at least one UWB antenna array.
3. The method according to claim 1, wherein the determining the position information of the target object from the communication data includes:
determining the distance between the target object and the robot according to the communication data;
determining an angle between a connecting line of the target object and the robot and a preset direction according to the communication data;
and determining the coordinate position of the target object in the coordinate system of the robot according to the distance and the angle.
4. The method of claim 1, wherein the robot comprises a foot robot and the target object comprises an asymmetric charging pile;
after controlling the robot to move within a preset range around the target object according to the position information of the target object, the method further comprises:
acquiring an image to be detected acquired by a camera of the robot, wherein the charging pile exists in the image to be detected;
Performing identification processing on the image to be detected, and determining a direction mark on the charging pile in the image to be detected;
and determining the azimuth information of the robot and the charging pile according to the direction identification, and controlling the robot to move to the preset azimuth of the charging pile according to the azimuth information.
5. The method according to claim 4, wherein at least one side of the charging pile is provided with a corresponding direction mark;
the controlling the robot to move to the preset azimuth of the charging pile according to the azimuth information comprises the following steps:
controlling the robot to remain motionless in place under the condition that the direction information characterizes the preset azimuth of the robot in the charging pile;
and under the condition that the direction information characterizes that the robot is not in the preset azimuth of the charging pile, controlling the robot to move to the preset azimuth of the charging pile.
6. The method of claim 4, further comprising, after the robot moves to a preset orientation of the charging stake:
acquiring an image to be detected acquired by a camera of the robot, wherein a direction identifier of the charging pile exists in the image to be detected, and at least one positioning point is arranged in the direction identifier;
According to camera parameters of the camera, converting each positioning point in the image to be detected into a corresponding three-dimensional space coordinate point;
and controlling the robot to move to a preset charging position according to the coordinate position of at least one three-dimensional space coordinate point in the space coordinate system of the camera, so that the charging coil of the robot and the charging coil of the charging pile are in a matching position.
7. The method of claim 1, wherein the robot comprises a foot robot and the target object comprises a symmetrical charging stake;
after controlling the robot to move within a preset range around the target object according to the position information of the target object, the method further comprises:
acquiring an image to be detected acquired by a camera of the robot, wherein the charging pile exists in the image to be detected;
performing identification processing on the image to be detected, and determining the position of a center mark on the charging pile in the image to be detected;
and controlling the robot to move to a position opposite to the center of the charging pile according to the position of the center mark on the charging pile, and controlling the robot to move forward to the charging pile so as to enable the charging coil of the robot and the charging coil of the charging pile to be in a pairing position.
8. The method of controlling movement of a robot according to any one of claims 4 to 7, further comprising, after the robot moves to a preset orientation of the charging stake:
and controlling the robot to execute a preset charging action so that a charging coil of the robot and a charging coil of the charging pile are in a pairing position.
9. The method according to claim 1, wherein the step of acquiring communication data between the UWB base station terminal of the robot and the UWB tag terminal of the target object comprises:
acquiring communication data between a UWB base station end of the robot and a UWB tag end of a target object under at least one of the following conditions:
the navigation map corresponding to the robot movement area cannot be obtained;
the position of the robot in the navigation map cannot be acquired;
the position of the target object within the navigation map cannot be acquired.
10. The method according to claim 9, wherein the positional information of the target object includes a coordinate position of the target object within a coordinate system of the robot;
the controlling the robot to move to a preset range around the target object according to the position information of the target object includes:
Determining the position of one of the robot and the target object in the navigation map according to the position of the one of the robot and the target object in the navigation map and the position information of the target object under the condition that the position of the one of the robot and the target object in the navigation map is successfully acquired and the position of the other of the robot and the target object in the navigation map cannot be acquired;
and controlling the robot to move to a preset range around the target object according to the positions of the robot and the target object in the navigation map.
11. A motion control apparatus of a robot, comprising:
the acquisition module is used for acquiring communication data between the UWB base station end of the robot and the UWB tag end of the target object;
the positioning module is used for determining the position information of the target object according to the communication data;
and the first movement module is used for controlling the robot to move to a preset range around the target object according to the position information of the target object.
12. The motion control apparatus of claim 11, wherein the UWB base station end of the robot includes at least one UWB antenna array.
13. The motion control device of a robot of claim 11, wherein the positioning module is specifically configured to:
determining the distance between the target object and the robot according to the communication data;
determining an angle between a connecting line of the target object and the robot and a preset direction according to the communication data;
and determining the coordinate position of the target object in the coordinate system of the robot according to the distance and the angle.
14. The motion control apparatus of a robot of claim 11, wherein the robot comprises a foot robot and the target object comprises an asymmetric charging pile;
the device also comprises a second motion module for:
after the robot is controlled to move to a preset range around the target object according to the position information of the target object, acquiring an image to be detected, which is acquired by a camera of the robot, wherein the charging pile exists in the image to be detected;
performing identification processing on the image to be detected, and determining a direction mark on the charging pile in the image to be detected;
and determining the azimuth information of the robot and the charging pile according to the direction identification, and controlling the robot to move to the preset azimuth of the charging pile according to the azimuth information.
15. The motion control device of a robot of claim 14, wherein at least one side of the charging stake is provided with a corresponding directional indicator;
the second movement module is used for controlling the robot to move to the preset azimuth of the charging pile according to the azimuth information, and is specifically used for:
controlling the robot to remain motionless in place under the condition that the direction information characterizes the preset azimuth of the robot in the charging pile;
and under the condition that the direction information characterizes that the robot is not in the preset azimuth of the charging pile, controlling the robot to move to the preset azimuth of the charging pile.
16. The motion control apparatus of a robot of claim 14, further comprising a third motion module for:
after the robot moves to the preset azimuth of the charging pile, acquiring an image to be detected, which is acquired by a camera of the robot, wherein a direction mark of the charging pile exists in the image to be detected, and at least one positioning point is arranged in the direction mark;
according to camera parameters of the camera, converting each positioning point in the image to be detected into a corresponding three-dimensional space coordinate point;
And controlling the robot to move to a preset charging position according to the coordinate position of at least one three-dimensional space coordinate point in the space coordinate system of the camera, so that the charging coil of the robot and the charging coil of the charging pile are in a matching position.
17. The motion control apparatus of a robot of claim 11, wherein the robot comprises a foot robot and the target object comprises a symmetrical charging stake;
the device further comprises a fourth motion module for:
after the robot is controlled to move to a preset range around the target object according to the position information of the target object, acquiring an image to be detected, which is acquired by a camera of the robot, wherein the charging pile exists in the image to be detected;
performing identification processing on the image to be detected, and determining the position of a center mark on the charging pile in the image to be detected;
and controlling the robot to move to a position opposite to the center of the charging pile according to the position of the center mark on the charging pile, and controlling the robot to move forward to the charging pile so as to enable the charging coil of the robot and the charging coil of the charging pile to be in a pairing position.
18. The motion control apparatus of a robot according to any one of claims 14 to 17, further comprising a charging action module for:
and after the robot moves to the preset direction of the charging pile, controlling the robot to execute a preset charging action so as to enable the charging coil of the robot and the charging coil of the charging pile to be in a pairing position.
19. The motion control device of a robot according to claim 11, wherein the acquisition module is specifically configured to:
acquiring communication data between a UWB base station end of the robot and a UWB tag end of a target object under at least one of the following conditions:
the navigation map corresponding to the robot movement area cannot be obtained;
the position of the robot in the navigation map cannot be acquired;
the position of the target object within the navigation map cannot be acquired.
20. The motion control apparatus of claim 19, wherein the positional information of the target object includes a coordinate position of the target object within a coordinate system of the robot;
the first motion module is specifically configured to:
determining the position of one of the robot and the target object in the navigation map according to the position of the one of the robot and the target object in the navigation map and the position information of the target object under the condition that the position of the one of the robot and the target object in the navigation map is successfully acquired and the position of the other of the robot and the target object in the navigation map cannot be acquired;
And controlling the robot to move to a preset range around the target object according to the positions of the robot and the target object in the navigation map.
21. A robot for performing the motion control method of the robot according to any one of claims 1 to 10.
22. An electronic device, characterized in that it comprises a memory for storing computer instructions executable on the processor for executing the computer instructions based on the method of controlling the movement of the robot according to any one of claims 1 to 10.
23. A computer readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, implements the method of any one of claims 1 to 10.
CN202210316335.9A 2022-03-28 2022-03-28 Robot, motion control method and device thereof, electronic equipment and storage medium Pending CN116852345A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210316335.9A CN116852345A (en) 2022-03-28 2022-03-28 Robot, motion control method and device thereof, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210316335.9A CN116852345A (en) 2022-03-28 2022-03-28 Robot, motion control method and device thereof, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116852345A true CN116852345A (en) 2023-10-10

Family

ID=88225536

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210316335.9A Pending CN116852345A (en) 2022-03-28 2022-03-28 Robot, motion control method and device thereof, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116852345A (en)

Similar Documents

Publication Publication Date Title
US10921803B2 (en) Method and device for controlling flight of unmanned aerial vehicle and remote controller
CN105101404B (en) Localization method, device and terminal
EP3742250B1 (en) Method, apparatus and system for controlling unmanned aerial vehicle
US20210158560A1 (en) Method and device for obtaining localization information and storage medium
CN105974357A (en) Method and device for positioning terminal
CN104105064B (en) The method and device of location equipment
CN108064360A (en) The control method and device of unmanned plane
CN107957266A (en) Localization method, device and storage medium
CN111610923B (en) Directional operation method, directional operation device and storage medium
KR20180039437A (en) Cleaning robot for airport and method thereof
CN113205549B (en) Depth estimation method and device, electronic equipment and storage medium
CN106657781A (en) Target object photographing method and target object photographing device
CN104571135A (en) Cloud deck tracking photography system and cloud deck tracking photography method
CN112179352A (en) Space map construction method and device, movement control method and device, and medium
CN106533907B (en) Information sending method and device
CN109587188B (en) Method and device for determining relative position relationship between terminal devices and electronic device
EP3667453A1 (en) Drone control method and device, drone and core network device
CN116852345A (en) Robot, motion control method and device thereof, electronic equipment and storage medium
US11398746B2 (en) Information processing method, mobile device and storage medium
CN113192139A (en) Positioning method and device, electronic equipment and storage medium
CN111782449A (en) Test apparatus and motion control method
CN115391380A (en) Method, self-moving equipment and system for automatically matching service area
CN113865481A (en) Object size measuring method, device and storage medium
CN116883496B (en) Coordinate reconstruction method and device for traffic element, electronic equipment and storage medium
CN108829097B (en) Mirror control method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right

Effective date of registration: 20231007

Address after: Room 602, 6th Floor, Building 5, Building 15, Kechuang 10th Street, Beijing Economic and Technological Development Zone, Daxing District, Beijing, 100176

Applicant after: Beijing Xiaomi Robot Technology Co.,Ltd.

Address before: No.018, 8th floor, building 6, No.33 yard, middle Xierqi Road, Haidian District, Beijing 100085

Applicant before: BEIJING XIAOMI MOBILE SOFTWARE Co.,Ltd.

TA01 Transfer of patent application right
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination