WO2023025028A1 - Procédé de charge, appareil de charge et robot - Google Patents

Procédé de charge, appareil de charge et robot Download PDF

Info

Publication number
WO2023025028A1
WO2023025028A1 PCT/CN2022/113273 CN2022113273W WO2023025028A1 WO 2023025028 A1 WO2023025028 A1 WO 2023025028A1 CN 2022113273 W CN2022113273 W CN 2022113273W WO 2023025028 A1 WO2023025028 A1 WO 2023025028A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
point cloud
cloud data
pose
charging stand
Prior art date
Application number
PCT/CN2022/113273
Other languages
English (en)
Chinese (zh)
Inventor
张新静
田丰溥
Original Assignee
追觅创新科技(苏州)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 追觅创新科技(苏州)有限公司 filed Critical 追觅创新科技(苏州)有限公司
Publication of WO2023025028A1 publication Critical patent/WO2023025028A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J7/00Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
    • H02J7/00032Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries characterised by data exchange
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • A47L11/4005Arrangements of batteries or cells; Electric power supply arrangements
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J7/00Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries
    • H02J7/0047Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries with monitoring or indicating devices or circuits

Definitions

  • the present invention relates to the communication field, in particular, to a charging method, a charging device and a robot.
  • the second is to set a specific concave-convex structure for the charging stand, or add reflective stickers of different intensities, use the detection sensor to scan the above structure, and then perform data matching between the scanned structure and the preset structure.
  • This method can improve the recognition accuracy, but this method is harmful to The structural requirements are high, and the concave-convex structure may cause dirt residue, and the consumables of the reflective stickers are vulnerable to weaken the features.
  • the autonomous recharging method in the related art has the problems of relatively low accuracy in identifying the charging stand and the identification method is not universal.
  • Embodiments of the present invention provide a charging method, a charging device, and a robot, so as to at least solve the problems in the related art that the accuracy of identifying the charging stand is relatively low and the identification method is not universal.
  • a charging method including: when it is determined that the target position relationship is satisfied between the robot and the charging stand, acquiring point cloud data obtained by scanning the target area by the robot; The point cloud data calculates the relative pose of the robot relative to the charging stand; determines the second global pose of the charging stand based on the first global pose of the robot and the relative pose; The second global pose controls the robot to drive into the entrance of the charging stand, so that the charging stand charges the robot.
  • calculating the relative pose of the robot relative to the charging stand based on the point cloud data includes: based on the information of each sampling point included in the point cloud data and the predetermined The structural information of the charging stand determines the contour bitmap of the charging stand; taking the target point on the charging stand as the origin and determining the template point cloud of the charging stand based on the contour bitmap; from the point cloud Select a predetermined number of target frame point cloud data from the multi-frame point cloud data included in the data, and frame the target frame point cloud data of the predetermined number to obtain the frame point cloud; based on the template point cloud and the The frame point cloud determines the relative pose of the robot relative to the charging stand; wherein, the point cloud data includes the multi-frame point cloud collected by the robot during the rotation of the third angle data.
  • selecting a predetermined number of target frame point cloud data from the multi-frame point cloud data included in the point cloud data includes: Finally, the predetermined number of frame point cloud data is determined as the target frame point cloud data; the target frame point cloud data is selected from the multi-frame point cloud data included in the point cloud data according to a predetermined selection interval.
  • framing the predetermined number of target frame point cloud data to obtain the frame point cloud includes: taking the last frame of point cloud data included in the target frame point cloud data as a reference Frame point cloud data, using the global pose difference between other frame point cloud data included in the target frame point cloud data and the reference frame point cloud data as a priori, using the nearest neighbor iterative algorithm for matching operations, to obtain the Describe the mosaic frame point cloud.
  • determining the relative pose of the robot relative to the charging stand based on the template point cloud and the frame point cloud includes: using the nearest neighbor iterative algorithm to calculate the frame point cloud and The relative position of the template point cloud; determining the relative position of the mosaic point cloud and the template point cloud as the relative pose of the robot and the charging stand.
  • controlling the robot to drive to the location of the charging base based on the second global pose includes: determining a first pose point of the robot based on the first global pose, And, determine a second pose point of the charging stand based on the second global pose; determine a first distance between the first pose point and the median perpendicular of the charging stand, and indicate the A second distance of the length of the line between the first pose point and the second pose point; based on the first distance and the second distance, the robot is controlled to travel to the location of the charging stand.
  • controlling the robot to travel to the location of the charging base based on the first distance and the second distance includes: determining a first length of the first distance; When the length exceeds the first length threshold, repeat the following operations until the first length is less than or equal to the first length threshold, and then execute the target processing so that the robot travels to the location of the charging stand Place: control the robot to turn in the direction of the first pose point and the vertical point of the median perpendicular line and walk the first length, and then turn in the direction of the second pose point; after determining the If the first length is less than or equal to the first length threshold, the target processing is executed so that the robot travels to the location of the charging stand.
  • the target processing includes: determining an included angle between a line connecting the first pose point and the second pose point and the median perpendicular; based on the included angle and The weighted value of the first length determines the distance that the robot deviates from the mid-perpendicular line, and based on the distance that the robot deviates from the mid-perpendicular line, continuously corrects the pose of the robot until the first pose point and the length of the line between the second pose point is less than the second length threshold; control the robot to rotate a fourth angle, and adjust the angular velocity of the robot in real time based on the included angle until the robot includes until the predetermined part touches the charging stand.
  • a charging device including: an acquisition module, configured to acquire the information obtained by scanning the target area by the robot when it is determined that the target position relationship is satisfied between the robot and the charging stand. Point cloud data; a calculation module, used to calculate the relative pose of the robot relative to the charging base based on the point cloud data; a first determination module, used to calculate the relative pose of the robot based on the first global pose of the robot and the Determine the second global pose of the charging stand relative to the pose; a control module, configured to control the robot to drive into the entrance where the charging stand is located based on the second global pose, so that the charging The dock charges the robot.
  • a robot including: a scanning component, used to scan the target area to obtain point cloud data; a control component, including the above-mentioned charging device; a charging component, used for The robot is charged.
  • the global pose of the charging stand can be determined in combination with the point cloud data, and then the robot is controlled to perform the recharging operation based on the global pose of the charging stand.
  • the point cloud data can represent accurate spatial data, it is possible to realize point-based
  • the cloud data obtains the accurate global pose of the charging stand.
  • the robot can accurately drive to the entrance where the charging stand is located, effectively improving the universality of recharging. It solves the problems in the related art that the accuracy of identifying the charging stand is relatively low and the identification method is not universal.
  • Fig. 1 is a hardware structural block diagram of a mobile robot of a charging method according to an embodiment of the present invention
  • Fig. 2 is a first schematic diagram of the relative positions of the charging stand and the robot according to the embodiment of the present invention
  • Fig. 3 is a second schematic diagram of the relative positions of the charging stand and the robot according to an embodiment of the present invention.
  • FIG. 4 is a flowchart of a charging method according to an embodiment of the present invention.
  • Fig. 5 is a schematic structural diagram of a charging stand according to an embodiment of the present invention.
  • Fig. 6 is a schematic diagram of docking according to an embodiment of the present invention.
  • Fig. 7 is a structural block diagram of a charging device according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a hardware structure of a mobile robot according to a charging method according to an embodiment of the present invention.
  • the mobile robot can include one or more (only one is shown in Figure 1) processor 102 (the processor 102 can include but not limited to processing devices such as microprocessor MCU or programmable logic device FPGA, etc.) and a memory 104 for storing data.
  • the above-mentioned mobile robot may also include a transmission device 106 and an input and output device 108 for communication functions.
  • a transmission device 106 and an input and output device 108 for communication functions.
  • the mobile robot may also include more or fewer components than those shown in FIG. 1 , or have a different configuration that is functionally equivalent to or more functionally than that shown in FIG. 1 .
  • the memory 104 can be used to store computer programs, for example, software programs and modules of application software, such as the computer program corresponding to the cleaning method of the mobile robot in the embodiment of the present invention, the processor 102 runs the computer program stored in the memory 104, thereby Executing various functional applications and data processing is to realize the above-mentioned method.
  • the memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 104 may further include memory located remotely from the processor 102, and these remote memories may be connected to the mobile robot through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
  • the transmission device 106 is used to receive or transmit data via a network.
  • a specific example of the above-mentioned network may include a wireless network provided by a mobile robot's communication provider.
  • the transmission device 106 includes a network interface controller (NIC for short), which can be connected to other network devices through a base station so as to communicate with the Internet.
  • the transmission device 106 may be a radio frequency (Radio Frequency, referred to as RF) module, which is used to communicate with the Internet in a wireless manner.
  • RF Radio Frequency
  • the sweeping robot is currently a typical representative of smart home, it can perform automatic cleaning operations, and this operation requires a certain amount of energy support
  • the energy source of the sweeping robot is the built-in battery
  • the battery is the sweeping robot
  • the core control ensures all the operation actions of the sweeping robot. If the battery is insufficient, the sweeping robot will not work normally.
  • the following explains how the sweeping robot is charged: the sweeping robot and the charging stand are equipped with a charging sheet, the charging sheet can be set on the side, top, etc. of the robot, and the charging sheet can be embedded in a certain surface of the robot.
  • the charging sheet set on the sweeping robot and the charging sheet set on the charging stand are set correspondingly.
  • the sweeping robot can be charged; in addition, the sweeping robot can also be charged by wireless charging.
  • the electromagnetic induction principle of electric magnetism and magnetic electricity generation can be used to wirelessly charge the sweeping robot.
  • the sweeping robot needs to drive to the location of the charging stand for charging, but in practical applications, the areas where the sweeping robot and the charging stand are located may not be the same area, as shown in Figure 2, the sweeping robot is in room A, and the charging The seat is in room B. It can be seen that the robot is far away from the charging stand. In this case, the sweeping robot needs to walk near the charging stand based on the map of the charging stand location saved in itself, or by means of infrared.
  • a charging method is provided in this embodiment, as shown in Figure 4, the method includes the following steps:
  • the execution subject of the above operations may be an intelligent robot (for example, a sweeper), or a processor provided in the intelligent robot, or other devices with similar processing capabilities.
  • the above-mentioned target area is the area where the robot and the charging stand are located.
  • the target area can be indoor places such as bedrooms, living rooms, study rooms, restaurants, and offices.
  • the above-mentioned target area can also be part of the outdoor area, such as a fitness square.
  • satisfying the target positional relationship between the robot and the charging base may include many situations.
  • the target positional relationship may be that the distance between the robot and the charging base is less than a certain distance, or it may be that the robot travels to The area corresponding to the location of the charging stand (for example, when the charging stand is set in the bedroom, the robot travels to the entrance of the bedroom; when the charging stand is set in the living room, the robot travels to the living room Sofa position, etc.), or, it may be that the robot has driven to the front of the charging stand and so on.
  • the point cloud data can be obtained by sensing the surrounding environment through the ranging sensor of the robot.
  • the ranging sensor includes a two-dimensional laser ranging unit, and a three-dimensional ranging unit based on tof or structured light.
  • Point cloud data includes the angle, distance and spatial height information of each sampling point.
  • the robot when sensing the surrounding environment, the robot can be controlled to continuously scan the point cloud data during the movement, so as to obtain more complete information, such as the robot rotating 360° (or 180°, 270°, 200°, etc. , of course, in practical applications, the robot can also be rotated non-in situ) to collect multi-frame point cloud data, reduce the impact of the supporting pillars of the protective cover of the ranging unit, and provide more constraint information.
  • the global pose of the charging stand can be determined in combination with the point cloud data, and then the robot can be controlled to perform the recharging operation based on the global pose of the robot and the charging stand. Since the point cloud data can represent accurate spatial data, it is sufficient Realize the accurate global pose of the charging stand based on the point cloud data.
  • the robot can accurately drive to the entrance of the charging stand, effectively improving the versatility of recharging and success rate, which solves the problems in the related art that the accuracy of identifying the charging stand is relatively low and the identification method is not universal.
  • the method before determining the point cloud data obtained by the robot scanning the target area, the method further includes: controlling the robot to rotate by a first angle from the initial position, and determining In the process, each signal receiving head provided on the robot receives the starting angle and the ending angle of the first signal, wherein, the starting angle and the ending angle are determined based on the initial position
  • the first signal is a signal sent by the charging stand; the target orientation of the charging stand is determined based on the initial angle and the end angle; the robot is controlled to move towards the target orientation until The robot and the charging base satisfy a target positional relationship.
  • the first angle can be an angle that can be set flexibly, for example, it can be 360°, 270°, 180°, etc., and it can also be an angle that can be flexibly adjusted according to the environment where the robot is located, for example, when in a bedroom , due to the occlusion of the bed, the charging stand can only be set at a specific corner of the wall. Therefore, the first angle in this scene can be set to 180°; in an open area, due to fewer occlusions, charging The position of the seat may not be fixed, therefore, the first angle in this scene may be set to 360°, and the above-mentioned signal receiving head may be an infrared receiving head.
  • the target orientation of the charging stand determined based on the initial angle and the end angle is only an approximate method, not an accurate one. After determining the approximate orientation, it is necessary to continue to determine the exact orientation , the subsequent embodiments will describe how to continue to determine the correct way.
  • controlling the robot to move toward the target orientation until the robot and the charging base meet a target positional relationship includes: controlling the robot to move toward the target orientation until the robot
  • the predetermined number of the signal receiving heads provided on the battery can all receive the second signal sent by the charging stand.
  • the second signal sent by the charging stand may be a field signal, or called a guide signal
  • the guide signal may be an invisible light signal, for example, a specific pulse infrared signal.
  • the robot after determining the above-mentioned target orientation for indicating the approximate orientation, the robot can be controlled to turn to the target orientation, and rely on the signal received by the infrared receiving head of the machine head to guide and recharge until the field signal is seen If there is an obstacle or a collision during driving, it will try to go around or navigate through the obstacle to continue recharging.
  • the method further includes: after determining that the predetermined number of signal receiving heads provided on the robot cannot all receive the charging In the case of the second signal sent by the charging stand, repeat the following operations until the predetermined number of signal receiving heads set on the robot can receive the second signal sent by the charging stand: control the The robot rotates in a first direction by a second angle; the robot is controlled to travel a first distance; the robot is controlled to rotate in a second direction by a second angle, wherein the first direction and the second direction are opposite direction.
  • the robot can receive the field signal, it can be judged whether it has walked to the position directly in front of the charging stand according to the infrared signals received by the two receiving heads (or 3 receiving heads, or 4 receiving heads, etc.) directly in front of the machine.
  • both receiving heads receive the centering signal (a modulated signal)
  • the machine needs to rotate a certain angle to the center line of the charging stand (that is, the second angle mentioned above, For example, 45 degrees, 90 degrees, 120 degrees, etc.)
  • drive a certain distance for example, walk a straight line or arc, etc., then turn to the direction of the charging stand, and then repeatedly judge the two receiving heads (or 3 receiving heads) directly in front of the robot , or 4 receiving heads, etc.) whether the infrared signal is received.
  • the robot can also walk a certain distance (for example, 5cm, 10cm, 20cm, etc.)
  • the head can face the charging stand.
  • determining and calculating the first global pose of the robot and the relative pose of the robot relative to the charging stand based on the point cloud data includes: The information of each sampling point and the predetermined structural information of the charging stand determine the first global pose of the robot and the contour bitmap of the charging stand; take the target point on the charging stand as the origin and based on The contour bitmap determines the template point cloud of the charging stand, that is, the point cloud used to indicate the complete contour of the charging stand; select a predetermined number of targets from the multi-frame point cloud data included in the point cloud data Frame point cloud data, frame the predetermined number of target frame point cloud data to obtain a frame point cloud; relative pose; wherein, the point cloud data includes the multi-frame point cloud data collected by the robot during the rotation of the third angle.
  • the point cloud data is a set of points obtained after obtaining the spatial coordinates of the sampling points on the surface of the object, wherein each point cloud data contains information such as the coordinates of the sampling points, that is, the above-mentioned sampling points
  • the information includes coordinate information of sampling points.
  • the predetermined structural information of the charging stand can be the contour information inside the charging stand.
  • the structural information can also be the external contour information of the charging stand, or it can also be the contour information of a part of the charging stand.
  • the target point on the charging stand can be the center point of the charging stand port, or the center point of the front end of the charging stand, or the center point of the rear side of the charging stand, or the center of gravity of the charging stand, or other types of points.
  • the structural diagram of the charging stand can be referred to in Fig. 5 .
  • selecting a predetermined number of target frame point cloud data from the multi-frame point cloud data included in the point cloud data includes: Finally, the predetermined number of frame point cloud data is determined as the target frame point cloud data; the target frame point cloud data is selected from the multi-frame point cloud data included in the point cloud data according to a predetermined selection interval.
  • the actually scanned point cloud data may be multi-frame point cloud data, and only part of the frame point cloud data may be used during application, wherein the part of the frame point cloud data may be selected in a specific way , for example, select the last few frames of point cloud data (for example, the last 4 frames, the last 5 frames, etc.), or you can select the middle frames of point cloud data, or you can select several frames of point cloud data according to a certain interval (for example, you can interval Select one frame out of 4 frames, that is, select the last frame out of every 5 frames), or use other selection methods for selection.
  • the isolated noise removal processing before selecting the point cloud data of the target frame, the isolated noise removal processing can be performed on the multi-frame point cloud data respectively, so as to remove the obviously problematic point cloud.
  • the range can also be set to 10cm, 20cm, etc.) a single point without adjacent points.
  • framing the predetermined number of target frame point cloud data to obtain the frame point cloud includes: taking the last frame of point cloud data included in the target frame point cloud data as a reference Frame point cloud data, using the global pose difference between other frame point cloud data included in the target frame point cloud data and the reference frame point cloud data as a priori, using the nearest neighbor iterative algorithm for matching operations, to obtain the Describe the mosaic frame point cloud.
  • determining the relative pose of the robot relative to the charging stand based on the template point cloud and the frame point cloud includes: using the nearest neighbor iterative algorithm to calculate the frame point cloud and The relative position of the template point cloud; determining the relative position of the mosaic point cloud and the template point cloud as the relative pose of the robot and the charging stand.
  • controlling the robot to drive to the location of the charging base based on the second global pose includes: determining a first pose point of the robot based on the first global pose (
  • the first pose point is actually the coordinate origin of the robot, for example, it can be a point on the robot that is close to the forward direction, a point close to the backward direction, a robot center point, etc.), and, based on the second global pose, determine the A second pose point of the charging stand; determining a first distance between the first pose point and the perpendicular line of the charging stand, and indicating the first pose point and the second pose point The second distance of the length of the connecting line; controlling the robot to travel to the location of the charging base based on the first distance and the second distance.
  • controlling the robot to travel to the location of the charging base based on the first distance and the second distance includes: determining a first length of the first distance; When the length exceeds a first length threshold (for example, 3cm, 5cm, etc.), repeat the following operations until the first length is less than or equal to the first length threshold, and then perform target processing so that the robot Traveling to the location of the charging stand: controlling the robot to turn in the direction of the vertical point between the first pose point and the mid-perpendicular line and walk the first length, and then to the second pose Rotate in the direction of the point; if it is determined that the first length is less than or equal to the first length threshold, execute the target processing so that the robot travels to the location of the charging stand.
  • a first length threshold for example, 3cm, 5cm, etc.
  • the target processing includes: determining an included angle between a line connecting the first pose point and the second pose point and the median perpendicular; based on the included angle and The weighted value of the first length determines the distance that the robot deviates from the mid-perpendicular line, and based on the distance that the robot deviates from the mid-perpendicular line, continuously corrects the pose of the robot until the first pose point and the length of the line between the second pose point is less than the second length threshold (for example, 2cm, 4cm, etc.); control the robot to rotate a fourth angle (for example, 120°, 180°, 200°, etc.) , and adjust the angular velocity of the robot in real time based on the included angle until a predetermined component included in the robot touches the charging base.
  • the pose point can be located at a specific position on the charging stand, or an artificially specified point) and sweeping
  • the machine can calculate a vertical line L1 of the charging stand, and A line segment L2 connecting p1 and p2 (corresponding to the aforementioned second distance).
  • a perpendicular line is drawn from the point p2 to the straight line L1 to obtain a line segment L3 (corresponding to the aforementioned first distance). Calculate the length of L3 as d1, and the angle between L1 and L2 as ⁇ .
  • the compensation action is: the machine turns to the direction of point p3, walks a distance of d1, and then turns to the direction of p1.
  • the machine determines the distance from the center of the machine according to the weighted value of the included angle ⁇ and d1, and calculates the angular velocity in real time based on the weighted value to correct the machine pose; and calculates the L2 distance d2 in real time during the process.
  • d2 is less than a preset threshold, it is determined that the machine has walked to a position close enough to the charging stand, and at this time the machine rotates 180 degrees (or other angles, such as 160 degrees, 150 degrees, 100 degrees, etc.), and according to the clip Angle ⁇ to adjust its own angular velocity in real time to ensure accurate docking until the charging pad of the machine touches the charging stand.
  • the posture of the charging seat can be accurately identified, and the recharging success rate can be improved.
  • a charging device is also provided, which is used to implement the above embodiments and preferred implementation modes, and what has already been described will not be repeated.
  • the term "module” may be a combination of software and/or hardware that realizes a predetermined function.
  • the devices described in the following embodiments are preferably implemented in software, implementations in hardware, or a combination of software and hardware are also possible and contemplated.
  • Fig. 7 is a structural block diagram of a charging device according to an embodiment of the present invention. As shown in Fig. 7, the device includes:
  • An acquisition module 72 configured to acquire point cloud data obtained by scanning the target area by the robot when it is determined that the target position relationship is satisfied between the robot and the charging stand;
  • a calculation module 74 configured to calculate the relative pose of the robot relative to the charging stand based on the point cloud data
  • a first determining module 76 configured to determine a second global pose of the charging stand based on the first global pose of the robot and the relative pose;
  • the control module 78 is configured to control the robot to drive into the entrance of the charging stand based on the second global pose, so that the charging stand can charge the robot.
  • the above-mentioned device also includes:
  • the second determination module is used to determine the point cloud data obtained by scanning the target area by the robot, the method also includes: controlling the robot to rotate the first angle from the initial position, and determining Each signal receiving head provided on the robot receives the start angle and end angle of the first signal, wherein, the start angle and the end angle are determined based on the initial position , the first signal is a signal sent by the charging stand;
  • a third determination module configured to determine the target orientation of the charging stand based on the initial angle and the end angle
  • control module 78 is also used to control the robot to move towards the target orientation until the robot and the charging stand meet the target positional relationship.
  • control module 78 may control the robot to move toward the target orientation in the following manner until the robot and the charging stand satisfy the target position relationship: control the robot to move toward the target The azimuth is moved until the predetermined number of signal receiving heads provided on the robot can all receive the second signal sent by the charging stand.
  • the device is further configured to: after controlling the robot to move toward the target orientation: after determining that the predetermined number of signal receiving heads provided on the robot cannot all receive the charging In the case of the second signal sent by the charging stand, repeat the following operations until the predetermined number of signal receiving heads set on the robot can receive the second signal sent by the charging stand: control the The robot rotates in a first direction by a second angle; controlling the robot to travel a first distance; controlling the robot to rotate in a second direction by a second angle, wherein the first direction and the second direction are opposite direction.
  • the calculation module 74 can determine the first global pose of the robot and the relative pose of the robot relative to the charging stand in the following manner: based on the point cloud data including The information of each sampling point and the predetermined structural information of the charging stand determine the first global pose of the robot and the contour bitmap of the charging stand; take the target point on the charging stand as the origin and Determine the template point cloud of the charging stand based on the contour bitmap; select a predetermined number of target frame point cloud data from the multi-frame point cloud data included in the point cloud data, and select a predetermined number of target frame point cloud data for the predetermined number of target frames
  • the point cloud data is framed to obtain a framed point cloud; based on the template point cloud and the framed point cloud, the relative pose of the robot relative to the charging stand is determined; wherein, in the point cloud data It includes the multi-frame point cloud data collected by the robot during the rotation of the third angle.
  • the device may select a predetermined number of target frame point cloud data from the multi-frame point cloud data included in the point cloud data in the following manner: The last predetermined number of frame point cloud data in the frame point cloud data is determined as the target frame point cloud data; the target frame is selected from the multi-frame point cloud data included in the point cloud data according to a predetermined selection interval point cloud data.
  • the device may frame the predetermined number of target frame point cloud data in the following manner to obtain the frame point cloud: the last frame included in the target frame point cloud data
  • the frame point cloud data is used as the reference frame point cloud data
  • the global pose difference between other frame point cloud data included in the target frame point cloud data and the reference frame point cloud data is used as a priori
  • the nearest neighbor iterative algorithm is used to perform matching operation to obtain the mosaic point cloud.
  • the device can determine the relative pose of the robot relative to the charging stand based on the template point cloud and the frame point cloud in the following manner: use the nearest neighbor iterative algorithm to calculate the The relative position of the mosaic point cloud and the template point cloud; the relative position of the mosaic point cloud and the template point cloud is determined as the relative pose of the robot and the charging stand.
  • control module 78 can control the robot to drive to the location of the charging base based on the second global pose in the following manner: determine the position of the robot based on the first global pose A first pose point, and, based on the second global pose, determine a second pose point of the charging stand; determine a first distance from the first pose point to a perpendicular line of the charging stand, and a second distance used to indicate the length of the connecting line between the first pose point and the second pose point; based on the first distance and the second distance, the robot is controlled to travel to the Where the charging stand is located.
  • control module 78 may control the robot to travel to the location of the charging stand based on the first distance and the second distance in the following manner: determine the first length of the first distance; When it is determined that the first length exceeds the first length threshold, the following operations are repeatedly performed until the first length is less than or equal to the first length threshold, and then target processing is performed so that the robot travels to Where the charging base is located: control the robot to turn in the direction of the vertical point of the first pose point and the median perpendicular line and walk the first length, and then move to the direction of the second pose point direction rotation; if it is determined that the first length is less than or equal to the first length threshold, execute the target processing so that the robot travels to the location of the charging stand.
  • the target processing includes: determining an included angle between a line connecting the first pose point and the second pose point and the median perpendicular; based on the included angle and The weighted value of the first length determines the distance that the robot deviates from the mid-perpendicular line, and based on the distance that the robot deviates from the mid-perpendicular line, continuously corrects the pose of the robot until the first pose point and the length of the line between the second pose point is less than the second length threshold; control the robot to rotate a fourth angle, and adjust the angular velocity of the robot in real time based on the included angle until the robot includes until the predetermined part touches the charging stand.
  • a robot in an optional embodiment, includes: a scanning component, used to scan the target area to obtain point cloud data; a control component, including the charging device described in any one of the above embodiments means; a charging assembly for charging said robot.
  • the above-mentioned modules can be realized by software or hardware. For the latter, it can be realized by the following methods, but not limited to this: the above-mentioned modules are all located in the same processor; or, the above-mentioned modules can be combined in any combination The forms of are located in different processors.
  • Embodiments of the present invention also provide a computer-readable storage medium, in which a computer program is stored, wherein the computer program is set to execute the steps in any one of the above method embodiments when running.
  • the above-mentioned computer-readable storage medium may be configured to store a computer program for performing the following steps:
  • the above-mentioned computer-readable storage medium may include but not limited to: U disk, read-only memory (Read-Only Memory, referred to as ROM), random access memory (Random Access Memory, referred to as RAM) , mobile hard disk, magnetic disk or optical disk and other media that can store computer programs.
  • ROM read-only memory
  • RAM random access memory
  • mobile hard disk magnetic disk or optical disk and other media that can store computer programs.
  • An embodiment of the present invention also provides an electronic device, including a memory and a processor, where a computer program is stored in the memory, and the processor is configured to run the computer program to perform the steps in any one of the above method embodiments.
  • the electronic device may further include a transmission device and an input and output device, wherein the transmission device is connected to the processor, and the input and output device is connected to the processor.
  • the above-mentioned processor may be configured to execute the following steps through a computer program:
  • each module or each step of the above-mentioned present invention can be realized by a general-purpose computing device, and they can be concentrated on a single computing device, or distributed in a network formed by multiple computing devices In fact, they can be implemented in program code executable by a computing device, and thus, they can be stored in a storage device to be executed by a computing device, and in some cases, can be executed in an order different from that shown here. Or described steps, or they are fabricated into individual integrated circuit modules, or multiple modules or steps among them are fabricated into a single integrated circuit module for implementation. As such, the present invention is not limited to any specific combination of hardware and software.

Landscapes

  • Engineering & Computer Science (AREA)
  • Power Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Les modes de réalisation de la présente invention concernent un procédé de charge, un appareil de charge et un robot. Le procédé de charge comprend les étapes suivantes : lorsqu'il est déterminé qu'un robot et une base de charge satisfont une relation de position cible, acquérir des données de nuage de points obtenues au moyen du balayage, par le robot, d'une zone cible ; sur la base des données de nuage de points, calculer une pose relative du robot par rapport à la base de charge ; sur la base d'une première pose globale du robot et de la pose relative, déterminer une seconde pose globale de la base de charge ; et sur la base de la seconde pose globale, commander le robot à se déplacer dans une entrée de la base de charge afin que la base de charge charge le robot. La solution permet de résoudre les problèmes, dans l'état de la technique pertinent, de la relativement faible précision de reconnaissance d'une base de charge et d'un mode de reconnaissance non universel.
PCT/CN2022/113273 2021-08-23 2022-08-18 Procédé de charge, appareil de charge et robot WO2023025028A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110969837.7A CN113675923B (zh) 2021-08-23 2021-08-23 充电方法、充电装置及机器人
CN202110969837.7 2021-08-23

Publications (1)

Publication Number Publication Date
WO2023025028A1 true WO2023025028A1 (fr) 2023-03-02

Family

ID=78545358

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/113273 WO2023025028A1 (fr) 2021-08-23 2022-08-18 Procédé de charge, appareil de charge et robot

Country Status (2)

Country Link
CN (1) CN113675923B (fr)
WO (1) WO2023025028A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113675923B (zh) * 2021-08-23 2023-08-08 追觅创新科技(苏州)有限公司 充电方法、充电装置及机器人
CN114355889A (zh) * 2021-12-08 2022-04-15 上海擎朗智能科技有限公司 控制方法、机器人、机器人充电座及计算机可读存储介质
CN114983273A (zh) * 2022-06-01 2022-09-02 深圳市倍思科技有限公司 一种清洁装置的回充定位方法及清洁系统
CN116501070B (zh) * 2023-06-30 2023-09-19 深圳市欢创科技有限公司 回充方法、机器人及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110632915A (zh) * 2018-06-21 2019-12-31 科沃斯机器人股份有限公司 机器人回充路径规划方法、机器人及充电系统
CN111625005A (zh) * 2020-06-10 2020-09-04 浙江欣奕华智能科技有限公司 一种机器人充电方法、机器人充电的控制装置及存储介质
CN112346453A (zh) * 2020-10-14 2021-02-09 深圳市杉川机器人有限公司 机器人自动回充方法、装置、机器人和存储介质
US20210060801A1 (en) * 2019-08-27 2021-03-04 Lg Electronics Inc. Method and system for charging robot
CN112792820A (zh) * 2021-03-16 2021-05-14 千里眼(广州)人工智能科技有限公司 机器人自动回充方法、装置及机器人系统
CN112826377A (zh) * 2021-02-23 2021-05-25 美智纵横科技有限责任公司 扫地机的回充对准方法、装置及扫地机
CN113675923A (zh) * 2021-08-23 2021-11-19 追觅创新科技(苏州)有限公司 充电方法、充电装置及机器人

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109407073B (zh) * 2017-08-15 2020-03-10 百度在线网络技术(北京)有限公司 反射值地图构建方法和装置
CN108363386A (zh) * 2017-12-30 2018-08-03 杭州南江机器人股份有限公司 基于二维码和激光的室内机器人定位方法、装置及系统
CN111413721B (zh) * 2020-01-14 2022-07-19 华为技术有限公司 车辆定位的方法、装置、控制器、智能车和系统
CN112086010B (zh) * 2020-09-03 2022-03-18 中国第一汽车股份有限公司 地图生成方法、装置、设备及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110632915A (zh) * 2018-06-21 2019-12-31 科沃斯机器人股份有限公司 机器人回充路径规划方法、机器人及充电系统
US20210060801A1 (en) * 2019-08-27 2021-03-04 Lg Electronics Inc. Method and system for charging robot
CN111625005A (zh) * 2020-06-10 2020-09-04 浙江欣奕华智能科技有限公司 一种机器人充电方法、机器人充电的控制装置及存储介质
CN112346453A (zh) * 2020-10-14 2021-02-09 深圳市杉川机器人有限公司 机器人自动回充方法、装置、机器人和存储介质
CN112826377A (zh) * 2021-02-23 2021-05-25 美智纵横科技有限责任公司 扫地机的回充对准方法、装置及扫地机
CN112792820A (zh) * 2021-03-16 2021-05-14 千里眼(广州)人工智能科技有限公司 机器人自动回充方法、装置及机器人系统
CN113675923A (zh) * 2021-08-23 2021-11-19 追觅创新科技(苏州)有限公司 充电方法、充电装置及机器人

Also Published As

Publication number Publication date
CN113675923A (zh) 2021-11-19
CN113675923B (zh) 2023-08-08

Similar Documents

Publication Publication Date Title
WO2023025028A1 (fr) Procédé de charge, appareil de charge et robot
CN108247647B (zh) 一种清洁机器人
CN109683605B (zh) 机器人及其自动回充方法、系统、电子设备、存储介质
KR102242713B1 (ko) 이동 로봇 및 그 제어방법, 및 단말기
CN109730590B (zh) 清洁机器人以及清洁机器人自动返回充电的方法
EP3603370B1 (fr) Robot mobile, procédé de commande d'un robot mobile et système de robot mobile
KR102403504B1 (ko) 이동 로봇 및 그 제어 방법
CA2428360C (fr) Systeme robotique multiplateforme autonome
US9820433B2 (en) Auto mowing system
US7054716B2 (en) Sentry robot system
CN105700522B (zh) 一种机器人充电方法及其充电系统
US20190254490A1 (en) Vacuum cleaner and travel control method thereof
CN109669457B (zh) 一种基于视觉标识的机器人回充方法及芯片
US20020095239A1 (en) Autonomous multi-platform robot system
CN105119338A (zh) 移动机器人充电控制系统及方法
US20110046784A1 (en) Asymmetric stereo vision system
EP2296072A2 (fr) Système de vision stéréoscopique asymétrique
US20190220033A1 (en) Moving robot and controlling method for the moving robot
US11564348B2 (en) Moving robot and method of controlling the same
JP2002182742A (ja) モービルロボット及びその経路補正方法
CN111067432B (zh) 扫地机充电桩的充电工作区域的确定方法及扫地机
CN205081492U (zh) 移动机器人充电控制系统
CN211022482U (zh) 清洁机器人
WO2018228254A1 (fr) Dispositif électronique mobile et procédé pour utilisation dans un dispositif électronique mobile
US20200057449A1 (en) Vacuum cleaner

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22860374

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE