CN110861094A - Robot control method, robot, and readable storage medium - Google Patents

Robot control method, robot, and readable storage medium Download PDF

Info

Publication number
CN110861094A
CN110861094A CN201911252566.2A CN201911252566A CN110861094A CN 110861094 A CN110861094 A CN 110861094A CN 201911252566 A CN201911252566 A CN 201911252566A CN 110861094 A CN110861094 A CN 110861094A
Authority
CN
China
Prior art keywords
elevator
robot
waiting
target position
waiting area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911252566.2A
Other languages
Chinese (zh)
Other versions
CN110861094B (en
Inventor
徐恩科
霍峰
陈侃
卜大鹏
秦宝星
程昊天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Gaoxian Automation Technology Co Ltd
Original Assignee
Shanghai Gaoxian Automation Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Gaoxian Automation Technology Co Ltd filed Critical Shanghai Gaoxian Automation Technology Co Ltd
Priority to CN201911252566.2A priority Critical patent/CN110861094B/en
Publication of CN110861094A publication Critical patent/CN110861094A/en
Application granted granted Critical
Publication of CN110861094B publication Critical patent/CN110861094B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones

Abstract

The application discloses a robot control method, a robot and a readable storage medium. The robot control method includes: after receiving the elevator taking task, determining an elevator waiting area of the robot according to the elevator entering point, wherein the distance from the elevator waiting area to the elevator is greater than the distance from the elevator entering point to the elevator; detecting the elevator waiting environment information of the elevator waiting area; and determining the target position of the robot waiting for the elevator in the elevator waiting area according to the elevator waiting environment information and the central position of the elevator waiting area, and controlling the robot to move to the target position. When the robot executes the task of taking the elevator, and passengers waiting for the elevator in front of the elevator are more, the robot can also find a proper elevator waiting position according to the elevator waiting environment information in the elevator waiting area, so that the robot can conveniently enter the elevator as far as possible and can keep a safe distance with the passengers.

Description

Robot control method, robot, and readable storage medium
Technical Field
The present application relates to the field of robot intelligent control technology, and more particularly, to a robot control method, a robot, and a readable storage medium.
Background
With the development of automation technology and artificial intelligence, the intelligent robot brings great convenience to the life and service of people. The application scenes of the intelligent robot become more and more abundant, wherein the automatic elevator taking of the robot is a new application scene. At present, when an intelligent robot automatically executes a boarding task, a lot of passengers wait for an elevator in front of the elevator, and the situation that the elevator waiting position pre-selected by the robot is blocked by the passengers or other objects and cannot be reached possibly occurs, so that the robot navigation fails. The application scene of the robot is limited, for example, many robots can only execute elevator taking tasks when the passenger flow is low at night, and the application scene of the robot is greatly limited.
Disclosure of Invention
The present invention is directed to solving, at least to some extent, one of the technical problems in the related art.
Therefore, the invention aims to provide a robot control method for safely executing a boarding task by a robot, which can realize safe navigation of boarding of the robot, automatically determine a boarding area and select a proper boarding position, thereby conveniently entering the boarding as much as possible and keeping a safe distance with passengers.
Another object of the present invention is to provide a robot and a readable storage medium for safely performing a boarding task.
In order to achieve the above object, an embodiment of the present invention provides a robot control method, including: after receiving a boarding task, determining a waiting area of the robot according to an elevator entering point, wherein the distance from the waiting area to an elevator is greater than the distance from the elevator entering point to the elevator; detecting the elevator waiting environment information of the elevator waiting area; and determining a target position of the robot waiting for the elevator in the elevator waiting area according to the elevator waiting environment information and the central position of the elevator waiting area, and controlling the robot to move to the target position.
According to the robot control method, when the robot executes the elevator taking task, the robot can wait in the elevator waiting area far away from the elevator, after the elevator arrives, the robot navigates to the elevator entering point close to the elevator in a leading mode, then the robot can enter the elevator safely by advancing in the elevator direction along the elevator entering point in a straight line mode, and safe navigation of the elevator taking task is achieved. When waiting before the elevator for a lot of passengers, the robot can also find a proper waiting position according to waiting environment information in the waiting area, so that the robot can conveniently enter the elevator as far as possible and can keep a safe distance with the passengers.
In some embodiments, the step of determining the waiting area of the robot according to the elevator entering point comprises the following steps: determining the elevator position of the corresponding floor according to the scanned floor map; and marking the position of the robot, which is right in front of the elevator door and is right opposite to the elevator direction, as the elevator entering point, wherein the distance from the elevator entering point to the elevator door is 0.1-0.3 m. The robot can appoint the floor to correspond to the elevator entering point in a marking mode, the elevator entering point is close to the elevator door, so that the robot can accurately master the position of the elevator door according to the appointed elevator entering point, the robot can determine an elevator waiting area according to the elevator entering point, and the robot can safely and accurately enter the elevator along a straight line from the elevator entering point when the elevator arrives.
In some embodiments, the step of determining the waiting area of the robot according to the elevator entering point comprises the following steps: determining the size and the position of the elevator waiting area according to configuration parameters and the position of the elevator entering point, wherein the configuration parameters comprise the distance between the elevator waiting area and the elevator entering point, the shape and the size of the elevator waiting area, and the direction angle of the central position relative to the elevator entering point. The robot automatically generates the elevator waiting area according to the configuration parameters and the position of the elevator entering point, a user does not need to manually determine the elevator waiting area, and the workload of the user is reduced.
In some embodiments, the elevator waiting environment information includes obstacle information in the elevator waiting area, and the step of determining the target position of the robot waiting for the elevator in the elevator waiting area based on the elevator waiting environment information and the center position of the elevator waiting area includes: selecting the central position as the target position, and judging whether the robot moves to the target position safely or not according to the obstacle information; when the robot moves to the target position safely, controlling the robot to move to the target position; or reselecting the target position when it is unsafe for the robot to move to the target position. The central point who waits the terraced regional position for the robot advances the ladder comparatively convenient position, and the robot waits the central point who waits the terraced regional position of preferred selection and waits the ladder, thereby advances the ladder as far as conveniently as possible through waiting the barrier in terraced region, can guarantee again to advance terraced safety.
In some embodiments, the step of reselecting the target location when it is unsafe for the robot to move to the target location comprises: generating a plurality of elevator waiting points in the elevator waiting area; judging whether the robot moves to each elevator waiting point safely or not according to the obstacle information; determining the weight of each elevator entering point with safe judgment result according to the current position and the central position of the robot; and selecting the corresponding elevator waiting point as the target position according to the weight, and controlling the robot to move to the target position. The robot generates a plurality of elevator waiting points and performs safety judgment and weight analysis on the elevator waiting points, so that the target position is determined again, and the robot can flexibly select a proper position for waiting the elevator.
In some embodiments, the weight is a sum of a distance from a current position of the robot to each of the elevator waiting points whose determination result is safe and a distance from the corresponding elevator waiting point to the center position, and the selecting the corresponding elevator waiting point as the target position according to the weight includes: and selecting the elevator waiting point with the minimum weight as the target position. The center position of the reselected target position and the center position of the elevator waiting area are close, the distance from the robot to the target position is close, the robot can conveniently move and navigate, and the robot is favorable for waiting for the elevator and entering the elevator.
In some embodiments, the step of selecting a corresponding elevator waiting point as the target position according to the weight and controlling the robot to move to the target position includes: controlling the robot to move and detecting the barrier information of the elevator waiting area in real time; judging whether the robot moves to the target position safely or not according to the obstacle information; and when the target position is detected to be unsafe in the moving process of the robot, the target position is determined again. Whether the robot has the barrier in the target position is detected in real time in the moving process, so that the problem that the safety of the robot moving to the target position is influenced by the movement of a passenger waiting for a ladder or the barrier is avoided.
The embodiment of the application provides a robot, which comprises a ladder waiting determining module, a detecting module and a control module, wherein the distance from a ladder waiting area to an elevator is greater than the distance from a ladder entering point to the elevator; the detection module is used for detecting the elevator waiting environment information of the elevator waiting area; the control module is used for determining a target position of the robot waiting for the elevator in the elevator waiting area according to the elevator waiting environment information and the central position of the elevator waiting area, and controlling the robot to move to the target position.
When the robot of the embodiment of the application executes the task of taking the elevator, the robot can wait in the elevator waiting area far away from the elevator, after the elevator arrives, the robot navigates to the elevator entering point near the elevator in a leading mode, then the robot can be guaranteed to enter the elevator safely by advancing to the elevator along the elevator entering point in a straight line mode, and the safe navigation of the task of taking the elevator is realized. When waiting before the elevator for a lot of passengers, the robot can also find a proper waiting position according to waiting environment information in the waiting area, so that the robot can conveniently enter the elevator as far as possible and can keep a safe distance with the passengers.
The embodiment of the application provides a robot, which comprises a processor, a readable storage medium and computer-executable instructions stored on the readable storage medium and capable of running on the processor, wherein when the computer-executable instructions are executed by the processor, the processor is enabled to execute the control method of any one of the above embodiments.
In the robot of the embodiment of the application, the processor executes the computer executable instruction, when the elevator taking task is executed, the robot can wait in an elevator waiting area far away from an elevator, after the elevator arrives, the robot navigates to an elevator entering point close to the elevator in a leading mode, then the robot can enter the elevator safely by advancing in a straight line along the elevator entering point to the elevator direction, and the safe navigation of the elevator taking task is realized. When waiting before the elevator for a lot of passengers, the robot can also find a proper waiting position according to waiting environment information in the waiting area, so that the robot can conveniently enter the elevator as far as possible and can keep a safe distance with the passengers.
The present embodiments provide a non-transitory computer-readable storage medium including computer-executable instructions that, when executed by one or more processors, cause the processors to perform the robot control method of the above embodiments.
In the readable storage medium of the embodiment of the application, the processor executes the computer executable instruction, so that the robot can wait in a waiting area far away from an elevator when an elevator taking task is executed, the robot navigates to an elevator entering point near the elevator in a leading mode after the elevator arrives, and then the robot can enter the elevator safely by advancing to the elevator along the elevator entering point in a straight line direction, and the safe navigation of the elevator taking task is realized. When waiting before the elevator for a lot of passengers, the robot can also find a proper waiting position according to waiting environment information in the waiting area, so that the robot can conveniently enter the elevator as far as possible and can keep a safe distance with the passengers.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a block schematic diagram of a robot according to an embodiment of the present application.
Fig. 2 is a flowchart illustrating a robot control method according to an embodiment of the present invention.
Fig. 3 is a schematic view of the robot riding an elevator according to the embodiment of the present application.
Fig. 4 is a schematic view of an application scenario of a robot according to an embodiment of the present application.
Fig. 5 is another flowchart illustrating a robot control method according to an embodiment of the present application.
Fig. 6 is a further flowchart illustrating a robot control method according to an embodiment of the present invention.
Fig. 7 is a further flowchart of the robot control method according to the embodiment of the present application.
Fig. 8 is a further flowchart of the robot control method according to the embodiment of the present application.
Fig. 9 is a further flowchart of the robot control method according to the embodiment of the present application.
Fig. 10 is a further flowchart of the robot control method according to the embodiment of the present application.
Fig. 11 is a further flowchart of the robot control method according to the embodiment of the present application.
Fig. 12 is a further flowchart of the robot control method according to the embodiment of the present application.
Fig. 13 is another block schematic diagram of a robot according to an embodiment of the present application.
Fig. 14 is a schematic configuration diagram of a robot according to an embodiment of the present application.
Description of the main element symbols:
robot 10, elevator determination module 11, marking module 12, elevator waiting determination module 13, control module 14, communication module 15, determination module 16, detection module 17, processor 18, readable storage medium 19, computer executable instructions 191, elevator 20, elevator doors 22, elevator entry point 30, elevator waiting area 40, center location 42, elevator waiting point 44, terminal device 50, server 60.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application.
Referring to fig. 1 to 4, a method for controlling a robot 10 according to an embodiment of the present disclosure is used to control the robot 10 to perform a task, for example, to control the robot 10 to perform an elevator riding task, so that the robot 10 can move between different floors to perform work. The robot 10 includes, but is not limited to, a cleaning robot 10, a meal delivery robot 10, a medicine delivery robot 10, and the like.
In some embodiments, the robot 10 control method includes:
step S1, after receiving the boarding task, determining the waiting area 40 of the robot 10 according to the boarding point 30, wherein the distance from the waiting area 40 to the elevator 20 is greater than the distance from the boarding point 30 to the elevator 20;
step S2, detecting the elevator waiting environment information of the elevator waiting area 40;
step S3, determining the target position of the robot 10 waiting for the elevator 20 in the waiting area 40 according to the waiting environment information and the center position 42 of the waiting area 40, and controlling the robot 10 to move to the target position.
Wherein the robot 10 waits for the elevator 20 in the waiting area 40 and moves to the entry point 30 to enter the elevator 20 from the entry point 30 when the elevator 20 arrives and opens the door.
Specifically, the robot 10 may include the elevator waiting determination module 13, the inspection module 17, and the control module 14, step S1 may be implemented by the determination module, step S2 may be implemented by the inspection module 17, and step S3 may be implemented by the elevator waiting determination module 13 and the control module 14. That is, the elevator waiting determination module 13 may be configured to determine the elevator waiting area 40 of the robot 10 according to the elevator entering point 30 after receiving the elevator taking mission. The detection module 17 may be configured to detect the elevator waiting environment information of the elevator waiting area 40. The elevator waiting determining module 13 is further configured to determine a target position of the robot 10 waiting for the elevator 20 in the elevator waiting area 40 according to the elevator waiting environment information and the center position 42 of the elevator waiting area 40. The control module 14 is used to control the robot 10 to move to a target position.
In the method for controlling the robot 10 according to the embodiment of the present application, the robot 10 may wait in the waiting area 40 far from the elevator 20 when performing the elevator taking task, and after the elevator 20 arrives, the robot 10 first navigates to the elevator entering point 30 near the elevator 20, and then linearly advances in the direction of the elevator 20 along the elevator entering point 30, so that the robot 10 can be ensured to safely enter the elevator 20, and the safe navigation of the elevator taking task is realized. When the number of passengers waiting for the elevator in front of the elevator 20 is large, the robot 10 can find a proper elevator waiting position according to the elevator waiting environment information in the elevator waiting area 40, so that the elevator can be conveniently entered as far as possible, and the robot can keep a safe distance with the passengers.
In some embodiments, the detection module 17 may be a camera, and the robot 10 may acquire images of the elevator waiting area 40, including but not limited to color images, grayscale images, depth images, infrared images, and the like, through the camera, so as to acquire elevator waiting environment information of the elevator waiting area 40.
Referring to fig. 5, in some embodiments, the robot 10 includes:
step S01, determining the position of the elevator 20 corresponding to the floor according to the scanned floor map; and
in step S02, the position where the robot 10 is located right in front of the elevator doors 22 and facing the elevator 20 is marked as the entry point 30.
Specifically, the robot 10 may include an elevator 20 determining module 11 and a marking module 12, and step S01 may be implemented by the elevator 20 determining module 11 and step S2 may be implemented by the marking module 12. That is, the elevator 20 determination module 11 may be configured to determine the position of the elevator 20 of the corresponding floor according to the scanned floor map. Marking module 12 may be used to mark the location of robot 10 directly in front of elevator doors 22 and directly opposite elevator 20 as entry point 30.
It will be appreciated that when entering a new job scenario, the robot 10 needs to scan a map of the scenario and then perform the corresponding tasks for automatic job depending on the location of the robot 10 in the map. When the robot 10 performs work on a floor, the floor map may be scanned by a person concerned, and the position of each area in the floor map, for example, the room position, the position of the elevator 20, the position of an entrance or exit, and the like, may be identified, and the scanned floor map may be stored in the robot 10, the terminal device 50, and/or the server 60, and the robot 10, the terminal device 50, and/or the server 60 may be connected through wired and/or wireless communication, thereby guiding the robot 10 to move. The wireless network communication connection includes, but is not limited to, wireless communication modes such as WiFi, Bluetooth (Bluetooth), Zigbee (Zigbee), narrowband Band Internet of Things (NB-Iot), and the like. Specifically, when the map is scanned, the contour of the elevator 20 can be scanned completely, the scanned map is edited in the relevant application of the terminal device 50, and the contour of the elevator 20 is drawn in the floor map, for example, a polygon is drawn to fit the contour of the elevator 20 as much as possible to identify the position of the elevator 20. In this manner, in step S01, the position of the elevator 20 corresponding to the floor can be specified from the scanned floor map.
After obtaining the floor map and determining the position of the elevator 20, the user can control the robot 10 to move to the position right in front of the elevator door 22, and control the robot 10 to face the direction of the elevator 20, at this time, the current position of the robot 10 is located, and the current position of the robot 10 is marked as an elevator entering point 30, so that the robot 10 can navigate to the elevator entering point 30 when performing an elevator taking task, and directly enter the elevator 20 from the elevator entering point 30.
In step S02, robot 10 may mark the position at which robot 10 is directly in front of elevator doors 22 and facing elevator 20 as entry point 30 according to the user' S instruction. As shown in fig. 4, the robot 10 may be in communication connection with the elevator 20 and the terminal device 50, the user may view a floor map and the current position of the robot 10 in the relevant terminal Application (APP) through the terminal device 50, and when the user confirms that the robot 10 is located right in front of the elevator door 22 and facing the elevator 20 for a certain distance, the terminal device 50 may be operated to send a command to the robot 10, and the robot 10 marks the current position as the landing point 30. In one example, the terminal device 50 includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, and the like.
Of course, the user may also directly operate the robot 10 to generate instructions from which the robot 10 marks the current position as the landing point 30.
In some embodiments, the entry point 30 may be stored in the robot 10, the terminal device 50, and/or the server 60. After the entry point 30 is marked in this way, the robot 10 can directly enter the elevator 20 according to the stored position of the entry point 30 when subsequently performing an elevator boarding task at the floor.
In the present embodiment, the entry point 30 is located closer to the elevator door 22, and the robot 10 can travel straight in the direction of the entry point 30 as long as it reaches the entry point 30, thereby ensuring safe entry into the elevator 20. In step S3, the robot 10 may determine the waiting area 40 of the robot 10 according to the position of the entry point 30. The distance between the elevator waiting area 40 and the elevator door 22 is greater than the distance between the elevator entering point 30 and the elevator door 22, that is, the area where the robot 10 waits for the elevator 20 is the elevator waiting area 40 which is far away from the elevator door 22, so that the robot 10 does not block passengers from getting on or off the elevator 20 and waiting for the elevator when waiting for the elevator, and the elevator riding experience of the passengers is ensured.
In some embodiments, the distance d1 from the access point 30 to the elevator door 22 may be 0.1 meters to 0.5 meters.
Therefore, the elevator entering point 30 is close to the elevator door 22, and the robot 10 can directly enter the elevator 20 along a straight line after navigating to the elevator entering point 30, so that the situation that the robot 10 deviates from the elevator door 22 due to insufficient direction control precision when entering the elevator is avoided.
In one example, the distance d1 from the access point 30 to the elevator door 22 may be 0.1 meters to 0.3 meters.
In certain embodiments, step S1 includes:
the size and the position of the waiting area 40 are determined according to configuration parameters and the position of the elevator entering point 30, wherein the configuration parameters can comprise the distance between the waiting area 40 and the elevator entering point 30, the shape and the size of the waiting area 40 and the direction angle of the central position 42 of the waiting area 40 relative to the elevator entering point 30.
Specifically, the elevator waiting determination module 13 may be configured to determine the size and the position of the elevator waiting area 40 according to the configuration parameters and the position of the elevator entering point 30.
In this way, the robot 10 may store the configuration parameters of the waiting area 40, and the robot 10 may determine the size and the position of the waiting area 40 according to the configuration parameters. It is understood that after the relevant person scans the floor map, the location of each area in the floor can be determined by establishing a world coordinate system, and after the landing point 30 is calibrated, the robot 10 can determine the coordinates of the landing point 30. In general, the landing area 40 has a regular shape, the center position 42 is the geometric center of the landing area 40, and when the landing area 40 is determined, the coordinates of the center position 42 of the landing area 40 can be determined according to the coordinates and the configuration parameters of the landing point 44.
In one example, the central position 42 of the landing area 40 faces the elevator doors 22, the landing area 40 may be rectangular and one side may be parallel to the elevator doors 22, the distance between the landing area 40 and the landing point 30 (i.e., the minimum distance between the landing point 30 and the landing area 40) L1 may be 0.5 m, the length L2 of the landing area 40 may be 1 m, the width L3 may be 1 m, the coordinates of the landing point 30 may be (0.2, 0.5), the direction angle θ of the central position 42 of the landing area 40 relative to the landing point 30 may be 0, and the connecting line of the landing point 30 and the central position 42 of the landing area 40 is perpendicular to one side of the landing area 40 close to the landing point 30. At this time, the distance between the approach point 30 and the central position 42 of the waiting area 40 is 1 m (L1+ L2/2), and thus the coordinates of the central position 42 of the waiting area 40 can be calculated by the following formula:
X=X0+(L1+L2/2)cos(θ),
Y=Y0+(L1+L2/2)sin(θ);
wherein, X0And Y0Respectively, the abscissa and ordinate of the approach point 30.
That is, X is 0.2+ (0.5+1/2) cos (0) is 1.2, Y is 0.5+ (0.5+1/2) sin (θ) is 0.5, and the coordinates of the central position 42 of the terraced region 40 may be (1.2, 0.5). So that the robot 10 can geometrically center the coordinates of the center position 42 to define the range of the waiting area 40.
Of course, in other embodiments, the configuration parameters may not be limited to the above discussed embodiments, the ladder waiting area 40 may also be selected from a circle, an ellipse, a regular pentagon, or other polygons according to actual needs, and the size of the ladder waiting area 40 may be flexibly configured according to actual needs, and is not specifically limited herein.
In the present embodiment, the configuration parameters may be set by relevant staff, for example, a test engineer, according to the model and size of the robot 10, the size of the space available for the floor to wait for the elevator, and the like, and stored in the robot 10. When the configuration parameters need to be modified due to the change of the use environment in the use process, the configuration parameters can be modified by an after-sales engineer according to the actual needs.
Referring to fig. 6, in some embodiments, the elevator waiting environment information includes obstacle information in the elevator waiting area 40, and step S3 includes:
step S32, selecting the center position 42 as a target position, and determining whether it is safe for the robot 10 to move to the target position according to the obstacle information;
step S34, when it is safe that the robot 10 moves to the target position, controlling the robot 10 to move to the target position; or
In step S36, when it is not safe to move the robot 10 to the target position, the target position is reselected.
In particular, the robot 10 may include a determination module 16. The step S32 may be implemented by the determination module 16, the step S34 may be implemented by the control module 14, and the step S36 may be implemented by the determination module 16. That is, the determination module 16 may be configured to select the center position 42 as the target position, and determine whether it is safe for the robot 10 to move to the target position according to the obstacle information. The control module 14 may be used to control the robot 10 to move to the target position when it is safe for the robot 10 to move to the target position. The determination module 16 may be used to reselect the target location when it is unsafe for the robot 10 to move to the target location.
In this embodiment, the determining module 16 may determine whether the elevator waiting area 40 is safe by determining whether an obstacle exists in the elevator waiting area 40 through image analysis. The central position 42 of the waiting area 40 is opposite to the elevator door 22, and the robot 10 can conveniently enter the elevator 20 from the central position 42 of the waiting area 40. In this way, the robot 10 preferentially takes the center position 42 of the waiting area 40 as a target position and tries to navigate to the target position, the determination module 16 determines whether it is safe for the robot 10 to move to the center position 42 of the waiting area 40, that is, whether the center position 42 of the waiting area 40 is reachable, based on the obstacle information in the waiting area 40, and when the robot 10 moves to the center position 42 of the waiting area 40, the robot 10 can move to the center position 42 of the waiting area 40 and wait for the elevator 20 toward the entry point 30. When it is unsafe for the robot 10 to move to the center position 42 of the waiting area 40, the target position is reselected and a suitable waiting position is found.
Referring to fig. 7, in some embodiments, step S36 includes:
step S361, generating a plurality of waiting points 44 in the waiting area 40;
step S362, determining whether it is safe for the robot 10 to move to each of the waiting spots 44 according to the obstacle information;
step S363, determining the weight of each landing entrance point 30 with a safe determination result according to the current position and the center position 42 of the robot 10; and
and step S364, selecting the corresponding waiting point 44 as the target position according to the weight, and controlling the robot 10 to move to the target position.
Specifically, step S361, step S362 and step S363 may be implemented by the determination module 16, and step S364 may be implemented by the determination module 16 and the control module 14. That is, the determining module 16 may be configured to generate a plurality of elevator waiting points 44 in the elevator waiting area 40; and is used for judging whether the robot 10 is safe to move to each elevator waiting point 44 according to the barrier information; and a weight for determining the landing point 30 for which each determination result is safe, based on the current position of the robot 10 and the center position 42. The determining module 16 may be further configured to select the corresponding waiting point 44 as the target position according to the weight, and the control module 14 may be configured to control the robot 10 to move to the target position.
In this manner, a plurality of waiting places 44 are generated in the waiting area 40, and the safety of the robot 10 moving to each waiting place 44 is determined. When the robot 10 is judged to be safe when moving to a certain elevator waiting point 44, recording the judgment result of the elevator waiting point 44 as safety; when it is judged that it is unsafe to move the robot 10 to a certain landing point 44, the judgment result of the entry point 30 is recorded as unsafe. At this time, when the robot 10 moves to the waiting point 44 at which the determination result is safe, the contour of the robot 10 does not interfere with the obstacle or keeps a certain distance. The judging module 16 may determine the weight of each elevator waiting point 44 by using the current position of the robot 10 and the central position 42 of the elevator waiting area 40 as the safe elevator waiting points 44, and select the corresponding elevator waiting point 44 as the target position according to the weight of each elevator waiting point 44, so as to ensure the safety of the elevator waiting and facilitate the entry of the robot 10.
In certain embodiments, step S36 further includes:
in step S365, when the determination results at the plurality of elevator waiting spots 44 are all unsafe, it is determined that the elevator boarding task fails and the elevator boarding task is abandoned.
Specifically, step S365 may be implemented by the determination module 16. That is, the determination module 16 may be configured to determine that the elevator taking task fails and abandon the elevator taking task when the determination results of the plurality of elevator waiting points 44 are all unsafe.
It can be understood that, when the judgment results of the plurality of elevator waiting points 44 are all unsafe, the elevator waiting area 40 is occupied by the obstacle, the robot 10 cannot reach the elevator waiting area 40 to wait for the elevator, and the robot 10 can judge that the task fails and abandon the elevator taking task.
In some embodiments, the robot 10 includes the communication module 15, and when the robot 10 fails to take the elevator for a task, the communication module 15 may upload the task status to the server 60 and/or the terminal device 50, so that the relevant personnel can know about the operation of the robot 10, so that the relevant personnel can solve the elevator taking problem of the robot 10 or schedule the subsequent tasks of the robot 10.
In some embodiments, the robot 10 may send a notification signal when the determination results of the plurality of elevator waiting points 44 are all unsafe.
The robot 10 may include a speaker and/or a display screen, and the control module 14 may control the speaker and/or the display screen to emit a prompt signal. For example, the control module 14 may control the speaker to emit a "please exit the front waiting area 40! "OR" please remove the object in the waiting ladder area 40! "to prompt passengers within the waiting area 40 to leave the waiting area 40 or to prompt associated personnel to move obstacles out of the waiting area 40. Likewise, the control module 14 may control the display screen to display "please exit the front waiting area 40! "OR" please remove the object in the waiting ladder area 40! "to prompt passengers in the elevator waiting area 40 to leave the elevator waiting area 40 or prompt related personnel to move obstacles out of the elevator waiting area 40.
Of course, the manner in which the robot 10 sends the prompting signal is not limited to the above-discussed embodiments, and an appropriate prompting manner may be selected according to actual needs.
In some embodiments, the weight is the sum of the distance S1 from the current position of the robot 10 to each landing point 44 determined as safe and the distance S2 from the corresponding landing point 44 to the center position 42, and the step S364 includes:
the elevator waiting spot 44 with the smallest weight is selected as the target position.
Specifically, the determining module 16 may be configured to select the elevator waiting spot 44 with the smallest weight as the target location.
The smallest weight, that is, the sum of the distance s1 from the current position of the robot 10 to each landing point 44 that is determined to be safe and the distance s2 from the corresponding landing point 44 to the center position 42, makes it possible to make the reselected target position closer to the center position 42 of the landing area 40 and make the distance for the robot 10 to move to the target position closer, which facilitates the movement and navigation of the robot 10 and facilitates the robot 10 to land and enter the elevator.
Referring to fig. 8, in some embodiments, step S364 includes:
step S3642, controlling the robot 10 to move and detecting the obstacle information of the elevator waiting area 40 in real time;
step S3644, it is determined whether it is safe for the robot 10 to move to the target position based on the obstacle information; and
in step S3646, when it is detected that the target position is unsafe during the movement of the robot 10, the target position is newly determined.
Specifically, step S3642 may be implemented by the control module 14 and the detection module 17, and step S3644 and step S3646 may be implemented by the determination module 16. That is, the control module 14 may be used to control the robot 10 to move, and the detection module 17 may be used to detect the obstacle information of the waiting area 40 in real time. The determination module 16 may be configured to determine whether it is safe for the robot 10 to move to the target position according to the obstacle information, and to re-determine the target position when it is detected that the target position is unsafe during the movement of the robot 10.
So, whether the robot 10 is safe through real-time detection target location at the in-process of moving to waiting terraced region 40 to when target location is unsafe, reselect suitable position and wait the terraced, guarantee that the robot 10 waits terraced safety, avoid waiting terraced passenger or barrier and remove and influence robot 10 and wait terraced because of waiting terraced passenger.
Referring to fig. 9, in some embodiments, the control method includes:
step S5, the robot 10 sends an elevator boarding request to the elevator 20 after moving to the target position;
step S6, when the elevator 20 arrives and opens the door within the first predetermined time, controlling the robot 10 to move to the entry point 30 so that the robot 10 can enter the elevator 20 from the entry point 30; or
In step S7, when the elevator 20 does not reach or open the door within the first predetermined time, it is determined that the boarding task failed and the boarding task is abandoned.
Specifically, step S5 may be implemented by the control module 14 and the communication module 15, step S6 may be implemented by the control module 14, and step S7 may be implemented by the determination module 16. That is, the control module 14 may be used to control the robot 10 to move to the waiting area 40 after receiving the boarding task, and the communication module 15 may be used to transmit a boarding request to the elevator 20. The determination module 16 may be configured to determine whether the elevator 20 arrives and opens the door within the first predetermined time, and the control module 14 may be further configured to control the robot 10 to move to the entry point 30 when the elevator 20 arrives and opens the door within the first predetermined time, so that the robot 10 can enter the elevator 20 from the entry point 30. The determining module 16 may be configured to determine that the elevator taking task fails and abandon the elevator taking task when the elevator 20 does not arrive or is not opened within the first predetermined time.
In this way, the robot 10 can wait in the waiting area 40 far from the elevator 20 when performing the boarding task, and after the elevator 20 arrives, the robot 10 first navigates to the boarding point 30 near the elevator 20 and then moves straight in the direction of the elevator 20 along the boarding point 30, thereby ensuring that the robot 10 can safely enter the elevator 20. In the case where the elevator 20 does not arrive for a long time or the door is not opened, the robot 10 may determine that the task has failed and abandon the boarding task.
In some embodiments, the first preset time may be 5 minutes. That is, when the elevator 20 does not reach the floor where the robot 10 is located within 5 minutes or the door is not opened, it is determined that the robot 10 has failed the boarding task of this time. Of course, in other embodiments, the first preset time may be flexibly configured according to actual needs, and is not specifically limited herein.
In some embodiments, when the robot 10 fails to take the elevator for a task, the communication module 15 may upload the task status to the server 60 and/or the terminal device 50, so that the relevant personnel can know the operation condition of the robot 10, so that the relevant personnel can solve the elevator taking problem of the robot or arrange the subsequent tasks of the robot. In other embodiments, after the robot 10 uploads the task status to the server 60 and/or the terminal device 50, if the user instruction is not received within a certain period of time, the robot 10 may call the elevator 20 again, automatically restart the elevator taking task, and send an elevator taking request to the elevator 20 through the communication module 15.
Referring to fig. 10, in some embodiments, step S6 includes:
step S62, controlling the robot 10 to move to the landing point 30;
step S64, planning a boarding route along a straight line according to the boarding point 30; and
in step S66, robot 10 is controlled to enter elevator 20 along the approach route.
Specifically, steps S62, S64, and S66 may be implemented by the control module 14. That is, the control module 14 may be used for controlling the robot 10 to move to the entry point 30, and for planning an entry route along a straight line according to the entry point 30, and for controlling the robot 10 to enter the elevator 20 along the entry route.
The robot 10 may wait for the elevator 20 toward the entry point 30 while waiting for the elevator in the waiting area 40, and when the elevator 20 reaches the floor where the robot 10 is located and opens the door, the robot 10 may pilot move to the entry point 30, and then plan an entry route in a straight line from the entry point 30 toward the elevator door 22, and control the robot 10 to enter the elevator 20 along the entry route.
In one example, elevator 20 arrives at the floor where robot 10 is located and the door is opened, and detection module 17 captures an image of elevator door 22 and determines whether elevator 20 is open through image processing. In another example, when the elevator 20 reaches the floor where the robot 10 is located and the door is opened, the elevator 20 may send a door opening signal to the robot 10 according to the elevator taking request of the robot 10, and the robot 10 starts to navigate to the elevator entering point after receiving the door opening signal.
In some embodiments, step S62 may be performed by the robot 10 first detecting whether there is an obstacle at the landing point 30 through the detection module 17, and determining whether the landing point 30 is safe through the determination module 16, and controlling the robot 10 to navigate to the landing point 30 when the landing point 30 is safe. Thus, the robot 10 is ensured to safely take the elevator.
Accordingly, when there is an obstacle at the entry point 30, the control module 14 may control the robot 10 to send a prompt signal, and the robot 10 may interrupt the waiting task so that the passenger at the entry point 30 leaves the entry point 30 or prompt the relevant person to move the obstacle out of the entry point 30. In the case where the unsafe situation at the landing point 30 is continuously detected for a period of time, the determining module 16 may determine that the elevator mission has failed and abandon the elevator boarding mission.
Referring to fig. 11, in some embodiments, step S66 includes:
step S662, detecting an approach route to determine whether an obstacle continuously exists in the approach route within a second preset time;
step S664, when it is detected within a second preset time that there is no obstacle in the approach route, controlling the robot 10 to enter the elevator 20 along the approach route; or
Step S666, when it is continuously detected that there is an obstacle on the approach route within the second preset time, the robot 10 is controlled to return to the waiting area 40, and the elevator taking task is determined to fail and is abandoned.
Specifically, step S662 may be implemented by the detection module 17 and the determination module 16, step S664 may be implemented by the control module 14, and step S666 may be implemented by both the control module 14 and the determination module 16. That is, the detection module 17 may be used to detect an approach route. The judging module 16 may be configured to judge whether the obstacle continuously exists on the approach route within the second preset time. The control module 14 may be configured to control the robot 10 to enter the elevator 20 along the approach path when it is detected that there is no obstacle in the approach path within a second preset time. When an obstacle is continuously detected on the approach route within the second preset time, the control module 14 may be configured to control the robot 10 to return to the waiting area 40, and the determination module 16 may be configured to determine that the elevator taking task fails and abandon the elevator taking task.
In this way, the robot 10 can safely enter the elevator by detecting whether an obstacle exists in the elevator entering route. Likewise, the robot 10 may capture an image of the inside of the elevator 20 through a camera, and determine whether an obstacle exists along the approach route from the image. When there is an obstacle in the approach path, robot 10 may issue a prompt to prompt passengers in elevator 20 to avoid. When an obstacle is continuously detected on the approach route within the second preset time, which indicates that the elevator 20 is fully loaded or the space in the elevator is insufficient, the control module 14 can control the robot 10 to return to the waiting area 40, and the judgment module 16 can judge that the elevator taking task fails and abandon the elevator taking task. The elevator riding control is realized through the elevator entering state of the robot 10 in the specified second preset time, and meanwhile, the phenomenon that the operation of the elevator 20 is influenced due to the overlong elevator entering time of the robot 10 is avoided.
In certain embodiments, the second preset time may be 1 minute. That is, when the robot 10 does not enter the elevator 20 and reaches the target position within 1 minute, the robot 10 returns to the waiting area 40, and determines that the robot 10 has failed the elevator boarding task of this time. Of course, in other embodiments, the second preset time may be flexibly configured according to actual needs, and is not specifically limited herein.
Referring to fig. 12, in some embodiments, step S6 further includes:
in step S68, when the robot 10 enters the elevator 20 and the distance d2 from the robot 10 to the elevator door 22 is greater than a predetermined value, it is confirmed that the boarding task is completed.
Specifically, step S668 may be implemented by the determination module 16. That is, the determination module 16 may confirm that the boarding task is completed when the robot 10 enters the elevator 20 and the distance d2 from the robot 10 to the elevator door 22 is greater than a predetermined value.
In this way, after the robot 10 enters the elevator 20, a certain predetermined distance, preferably 0.6 m, needs to be maintained between the robot 10 and the elevator door 22, so that the robot 10 can be prevented from interfering with the robot 10 when the elevator 20 is closed, which is beneficial to ensuring the safety of the robot 10 when the elevator is closed.
In some embodiments, the number of the robots 10 may be multiple, and when multiple robots 10 perform the elevator taking tasks, the elevators may be entered in sequence according to the order of the elevator taking tasks and/or the task priority.
Specifically, a plurality of robots 10 may communicate with each other to determine an elevator entering sequence, or a plurality of robots 10 may communicate with the server 60 and/or the terminal device 50, when the robots 10 perform an elevator taking task, elevator taking task states may be sent to the server 60 and/or the terminal device 50 in real time, and the server 60 and/or the terminal device 50 may arrange the elevator entering sequence of the robots 10 according to the elevator taking task states of the plurality of robots 10, thereby ensuring the elevator entering safety of the robots 10.
In some embodiments, there may be multiple elevators 20 per floor and the elevator determination module 11 may determine multiple elevator locations for the corresponding floor from the scanned floor map. Marking module 12 may mark the location where robot 10 is directly in front of each elevator door 22 and directly opposite elevator 20 as the entry point 30 for elevator 20. The elevator waiting determination module 13 can determine the elevator waiting area 30 according to the corresponding elevator entering points 30 of the plurality of elevators 20.
Specifically, after the relevant person scans the floor map, the number and the mark of the plurality of elevators 20 may be performed, and the robot 10 may automatically identify the corresponding elevator 20, thereby performing an elevator riding task.
In one example, the floors that can be reached by the plurality of elevators 20 are different, and the robot 10 can automatically recognize that the corresponding elevator 20 is taken according to the destination floor of the boarding task. In another example, the plurality of elevators 20 may be divided into passenger elevators, robot elevators, and/or shared elevators, etc., and likewise, the robot 10 may automatically recognize that the corresponding elevator 20 is taken according to actual needs.
In some embodiments, the step of determining the elevator waiting area 40 may be to determine one elevator waiting area 40 for each elevator waiting point 4430, for example, when the floors reached by different elevators 20 are different, each elevator waiting point 4430 corresponds to one elevator waiting area 40, so that robots 10 with different destination floors can wait for the elevators 20 in different elevator waiting areas 40 to ensure that the robots 10 enter the elevators safely and orderly. The elevator waiting area 40 can also be determined by a plurality of elevator entering points 30 to jointly determine one elevator waiting area 40, and after the elevator 20 arrives, the robot 10 can identify the corresponding elevator 20 and obtain the position of the corresponding elevator entering point 30, so that the corresponding elevator 20 can be automatically selected to take according to actual needs.
Referring to fig. 13, a robot 10 provided in an embodiment of the present disclosure includes a processor 18, a readable storage medium 19, and computer-executable instructions 191 stored on the readable storage medium 19 and executable on the processor 18, where when the computer-executable instructions 191 are executed by the processor 18, the processor 18 is caused to execute the control method of any one of the above embodiments.
In one example, the computer-executable instructions 191, when executed by the processor 18, cause the processor 18 to perform the steps of:
step S1, after receiving the boarding task, determining a waiting area of the robot according to the elevator entering point, wherein the distance from the waiting area to the elevator is greater than the distance from the elevator entering point to the elevator;
step S2, detecting the elevator waiting environment information of the elevator waiting area;
and step S3, determining the target position of the robot waiting for the elevator in the elevator waiting area according to the elevator waiting environment information and the central position 42 of the elevator waiting area, and controlling the robot to move to the target position.
The robot 10 of the embodiment of the application executes the computer executable instruction 191 through the processor 18, can wait in a waiting area far away from an elevator, and after the elevator arrives, the robot navigates to an elevator entering point near the elevator in a leading mode, and then linearly advances towards the elevator along the elevator entering point to ensure that the robot can safely enter the elevator, so that the safe navigation of the elevator taking task is realized. When waiting before the elevator for a lot of passengers, the robot can also find a proper waiting position according to waiting environment information in the waiting area, so that the robot can conveniently enter the elevator as far as possible and can keep a safe distance with the passengers.
The embodiments of the present application also provide a non-volatile computer-readable storage medium 19, where the readable storage medium 19 includes computer-executable instructions 191, and when the computer-executable instructions 191 are executed by one or more processors 18, the processor 18 is caused to execute the robot control method of any one of the above embodiments.
Referring to fig. 14, one or more processors 18 may be coupled to a readable storage medium through a bus, and the readable storage medium 19 stores computer-executable instructions 191, which are processed by the processors 18 to perform the robot control method according to the embodiment of the present disclosure, so that the robot 10 can safely perform the elevator riding task. The robot 10 may also be connected to a network via a communication module 15 to enable communication with the server 60 and/or the terminal device 50, and connected to an input/output device via an input/output interface to collect environmental information or output control status signals.
In the description herein, reference to the term "one embodiment," "some embodiments," or "an example" etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium. The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (10)

1. A robot control method, comprising:
after receiving a boarding task, determining a waiting area of the robot according to an elevator entering point, wherein the distance from the waiting area to an elevator is greater than the distance from the elevator entering point to the elevator;
detecting the elevator waiting environment information of the elevator waiting area;
and determining a target position of the robot waiting for the elevator in the elevator waiting area according to the elevator waiting environment information and the central position of the elevator waiting area, and controlling the robot to move to the target position.
2. The robot control method according to claim 1, characterized by comprising:
determining the elevator position of the corresponding floor according to the scanned floor map; and
and marking the position of the robot, which is right in front of the elevator door and is right opposite to the elevator direction, as the elevator entering point, wherein the distance from the elevator entering point to the elevator door is 0.1-0.5 m.
3. The robot control method according to claim 1, wherein the step of determining a waiting area of the robot according to the landing point comprises:
determining the size and the position of the elevator waiting area according to configuration parameters and the position of the elevator entering point, wherein the configuration parameters comprise the distance between the elevator waiting area and the elevator entering point, the shape and the size of the elevator waiting area, and the direction angle of the central position relative to the elevator entering point.
4. The robot control method according to claim 1, wherein the elevator waiting environment information includes obstacle information in the elevator waiting area, and the step of determining the target position of the robot waiting for the elevator in the elevator waiting area based on the elevator waiting environment information and the center position of the elevator waiting area includes:
selecting the central position as the target position, and judging whether the robot moves to the target position safely or not according to the obstacle information;
when the robot moves to the target position safely, controlling the robot to move to the target position; or
Reselecting the target location when it is unsafe for the robot to move to the target location.
5. The robot control method according to claim 4, wherein the step of reselecting the target position when the robot is not safely moved to the target position includes:
generating a plurality of elevator waiting points in the elevator waiting area;
judging whether the robot moves to each elevator waiting point safely or not according to the obstacle information;
determining the weight of each elevator entering point with safe judgment result according to the current position and the central position of the robot; and
and selecting a corresponding elevator waiting point as the target position according to the weight, and controlling the robot to move to the target position.
6. The robot control method according to claim 5, wherein the weight is a sum of a distance from a current position of the robot to each of the landing points determined to be safe and a distance from the corresponding landing point to the center position, and the step of selecting the corresponding landing point as the target position based on the weight includes:
and selecting the elevator waiting point with the minimum weight as the target position.
7. The robot control method according to claim 5, wherein the step of selecting a corresponding waiting point as the target position according to the weight and controlling the robot to move to the target position comprises:
controlling the robot to move and detecting the barrier information of the elevator waiting area in real time;
judging whether the robot moves to the target position safely or not according to the obstacle information; and
and when the target position is detected to be unsafe in the moving process of the robot, the target position is determined again.
8. A robot is characterized in that a robot body is provided with a plurality of robots,
the robot elevator waiting system comprises an elevator waiting determining module, a robot elevator waiting determining module and an elevator waiting determining module, wherein the elevator waiting determining module is used for determining an elevator waiting area of a robot according to an elevator entering point after receiving an elevator taking task, and the distance from the elevator waiting area to an elevator is greater than the distance from the elevator entering point to the elevator;
the detection module is used for detecting the elevator waiting environment information of the elevator waiting area; and
the elevator waiting determining module is further used for determining a target position of the robot waiting for the elevator in the elevator waiting area according to the elevator waiting environment information and the center position of the elevator waiting area, and the control module is used for controlling the robot to move to the target position.
9. A robot comprising a processor, a readable storage medium, and computer-executable instructions stored on the readable storage medium and executable on the processor, the computer-executable instructions, when executed by the processor, causing the processor to perform the control method of any of claims 1-7.
10. A non-transitory computer-readable storage medium, comprising computer-executable instructions that, when executed by one or more processors, cause the processors to perform the robot control method of any one of claims 1-7.
CN201911252566.2A 2019-12-09 2019-12-09 Robot control method, robot, and readable storage medium Active CN110861094B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911252566.2A CN110861094B (en) 2019-12-09 2019-12-09 Robot control method, robot, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911252566.2A CN110861094B (en) 2019-12-09 2019-12-09 Robot control method, robot, and readable storage medium

Publications (2)

Publication Number Publication Date
CN110861094A true CN110861094A (en) 2020-03-06
CN110861094B CN110861094B (en) 2021-03-23

Family

ID=69657660

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911252566.2A Active CN110861094B (en) 2019-12-09 2019-12-09 Robot control method, robot, and readable storage medium

Country Status (1)

Country Link
CN (1) CN110861094B (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111728532A (en) * 2020-06-01 2020-10-02 珠海市一微半导体有限公司 Control method for robot to enter elevator, laser robot and chip
CN111882710A (en) * 2020-07-21 2020-11-03 上海木木聚枞机器人科技有限公司 Robot-based access control exception handling method, device and system
CN112069072A (en) * 2020-09-07 2020-12-11 上海高仙自动化科技发展有限公司 Robot simulation control system, method, server and storage medium
CN112130574A (en) * 2020-09-30 2020-12-25 拉扎斯网络科技(上海)有限公司 Robot control method, device, electronic device and computer storage medium
CN112207832A (en) * 2020-10-16 2021-01-12 拉扎斯网络科技(上海)有限公司 Method for entering bearing equipment, mobile equipment, electronic equipment and storage medium
CN112405548A (en) * 2021-01-22 2021-02-26 苏州优智达机器人有限公司 Method and system for interaction between robot and elevator and robot
CN112537705A (en) * 2020-03-31 2021-03-23 深圳优地科技有限公司 Robot elevator taking scheduling method and device, terminal equipment and storage medium
CN112537703A (en) * 2020-03-31 2021-03-23 深圳优地科技有限公司 Robot elevator taking method and device, terminal equipment and storage medium
CN112537704A (en) * 2020-03-31 2021-03-23 深圳优地科技有限公司 Robot elevator taking scheduling method, terminal and computer readable storage medium
CN112612273A (en) * 2020-12-21 2021-04-06 南方电网电力科技股份有限公司 Routing inspection robot obstacle avoidance path planning method, system, equipment and medium
CN112743549A (en) * 2021-02-02 2021-05-04 苏州优智达机器人有限公司 Robot control method, system, robot and program product
CN113093751A (en) * 2021-04-02 2021-07-09 北京云迹科技有限公司 Position control method and device for elevators such as robot and storage medium
CN113307113A (en) * 2021-05-26 2021-08-27 福建汉特云智能科技有限公司 Method for robot to take elevator automatically and storage medium
CN113370216A (en) * 2021-06-29 2021-09-10 上海有个机器人有限公司 Ladder waiting method and device for robot to take ladder, robot and storage medium
CN113401737A (en) * 2020-03-16 2021-09-17 奥的斯电梯公司 Control system and method for elevator waiting position of machine passenger
EP3882193A1 (en) * 2020-03-16 2021-09-22 Otis Elevator Company Interaction safety control between an elevator system and a machine passenger
CN114237245A (en) * 2021-12-14 2022-03-25 北京云迹科技股份有限公司 Method and device for avoiding elevator by robot
CN114397885A (en) * 2021-12-16 2022-04-26 北京三快在线科技有限公司 Method for assisting mobile robot in taking elevator, electronic device and storage medium
CN115043277A (en) * 2022-05-20 2022-09-13 浙江孚宝智能科技有限公司 Control method and system for robot to take elevator and storage medium
CN115366127A (en) * 2022-10-24 2022-11-22 上海思岚科技有限公司 Method and equipment for robot taking elevator for distribution
WO2023216755A1 (en) * 2022-05-11 2023-11-16 深圳市普渡科技有限公司 Robot control method and apparatus, and robot, storage medium and program product

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100080278A (en) * 2009-03-04 2010-07-08 한국생산기술연구원 Robot, robot control method, elevator system linked with robot and control method thereof
CN105858368A (en) * 2016-05-31 2016-08-17 北京云迹科技有限公司 System and method for automatically calling elevator
CN105881553A (en) * 2016-05-26 2016-08-24 深圳市神州云海智能科技有限公司 Method and device for realizing autonomous operation of corridor cleaning robot and robot
JP2017033450A (en) * 2015-08-05 2017-02-09 日本精工株式会社 Moving robot
JP2017220123A (en) * 2016-06-09 2017-12-14 パナソニックIpマネジメント株式会社 Mobile robot
CN108002154A (en) * 2017-11-22 2018-05-08 上海思岚科技有限公司 The method that control robot is moved across floor
CN109205413A (en) * 2018-08-07 2019-01-15 北京云迹科技有限公司 Across floor paths planning method and system
US20190084792A1 (en) * 2016-05-17 2019-03-21 Mitsubishi Electric Corporation Elevator system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100080278A (en) * 2009-03-04 2010-07-08 한국생산기술연구원 Robot, robot control method, elevator system linked with robot and control method thereof
JP2017033450A (en) * 2015-08-05 2017-02-09 日本精工株式会社 Moving robot
US20190084792A1 (en) * 2016-05-17 2019-03-21 Mitsubishi Electric Corporation Elevator system
CN105881553A (en) * 2016-05-26 2016-08-24 深圳市神州云海智能科技有限公司 Method and device for realizing autonomous operation of corridor cleaning robot and robot
CN105858368A (en) * 2016-05-31 2016-08-17 北京云迹科技有限公司 System and method for automatically calling elevator
JP2017220123A (en) * 2016-06-09 2017-12-14 パナソニックIpマネジメント株式会社 Mobile robot
CN108002154A (en) * 2017-11-22 2018-05-08 上海思岚科技有限公司 The method that control robot is moved across floor
CN109205413A (en) * 2018-08-07 2019-01-15 北京云迹科技有限公司 Across floor paths planning method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
范文杰: "基于云模型的机器人自主乘梯控制方法研究", 《微计算机信息》 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3882193A1 (en) * 2020-03-16 2021-09-22 Otis Elevator Company Interaction safety control between an elevator system and a machine passenger
CN113401737A (en) * 2020-03-16 2021-09-17 奥的斯电梯公司 Control system and method for elevator waiting position of machine passenger
EP3882196A1 (en) * 2020-03-16 2021-09-22 Otis Elevator Company Control system and method for elevator-waiting position of robotic passenger
CN112537705A (en) * 2020-03-31 2021-03-23 深圳优地科技有限公司 Robot elevator taking scheduling method and device, terminal equipment and storage medium
CN112537704B (en) * 2020-03-31 2022-10-25 深圳优地科技有限公司 Robot elevator taking scheduling method, terminal and computer readable storage medium
CN112537703A (en) * 2020-03-31 2021-03-23 深圳优地科技有限公司 Robot elevator taking method and device, terminal equipment and storage medium
CN112537704A (en) * 2020-03-31 2021-03-23 深圳优地科技有限公司 Robot elevator taking scheduling method, terminal and computer readable storage medium
CN112537703B (en) * 2020-03-31 2023-01-20 深圳优地科技有限公司 Robot elevator taking method and device, terminal equipment and storage medium
CN112537705B (en) * 2020-03-31 2023-04-11 深圳优地科技有限公司 Robot elevator taking scheduling method and device, terminal equipment and storage medium
CN111728532A (en) * 2020-06-01 2020-10-02 珠海市一微半导体有限公司 Control method for robot to enter elevator, laser robot and chip
CN111882710A (en) * 2020-07-21 2020-11-03 上海木木聚枞机器人科技有限公司 Robot-based access control exception handling method, device and system
CN112069072A (en) * 2020-09-07 2020-12-11 上海高仙自动化科技发展有限公司 Robot simulation control system, method, server and storage medium
CN112130574A (en) * 2020-09-30 2020-12-25 拉扎斯网络科技(上海)有限公司 Robot control method, device, electronic device and computer storage medium
CN112207832A (en) * 2020-10-16 2021-01-12 拉扎斯网络科技(上海)有限公司 Method for entering bearing equipment, mobile equipment, electronic equipment and storage medium
CN112207832B (en) * 2020-10-16 2021-10-08 拉扎斯网络科技(上海)有限公司 Method for entering bearing equipment, mobile equipment, electronic equipment and storage medium
CN112612273B (en) * 2020-12-21 2021-08-24 南方电网电力科技股份有限公司 Routing inspection robot obstacle avoidance path planning method, system, equipment and medium
CN112612273A (en) * 2020-12-21 2021-04-06 南方电网电力科技股份有限公司 Routing inspection robot obstacle avoidance path planning method, system, equipment and medium
CN112405548A (en) * 2021-01-22 2021-02-26 苏州优智达机器人有限公司 Method and system for interaction between robot and elevator and robot
CN112743549A (en) * 2021-02-02 2021-05-04 苏州优智达机器人有限公司 Robot control method, system, robot and program product
CN113093751A (en) * 2021-04-02 2021-07-09 北京云迹科技有限公司 Position control method and device for elevators such as robot and storage medium
CN113093751B (en) * 2021-04-02 2023-03-14 北京云迹科技股份有限公司 Position control method and device for elevators such as robot and storage medium
CN113307113A (en) * 2021-05-26 2021-08-27 福建汉特云智能科技有限公司 Method for robot to take elevator automatically and storage medium
CN113370216A (en) * 2021-06-29 2021-09-10 上海有个机器人有限公司 Ladder waiting method and device for robot to take ladder, robot and storage medium
CN114237245A (en) * 2021-12-14 2022-03-25 北京云迹科技股份有限公司 Method and device for avoiding elevator by robot
CN114397885A (en) * 2021-12-16 2022-04-26 北京三快在线科技有限公司 Method for assisting mobile robot in taking elevator, electronic device and storage medium
WO2023216755A1 (en) * 2022-05-11 2023-11-16 深圳市普渡科技有限公司 Robot control method and apparatus, and robot, storage medium and program product
CN115043277A (en) * 2022-05-20 2022-09-13 浙江孚宝智能科技有限公司 Control method and system for robot to take elevator and storage medium
CN115366127A (en) * 2022-10-24 2022-11-22 上海思岚科技有限公司 Method and equipment for robot taking elevator for distribution

Also Published As

Publication number Publication date
CN110861094B (en) 2021-03-23

Similar Documents

Publication Publication Date Title
CN110861094B (en) Robot control method, robot, and readable storage medium
CN110861095B (en) Robot control method, robot, and readable storage medium
JP6931504B2 (en) Flight management system for flying objects
CN104737085B (en) Robot for automatically detecting or handling ground and method
WO2018195955A1 (en) Aircraft-based facility detection method and control device
US9434074B2 (en) Autonomous running control method and device of autonomous running apparatus, and program for autonomous running control device
WO2020182470A1 (en) Route planning in an autonomous device
KR102337531B1 (en) Method and system for specifying node for robot path plannig
US20220357174A1 (en) Stand-alone self-driving material-transport vehicle
CN111728532A (en) Control method for robot to enter elevator, laser robot and chip
CN109311622B (en) Elevator system and car call estimation method
CN109753074A (en) A kind of robot cruise control method, device, control equipment and storage medium
KR20190085545A (en) Vehicle control method and apparatus, and storage medium
JPH10149217A (en) Automatic conveyance device
CN111427374B (en) Airplane berth guiding method, device and equipment
CN112590773B (en) Automatic vehicle warehousing method and device, computer equipment and storage medium
JP6354296B2 (en) Automated guided vehicle traveling system
KR101333496B1 (en) Apparatus and Method for controlling a mobile robot on the basis of past map data
KR102541959B1 (en) Elevator control system and method for controlling elevator which robot and human board
CN115026833A (en) Multilayer map creation method and device and robot
CN114339912A (en) Network connection method, device, equipment and storage medium
CN114199247A (en) Method and device for positioning floor by mobile robot
CN112265647A (en) Remote automatic control device, medium and electronic equipment of boarding bridge
CN112008727A (en) Elevator-taking robot key control method based on bionic vision and elevator-taking robot
CN111538342A (en) Method and device for adjusting robot travel route, robot and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant