CN117519195A - Robot bin gate control method, electronic equipment and storage medium - Google Patents

Robot bin gate control method, electronic equipment and storage medium Download PDF

Info

Publication number
CN117519195A
CN117519195A CN202311635859.5A CN202311635859A CN117519195A CN 117519195 A CN117519195 A CN 117519195A CN 202311635859 A CN202311635859 A CN 202311635859A CN 117519195 A CN117519195 A CN 117519195A
Authority
CN
China
Prior art keywords
door
robot
state
point cloud
cloud data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311635859.5A
Other languages
Chinese (zh)
Inventor
夏舸
张美妙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uditech Co Ltd
Original Assignee
Uditech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uditech Co Ltd filed Critical Uditech Co Ltd
Priority to CN202311635859.5A priority Critical patent/CN117519195A/en
Publication of CN117519195A publication Critical patent/CN117519195A/en
Pending legal-status Critical Current

Links

Abstract

The invention relates to the technical field of robots, in particular to a robot bin gate control method, electronic equipment and a storage medium, wherein the method comprises the following steps: determining a state change of a door associated with a target position after reaching the target position in the process of executing a delivery task by a robot, wherein the state change comprises a change from an open state to a closed state and a change from the closed state to the open state; and on the basis of the state change of the door, controlling the opening and closing of the robot door. According to the method, the robot bin gate is timely controlled to be opened and closed through the state change of the bin gate, so that the distribution efficiency is improved, the user experience is improved, and the distribution safety is guaranteed.

Description

Robot bin gate control method, electronic equipment and storage medium
Technical Field
The present invention relates to the field of robots, and in particular, to a method for controlling a door of a robot, an electronic device, and a storage medium.
Background
Along with the rapid development of artificial intelligence technology, robots are increasingly widely applied to various industries, gradually replace manual labor, operate efficiently, improve the production and working efficiency and save the labor cost. For example, an item is delivered to a designated place such as a room of a user by a delivery robot in a hotel.
Among them, delivery robots used in many industries often have a warehouse, and in order to meet the demands of delivering and taking articles, the door of the warehouse often needs to be opened or closed. In the prior art, after the delivery robot arrives at a designated place, a user needs to manually click a related button, open a bin gate and close the bin gate, so that the control of the bin gate of the delivery robot is realized. However, when the user forgets to click the door closing button, the delivery robot fails to perform the next delivery task in time, reducing delivery efficiency and resulting in poor user experience. In addition, the bin gate is not closed for a long time, so that the delivery robot may have bin gate faults, and the delivery safety is affected.
Disclosure of Invention
In view of the above, the embodiment of the invention provides a control method for a robot door, electronic equipment and a storage medium.
The embodiment of the invention provides the following technical scheme:
in a first aspect, an embodiment of the present invention provides a method for controlling a robot door, including:
determining a state change of a door associated with a target position after reaching the target position in the process of executing a delivery task by a robot, wherein the state change comprises a change from an open state to a closed state and a change from the closed state to the open state;
And on the basis of the state change of the door, controlling the opening and closing of the robot door.
In some embodiments, the determining a change in status of a door associated with the target location includes:
acquiring first point cloud data of the surrounding environment of the target position, and determining the position of a door associated with the target position according to the first point cloud data;
continuously scanning the door to obtain continuous multi-frame second point cloud data;
and determining the state change of the door according to the change among the continuous multi-frame second point cloud data.
In some embodiments, the acquiring the first point cloud data of the environment surrounding the target location, determining the location of the door associated with the target location according to the first point cloud data, includes:
scanning the surrounding environment of the target position to obtain first point cloud data;
determining track characteristics formed by point clouds in the first point cloud data;
and when the track characteristic is matched with the preset door frame characteristic, determining the position of the door associated with the target position.
In some embodiments, the determining the status change of the door according to the change between the second point cloud data of consecutive frames includes:
Determining the state of the door in the second point cloud data according to the second point cloud data of each frame, wherein in one frame of the second point cloud data, if the distance difference between the actual distance and the theoretical distance between the adjacent point clouds is smaller than or equal to a preset difference value, the state of the door is determined to be a closed state, and if at least one group of distance differences between the actual distance and the theoretical distance between the adjacent point clouds is larger than the preset difference value, the state of the door is determined to be an open state;
if the door is determined to be in a closed state according to the second point cloud data of the current frame and is determined to be in an open state according to the second point cloud data of the next frame, determining that the state of the door is changed from the closed state to the open state;
and if the door is determined to be in an open state according to the second point cloud data of the current frame and the door is determined to be in a closed state according to the second point cloud data of the next frame, determining that the state of the door is changed from the open state to the closed state.
In some embodiments, the controlling the opening and closing of the robot door based on the state change of the door includes:
If the door is changed from a closed state to an open state, opening a robot cabin door;
and if the door is changed from the open state to the closed state, closing the robot cabin door.
In some embodiments, the method further comprises:
if an opening instruction generated by triggering a user on a robot display device or a terminal APP is received, opening a robot bin gate; or,
and if a closing instruction generated by triggering a user on a robot display device or a terminal APP is received, closing a robot bin gate and completing a distribution task.
In some embodiments, in the process of performing the delivery task by the robot, after reaching the target position, before determining the state change of the door associated with the target position, the method further includes:
when objects to be distributed exist in the robot warehouse, determining that the robot is in the process of executing a distribution task;
and when no object to be distributed exists in the robot warehouse, determining that the robot is in a non-execution distribution task.
In some embodiments, the method further comprises:
when the robot is in a non-execution delivery task and a delivery instruction generated by triggering a user on a robot display device or a terminal APP is received, detecting whether an object to be delivered exists in a robot warehouse;
When objects to be distributed exist in the robot warehouse, converting the state of the robot into a state for executing a distribution task;
when no object to be distributed exists in the robot warehouse, the state of the robot is kept in a state of not executing the distribution task.
In a second aspect, an embodiment of the present invention provides an electronic device, including:
a processor and a memory communicatively coupled to the processor;
the memory stores computer program instructions executable by the processor, which when invoked by the processor, cause the processor to perform the steps of any one of the robotic door control methods set forth in the first aspect, or any one of the possible implementation manners of any one of the robotic door control methods set forth in the first aspect.
In a third aspect, an embodiment of the present invention provides a computer readable storage medium, where computer program instructions are stored on the computer readable storage medium, where the computer program instructions are configured to cause a computer device to perform the method of controlling a door of any one of the robots proposed in the first aspect, or the steps in any one of the possible implementation manners of the method of controlling a door of any one of the robots proposed in the first aspect.
The embodiment of the invention has the beneficial effects that: different from the situation in the prior art, the robot bin gate control method provided by the embodiment of the invention comprises the following steps: determining a state change of a door associated with a target position after reaching the target position in the process of executing a delivery task by a robot, wherein the state change comprises a change from an open state to a closed state and a change from the closed state to the open state; and on the basis of the state change of the door, controlling the opening and closing of the robot door. According to the method, the robot bin gate is timely controlled to be opened and closed through the state change of the bin gate, so that the distribution efficiency is improved, the user experience is improved, and the distribution safety is guaranteed.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the description of the embodiments or the prior art will be briefly described below, it being obvious that the drawings described below only illustrate certain embodiments of the present invention and therefore should not be considered as limiting the scope of protection, and other related drawings may be obtained according to these drawings without inventive effort to a person skilled in the art.
Fig. 1 is a schematic view of an application scenario of a robot door control method according to some embodiments of the present invention;
FIG. 2 is a schematic diagram of an electronic device according to some embodiments of the present invention;
FIG. 3 is a flow chart of a robot door control method provided by some embodiments of the present invention;
FIG. 4 is a schematic view of a sub-process of step S100 in the robot door control method of the embodiment of FIG. 3;
FIG. 5 is a schematic view of a sub-process of step S110 in the robot door control method of the embodiment of FIG. 4;
fig. 6 is a set of neighboring point clouds in a frame of second point cloud data in some embodiments of the invention.
Detailed Description
In order to make the objects and advantages of the embodiments of the present invention easier to understand, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings of the embodiments of the present invention. It is apparent that the described embodiments are only some embodiments of the present invention, but not all embodiments, and the following detailed description of the embodiments of the present invention in the accompanying drawings does not limit the scope of the claimed invention, but only represent selected embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that, if no conflict arises, the technical features of the embodiments of the present invention described below may be combined with each other and all are within the protection scope of the present application. In addition, although functional block division is performed in a device or a structural diagram, a logical order is shown in a flowchart, in some cases, steps shown or described may be performed in a different order than block division in a device or from the order in a flowchart. Furthermore, the use of "first," "second," "third," and other similar expressions, herein do not limit the data and execution sequence, but merely facilitate description and distinguish between identical items or similar items that have substantially the same function and effect, without regard to indicating or implying a relative importance or implying a number of technical features.
Unless defined otherwise, technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. It should be understood that the term "and/or" as used in this specification includes any and all combinations of one or more of the associated listed items.
Referring to fig. 1, fig. 1 is a schematic diagram of an application scenario of a robot door control method according to some embodiments of the present invention. In this application scenario, an electronic device (e.g., an unmanned vehicle or robot) 100 and a door 200 are included. In the embodiment of the present invention, taking the electronic device 100 as an example of a robot, the method for controlling a robot door provided in the embodiment of the present invention is described in detail, and the method for controlling a robot door is applied to other types of electronic devices and can refer to the implementation mode of the robot.
Specifically, the robot has a bin (not shown in fig. 1) for receiving the objects to be dispensed, wherein the bin includes a bin door 150, and the bin door 150 is used for opening or closing a storage space inside the bin, in which the objects to be dispensed can be received. In the process of executing the delivery task, the robot senses and acquires the state change of the door 200 associated with the target position through various sensors equipped with the robot after navigating to the corresponding target position, and then opens or closes the door 150 based on the state change of the door 200 to complete the corresponding delivery task.
In some embodiments, when the robot is in the dispensing process and the monitoring determines that the door 200 associated with the target location changes from a closed state to an open state, the user is presented with the door 200 open in preparation for receiving the items to be dispensed that are housed in the robot warehouse, and the robot opens the door 150 to deliver the carried items to the user. If the monitoring determines that the door 200 associated with the target location has changed from an open to a closed state, indicating that the user has successfully received the item to be dispensed, the robot closes the door 150, and the dispensing task has completed, either performing the next job or returning to the job waiting area to wait for the next job.
It will be appreciated by those skilled in the art that the application scenario shown in the embodiment of fig. 1 only schematically illustrates one situation in which the electronic device 100 performs the distribution task, and does not impose any limitation on the structure and type of the electronic device 100 and the opening and closing states of the door 150, the opening and closing control of the door 150, the opening and closing states of the door 200, etc. of the warehouse of the electronic device 100 in other application scenarios.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an electronic device 100 according to some embodiments of the present invention. The electronic device 100 may be a robot or a terminal device mounted to the robot. Wherein the electronic device 100 comprises at least one processor 110 and a memory 120 in communication connection (in fig. 2, a bus system connection, one processor is an example), the various components in the electronic device 100 are coupled together by a bus system 130, the bus system 130 being adapted to enable a connection communication between these components. It will be readily appreciated that the bus system 130 includes, in addition to the data bus, a power bus, a control bus, a status signal bus, and the like. Various buses are labeled as bus system 130 in fig. 2 for clarity and conciseness. It will be appreciated that the configuration shown in the embodiment of fig. 2 is merely illustrative and is not intended to limit the configuration of the electronic device 100 described above. For example, the electronic device 100 described above may also include more or fewer components than shown in fig. 2, or have a different configuration than shown in fig. 2.
In particular, the processor 110 is configured to provide computing and control capabilities to control the electronic device 100 to perform corresponding tasks. For example, the above-mentioned robot is controlled to execute the steps in any one of the robot door control methods provided by the embodiments of the present invention, or any one of the possible implementation manners of any one of the robot door control methods provided by the embodiments of the present invention. Those skilled in the art will appreciate that the processor 110 may be a general purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (Digital Signal Processing, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
The memory 120, as a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the robot door control method in the embodiments of the present invention. In some embodiments, memory 120 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created from the use of the processor 110, etc. The steps of any one of the robot door control methods provided by the embodiments of the present invention, or any one of the possible implementation manners of any one of the robot door control methods provided by the embodiments of the present invention, may be implemented by the processor 110 executing non-transitory software programs, instructions, and modules stored in the memory 120, thereby performing various functional applications and data processing of the robot. Those skilled in the art will appreciate that memory 120 may include high-speed random access memory, and may also include non-transitory memory. Such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 120 may also include memory located remotely from processor 110, which may be connected to processor 110 via a communication network. It will be appreciated that examples of the above-described communication networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
It will be appreciated from the foregoing that the method for controlling a robot door according to the embodiments of the present invention may be implemented by various suitable types of electronic devices 100 having certain computing and control capabilities, for example, by the above-described robot implementation. The following describes in detail the method for controlling the cabin door of the robot according to the embodiment of the present invention in conjunction with exemplary applications and implementations of the robot according to the embodiment of the present invention.
Referring to fig. 3, fig. 3 is a flow chart of a robot door control method according to some embodiments of the present invention. Obviously, the method for controlling the cabin door of the robot provided by the embodiment of the invention can be applied to the electronic equipment, for example, the robot. The robot door control method includes, but is not limited to, the following steps S100-S200:
s100: in the process of the robot executing the delivery task, after reaching a target position, determining the state change of a door associated with the target position, wherein the state change comprises changing from an open state to a closed state and changing from the closed state to the open state.
Wherein a robot is a multi-functional automated device or system capable of executing programs or tasks. Robots are typically designed to perform tasks that humans are reluctant or unable to perform, or to perform repetitive, mechanical tasks for efficiency and accuracy. The robot has the basic characteristics of perception, decision making, execution and the like, can assist or even replace human beings to finish dangerous, heavy and complex work, improves the working efficiency and quality, serves the life of the human beings, and enlarges or extends the activity and capacity range of the human beings. Robots have been widely used in many fields such as manufacturing, healthcare, military, service, construction, etc., to provide assistance and ease of burden to humans and to play an important role in some hazardous or harsh environments.
Illustratively, in the service industry, robots are typically used to perform dispensing tasks to dispense related items to a user, and thus robots typically have a warehouse for storing items. Specifically, when receiving a delivery task, the robot goes to a warehouse or other goods taking positions to carry corresponding goods to be delivered, and after receiving the corresponding goods to be delivered, the robot accommodates the goods to be delivered in a warehouse. In the process that the robot navigates to the target position, the bin door of the bin is closed, so that articles in the bin are prevented from falling, and the articles are prevented from being lost or damaged. After the robot reaches the target position, the bin gate of the bin is opened, so that a user can collect the articles in the bin conveniently, and the delivery task is completed.
Specifically, after the robot receives the delivery task and accommodates the corresponding to-be-delivered object in the warehouse, the robot starts to execute the delivery task. The robot senses and acquires environmental data of the surrounding environment through various sensors such as cameras, laser radars, infrared sensors and the like, so as to navigate to a target position corresponding to a delivery task. After reaching the target position, the robot detects and determines the state information or state change of the door associated with the target position by various sensors provided in the robot. For example, whether the door is in an open or closed state, whether the door is changed from an open state to a closed state or from a closed state to an open state, and the like are detected.
In some embodiments, referring to fig. 4, fig. 4 is a schematic flow chart illustrating a sub-process of step S100 in a robot door control method according to some embodiments of the present invention. Specifically, the determining a status change of the door associated with the target location includes, but is not limited to, the following steps S110-S130:
s110: and acquiring first point cloud data of the surrounding environment of the target position, and determining the position of a door associated with the target position according to the first point cloud data.
Illustratively, point cloud data is a type of data generated by a lidar, depth camera, or other three-dimensional sensing device, and is a collection of tens or millions of points, each having three-dimensional coordinate information (x, y, z). Wherein the three-dimensional coordinates represent the position of the object surface in three-dimensional space. The density and accuracy of the point cloud data is largely dependent on the sensor and scanning method employed. A high density point cloud typically contains a large number of points, which may provide detailed environmental information, while a low density point cloud may contain only a few points, providing more generalized information. It is readily understood that the point cloud data may also include other attributes, such as color or reflection intensity, etc.
Specifically, after reaching the target position, the robot acquires first point cloud data of the surrounding environment of the target position through sensors such as a laser radar and a depth camera which are arranged on the robot. Wherein the first point cloud data represents in the form of points the spatial structure of the environment surrounding the target location, e.g. walls, objects, doors, etc. It will be appreciated that the first point cloud data collected by the robot will typically contain points from a plurality of objects and environmental elements, and that it is necessary to perform a point cloud analysis and filtering process on the first point cloud data to remove irrelevant points and reduce noise before determining the position of the door associated with the target location from the first point cloud data.
In some embodiments, the points of the first point cloud data may be from different doors, walls, or other objects, and after the filtering process, the robot needs to perform point cloud clustering to group the points onto different objects or doors for point cloud analysis to extract point cloud data related to the doors from the first point cloud data. Since doors typically have a specific geometry, such as a rectangle or polygon, feature points or features of the specific geometry, such as frames, edges, corner points or other door-related features of the door, may be identified in the obtained first point cloud data using a feature extraction algorithm to determine the position of the door associated with the target position.
It will be appreciated that in the process of delivering the objects to be delivered, the robot reaches the target position first, then delivers the objects to be delivered to the hands of the user at the target position, and in the hotel location, the target position is set near or beside the door opening of each room, and the target position may be marked on the map in advance, that is, the door opening of one room is correspondingly associated with one target position.
In some embodiments, referring to fig. 5, fig. 5 is a schematic flow chart of step S110 in a robot door control method according to some embodiments of the present invention. Specifically, the acquiring the first point cloud data of the surrounding environment of the target location, determining the location of the door associated with the target location according to the first point cloud data, including but not limited to the following steps S111-S113:
s111: the surroundings of the target location are scanned to obtain first point cloud data.
Illustratively, the robot needs to determine a scan pattern prior to scanning the surroundings of the target location, wherein the scan pattern includes an omnidirectional scan, a directional scan, and the like. Obviously, the omnibearing scanning can provide more comprehensive environmental information, the directional scanning can acquire the environmental information of a specific area more pertinently, and a proper scanning mode can be selected according to the actual requirements of tasks, environmental characteristics and the like.
After determining the scanning mode, the robot scans the surroundings of the target location using a laser radar, a depth camera, a structured light sensor, etc. provided in itself, to acquire first point cloud data of the surroundings of the target location. It will be appreciated that during the scanning process, the robot may adjust the scanning range to ensure that the surroundings of the target location are covered, and obtain enough first point cloud data to better understand the structure and characteristics of the surroundings of the target location.
S112: and determining the track characteristics formed by the point clouds in the first point cloud data.
Specifically, after the first point cloud data is obtained, the first point cloud data is analyzed to identify key feature points representing trajectory features of objects, doors, or other structures in the surrounding environment of the target location. The robot acquires association relations, attribute information and the like between adjacent point clouds to determine track features or edge features of different objects, doors or other structures formed by the point clouds in the surrounding environment of the target position. In some embodiments, the robot extracts trajectory features of various objects, doors, or other structures formed by the point cloud by employing a feature extraction algorithm to determine the position of the door next to the target position.
S113: and when the track characteristic is matched with the preset door frame characteristic, determining the position of the door associated with the target position.
Illustratively, the robot uses a matching algorithm or model to compare and match the extracted track features with the preset door frame features, such as comparing the shape, size, direction, etc. features of the track, to perform similarity analysis with the preset door frame.
And if the extracted track characteristics are matched with the preset door frame characteristics, determining the position of the door associated with the target position. In some embodiments, the robot may need to perform feature verification, such as verifying the relative position between the track and the door frame, consistency of key points, etc., to ensure accuracy of the matching results.
In some embodiments, if the matching result of the extracted trajectory feature and the preset door frame feature does not meet the expectation, which indicates that the robot does not find the position of the door associated with the target position, the robot may need to perform a corresponding anomaly handling policy, such as rescanning the surrounding environment of the target position, re-matching, or other handling policy, and so on.
S120: and continuously scanning the door to obtain continuous multi-frame second point cloud data.
Specifically, after determining the position of the door, the robot continuously scans the door using a laser radar, a depth camera, a structured light sensor, and the like equipped with itself according to a set scanning frequency and time interval to obtain continuous multi-frame second point cloud data. The robot can detect and analyze whether the state of the door changes and the state change process of the door by comparing continuous multi-frame second point cloud data, and timely open or close the door to efficiently finish the distribution task.
S130: and determining the state change of the door according to the change among the continuous multi-frame second point cloud data.
Specifically, the point clouds in the continuous multi-frame second point cloud data are processed and analyzed in real time, and differences among the point clouds in the second point cloud data of each frame, such as differences in the density, shape, outline and the like of the point clouds, are detected to identify and determine the state change of the door.
Generally, a change in the state of a door is accompanied by a change in the door position. The robot detects whether the position of the door is changed by using a geometry analyzing technique.
In some embodiments, specifically, the determining the status change of the door according to the change between the continuous multi-frame second point cloud data includes, but is not limited to, the following steps S131-S133:
S131: and determining the state of the door in the second point cloud data according to the second point cloud data of each frame, wherein in one frame of the second point cloud data, if the distance difference between the actual distance and the theoretical distance between the adjacent point clouds is smaller than or equal to a preset difference value, the state of the door is determined to be a closed state, and if at least one group of distance differences between the actual distance and the theoretical distance between the adjacent point clouds is larger than the preset difference value, the state of the door is determined to be an open state.
Illustratively, after the continuous multi-frame second point cloud data is obtained, for each frame of second point cloud data, the differences in shape, density, outline and the like of point clouds in each frame of second point cloud data are compared to identify and determine the state of a door in the second point cloud data.
Referring to fig. 6, specifically, in the second point cloud data of one frame, if the difference between the actual distance and the theoretical distance between each adjacent point cloud is smaller than or equal to the preset difference value, it is determined that the door is in a closed state. The distance difference is a difference between an actual distance between adjacent point clouds and a theoretical distance, and the actual distance between the adjacent point clouds can be calculated by adopting a first formula, wherein the first formula is as follows:
Wherein x is 1 X-axis coordinates, y, of one of the neighboring point clouds 1 Y-axis coordinate, z, of one of the neighboring point clouds 1 Is among adjacent point cloudsZ-axis coordinate, x of point cloud 2 Is the X-axis coordinate, y of another point cloud in the adjacent point clouds 2 Y-axis coordinate, z, of another one of the neighboring point clouds 2 Is the Z-axis coordinate of another one of the neighboring point clouds. Substituting coordinates of two point clouds in the adjacent point clouds, namely calculating to obtain the actual distance between the adjacent point clouds.
The theoretical distance between adjacent point clouds can be calculated by adopting a second formula, wherein the second formula is as follows:
d is the actual detection distance of one point in the adjacent point clouds, and θ is the included angle formed by the two point clouds in the adjacent point clouds and the laser emergent point, namely the angular resolution of continuous scanning of sensors such as a laser radar, a depth camera and the like on a door.
Referring to fig. 6, fig. 6 illustrates a set of neighboring point clouds in a frame of second point cloud data acquired according to some embodiments of the present invention. The group of adjacent point clouds are point cloud A and point cloud B, and the actual distance and the theoretical distance between the point cloud A and the point cloud B can be calculated through a first formula and a second formula respectively.
The actual distance between the point cloud A and the point cloud B is as follows:
wherein x is 1 Is the X-axis coordinate, y of the point cloud A 1 Z is the Y-axis coordinate of the point cloud A 1 Is Z-axis coordinate of point cloud A, x 2 Is the X-axis coordinate, y of the point cloud B 2 Z is the Y-axis coordinate of the point cloud B 2 Is the Z-axis coordinate of point cloud B. Substituting the coordinates of the point cloud A and the point cloud B, namely calculating to obtain the actual distance between the point cloud A and the point cloud B.
The theoretical distance between the point cloud A and the point cloud B is as follows:
d is the actual detection distance of the point cloud A, namely the distance between the point cloud A and the laser emergent point, and θ is the included angle formed by the point cloud A and the point cloud B and the laser emergent point, namely the angular resolution of continuous scanning of sensors such as a laser radar, a depth camera and the like on a door. Substituting the actual detection distance of the point cloud A obtained by monitoring, namely calculating to obtain the theoretical distance between the point cloud A and the point cloud B.
And after the distance difference between the actual distance and the theoretical distance between the adjacent point clouds included in the second point cloud data of one frame is obtained through calculation, namely, after the difference between the actual distance and the theoretical distance between each group of adjacent point clouds is obtained, if the distance difference between the actual distance and the theoretical distance between each group of adjacent point clouds in the second point cloud data of the frame is smaller than or equal to the preset difference value, the state of the door can be determined to be the closed state. The preset difference value can be preset and determined according to actual requirements, environmental characteristics and the like.
If at least one group of distance differences between the actual distance and the theoretical distance of each group of adjacent point clouds in the second point cloud data of the frame are larger than a preset difference value, the state of the door can be determined to be an open state.
S132: and if the door is determined to be in a closed state according to the second point cloud data of the current frame and is determined to be in an open state according to the second point cloud data of the next frame, determining that the state of the door is changed from the closed state to the open state.
S133: and if the door is determined to be in an open state according to the second point cloud data of the current frame and the door is determined to be in a closed state according to the second point cloud data of the next frame, determining that the state of the door is changed from the open state to the closed state.
Illustratively, in obtaining consecutive multi-frame second point cloud data, a status of a door in each frame of second point cloud data may be determined according to the foregoing method.
Specifically, if it is determined that the state of the door in the second point cloud data of the current frame is a closed state and it is determined that the state of the door in the second point cloud data of the next frame is an open state, it may be determined that the state of the door is changed from the closed state to the open state.
If the state of the door in the second point cloud data of the current frame is determined to be an open state, and the state of the door in the second point cloud data of the next frame is determined to be a closed state, it can be determined that the state of the door is changed from the open state to the closed state.
The state transition of the door from closed to open or from open to closed is captured by tracking the state change of the door based on the second point cloud data of the adjacent frame. In some embodiments, the robot needs to detect the second point cloud data of each group of adjacent frames to determine the state change of the door, so as to detect the persistence of the state of the door in the continuous multi-frame second point cloud data, ensure that the state change of the door is not instantaneous change but real state transition, and be beneficial to eliminating misjudgment caused by noise or short-time interference.
S200: and on the basis of the state change of the door, controlling the opening and closing of the robot door.
Illustratively, when the robot determines that a door associated with the target location has a change of state, the opening or closing of its door is controlled accordingly in accordance with the change of state of the door. The control system can be controlled by a program in the robot through a motor, a pneumatic device or other driving components, and can also be controlled through feedback of an external sensor so as to realize automatic opening and closing of the robot door.
In some embodiments, the controlling the opening and closing of the door of the robot based on the state change of the door specifically includes, but is not limited to, the following steps S210-S220:
s210: and if the door is changed from the closed state to the open state, opening the robot cabin door.
S220: and if the door is changed from the open state to the closed state, closing the robot cabin door.
In some embodiments, when the robot senses a state change of the door associated with the target position through a sensor, for example, when the door is changed from a closed state to an open state, the automation control system controls the robot door to be opened after receiving the state change information of the door. It should be appreciated that the automation control system needs to consider the safety of the surrounding environment of the target location, and when opening the robot door, it needs to determine whether there is an obstacle in front of the robot door to ensure that the opening of the robot door is not obstructed, avoid damage or danger to the surrounding environment of the target location or pedestrians, and can take avoidance or stopping measures if necessary to ensure delivery safety.
In some embodiments, the robotic door is controlled to close if the door changes from an open state to a closed state. Similarly, when closing the robot door, it is necessary to determine whether an obstacle is located in the path of the door closure, ensure that no damage or danger is caused to the surrounding environment of the target location or pedestrians, and possibly take avoidance or stopping measures as necessary to ensure safe delivery.
It can be appreciated that in the event of a state change in the robot door, the automated control system needs to implement a corresponding exception handling strategy. For example, when a door cannot be opened or closed, an alarm message is generated and emergency action is taken to prevent a task interruption or other safety problems.
The robot bin gate control method provided by the embodiment of the invention comprises the following steps: determining a state change of a door associated with a target position after reaching the target position in the process of executing a delivery task by a robot, wherein the state change comprises a change from an open state to a closed state and a change from the closed state to the open state; and on the basis of the state change of the door, controlling the opening and closing of the robot door. According to the method, the robot bin gate is timely controlled to be opened and closed through the state change of the bin gate, so that the distribution efficiency is improved, the user experience is improved, and the distribution safety is guaranteed.
In some embodiments, the robot door may also be controlled to switch according to receiving the corresponding switch command.
In the method for controlling the robot door according to other embodiments of the present invention, the method for controlling the robot door further includes, but is not limited to, the following steps S300 to S400:
S300: and if an opening instruction generated by triggering a user on a robot display device or a terminal APP is received, opening a robot bin gate.
The robot is provided with a display device, and the display device is used for providing information feedback and an interactive interface for a user or an operator. By way of example, the display means may be a screen or other type of display device comprising control elements such as touch screens, buttons, etc. that may be used to display information content such as text, images and charts. For example, the display device may display important information such as a current state of the robot, a current task to be performed, sensor data, etc., to the user, so that the user can know the operation condition of the robot in real time. The user can also interact with the robot through the display device, so that the user can perform visual interface operation with the robot.
In some embodiments, when the robot receives an instruction or a request generated by a user at the display device of the robot or at the terminal APP trigger, a corresponding feedback may be provided through the display device to confirm that the instruction or the request has been received or executed.
Specifically, after the robot travels to the target position, the user may operate the display device of the robot to trigger generation of a door opening instruction and send to the robot. After the robot receives the door opening command, the robot can feed back the received door opening command to a user through interface feedback, voice prompt or other modes, and then execute corresponding operation, namely, the door of the robot is opened.
In some embodiments, the user may also operate the terminal APP to trigger generation of a door opening instruction and send to the robot. After the robot receives the door opening command, the robot can feed back the received door opening command to the user through interface feedback, voice prompt or other modes, and then the door of the robot is opened. It is easy to understand that the terminal can be electronic equipment such as a smart phone, a tablet personal computer and the like, wherein a corresponding application program APP is installed in the terminal, and the terminal is communicated with the robot.
S400: and if a closing instruction generated by triggering a user on a robot display device or a terminal APP is received, closing a robot bin gate and completing a distribution task.
Specifically, after the user collects the articles, the user can operate the display device or the terminal APP of the robot to trigger generation of a door closing instruction, and send the door closing instruction to the robot. After the robot receives the door closing instruction, the robot can feed back to the user that the door closing instruction is received through interface feedback, voice prompt or other modes, and then the door of the robot is closed. After the bin gate is closed, the robot sends feedback of the completion of the delivery task to the user, and the completion of the delivery task is confirmed.
In some embodiments, before the robot performs the delivery task, it may be determined whether the robot is in the process of performing the delivery task by determining whether there is an item to be delivered in the warehouse.
In the method for controlling a door of a robot according to other embodiments of the present invention, after reaching a target position, before determining a state change of a door associated with the target position, the method for controlling a door of a robot further includes, but is not limited to, steps S101 to S102:
s101: when the objects to be distributed exist in the robot warehouse, the robot is determined to be in the process of executing the distribution task.
S102: and when no object to be distributed exists in the robot warehouse, determining that the robot is in a non-execution distribution task.
Specifically, whether the robot is in the process of performing the delivery task is determined by detecting the articles in the warehouse and analyzing the current state of the robot. The robot may detect and determine whether the goods to be distributed are located in the warehouse by using sensors or vision systems, such as cameras, lidar, etc. to acquire the position and state information of the goods in the warehouse.
If the fact that the goods to be distributed exist in the goods warehouse of the robot is detected, the fact that the robot is in a state of executing the distribution task can be determined, the robot provides immediate feedback for a user, and the user is informed that the distribution task is in progress through a display screen, a terminal APP notification and the like.
If no articles to be distributed exist in the robot warehouse, the robot can be determined to be in a state of not executing the distribution task, the robot provides instant feedback for management personnel, the management personnel is informed of the fact that the robot is in an idle or standby state through a display screen, a terminal APP notification and the like, and then corresponding measures are taken according to a preset standby strategy, such as staying at a specified position, converting into an energy consumption reduction mode or performing self-checking and maintenance.
In some embodiments, when the robot is in a state that is not performing a delivery task, relevant resources occupied when the delivery task is released, e.g., power management policies, suspension path planning, etc., are released to reduce power consumption and ready to receive a new delivery task.
In some embodiments, when the robot is in a state of not performing the dispensing task, if a dispensing instruction generated by a user operating a display device of the robot or triggering of the terminal APP is received, in order to avoid that the dispensing task is directly performed without loading the to-be-dispensed article, it may be caused that the to-be-dispensed article is not carried, and after the dispensing instruction is received, before the dispensing task is performed, it is required to detect whether the to-be-dispensed article exists in a warehouse of the robot, so as to ensure that the robot successfully carries the to-be-dispensed article.
In the method for controlling a robot door according to other embodiments of the present invention, the method for controlling a robot door further includes, but is not limited to, the following steps S103 to S105:
s103: when the robot is in a non-execution delivery task and a delivery instruction generated by triggering a user on a robot display device or a terminal APP is received, whether an object to be delivered exists in a robot warehouse or not is detected.
Specifically, if the robot is in a state in which the delivery task is not performed, the user may operate the display device of the robot to trigger generation of the delivery instruction and transmit the delivery instruction to the robot. When the robot receives a delivery instruction generated by triggering a display device of the user operation robot, whether the goods to be delivered exist in the warehouse or not is detected through a sensor or a vision system carried on the robot, so that the robot can execute a delivery task after carrying the corresponding goods to be delivered, and successful execution of the delivery task is ensured.
In some embodiments, the user may also operate the terminal APP to trigger generation of the dispensing instruction and send the dispensing instruction to the robot. When the robot receives a delivery instruction generated by triggering of the user operation terminal APP, whether the goods to be delivered exist in the warehouse or not is detected through a sensor or a vision system, such as a camera, a laser radar and other devices, which are arranged on the robot.
S104: when the objects to be distributed exist in the robot warehouse, the state of the robot is converted into the state of executing the distribution task.
If the fact that the to-be-delivered objects exist in the robot warehouse is detected, the fact that the to-be-delivered objects are loaded in the robot warehouse can be determined, the corresponding delivery tasks can be started to be executed, the state of the robot is converted from the state of not executing the delivery tasks to the state of executing the delivery tasks, and then immediate feedback is provided for a user in a mode of a display screen, a terminal APP notification and the like so as to inform the user that the robot is ready to execute the corresponding delivery tasks.
S105: when no object to be distributed exists in the robot warehouse, the state of the robot is kept in a state of not executing the distribution task.
If no to-be-delivered object exists in the robot warehouse, it can be determined that the robot warehouse is not loaded with the to-be-delivered object, the robot cannot execute the corresponding delivery task, the state of the robot is kept unchanged in the state of not executing the delivery task, then instant feedback is provided for the manager in a mode of a display screen, a terminal APP notification and the like, so that the manager is informed that the to-be-delivered object corresponding to the not-to-be-delivered task in the warehouse of the robot cannot execute the corresponding delivery task, and the manager waits for processing.
It can be appreciated that in the embodiment of the invention, when the robot is in the process of executing the delivery task, the robot can switch and control the robot door according to the state change of the door near the target position. However, when the robot is not performing the delivery task, the robot does not perform the opening and closing control on the robot door according to the state change of the door, so that misjudgment and frequent execution of the door opening or closing operation during the normal operation of the robot are avoided.
In summary, the method for controlling the robot bin gate provided by the embodiment of the invention includes: determining a state change of a door associated with a target position after reaching the target position in the process of executing a delivery task by a robot, wherein the state change comprises a change from an open state to a closed state and a change from the closed state to the open state; and on the basis of the state change of the door, controlling the opening and closing of the robot door. According to the method, the robot bin gate is timely controlled to be opened and closed through the state change of the bin gate, so that the distribution efficiency is improved, the user experience is improved, and the distribution safety is guaranteed.
An embodiment of the present invention provides a computer readable storage medium, where computer program instructions are stored on the computer readable storage medium, where the computer program instructions are configured to cause an electronic device to execute steps in any one of the robot door control methods provided by the embodiment of the present invention, or any one of possible implementation manners of any one of the robot door control methods provided by the embodiment of the present invention.
In some embodiments, the storage medium may be flash memory, a hard disk, an optical disk, a register, magnetic surface memory, a removable magnetic disk, a CD-ROM, random Access Memory (RAM), read-only memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, or any other form of storage medium known in the art, and may be a variety of devices including one or any combination of the aforementioned storage media.
In some embodiments, computer program instructions may be written in any form of programming language, including compiled or interpreted languages, or declarative or procedural languages, in the form of programs, software modules, scripts or code, and it may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
As an example, computer program instructions may, but need not, correspond to files in a file system, be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext markup language (HTML, hyper Text Markup Language) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, computer program instructions may be deployed to be executed on one computing device (e.g., devices that include smart terminals and servers) or on multiple computing devices at one site or distributed across multiple computing devices interconnected by a communication network. It will be readily understood that all or part of the steps of the methods described in the embodiments of the present invention described above may be implemented directly using electronic hardware or processor-executable computer program instructions, or a combination of both.
It will be appreciated by those skilled in the art that the embodiments provided in the present invention are merely illustrative, and the written order of steps in the methods of the embodiments is not meant to be a strict order of execution and forms any limitation on the implementation process, and the order may be adjusted, combined, and deleted according to actual needs, and the modules or sub-modules, units, or sub-units in the apparatus or system of the embodiments may be combined, divided, and deleted according to actual needs. For example, the division of the units is merely a logic function division, and there may be another division manner when actually implemented, and for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed.
It should be noted that the foregoing embodiments are merely illustrative of the technical concept and features of the present invention, and are intended to enable those skilled in the art to understand the present invention and to implement the same, but are not intended to limit the scope of the claims of the present invention, and those skilled in the art will understand that all or part of the procedures for implementing the foregoing embodiments are equivalent and modified according to the claims of the present invention, and all equivalent changes and modifications will fall within the scope of the claims of the present invention.

Claims (10)

1. A method for controlling a robot door, comprising:
determining a state change of a door associated with a target position after reaching the target position in the process of executing a delivery task by a robot, wherein the state change comprises a change from an open state to a closed state and a change from the closed state to the open state;
and on the basis of the state change of the door, controlling the opening and closing of the robot door.
2. The robotic door control method of claim 1, wherein the determining a change in state of a door associated with the target location comprises:
acquiring first point cloud data of the surrounding environment of the target position, and determining the position of a door associated with the target position according to the first point cloud data;
Continuously scanning the door to obtain continuous multi-frame second point cloud data;
and determining the state change of the door according to the change among the continuous multi-frame second point cloud data.
3. The method of claim 2, wherein the obtaining first point cloud data of the environment surrounding the target location, and determining the location of the door associated with the target location according to the first point cloud data, comprises:
scanning the surrounding environment of the target position to obtain first point cloud data;
determining track characteristics formed by point clouds in the first point cloud data;
and when the track characteristic is matched with the preset door frame characteristic, determining the position of the door associated with the target position.
4. The method of claim 2, wherein determining the status change of the door based on the change between the second point cloud data for a plurality of consecutive frames comprises:
determining the state of the door in the second point cloud data according to the second point cloud data of each frame, wherein in one frame of the second point cloud data, if the distance difference between the actual distance and the theoretical distance between the adjacent point clouds is smaller than or equal to a preset difference value, the state of the door is determined to be a closed state, and if at least one group of distance differences between the actual distance and the theoretical distance between the adjacent point clouds is larger than the preset difference value, the state of the door is determined to be an open state;
If the door is determined to be in a closed state according to the second point cloud data of the current frame and is determined to be in an open state according to the second point cloud data of the next frame, determining that the state of the door is changed from the closed state to the open state;
and if the door is determined to be in an open state according to the second point cloud data of the current frame and the door is determined to be in a closed state according to the second point cloud data of the next frame, determining that the state of the door is changed from the open state to the closed state.
5. The method according to any one of claims 1 to 4, wherein the controlling of the opening and closing of the robot door based on the state change of the door comprises:
if the door is changed from a closed state to an open state, opening a robot cabin door;
and if the door is changed from the open state to the closed state, closing the robot cabin door.
6. The robotic door control method of claim 5, wherein the method further comprises:
if an opening instruction generated by triggering a user on a robot display device or a terminal APP is received, opening a robot bin gate; or,
And if a closing instruction generated by triggering a user on a robot display device or a terminal APP is received, closing a robot bin gate and completing a distribution task.
7. The method of claim 1-4, wherein the method further comprises, during the performance of the delivery task by the robot, after reaching the target location, before determining a change in the status of the door associated with the target location:
when objects to be distributed exist in the robot warehouse, determining that the robot is in the process of executing a distribution task;
and when no object to be distributed exists in the robot warehouse, determining that the robot is in a non-execution distribution task.
8. The robotic door control method of any of claims 1-4, wherein the method further comprises:
when the robot is in a non-execution delivery task and a delivery instruction generated by triggering a user on a robot display device or a terminal APP is received, detecting whether an object to be delivered exists in a robot warehouse;
when objects to be distributed exist in the robot warehouse, converting the state of the robot into a state for executing a distribution task;
when no object to be distributed exists in the robot warehouse, the state of the robot is kept in a state of not executing the distribution task.
9. An electronic device, comprising:
a processor and a memory communicatively coupled to the processor;
the memory stores computer program instructions executable by the processor, which when invoked by the processor, cause the processor to perform the robot door control method of any of claims 1-8.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon computer program instructions for causing a computer device to execute the robot door control method according to any of claims 1-8.
CN202311635859.5A 2023-11-30 2023-11-30 Robot bin gate control method, electronic equipment and storage medium Pending CN117519195A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311635859.5A CN117519195A (en) 2023-11-30 2023-11-30 Robot bin gate control method, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311635859.5A CN117519195A (en) 2023-11-30 2023-11-30 Robot bin gate control method, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117519195A true CN117519195A (en) 2024-02-06

Family

ID=89753015

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311635859.5A Pending CN117519195A (en) 2023-11-30 2023-11-30 Robot bin gate control method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117519195A (en)

Similar Documents

Publication Publication Date Title
JP6752499B2 (en) A safety system for identifying safe areas in a 3D workspace, including controlled machinery, and how to safely operate controlled machinery in a 3D workspace.
US20200333789A1 (en) Information processing apparatus, information processing method, and medium
CN108369419B (en) System and method for determining monitoring parameters using spatiotemporal object inventory
US20200064827A1 (en) Self-driving mobile robots using human-robot interactions
CN104737085A (en) Robot and method for autonomous inspection or processing of floor areas
CN112207827B (en) Robot safe movement control method and device and electronic equipment
Price et al. Multisensor-driven real-time crane monitoring system for blind lift operations: Lessons learned from a case study
CN115659452B (en) Intelligent patrol method, intelligent patrol system and computer readable storage medium
CN109624994A (en) A kind of Vehicular automatic driving control method, device, equipment and terminal
CN114730192A (en) Object moving system
US20220410391A1 (en) Sensor-based construction of complex scenes for autonomous machines
CN114180428A (en) Method and device for robot to recover task
Tashtoush et al. Human-robot interaction and collaboration (HRI-c) utilizing top-view RGB-d camera system
US11675329B2 (en) Functional safety system using three dimensional sensing and dynamic digital twin
CN117519195A (en) Robot bin gate control method, electronic equipment and storage medium
US20230069482A1 (en) Machine learning-based environment fail-safes through multiple camera views
US20240051132A1 (en) Distributed coordination system and task execution method
JP7051270B1 (en) Information processing systems, methods, and programs
CN113199484B (en) Robot safe operation method, equipment and storage medium
CN115576324A (en) Robot inspection method and device, storage medium and robot
CN113379830A (en) Anti-collision method and device, storage medium and electronic equipment
CN113139448A (en) Safety monitoring method, device and system for charging and replacing power station, charging and replacing power station and medium
CN113487298A (en) Automatic remote tally management and control method, system and terminal for wharf
CN113762008A (en) Robot ladder entering method and device
CN117791807A (en) Robot charging in-place detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination