CN114770504B - Robot control method, device, robot and storage medium - Google Patents

Robot control method, device, robot and storage medium Download PDF

Info

Publication number
CN114770504B
CN114770504B CN202210447069.3A CN202210447069A CN114770504B CN 114770504 B CN114770504 B CN 114770504B CN 202210447069 A CN202210447069 A CN 202210447069A CN 114770504 B CN114770504 B CN 114770504B
Authority
CN
China
Prior art keywords
palm
user
mechanical arm
storage bin
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210447069.3A
Other languages
Chinese (zh)
Other versions
CN114770504A (en
Inventor
夏舸
梁朋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uditech Co Ltd
Original Assignee
Uditech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uditech Co Ltd filed Critical Uditech Co Ltd
Priority to CN202210447069.3A priority Critical patent/CN114770504B/en
Publication of CN114770504A publication Critical patent/CN114770504A/en
Application granted granted Critical
Publication of CN114770504B publication Critical patent/CN114770504B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a robot control method, a device, a robot and a storage medium, wherein the robot is provided with a storage bin, and a mechanical arm is arranged in the storage bin, and the method comprises the following steps: after reaching the delivery destination of the articles to be delivered in the storage bin, the grasped articles to be delivered are sent out of the storage bin through the mechanical arm; detecting whether a user palm exists in the environment outside the storage bin or not through a sensor arranged in the mechanical arm; when the existence of the palm of the user is determined, acquiring the position of the palm of the user through a sensor; and controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed objects to be distributed on the palm of the user. According to the invention, the robot is used for automatically delivering the objects to be delivered to the hands of the user, the user only needs to stretch the hands to catch the objects to be delivered, and the user does not need to stretch the hands to enter the storage bin to pick up the objects, so that the pick-up flow of the user is simplified, the intelligence of the delivery robot is improved, and the pick-up experience of the user is improved.

Description

Robot control method, device, robot and storage medium
Technical Field
The present invention relates to the field of robots, and in particular, to a method and apparatus for controlling a robot, and a storage medium.
Background
Currently, as robotics mature and develop, robots are increasingly applied in various fields to replace manual work tasks. For example, robots may be utilized to provide intelligent item distribution services to customers in hotels, restaurants, etc. However, when the robot delivers the objects, the user often needs to stretch hands to enter the storage bin of the robot to take the objects, the user can take the objects by verifying the identity and bending down by stretching the hands again, and the operation of the user is tedious, time-consuming and labor-consuming.
Disclosure of Invention
The invention mainly aims to provide a robot control method, a robot control device, a robot and a storage medium, and aims to solve the technical problem of how to improve the intelligence of the robot in the process of delivering objects.
In order to achieve the above object, the present invention provides a robot control method, which is applied to a robot, wherein the robot is provided with a storage bin, and a mechanical arm is arranged in the storage bin, and the method comprises:
after reaching the delivery destination of the articles to be delivered in the storage bin, the grasped articles to be delivered are sent out of the storage bin through the mechanical arm;
detecting whether a user palm exists in the environment outside the storage bin or not through a sensor arranged in the mechanical arm;
When the existence of the palm of the user is determined, acquiring the position of the palm of the user through the sensor;
and controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed articles to be distributed on the palm of the user.
Optionally, the detecting, by a sensor provided in the mechanical arm, whether a palm of a user exists in an environment outside the storage bin includes:
acquiring sensor data through a sensor arranged in the mechanical arm, wherein the sensor is a depth camera and/or a radar sensor;
extracting environmental object outline features from the sensor data, and comparing the environmental object outline features with preset palm outline features;
and if the contour features of the environment objects are consistent with the contour features of the palm, determining that the palm of the user exists in the environment outside the storage bin.
Optionally, after reaching the delivery destination of the to-be-delivered object in the storage bin, before the mechanical arm sends the grasped to-be-delivered object out of the storage bin, the method further includes:
after the to-be-dispensed objects are determined, the to-be-dispensed objects are grabbed and fixed from the storage bin through the mechanical arm.
Optionally, the grabbing and fixing the to-be-dispensed objects from the storage bin through the mechanical arm includes:
the method comprises the steps of scanning through a graphic code scanning device arranged in the mechanical arm, determining the position of an object to be distributed in the storage bin according to the scanned graphic code of the object to be distributed, or communicating with each object in the storage bin through a near field communication device arranged in the mechanical arm, and determining the position of the object to be distributed in the storage bin according to a communication result;
and controlling the mechanical arm to move to the position of the article to be distributed in the storage bin, and then grabbing and fixing the article to be distributed.
Optionally, before the detecting, by the sensor disposed in the mechanical arm, whether the palm of the user exists in the environment outside the storage bin, the method further includes:
detecting whether a pickup object exists or not through the sensor;
if the goods taking object is determined to exist, detecting the height characteristics of the goods taking object through the sensor;
and controlling the mechanical arm to carry the to-be-dispensed object to hover at a height position corresponding to the height characteristic.
Optionally, when the sensor includes an imaging device, the detecting, by the sensor, whether the pick-up object exists includes:
Acquiring environment image data through the camera device, and identifying the environment image data to obtain identity verification information;
matching the identity verification information with the identity information of the goods taking object corresponding to the goods to be distributed;
and when the identity verification information is matched and consistent with the identity information of the goods taking object, determining that the goods taking object exists.
Optionally, the controlling the mechanical arm to move to a position where the palm of the user is located, so as to place the gripped object to be dispensed on the palm of the user includes:
controlling the mechanical arm to move to the position where the palm of the user is;
detecting pressure values of contact points through electronic skin arranged on the contact surface of the mechanical arm grabbing component and the to-be-dispensed object;
when each pressure value is detected to be in accordance with the preset pressure value distribution state, the mechanical arm is controlled to loosen the to-be-dispensed objects so as to place the grasped to-be-dispensed objects on the palm of the user.
In order to achieve the above object, the present invention further provides a robot control device, the device is deployed on a robot, the robot is provided with a storage bin, a mechanical arm is provided in the storage bin, the device includes:
The grabbing module is used for sending the grabbed objects to be distributed out of the storage bin through the mechanical arm after reaching the distribution destination of the objects to be distributed in the storage bin;
the detection module is used for detecting whether a user palm exists in the environment outside the storage bin or not through a sensor arranged in the mechanical arm;
the acquisition module is used for acquiring the position of the palm of the user through the sensor when the palm of the user is determined to exist;
and the delivery module is used for controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed objects to be distributed on the palm of the user.
To achieve the above object, the present invention also provides a robot including: the robot control system comprises a memory, a processor and a robot control program stored on the memory and capable of running on the processor, wherein the robot control program realizes the steps of the robot control method when being executed by the processor.
In addition, in order to achieve the above object, the present invention also proposes a computer-readable storage medium having stored thereon a robot control program which, when executed by a processor, implements the steps of the robot control method as described above.
According to the invention, the storage bin is arranged in the robot, the mechanical arm is arranged in the storage bin, after the delivery destination corresponding to the to-be-delivered object is reached, the mechanical arm is used for grabbing the to-be-delivered object in the bin and delivering the to-be-delivered object out of the bin, the sensor arranged in the mechanical arm is used for detecting whether the user palm exists in the external environment, the position of the user palm is obtained through the sensor when the user palm is determined, the mechanical arm is controlled to move to the position of the user palm, so that the grabbed to-be-delivered object is placed in the user palm, the robot is realized to automatically deliver the to-be-delivered object to the user hand, the user only needs to stretch the hand to receive the to-be-delivered object, the user does not need to stretch the hand to enter the storage bin to take the to-be-delivered object, the goods taking procedure of the user is simplified, the intelligence of the delivery robot is improved, and the goods taking experience of the user is improved.
Drawings
FIG. 1 is a schematic diagram of a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a flow chart of a first embodiment of the robot control method of the present invention;
fig. 3 is a schematic view of a scenario in which a robotic arm grips an object to be dispensed according to an embodiment of the present invention;
fig. 4 is a schematic functional block diagram of a robot control device according to a preferred embodiment of the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Referring to fig. 1, fig. 1 is a schematic device structure of a hardware running environment according to an embodiment of the present invention.
It should be noted that, the device in the embodiment of the present invention may be a device with data processing capability, such as a smart phone, a personal computer, and a server, and the device may be deployed in a mobile robot, which is not limited herein. The robot is provided with a storage bin, and a mechanical arm is arranged in the storage bin.
As shown in fig. 1, the apparatus (terminal or robot) may include: a processor 1001, such as a CPU, memory 1002, a communications bus 1003. Wherein the communication bus 1003 is used to enable connectivity communications between these components. The memory 1002 may be a high-speed RAM memory or a stable memory (non-volatile memory), such as a disk memory. The memory 1002 may alternatively be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the device structure shown in fig. 1 is not limiting of the device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 1, an operating system and a robot control program may be included in a memory 1002 as one type of computer storage medium. The operating system is a program that manages and controls the hardware and software resources of the device, supporting the operation of the robot control program and other software or programs. In the apparatus shown in fig. 1, a processor 1001 may be used to call a robot control program stored in a memory 1002 and perform the following operations:
after reaching the delivery destination of the articles to be delivered in the storage bin, the grasped articles to be delivered are sent out of the storage bin through the mechanical arm;
detecting whether a user palm exists in the environment outside the storage bin or not through a sensor arranged in the mechanical arm;
when the existence of the palm of the user is determined, acquiring the position of the palm of the user through the sensor;
and controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed articles to be distributed on the palm of the user.
Further, detecting whether a user palm exists in the environment outside the storage bin through the sensor arranged in the mechanical arm comprises:
acquiring sensor data through a sensor arranged in the mechanical arm, wherein the sensor is a depth camera and/or a radar sensor;
Extracting environmental object outline features from the sensor data, and comparing the environmental object outline features with preset palm outline features;
and if the contour features of the environment objects are consistent with the contour features of the palm, determining that the palm of the user exists in the environment outside the storage bin.
Further, the processor 1001 may be further configured to invoke a robot control program stored in the memory 1002, to perform the following operations after reaching the delivery destination of the objects to be delivered in the storage bin, before the gripped objects to be delivered are sent out of the storage bin by the mechanical arm:
after the to-be-dispensed objects are determined, the to-be-dispensed objects are grabbed and fixed from the storage bin through the mechanical arm.
Further, the grabbing and fixing the to-be-dispensed objects from the storage bin through the mechanical arm comprises:
the method comprises the steps of scanning through a graphic code scanning device arranged in the mechanical arm, determining the position of an object to be distributed in the storage bin according to the scanned graphic code of the object to be distributed, or communicating with each object in the storage bin through a near field communication device arranged in the mechanical arm, and determining the position of the object to be distributed in the storage bin according to a communication result;
And controlling the mechanical arm to move to the position of the article to be distributed in the storage bin, and then grabbing and fixing the article to be distributed.
Further, before the sensor provided in the mechanical arm detects whether the palm of the user exists in the environment outside the storage compartment, the processor 1001 may be further configured to invoke the robot control program stored in the memory 1002 to perform the following operations:
detecting whether a pickup object exists or not through the sensor;
if the goods taking object is determined to exist, detecting the height characteristics of the goods taking object through the sensor;
and controlling the mechanical arm to carry the to-be-dispensed object to hover at a height position corresponding to the height characteristic.
Further, when the sensor includes an imaging device, the detecting, by the sensor, whether the pickup object exists includes:
acquiring environment image data through the camera device, and identifying the environment image data to obtain identity verification information;
matching the identity verification information with the identity information of the goods taking object corresponding to the goods to be distributed;
and when the identity verification information is matched and consistent with the identity information of the goods taking object, determining that the goods taking object exists.
Further, the controlling the mechanical arm to move to the position of the palm of the user so as to place the gripped objects to be dispensed on the palm of the user includes:
controlling the mechanical arm to move to the position where the palm of the user is;
detecting pressure values of contact points through electronic skin arranged on the contact surface of the mechanical arm grabbing component and the to-be-dispensed object;
when each pressure value is detected to be in accordance with the preset pressure value distribution state, the mechanical arm is controlled to loosen the to-be-dispensed objects so as to place the grasped to-be-dispensed objects on the palm of the user.
Based on the above-described structure, various embodiments of a robot control method are presented.
Referring to fig. 2, fig. 2 is a flowchart illustrating a first embodiment of a robot control method according to the present invention.
The embodiments of the present invention provide embodiments of a robot control method, it being noted that although a logic sequence is shown in the flow chart, in some cases the steps shown or described may be performed in a different order than that shown or described herein. The execution body of each embodiment of the robot control method of the invention can be a robot, the robot can be a conventional robot controlled by an automatic control program, and the type and specific implementation details of the robot are not limited in each embodiment. In the present embodiment, the robot control method includes the following steps S10 to S40:
Step S10, after reaching a delivery destination of the objects to be delivered in the storage bin, delivering the grabbed objects to be delivered out of the storage bin through the mechanical arm;
in this embodiment, the robot is provided with the storing storehouse for place the article that needs the delivery, set up the arm in the storing storehouse, be used for snatching the article in the storing storehouse and send out outside the storehouse. In a specific embodiment, the storage bin can be provided with a bin gate, or the bin gate can be omitted, so that the bin gate can be arranged for improving the safety of the distributed objects, and the robot can control the bin gate to be opened or closed according to scene requirements.
The articles to be dispensed will hereinafter be referred to as articles to be dispensed. In a specific embodiment, the articles to be distributed may be manually placed in the storage bin of the robot by a worker, or may be automatically picked up in an automatic pickup area by the robot, which is not limited in this embodiment.
The robot can deliver according to the delivery destination of the objects to be delivered, and after the objects reach the delivery destination, the robot arm is controlled to grasp the objects to be delivered in the storage bin and send the objects to be delivered out of the storage bin. It should be noted that, the method for controlling the pose of the mechanical arm by the robot is not limited in this embodiment, for example, in an implementation manner, the target pose (including the position and the pose) of the mechanical arm may be determined according to a specific scene, the control parameters of each component of the mechanical arm are calculated according to the current pose and the target pose of the mechanical arm and the control algorithm of the mechanical arm, and each component is driven to act according to the control parameters, so as to drive the mechanical arm to adjust from the current pose to the target pose.
Further, in an embodiment, when the robot simultaneously delivers a plurality of objects, a correspondence between the objects to be delivered and the delivery destination may be obtained, in a specific embodiment, the delivery destination may be determined according to the correspondence when the objects to be delivered are known, and then the objects to be delivered may be navigated to the delivery destination to deliver the objects to be delivered, or the objects to be delivered may be determined according to the correspondence when the delivery destination is known, and then the objects to be delivered may be delivered to the customer when the navigation reaches the delivery destination.
In the present embodiment, the manner in which the robot obtains the correspondence between the article to be delivered and the delivery destination is not limited. For example, in one embodiment, when the to-be-dispensed object is placed in the storage bin by the worker, the worker may input the dispensing destination of the to-be-dispensed object, the robot recognizes the to-be-dispensed object placed in the storage bin through the recognition device in the storage bin, and binds the dispensing destination input by the worker with the recognized to-be-dispensed object. For example, in another embodiment, when the robot takes goods in the automatic goods taking area, the robot can control the mechanical arm to extend out of the bin, the camera is arranged in the mechanical arm to scan the graphic code arranged in the goods to be delivered, the graphic code is identified to obtain the delivery destination of the goods to be delivered, or the near field communication device arranged in the mechanical arm is communicated with the near field communication device in the goods to be delivered to obtain the delivery destination of the goods to be delivered, the goods to be delivered and the delivery destination are bound, and the goods to be delivered are grabbed by the mechanical arm and placed in the storage bin.
In the specific embodiment, the robot may distinguish different articles according to the unique numbers of the articles, or by providing a camera in the robot, identifying the image features of the articles by the images captured by the camera, distinguishing the different articles based on the image features, and in this embodiment, the manner of distinguishing the articles to be dispensed by the robot is not limited. When the articles are distinguished according to their unique codes, the codes may be carried in a graphic code provided in the articles, or may be preset in a near field communication device in the articles, or may be entered in the robot by a worker.
It should be noted that, the robot may grasp the to-be-dispensed object through the mechanical arm after reaching the delivery destination, and send the to-be-dispensed object out of the bin, or may pre-grasp the to-be-dispensed object in the storage bin through the mechanical arm before reaching the delivery destination, so as to quickly send the to-be-dispensed object out of the bin after reaching the delivery destination.
Step S20, detecting whether a user palm exists in the environment outside the storage bin or not through a sensor arranged in the mechanical arm;
The mechanical arm may be provided with a sensor for sensing an external environment, and the sensor may specifically be an image pickup device, a radar sensor, an infrared sensor, a thermal imaging sensor, or the like, or may be provided with a plurality of sensors at the same time, which is not limited in this embodiment.
The robot can detect whether a user palm exists in the environment outside the storage bin through the sensor arranged in the mechanical arm, and can specifically determine through analyzing data acquired by the sensor. In particular embodiments, when the types of sensors employed are different, the manner in which the sensor data is analyzed to determine whether a user's palm is present is also different. Specifically, data is collected through a sensor in advance when a user palm exists in an external environment, the collected sensor data is analyzed to extract data features related to the user palm and store the data features, such as outline features, when a robot needs to identify whether the user palm exists in the external environment, the data features collected through the sensor are analyzed and extracted to be compared with the data features related to the user palm stored in advance, and if the data features are consistent, the existence of the user palm can be determined.
Step S30, when the existence of the palm of the user is determined, acquiring the position of the palm of the user through the sensor;
when the fact that the user palm exists in the external environment is determined, the robot can acquire the position of the user palm through the sensor, and the position can be represented by data such as the distance, the azimuth and the like of the user palm relative to the mechanical arm. In a specific embodiment, when the data collected by the sensor is analyzed to determine that the palm of the user exists, the position of the palm of the user is determined according to the data in the analysis process, or when the palm of the user exists, the position of the palm of the user is obtained through analyzing the data collected by the sensor.
In a specific embodiment, when the types of the adopted sensors are different, the method for analyzing the data collected by the sensors to determine the position of the palm of the user is different; specifically, when the data characteristics acquired by the sensor are analyzed and extracted and compared with the data characteristics related to the pre-stored user palm, the position of the user palm is determined according to the position information carried by the sensor data corresponding to the data characteristics which are compared and matched with the data characteristics related to the user palm.
It will be appreciated that when it is determined that the palm of the user's hand is present in the environment, the user's intention to reach over the item to be dispensed is illustrated.
Further, in an embodiment, when it is determined that the palm of the user does not exist in the external environment, the robot may adjust the position of the mechanical arm or adjust the direction that the robot faces in its entirety, and after the adjustment, data is collected by a sensor in the mechanical arm to perform analysis and judgment on whether the palm of the user exists.
And S40, controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed objects to be distributed on the palm of the user.
After the position of the palm of the user is determined, the robot can control the mechanical arm to move to the position of the palm of the user so as to place the grabbed objects to be distributed on the palm of the user. The specific embodiment of the control mechanical arm moving to the position of the palm of the user can refer to the specific embodiment of the control mechanical arm for delivering the objects to be delivered out of the bin, which is not described herein.
In particular embodiments, the robotic arm may release the item to be dispensed upon detecting that the user holds the item to be dispensed, to deliver the item to be dispensed to the user. There are various ways to detect whether the user holds the object to be dispensed, for example, when a confirmation instruction input by the user (which may be input by a touch screen, a touch key, a voice, etc.) is detected, it can be determined that the user holds the object to be dispensed.
It can be appreciated that in this embodiment, through setting up the storing storehouse at the robot, set up the arm in the storing storehouse, after arriving the delivery destination that the article that waits to be dispatched corresponds, snatch the article that waits to be dispatched in the storehouse through the arm and send out outside the storehouse, whether there is the user's palm in the external environment is detected through the sensor that sets up in the arm, obtain user's palm position through the sensor again when confirming user's palm, control the arm and remove to user's palm position, with the article that waits to be dispatched of snatching place in user's palm, realized that the robot is automatic with waiting to dispatch article and delivering in user's hand, the user only need the extension hand to catch the article that waits to be dispatched can, need not to extend into in the storing storehouse and get goods, user's the flow of getting goods has been simplified, the intelligent of delivering robot has been improved user's getting experience.
Further, based on the first embodiment, a second embodiment of the robot control method of the present invention is provided, in this embodiment, step S20 detects, through a sensor provided in the mechanical arm, whether a palm of a user exists in an environment outside the storage compartment, including steps S201 to S203:
step S201, sensor data are acquired through a sensor arranged in the mechanical arm, wherein the sensor is a depth camera and/or a radar sensor;
In this embodiment, a sensor is provided in the mechanical arm, and the sensor may employ a depth camera and/or a radar sensor. The image data shot by the depth camera comprises the positions of the relative cameras of the distances between each pixel point and the camera, and the point cloud data measured by the radar sensor comprises the distances between each point in space and the radar sensor and the positions of the relative radar sensor.
The robot may collect data through a sensor, the acquired data being referred to as sensor data, the sensor data comprising image data when the sensor comprises a depth camera, and the sensor data comprising point cloud data when the sensor comprises a radar sensor.
Step S202, extracting environmental object outline features from the sensor data, and comparing the environmental object outline features with preset palm outline features;
the robot may extract contour features (hereinafter referred to as environmental object contour features) from the sensor data to illustrate the distinction. There are various methods for extracting contour features from image data of a depth camera, and the method is not limited in this embodiment. There are various methods for extracting the contour features from the point cloud data of the radar sensor, and the method is not limited in this embodiment. Palm contour features may be preset in the robot. The palm contour feature may be obtained by analyzing sensor data collected by the sensor when a palm of a user is present in an external environment. The robot compares the environmental object profile with the palm profile.
Step S203, if the profile features of the environment object are consistent with the profile features of the palm, determining that the palm of the user exists in the environment outside the storage bin.
If the contour features of the environment object are consistent with the contour features of the palm, the existence of the palm of the user in the environment outside the storage bin can be determined. It should be noted that, because other obstacles exist in the environment outside the storage bin besides the palm of the user, the outline features of the environment also comprise the outline features of other obstacles, and when some outline features in the outline features of the environment are consistent with the outline features of the palm, the palm of the user in the environment outside the storage bin can be determined.
Further, in a specific embodiment, when the contour features of the environment object are inconsistent with the contour features of the palm, it may be determined that the palm of the user does not exist in the environment outside the storage bin, or other detection methods may be further adopted to detect whether the palm of the user exists, for example, an image matching method.
Further, in an embodiment, when the environmental object contour feature is aligned with the palm contour feature, a portion aligned with the palm contour feature may be extracted from the environmental object contour feature, distance and azimuth information corresponding to the portion feature may be extracted from the sensor data, a distance and an azimuth of the user palm relative to the sensor may be determined, and the distance and the azimuth may be converted into a distance and an azimuth of the user palm relative to the mechanical arm, so as to obtain a position where the user palm is located.
Further, in an embodiment, the robot may also set a common camera to capture an image, match the captured image with a preset palm image of the user, determine whether the captured image includes the palm of the user, and further determine whether the environment outside the storage bin has the palm of the user according to the matching result. The user palm image may be an image including a user palm obtained by shooting in advance, specifically, the images obtained by shooting the palms of different people from different angles and different distances may be respectively matched with each user palm image when matching, if matching with one of the images is successful, it may be determined that the environment outside the storage bin has the user palm, and if not, it may be determined that the environment outside the storage bin does not have the user palm. Further, when the shot image and the user palm image are successfully matched, a partial image matched with the user palm image can be extracted from the shot image, the position of the partial image in the shot whole image can be converted to obtain the position of the user palm relative to the mechanical arm, the distance of the user palm relative to the mechanical arm can be converted to obtain the position of the user palm according to the area occupied by the partial image in the shot whole image.
Further, in an embodiment, the step S10 further includes, after reaching the delivery destination of the to-be-delivered object in the storage bin, before the mechanical arm sends the gripped to-be-delivered object out of the storage bin:
and S50, after the to-be-dispensed objects are determined, grabbing and fixing the to-be-dispensed objects from the storage bin through the mechanical arm.
After the robot determines the to-be-dispensed objects, the to-be-dispensed objects can be grasped and fixed from the storage bin through the mechanical arm so as to achieve the effect of pre-grasping, and in the walking process of the robot, the to-be-dispensed objects are fixed through the mechanical arm so as to ensure that the to-be-dispensed objects are not influenced by jolt of the robot in the moving process of the robot, and the safety of the to-be-dispensed objects is ensured.
Further, in an embodiment, the step S50, after determining the to-be-dispensed object, of grabbing and fixing the to-be-dispensed object from the storage bin through the mechanical arm, includes:
step S501, scanning by using a graphic code scanning device arranged in the mechanical arm, determining a position of the to-be-dispensed object in the storage bin according to the scanned graphic code of the to-be-dispensed object, or communicating with each object in the storage bin by using a near field communication device arranged in the mechanical arm, and determining the position of the to-be-dispensed object in the storage bin according to a communication result;
In this embodiment, when a plurality of articles are placed in the storage bin, the mechanical arm needs to identify the articles to be dispensed and grasp the articles. In one embodiment, the graphic code can be attached to the object or given to the object in other modes, the graphic code carries the identification information of the object for distinguishing other objects, and the graphic code scanning device can be arranged in the mechanical arm; the graphic code scanning device is started in the storage bin to scan through the mechanical arm, which graphic code is the graphic code of the object to be distributed is determined according to the information carried in the scanned graphic code, and the position of the object to be distributed in the storage bin can be determined according to the position of the graphic code.
In another embodiment, a near field communication device may be placed in each article and the mechanical arm, and the identification information of the article may be placed in the near field communication device of each article to be distinguished from other articles; the mechanical arm is communicated with each non-station through the near field communication device in the storage bin, which near field communication signal is sent by the near field communication device of the article to be distributed is determined, and the position of the article to be distributed in the storage bin can be determined according to the sending position of the communication signal.
Step S502, controlling the mechanical arm to move to the position of the to-be-dispensed object in the storage bin, and then grabbing and fixing the to-be-dispensed object.
After the robot determines the position of the to-be-dispensed object in the storage bin, the robot can control the mechanical arm to move to the position and grasp and fix the to-be-dispensed object.
Further, in an embodiment, the robot may capture an image of each article in the storage bin through a camera disposed in the storage bin or on the mechanical arm, output the image to be displayed on a display screen of the robot, so that a user selects an own article in the image, and the robot uses the article as the article to be distributed, and converts the position of the article to be distributed in the storage bin according to the position of the article selected by the user in the image.
Further, based on the first and/or second embodiments, a third embodiment of the robot control method according to the present invention is provided, in this embodiment, the step S20 of detecting, by a sensor provided in the mechanical arm, whether a palm of a user exists in an environment outside the storage compartment further includes:
step S60, detecting whether a goods taking object exists or not through the sensor;
The robot may detect whether a pick-up object exists through a sensor provided in the robot arm. In a specific embodiment, the condition for judging whether the object to be picked up exists or not may be set according to the need, and the adopted detection method is different when different judging conditions are set. For example, in an embodiment, when the user is about to take the object from outside the storage bin, whether a person exists in the environment outside the storage bin can be determined by analyzing the data collected by the sensor, and the specific detection mode can refer to the specific embodiment of detecting whether the palm of the user exists according to the sensor, which is not described herein.
Step S70, if the fact that the goods taking object exists is determined, detecting the height characteristics of the goods taking object through the sensor;
if it is determined that the pick-up object exists, the robot can detect the height characteristics of the pick-up object through the sensor. The height characteristic may be specifically represented by a height value of the object, or may be represented by other data capable of representing the height of the object, for example, people of different ages may have different heights, and the height characteristic may also be represented by the age of the object.
When the height feature is represented by the height value of the picking object, the specific implementation mode of detecting the height feature of the picking object through the sensor can refer to the specific implementation mode of detecting the position of the palm of the user through the sensor, and the height value of the picking object can be obtained by conversion after the distance and the azimuth of the head and the foot of the picking object relative to the mechanical arm are detected.
When the height features are represented by ages, the sensor can be specifically realized by adopting the image pickup device, and the classification result of which age bracket the goods taking object is positioned can be obtained by performing image classification on the image data shot by the image pickup device.
And S80, controlling the mechanical arm to hover at a height position corresponding to the height characteristic with the object to be distributed.
After the robot detects the height characteristics of the goods taking object, the robot can be controlled to hover at the height position corresponding to the height characteristics with the goods to be delivered carried by the mechanical arm. The height positions corresponding to the height features of the robot can be preset in the robot, and the height positions can be relative to a coordinate system of the mechanical arm. The height positions corresponding to the height features can be set according to the needs, so that the mechanical arm can hover at the chest position of the picking object. Through detecting the height characteristic that gets the goods and correspond, control arm carries to wait to dispatch the article and hovers in getting the high position that the goods object height characteristic corresponds, can be convenient for get the goods object and know the purpose that the robot delivered to wait to dispatch the article in order to reach the hand and connect the article, and then be convenient for the arm to fix a position user's palm place fast, accomplish the delivery fast.
Further, in an embodiment, the step S60 of detecting whether the pick-up object exists by the sensor includes:
step S601, acquiring environment image data through the camera device, and identifying the environment image data to obtain identity verification information;
in this embodiment, the sensor provided in the robot arm may include an image pickup device. The robot can acquire image data (hereinafter referred to as environmental image data) by an imaging device, and recognize the environmental image data to obtain authentication information. The authentication information may be face information obtained by recognition, or goods taking information in an authentication graphic code presented by a user obtained by recognition, and the like, and is not limited in this embodiment.
Step S602, matching the identity verification information with the identity information of the picking object corresponding to the to-be-delivered object;
the robot may obtain the identity information of the object to be delivered in advance, specifically, the identity information may be input by a worker, or may be obtained by identifying a graphic code of the object to be delivered by the robot, or may be obtained by other means, which is not limited in this embodiment.
The robot matches the authentication information with the identity information of the pick-up object of the article to be dispensed to determine whether the authentication information is the authentication information of the pick-up object.
Step S603, when the identity verification information matches with the identity information of the picking object, determining that the picking object exists.
When the identity verification information is matched with the identity information of the goods taking object, the robot can determine that the goods taking object exists, and further can conduct subsequent delivery operation of the goods to be delivered.
Further, in an embodiment, the robot may determine that the pick-up object is not present when the identity verification information is inconsistent with the pick-up object identity information. When it is determined that no pick-up object exists, the robot may adjust the position of the robot arm or adjust the orientation of the robot, and analyze and determine whether or not the pick-up object exists based on image data of other angles captured by the image capturing device in the robot arm.
The identity verification information is acquired and matched with the identity information of the goods taking object, and the subsequent delivery operation of the goods to be delivered is executed when the matching is consistent, so that the accuracy and the safety of the delivery of the goods to be delivered are improved, and the delivery error is avoided.
Further, in an embodiment, the step S40 of controlling the mechanical arm to move to the position where the palm of the user is located, so as to place the gripped object to be dispensed on the palm of the user includes:
step S401, controlling the mechanical arm to move to the position where the palm of the user is located;
step S402, detecting pressure values of contact points through electronic skin arranged on the contact surface of the mechanical arm grabbing component and the to-be-dispensed object;
in this embodiment, an electronic skin is provided on the contact surface between the gripping member of the robot arm and the article, and whether or not the article to be dispensed is pulled by a person can be determined based on the pressure values of the contact points detected by the electronic skin.
The electronic skin may be realized by a plurality of pressure sensors, which may detect the pressure values to which they are subjected. After the robot moves the mechanical arm to the position of the palm of the user, the pressure value of each contact point can be detected through the electronic skin.
Step S403, when it is detected that each pressure value accords with the preset pressure value distribution state, controlling the mechanical arm to loosen the to-be-dispensed object, so as to place the grasped to-be-dispensed object on the palm of the user.
The robot detects whether the pressure value of each contact point accords with a preset pressure value distribution state. The preset pressure value distribution state may specifically include a pressure value or a pressure value range that each contact point should show when the article to be dispensed is towed, specifically may be obtained by grabbing the article by using a mechanical arm in a laboratory stage and towing the article by using a person or a test tool, measuring the pressure value of each contact point under the condition, and setting the preset pressure value distribution state according to the measured pressure value of each contact point.
When the robot detects that each pressure value accords with the preset pressure value distribution state, the mechanical arm can be controlled to loosen the to-be-dispensed objects, so that the to-be-dispensed objects are placed in the palm of the user, and the user can take away the to-be-dispensed objects.
In addition, an embodiment of the present invention further provides a robot control device, where the device is deployed on a robot, and the robot is characterized in that a storage bin is provided, and a mechanical arm is provided in the storage bin, and referring to fig. 4, the device includes:
the grabbing module 10 is used for sending the grabbed objects to be distributed out of the storage bin through the mechanical arm after reaching a distribution destination of the objects to be distributed in the storage bin;
The detection module 20 is configured to detect whether a palm of a user exists in an environment outside the storage bin through a sensor provided in the mechanical arm;
an obtaining module 30, configured to obtain, when it is determined that the palm of the user exists, a position where the palm of the user is located through the sensor;
and the delivery module 40 is used for controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed objects to be distributed on the palm of the user.
Further, the detection module 20 is further configured to:
acquiring sensor data through a sensor arranged in the mechanical arm, wherein the sensor is a depth camera and/or a radar sensor;
extracting environmental object outline features from the sensor data, and comparing the environmental object outline features with preset palm outline features;
and if the contour features of the environment objects are consistent with the contour features of the palm, determining that the palm of the user exists in the environment outside the storage bin.
Further, the grabbing module 10 is further configured to:
after the to-be-dispensed objects are determined, the to-be-dispensed objects are grabbed and fixed from the storage bin through the mechanical arm.
Further, the grabbing module 10 is further configured to:
The method comprises the steps of scanning through a graphic code scanning device arranged in the mechanical arm, determining the position of an object to be distributed in the storage bin according to the scanned graphic code of the object to be distributed, or communicating with each object in the storage bin through a near field communication device arranged in the mechanical arm, and determining the position of the object to be distributed in the storage bin according to a communication result;
and controlling the mechanical arm to move to the position of the article to be distributed in the storage bin, and then grabbing and fixing the article to be distributed.
Further, the detection module 20 is further configured to:
detecting whether a pickup object exists or not through the sensor;
if the goods taking object is determined to exist, detecting the height characteristics of the goods taking object through the sensor;
the delivery module 40 is further configured to: and controlling the mechanical arm to carry the to-be-dispensed object to hover at a height position corresponding to the height characteristic.
Further, when the sensor includes an imaging device, the detection module 20 is further configured to:
acquiring environment image data through the camera device, and identifying the environment image data to obtain identity verification information;
Matching the identity verification information with the identity information of the goods taking object corresponding to the goods to be distributed;
and when the identity verification information is matched and consistent with the identity information of the goods taking object, determining that the goods taking object exists.
Further, the delivery module 40 is further configured to:
controlling the mechanical arm to move to the position where the palm of the user is;
detecting pressure values of contact points through electronic skin arranged on the contact surface of the mechanical arm grabbing component and the to-be-dispensed object;
when each pressure value is detected to be in accordance with the preset pressure value distribution state, the mechanical arm is controlled to loosen the to-be-dispensed objects so as to place the grasped to-be-dispensed objects on the palm of the user.
The expansion content of the specific implementation mode of the robot control device is basically the same as that of each embodiment of the robot control method, and is not described in detail herein.
In addition, the embodiment of the invention also provides a computer readable storage medium, wherein the storage medium is stored with a robot control program, and the robot control program realizes the steps of a robot control method when being executed by a processor.
Embodiments of the robot and the computer readable storage medium of the present invention may refer to embodiments of the robot control method of the present invention, and will not be described herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (8)

1. The robot control method is applied to a robot, and is characterized in that the robot is provided with a storage bin, and a mechanical arm is arranged in the storage bin, and the method comprises the following steps:
after reaching the delivery destination of the articles to be delivered in the storage bin, the grasped articles to be delivered are sent out of the storage bin through the mechanical arm;
detecting whether a goods taking object exists or not through a sensor arranged in the mechanical arm;
if the goods taking object is determined to exist, detecting the height characteristics of the goods taking object through the sensor;
controlling the mechanical arm to hover at a height position corresponding to the height characteristic with the article to be distributed;
detecting whether a user palm exists in the environment outside the storage bin or not through a sensor arranged in the mechanical arm;
when the existence of the palm of the user is determined, acquiring the position of the palm of the user through the sensor;
Controlling the mechanical arm to move to the position of the palm of the user so as to place the grabbed articles to be distributed on the palm of the user;
detecting whether a user palm exists in the environment outside the storage bin through a sensor arranged in the mechanical arm comprises:
acquiring sensor data through a sensor arranged in the mechanical arm, wherein the sensor is a depth camera and/or a radar sensor;
extracting environmental object outline features from the sensor data, and comparing the environmental object outline features with preset palm outline features;
and if the contour features of the environment objects are consistent with the contour features of the palm, determining that the palm of the user exists in the environment outside the storage bin.
2. The robot control method according to claim 1, wherein after reaching a delivery destination of the articles to be delivered in the storage compartment, before the gripped articles to be delivered are sent out of the storage compartment by the mechanical arm, further comprising:
after the to-be-dispensed objects are determined, the to-be-dispensed objects are grabbed and fixed from the storage bin through the mechanical arm.
3. The robotic control method of claim 2, wherein the grasping and securing the item to be dispensed from within the storage bin by the robotic arm comprises:
The method comprises the steps of scanning through a graphic code scanning device arranged in the mechanical arm, determining the position of an object to be distributed in the storage bin according to the scanned graphic code of the object to be distributed, or communicating with each object in the storage bin through a near field communication device arranged in the mechanical arm, and determining the position of the object to be distributed in the storage bin according to a communication result;
and controlling the mechanical arm to move to the position of the article to be distributed in the storage bin, and then grabbing and fixing the article to be distributed.
4. The robot control method according to claim 1, wherein when the sensor includes an image pickup device, the detecting by the sensor whether or not there is a pickup object includes:
acquiring environment image data through the camera device, and identifying the environment image data to obtain identity verification information;
matching the identity verification information with the identity information of the goods taking object corresponding to the goods to be distributed;
and when the identity verification information is matched and consistent with the identity information of the goods taking object, determining that the goods taking object exists.
5. The robot control method according to any one of claims 1 to 4, wherein the controlling the movement of the robot arm to the position of the palm of the user to place the gripped object to be dispensed on the palm of the user comprises:
Controlling the mechanical arm to move to the position where the palm of the user is;
detecting pressure values of contact points through electronic skin arranged on the contact surface of the mechanical arm grabbing component and the to-be-dispensed object;
when each pressure value is detected to be in accordance with the preset pressure value distribution state, the mechanical arm is controlled to loosen the to-be-dispensed objects so as to place the grasped to-be-dispensed objects on the palm of the user.
6. A robot control device, the device is disposed on a robot, wherein the robot is provided with a storage bin, and a mechanical arm is disposed in the storage bin, the device comprising:
the grabbing module is used for sending the grabbed objects to be distributed out of the storage bin through the mechanical arm after reaching the distribution destination of the objects to be distributed in the storage bin;
the detection module is used for detecting whether a user palm exists in the environment outside the storage bin or not through a sensor arranged in the mechanical arm, and particularly, the detection module is used for acquiring sensor data through the sensor arranged in the mechanical arm, wherein the sensor is a depth camera and/or a radar sensor; extracting environmental object outline features from the sensor data, and comparing the environmental object outline features with preset palm outline features; if the contour features of the environment objects are consistent with the contour features of the palm, determining that a palm of a user exists in the environment outside the storage bin, and detecting whether a pickup object exists or not through the sensor by the detection module; if the goods taking object is determined to exist, detecting the height characteristics of the goods taking object through the sensor;
The acquisition module is used for acquiring the position of the palm of the user through the sensor when the palm of the user is determined to exist;
the delivery module is used for controlling the mechanical arm to move to the position where the palm of the user is located so as to place the grabbed objects to be distributed on the palm of the user, and is also used for controlling the mechanical arm to carry the objects to be distributed to hover at the height position corresponding to the height characteristics.
7. A robot, the robot comprising: a memory, a processor and a robot control program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the robot control method according to any one of claims 1 to 5.
8. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a robot control program, which when executed by a processor, implements the steps of the robot control method according to any one of claims 1 to 5.
CN202210447069.3A 2022-04-26 2022-04-26 Robot control method, device, robot and storage medium Active CN114770504B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210447069.3A CN114770504B (en) 2022-04-26 2022-04-26 Robot control method, device, robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210447069.3A CN114770504B (en) 2022-04-26 2022-04-26 Robot control method, device, robot and storage medium

Publications (2)

Publication Number Publication Date
CN114770504A CN114770504A (en) 2022-07-22
CN114770504B true CN114770504B (en) 2024-01-30

Family

ID=82433904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210447069.3A Active CN114770504B (en) 2022-04-26 2022-04-26 Robot control method, device, robot and storage medium

Country Status (1)

Country Link
CN (1) CN114770504B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115922712A (en) * 2022-12-02 2023-04-07 深圳优地科技有限公司 Robot distribution method and robot

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201044114Y (en) * 2006-08-23 2008-04-02 浦比俊引特艾克堤夫科技公司 Automatic sale machine with midair display system
JP2013146389A (en) * 2012-01-19 2013-08-01 Panasonic Corp Hand drying apparatus
WO2018127880A2 (en) * 2018-03-14 2018-07-12 Logic Studio Method and apparatus for giving and receiving objects
CA2997849A1 (en) * 2017-03-09 2018-09-09 Memic Innovative Surgery Ltd. Control console for surgical device with mechanical arms
CN108711086A (en) * 2018-05-09 2018-10-26 连云港伍江数码科技有限公司 Man-machine interaction method, device, article-storage device and storage medium in article-storage device
CN110271800A (en) * 2019-03-14 2019-09-24 金树玉 A kind of cargo collator and its working method for intelligent storage
CN111993978A (en) * 2020-07-14 2020-11-27 嘉善新石器智牛科技有限公司 Automatic driving vending cart and man-machine interaction method thereof
CN112036644A (en) * 2020-09-01 2020-12-04 北京京东振世信息技术有限公司 Method and apparatus for distributing courier boxes
CN112276956A (en) * 2020-10-30 2021-01-29 北京市商汤科技开发有限公司 Article distribution method, device and equipment and storage medium
CN113867398A (en) * 2017-04-28 2021-12-31 深圳市大疆创新科技有限公司 Control method for palm landing of unmanned aerial vehicle and unmanned aerial vehicle
CN114004329A (en) * 2020-07-28 2022-02-01 辉达公司 Machine learning control of object hand-off

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4456561B2 (en) * 2005-12-12 2010-04-28 本田技研工業株式会社 Autonomous mobile robot
US20080051933A1 (en) * 2006-08-23 2008-02-28 Provision Interactive Technologies, Inc Vending machine having aerial display system
US9469028B2 (en) * 2014-09-30 2016-10-18 Toyota Jidosha Kabushiki Kaisha Robotic handover system natural for humans

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201044114Y (en) * 2006-08-23 2008-04-02 浦比俊引特艾克堤夫科技公司 Automatic sale machine with midair display system
JP2013146389A (en) * 2012-01-19 2013-08-01 Panasonic Corp Hand drying apparatus
CA2997849A1 (en) * 2017-03-09 2018-09-09 Memic Innovative Surgery Ltd. Control console for surgical device with mechanical arms
CN113867398A (en) * 2017-04-28 2021-12-31 深圳市大疆创新科技有限公司 Control method for palm landing of unmanned aerial vehicle and unmanned aerial vehicle
WO2018127880A2 (en) * 2018-03-14 2018-07-12 Logic Studio Method and apparatus for giving and receiving objects
CN108711086A (en) * 2018-05-09 2018-10-26 连云港伍江数码科技有限公司 Man-machine interaction method, device, article-storage device and storage medium in article-storage device
CN110271800A (en) * 2019-03-14 2019-09-24 金树玉 A kind of cargo collator and its working method for intelligent storage
CN111993978A (en) * 2020-07-14 2020-11-27 嘉善新石器智牛科技有限公司 Automatic driving vending cart and man-machine interaction method thereof
CN114004329A (en) * 2020-07-28 2022-02-01 辉达公司 Machine learning control of object hand-off
CN112036644A (en) * 2020-09-01 2020-12-04 北京京东振世信息技术有限公司 Method and apparatus for distributing courier boxes
CN112276956A (en) * 2020-10-30 2021-01-29 北京市商汤科技开发有限公司 Article distribution method, device and equipment and storage medium

Also Published As

Publication number Publication date
CN114770504A (en) 2022-07-22

Similar Documents

Publication Publication Date Title
JP3834297B2 (en) Image processing device
JP5806301B2 (en) Method for physical object selection in robotic systems
JP6529302B2 (en) INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM
CN111590611B (en) Article classification and recovery method based on multi-mode active perception
JP4226623B2 (en) Work picking device
US20140180479A1 (en) Bagging With Robotic Arm
CN111496770A (en) Intelligent carrying mechanical arm system based on 3D vision and deep learning and use method
EP1385122A1 (en) Object taking-out apparatus
CN114770504B (en) Robot control method, device, robot and storage medium
US11232589B2 (en) Object recognition device and object recognition method
CN114041096A (en) Baggage transportation method, transportation system, robot, terminal device, and storage medium
US20190126473A1 (en) Information processing apparatus and robot arm control system
US9361695B2 (en) Method of recognizing a position of a workpiece from a photographed image
KR20150106718A (en) Object peaking system, object detecting device and method thereof
CN114029243B (en) Soft object grabbing and identifying method for sorting robot
CN113927601B (en) Method and system for realizing precise picking of mechanical arm based on visual recognition
JP2020163502A (en) Object detection method, object detection device, and robot system
US20230297068A1 (en) Information processing device and information processing method
CN110026976A (en) The robot of elevator and the method using the robot sending and receiving article above and below energy
EP1480169A1 (en) Image processing apparatus
CN111476840B (en) Target positioning method, device, equipment and computer readable storage medium
CN113269112A (en) Method and device for identifying capture area, electronic equipment and storage medium
JP6041710B2 (en) Image recognition method
KR102317041B1 (en) Gripping System of object and method thereof
JP7481867B2 (en) Control device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant