WO2019037013A1 - 机器人码放货物的方法及机器人 - Google Patents
机器人码放货物的方法及机器人 Download PDFInfo
- Publication number
- WO2019037013A1 WO2019037013A1 PCT/CN2017/098781 CN2017098781W WO2019037013A1 WO 2019037013 A1 WO2019037013 A1 WO 2019037013A1 CN 2017098781 W CN2017098781 W CN 2017098781W WO 2019037013 A1 WO2019037013 A1 WO 2019037013A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- end effector
- goods
- target state
- robot
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
Definitions
- the invention belongs to the field of robot technology, and in particular relates to a method and a robot for robot loading and unloading goods.
- the invention provides a method for robots to load goods and a robot, and aims to perform a placing or holding operation according to sensing parameters of the sensor when the robot approaches the position of the goods or the target position of the goods. This solves the problem of not being able to accurately hold and place.
- a first aspect of the embodiments of the present invention provides a method for robots to load goods, the robot comprising an end effector disposed at an end of the operation of the robot, the method comprising: continuously acquiring sensing parameters through the sensor; controlling an end effector of the robot Approaching the target state, if the sensing parameter meets the preset condition, the control end effector performs a holding or placing operation.
- a second aspect of the embodiments of the present invention provides a robot, including: an end effector disposed at an operation end of the robot; a sensor configured to acquire sensing parameters; a memory, a processor, and the memory and And a computer program running on the processor, when the processor executes the computer program, implementing the method for robot loading and unloading goods provided by the first aspect of the embodiment of the present invention.
- the end effector of the control robot approaches the target state, and the sensor parameters are continuously acquired through the sensor, and the robot is placed or picked up according to the continuously acquired sensing parameters, so that the robot accurately completes the acquisition. With the placement operation.
- FIG. 1 is a schematic diagram of an application environment of a method for loading and unloading goods by a robot according to an embodiment of the present invention
- FIG. 2 is a schematic diagram showing an implementation flow of a method for loading and unloading goods by a robot according to a first embodiment of the present invention
- FIG. 3 is a schematic diagram showing an implementation flow of a method for loading and unloading goods by a robot according to a second embodiment of the present invention
- FIG. 4 is a schematic flow chart showing the implementation of a method for loading and unloading goods by a robot according to a third embodiment of the present invention.
- FIG. 5 is a schematic structural diagram of a robot according to a fourth embodiment of the present invention.
- FIG. 1 is a schematic diagram of an application environment of a method for loading and unloading a robot according to an embodiment of the present invention.
- the robot 10 performs data interaction with the server 80 by wire or wirelessly, and proceeds to the cargo bay 30 to perform unloading or loading operations in accordance with an instruction sent by the server 80.
- the cargo 60 is loaded from the location outside the cargo bay 30 or the conveyor 40 to the cargo bay 30.
- the cargo 60 is unloaded from the cargo bay 30 onto the conveyor 40 or transported outside of the cargo bay 30.
- the robot 10 may be a single robot or a robot cluster composed of a plurality of robots.
- the robot 10 includes a robot arm and an end effector coupled to the operating end of the robot arm. End effectors include, but are not limited to, suction cups, robots, and gripping devices.
- the robot includes dexterous hands.
- FIG. 2 is a schematic flowchart of an implementation process of a method for loading and unloading a robot according to a first embodiment of the present invention, where the method includes the following steps:
- the senor may include, but is not limited to, a visual sensor, a ranging sensor, and a connection.
- a visual sensor a ranging sensor
- a connection a connection between the sensors.
- the target state includes a target location and/or a target gesture.
- the target state includes a first target state corresponding to the execution of the placement operation, or a second target state corresponding to the execution of the holding operation.
- a placing operation is performed to place the goods to be placed.
- the holding operation is performed to obtain the to-be-held goods.
- the sensing parameter is acquired by the sensor, and according to the sensing parameter, the robot is controlled to place or hold the cargo. This allows the robot to accurately place and hold the goods.
- the placement distance between the goods can be properly arranged, placed neatly, saving space and stabilizing the stacking of goods.
- the unloading task can be completed stably and efficiently.
- FIG. 3 is a schematic flowchart of an implementation process of a method for loading and unloading a robot according to a second embodiment of the present invention. This embodiment is an explanation from the viewpoint of placing goods, as shown in FIG. 3, the method includes the following steps:
- control end effector When the control end effector approaches the first target state, if the sensing parameter meets the first preset condition, the control end effector performs an operation of placing the held goods to be placed.
- the first target state corresponds to performing a placement operation. Based on the first target state of the cargo and the current state of the end effector, the end effector is scheduled to arrive from the current state to a first execution path of the third target state having a first distance from the first target state. Controlling the end of the robot according to the first execution path After reaching the third target state, the device approaches the first target state, and when the sensing parameter meets the first preset condition, the control end effector places the to-be-placed goods.
- the current state of the end effector includes the current position and/or current pose of the goods that have been held.
- the third target state is a feasible position and/or a viable gesture that is close to the first target state.
- the first target state is approached by the third target state.
- it is judged whether or not the placing operation is performed. For example, in the application scenario of logistics storage, the robot that has been held the cargo to be placed, according to the force and torque sensor, judges that the held goods to be placed have not touched the spatial position of any side, the surface including the ground, the bulkhead and the wall.
- the surface including the goods stacked on the lower side, the left side, the right side, and the rear side, etc., when the end effector gradually approaches the first target state, the goods to be placed contact the lower bearing surface, such as the foregoing The ground, the bottom of the cargo hold, the cargo below, etc., generate an external force, which proves that the pre-placed bottom surface has been touched, and it can be used as a judgment condition for the robot to release the goods to be placed.
- the robot can determine whether the condition of the placement can be that the rear side and the lower side of the cargo generate an external force, and the left side or the right side can also increase the external force.
- the external force generated on the other side such as the left side and the right side can also be used as an adjustment parameter.
- an external force is generated on the left side, indicating that the goods need to be placed after the right shift.
- the external force is detected by the sensor to obtain an external force parameter.
- the sensing parameters include the external force parameters acquired by the sensor.
- the first preset condition includes: the external force parameter belongs to the first regional parameter.
- the area range of the first area parameter can be set flexibly according to the scene requirement constraints.
- the external force parameter acquired by the sensor when the external force is detected in the direction corresponding to the first target state in the region range may be set. It may also be an external force parameter acquired by the sensor when an external force is detected in a direction corresponding to the first target state. It is also an external force parameter acquired by the sensor when an external force is detected in a specific direction and the external force is greater than an external force value.
- the specific direction may include a multi-directional constraint.
- the range of the external force parameter obtained by the sensor that meets the requirement is the first regional parameter.
- the first area parameter is a sensing parameter corresponding to the first target state that is set according to a specific application scenario. example If the first target state required for the scene includes the placement position and the placement posture of the goods to be placed, the placement can be performed while satisfying the tightness requirement of the stacking goods in the process of approaching the first target state.
- the sensing parameter area that meets the tightness requirement can be determined by detecting whether the other cargo or bulkhead received by the cargo reacts to the cargo to determine whether it is in contact, and even if the force is set to a certain range.
- the first region parameter required for this scenario can be determined by detecting whether the other cargo or bulkhead received by the cargo reacts to the cargo to determine whether it is in contact, and even if the force is set to a certain range.
- the sensing parameter belongs to the first region parameter corresponding to the first target state
- the placement is performed.
- the first regional parameter corresponds to the reaction force of any one or any of the bottom side, the rear side, the left side, and the right side of the cargo to be placed, such as contact with other cargo, the ground or the bulkhead of the cargo hold, etc.
- the reaction of the cargo in turn, by the end effector holding the cargo to be placed, exerts a force on the sensor connected to the end effector.
- the connection includes a direct connection as well as an indirect connection.
- the senor may include: a force feedback sensor and/or a torque sensor, and may also be a Force/Torque Sensor.
- the sensor can be placed at the joint of the end of the robot with the end of the actuator, or it can be placed directly on the end effector.
- a Force Sensor is a device that converts the magnitude of a force into an associated electrical signal. Force is the direct cause of changes in material motion.
- a force/torque sensor (Force/Torque Sensor) is used to measure the interaction force and torque between the support and the supported object. It can be placed at the joint of the end of the robot to the end of the actuator.
- the senor further includes: a visual sensor.
- the movable space of the goods to be placed that the end effector holds when the end effector approaches the first target state can be acquired by the visual sensor.
- the sensing parameter may include: a movable space acquired by the vision sensor to approach the first target state.
- the vision sensor includes a three-dimensional camera device such as a RGB Depth device, Binocular Stereo Vision.
- the sensing parameter is spatial data acquired by the three-dimensional camera. It is also possible to combine the depth sensing device with the two-dimensional camera device, so that two-dimensional images and depth data can be obtained, that is, spatial data can be obtained.
- the sensing parameter conforms to the first preset condition that the movable space belongs to the preset first spatial range. That is, when the end effector of the control robot is positioned to approach the first target state, if the visual sensor acquires the movable space of the goods to be placed that approaches the first target state, the first space belongs to the preset Range, the robot places the goods to be placed.
- the senor further includes: a distance measuring sensor.
- the first distance of the goods to be placed that is approaching other objects can be detected by the ranging sensor.
- the sensing parameter may include that the distance measuring sensor detects the distance that the to-be-positioned goods approach other objects, and the sensing parameter meets the first preset condition that the distance of the goods to be placed approaching other objects belongs to a preset first threshold. range. That is, when the end effector of the control robot is positioned to approach the first target state, if the distance detected by the ranging sensor approaches the distance of other objects, it belongs to a preset first threshold range. Then the robot places the goods to be placed.
- the ranging sensor can obtain one-dimensional data.
- the normal vector of the measured surface can be calculated.
- the distance between the end effector and the measurement surface can be calculated from the normal vector.
- the attitude of the end effector can be adjusted to accurately approach.
- the distance measuring sensor can also measure the distance between the goods to be placed and other goods, and the like. Sensing parameters required.
- the senor further includes a proximity sensor.
- the first proximity data of the item to be placed and the approach to other objects may be detected by the proximity sensor.
- the sensing parameter may include the proximity sensor detecting the first proximity data of the goods to be placed and approaching other objects, and the sensing parameter conforms to the first preset condition that the first proximity data belongs to a preset third threshold range. That is, when the end effector of the control robot is brought close to the first target state, if the first shipment data of the goods to be placed and the other objects are in the preset third threshold range, the robot is placed. The goods to be placed.
- the proximity sensor can be used to detect specific proximity parameters, or it can be used only to detect whether Produce close data.
- the proximity sensor and the position of the robot can be set according to the detection parameters of the proximity sensor. For example, if the approach distance is 55 mm, then close to 55 mm or less will produce close data, otherwise there will be no data.
- the proximity sensor is placed at a position where the end effector can hold the cargo and the distance from the cargo falls to 55 mm. Then, when the end effector approaches the second target state, the proximity sensor generates the proximity data, that is, the end effector is in a state in which the goods can be held, and then the holding is performed.
- the senor further includes a tactile sensor.
- the contact data of the end effector with the item to be placed can be obtained by the tactile sensor.
- Sensing parameters include contact data between the end effector and the item to be placed.
- the sensing parameter meets the first preset condition, including: the sensing parameter detected by the touch sensor belongs to the third regional parameter.
- the end effector of the robot is controlled to approach the first target state, if it is detected that the sensing parameter detected by the tactile sensor belongs to the third region parameter, the to-be-placed goods are placed.
- the third area is a combination of sensing parameters acquired by all the tactile sensors corresponding to the first target state set according to the requirements of the specific application scenario.
- the first target state required for the scene includes the placement position and the placement posture of the goods to be placed, and in the process of approaching the first target state, if the acquired contact data satisfies the tightness requirement of the stacking goods, the placement may be performed.
- the sensing parameter that meets the tightness requirement can be judged by detecting the reaction force of other goods or bulkheads on which the cargo is subjected to the cargo, and even setting the force to a certain range.
- the sensing parameter detected by the sensor falls within the sensing parameter of the third region under the constraint of the scene requirement.
- placement is performed.
- the third regional parameter corresponds to the reaction force of any one or any of the bottom side, the rear side, the left side, and the right side of the cargo to be placed, such as contact with other cargo, the ground or the bulkhead of the cargo hold, etc.
- the reaction of the cargo in turn, by the end effector holding the cargo to be placed, exerts a force on the tactile sensor connected to the end effector.
- the tactile sensor can be disposed in an area in contact with the cargo when the end effector holds the cargo.
- the tactile sensor can also detect the distribution of forces.
- the setting of the third zone may also include contact data definition of the end effector and cargo contact distribution. For example, when the end effector is a robot, the tactile sensor is placed in the area where the fingers of the robot contact the cargo. Then, the setting of the third area includes the range of contact data defined by the contact distribution of each finger and the cargo. It is assumed that the contact distribution is defined such that the contact area of each finger and the cargo is larger than the first area. It can be understood that the specific contact data range of the third area may further include the limitation of the size of the force according to the scene requirement.
- the end effector of the robot includes a suction cup.
- the sensor may also include a vacuum sensor, the sensing parameter including a negative pressure parameter.
- the first preset condition includes that the value of the negative pressure parameter produces a preset fluctuation.
- the preset fluctuation corresponds to the contact of the goods with the placement surface, such as contact with other cargo, the ground or the bulkhead of the cargo hold, etc., to generate a reaction to the cargo, and then to the end effector by the suction cup holding the cargo to be placed.
- the vacuum sensor generates a negative pressure change, and when the change conforms to the preset fluctuation, the first preset condition is met, that is, the placing operation is performed.
- the sensing parameter is acquired by the sensor, and the robot is controlled to place the cargo according to the sensing parameter.
- This allows the robot to accurately complete the placement of the goods.
- the placement distance between the goods can be properly arranged, placed neatly, saving space and stabilizing the stacking of goods.
- the unloading task can be completed stably and efficiently.
- FIG. 4 is a schematic diagram of an implementation process of a method for loading and unloading a robot according to a third embodiment of the present invention. This embodiment is an explanation from the perspective of holding the goods. As shown in FIG. 4, the method includes the following steps:
- the planned end effector has the first state from the current state and the second target state.
- a second execution path of the fourth target state of the two distances Controlling the end effector of the robot to approach the second target state according to the second execution path, and approaching the second target state, and controlling the end effector to obtain the obtained goods when the sensing parameter meets the second preset condition .
- the current state of the end effector includes the current position of the end effector and/or the current pose.
- the fourth target state is a feasible position and/or a viable gesture that is close to the second target state. Approaching the second target state by reaching the fourth target state.
- the sensing parameters can be obtained through the sensor, including: detecting an external force through the sensor, and acquiring an external force parameter.
- the second preset condition includes: the external force parameter belongs to the second regional parameter.
- the second area parameter is a sensing parameter corresponding to the second target state that is set according to a specific application scenario.
- the second target state required by the scene includes receiving the held position and the holding posture of the held goods, and in the process of approaching the second target state, if the end effector is satisfied to contact the to-be-held goods, the holding can be performed.
- the sensing parameter area that meets the required requirements that is, the second area parameter .
- the second area parameter corresponds to contacting the goods to be held, or even squeezing the goods to be held, and the goods to be held generate a reaction force to the end effector, thereby generating a force on the sensor connected to the end effector, Get the sensing parameters.
- the connection includes a direct connection as well as an indirect connection.
- the sensor may include: a force feedback sensor and/or a torque sensor, and may also be a force/torque sensor (Force/Torque Sensor).
- the sensor can be placed at the joint of the end of the end of the actuator, or it can be placed directly on the end effector.
- the sensor comprises: a visual sensor.
- the visual sensor includes a three-dimensional imaging device such as a color depth device, binocular stereo vision.
- the sensing parameter is spatial data acquired by the three-dimensional camera. It is also possible to combine the depth sensing device with the two-dimensional camera device, so that two-dimensional images and depth data can be obtained, that is, spatial data can be obtained.
- the relative state of the end effector and the goods to be held is obtained by the visual sensor.
- the sensing parameters may include the relative state of the end effector and the goods to be held.
- the second preset condition includes that the relative state of the end effector and the goods to be held belongs to a preset holdable state. For example, when the second target state is the initial holding position and/or the holding posture of the goods to be held.
- the optimized acquisition can be obtained. Hold position and / or hold posture. When the end effector is in an optimized holding position and/or a holding position, the goods to be held are acquired.
- the senor further includes: a distance measuring sensor.
- the distance sensor is used to obtain a second distance between the end effector and the item to be held.
- the second predetermined condition is that the second distance belongs to a second threshold range in which the end effector can obtain the goods to be held. That is, when the end effector of the robot is approached to be held, that is, when approaching the second target state, if the distance from the to-be-holded cargo is determined according to the sensing parameter of the ranging sensor, if it belongs to the end
- the actuator can obtain the execution threshold range of the goods to be held, and then obtain the goods to be held. Execution threshold ranges include exposure to the goods to be held, as well as non-contact conditions. For example, when the end effector is a suction cup, the distance between the suction cup and the goods to be held falls within the suction range of the suction cup.
- the ranging sensor can obtain one-dimensional data.
- the normal vector of the measured surface can be calculated.
- the distance between the end effector and the measurement surface can be calculated from the normal vector.
- the attitude of the end effector can be adjusted to accurately approach.
- the end effector and the pending measurement can also be measured by the distance measuring sensor.
- the senor further includes: a proximity sensor.
- the second proximity data of the end effector and the to-be-held goods is obtained by the proximity sensor, and the sensing parameter may further include the end effector acquired by the proximity sensor and the second item to be held. Close to the data.
- the second preset condition is that the second proximity data belongs to a preset fourth threshold range. That is, when the end effector that controls the robot approaches the to-be-held goods, that is, approaches the second target state, if the second proximity data of the robot and the to-be-held goods is detected, it belongs to the preset fourth threshold. For the scope, the goods to be held are obtained.
- Proximity touch sensors include the ability to detect specific proximity parameters, and may also include detection of only proximity data.
- the proximity sensor detects whether or not the proximity data is generated, the proximity sensor can be set to the position of the robot according to the detection parameter of the proximity sensor. For example, if the approach distance is 55 mm, then close to 55 mm or less will produce close data, otherwise there will be no data.
- the proximity sensor is placed at a position where the end effector can hold the cargo and the distance from the cargo falls to 55 mm. Then, when the end effector approaches the second target state, the proximity sensor generates the proximity data, that is, the end effector is in a state in which the cargo can be held, and the hold operation is performed.
- the senor further includes: a touch sensor.
- contact data of the end effector and the goods to be held may be acquired by the tactile sensor.
- the second preset condition includes that the contact data of the end effector and the goods to be held belong to the fourth area parameter. That is, when the end effector that controls the robot approaches the to-be-held goods, that is, approaches the second target state, if it is detected that the contact data of the end effector and the to-be-held goods belong to the fourth regional parameter, Hold the goods to be held. Since the tactile sensor can detect the magnitude of the force, the setting of the fourth region can also include a definition of the amount of force with the cargo.
- the tactile sensor can also detect the distribution of forces.
- the setting of the fourth zone may also include contact data definition of the end effector and cargo contact distribution.
- the tactile touch sensor is disposed in an area where the fingers of the robot contact the cargo.
- the setting of the fourth area includes the range of contact data defined by the contact distribution of each finger and the cargo. It is assumed that the contact distribution is defined such that the contact area of each finger and the cargo is larger than the first area.
- the approaching of the robot to the second target state includes the robot approaching to obtain the goods to be held, for example, gradually gathering the fingers until the contact data obtained by the tactile sensors of each finger corresponds to the contact area of the goods to be held being larger than the first area, then executing Get hold.
- the specific contact data range of the fourth area may further include the limitation of the size of the force according to the scene requirement.
- the contact data of the tactile sensor may also monitor whether the goods are slipping in the state of the seized goods. If it slides, the end effector is controlled to increase the holding force. It may also include adjustment of the posture of the goods in the state of being held.
- the texture of the goods is obtained based on the contact data.
- the second target state is an area where the holding position and/or the holding posture corresponds to the cargo having the texture feature. Then, when the end effector approaches the second target state, it is judged according to the contact data whether the end effector contacts the area with the texture feature, when the end effector and the feature of the contact data to be held, and the end effector When the features of the corresponding contact data are matched when the texture-affected area of the goods to be held is matched, it is confirmed that the second preset condition is met, and the holding operation is performed.
- the contact with the goods to be held is determined according to the contact data, and then the holding is performed.
- the end effector is controlled to adjust the posture of the cargo. For example, based on the visual sensor or known cargo model data, it is known that the texture feature is located above the end of the cargo holding area, and the holding force of the end effector can be reduced, but the contact is maintained, that is, the end is executed according to the contact data. The force between the goods.
- the control goods are slid down until the texture feature reaches the holding area of the end effector, that is, whether the texture characteristic is determined according to the contact data in the sliding stage, and if so, the control end effector is increased. Holding power. That is, the cargo is stably held in the end effector, and the area where the end effector is held is in the area of the texture feature of the cargo.
- the texture feature can be obtained by the tactile sensor. And get the information of the goods.
- the senor further includes: a vacuum sensor.
- the end effector of the robot includes a suction cup.
- the sensor may also include a vacuum sensor, the sensing parameter including a negative pressure parameter.
- the second preset condition includes that the value of the negative pressure parameter is greater than a preset negative pressure threshold.
- the preset negative pressure threshold corresponds to the negative pressure value required by the suction cup to hold the goods to be held.
- the first regional parameter, the second regional parameter, the third regional parameter, and the fourth regional parameter include a plurality of sensing parameters corresponding to a multidimensional space such as a one-dimensional interval, a two-dimensional region, a three-dimensional space, and a six-dimensional space.
- the specific values of the plurality of sensing parameters in the above set may be continuous or discontinuous. For example, when two sensors capable of detecting six dimensions are included, 12-dimensional spatial data can be obtained for describing the state of the sensor.
- the sensing parameter is acquired by the sensor, and according to the sensing parameter, the robot is controlled to obtain the cargo. This allows the robot to accurately complete the acquisition of the goods.
- the placement distance between the goods can be properly arranged, placed neatly, saving space and stabilizing the stacking of goods.
- the senor after reaching the third target state close to the first target state and the fourth target state close to the second target state, the sensor is acquired by approaching the first target state and the second target state. Sensing parameters to determine whether to perform placement or holding, in order to avoid the error caused by robot calculation and mechanical operation, causing a plan to reach the target state to perform placement or squeezing to other goods or walls when holding the goods, or unable to obtain the goods, resulting in the task The problem of failure. Thereby increasing the chances of keeping the goods intact.
- FIG. 5 is a schematic diagram of a robot according to a fourth embodiment of the present invention.
- the robot is the robot 10 in FIG.
- the end effector 101 is provided at the operation end of the robot 10.
- the end of the operation can be the end of the robotic arm.
- the end effector 101 is disposed at the end of the robot arm.
- the sensor 102 is configured to acquire sensing parameters.
- the end effector 101, the sensor 102, the memory 103, and the processor 104 are communicatively coupled, including but not limited to being connected by a bus 105.
- the memory 103 may be a high speed random access memory (RAM) memory or a non-volatile memory such as a magnetic disk memory.
- Memory 103 is used to store a set of executable program code, and processor 104 is coupled to memory 104.
- memory 103 can optionally include memory remotely located relative to processor 104, which can be connected to the robot via a network. Examples of such networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
- the senor 102 includes one or more of a visual sensor, a distance measuring sensor, a proximity sensor, a tactile sensor, a vacuum sensor, a force feedback sensor, a torque sensor, and a force and torque sensor.
- the force feedback sensor is disposed at the end effector, and the force and torque sensors are disposed at the joint of the end of the end effector.
- the force feedback sensor is used to detect an external force.
- Force and torque sensors are used to detect external forces and moments.
- the visual sensor is used to acquire a movable space in which the goods to be placed tend to be in the first target state.
- the visual sensor is also used to obtain data of the goods to be held for obtaining the distance from the goods to be held.
- the ranging sensor is used to detect the distance that the goods to be placed approach other objects.
- the ranging sensor is also used to detect the distance from the goods to be held.
- the proximity sensor is used to detect the first proximity data of the item to be placed and to approach other objects.
- the proximity sensor is also used to detect a second proximity data to the cargo to be held.
- the tactile sensor is used to obtain contact data generated when the end effector is in contact with the goods to be placed or the goods to be held.
- the vacuum sensor is used to obtain the negative pressure parameter. Further, after determining whether to suck the object according to the negative pressure parameter, the vacuum sensor can also be used to adjust the magnitude of the suction.
- the vacuum sensor is also used to measure fluctuations in negative pressure.
- the sensor to be placed is placed when the sensing parameter obtained by the sensor meets the first preset condition. And when it is confirmed that the sensing parameter obtained by the sensor meets the second preset condition, the held goods are obtained.
- the robot further includes: a moving mechanism, a robot arm, a body, and a power supply.
- the bottom of the moving mechanism is provided with a plurality of wheels, and the robot is moved in all directions by driving the wheels to rotate.
- the processor 104 and the memory 103 described above are disposed in the body.
- At least one sensor is disposed on the robot, and the sensing parameter is acquired by the sensor, and the robot is controlled to hold or hold the cargo according to the sensing parameter, so that the robot accurately completes the operation of the cargo when the cargo is stacked, so that The distance between the goods is appropriate, placed neatly, saving space and stabilizing the stacking of goods.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
Abstract
一种机器人码放货物的方法及机器人(10),该机器人码放货物的方法包括:通过传感器(102)获取传感参数;控制机器人(10)的末端执行器(101)趋近目标状态,若所述传感参数符合预设条件,则控制所述末端执行器(101)执行获持操作或放置操作。机器人码放货物的方法及机器人可在码放货物时准确完成货物的放置、获持,堆放稳定,节约放置空间。
Description
本发明属于机器人技术领域,尤其涉及一种机器人码放货物的方法及机器人。
科技的快速发展,给人们的生活带来翻天覆地的变化,同时也提升了企业的工作效率。而电子商务的蓬勃发展,对国内仓储物流同样提出了前所未有的提速需求。智能机器人应用于物流的概念也逐步被提及并展开多方向的研究。其中,在货物的装卸、搬运方向,如何保持装卸货物的稳定性和码货质量成为一个重要技术问题。
现有技术中,通常机器人在装卸过程中,都会出现将货物码放在指定位置的情况。但是,基于目前机器人领域,硬件设备不可避免的存在机械误差以及计算误差,实际操作过程中,无法准确完成货物的放置,造成货物之间距离有大有小,参差不齐,易造成货物堆放不稳定,导致码放货物的坍塌,难以达到码放质量的要求。同时,也会造成放置空间的浪费,难以达到码放率的要求。另外,基于物流仓储领域的特殊性,机器人应用的一个难点在于,受到货物形状、大小、重量不同的影响,更难满足高质量、高效的码放要求。
因此,如何解决机器人领域准确获持与放置的问题,以及,进一步应用于物流仓储领域有效执行码放货物,是亟待解决的问题。
发明内容
本发明提供一种机器人码放货物的方法及机器人,旨在通过机器人趋近放置货物或获持货物的目标位置时,根据传感器的传感参数执行放置或获持操作。从而解决无法准确获持与放置的问题。
本发明实施例第一方面提供了一种机器人码放货物的方法,所述机器人包括设置于机器人操作末端的末端执行器,所述方法包括:通过传感器持续获取传感参数;控制机器人的末端执行器趋近目标状态,若所述传感参数符合预设条件,则控制末端执行器执行获持或放置操作。
本发明实施例第二方面提供了一种机器人,包括:末端执行器,设置于所述机器人的操作末端;传感器,用于获取传感参数;存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时,实现上述本发明实施例第一方面提供的机器人码放货物的方法。
从上述本发明实施例可知,控制机器人的末端执行器趋近目标状态,并通过传感器持续获取传感参数,根据持续获取的传感参数,控制机器人放置或获持货物,使得机器人准确完成获持与放置操作。
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例。
图1是是本发明实施例提供的机器人码放货物的方法的应用环境示意图;
图2是本发明第一实施例提供的机器人码放货物的方法的实现流程示意图;
图3是本发明第二实施例提供的机器人码放货物的方法的实现流程示意图;
图4是本发明第三实施例提供的机器人码放货物的方法的实现流程示意
图;
图5是本发明第四实施例提供的机器人的结构示意图。
为使得本发明的发明目的、特征、优点能够更加的明显和易懂,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而非全部实施例。基于本发明中的实施例,本领域技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
为了便于理解,以下结合物流仓储领域的具体应用环境示例,加以阐述本发明机器人准确获持与放置的方案。
请参阅图1,图1为本发明实施例提供的机器人码放货物方法的应用环境示意图。机器人10通过有线或无线的方式,与服务器80进行数据交互,根据服务器80发送的指令,前往货舱30执行卸货或装货操作。例如:将货物60从货舱30之外的地方或传送装置40上,装载至货舱30。或者,将货物60从货舱30卸载至传送装置40上或搬运到货舱30之外的地方。其中,机器人10可以是单独的一个机器人,也可以是由多个机器人组成的机器人集群。具体的,机器人10包括机械臂以及连接于机械臂操作末端的末端执行器。末端执行器包括但不限于吸盘,机械手,夹持装置。机械手包括灵巧手。
请参见图2,图2为本发明第一实施例提供的机器人码放货物方法的实现流程示意图,该方法包括以下步骤:
S201、通过传感器获取传感参数。
S202、控制机器人的末端执行器趋近目标状态,若传感参数符合预设条件,则控制末端执行器执行获持或放置操作。
于本实施例中,传感器可以但不限于包括:视觉传感器,测距传感器,接
近传感器,触觉传感器,真空传感器,力反馈传感器,力矩传感器以及力和力矩传感器中的一种或几种的组合。目标状态包括目标位置和/或目标姿态。
具体地,目标状态包括用于执行放置操作对应的第一目标状态,或用于执行获持操作对应的第二目标状态。相应的,当控制机器人的末端执行器趋近第一目标状态时,若传感参数符合第一预设条件,则执行放置操作,以放置待放置货物。当控制机器人的末端执行器趋近第二目标状态时,若传感参数符合第二预设条件,则执行获持操作,以获持待获持货物。
本发明实施例中,在趋近目标状态过程中,通过传感器获取传感参数,根据该传感参数,控制机器人放置或获持货物。使得机器人可以准确完成货物的放置与获持。当应用与物流仓储领域,尤其用于码货方案时,可以实现货物之间的摆放距离恰当,放置整齐,节约放置空间,货物堆放稳定。在卸货时,能稳定、有效地完成卸货任务。
请参见图3,图3为本发明第二实施例提供的机器人码放货物方法的实现流程示意图。本实施例是从放置货物的角度进行的说明,如图3所示,该方法包括以下步骤:
S301、通过传感器获取传感参数;
S302、根据第一目标状态以及末端执行器的当前状态,规划从末端执行器从当前状态到达第三目标状态的第一执行路径,第三目标状态与第一目标状态具有第一距离;
S304、根据第一执行路径,控制末端执行器到达第三目标状态;
S305、当控制末端执行器趋近第一目标状态时,若传感参数符合第一预设条件,则控制末端执行器执行放置已获持的待放置货物的操作。
第一目标状态与执行放置操作对应。根据货物的第一目标状态以及末端执行器的当前状态,规划末端执行器从当前状态到达与第一目标状态具有第一距离的第三目标状态的第一执行路径。根据第一执行路径控制机器人的末端执行
器到达第三目标状态后,趋近第一目标状态,并当传感参数符合第一预设条件时,控制末端执行器放置该待放置货物。末端执行器的当前状态包括已获持待放置货物的当前位置和/或当前姿态。
第三目标状态为接近第一目标状态的一个可行位置和/或可行姿态。通过第三目标状态趋近第一目标状态。在趋近过程中,结合传感器的传感参数,判断是否执行放置操作。例如在物流仓储的应用场景中,已获持待放置货物的机器人,根据力和力矩传感器判断已获持的待摆放货物未接触到任意一面的空间位置,该面包括地面、舱壁、墙面,还包括下侧、左侧、右侧、后侧已堆放的货物等,则当末端执行器逐渐趋近该第一目标状态过程中,待放置货物接触到下侧的承载面,例如前述的地面、货舱底面、下方货物等,产生一个外力,则证明已接触到预放置的底面,则可作为机器人松手放置待放置货物的判断条件。当然对于除了底面,其他面附近也有货物或障碍物的场景下,机器人判断是否放置的条件可以为,货物的后侧和下侧产生了外力,还可以增加左侧或右侧也产生了外力,可以判断为符合紧密码放。但左侧、右侧等其他侧产生的外力也可以作为调整参数,例如,趋近目标位置的过程中,检测到左侧产生了外力,则说明需要右移后放置货物。
可选的,通过传感器检测外力,获取外力参数。传感参数包括该传感器获取到的外力参数。第一预设条件包括:外力参数属于第一区域参数。该第一区域参数的区域范围可以在灵活根据场景需求约束条件下设置。如,可以设置该区域范围对应第一目标状态的方向检测到外力时,传感器获取的外力参数。也可以为,在对应趋近第一目标状态的方向检测到外力时,传感器获取的外力参数。还可以为在特定的方向检测到外力并且外力大于一外力值时,传感器获取的外力参数。可以理解的是,该特定方向可以包括多方向的约束条件,在约束条件下,传感器获取的符合需求的外力参数范围则为第一区域参数。第一区域参数为依据具体应用场景所需而设定的符合第一目标状态对应的传感参数。例
如场景所需的第一目标状态包括待放置货物的放置位置和放置姿态,则在趋近第一目标状态过程中,满足码放货物的紧密度要求,则可以执行放置。该示例下,可以通过检测货物受到的其他货物或舱壁对货物产生的反作用力判断是否接触到,以及甚至可以设定作用力达到某一范围,则符合紧密度要求的传感参数区域,即为该场景需求的第一区域参数。当趋近第一目标状态过程中,若该传感参数属于符合第一目标状态对应的第一区域参数,则执行放置。例如,第一区域参数对应于待放置货物底侧、后侧、左侧、右侧中任意一面或任意几面具有的反作用力,如接触到其他货物、地面或货舱的舱壁等,产生对货物的反作用力,进而通过获持待放置货物的末端执行器,对与末端执行器连接的传感器产生作用力。该连接包括直接连接也包括间接连接。
可选的,传感器可包括:力反馈传感器和/或力矩传感器,还可以为力/力矩传感器(Force/Torque Sensor)。该传感器可以设置在机器人操作末端连接末端执行器的关节处,也可以直接设置在末端执行器上。
力传感器(Force Sensor)是将力的量值转换为相关电信号的器件。力是引起物质运动变化的直接原因。力和力矩传感器(Force/Torque Sensor)用于测量支撑物和被支撑物之间的相互作用力和扭矩。可以设置在机器人操作末端连接末端执行器的关节处。
进一步的,传感器还包括:视觉传感器。
于本发明另一实施例中,可通过视觉传感器获取末端执行器趋近第一目标状态时,末端执行器获持的待放置货物的可移动空间。传感参数可包括:该视觉传感器获取的待放置货物趋近该第一目标状态的可移动空间。具体地,该视觉传感器包括三维摄像装置,如颜色深度(RGBD,RGB Depth)装置,双目立体视觉(Binocular Stereo Vision)。该传感参数为该三维摄像装置获取的空间数据。也可以为二维摄像装置结合深度感知装置,从而可以获得二维图像以及深度数据,即可以获得空间数据。
则该传感参数符合该第一预设条件为,该可移动空间属于预置的第一空间范围。即,当控制机器人的末端执行器获持待放置货物趋近第一目标状态时,若该视觉传感器获取的待放置货物趋近该第一目标状态的可移动空间,属于预置的第一空间范围,则机器人放置该待放置货物。
进一步的,传感器还包括:测距传感器。
于本发明另一实施例中,可通过测距传感器检测待放置货物趋近其他物体的第一距离。传感参数可包括该测距传感器检测到该待放置货物趋近其他物体的距离,则传感参数符合第一预设条件为该待放置货物趋近其他物体的距离属于预置的第一阈值范围。即,当控制机器人的末端执行器获持待放置货物趋近第一目标状态时,若该测距传感器检测到的该待放置货物趋近其他物体的距离,属于预置的第一阈值范围,则机器人放置该待放置货物。
一般的,测距传感器可以获得一维数据。当具有三个测距传感器则可以算出测算表面的法向量。进而可以依据法向量测算出末端执行器与测算表面的距离。进而还可以调整末端执行器的姿态,以准确趋近。
需要说明的是,当通过视觉传感器趋近第一目标状态过程中,若检测对象超出该视觉传感器的检测范围时,还可以通过该测距传感器测量待放置货物与其他货物的距离等符合场景所需的传感参数。
进一步的,传感器还包括接近传感器。
于本发明另一实施例中,可通过接近传感器检测待放置货物与趋近其他物体的第一接近数据。传感参数可包括接近传感器检测到该待放置货物与趋近其他物体的第一接近数据,则该传感参数符合第一预设条件为该第一接近数据属于预置的第三阈值范围。即,当控制机器人的末端执行器获持待放置货物趋近第一目标状态时,若该待放置货物与趋近其他物体的第一接近数据,属于预置的第三阈值范围,则机器人放置该待放置货物。
接近传感器可以用于检测具体的接近参数,或者,也可以仅用于检测是否
产生接近数据。当接近传感器为检测是否产生接近数据时,可以依据接近传感器的检测参数,设置接近传感器与机器人的位置。例如接近距离为55mm,则接近至小于等于55mm则产生接近数据,否则无数据。将该接近传感器设置于末端执行器可获持货物的状态下与货物的距离落入55mm的位置。则当末端执行器趋近第二目标状态时,接近传感器产生接近数据,即代表末端执行器处于可获持货物的状态,则执行获持。
进一步的,传感器还包括触觉传感器。
于本发明另一实施例中,可通过触觉传感器获取末端执行器与待放置货物的接触数据。传感参数包括末端执行器与待放置货物的接触数据。传感参数符合第一预设条件,包括:触觉传感器检测的传感参数属于第三区域参数。当控制该机器人的末端执行器趋近第一目标状态时,若检测到触觉传感器检测的传感参数属于第三区域参数,则放置该待放置货物。第三区域为依据具体应用场景所需而设定的符合第一目标状态对应的所有触觉传感器获取的传感参数的结合。例如场景所需的第一目标状态包括待放置货物的放置位置和放置姿态,则在趋近第一目标状态过程中,若获取的接触数据满足码放货物的紧密度要求,则可以执行放置。该示例下,可以通过检测货物受到的其他货物或舱壁对货物产生的反作用力判断是否接触到,以及甚至可以设定作用力达到某一范围,则符合紧密度要求的传感参数,即为传感器检测的传感参数落入该场景需求约束下第三区域的传感参数。当趋近第一目标状态过程中,若该传感参数落入符合第一目标状态对应的第三区域的传感参数范围之内,则执行放置。例如,第三区域参数对应于待放置货物底侧、后侧、左侧、右侧中任意一面或任意几面具有的反作用力,如接触到其他货物、地面或货舱的舱壁等,产生对货物的反作用力,进而通过获持待放置货物的末端执行器,对与末端执行器连接的触觉传感器产生作用力。具体的,触觉传感器可以设置于末端执行器获持货物时,与货物接触的区域。
进一步的,触觉传感器还可以检测力的分布。第三区域的设定还可以包括末端执行器与货物接触分布的接触数据限定。例如,当末端执行器为机械手,触觉传感器设置于机械手各手指接触货物的区域。则第三区域的设定包括各手指与货物的接触分布限定得到的接触数据范围。假设,该接触分布限定为各手指与货物的接触区域均大于第一面积。可以理解的是,可以依据场景需求,第三区域的具体接触数据范围还可以包括力的大小的限定。
于本发明其他一实施例中,机器人的末端执行器包括吸盘。传感器还可以包括真空传感器,该传感参数包括负压压强参数。第一预设条件包括,该负压压强参数的值产生预设波动。该预设波动对应于货物接触到放置面,如接触到其他货物、地面或货舱的舱壁等,产生对货物的反作用力,进而通过获持待放置货物的吸盘,对与末端执行器连接的真空传感器产生负压压强变化,当该变化符合预设波动,则符合第一预设条件,即执行放置操作。
本发明实施例中,在趋近目标状态过程中,通过传感器获取传感参数,根据该传感参数,控制机器人放置货物。使得机器人可以准确完成货物的放置。当应用与物流仓储领域,尤其用于码货方案时,可以实现货物之间的摆放距离恰当,放置整齐,节约放置空间,货物堆放稳定。在卸货时,能稳定、有效地完成卸货任务。
请参见图4,图4为本发明第三实施例提供的机器人码放货物方法的实现流程示意图。本实施例是从获持货物的角度进行的说明,如图4所示,该方法包括以下步骤:
S401、通过传感器获取传感参数。
S402、根据第二目标状态以及末端执行器的当前状态,规划末端执行器从当前状态到达第四目标状态的第二执行路径,第四目标状态与第二目标状态具有第二距离;
S403、根据第二执行路径,控制末端执行器到达第四目标状态;
S404、当末端执行器趋近第二目标状态时,若传感参数符合第二预设条件,则控制末端执行器执行获持待获持货物的操作。
需要说明的是,根据货物要获持的目标状态和末端执行器的当前状态,包括根据第二目标状态以及末端执行器的当前状态,规划末端执行器从当前状态到达与第二目标状态具有第二距离的第四目标状态的第二执行路径。根据所述第二执行路径控制机器人的末端执行器到达第四目标状态后,趋近第二目标状态,并当传感参数符合第二预设条件时,控制末端执行器获持该获持货物。末端执行器的当前状态包括末端执行器的当前位置和/或当前姿态。第四目标状态为接近第二目标状态的一个可行位置和/或可行姿态。通过到达第四目标状态趋近第二目标状态。
具体的,可通过传感器获取传感参数包括:通过传感器检测外力,获取外力参数。则,第二预设条件包括:外力参数属于第二区域参数。第二区域参数为依据具体应用场景所需而设定的符合第二目标状态对应的传感参数。例如场景所需的第二目标状态包括接待获持货物的获持位置和获持姿态,在趋近第二目标状态过程中,满足末端执行器接触到待获持货物,则可以执行获持。该示例下,可以通过检测末端执行器是否接触到待获持货物,甚至可以设与待获持货物的作用力达到某一范围,则符合获持要求的传感参数区域,即第二区域参数。当趋近第二目标状态过程中,若该传感参数属于符合第二目标状态对应的第二区域参数,则执行获持。例如,第二区域参数对应于接触到待获持货物,甚至挤压待获持货物,待获持货物产生对末端执行器的反作用力,进而对与末端执行器连接的传感器产生作用力,以获取传感参数。该连接包括直接连接也包括间接连接。
其中,传感器可包括:力反馈传感器和/或力矩传感器,还可以为力/力矩传感器(Force/Torque Sensor)。该传感器可以设置在操作末端连接末端执行器的关节处,也可以直接设置在末端执行器上。
进一步的,传感器包括:视觉传感器。具体地,该视觉传感器包括三维摄像装置,如颜色深度装置,双目立体视觉。该传感参数为该三维摄像装置获取的空间数据。也可以为二维摄像装置结合深度感知装置,从而可以获得二维图像以及深度数据,即可以获得空间数据。
于本发明其他一实施例中,通过视觉传感器获取末端执行器与待获持货物的相对状态。传感参数可包括:末端执行器与待获持货物的相对状态。第二预设条件包括:末端执行器与待获持货物的相对状态属于预设的可获持状态。如,当该第二目标状态为获持该待获持货物的初始获持位置和/或获持姿态。当控制该机器人的末端执行器趋近第二目标状态时,即趋近待获持货物时,根据视觉传感器获取的视觉数据获得更加准确的待获持货物的空间数据,则可以得到优化的获持位置和/或获持姿态。当末端执行器处于优化的获持位置和/或获持姿态,则获持该待获持货物。
进一步的,传感器还包括:测距传感器。
于本发明其他一实施例中,通过测距传感器获取末端执行器与待获持货物的第二距离。第二预设条件为该第二距离属于末端执行器可获持待获持货物的第二阈值范围。即,当控制该机器人的末端执行器趋近待获持货物时,即趋近第二目标状态时,若根据测距传感器的传感参数判断与该待获持货物的距离,若属于该末端执行器可获持该待获持货物的执行阈值范围,则获持该待获持货物。执行阈值范围包括接触到待获持货物,也包括非接触状态。例如,当末端执行器为吸盘时,吸盘与待获持货物的距离落入吸盘的吸力范围即可。
一般的,测距传感器可以获得一维数据。当具有三个测距传感器则可以算出测算表面的法向量。进而可以依据法向量测算出末端执行器与测算表面的距离。进而还可以调整末端执行器的姿态,以准确趋近。
需要说明的是,当通过视觉传感器趋近第二目标状态过程中,若检测对象超出视觉传感器的检测范围时,还可以通过测距传感器测量末端执行器与待获
持货物的距离,待获持货物与周围的物体之间的距离等符合场景所需的传感参数。
进一步的,传感器还包括:接近传感器。
于本发明其他一实施例中,通过接近传感器获取末端执行器与待获持货物的第二接近数据,传感参数还可以包括该接近传感器获取到的末端执行器与待获持货物的第二接近数据。第二预设条件为该第二接近数据属于预置的第四阈值范围。即,当控制该机器人的末端执行器趋近待获持货物时,即趋近第二目标状态时,若检测到机器人与该待获持货物的第二接近数据,属于预置的第四阈值范围,则获持该待获持货物。
接近触感感器包括可以检测具体接近参数,也可以包括仅可以检测是否有接近数据。当接近传感器为检测是否产生接近数据时,可以依据接近传感器的检测参数,设置接近传感器于机器人的位置。例如接近距离为55mm,则接近至小于等于55mm则产生接近数据,否则无数据。将该接近传感器设置于末端执行器可获持货物的状态下与货物的距离落入55mm的位置。则当末端执行器趋近第二目标状态时,接近传感器产生接近数据,即代表末端执行器处于可获持货物的状态,则执行获持操作。
进一步的,传感器还包括:触觉传感器。
于本发明另一实施例中,可通过触觉传感器获取末端执行器与待获持货物的接触数据。第二预设条件包括:末端执行器与待获持货物的接触数据属于第四区域参数。即,当控制该机器人的末端执行器趋近待获持货物时,即趋近第二目标状态时,若检测到末端执行器与该待获持货物的接触数据属于第四区域参数,则获持该待获持货物。由于触觉传感器可以检测力的大小,因此,第四区域的设定还可以包括与货物之间作用力大小的限定。
进一步的,触觉传感器还可以检测力的分布。第四区域的设定还可以包括末端执行器与货物接触分布的接触数据限定。例如,当末端执行器为机械手,
触觉触感器设置于机械手各手指接触货物的区域。则第四区域的设定包括各手指与货物的接触分布限定得到的接触数据范围。假设,该接触分布限定为各手指与货物的接触区域均大于第一面积。机械手趋近第二目标状态包括机械手趋近获持待获持货物,例如,逐渐收拢手指,直到各手指的触觉传感器获得的接触数据对应于待获持货物的接触区域大于第一面积,则执行获持。可以理解的是,可以依据场景需求,第四区域的具体接触数据范围还可以包括力的大小的限定。
进一步的,还可以利用触觉触感器的接触数据监测获持货物状态下,货物是否产生滑动。若滑动,则控制末端执行器加大获持力度。还可以包括,获持状态下,对货物姿态的调整。
进一步的,根据接触数据得到货物的纹理。例如,第二目标状态为获持位置和/或获持姿态对应货物具有纹理特征的区域。则当末端执行器趋近第二目标状态过程中,根据接触数据判断末端执行器是否接触到该具有纹理特征的区域,当末端执行器与待获持货物的接触数据的特征,与末端执行器和待获持货物的具有纹理特征的区域接触时对应的接触数据的特征相匹配时,确认符合第二预设条件,则执行获持操作。
另一可行实施方式下,可以依据前述方式,根据接触数据判断接触到待获持货物,则执行获持。获持后,控制末端执行器调整货物的姿态。例如,根据视觉传感器或已知的货物模型数据获知纹理特征位于末端执行器于货物获持区域的上方,则可以减小末端执行器的获持力度,但保持接触,即根据接触数据控制末端执行器于货物之间的作用力。在接触状态下,仍具有摩擦力,控制货物于下滑至纹理特征到达末端执行器的获持区域,即在下滑阶段根据接触数据判断是否具有纹理特征,如有,则控制末端执行器加大获持力。即货物稳定获持于末端执行器中,且末端执行器获持的区域处于货物的纹理特征的区域。具体的,当该纹理特征包括货物信息时,可以通过触觉传感器得到纹理特征,进
而得到该货物的信息。
进一步的,传感器还包括:真空传感器。
于本发明其他一实施例中,机器人的末端执行器包括吸盘。传感器还可以包括真空传感器,该传感参数包括负压压强参数。第二预设条件包括,该负压压强参数的值大于预设负压压强阈值。预设负压压强阈值符合该吸盘吸住该待获持货物所需的负压压强值。
可以理解的是,执行获持或放置与判断传感参数可以同步完成。即一旦判断符合预设条件,末端执行器停止趋近运动。
上述第一区域参数、第二区域参数、第三区域参数、第四区域参数包括一维区间,二维区域,三维空间、六维空间等多维空间对应的多个传感参数的集合。上述集合中的多个传感参数的具体值可以是连续的,也可以是不连续的。例如,当包括2个可以检测六维的传感器,则可以获得12维空间数据,用于描述传感器的状态。
在本发明实施例中,在趋近目标状态过程中,通过传感器获取传感参数,根据该传感参数,控制机器人获持货物。使得机器人可以准确完成货物的获持。当应用与物流仓储领域,尤其用于码货方案时,可以实现货物之间的摆放距离恰当,放置整齐,节约放置空间,货物堆放稳定。
上述实施例中,通过到达接近第一目标状态的第三目标状态,以及接近第二目标状态的第四目标状态后,在趋近第一目标状态、第二目标状态过程中,通过传感器获取的传感参数判断是否执行放置或获持,以避免因机器人计算及机械操作存在误差导致一次规划到达目标状态执行放置或获持货物时挤压到其他货物或墙壁,或无法获持货物,导致任务失败的问题。从而提高保持货物完整的几率。
请参见图5,图5为本发明第四实施例提供的机器人,该机器人为图1中的机器人10,机器人10包括:
末端执行器101,设置于机器人10的操作末端。该操作末端可以是机械臂执行末端。末端执行器101设置于该机械臂执行末端。
传感器102,用于获取传感参数。
存储器103、处理器104及存储在存储器103上并可在处理器104上运行的计算机程序,处理器104执行该计算机程序时,实现前述图2至图4所示实施例中描述的机器人码放货物的方法。
上述末端执行器101、传感器102、存储器103以及处理器104通信连接,包括但不限于通过总线105连接。
存储器103可以是高速随机存取记忆体(RAM,Random Access Memory)存储器,也可为非不稳定的存储器(non-volatile memory),例如磁盘存储器。存储器103用于存储一组可执行程序代码,处理器104与存储器104耦合。在一些实施例中,存储器103可选包括相对于处理器104远程设置的存储器,这些远程存储器可以通过网络连接至机器人。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
进一步的,传感器102包括:视觉传感器,测距传感器,接近传感器,触觉传感器,真空传感器,力反馈传感器,力矩传感器以及力和力矩传感器中的一个或几个传感器。
其中,力反馈传感器设置在该末端执行器,力和力矩传感器设置在该操作末端连接末端执行器的关节处。
具体地,力反馈传感器用于检测外力。
力和力矩传感器用于检测外力及力矩。
视觉传感器用于获取待放置货物趋该第一目标状态的可移动空间。
视觉传感器还用于获取的待获持货物的数据,用于获取与待获持货物的距离。
测距传感器用于检测待放置货物趋近其他物体的距离。
测距传感器还用于检测与待获持货物的距离。
接近传感器用于检测待放置货物与趋近其他物体的第一接近数据。
接近传感器还用于检测与待获持货物的第二接近数据。
触觉传感器用于获取末端执行器与待放置货物或待获持货物接触时产生的接触数据。
真空传感器用于获取负压压强参数。进一步的,根据负压压强参数判断是否吸住物体之后,该真空传感器还可以用于调整吸力的大小。
真空传感器还用于测量负压压强的波动。
根据以上各传感器检测的传感器参数,确认通过传感器得到的传感参数符合第一预设条件时,放置该待放置货物。以及确认通过传感器得到的传感参数符合第二预设条件时,获持该待获持货物。具体请参见前述图2至图4所述实施例的描述,此处不再赘述。
进一步的,机器人还包括:移动机构、机械臂、机身以及供电电源。
其中,移动机构的底部设置有多个轮子,通过驱动轮子转动,使得机器人向各个方向移动。
上述处理器104以及存储器103设置在机身中。
未在本实施例中详尽描述的技术细节,可参见本发明第一实施例至第三实施例所提供的方法。
本发明实施例中,在机器人上设置有至少一种传感器,通过传感器获取传感参数,根据该传感参数,控制机器人放置或获持货物,使得机器人在码放货物时准确完成货物的操作,使得货物之间距离恰当,放置整齐,节约放置空间,货物堆放稳定。
需要说明的是,对于前述的各方法实施例,为了简便描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明并不受所描述的动作顺序的限制,因为依据本发明,某些步骤可以采用其它顺序或者同时进行。
其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定都是本发明所必须的。
在上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其它实施例的相关描述。
以上为对本发明所提供的机器人码放货物的方法及机器人的描述,对于本领域的技术人员,依据本发明实施例的思想,在具体实施方式及应用范围上均会有改变之处,综上,本说明书内容不应理解为对本发明的限制。
Claims (20)
- 一种机器人码放货物的方法,其特征在于,所述机器人包括设置于机器人操作末端的末端执行器,所述方法包括:通过传感器获取传感参数;控制机器人的末端执行器趋近目标状态,若所述传感参数符合预设条件,则控制所述末端执行器执行获持或放置操作。
- 根据权利要求1所述的方法,其特征在于,所述目标状态包括用于执行放置操作对应的第一目标状态,则所述控制机器人的末端执行器趋近目标状态,若所述传感参数符合预设条件,则控制所述末端执行器执行获持或放置操作,包括:根据所述第一目标状态以及所述末端执行器的当前状态,规划从所述末端执行器从所述当前状态到达第三目标状态的第一执行路径,所述第三目标状态与所述第一目标状态具有第一距离;根据所述第一执行路径,控制所述末端执行器到达所述第三目标状态;当控制所述末端执行器趋近所述第一目标状态时,若所述传感参数符合第一预设条件,则控制所述末端执行器执行放置已获持的待放置货物的操作。
- 根据权利要求1所述的方法,其特征在于,所述目标状态包括用于执行获持操作对应的第二目标状态,则所述控制机器人的末端执行器趋近目标状态,若所述传感参数符合预设条件,则控制所述末端执行器执行获持或放置操作,包括:根据所述第二目标状态以及所述末端执行器的当前状态,规划所述末端执行器从所述当前状态到达第四目标状态的第二执行路径,所述第四目标状态与所述第二目标状态具有第二距离;根据所述第二执行路径,控制所述末端执行器到达第四目标状态;当所述末端执行器趋近所述第二目标状态时,若所述传感参数符合第二预 设条件,则控制所述末端执行器执行获持待获持货物的操作。
- 根据权利要求2所述的方法,其特征在于,所述通过传感器获取传感参数包括:通过所述传感器检测外力,获取外力参数;则,所述第一预设条件包括:所述外力参数属于第一区域参数。
- 根据权利要求3所述的方法,其特征在于,所述通过传感器获取传感参数包括:通过所述传感器检测外力,获取外力参数;则,所述第二预设条件包括:所述外力参数属于第二区域参数。
- 根据权利要求2所述的方法,其特征在于,所述通过传感器获取传感参数包括:通过视觉传感器获取所述末端执行器趋近所述第一目标状态时,所述末端执行器获持的待放置货物的可移动空间;则,所述第一预设条件包括:所述可移动空间属于预置的第一空间范围。
- 根据权利要求3所述的方法,其特征在于,所述通过传感器获取传感参数包括:通过视觉传感器获取所述末端执行器与所述待获持货物的相对状态;则,所述第二预设条件包括:所述相对状态属于可获持状态。
- 根据权利要求2所述的方法,其特征在于,所述通过传感器获取传感参数包括:通过测距传感器检测所述待放置货物趋近其他物体的第一距离;则,所述第一预设条件包括:所述第一距离属于预置的第一阈值范围。
- 根据权利要求3所述的方法,其特征在于,所述通过传感器获取传感参数包括:通过测距传感器检测所述末端执行器与所述待获持货物的第二距离;则,所述第二预设条件包括:所述第二距离属于所述末端执行器可获持货 物的第二阈值范围。
- 根据权利要求2所述的方法,其特征在于,所述通过传感器获取传感参数包括:通过接近传感器获取所述待放置货物与趋近其他物体的第一接近数据;则,所述第一预设条件包括:所述第一接近数据属于预置的第三阈值范围。
- 根据权利要求3所述的方法,其特征在于,所述通过传感器获取传感参数包括:通过接近传感器获取所述末端执行器与待获持货物的第二接近数据;则,所述第二预设条件包括:所述第二接近数据属于预置的第四阈值范围。
- 根据权利要求2所述的方法,其特征在于,所述通过传感器获取传感参数包括:通过触觉传感器获取所述末端执行器与所述待放置货物的接触数据;则,所述第一预设条件包括:所述末端执行器与所述待放置货物的接触数据属于第三区域参数。
- 根据权利要求3所述的方法,其特征在于,所述通过传感器获取传感参数包括:通过触觉传感器获取所述末端执行器与待获持货物的接触数据;则,所述第二预设条件包括:所述末端执行器与所述待获持货物的接触数据属于第四区域参数。
- 根据权利要求13所述的方法,其特征在于,所述通过传感器获取传感参数包括:通过触觉传感器获取所述末端执行器与待获持货物的接触数据;则,所述第二预设条件包括:所述末端执行器与所述待获持货物的接触数据的特征,与所述末端执行器和所述待获持货物的具有纹理特征的区域接触时对应的接触数据的特征相匹配。
- 根据权利要求13或14所述的方法,其特征在于,所述控制所述末端执行器执行获持待获持货物的操作之后,包括:根据所述末端执行器与所述待获持货物的接触数据,分析获持货物状态下,获持的货物是否产生滑动;若产生滑动,则控制所述末端执行器加大获持力度。
- 根据权利要求13或14所述的方法,其特征在于,所述控制所述末端执行器执行获持待获持货物的操作之后,包括:在获持货物状态下,根据所述末端执行器与所述待获持货物的接触数据,调整获持的货物的姿态。
- 根据权利要求2或3所述的方法,其特征在于,所述方法还包括:在视觉传感器趋近第二目标状态的过程中,若检测对象超出所述视觉传感器的检测范围时,则通过测距传感器测量所述末端执行器与所述待获持货物的距离,以及通过所述测距传感器测量所述待获持货物与周围的物体之间的距离。
- 根据权利要求1所述的方法,其特征在于,所述末端执行器为吸盘,所述传感器包括真空传感器,所述传感参数包括负压压强参数,则所述若所述传感参数符合预设条件,则控制所述末端执行器执行获持或放置操作,包括:若所述负压压强参数的值产生预设波动,则控制所述末端执行器执行获持操作;若所述负压压强参数的值大于预设负压压强阈值,则控制所述末端执行器执行放置操作。
- 一种机器人,其特征在于,包括:末端执行器,设置于所述机器人的操作末端;传感器,用于获取传感参数;存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时,实现如权利要求1至18中的任一项 所述的机器人码放货物的方法。
- 根据权利要求19所述的机器人,其特征在于,所述传感器包括:视觉传感器,测距传感器,接近传感器,触觉传感器,真空传感器,力反馈传感器,力矩传感器以及力和力矩传感器中的一种或几种的组合。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/098781 WO2019037013A1 (zh) | 2017-08-24 | 2017-08-24 | 机器人码放货物的方法及机器人 |
CN201780006693.7A CN108698225B (zh) | 2017-08-24 | 2017-08-24 | 机器人码放货物的方法及机器人 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/098781 WO2019037013A1 (zh) | 2017-08-24 | 2017-08-24 | 机器人码放货物的方法及机器人 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019037013A1 true WO2019037013A1 (zh) | 2019-02-28 |
Family
ID=63843794
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/098781 WO2019037013A1 (zh) | 2017-08-24 | 2017-08-24 | 机器人码放货物的方法及机器人 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108698225B (zh) |
WO (1) | WO2019037013A1 (zh) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114671256A (zh) * | 2020-12-24 | 2022-06-28 | 未势能源科技有限公司 | 电堆物料堆叠控制方法、装置及物料拿取机构 |
CN116187908A (zh) * | 2023-03-21 | 2023-05-30 | 江陵县百顺通达物流有限公司 | 一种基于图像识别的仓储物流智能管理方法及系统 |
CN118220723A (zh) * | 2024-05-22 | 2024-06-21 | 菲特(天津)检测技术有限公司 | 基于机器视觉的精确码垛方法及系统 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109648562B (zh) * | 2018-12-29 | 2021-01-26 | 深圳蓝胖子机器人有限公司 | 箱体抓取控制方法、箱体放置控制方法、相关装置及系统 |
CN110509067B (zh) * | 2019-07-31 | 2021-06-29 | 清华大学 | 一种大型复杂构件原位加工多机器人系统装备 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020154986A1 (en) * | 2001-03-30 | 2002-10-24 | Axium Automation Inc. | Gripping and transport clamp mounted at the end of a robotic arm and method for operating the same |
CN102902271A (zh) * | 2012-10-23 | 2013-01-30 | 上海大学 | 基于双目视觉的机器人目标识别与抓取系统及方法 |
CN103043359A (zh) * | 2011-10-17 | 2013-04-17 | 株式会社安川电机 | 机器人系统、机器人以及已分拣物品的制造方法 |
CN205555541U (zh) * | 2016-03-16 | 2016-09-07 | 广州圣益龙自动控制技术有限公司 | 码垛、拆垛机器人 |
CN106610666A (zh) * | 2015-10-22 | 2017-05-03 | 沈阳新松机器人自动化股份有限公司 | 一种基于双目视觉的助理机器人及其控制方法 |
CN106695792A (zh) * | 2017-01-05 | 2017-05-24 | 中国计量大学 | 基于机器视觉的码垛机器人跟踪监控系统及方法 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FI20105732A0 (fi) * | 2010-06-24 | 2010-06-24 | Zenrobotics Oy | Menetelmä fyysisten kappaleiden valitsemiseksi robottijärjestelmässä |
US9102055B1 (en) * | 2013-03-15 | 2015-08-11 | Industrial Perception, Inc. | Detection and reconstruction of an environment to facilitate robotic interaction with the environment |
DE102013106819B4 (de) * | 2013-06-28 | 2018-07-12 | Ferrobotics Compliant Robot Technology Gmbh | Verfahren zum robotergestützten Stapeln von Gegenständen |
CN103978474B (zh) * | 2014-05-14 | 2015-09-23 | 湖南大学 | 一种面向极端环境的特种作业机器人 |
CN104626171A (zh) * | 2015-01-07 | 2015-05-20 | 北京卫星环境工程研究所 | 基于六维力传感器的机械臂碰撞检测与响应方法 |
CN104786220A (zh) * | 2015-03-20 | 2015-07-22 | 江苏南铸科技股份有限公司 | 一种用于搬运液晶屏的机器人手臂 |
CN205060979U (zh) * | 2015-10-08 | 2016-03-02 | 胡雨滨 | 码垛机器人系统 |
CN105692198B (zh) * | 2016-03-11 | 2019-01-01 | 青岛创想智能技术有限公司 | 一种用于控制夹具的控制系统 |
CN106346510B (zh) * | 2016-10-11 | 2018-08-14 | 佛山科学技术学院 | 一种具有触觉感知功能的柔顺型三指夹持器 |
CN106671112B (zh) * | 2016-12-13 | 2018-12-11 | 清华大学 | 一种基于触觉阵列信息的机械手抓取稳定性判断方法 |
-
2017
- 2017-08-24 WO PCT/CN2017/098781 patent/WO2019037013A1/zh active Application Filing
- 2017-08-24 CN CN201780006693.7A patent/CN108698225B/zh active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020154986A1 (en) * | 2001-03-30 | 2002-10-24 | Axium Automation Inc. | Gripping and transport clamp mounted at the end of a robotic arm and method for operating the same |
CN103043359A (zh) * | 2011-10-17 | 2013-04-17 | 株式会社安川电机 | 机器人系统、机器人以及已分拣物品的制造方法 |
CN102902271A (zh) * | 2012-10-23 | 2013-01-30 | 上海大学 | 基于双目视觉的机器人目标识别与抓取系统及方法 |
CN106610666A (zh) * | 2015-10-22 | 2017-05-03 | 沈阳新松机器人自动化股份有限公司 | 一种基于双目视觉的助理机器人及其控制方法 |
CN205555541U (zh) * | 2016-03-16 | 2016-09-07 | 广州圣益龙自动控制技术有限公司 | 码垛、拆垛机器人 |
CN106695792A (zh) * | 2017-01-05 | 2017-05-24 | 中国计量大学 | 基于机器视觉的码垛机器人跟踪监控系统及方法 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114671256A (zh) * | 2020-12-24 | 2022-06-28 | 未势能源科技有限公司 | 电堆物料堆叠控制方法、装置及物料拿取机构 |
CN114671256B (zh) * | 2020-12-24 | 2024-05-24 | 未势能源科技有限公司 | 电堆物料堆叠控制方法、装置及物料拿取机构 |
CN116187908A (zh) * | 2023-03-21 | 2023-05-30 | 江陵县百顺通达物流有限公司 | 一种基于图像识别的仓储物流智能管理方法及系统 |
CN116187908B (zh) * | 2023-03-21 | 2023-12-22 | 岳阳礼一科技股份有限公司 | 一种基于图像识别的仓储物流智能管理方法及系统 |
CN118220723A (zh) * | 2024-05-22 | 2024-06-21 | 菲特(天津)检测技术有限公司 | 基于机器视觉的精确码垛方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
CN108698225A (zh) | 2018-10-23 |
CN108698225B (zh) | 2022-05-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110465960B (zh) | 具有物件丢失管理机制的机器人系统 | |
WO2019037013A1 (zh) | 机器人码放货物的方法及机器人 | |
US11046518B2 (en) | Controller and control method for robot system | |
JP2024133556A (ja) | ロボットシステムの制御装置及び制御方法 | |
US9498887B1 (en) | Two-faced linearly actuated gripper | |
US9205558B1 (en) | Multiple suction cup control | |
JP7175487B1 (ja) | 画像ベースのサイジングメカニズムを備えたロボットシステム及びロボットシステムを操作するための方法 | |
US11981518B2 (en) | Robotic tools and methods for operating the same | |
US20230158676A1 (en) | Controlling multiple robots to cooperatively pick and place items | |
JP2023524607A (ja) | ロボット多面グリッパアセンブリ及びその操作方法 | |
WO2023187006A1 (en) | Controlling a robotic manipulator for packing an object | |
JP7264387B2 (ja) | 開閉式物体用のロボットグリッパアセンブリ及び物体をピッキングするための方法 | |
US20240228192A9 (en) | Robotic systems with dynamic motion planning for transferring unregistered objects | |
US20240293936A1 (en) | Use of robotic arm to achieve packing density | |
CN115609569A (zh) | 具有基于图像的尺寸确定机制的机器人系统及其操作方法 | |
WO2024186375A1 (en) | Systems and methods for grasping and placing multiple objects with a robotic gripper | |
CN114683299A (zh) | 机器人工具及其操作方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17922540 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17922540 Country of ref document: EP Kind code of ref document: A1 |