CN111203849A - Mobile robot grabbing operation system and control method - Google Patents

Mobile robot grabbing operation system and control method Download PDF

Info

Publication number
CN111203849A
CN111203849A CN202010021047.1A CN202010021047A CN111203849A CN 111203849 A CN111203849 A CN 111203849A CN 202010021047 A CN202010021047 A CN 202010021047A CN 111203849 A CN111203849 A CN 111203849A
Authority
CN
China
Prior art keywords
coordinate system
mobile platform
mechanical arm
sensing unit
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010021047.1A
Other languages
Chinese (zh)
Inventor
王滔
张雲策
葛鸿昌
朱世强
祝义朋
胡纪远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202010021047.1A priority Critical patent/CN111203849A/en
Publication of CN111203849A publication Critical patent/CN111203849A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • B25J5/007Manipulators mounted on wheels or on carriages mounted on wheels
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/02Gripping heads and other end effectors servo-actuated
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/08Gripping heads and other end effectors having finger members
    • B25J15/12Gripping heads and other end effectors having finger members with flexible finger members
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/022Optical sensing devices using lasers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Optics & Photonics (AREA)
  • Manipulator (AREA)

Abstract

The invention discloses a mobile robot grabbing operation system and a control method. The system comprises a system control unit, a flexible grabbing unit, an intelligent sensing unit and a mobile platform, wherein the flexible grabbing unit comprises a mechanical arm and a soft mechanical arm; the intelligent sensing unit comprises a basic sensing unit, a holder, a processor and an embedded computing platform and is used for realizing the identification and positioning of an object; the flexible grabbing unit is used for safely grabbing objects, and the soft mechanical arm is driven by the air pump; the system control unit is respectively connected and communicated with the intelligent sensing unit, the flexible grabbing unit and the mobile platform by utilizing communication cables, and sends a control signal to the mobile platform and the flexible grabbing unit to complete corresponding operation processes after receiving sensing information of the intelligent sensing unit; the mobile platform carries the intelligent sensing unit, the flexible grabbing unit, the system control unit and the power module, the power module provides power requirements for all units, and movement in the operation process is achieved.

Description

Mobile robot grabbing operation system and control method
Technical Field
The invention relates to a grabbing operation system, in particular to a grabbing operation system applied to a mobile robot and a control method, which are suitable for the autonomous grabbing operation of an intelligent service robot.
Background
Nowadays, a large number of robots are applied to human social life and production to assist or replace human beings to perform partial repetitive or service work, and are mostly applied to some structural work environments, such as production industry, construction industry and the like. With the improvement of the intelligence degree of the robot, more and more intelligent mobile robots appear in various industries, particularly the service industry. The intelligent mobile robot plays an increasingly important role in social production and life and brings great innovation to the service industry.
The intelligent mobile robot is a comprehensive system integrating multiple functions of environment perception, dynamic decision and planning, behavior control and execution and the like, wherein the behavior control and execution are an important link for realizing the operation capability of the intelligent mobile robot and are very important components in a robot system, and the grabbing operation is a very key part in the behavior control and execution link of the intelligent mobile robot. At present, research on a mobile robot grabbing operation system focuses on grabbing operation or non-autonomous grabbing operation in a fixed scene, which greatly limits the service capacity and operation efficiency of an intelligent mobile robot in the service industry, and reduces the autonomous operation level of the intelligent mobile robot, so that the intelligent mobile robot in the service industry depends on a structural environment, and meanwhile, certain manpower cooperation is needed to ensure normal operation of the intelligent mobile robot. The intelligent mobile robot adopting the grabbing operation system is difficult to replace human to finish service work autonomously in the actual working scene of the service industry.
Disclosure of Invention
The invention aims to provide a mobile robot grabbing operation system and a control method, and designs the robot grabbing operation system and the control method which can effectively and autonomously complete grabbing operation in a service industry working environment. The system has strong environment adaptability and can carry out grabbing operation in an unstructured environment; the automatic grabbing device has the autonomous operation capability and can autonomously complete the whole grabbing operation process; the whole set of mobile robot grabbing operation system has a simple and reliable hardware structure, high integration level and system stability, can be competent for grabbing operation in most mobile robot working scenes, and is beneficial to technical popularization and application.
The utility model provides a mobile robot snatchs operating system, includes system control unit and flexible unit of snatching, flexible unit of snatching includes arm and software manipulator, still includes: the intelligent sensing unit and the mobile platform;
the intelligent sensing unit comprises a basic sensing unit, a holder, a processor and an embedded computing platform; the system is used for realizing the identification and positioning of the object;
the flexible grabbing unit realizes safe grabbing of the articles, and the soft mechanical arm is driven by the air pump;
the system control unit is respectively connected and communicated with the intelligent sensing unit, the flexible grabbing unit and the mobile platform by utilizing communication cables, and sends a control signal to the mobile platform and the flexible grabbing unit to complete corresponding operation processes after receiving sensing information of the intelligent sensing unit;
the mobile platform carries the intelligent sensing unit, the flexible grabbing unit, the system control unit and the power module, the power module provides power requirements for all units, and the moving link of the whole grabbing operation system in the operation process is achieved.
Furthermore, the basic sensing unit comprises a monocular camera and a single-line laser ranging module, and the laser ranging module is driven by the holder; the processor is used for processing the information of the laser ranging module, the control of the tripod head motor and the calculation of the three-dimensional coordinate of the object; and the embedded computing platform processes the object recognition algorithm and outputs a recognition positioning result.
Further, the method for calculating the three-dimensional coordinates comprises the following steps:
and S21, performing object detection by using an object recognition neural network according to the environment image acquired by the monocular camera, controlling the mobile platform to rotate or move to change the system view field for further detection if the object to be captured is not detected, and calculating the two-dimensional coordinate of the object center point in the pixel coordinate system according to the two-dimensional coordinate of the object detection frame vertex in the pixel coordinate system if the object to be captured is detected.
S22, converting the two-dimensional coordinates of the object center point obtained in S21 in the pixel coordinate system into three-dimensional coordinates in the camera coordinate system, wherein the formula (1) is the conversion relation between the pixel coordinate system and the camera coordinate system, and the formula (u, v) is the coordinates of the object center point pixel coordinate system, when the coordinates of the target in the pixel coordinate system are known, the world coordinate system has infinite points corresponding to the target, but all on the connecting line of the target point and the camera optical center, and any given Z can be obtained according to the conversion relationcCoordinate (X) of object center point under camera coordinate systemc,Yc,Zc);
Figure BDA0002360066970000021
S23, obtaining the coordinates (X) of the object center point in the camera coordinate system according to the S22c,Yc,Zc) The object center point and the camera optical center O under the camera coordinate system can be solvedcPitch α and yaw β of the wire.
S24, controlling the holder to drive the laser ranging module to rotate to a corresponding angle according to the pitch angle α and the heading angle β obtained in the step S23 to perform preliminary ranging on the object, wherein the measured distance is d;
s25, correcting the pitch angle according to the distance d preliminarily measured in S24, and obtaining the corrected pitch angle as α' according to the formula (2), wherein c is the optical center of the camera and the optical center of the laser ranging module in YcSpacing on the shaft;
Figure BDA0002360066970000031
s26, readjusting the angle of the laser ranging module according to the corrected pitch angle α' and accurately measuring the distance between the center points of the objects;
s27, according to the distance of the object center point measured by the laser ranging module in S26 and the actual rotation angle of the laser ranging module on the course axis and the pitch axis, the reference formula (3) can calculate and convert to obtain the three-dimensional coordinate (X) of the object center point under the coordinate system of the laser ranging module, namely the system world coordinate systemc,Yc,Zc) And realizing the identification and positioning of the object.
Figure BDA0002360066970000032
Further, the system control unit receives three-dimensional coordinates (X) in a coordinate system of the intelligent sensing unitc,Yc,Zc) Then, according to the formula (4), the three-dimensional coordinates (X) of the received object in the coordinate system of the intelligent sensing unitc,Yc,Zc) Converting into three-dimensional coordinates (X) under a mobile platform coordinate systemp,Yp,Zp),RpcIs a rotation matrix of the coordinate system of the intelligent sensing unit and the coordinate system of the mobile platform, TpcA translation matrix of an intelligent sensing unit coordinate system and a mobile platform coordinate system;
Figure BDA0002360066970000033
as in S31Three-dimensional coordinates (X) of object under moving platform coordinate systemp,Yp,Zp) And as a control feedback quantity of the mobile platform, the control input is a target coordinate of the mobile platform, and a PID controller with speed as an output quantity is designed to control the mobile platform to move to a specified position.
Further, the system control unit receives the three-dimensional coordinates (X) in the coordinate system of the intelligent sensing unitc,Yc,Zc) Then, according to the formula (5), the three-dimensional coordinates (X) of the received object in the coordinate system of the intelligent sensing unitc,Yc,Zc) Converting into three-dimensional coordinates under a mechanical arm coordinate system, wherein (X)c,Yc,Zc) Is the three-dimensional coordinate (X) of the object under the coordinate system of the intelligent sensing unitj,Yj,Zj) Is the three-dimensional coordinate of an object under a mechanical arm coordinate system, RjcIs a rotation matrix, T, of an intelligent sensing unit coordinate system and a mechanical arm coordinate systemjcThe translation matrix is a translation matrix of an intelligent sensing unit coordinate system and a mechanical arm coordinate system;
Figure BDA0002360066970000034
according to the three-dimensional coordinates (X) of the object under the mechanical arm coordinate system obtained by conversionj,Yj,Zj) And the system control unit sends control instructions to the mechanical arm and the electromagnetic valve to complete the static grabbing work of the object.
A control method of a mobile robot grabbing operation system comprises a system control unit, a flexible grabbing unit, an intelligent sensing unit and a mobile platform, wherein the flexible grabbing unit comprises a mechanical arm and a soft mechanical arm; the control method comprises the following steps:
s1, specifying the object type to be grabbed by the system, and inputting the object type into the system control unit;
s2, the system automatically searches and positions the object to be grabbed, if the object to be grabbed is not found in the current visual field of the system by the intelligent sensing unit, the system control unit controls the mobile platform to rotate or move to adjust the visual field of the system, when the object to be grabbed is identified by the intelligent sensing unit, three-dimensional positioning work is carried out to obtain the three-dimensional coordinate of the object under the coordinate system of the intelligent sensing unit, and after the identification and three-dimensional positioning work is finished, object identification information and the three-dimensional coordinate are sent to the system control unit through a communication cable;
s3, after receiving the object identification information and the three-dimensional coordinates, the system control unit converts the three-dimensional coordinates of the object into three-dimensional coordinates under a mobile platform coordinate system, sends a control instruction to the mobile platform according to the three-dimensional coordinates of the object under the mobile platform coordinate system, and enables the mobile platform to move to a specified position, namely, the grabbing operation can be carried out within the range, the system control unit always keeps communication with the intelligent sensing unit in the process, and the three-dimensional coordinates of the object are updated in real time to be used as feedback quantity for controlling the mobile platform;
s4, when the mobile platform enters the grabbing operation range, the mobile platform stops moving and keeps a static state, the system control unit reads the three-dimensional coordinates of the object sent by the intelligent sensing unit and converts the three-dimensional coordinates into the three-dimensional coordinates under the coordinate system of the mechanical arm, and sends a control instruction to the mechanical arm to control the mechanical arm to move right above the object to be grabbed, the system control unit opens the soft mechanical arm by controlling the electromagnetic valve, after the soft mechanical arm opens, the mechanical arm drives the soft mechanical arm to move downwards, when the soft mechanical arm moves to a specified position according to the three-dimensional coordinates of the object, the system control unit controls the electromagnetic valve again to close the soft mechanical arm, and after closing, the mechanical arm drives the soft mechanical arm to;
s5, after the object is successfully grabbed, the system control unit controls the mobile platform to move to the designated position and then controls the mechanical arm and the soft mechanical arm to place the object at the corresponding position, and the whole grabbing operation process of the system is completed.
Further, the step S2 specifically includes:
and S21, performing object detection by using an object recognition neural network according to the environment image acquired by the monocular camera, controlling the mobile platform to rotate or move to change the system view field for further detection if the object to be captured is not detected, and calculating the two-dimensional coordinate of the object center point in the pixel coordinate system according to the two-dimensional coordinate of the object detection frame vertex in the pixel coordinate system if the object to be captured is detected.
S22, converting the two-dimensional coordinates of the object center point obtained in S21 in the pixel coordinate system into three-dimensional coordinates in the monocular camera coordinate system, wherein the formula (1) is the conversion relation between the pixel coordinate system and the monocular camera coordinate system, and the formula (u, v) is the coordinates of the object center point pixel coordinate system, when the coordinates of the target in the pixel coordinate system are known, the world coordinate system has infinite points corresponding to the target, but the coordinates are all on the connecting line of the target point and the monocular camera optical center, and any given Z can be obtained according to the conversion relationcCoordinate (X) of object center point under monocular camera coordinate systemc,Yc,Zc);
Figure BDA0002360066970000051
S23, obtaining the coordinates (X) of the object center point in the camera coordinate system according to the S22c,Yc,Zc) The object center point and the camera optical center O under the camera coordinate system can be solvedcPitch α and yaw β of the wire.
S24, controlling the holder to drive the single-line laser ranging module to rotate to a corresponding angle according to the pitch angle α and the heading angle β obtained in the step S23 so as to perform preliminary ranging on the object, wherein the measured distance is d;
s25, correcting the pitch angle according to the distance d preliminarily measured in S24, and obtaining the corrected pitch angle as α' according to the formula (2), wherein c is the single line of the optical center of the monocular camera and the optical center of the laser ranging module in YcSpacing on the shaft;
Figure BDA0002360066970000052
s26, readjusting the angle of the laser ranging module according to the corrected pitch angle α' and accurately measuring the distance between the center points of the objects;
s27, referring to the distance of the object center point measured by the laser ranging module in S26 and the actual rotation angle of the laser ranging module on the course axis and the pitch axisFormula (3) can be calculated and converted to obtain the three-dimensional coordinate (X) of the object center point under the coordinate system of the laser ranging module, namely the system world coordinate systemc,Yc,Zc) And realizing the identification and positioning of the object.
Figure BDA0002360066970000053
Further, the step S3 specifically includes:
s31, converting the three-dimensional coordinates of the received object in the coordinate system of the intelligent sensing unit into the three-dimensional coordinates in the coordinate system of the mobile platform according to the formula (4), wherein (X)c,Yc,Zc) Is the three-dimensional coordinate (X) of the object under the coordinate system of the intelligent sensing unitp,Yp,Zp) Is a three-dimensional coordinate of an object under a coordinate system of a moving platform, RpcIs a rotation matrix of the coordinate system of the intelligent sensing unit and the coordinate system of the mobile platform, TpcA translation matrix of an intelligent sensing unit coordinate system and a mobile platform coordinate system;
Figure BDA0002360066970000054
s32, using the three-dimensional coordinates (X) of the object in the S31 under the coordinate system of the mobile platformp,Yp,Zp) The PID controller which is used as the control feedback quantity of the mobile platform and controls the input to be the target coordinate of the mobile platform is designed to control the mobile platform to move to the designated position by taking the speed as the output quantity;
further, the step S4 specifically includes:
s41, converting the three-dimensional coordinates of the received object in the coordinate system of the intelligent sensing unit into the three-dimensional coordinates in the coordinate system of the mobile platform according to the formula (5), wherein (X)c,Yc,Zc) Is the three-dimensional coordinate (X) of the object under the coordinate system of the intelligent sensing unitj,Yj,Zj) Is the three-dimensional coordinate of an object under a mechanical arm coordinate system, RjcIs a rotation matrix, T, of an intelligent sensing unit coordinate system and a mechanical arm coordinate systemjcFor intelligently perceiving the billA translation matrix of the element coordinate system and the mechanical arm coordinate system;
Figure BDA0002360066970000061
s42, converting the three-dimensional coordinates (X) of the object under the mechanical arm coordinate system according to S41j,Yj,Zj) And the system control unit sends control instructions to the mechanical arm and the electromagnetic valve to complete the static grabbing work of the object.
Compared with the background technology, the invention has the beneficial effects that:
the mobile robot grabbing operation system can effectively carry out the autonomous grabbing operation of the mobile robot in the actual working scene of the service industry, and greatly improves the autonomous operation capability and the service capability of the mobile robot. The system has strong environment adaptability and can carry out grabbing operation in an unstructured environment; the automatic grabbing device has the autonomous operation capability and can autonomously complete the whole grabbing operation process; the whole set of mobile robot grabbing operation system has a simple and reliable hardware structure, high integration level and system stability, can be competent for grabbing operation in most mobile robot working scenes, and is beneficial to technical popularization and application.
Drawings
FIG. 1 is a schematic diagram of the system of the present invention;
FIG. 2 is a flow chart of the grabbing operation of the present invention;
FIG. 3 is a schematic diagram of the structure of the intelligent sensing unit in the invention;
FIG. 4 is a schematic diagram of the positioning of the smart sensor unit in the present invention;
wherein: the system comprises a mobile platform 1, a soft mechanical arm 2, a mechanical arm 3, an embedded computing platform 4, an electromagnetic valve 5, an electromagnetic valve driver 6, an inverter 7, a processor 8, an air pump 9, an intelligent sensing unit 10, a monocular camera 11, a laser ranging module 12 and a pan-tilt head 13.
Detailed Description
The technical scheme of the invention is further explained in detail by combining the drawings in the specification.
As shown in fig. 1 to 4, the present invention provides a mobile robot grabbing operation system, which includes the following components:
the intelligent sensing unit 10 consists of a basic sensing unit, a two-axis cradle head 13, a processor 8 and an embedded computing platform 4, wherein the basic sensing unit consists of a monocular camera 11 and a single-line laser ranging module 12, and the laser ranging module 12 is driven by the small two-axis cradle head 13; the processor 8 adopts a single chip microcomputer and is used for processing the information of the laser ranging module 12, the control of a holder motor and the calculation of the three-dimensional coordinate of the object; the embedded computing platform 4 is respectively connected with the monocular camera 11 and the processor 8, and is used for processing the object recognition algorithm and outputting the recognition positioning result.
The flexible grabbing unit comprises a mechanical arm 3, a fluid-driven soft mechanical arm 2, an inverter 7, an air pump 9, an electromagnetic valve 5 and an electromagnetic valve driver 6, wherein the soft mechanical arm 2 is driven by a small air pump 9, so that safe grabbing of food, fragile articles and the like can be realized, and the soft mechanical arm 2 has strong adaptability. The air pump, the electromagnetic valve and the manipulator body form a pneumatic circuit, the inverter raises the voltage of the lithium battery to 220v to supply power to the air pump, the air pump 9 serves as an air source to drive the soft manipulator 2 to work, the electromagnetic valve driver 6 controls the electromagnetic valve 5 to act, and the opening and closing of the electromagnetic valve 5 controls the action of the manipulator. The safe grabbing operation can be performed without depending on force feedback to a certain extent.
The system control unit comprises a controller of the mobile platform 1, a controller of the flexible grabbing unit and a main controller, wherein the main controller is a system upper computer, an industrial personal computer is specifically adopted, the system control unit is respectively connected and communicated with the intelligent sensing unit 10, the flexible grabbing unit and the mobile platform 1 through communication cables, and after sensing information of the intelligent sensing unit 10 is received, a control signal is sent to the mobile platform 1 and the flexible grabbing unit to complete corresponding operation processes.
The mobile platform 1 is used as a platform of the whole system, a power module is integrated besides all the units, power requirements needed by all the units are provided, wheels are arranged below the power module, and the whole moving link in the operation process of the grabbing operation system is realized.
The specific operation process comprises the following steps:
s1, specifying the object type to be grabbed by the system, and inputting the object type into the system control unit;
s2, the system automatically searches and positions the object to be grabbed, if the object to be grabbed is not found in the current visual field of the system by the intelligent sensing unit 10, the system control unit controls the mobile platform 1 to rotate or move to adjust the visual field of the system, when the object to be grabbed is identified by the intelligent sensing unit 10, three-dimensional positioning work is carried out to obtain the three-dimensional coordinates of the object under the coordinate system of the intelligent sensing unit 10, and after the identification and three-dimensional positioning work is finished, object identification information and the three-dimensional coordinates are sent to the system control unit through a communication cable;
s3, after receiving the object identification information and the three-dimensional coordinates, the system control unit converts the three-dimensional coordinates of the object into three-dimensional coordinates under a coordinate system of the mobile platform 1, sends a control instruction to the mobile platform 1 according to the three-dimensional coordinates of the object under the coordinate system of the mobile platform 1, and enables the mobile platform 1 to move to a specified position, so that the grabbing operation can be carried out within the range, the system control unit always keeps communication with the intelligent sensing unit 10 in the process, and the three-dimensional coordinates of the object are updated in real time to serve as feedback quantity for controlling the mobile platform;
s4, when the mobile platform 1 enters the grabbing operation range, the mobile platform 1 stops moving and keeps a static state, the system control unit reads the three-dimensional coordinates of the object sent by the intelligent sensing unit 10 and converts the three-dimensional coordinates into three-dimensional coordinates under a coordinate system of the mechanical arm 3, and sends a control instruction to the mechanical arm 3 to control the mechanical arm to move right above the object to be grabbed, at the moment, the system control unit opens the soft mechanical arm by controlling the electromagnetic valve 5, after the soft mechanical arm 2 opens, the mechanical arm 3 drives the soft mechanical arm 2 to move downwards, when the soft mechanical arm 2 moves to a specified position according to the three-dimensional coordinates of the object, the system control unit controls the electromagnetic valve 5 again to close the soft mechanical arm 2, and after closing, the mechanical arm 3 drives the soft mechanical arm;
s5, after the object is successfully grabbed, the system control unit controls the mobile platform 1 to move to the designated position and then controls the mechanical arm 3 and the soft mechanical arm 2 to place the object at the corresponding position, and the whole grabbing operation process of the system is completed.
The step S2 specifically includes:
s21, the embedded computing platform 4 utilizes the object recognition neural network to detect the object according to the environment image acquired by the monocular camera 11, if the object to be grabbed is not detected, the mobile platform 1 is controlled to rotate or move to change the system view field for further detection, and if the object to be grabbed is detected, the two-dimensional coordinate of the object center point in the pixel coordinate system is calculated according to the two-dimensional coordinate of the object detection frame vertex in the pixel coordinate system.
S22, converting the two-dimensional coordinates of the object center point obtained in S21 in the pixel coordinate system into three-dimensional coordinates under the monocular camera 11 coordinate system, wherein the formula (1) is the conversion relation between the pixel coordinate system and the monocular camera 11 coordinate system, and the formula (u, v) is the object center point pixel coordinate system coordinates, when the coordinates of the target in the pixel coordinate system are known, the world coordinate system has infinite points corresponding to the target, but the points are all on the connecting line of the target point and the monocular camera 11 optical center, and any given Z can be obtained according to the conversion relationcCoordinate (X) of the center point of the object in the coordinate system of the monocular camera 11c,Yc,Zc);
Figure BDA0002360066970000081
S23, obtaining the coordinates (X) of the object center point in the camera coordinate system according to the S22c,Yc,Zc) The object center point and the camera optical center O under the camera coordinate system can be solvedcPitch α and yaw β of the wire.
S24, controlling the two-axis tripod head 13 to drive the single-line laser ranging module 12 to rotate to a corresponding angle according to the pitch angle α and the heading angle β obtained in the step S23 so as to perform preliminary ranging on the object, wherein the measured distance is d;
s25, correcting the pitch angle according to the distance d preliminarily measured in S24, and obtaining the corrected pitch angle of α' according to the formula (2), wherein c is a single line of the optical center of the monocular camera 11And the optical center of the laser ranging module 12 is at YcSpacing on the shaft;
Figure BDA0002360066970000091
s26, readjusting the angle of the laser ranging module 12 according to the corrected pitch angle α', and accurately measuring the distance between the center points of the objects;
s27, according to the distance of the object center point measured by the laser ranging module 12 in S26 and the actual rotation angle of the laser ranging module 12 on the course axis and the pitch axis, the reference formula (3) can calculate and convert the three-dimensional coordinate (X) of the object center point under the coordinate system of the laser ranging module 12, namely the system world coordinate systemc,Yc,Zc) And realizing the identification and positioning of the object.
Figure BDA0002360066970000092
The step S3 specifically includes:
s31, the controller of the mobile platform 1 converts the three-dimensional coordinates of the received object in the coordinate system of the smart sensor unit 10 into the three-dimensional coordinates in the coordinate system of the mobile platform 1 according to equation (4), where (X)c,Yc,Zc) Is the three-dimensional coordinate (X) of the object under the coordinate system of the intelligent sensing unit 10p,Yp,Zp) Is the three-dimensional coordinate, R, of an object under the coordinate system of the moving platform 1pcIs a rotation matrix of the coordinate system of the intelligent sensing unit 10 and the coordinate system of the mobile platform 1, TpcA translation matrix of a coordinate system of the intelligent sensing unit 10 and a coordinate system of the mobile platform 1;
Figure BDA0002360066970000093
s32, using the three-dimensional coordinates (X) of the object in S31 in the coordinate system of the mobile platform 1p,Yp,Zp) The PID controller with speed as output quantity is designed to control the moving and moving of the mobile platform 1 as the control feedback quantity of the mobile platform 1 and the control input is the target coordinate of the mobile platform 1Moving to a designated position;
the step S4 specifically includes:
s41, the controller of the flexible grabbing unit converts the three-dimensional coordinates of the received object in the coordinate system of the intelligent sensing unit 10 into the three-dimensional coordinates in the coordinate system of the mechanical arm 3 according to the formula (5), wherein (X)c,Yc,Zc) Is the three-dimensional coordinate (X) of the object under the coordinate system of the intelligent sensing unit 10j,Yj,Zj) Is a three-dimensional coordinate of an object under a coordinate system of a mechanical arm 3, RjcIs a rotation matrix of a coordinate system of the intelligent sensing unit 10 and a coordinate system of the mechanical arm 3, TjcA translation matrix of a coordinate system of the intelligent sensing unit 10 and a coordinate system of the mechanical arm 3;
Figure BDA0002360066970000101
s42, converting the three-dimensional coordinates (X) of the object in the robot arm 3 coordinate system according to S41j,Yj,Zj) And the system control unit sends control instructions to the mechanical arm 3 and the electromagnetic valve 5 to complete the static grabbing work of the object.
The above description is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above embodiments, and all technical solutions belonging to the idea of the present invention belong to the protection scope of the present invention. It should be noted that modifications and embellishments within the scope of the invention may occur to those skilled in the art without departing from the principle of the invention, and are considered to be within the scope of the invention.

Claims (9)

1. The utility model provides a mobile robot snatchs operating system, includes system control unit and flexible unit of snatching, flexible unit of snatching includes arm (3) and software manipulator (2), its characterized in that still includes: the intelligent sensing unit (10) and the mobile platform (1);
the intelligent sensing unit (10) comprises a basic sensing unit, a holder (13), a processor (8) and an embedded computing platform (4); the system is used for realizing the identification and positioning of the object;
the flexible grabbing unit realizes safe grabbing of articles, and the soft manipulator (2) is driven by the air pump (9);
the system control unit is respectively connected with the intelligent sensing unit (10), the flexible grabbing unit and the mobile platform (1) for communication by utilizing communication cables, and sends a control signal to the mobile platform (1) and the flexible grabbing unit to complete corresponding operation processes after receiving sensing information of the intelligent sensing unit (10);
the mobile platform 1 carries an intelligent sensing unit (10), a flexible grabbing unit, a system control unit and a power module, the power module provides power requirements for all units, and a moving link in the operation process of the whole grabbing operation system is achieved.
2. The mobile robot grabbing operation system according to claim 1, wherein the basic sensing unit comprises a monocular camera (11) and a single line laser ranging module (12), and the laser ranging module (12) is driven by a pan-tilt (13); the processor (8) is used for processing the information of the laser ranging module (12), the control of the tripod head motor and the calculation of the three-dimensional coordinate of the object; the embedded computing platform (4) processes the object recognition algorithm and outputs the recognition positioning result.
3. The system according to claim 2, wherein the three-dimensional coordinates are calculated by:
s21, performing object detection by using an object recognition neural network according to the environment image acquired by the monocular camera, controlling the mobile platform to rotate or move to change the system view field for further detection if the object to be captured is not detected, and calculating the two-dimensional coordinate of the object center point in the pixel coordinate system according to the two-dimensional coordinate of the object detection frame vertex in the pixel coordinate system if the object to be captured is detected;
s22, converting the two-dimensional coordinates of the object center point in the pixel coordinate system obtained in S21 into three-dimensional coordinates in the camera coordinate system, wherein the expression (1) is the conversion relation between the pixel coordinate system and the camera coordinate system, and (u, v) are the coordinates of the object center point pixel coordinate system when the target is in the pixelWhen the coordinate of the coordinate system is known, the world coordinate system has infinite points corresponding to the coordinate system, but all the points are on the connecting line of the target point and the optical center of the camera, and any given Z can be obtained according to the conversion relationcCoordinate (X) of object center point under camera coordinate systemc,Yc,Zc);
Figure FDA0002360066960000011
S23, obtaining the coordinates (X) of the object center point in the camera coordinate system according to the S22c,Yc,Zc) The object center point and the camera optical center O under the camera coordinate system can be solvedcPitch α and yaw β of the wire;
s24, controlling the holder to drive the laser ranging module (12) to rotate to a corresponding angle according to the pitch angle α and the heading angle β obtained in the step S23 to perform preliminary ranging on the object, wherein the measured distance is d;
s25, correcting the pitch angle according to the distance d preliminarily measured in S24, and obtaining the corrected pitch angle of α' according to the formula (2), wherein c is the optical center of the camera and the optical center of the laser ranging module (12) in YcSpacing on the shaft;
Figure FDA0002360066960000021
s26, readjusting the angle of the laser ranging module (12) according to the corrected pitch angle α', and accurately measuring the distance between the center points of the objects;
s27, according to the distance of the center point of the object measured by the laser ranging module (12) in the S26 and the actual rotation angles of the laser ranging module (12) on the azimuth axis and the elevation axis, the reference formula (3) can calculate and convert the three-dimensional coordinate (X coordinate) of the center point of the object in the coordinate system of the laser ranging module (12), namely the coordinate system of the system worldc,Yc,Zc) And realizing the identification and positioning of the object.
Figure FDA0002360066960000022
4. The system according to claim 3, wherein the system control unit receives three-dimensional coordinates (X) of the smart sensor unit coordinate systemc,Yc,Zc) Then, according to the formula (4), the three-dimensional coordinates (X) of the received object in the coordinate system of the intelligent sensing unitc,Yc,Zc) Converting into three-dimensional coordinates (X) under a mobile platform coordinate systemp,Yp,Zp),RpcIs a rotation matrix of the coordinate system of the intelligent sensing unit and the coordinate system of the mobile platform, TpcA translation matrix of an intelligent sensing unit coordinate system and a mobile platform coordinate system;
Figure FDA0002360066960000023
with the three-dimensional coordinates (X) of the object in the moving platform coordinate system in S31p,Yp,Zp) And as a control feedback quantity of the mobile platform, the control input is a target coordinate of the mobile platform, and a PID controller with speed as an output quantity is designed to control the mobile platform to move to a specified position.
5. The system according to claim 3, wherein the system control unit receives three-dimensional coordinates (X) of the smart sensor unit in the coordinate systemc,Yc,Zc) Then, according to the formula (5), the three-dimensional coordinates (X) of the received object in the coordinate system of the intelligent sensing unitc,Yc,Zc) Converting into three-dimensional coordinates under a mechanical arm coordinate system, wherein (X)c,Yc,Zc) Is the three-dimensional coordinate (X) of the object under the coordinate system of the intelligent sensing unitj,Yj,Zj) Is the three-dimensional coordinate of an object under a mechanical arm coordinate system, RjcIs a rotation matrix, T, of an intelligent sensing unit coordinate system and a mechanical arm coordinate systemjcThe translation matrix is a translation matrix of an intelligent sensing unit coordinate system and a mechanical arm coordinate system;
Figure FDA0002360066960000031
according to the three-dimensional coordinates (X) of the object under the mechanical arm coordinate system obtained by conversionj,Yj,Zj) And the system control unit sends control instructions to the mechanical arm and the electromagnetic valve to complete the static grabbing work of the object.
6. A control method of a mobile robot grabbing operation system comprises a system control unit, a flexible grabbing unit, an intelligent sensing unit (10) and a mobile platform (1), wherein the flexible grabbing unit comprises a mechanical arm (3) and a soft mechanical arm (2); the control method is characterized by comprising the following steps:
s1, specifying the object type to be grabbed by the system, and inputting the object type into the system control unit;
s2, the system automatically searches and positions the object to be grabbed, if the object to be grabbed is not found in the current visual field of the system by the intelligent sensing unit (10), the system control unit controls the mobile platform (1) to rotate or move to adjust the visual field of the system, when the object to be grabbed is identified by the intelligent sensing unit (10), three-dimensional positioning work is carried out to obtain the three-dimensional coordinates of the object under the coordinate system of the intelligent sensing unit (10), and after the identification and three-dimensional positioning work is finished, object identification information and the three-dimensional coordinates are sent to the system control unit through a communication cable;
s3, after receiving the object identification information and the three-dimensional coordinates, the system control unit converts the three-dimensional coordinates of the object into three-dimensional coordinates under a coordinate system of the mobile platform (1), sends a control instruction to the mobile platform (1) according to the three-dimensional coordinates of the object under the coordinate system of the mobile platform (1), so that the mobile platform moves to a specified position, namely, the grabbing operation can be carried out within the range, the system control unit always communicates with the intelligent sensing unit (10) in the process, and the three-dimensional coordinates of the object are updated in real time to serve as feedback quantity for controlling the mobile platform;
s4, when the mobile platform (1) enters the grabbing operation range, the mobile platform (1) stops moving and keeps a static state, the system control unit reads the three-dimensional coordinates of the object sent by the intelligent sensing unit (10) and converts the three-dimensional coordinates into the three-dimensional coordinates of the mechanical arm (3) in the coordinate system, and sends a control instruction to the mechanical arm (3) to control the mechanical arm to move right above the object to be grabbed, at the moment, the system control unit opens the soft mechanical arm by controlling the electromagnetic valve (5), after the soft mechanical arm (2) opens, the mechanical arm (3) drives the soft mechanical arm (2) to move downwards, when the soft mechanical arm (2) moves to a specified position according to the three-dimensional coordinate of the object, the system control unit controls the electromagnetic valve (5) again to close the soft mechanical arm (2), and the mechanical arm (3) drives the soft mechanical arm (2) to move to grab the object after closing;
s5, after the object is successfully grabbed, the system control unit controls the mobile platform (1) to move to the designated position and then controls the mechanical arm (3) and the soft mechanical arm (2) to place the object at the corresponding position, and the whole grabbing operation process of the system is completed.
7. The method for controlling a mobile robot gripper work system according to claim 6, wherein the step S2 specifically includes:
s21, carrying out object detection by using an object recognition neural network according to an environment image acquired by a monocular camera (11), if an object to be grabbed is not detected, controlling a mobile platform (1) to rotate or move to change the system view field for further detection, and if the object to be grabbed is detected, calculating the two-dimensional coordinate of the object center point in a pixel coordinate system according to the two-dimensional coordinate of the vertex of an object detection frame in the pixel coordinate system;
s22, converting the two-dimensional coordinates of the object center point obtained in S21 in the pixel coordinate system into three-dimensional coordinates under the coordinate system of the monocular camera (11), wherein the expression (1) is the conversion relation between the pixel coordinate system and the coordinate system of the monocular camera (11), and the expression (u, v) is the coordinates of the object center point pixel coordinate system, when the coordinates of the target in the pixel coordinate system are known, the world coordinate system has infinite points corresponding to the target, but the coordinates are on the connecting line of the target point and the optical center of the monocular camera (11), and any given Z can be obtained according to the conversion relationcThe coordinate (X) of the center point of the object under the coordinate system of the monocular camera (11)c,Yc,Zc);
Figure FDA0002360066960000041
S23, obtaining the coordinates (X) of the object center point in the camera coordinate system according to the S22c,Yc,Zc) The object center point and the camera optical center O under the camera coordinate system can be solvedcPitch α and yaw β of the wire;
s24, controlling the holder (13) to drive the single-line laser ranging module (12) to rotate to a corresponding angle according to the pitch angle α and the heading angle β obtained in the step S23 so as to carry out preliminary ranging on the object, wherein the measured distance is d;
s25, correcting the pitch angle according to the distance d preliminarily measured in S24, and obtaining the corrected pitch angle as α' according to the formula (2), wherein c is the single line of the optical center of the monocular camera (11) and the optical center of the laser ranging module (12) in YcSpacing on the shaft;
Figure FDA0002360066960000042
s26, readjusting the angle of the laser ranging module (12) according to the corrected pitch angle α', and accurately measuring the distance between the center points of the objects;
s27, according to the distance of the center point of the object measured by the laser ranging module (12) in the S26 and the actual rotation angles of the laser ranging module (12) on the azimuth axis and the elevation axis, the reference formula (3) can calculate and convert the three-dimensional coordinate (X coordinate) of the center point of the object in the coordinate system of the laser ranging module (12), namely the coordinate system of the system worldc,Yc,Zc) And realizing the identification and positioning of the object.
Figure FDA0002360066960000051
8. The method for controlling a mobile robot gripper work system according to claim 6, wherein the step S3 specifically includes:
s31, the received object is sensed intelligently according to the formula (4)Three-dimensional coordinates in the coordinate system of the known unit (10) are converted into three-dimensional coordinates in the coordinate system of the mobile platform (1), wherein (X)c,Yc,Zc) Is three-dimensional coordinates (X) of an object in a coordinate system of an intelligent sensing unit (10)p,Yp,Zp) Is a three-dimensional coordinate, R, of an object in a coordinate system of a moving platform (1)pcIs a rotation matrix of a coordinate system of the intelligent sensing unit (10) and a coordinate system of the mobile platform (1), TpcA translation matrix of a coordinate system of the intelligent sensing unit (10) and a coordinate system of the mobile platform (1);
Figure FDA0002360066960000052
s32, using the three-dimensional coordinates (X) of the object in S31 in the coordinate system of the mobile platform (1)p,Yp,Zp) The control feedback quantity of the mobile platform (1) is controlled, the control input is the target coordinate of the mobile platform (1), and a PID controller with speed as the output quantity is designed to control the mobile platform (1) to move to the specified position.
9. The method for controlling a mobile robot gripper work system according to claim 6, wherein the step S4 specifically includes:
s41, converting the three-dimensional coordinates of the received object in the coordinate system of the intelligent sensing unit (10) into the three-dimensional coordinates in the coordinate system of the mobile platform (1) according to the formula (5), wherein (X)c,Yc,Zc) Is three-dimensional coordinates (X) of an object in a coordinate system of an intelligent sensing unit (10)j,Yj,Zj) Is a three-dimensional coordinate of an object under a coordinate system of a mechanical arm (3), RjcIs a rotation matrix of a coordinate system of the intelligent sensing unit (10) and a coordinate system of the mechanical arm (3), TjcThe translation matrix is a translation matrix of a coordinate system of the intelligent sensing unit (10) and a coordinate system of the mechanical arm (3);
Figure FDA0002360066960000053
s42, the object obtained by conversion according to S41 is seated on the mechanical arm (3)Three-dimensional coordinates (X) under the coordinate systemj,Yj,Zj) And the system control unit sends control instructions to the mechanical arm (3) and the electromagnetic valve (5) to complete the static grabbing work of the object.
CN202010021047.1A 2020-01-08 2020-01-08 Mobile robot grabbing operation system and control method Pending CN111203849A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010021047.1A CN111203849A (en) 2020-01-08 2020-01-08 Mobile robot grabbing operation system and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010021047.1A CN111203849A (en) 2020-01-08 2020-01-08 Mobile robot grabbing operation system and control method

Publications (1)

Publication Number Publication Date
CN111203849A true CN111203849A (en) 2020-05-29

Family

ID=70781003

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010021047.1A Pending CN111203849A (en) 2020-01-08 2020-01-08 Mobile robot grabbing operation system and control method

Country Status (1)

Country Link
CN (1) CN111203849A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111618835A (en) * 2020-06-12 2020-09-04 山东科曼智能科技有限公司 Intelligent charging robot system for port shore power and operation method
CN113021341A (en) * 2021-03-18 2021-06-25 深圳市科服信息技术有限公司 Robot based on 5G article identification and automatic transfer transportation
CN113084808A (en) * 2021-04-02 2021-07-09 上海智能制造功能平台有限公司 Monocular vision-based 2D plane grabbing method for mobile mechanical arm
CN113146576A (en) * 2021-02-04 2021-07-23 合肥工业大学 Medicine taking system, robot based on medicine taking system and control method
CN114473999A (en) * 2022-01-14 2022-05-13 浙江工业大学 Intelligent service robot system capable of automatically changing infusion bottles
CN114700956A (en) * 2022-05-20 2022-07-05 江苏金和美机器人科技有限公司 Identification, positioning and gripping device and method for robot-oriented article gripping operation
CN115919472A (en) * 2023-01-09 2023-04-07 北京云力境安科技有限公司 Mechanical arm positioning method and related system, device, equipment and medium
WO2024067727A1 (en) * 2022-09-30 2024-04-04 广州明珞装备股份有限公司 Drilling method and drilling device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103522291A (en) * 2013-10-29 2014-01-22 中国人民解放军总装备部军械技术研究所 Target capturing system and method of explosive ordnance disposal robot
CN107589749A (en) * 2017-09-19 2018-01-16 浙江大学 Underwater robot autonomous positioning and node map construction method
US20180154525A1 (en) * 2015-05-01 2018-06-07 General Electric Company Systems and methods for control of robotic manipulation
CN108161887A (en) * 2018-01-22 2018-06-15 东莞理工学院 Two-wheeled vision robot with manipulator
CN108942862A (en) * 2018-07-16 2018-12-07 汕头大学 A kind of compound mobile robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103522291A (en) * 2013-10-29 2014-01-22 中国人民解放军总装备部军械技术研究所 Target capturing system and method of explosive ordnance disposal robot
US20180154525A1 (en) * 2015-05-01 2018-06-07 General Electric Company Systems and methods for control of robotic manipulation
CN107589749A (en) * 2017-09-19 2018-01-16 浙江大学 Underwater robot autonomous positioning and node map construction method
CN108161887A (en) * 2018-01-22 2018-06-15 东莞理工学院 Two-wheeled vision robot with manipulator
CN108942862A (en) * 2018-07-16 2018-12-07 汕头大学 A kind of compound mobile robot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
万刚等编著: "《无人机测绘技术及应用》", 31 December 2015, 测绘出版社 *
王兴松著: "《Mecanum轮全方位移动机器人原理与应用》", 30 June 2018, 东南大学出版社 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111618835A (en) * 2020-06-12 2020-09-04 山东科曼智能科技有限公司 Intelligent charging robot system for port shore power and operation method
CN113146576A (en) * 2021-02-04 2021-07-23 合肥工业大学 Medicine taking system, robot based on medicine taking system and control method
CN113021341A (en) * 2021-03-18 2021-06-25 深圳市科服信息技术有限公司 Robot based on 5G article identification and automatic transfer transportation
CN113084808A (en) * 2021-04-02 2021-07-09 上海智能制造功能平台有限公司 Monocular vision-based 2D plane grabbing method for mobile mechanical arm
CN113084808B (en) * 2021-04-02 2023-09-22 上海智能制造功能平台有限公司 Monocular vision-based 2D plane grabbing method for mobile mechanical arm
CN114473999A (en) * 2022-01-14 2022-05-13 浙江工业大学 Intelligent service robot system capable of automatically changing infusion bottles
CN114473999B (en) * 2022-01-14 2023-09-29 浙江工业大学 Intelligent service robot system capable of automatically replacing infusion bottles
CN114700956A (en) * 2022-05-20 2022-07-05 江苏金和美机器人科技有限公司 Identification, positioning and gripping device and method for robot-oriented article gripping operation
WO2024067727A1 (en) * 2022-09-30 2024-04-04 广州明珞装备股份有限公司 Drilling method and drilling device
CN115919472A (en) * 2023-01-09 2023-04-07 北京云力境安科技有限公司 Mechanical arm positioning method and related system, device, equipment and medium
CN115919472B (en) * 2023-01-09 2023-05-05 北京云力境安科技有限公司 Mechanical arm positioning method and related system, device, equipment and medium

Similar Documents

Publication Publication Date Title
CN111203849A (en) Mobile robot grabbing operation system and control method
CN107433573B (en) Intelligent binocular automatic grabbing mechanical arm
CN205219101U (en) Service robot of family
CN100352623C (en) Control device and method for intelligent mobile robot capable of picking up article automatically
CN111055281A (en) ROS-based autonomous mobile grabbing system and method
CN106113067B (en) A kind of Dual-Arm Mobile Robot system based on binocular vision
CN111243017A (en) Intelligent robot grabbing method based on 3D vision
CN111015649B (en) Driving and controlling integrated control system
CN112873163A (en) Automatic material carrying robot system and control method thereof
CN109877827B (en) Non-fixed point material visual identification and gripping device and method of connecting rod manipulator
CN111203880B (en) Image visual servo control system and method based on data driving
CN112873164A (en) Automatic material handling robot
CN114505840A (en) Intelligent service robot of autonomous operation box type elevator
CN114770461B (en) Mobile robot based on monocular vision and automatic grabbing method thereof
CN114473998B (en) Intelligent service robot system capable of automatically opening door
CN110207619B (en) Measuring system and method for carrying cooperative mechanical arm based on omnibearing mobile platform
CN2810918Y (en) Intelligent mobile robot controller capable of collecting articles automatically
CN116100565A (en) Immersive real-time remote operation platform based on exoskeleton robot
Chang et al. Hybrid fuzzy control of an eye-to-hand robotic manipulator for autonomous assembly tasks
CN207408790U (en) A kind of copline cricket experimental system based on image procossing
Zhou et al. Visual servo control system of 2-DOF parallel robot
Bi et al. Intelligent Logistics Handling Robot: Design, Control, and Recognition
CN115741665A (en) Intelligent disassembling robot system and control method
Gao et al. Deep-learning based robotic manipulation of flexible PCBs
CN113352314A (en) Robot motion control system and method based on closed-loop feedback

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200529