CN108724183B - Control method, system and related device of carrying mechanical arm - Google Patents

Control method, system and related device of carrying mechanical arm Download PDF

Info

Publication number
CN108724183B
CN108724183B CN201810532037.7A CN201810532037A CN108724183B CN 108724183 B CN108724183 B CN 108724183B CN 201810532037 A CN201810532037 A CN 201810532037A CN 108724183 B CN108724183 B CN 108724183B
Authority
CN
China
Prior art keywords
image
carried
feature points
mechanical arm
carrying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810532037.7A
Other languages
Chinese (zh)
Other versions
CN108724183A (en
Inventor
谭天诺
刘建群
黄威威
任亮亮
蔡建林
郑允泽
赵壬博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN201810532037.7A priority Critical patent/CN108724183B/en
Publication of CN108724183A publication Critical patent/CN108724183A/en
Application granted granted Critical
Publication of CN108724183B publication Critical patent/CN108724183B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a control method of a handling mechanical arm, comprising the following steps: when an object carrying command is received, shooting an environment image by utilizing a multi-view depth camera, and judging whether an object to be carried exists or not according to the environment image; if yes, performing three-dimensional reduction operation on the environment image, and determining image depth information of the object to be carried; and generating a motion track according to the image depth information, and controlling the carrying mechanical arm to carry the object to be carried to a position corresponding to the object carrying command according to the motion track. The method can determine the specific position of the object to be conveyed by utilizing the conveying mechanical arm and realize conveying operation. The application also discloses a control system of a handling mechanical arm, a computer readable storage medium and a handling mechanical arm, which have the beneficial effects.

Description

Control method, system and related device of carrying mechanical arm
Technical Field
The present invention relates to the field of mechanical arms, and in particular, to a method and a system for controlling a handling mechanical arm, a computer readable storage medium, and a handling mechanical arm.
Background
In recent years, mechanical arms have been widely used in industrial production, and various industrial mechanical arms are various in variety and function. The robotic arm typically has multiple degrees of freedom to grasp or move an object (tool or workpiece). The industrial robot can be understood as: the mechanical electronic device with the functions of anthropomorphic arm, wrist and hand can move any object or tool according to the time-varying requirement of space pose (position and pose), so as to fulfill the operation requirement of certain industrial production. For example, clamping a welding tongs or a welding gun, performing spot welding or arc welding on an automobile or motorcycle body, and carrying a die-cast or stamped part or member; performing laser cutting and spraying; assembling mechanical parts, etc.
The carrying mechanical arm in the prior art is mainly applied to the field of mechanical assembly, the carrying mechanical arm is used for executing a mechanized procedure in the mechanical assembly process, and the position of an object to be carried and the carrying destination are fixed, so that all moving paths and carrying operations in the working process are planned in advance, and only infrared laser is needed for calibration. However, the handling robot arm in the prior art is not developed for daily life of people, and there is a need for a handling robot arm in daily life, for example, for old people or disabled people to hold a cup, pick up a pen from a table top, and the like. In the case of using the handling robot in daily life, the position of the object to be handled is random, so that the handling robot with a fixed moving path and fixed handling operation in the prior art cannot be used to realize the operation in the above application scenario.
Therefore, how to determine a specific position of an object to be handled and implement a handling operation by using the handling robot arm is a technical problem that a person skilled in the art needs to solve at present.
Disclosure of Invention
The invention aims to provide a control method and system of a carrying mechanical arm, a computer-readable storage medium and the carrying mechanical arm, which can determine the specific position of an object to be carried by using the carrying mechanical arm and realize carrying operation.
In order to solve the above technical problems, the present application provides a control method of a handling mechanical arm, the control method includes:
when an object carrying command is received, shooting an environment image by utilizing a multi-view depth camera, and judging whether an object to be carried exists or not according to the environment image;
if yes, performing three-dimensional reduction operation on the environment image, and determining image depth information of the object to be carried;
and generating a motion track according to the image depth information, and controlling the carrying mechanical arm to carry the object to be carried to a position corresponding to the object carrying command according to the motion track.
Optionally, after receiving the object handling command, shooting an environmental image with the multi-view depth camera, and determining whether the object to be handled exists according to the environmental image includes:
when an object carrying command is received, shooting an environment image by using a multi-view depth camera;
extracting feature points in all the environment images, and extracting image contours in all the environment images through a canny operator;
obtaining an object image to be identified according to the intensity of the feature points falling in the image outline;
judging whether the number of the characteristic points and the peak value to noise ratio in the object image to be identified meet preset standards or not; the preset standard comprises the number of standard characteristic points of the standard object image to be carried and a standard peak value to noise ratio;
if yes, the environment image exists the object to be carried.
Optionally, performing three-dimensional reduction operation on the environmental image, and determining the image depth information of the object to be carried includes:
acquiring the internal parameters of the multi-view depth camera; the multi-view depth camera is calibrated by adopting a Zhang Youzheng calibration method;
comparing the similarity of the feature points of all the environmental images to obtain M groups of similar feature points with the similarity larger than a preset value, and determining an image to be restored in all the environmental images of the object to be carried;
calculating three-dimensional coordinates of similar feature points on the image to be restored according to the two-dimensional coordinates of the similar feature points on the image to be restored and the internal parameters;
and executing depth interpolation operation according to the three-dimensional coordinates of the similar characteristic points on the image to be restored to determine the image depth information of the object to be carried.
Optionally, the feature points are surf feature points.
Optionally, the method further comprises:
and receiving voice information input by a user, and generating the object carrying command according to the voice information.
The application also provides a control system of a handling mechanical arm, the system comprising:
the image shooting module is used for shooting an environment image by utilizing the multi-view depth camera after receiving an object carrying command and judging whether an object to be carried exists or not according to the environment image;
the three-dimensional reduction module is used for carrying out three-dimensional reduction operation on the environment image when the environment image judges that the object to be carried exists, and determining image depth information of the object to be carried;
and the carrying module is used for generating a motion track according to the image depth information and controlling the carrying mechanical arm to carry the object to be carried to a position corresponding to the object carrying command according to the motion track.
Optionally, the image capturing module includes:
the shooting unit is used for shooting a preset number of environmental images with different shooting angles after receiving an object carrying command;
the feature extraction unit is used for extracting feature points in all the environment images and extracting image contours in all the environment images through a canny operator;
the image determining unit is used for obtaining an object image to be identified according to the intensity of the characteristic points falling in the image outline;
the judging unit judges whether the number of the characteristic points and the peak value to noise ratio in the object image to be identified meet preset standards or not; the preset standard comprises the number of standard characteristic points of the standard object image to be carried and a standard peak value to noise ratio; if yes, the environment image exists the object to be carried.
Optionally, the three-dimensional reduction module includes:
an internal parameter obtaining unit, configured to obtain an internal parameter of the multi-view depth camera; the multi-view depth camera is calibrated by adopting a Zhang Youzheng calibration method;
the similarity feature point determining unit is used for comparing the similarity of the feature points of all the environment images to obtain M groups of similarity feature points with the similarity larger than a preset value, and determining an image to be restored in all the environment images of the object to be carried;
the coordinate calculation unit is used for calculating three-dimensional coordinates of the similar feature points on the image to be restored according to the two-dimensional coordinates of the similar feature points on the image to be restored and the internal parameters;
and the depth information determining unit is used for performing depth interpolation operation according to the three-dimensional coordinates of the similar characteristic points on the image to be restored to determine the image depth information of the object to be carried.
The application also provides a computer readable storage medium, on which a computer program is stored, which when executed implements the steps performed by the control method of the handling robot.
The application also provides a handling mechanical arm, including: a multi-view depth camera for shooting images of objects to be carried; the microcomputer is connected with the multi-view depth camera and used for generating image depth information according to shooting an image of an object to be carried and generating a control instruction according to the image depth information; and the mechanical arm assembly is connected with the microcomputer through a control circuit and is used for carrying the object to be carried according to the control instruction.
The invention provides a control method of a carrying mechanical arm, which comprises the steps of shooting an environment image by utilizing a multi-view depth camera after an object carrying command is received, and judging whether an object to be carried exists or not according to the environment image; if yes, performing three-dimensional reduction operation on the environment image, and determining image depth information of the object to be carried; and generating a motion track according to the image depth information, and controlling the carrying mechanical arm to carry the object to be carried to a position corresponding to the object carrying command according to the motion track.
The spatial position of the object to be carried by the carrying mechanical arm in the prior art is fixed, and only the fixed operation is needed to be executed, but when the spatial position of the object to be carried is not fixed, the carrying mechanical arm in the prior art cannot normally execute the carrying operation. Based on the method, the environment image is shot by the multi-view depth camera, the three-dimensional restoration operation is carried out according to the environment image, the space position of the object to be carried is further determined, and the corresponding control instruction is generated according to the actual space position of the object to be carried so as to complete the carrying operation. According to the scheme, the specific position of the object to be conveyed can be determined by utilizing the conveying mechanical arm, and conveying operation is realized. The application also provides a control system of a handling mechanical arm, a computer readable storage medium and a handling mechanical arm, which have the beneficial effects and are not repeated herein.
Drawings
For a clearer description of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described, it being apparent that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a control method of a handling mechanical arm according to an embodiment of the present application;
fig. 2 is a flowchart of another control method of a handling robot according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a control system of a handling robot according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a mechanical arm assembly of a table top stage handling mechanical arm according to an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
Referring to fig. 1, fig. 1 is a flowchart of a control method of a handling robot according to an embodiment of the present application.
The specific steps may include:
s101: when an object carrying command is received, shooting an environment image by utilizing a multi-view depth camera, and judging whether an object to be carried exists or not according to the environment image; if yes, enter S102; if not, ending the flow;
in this embodiment, there is default an operation of sending an object handling command, where the object handling command may include a type of an object to be handled, and a spatial position where the object to be handled needs to be handled. The object handling command may be sent by the user in the form of voice, touch screen, wireless signals, etc. The multi-view depth camera is composed of a plurality of sub cameras, for example, a binocular depth camera composed of two 150-degree wide-angle cameras can be adopted. The multi-view depth camera can shoot a plurality of environment images and judge whether an object to be carried exists in the environment images.
It should be noted that, in this embodiment, feature information about a standard picture or a picture of an object to be carried exists by default, and whether the object to be carried exists in the environmental image may be determined according to a similarity comparison method. In this embodiment, the sending operation of the object handling command may also be implemented by means of voice input, touch screen, remote control, etc.
S102: performing three-dimensional reduction operation on the environment image, and determining image depth information of the object to be carried;
the step is based on the determination that the object to be carried exists in the environmental image in S101, and further determines a spatial position relationship between the object to be carried and the carrying mechanical arm. Because the plane image shot by the multi-view depth camera only can determine the two-dimensional coordinates of the object to be conveyed, the three-dimensional coordinates of the object to be conveyed cannot be known. The multi-view depth camera can acquire the object distance range measured by the axis of the imager of the clear image, so that three-dimensional restoration operation can be performed on the object to be carried in the two-dimensional environment image, and the three-dimensional coordinates of all characteristic points on the object to be carried are obtained, namely, the image depth information of the object to be carried is obtained.
S103: and generating a motion track according to the image depth information, and controlling the carrying mechanical arm to carry the object to be carried to a position corresponding to the object carrying command according to the motion track.
After the image depth information is obtained, the geometric center and the object outline of the object to be conveyed can be obtained, and further, the conveying mechanical arm can generate a motion track according to the position of the mechanical arm and the position of the object to be conveyed, and convey the object to be conveyed to the position corresponding to the object conveying command.
According to the embodiment, the environment image is shot by the multi-view depth camera, three-dimensional restoration operation is carried out according to the environment image, the space position of the object to be carried is further determined, and corresponding control instructions are generated according to the actual space position of the object to be carried so as to complete carrying operation. According to the scheme, the specific position of the object to be conveyed can be determined by utilizing the conveying mechanical arm, and conveying operation is realized.
Referring to fig. 2, fig. 2 is a flowchart of another control method of a handling robot according to an embodiment of the present disclosure; the specific steps may include:
s201: when an object carrying command is received, shooting an environment image by using a multi-view depth camera;
s202: extracting feature points in all the environment images, and extracting image contours in all the environment images through a canny operator;
wherein, the canny operator is fully called: the canny edge detection operator is a multi-stage edge detection algorithm. Here, the types of feature points are not limited, and surf feature points may be extracted, or HARRIS corner points may be extracted, but only one feature point may be extracted.
S203: obtaining an object image to be identified according to the intensity of the feature points falling in the image outline;
and determining to obtain an object image to be identified when the intensity of the feature points falling in the image outline is larger than a certain value. Note that, the image of the object to be recognized determined in this step is only an image of a certain object existing in the environment image, and is not necessarily an image of the object to be carried, and therefore the judgment operation in S204 is required.
S204: judging whether the number of the characteristic points and the peak value to noise ratio in the object image to be identified meet preset standards or not; if yes, go to S205; if not, entering S201;
the preset standard comprises the number of standard characteristic points of the standard object image to be carried and a standard peak value to noise ratio; the judging operation in S204 specifically includes: judging whether the number of the characteristic points in the object image to be identified is larger than the number of the standard characteristic points, judging that the peak value to noise ratio in the object image to be identified is larger than the standard peak value to noise ratio, and if the two judgments are yes, indicating that the object image to be identified meets the standard, otherwise, not meeting the standard. If the preset criteria are not met, S201 can be entered to re-capture the ambient image.
S205: acquiring the internal parameters of the multi-view depth camera; the multi-view depth camera is calibrated by adopting a Zhang Youzheng calibration method;
s206: comparing the similarity of the feature points of all the environmental images to obtain M groups of similar feature points with the similarity larger than a preset value, and determining an image to be restored in all the environmental images of the object to be carried;
the multi-view depth camera can shoot the environmental images with the same number as that of the sub cameras at one time, so that the similarity comparison is carried out on the characteristic points in all the environmental images, M groups of similar characteristic points can be obtained, the specific number of M is not limited here, and M is any positive integer larger than 1. The purpose of determining the images to be restored in this step is to select the pictures from all the environmental images to perform three-dimensional coordinate calculation, and the number of the images to be restored is not limited herein, and 1 image to be restored may be selected as a preferred embodiment. For example, feature points X1, X2, X3, X4 and X5 are extracted through the environmental image X, feature points Y1, Y2, Y3, Y4 and Y5 are extracted through the environmental image Y, and by comparing the similarity of all feature points of the environmental image in this step, it is determined that the similarity of X1 and Y1 is larger than a preset value, the similarity of X2 and Y2 is larger than a preset value, the similarity of X3 and Y3 is larger than other similarity of the preset value and is not larger than the preset value, and 3 groups of similar feature points are obtained at this time. And selecting the environment image X as an image to be restored, wherein X1, X2 and X3 in the image to be restored are similar feature points on the image to be restored.
S207: calculating three-dimensional coordinates of similar feature points on the image to be restored according to the two-dimensional coordinates of the similar feature points on the image to be restored and the internal parameters;
the internal parameters of the camera represent the mapping relation between the two-dimensional coordinates and the three-dimensional coordinates in the picture, so that the three-dimensional coordinates can be determined according to the two-dimensional coordinates and the internal parameters of the similar feature points. Since the similar feature points are not all feature points in the picture to be restored, the operation in S208 is also performed to determine the three-dimensional coordinates of all feature points.
S208: and executing depth interpolation operation according to the three-dimensional coordinates of the similar characteristic points on the image to be restored to determine the image depth information of the object to be carried.
S209: and generating a motion track according to the image depth information, and controlling the carrying mechanical arm to carry the object to be carried to a position corresponding to the object carrying command according to the motion track.
Referring to fig. 3, fig. 3 is a schematic structural diagram of a control system of a handling robot according to an embodiment of the present disclosure;
the system may include:
the image shooting module 100 is configured to, after receiving an object handling command, shoot an environmental image with a multi-view depth camera, and determine whether an object to be handled exists according to the environmental image;
the three-dimensional reduction module 200 is configured to perform three-dimensional reduction operation on the environmental image when the environmental image determines that the object to be carried exists, and determine image depth information of the object to be carried;
and the carrying module 300 is used for generating a motion track according to the image depth information and controlling the carrying mechanical arm to carry the object to be carried to a position corresponding to the object carrying command according to the motion track.
Further, the image capturing module 100 includes:
the shooting unit is used for shooting a preset number of environmental images with different shooting angles after receiving an object carrying command;
the feature extraction unit is used for extracting feature points in all the environment images and extracting image contours in all the environment images through a canny operator;
the image determining unit is used for obtaining an object image to be identified according to the intensity of the characteristic points falling in the image outline;
the judging unit judges whether the number of the characteristic points and the peak value to noise ratio in the object image to be identified meet preset standards or not; the preset standard comprises the number of standard characteristic points of the standard object image to be carried and a standard peak value to noise ratio; if yes, the environment image exists the object to be carried.
Further, the three-dimensional reduction module 200 includes:
an internal parameter obtaining unit, configured to obtain an internal parameter of the multi-view depth camera; the multi-view depth camera is calibrated by adopting a Zhang Youzheng calibration method;
the similarity feature point determining unit is used for comparing the similarity of the feature points of all the environment images to obtain M groups of similarity feature points with the similarity larger than a preset value, and determining an image to be restored in all the environment images of the object to be carried;
the coordinate calculation unit is used for calculating three-dimensional coordinates of the similar feature points on the image to be restored according to the two-dimensional coordinates of the similar feature points on the image to be restored and the internal parameters;
and the depth information determining unit is used for performing depth interpolation operation according to the three-dimensional coordinates of the similar feature points on the image to be restored to determine the image depth information of the object to be conveyed.
Further, the detection system further includes:
and the voice control module is used for receiving voice information input by a user and generating the object carrying command according to the voice information.
Since the embodiments of the system portion and the embodiments of the method portion correspond to each other, the embodiments of the system portion refer to the description of the embodiments of the method portion, which is not repeated herein.
The present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed, implements the steps provided by the above embodiments. The storage medium may include: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The application also provides a carrying mechanical arm which can be used for shooting a multi-view depth camera of an image of an object to be carried;
the microcomputer is connected with the multi-view depth camera and used for generating image depth information according to shooting an image of an object to be carried and generating a control instruction according to the image depth information; and the mechanical arm assembly is connected with the microcomputer through a control circuit and is used for carrying the object to be carried according to the control instruction. Of course, the handling robot arm may also include various network interfaces, power supplies, and other components. Referring to fig. 4, fig. 4 is a schematic structural diagram of a mechanical arm assembly of a table top stage handling mechanical arm according to an embodiment of the present disclosure; the mechanical arm assembly includes:
the mechanical arm turntable 402 is connected with the first steering engine 401 and is used for driving the driving mechanical arm 404 to rotate in a horizontal plane around the rotating shaft;
the driving mechanical arm 404 is rotatably connected with the mechanical arm turntable 402 and the driven mechanical arm 405, is connected with the second steering engine assembly 403 and is used for driving the driven mechanical arm 405 to move;
the clamping device 406 is arranged at the tail end of the driven mechanical arm 405, is connected with the third steering engine 407 and is used for clamping the object to be carried.
Further, the active mechanical arm 404 includes:
a first connecting rod rotatably connected with the first connecting point of the mechanical arm turntable and the driven mechanical arm;
the second connecting rod is rotatably connected with the second connecting point of the mechanical arm turntable and the driven mechanical arm;
correspondingly, the second steering engine assembly comprises a main steering engine and an auxiliary steering engine; the main steering engine is connected with the first connecting rod, and the auxiliary steering engine is connected with the second connecting rod. The first connecting rod is connected with a first connecting point of the driven mechanical arm, the second connecting rod is connected with a second connecting point of the driven mechanical arm, and a certain distance exists between the first connecting point and the second connecting point so as to control the driven mechanical arm.
Wherein, this arm assembly has 5 degrees of freedom, includes: one degree of freedom of the robotic turntable 402, one degree of freedom of the first link, one degree of freedom of the second link, and two degrees of freedom of the clamping and dumping of the clamping device 406.
Further, the method further comprises the following steps:
and the voice recognition component is connected with the microcomputer and is used for receiving voice information and converting the voice information into an object carrying command.
The voice information can be collected through the microphone array and stored in the pcm format, the voice information is uploaded to the network platform to realize voice recognition, and the keywords of the control instruction are extracted to generate an object carrying command according to the returned information.
Further, the method further comprises the following steps:
and a wireless signal transceiver connected with the microcomputer.
In the description, each embodiment is described in a progressive manner, and each embodiment is mainly described by the differences from other embodiments, so that the same similar parts among the embodiments are mutually referred. For the system disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section. It should be noted that it would be obvious to those skilled in the art that various improvements and modifications can be made to the present application without departing from the principles of the present application, and such improvements and modifications fall within the scope of the claims of the present application.
It should also be noted that in this specification, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.

Claims (8)

1. The control method of the carrying mechanical arm is characterized by comprising the following steps:
when an object carrying command is received, shooting an environment image by using a multi-view depth camera;
extracting feature points in all the environment images, and extracting image contours in all the environment images through a canny operator;
obtaining an object image to be identified according to the intensity of the feature points falling in the image outline;
judging whether the number of the characteristic points and the peak value to noise ratio in the object image to be identified meet preset standards or not; the preset standard comprises the number of standard characteristic points of the standard object image to be carried and a standard peak value to noise ratio; if yes, the object to be carried exists;
if the object to be carried exists, carrying out three-dimensional reduction operation on the environment image, and determining image depth information of the object to be carried;
and generating a motion track according to the image depth information, and controlling the carrying mechanical arm to carry the object to be carried to a position corresponding to the object carrying command according to the motion track.
2. The control method according to claim 1, wherein performing three-dimensional restoration operation on the environmental image, determining image depth information of the object to be handled includes:
acquiring the internal parameters of the multi-view depth camera; the multi-view depth camera is calibrated by adopting a Zhang Youzheng calibration method;
comparing the similarity of the feature points of all the environmental images to obtain M groups of similar feature points with the similarity larger than a preset value, and determining an image to be restored in all the environmental images of the object to be carried;
calculating three-dimensional coordinates of similar feature points on the image to be restored according to the two-dimensional coordinates of the similar feature points on the image to be restored and the internal parameters;
and executing depth interpolation operation according to the three-dimensional coordinates of the similar characteristic points on the image to be restored to determine the image depth information of the object to be carried.
3. The control method according to claim 1, characterized in that the feature points are in particular surf feature points.
4. The control method according to claim 1, characterized by further comprising:
and receiving voice information input by a user, and generating the object carrying command according to the voice information.
5. A control system for a handling robot, comprising:
the image shooting module is used for shooting an environment image by utilizing the multi-view depth camera after receiving an object carrying command and judging whether an object to be carried exists or not according to the environment image;
the three-dimensional reduction module is used for carrying out three-dimensional reduction operation on the environment image when the environment image judges that the object to be carried exists, and determining image depth information of the object to be carried;
the carrying module is used for generating a motion track according to the image depth information and controlling the carrying mechanical arm to carry the object to be carried to a position corresponding to the object carrying command according to the motion track;
wherein, the image shooting module includes:
the shooting unit is used for shooting a preset number of environmental images with different shooting angles after receiving the object carrying command;
the feature extraction unit is used for extracting feature points in all the environment images and extracting image contours in all the environment images through a canny operator;
the image determining unit is used for obtaining an object image to be identified according to the intensity of the characteristic points falling in the image outline;
the judging unit judges whether the number of the characteristic points and the peak value to noise ratio in the object image to be identified meet preset standards or not; the preset standard comprises the number of standard characteristic points of the standard object image to be carried and a standard peak value to noise ratio; if yes, the object to be carried exists.
6. The control system of claim 5, wherein the three-dimensional restoration module comprises:
an internal parameter obtaining unit, configured to obtain an internal parameter of the multi-view depth camera; the multi-view depth camera is calibrated by adopting a Zhang Youzheng calibration method;
the similarity feature point determining unit is used for comparing the similarity of the feature points of all the environment images to obtain M groups of similarity feature points with the similarity larger than a preset value, and determining an image to be restored in all the environment images of the object to be carried;
the coordinate calculation unit is used for calculating three-dimensional coordinates of the similar feature points on the image to be restored according to the two-dimensional coordinates of the similar feature points on the image to be restored and the internal parameters;
and the depth information determining unit is used for performing depth interpolation operation according to the three-dimensional coordinates of the similar feature points on the image to be restored to determine the image depth information of the object to be conveyed.
7. A computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, which when executed by a processor, implements the steps of the control method of the handling robot according to any one of claims 1 to 4.
8. A handling robot arm, comprising:
a multi-view depth camera for shooting images of objects to be carried;
the microcomputer is connected with the multi-view depth camera and used for generating image depth information according to a shot image of an object to be carried and generating a control instruction according to the image depth information;
the mechanical arm assembly is connected with the microcomputer through a control circuit and is used for carrying the object to be carried according to the control instruction;
the handling mechanical arm is also used for extracting feature points in all the environment images and extracting image contours in all the environment images through a canny operator; the image recognition method is also used for obtaining an object image to be recognized according to the intensity of the characteristic points falling in the image outline; the method is also used for judging whether the number of the characteristic points and the peak value to noise ratio in the object image to be identified meet preset standards or not; the preset standard comprises the number of standard characteristic points of the standard object image to be carried and a standard peak value to noise ratio; if yes, the object to be carried exists.
CN201810532037.7A 2018-05-29 2018-05-29 Control method, system and related device of carrying mechanical arm Active CN108724183B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810532037.7A CN108724183B (en) 2018-05-29 2018-05-29 Control method, system and related device of carrying mechanical arm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810532037.7A CN108724183B (en) 2018-05-29 2018-05-29 Control method, system and related device of carrying mechanical arm

Publications (2)

Publication Number Publication Date
CN108724183A CN108724183A (en) 2018-11-02
CN108724183B true CN108724183B (en) 2024-03-26

Family

ID=63936575

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810532037.7A Active CN108724183B (en) 2018-05-29 2018-05-29 Control method, system and related device of carrying mechanical arm

Country Status (1)

Country Link
CN (1) CN108724183B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113034668B (en) * 2021-03-01 2023-04-07 中科数据(青岛)科技信息有限公司 AR-assisted mechanical simulation operation method and system
CN113344952B (en) * 2021-07-28 2021-11-19 浙江华睿科技股份有限公司 Trimming method and device and electronic equipment
CN113858199B (en) * 2021-09-27 2023-02-28 河南松源企业管理咨询有限公司 Furniture transfer robot

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06262568A (en) * 1993-03-08 1994-09-20 Nippondenso Co Ltd Recognition method for three-dimensional position and attitude based on visual sensation and device thereof
DE202017003104U1 (en) * 2017-06-13 2017-09-15 Gmt Global Inc. Image recognition function supporting motion control module for work machines
CN107886044A (en) * 2016-09-29 2018-04-06 发那科株式会社 Object detector and object identification method
CN107914272A (en) * 2017-11-20 2018-04-17 北京科技大学 A kind of method of seven freedom robot assemblies crawl target object
CN107972026A (en) * 2016-10-25 2018-05-01 深圳光启合众科技有限公司 Robot, mechanical arm and its control method and device
CN208262843U (en) * 2018-05-29 2018-12-21 广东工业大学 A kind of desktop grade handling machinery arm

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06262568A (en) * 1993-03-08 1994-09-20 Nippondenso Co Ltd Recognition method for three-dimensional position and attitude based on visual sensation and device thereof
CN107886044A (en) * 2016-09-29 2018-04-06 发那科株式会社 Object detector and object identification method
CN107972026A (en) * 2016-10-25 2018-05-01 深圳光启合众科技有限公司 Robot, mechanical arm and its control method and device
DE202017003104U1 (en) * 2017-06-13 2017-09-15 Gmt Global Inc. Image recognition function supporting motion control module for work machines
CN107914272A (en) * 2017-11-20 2018-04-17 北京科技大学 A kind of method of seven freedom robot assemblies crawl target object
CN208262843U (en) * 2018-05-29 2018-12-21 广东工业大学 A kind of desktop grade handling machinery arm

Also Published As

Publication number Publication date
CN108724183A (en) 2018-11-02

Similar Documents

Publication Publication Date Title
CN108724183B (en) Control method, system and related device of carrying mechanical arm
CN105269578B (en) Pointing device and robot system
JP4226623B2 (en) Work picking device
US10306149B2 (en) Image processing apparatus, robot system, robot, and image processing method
CN111319044B (en) Article grabbing method and device, readable storage medium and grabbing robot
JP2010012567A (en) Article picking device
CN116157837A (en) Calibration method and device for robot
CN116958146B (en) Acquisition method and device of 3D point cloud and electronic device
US11470259B2 (en) Systems and methods for sampling images
JP2006224291A (en) Robot system
JP2018122376A (en) Image processing device, robot control device, and robot
JP2003089086A (en) Robot controller
JP2006021300A (en) Predicting device and holding device
JP2011093058A (en) Target object holding area extraction apparatus and robot system using the same
CN112338922A (en) Five-axis mechanical arm grabbing and placing method and related device
JP6067547B2 (en) Object recognition device, robot, and object recognition method
JP2018146347A (en) Image processing device, image processing method, and computer program
KR101820241B1 (en) Apparatus for motion estimation of object using gripper and method thereof
JP2020040143A (en) Manipulator and mobile robot
JP2000254883A (en) Tool device, tool recognizing method, and automatic operation device
JP2015157343A (en) Robot, robot system, control device, and control method
JP2018017610A (en) Three-dimensional measuring device, robot, robot controlling device, and robot system
JP6123364B2 (en) Robot control system, robot, program, and robot control method
CN117283555B (en) Method and device for autonomously calibrating tool center point of robot
KR20230175122A (en) Method for controlling a robot for manipulating, in particular picking up, an object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant