CN112157653A - Shielding detection method and device - Google Patents

Shielding detection method and device Download PDF

Info

Publication number
CN112157653A
CN112157653A CN202010954855.3A CN202010954855A CN112157653A CN 112157653 A CN112157653 A CN 112157653A CN 202010954855 A CN202010954855 A CN 202010954855A CN 112157653 A CN112157653 A CN 112157653A
Authority
CN
China
Prior art keywords
information
shielding
mechanical arm
joint
parameter information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010954855.3A
Other languages
Chinese (zh)
Other versions
CN112157653B (en
Inventor
孙雷
李�赫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ruying Intelligent Technology Co ltd
Original Assignee
Beijing Ruying Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ruying Intelligent Technology Co ltd filed Critical Beijing Ruying Intelligent Technology Co ltd
Priority to CN202010954855.3A priority Critical patent/CN112157653B/en
Publication of CN112157653A publication Critical patent/CN112157653A/en
Application granted granted Critical
Publication of CN112157653B publication Critical patent/CN112157653B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means

Abstract

The invention discloses a method and a device for shielding detection, which are used for detecting whether a mechanical arm shields a target object. The method comprises the following steps: acquiring joint parameter information of the mechanical arm; acquiring coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system; generating spatial distribution information of the mechanical arm according to the coordinate information; in a pre-constructed orthographic projection model, obtaining a shielding area according to known shooting points and the space distribution information; and obtaining the shielding information according to the shielding area and the known area where the target object is located.

Description

Shielding detection method and device
Technical Field
The present invention relates to the field of computer and communication technologies, and in particular, to a method and an apparatus for occlusion detection.
Background
With the development of artificial intelligence technology, the technology of robots and mechanical arms is rapidly developed. The mechanical arm can better simulate human behaviors and replace human beings to finish some works. The camera is equivalent to eyes of the mechanical arm, and the mechanical arm depends on images shot by the camera in the working process. However, the mechanical arm may shield the camera in the moving process, so that the camera cannot shoot the target object, and the operation of the mechanical arm is affected. Therefore, how to detect whether the mechanical arm shields the camera is a problem to be solved urgently.
Disclosure of Invention
The invention provides a method and a device for shielding detection, which are used for detecting whether a mechanical arm shields a target object.
The invention provides a method for detecting occlusion, which comprises the following steps:
acquiring joint parameter information of the mechanical arm;
acquiring coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system;
generating spatial distribution information of the mechanical arm according to the coordinate information;
in a pre-constructed orthographic projection model, obtaining a shielding area according to known shooting points and the space distribution information;
and obtaining the shielding information according to the shielding area and the known area where the target object is located.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects: according to the embodiment, the projection relation between the mechanical arm and the shooting point is established, so that the shielding information for the target object is obtained, the fact whether the target object is shielded or not is facilitated to be determined, an information basis is provided for subsequent processing, and the processing process is simple and rapid.
Optionally, the obtaining coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system includes:
determining reference points of all joints of the mechanical arm according to the joint parameter information;
and obtaining the coordinate information of the reference point according to the joint parameter information and a preset coordinate system.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects: the embodiment abstracts the mechanical arm into a reference point, so that the subsequent calculation processing is facilitated.
Optionally, the determining the reference point of each joint of the mechanical arm according to the joint parameter information includes:
determining the joint of each joint of the mechanical arm as a reference point according to the joint parameter information;
determining two end points of the mechanical arm as reference points according to the joint parameter information;
and inserting a reference point between two adjacent reference points according to the width parameter information in the joint parameter information.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects: the mechanical arm is characterized by a plurality of reference points, so that subsequent calculation processing is facilitated.
Optionally, the generating spatial distribution information of the mechanical arm according to the coordinate information includes:
and filling a preset three-dimensional unit model in the coordinate information to generate the spatial distribution information of the mechanical arm.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects: the space distribution of the mechanical arm can be rapidly simulated in the embodiment through a three-dimensional unit model filling mode.
Optionally, the width of the three-dimensional unit model is consistent with the width parameter information in the joint parameter information.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects: in the embodiment, the spatial distribution of the mechanical arm can be simulated approximately through the three-dimensional unit model.
Optionally, the method further includes:
comparing the shielding information with a preset shielding threshold value;
when the shielding information is lower than the shielding threshold value, acquiring an image shot by the shooting point;
and when the shielding information is not lower than the shielding threshold value, discarding the image shot by the shooting point.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects: according to the embodiment, the quality evaluation of the shot image is realized through the shielding information, and the image identification process is saved.
Optionally, the method further includes:
and planning a mechanical arm movement path according to the shielding information so as to reduce the shielding information.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects: the embodiment realizes the replanning of the motion path of the mechanical arm based on the shielding information so as to obtain a better motion path.
Optionally, the method further includes:
and acquiring shielding information of the plurality of shooting points of the target object, and acquiring an image shot by the shooting point with the lowest shielding information.
The technical scheme provided by the embodiment of the invention can have the following beneficial effects: the embodiment can select the image with better quality based on the shielding information, and saves the image identification process.
The invention provides a device for occlusion detection, comprising:
the first acquisition module is used for acquiring joint parameter information of the mechanical arm;
the coordinate module is used for acquiring coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system;
the space module is used for generating space distribution information of the mechanical arm according to the coordinate information;
the projection module is used for obtaining a shielding area in a pre-constructed orthographic projection model according to the known shooting points and the space distribution information;
and the shielding module is used for acquiring shielding information according to the shielding area and the known area where the target object is located.
Optionally, the coordinate module includes:
the reference point submodule is used for determining the reference point of each joint of the mechanical arm according to the joint parameter information;
and the coordinate submodule is used for acquiring the coordinate information of the reference point according to the joint parameter information and a preset coordinate system.
Optionally, the reference point submodule determines a joint of each joint of the mechanical arm as a reference point according to the joint parameter information; determining two end points of the mechanical arm as reference points according to the joint parameter information; and inserting a reference point between two adjacent reference points according to the width parameter information in the joint parameter information.
Optionally, the space module includes:
and the space submodule is used for filling a preset three-dimensional unit model in the coordinate information to generate the space distribution information of the mechanical arm.
Optionally, the width of the three-dimensional unit model is consistent with the width parameter information in the joint parameter information.
Optionally, the apparatus further comprises:
the comparison module is used for comparing the shielding information with a preset shielding threshold value;
the image module is used for acquiring the image shot by the shooting point when the shielding information is lower than the shielding threshold value;
and the abandoning module is used for abandoning the image shot by the shooting point when the shielding information is not lower than the shielding threshold value.
Optionally, the apparatus further comprises:
and the planning module is used for planning the motion path of the mechanical arm according to the shielding information so as to reduce the shielding information.
Optionally, the apparatus further comprises:
and the second acquisition module is used for acquiring shielding information of the plurality of shooting points of the target object and acquiring an image shot by the shooting point with the lowest shielding information.
The invention provides a device for occlusion detection, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring joint parameter information of the mechanical arm;
acquiring coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system;
generating spatial distribution information of the mechanical arm according to the coordinate information;
in a pre-constructed orthographic projection model, obtaining a shielding area according to known shooting points and the space distribution information;
and obtaining the shielding information according to the shielding area and the known area where the target object is located.
The present invention provides a computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings:
FIG. 1 is a flowchart of a method for occlusion detection in an embodiment of the present invention;
FIG. 2 is a schematic view of a projection in an embodiment of the invention;
FIG. 3 is a schematic illustration of spatial distribution in an embodiment of the present invention;
FIG. 4 is a flowchart of a method of occlusion detection in an embodiment of the present invention;
FIG. 5 is a flowchart of a method for occlusion detection in an embodiment of the present invention;
FIG. 6 is a diagram of a device structure for occlusion detection according to an embodiment of the present invention;
FIG. 7 is a block diagram of a coordinate module in an embodiment of the invention;
FIG. 8 is a block diagram of a space module in an embodiment of the present invention;
FIG. 9 is a diagram of an apparatus for occlusion detection in an embodiment of the present invention;
FIG. 10 is a diagram of an apparatus for occlusion detection in an embodiment of the present invention;
FIG. 11 is a diagram illustrating an exemplary embodiment of an occlusion detection apparatus.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
In the related art, the camera is equivalent to the eyes of the mechanical arm, and the mechanical arm needs to rely on images shot by the camera in the working process. However, the mechanical arm may shield the camera in the moving process, so that the camera cannot shoot the target object, and the operation of the mechanical arm is affected. Therefore, how to detect whether the mechanical arm shields the camera is a problem to be solved urgently. One possible solution is to perform image recognition on the captured image to determine whether the target object is occluded. Another possible scheme is to use multiple cameras to capture images, and perform image fusion on the images captured by the multiple cameras to solve the problem that a single camera is blocked. The image recognition and image fusion technologies are complex in processing and slow in process.
In order to solve the above problem, in this embodiment, spatial distribution information of the mechanical arm is generated, and a projection relationship between the shooting point and the mechanical arm is constructed, so as to obtain shielding information for the target object. The process is simple and quick to process.
Referring to fig. 1, the method for occlusion detection in this embodiment includes:
step 101: and acquiring joint parameter information of the mechanical arm.
Step 102: and obtaining the coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system.
Step 103: and generating the spatial distribution information of the mechanical arm according to the coordinate information.
Step 104: and in a pre-constructed orthographic projection model, obtaining an occlusion area according to the known shooting points and the space distribution information.
Step 105: and obtaining the shielding information according to the shielding area and the known area where the target object is located.
The embodiment can be realized by a mechanical arm control system. The base of the robot arm is fixedly mounted at a position, so that the coordinate information of the base of the robot arm in a preset coordinate system is known. The camera at the shooting point is also fixedly installed, so that the coordinate information of the shooting point in the preset coordinate system is known. In this embodiment, before the movement of the robot arm, the movement path of the robot arm may be planned in advance, so that the embodiment may be implemented before and during the movement of the robot arm.
In step 101, joint parameter information of the mechanical arm of each node on the motion path may be acquired according to a preset sampling frequency. The sampling frequency may be a preset sampling duration, a preset sampling distance, or the like. Generally, the frequency of the image capturing device capturing the image is slower than the sampling frequency of the robot arm control system for the joint parameter information, and in view of the detection of the occlusion of the capturing point in the present embodiment, the joint parameter information of the robot arm can be obtained according to the capturing frequency of the image capturing device at the capturing point. The joint parameter information is also called dh parameter information, and includes information such as length and width of a joint of the mechanical arm (the cross section of the mechanical arm can be circular and can be represented by a radius), a rotation angle (an included angle between two adjacent joint axes), a deviation distance, a joint angle (an included angle between two adjacent connecting rods around a common axis), and the like.
Because the coordinate information of the base of the mechanical arm is known, the joint parameter information of each joint of the mechanical arm is also known, and the coordinate information of each joint of the mechanical arm can be obtained according to a kinematics forward solution algorithm of the mechanical arm.
The coordinate information may reflect point information, and the mechanical arm is a three-dimensional structure and a three-dimensional figure, so the embodiment generates spatial distribution information according to the coordinate information, and the spatial distribution information may represent the three-dimensional structure of the mechanical arm. That is, the three-dimensional space occupied by the robot arm is known.
As shown in fig. 2, the position P of the shooting point is set as the starting point of projection, and the shooting direction is set as the projection direction. Circle Q represents the projection of point P onto the arm, circle K represents the projection of point P onto the target object distance, i.e., the occlusion region, and circle J represents the region where the target object is located. The overlapping condition of the circle K and the circle J is the occlusion information. The occlusion information may include information of an overlap ratio, an overlap area, coordinates of an overlap region, and the like.
The area where the target object is located can be known in advance, that is, the distance from the area where the target object is located to the point P can be known, so that the circle K can be obtained.
The area where the target object is located may be a relatively wide area, which is not accurate enough, and when the mechanical arm operates the target object, the accurate position of the target object needs to be known, so that the accurate position needs to be obtained by depending on the shot image, and at this time, whether the image can clearly and completely shoot the target object is a concern of the embodiment. The region where the target object is located may also be a relatively narrow range, which is a relatively specific position, but the position may be inaccurate and may change, so before the robot arm contacts the target object, the precise position of the target object may need to be obtained many times, so that the precise position needs to be obtained depending on the captured image, and it is a concern of this embodiment whether the image can clearly and completely capture the target object.
According to the embodiment, the shielding condition of the mechanical arm and the target object is obtained by constructing the projection model, image recognition and analysis are not needed, the processing process is simple, convenient and fast, and the method is suitable for detecting the shielding condition in real time.
In this embodiment, there may be n shooting points and m mechanical arms, and occlusion detection may be performed for each shooting point and each mechanical arm, which is equivalent to performing occlusion detection n × m times.
Optionally, the step 102 includes: step a 1-step a 2.
Step A1: and determining the reference point of each joint of the mechanical arm according to the joint parameter information.
Step A2: and obtaining the coordinate information of the reference point according to the joint parameter information and a preset coordinate system.
In the embodiment, each joint of the mechanical arm is abstracted into a plurality of reference points, and each joint of the mechanical arm is represented by the reference points, so that the model structure of the mechanical arm can be simplified, and the subsequent calculated amount is greatly reduced.
Optionally, the step a1 includes: step a 11-step a 13.
Step A11: and determining the joint of each joint of the mechanical arm as a reference point according to the joint parameter information.
Step A12: and determining two end points of the mechanical arm as reference points according to the joint parameter information.
Step A13: and inserting a reference point between two adjacent reference points according to the width parameter information in the joint parameter information.
Wherein, the step A11 and the step A12 can be performed synchronously.
For example, if the robot arm comprises 6 joints, then there are 5 joints, i.e. the connecting axis, which are also inflection points in the position and orientation of the robot arm. The two end points of the mechanical arm are a base and a tail end clamping jaw of the mechanical arm. The 7 points are used as reference points, the key positions of all joints of the mechanical arm can be accurately marked, and then the reference points are inserted between two adjacent reference points in an interpolation mode, so that the mechanical arm can be accurately represented.
When inserting the reference point, midpoint insertion, insertion according to a preset step length, and the like can be adopted. In this embodiment, the reference point is inserted according to the width parameter information in the joint parameter information, which is convenient for the subsequent establishment of the three-dimensional unit model, and a simpler three-dimensional unit model can be set. The distance between two adjacent reference points is consistent with the width of the mechanical arm, the width of the three-dimensional unit model is also consistent with the width of the mechanical arm, the length and the width of the three-dimensional unit model can be equal, and the three-dimensional unit model can adopt a cube, a sphere and other models which are simple in structure and regular, so that the calculation amount is reduced.
Optionally, the step 103 includes: step B1.
Step B1: and filling a preset three-dimensional unit model in the coordinate information to generate the spatial distribution information of the mechanical arm.
In this embodiment, the robot arm is a three-dimensional structure, and it may not be accurate enough to characterize the robot arm with only the reference point, so this embodiment constructs a spatial model of the robot arm. In view of the complexity of the shape of the mechanical arm, the shapes of mechanical arms of different models may be different, and if the real shape structure of the mechanical arm is obtained, the calculation amount is greatly increased. In the embodiment, the three-dimensional unit model is filled in the coordinate information (namely the reference point), which is equivalent to simulating the appearance of the mechanical arm by using a plurality of regular three-dimensional unit models, thereby realizing a three-dimensional structure and reducing the calculation amount.
The three-dimensional cell model may be set according to the shape of the robot arm. For example, as shown in fig. 3, the robot arm has a circular cross section, and the three-dimensional cell model may take the form of a sphere. The three-dimensional cell model is characterized by a sphere center and a radius. The coordinates of the center of sphere are consistent with the coordinates of the reference point. The spherical radius may be set empirically so that the width of the three-dimensional unit model is close to the width of the joint of the robot arm. And filling the spherical model at the reference point to obtain a spherical surrounding tree, wherein the spherical surrounding tree can correspondingly obtain a circle K in the orthographic projection model.
Optionally, the width of the three-dimensional unit model is consistent with the width parameter information in the joint parameter information.
When the three-dimensional element model is spherical, the width of the three-dimensional element model is the diameter of the sphere. And when the width of the three-dimensional unit model is consistent with the width parameter information in the joint parameter information, the shape of the mechanical arm simulated by the three-dimensional unit model is closer to the real shape of the mechanical arm.
The widths of all joints of the mechanical arm can be different, and the radius of the three-dimensional unit model at the corresponding position can be changed correspondingly. The three-dimensional unit model can simulate the mechanical arm more accurately.
Optionally, the method further includes: step C1-step C3.
Step C1: and comparing the shielding information with a preset shielding threshold value.
Step C2: and when the shielding information is lower than the shielding threshold value, acquiring the image shot by the shooting point.
Step C3: and when the shielding information is not lower than the shielding threshold value, discarding the image shot by the shooting point.
The occlusion threshold value in this embodiment may be configured to be about 50%. The occlusion information may be an occlusion ratio. According to the embodiment, through comparison of the shielding information and the shielding threshold value, whether the shot image is available or not can be quickly evaluated without image recognition.
When the image is available, the image can be identified to obtain the accurate position of the target object. When the images are not available, the motion path of the mechanical arm can be re-planned, or other mechanical arms can be adopted, or images shot by other shooting points can be adopted.
Optionally, the method further includes: step D1.
Step D1: and planning a mechanical arm movement path according to the shielding information so as to reduce the shielding information.
The embodiment can re-plan the motion path of the mechanical arm according to the shielding range no matter how much the shielding proportion is reached, so as to reduce shielding.
Optionally, the method further includes: step E1.
Step E1: and acquiring shielding information of the plurality of shooting points of the target object, and acquiring an image shot by the shooting point with the lowest shielding information.
The embodiment can select the high-quality image without image recognition and fusion. In this embodiment, the lower the occlusion information is, the smaller the occlusion area is, and the higher the image quality is.
The implementation is described in detail below by way of several embodiments.
Referring to fig. 4, the method for occlusion detection in this embodiment includes:
step 401: and acquiring joint parameter information of the mechanical arm.
Step 402: and determining the reference point of each joint of the mechanical arm according to the joint parameter information.
Step 403: and obtaining the coordinate information of the reference point according to the joint parameter information and a preset coordinate system.
Step 404: and filling a preset three-dimensional unit model at the reference point to generate the spatial distribution information of the mechanical arm.
Step 405: and in a pre-constructed orthographic projection model, obtaining an occlusion area according to the known shooting points and the space distribution information.
Step 406: and obtaining the shielding information according to the shielding area and the known area where the target object is located.
Referring to fig. 5, the method for occlusion detection in this embodiment includes:
step 501: and acquiring joint parameter information of the mechanical arm.
Step 502: and obtaining the coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system.
Step 503: and generating the spatial distribution information of the mechanical arm according to the coordinate information.
Step 504: and in a pre-constructed orthographic projection model, obtaining an occlusion area according to the known shooting points and the space distribution information.
Step 505: and obtaining the shielding information according to the shielding area and the known area where the target object is located.
Step 506: and comparing the shielding information with a preset shielding threshold value. When the occlusion information is lower than the occlusion threshold, continuing with step 507; when the occlusion information is not below the occlusion threshold, continue with step 508.
Step 507: and acquiring the image shot by the shooting point.
Step 508: and discarding the image shot by the shooting point.
The above embodiments can be freely combined according to actual needs.
The implementation of occlusion detection is described above, and can be implemented by a device, the internal structure and function of which are described below.
Referring to fig. 6, the occlusion detection apparatus in the present embodiment includes: a first acquisition module 601, a coordinate module 602, a space module 603, a projection module 604, and an occlusion module 605.
The first obtaining module 601 is configured to obtain joint parameter information of a mechanical arm.
And the coordinate module 602 is configured to obtain coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system.
And a space module 603, configured to generate spatial distribution information of the mechanical arm according to the coordinate information.
And a projection module 604, configured to obtain an occlusion region according to the known shooting point and the spatial distribution information in a pre-constructed orthogonal projection model.
And the occlusion module 605 is configured to obtain occlusion information according to the occlusion region and a region where the known target object is located.
Optionally, as shown in fig. 7, the coordinate module 602 includes: a reference point submodule 701 and a coordinates submodule 702.
And the reference point submodule 701 is used for determining the reference point of each joint of the mechanical arm according to the joint parameter information.
And the coordinate submodule 702 is configured to obtain coordinate information of the reference point according to the joint parameter information and a preset coordinate system.
Optionally, the reference point submodule 701 determines a joint of each joint of the mechanical arm as a reference point according to the joint parameter information; determining two end points of the mechanical arm as reference points according to the joint parameter information; and inserting a reference point between two adjacent reference points according to the width parameter information in the joint parameter information.
Optionally, as shown in fig. 8, the space module 603 includes: the space sub-module 801.
And the space submodule 801 is used for filling a preset three-dimensional unit model in the coordinate information to generate space distribution information of the mechanical arm.
Optionally, the width of the three-dimensional unit model is consistent with the width parameter information in the joint parameter information.
Optionally, as shown in fig. 9, the apparatus further includes: a comparison module 901, an image module 902 and a discard module 903.
A comparing module 901, configured to compare the occlusion information with a preset occlusion threshold.
An image module 902, configured to obtain an image captured by the shooting point when the occlusion information is lower than the occlusion threshold.
A discarding module 903, configured to discard the image captured by the shooting point when the occlusion information is not lower than the occlusion threshold.
Optionally, as shown in fig. 10, the apparatus further includes: the planning module 1001.
And the planning module 1001 is configured to plan a motion path of the mechanical arm according to the shielding information, so as to reduce the shielding information.
Optionally, as shown in fig. 11, the apparatus further includes: a second acquisition module 1101.
A second obtaining module 1101, configured to obtain occlusion information for a plurality of shooting points of the target object, and obtain an image shot by a shooting point with the lowest occlusion information.
An apparatus for occlusion detection, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring joint parameter information of the mechanical arm;
acquiring coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system;
generating spatial distribution information of the mechanical arm according to the coordinate information;
in a pre-constructed orthographic projection model, obtaining a shielding area according to known shooting points and the space distribution information;
and obtaining the shielding information according to the shielding area and the known area where the target object is located.
A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (18)

1. A method of occlusion detection, comprising:
acquiring joint parameter information of the mechanical arm;
acquiring coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system;
generating spatial distribution information of the mechanical arm according to the coordinate information;
in a pre-constructed orthographic projection model, obtaining a shielding area according to known shooting points and the space distribution information;
and obtaining the shielding information according to the shielding area and the known area where the target object is located.
2. The method of claim 1, wherein the obtaining coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system comprises:
determining reference points of all joints of the mechanical arm according to the joint parameter information;
and obtaining the coordinate information of the reference point according to the joint parameter information and a preset coordinate system.
3. The method of claim 2, wherein determining the reference point for each joint of the robotic arm based on the joint parameter information comprises:
determining the joint of each joint of the mechanical arm as a reference point according to the joint parameter information;
determining two end points of the mechanical arm as reference points according to the joint parameter information;
and inserting a reference point between two adjacent reference points according to the width parameter information in the joint parameter information.
4. The method of claim 1, wherein generating spatial distribution information of the robotic arm from the coordinate information comprises:
and filling a preset three-dimensional unit model in the coordinate information to generate the spatial distribution information of the mechanical arm.
5. The method of claim 4, wherein a width of the three-dimensional cell model is consistent with width parameter information in the joint parameter information.
6. The method of claim 1, wherein the method further comprises:
comparing the shielding information with a preset shielding threshold value;
when the shielding information is lower than the shielding threshold value, acquiring an image shot by the shooting point;
and when the shielding information is not lower than the shielding threshold value, discarding the image shot by the shooting point.
7. The method of claim 1, wherein the method further comprises:
and planning a mechanical arm movement path according to the shielding information so as to reduce the shielding information.
8. The method of claim 1, wherein the method further comprises:
and acquiring shielding information of the plurality of shooting points of the target object, and acquiring an image shot by the shooting point with the lowest shielding information.
9. An apparatus for occlusion detection, comprising:
the first acquisition module is used for acquiring joint parameter information of the mechanical arm;
the coordinate module is used for acquiring coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system;
the space module is used for generating space distribution information of the mechanical arm according to the coordinate information;
the projection module is used for obtaining a shielding area in a pre-constructed orthographic projection model according to the known shooting points and the space distribution information;
and the shielding module is used for acquiring shielding information according to the shielding area and the known area where the target object is located.
10. The apparatus of claim 9, wherein the coordinate module comprises:
the reference point submodule is used for determining the reference point of each joint of the mechanical arm according to the joint parameter information;
and the coordinate submodule is used for acquiring the coordinate information of the reference point according to the joint parameter information and a preset coordinate system.
11. The apparatus of claim 10, wherein the reference point submodule determines a connection of each joint of the robot arm as a reference point based on the joint parameter information; determining two end points of the mechanical arm as reference points according to the joint parameter information; and inserting a reference point between two adjacent reference points according to the width parameter information in the joint parameter information.
12. The apparatus of claim 9, wherein the space module comprises:
and the space submodule is used for filling a preset three-dimensional unit model in the coordinate information to generate the space distribution information of the mechanical arm.
13. The apparatus of claim 12, wherein a width of the three-dimensional cell model is consistent with width parameter information in the joint parameter information.
14. The apparatus of claim 9, wherein the apparatus further comprises:
the comparison module is used for comparing the shielding information with a preset shielding threshold value;
the image module is used for acquiring the image shot by the shooting point when the shielding information is lower than the shielding threshold value;
and the abandoning module is used for abandoning the image shot by the shooting point when the shielding information is not lower than the shielding threshold value.
15. The apparatus of claim 9, wherein the apparatus further comprises:
and the planning module is used for planning the motion path of the mechanical arm according to the shielding information so as to reduce the shielding information.
16. The apparatus of claim 9, wherein the apparatus further comprises:
and the second acquisition module is used for acquiring shielding information of the plurality of shooting points of the target object and acquiring an image shot by the shooting point with the lowest shielding information.
17. An apparatus for occlusion detection, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to:
acquiring joint parameter information of the mechanical arm;
acquiring coordinate information of each joint of the mechanical arm according to the joint parameter information and a preset coordinate system;
generating spatial distribution information of the mechanical arm according to the coordinate information;
in a pre-constructed orthographic projection model, obtaining a shielding area according to known shooting points and the space distribution information;
and obtaining the shielding information according to the shielding area and the known area where the target object is located.
18. A computer-readable storage medium having stored thereon computer instructions, which when executed by a processor, implement the steps of the method of any one of claims 1 to 8.
CN202010954855.3A 2020-09-11 2020-09-11 Shielding detection method and device Active CN112157653B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010954855.3A CN112157653B (en) 2020-09-11 2020-09-11 Shielding detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010954855.3A CN112157653B (en) 2020-09-11 2020-09-11 Shielding detection method and device

Publications (2)

Publication Number Publication Date
CN112157653A true CN112157653A (en) 2021-01-01
CN112157653B CN112157653B (en) 2022-02-01

Family

ID=73858038

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010954855.3A Active CN112157653B (en) 2020-09-11 2020-09-11 Shielding detection method and device

Country Status (1)

Country Link
CN (1) CN112157653B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6448107A (en) * 1987-08-19 1989-02-22 Hitachi Ltd Discriminating system for interference between objects in space
CN101435975A (en) * 2007-11-14 2009-05-20 中强光电股份有限公司 Projector and operation method thereof
CN101866425A (en) * 2010-06-02 2010-10-20 北京交通大学 Human body detection method based on fish-eye camera
CN102687174A (en) * 2010-01-12 2012-09-19 皇家飞利浦电子股份有限公司 Determination of a position characteristic for an object
CN104850699A (en) * 2015-05-19 2015-08-19 天津市天锻压力机有限公司 Anti-collision control method of transfer robots of stamping production line
CN106142083A (en) * 2016-07-21 2016-11-23 河北工业大学 The method of the three-dimensional motion emulation of high-altitude curtain wall mounting robot
US20170305010A1 (en) * 2014-11-26 2017-10-26 Irobot Corporation Systems and Methods for Performing Occlusion Detection
CN107791248A (en) * 2017-09-28 2018-03-13 浙江理工大学 Control method based on the six degree of freedom serial manipulator for being unsatisfactory for pipper criterions
US10192195B1 (en) * 2016-10-25 2019-01-29 Amazon Technologies, Inc. Techniques for coordinating independent objects with occlusions
CN110116410A (en) * 2019-05-28 2019-08-13 中国科学院自动化研究所 Mechanical arm target guiding system, the method for view-based access control model servo
CN110647818A (en) * 2019-08-27 2020-01-03 北京易华录信息技术股份有限公司 Identification method and device for shielding target object
CN111582196A (en) * 2020-02-13 2020-08-25 牧今科技 Method and system for determining occlusions within a camera field of view

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6448107A (en) * 1987-08-19 1989-02-22 Hitachi Ltd Discriminating system for interference between objects in space
CN101435975A (en) * 2007-11-14 2009-05-20 中强光电股份有限公司 Projector and operation method thereof
CN102687174A (en) * 2010-01-12 2012-09-19 皇家飞利浦电子股份有限公司 Determination of a position characteristic for an object
CN101866425A (en) * 2010-06-02 2010-10-20 北京交通大学 Human body detection method based on fish-eye camera
US20170305010A1 (en) * 2014-11-26 2017-10-26 Irobot Corporation Systems and Methods for Performing Occlusion Detection
CN104850699A (en) * 2015-05-19 2015-08-19 天津市天锻压力机有限公司 Anti-collision control method of transfer robots of stamping production line
CN106142083A (en) * 2016-07-21 2016-11-23 河北工业大学 The method of the three-dimensional motion emulation of high-altitude curtain wall mounting robot
US10192195B1 (en) * 2016-10-25 2019-01-29 Amazon Technologies, Inc. Techniques for coordinating independent objects with occlusions
CN107791248A (en) * 2017-09-28 2018-03-13 浙江理工大学 Control method based on the six degree of freedom serial manipulator for being unsatisfactory for pipper criterions
CN110116410A (en) * 2019-05-28 2019-08-13 中国科学院自动化研究所 Mechanical arm target guiding system, the method for view-based access control model servo
CN110647818A (en) * 2019-08-27 2020-01-03 北京易华录信息技术股份有限公司 Identification method and device for shielding target object
CN111582196A (en) * 2020-02-13 2020-08-25 牧今科技 Method and system for determining occlusions within a camera field of view

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
吴昕: "遥操作双目立体增强现实中的虚实融合与多层次遮挡检测算法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Also Published As

Publication number Publication date
CN112157653B (en) 2022-02-01

Similar Documents

Publication Publication Date Title
CN108369643B (en) Method and system for 3D hand skeleton tracking
Lee et al. Camera-to-robot pose estimation from a single image
US8780119B2 (en) Reconstruction render farm used in motion capture
CN109155078B (en) Method and device for generating set of sample images, electronic equipment and storage medium
CN108446585A (en) Method for tracking target, device, computer equipment and storage medium
CN109410316B (en) Method for three-dimensional reconstruction of object, tracking method, related device and storage medium
CN110083157B (en) Obstacle avoidance method and device
CN113492393A (en) Robot teaching demonstration by human
CN109816730A (en) Workpiece grabbing method, apparatus, computer equipment and storage medium
CN109191593A (en) Motion control method, device and the equipment of virtual three-dimensional model
CN107766851A (en) A kind of face key independent positioning method and positioner
US11108964B2 (en) Information processing apparatus presenting information, information processing method, and storage medium
CN113449570A (en) Image processing method and device
CN104318235B (en) A kind of spot center extracting method and device based on intensity profile modeling
CN112258589A (en) Hand-eye calibration method and device
CN112801064A (en) Model training method, electronic device and storage medium
CN112416133A (en) Hand motion capture method and device, electronic equipment and storage medium
Rodrigues et al. Robot trajectory planning using OLP and structured light 3D machine vision
CN109255801A (en) The method, apparatus, equipment and storage medium of three-dimension object Edge Following in video
CN112157653B (en) Shielding detection method and device
Costa et al. Modeling of video projectors in OpenGL for implementing a spatial augmented reality teaching system for assembly operations
Jørgensen et al. Simulation-based Optimization of Camera Placement in the Context of Industrial Pose Estimation.
KR20190050819A (en) A method for performing calibration using measured data without an assumed calibration model and a three-dimensional scanner calibration system
Dyrstad et al. Bin picking of reflective steel parts using a dual-resolution convolutional neural network trained in a simulated environment
CN116489516A (en) Specific object tracking shooting method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant