CN115648226B - Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring - Google Patents
Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring Download PDFInfo
- Publication number
- CN115648226B CN115648226B CN202211679848.2A CN202211679848A CN115648226B CN 115648226 B CN115648226 B CN 115648226B CN 202211679848 A CN202211679848 A CN 202211679848A CN 115648226 B CN115648226 B CN 115648226B
- Authority
- CN
- China
- Prior art keywords
- robot
- distance
- task space
- data
- space
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 24
- 230000011218 segmentation Effects 0.000 title claims abstract description 24
- 238000000034 method Methods 0.000 title claims abstract description 23
- 230000002159 abnormal effect Effects 0.000 claims abstract description 9
- 238000007781 pre-processing Methods 0.000 claims abstract description 6
- 238000012545 processing Methods 0.000 claims abstract description 5
- 230000009467 reduction Effects 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 230000004927 fusion Effects 0.000 claims description 2
- 230000003993 interaction Effects 0.000 claims 1
- 238000013459 approach Methods 0.000 description 3
- 238000000638 solvent extraction Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Abstract
The invention provides a safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring, which comprises the following steps: acquiring robot data by using a sensor system, preprocessing the acquired robot data, and removing abnormal values; processing the data with the abnormal values removed by using an octree filter and generating a point cloud, wherein the point cloud comprises the following steps: segmenting three-dimensional image data of the robot by using an octree filter, recording point cloud distribution of the robot in a task space, and realizing segmentation of the task space; fusing the point clouds to generate a task space when the number of the point clouds reaches a preset value; setting a distance threshold according to the task space; detecting the distance between an object or a person and the robot by using an ultrasonic distance sensor; and judging whether the distance between the object or the person and the robot is lower than the distance threshold value, and if so, controlling the robot to stop suddenly or safely.
Description
Technical Field
The invention relates to the technical field of industrial robots, in particular to a safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring.
Background
The concept of human-computer cooperation is widely applied in the field of robots in recent years, and the safety problem is one of the major challenges. At present, there are many common man-machine cooperation schemes, namely lightweight robot mechanism design, control based on a torque sensor or adding a pressure sensitive surface. These approaches have evolved into a variety of new robots, commonly referred to as cooperative robots. However, statistically, cooperative robots account for less than 10% of the robots in the world, while traditional industrial robots account for the vast majority of shipment. In addition, cooperative robots also have a large gap in performance from industrial robots. By combining the two points, a safety solution on the basis of the existing industrial robot needs to be provided to meet the requirement of man-machine cooperation.
Disclosure of Invention
The object of the present invention is to solve at least one of the technical drawbacks mentioned.
Therefore, the invention aims to provide a safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring.
In order to achieve the above object, an embodiment of the present invention provides a secure human-computer cooperation method based on task space segmentation and dynamic distance monitoring, including the following steps:
the method comprises the following steps that S1, robot data are obtained through a sensor system, the obtained robot data are preprocessed, and abnormal values are removed;
s2, processing the data with the abnormal values removed by using an octree filter and generating a point cloud, wherein the point cloud generation method comprises the following steps: segmenting three-dimensional image data of the robot by using an octree filter, recording point cloud distribution of the robot in a task space, and realizing segmentation of the task space;
s3, fusing the point clouds to generate a task space when the number of the point clouds is judged to reach a preset value;
step S4, setting a distance threshold value according to the task space,
s5, detecting the distance between an object or a person and the robot by using an ultrasonic distance sensor;
s6, judging whether the distance between the object or the person and the robot is lower than the distance threshold value, if so, executing the step S7, otherwise, returning to the step S5;
and S7, controlling the robot to stop suddenly or safely.
Further, the sensor system includes: the system comprises a 2D image acquisition system and a depth sensor, wherein 2D image data obtained by the 2D image acquisition system is combined with depth information of the depth sensor to generate point cloud.
Further, in the step S1, the preprocessing the data includes: and performing down-sampling and noise reduction on the data.
Further, in the step S3, when it is determined that the number of point clouds does not reach the preset value, the step S1 is returned to.
Further, a distance monitoring system is formed by adopting a plurality of ultrasonic distance sensors arranged in different directions and a central controller, the distance between an object or a person and the robot is detected by utilizing the ultrasonic distance sensors, and when the distance between the object or the person and the robot is judged to be lower than the distance threshold value, the central controller sends an instruction to the robot to control the robot to stop suddenly or stop safely.
Further, the distance thresholds set by the ultrasonic distance sensors are independent, and the size of the distance thresholds is related to the task space.
According to the safe human-computer cooperation method based on task space segmentation and dynamic distance monitoring, the 2D image sensor and the depth sensor are adopted to complete the task space segmentation of the robot, and the safe human-computer cooperation is realized by matching with a multi-ultrasonic-sensor distance monitoring system. The invention does not need to change the existing robot body or control system, and is applicable to industrial robots.
The invention uses task space segmentation, and can identify the actually used task space without knowing the internal program of the robot a priori. In a further step, a distance monitoring system consisting of individually configurable ultrasound modules is integrated, enabling safe or emergency stops with different distance thresholds depending on the direction of approach of an object or person. When the robot task is changed, the task space can be easily re-divided, and the method can adapt to complex and changeable industrial scenes. The invention is completely independent of the robot, does not need to modify the robot body or know the internal control system of the robot, and can be adapted to the industrial robots which account for most of the shipment at present.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow diagram of a method for secure human-machine collaboration based on task space segmentation and dynamic distance monitoring in accordance with an embodiment of the present invention;
FIG. 2 is a diagram of a robot task space segmentation effect according to an embodiment of the present invention;
FIG. 3 is a flowchart of a task space partitioning and dynamic distance monitoring based secure human-machine collaboration system according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
The robot usually carries out repeated operation when working, and its task space only accounts for a part of reachable space, can cut it out from reachable space to realize more accurate security and judge.
Based on the task space division and dynamic distance monitoring based safe man-machine cooperation method, the robot task space division and distance monitoring can be achieved.
As shown in fig. 1 to fig. 3, the secure human-computer cooperation method based on task space segmentation and dynamic distance monitoring according to the embodiment of the present invention includes the following steps:
and S1, acquiring robot data by using a sensor system, preprocessing the acquired robot data, and removing abnormal values.
In particular, a sensor system is employed to externally register any robot in an operational mode. The sensor system includes: the system comprises a 2D image acquisition system and a depth sensor, wherein 2D image data obtained by the 2D image acquisition system is combined with depth information of the depth sensor to generate a series of point clouds, and then the point clouds can be used for reconstructing a task space.
The robot data refers to fusion of 2D image data and depth information, i.e., depth information is added to each pixel of the 2D image. The fused data has information of 3-dimensional space, and the specific data is space position information of an entity in the space.
First, preprocessing data obtained by a sensor system includes: preprocessing is carried out through a down-sampling algorithm and noise reduction, and then abnormal values are filtered and removed.
S2, processing the data with the abnormal values removed by using an octree filter and generating a point cloud, wherein the point cloud generation method comprises the following steps: and segmenting the three-dimensional image data of the robot by using an octree filter, and recording the point cloud distribution of the robot in a task space to realize the segmentation of the task space.
And partitioning the operating robot by using the modified octree filter. By adopting the set of process continuous monitoring robot, the point cloud distribution of the robot in the task space can be recorded, so that the task space is divided. FIG. 2 illustrates the task space partitioning effect.
Specifically, the segmentation of the three-dimensional image data of the robot by using the octree filter includes: the three-dimensional data acquired in the last step includes the robot and other surrounding objects, and the segmentation refers to processing each frame of three-dimensional image to strip the robot from the whole space. In the following steps, the point cloud of the multi-frame robot is fused, and the task space can be obtained.
And S3, fusing the point clouds to generate a task space when the number of the point clouds is judged to reach a preset value. And returning to the step S1 when the number of the point clouds is judged not to reach the preset value.
It should be noted that the preset value can use the density of the point cloud as a standard, and the higher the density is, the more accurate the task space is.
And S4, setting a distance threshold according to the task space.
And S5, detecting the distance between the object or the person and the robot by using the ultrasonic distance sensor.
And S6, judging whether the distance between the object or the person and the robot is lower than a distance threshold value, if so, executing the step S7, otherwise, returning to the step S5.
And S7, controlling the robot to stop suddenly or safely.
Specifically, after the task space is divided, in order to realize distance detection in different directions, a distance monitoring system is formed by adopting a plurality of ultrasonic distance sensors installed in different directions and a central controller, the distance between an object or a person and the robot is detected by utilizing the ultrasonic distance sensors, and when the distance between the object or the person and the robot is judged to be lower than a distance threshold value, the central controller sends an instruction to the robot to control the robot to stop suddenly or safely.
In the embodiment of the invention, the distance thresholds set by the plurality of ultrasonic distance sensors are independent, and the size of the distance threshold is related to the task space.
The detection direction of the ultrasonic distance sensor is a sector area with a certain direction angle, and the distance in different directions can be monitored by adopting a plurality of sensors. A sudden stop is triggered whenever one does not meet the distance requirement.
According to the safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring, the 2D image sensor and the depth sensor are adopted to complete robot task space segmentation, and the multi-ultrasonic sensor distance monitoring system is matched to realize safe man-machine cooperation. The invention does not need to change the existing robot body or control system, and is applicable to industrial robots.
The invention uses task space segmentation, and can identify the actually used task space without knowing the internal program of the robot a priori. In a further step, a distance monitoring system consisting of individually configurable ultrasound modules is integrated, enabling safe or emergency stops with different distance thresholds depending on the direction of approach of an object or person. When the robot task is changed, the task space can be easily re-divided, and the method can adapt to complex and changeable industrial scenes. The invention is completely independent of the robot, does not need to modify the robot body or know the internal control system of the robot, and can be adapted to the industrial robots which account for most of the shipment at present.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made in the above embodiments by those of ordinary skill in the art without departing from the principle and spirit of the present invention. The scope of the invention is defined by the appended claims and their full range of equivalents.
Claims (3)
1. A safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring is characterized by comprising the following steps:
the method comprises the following steps that S1, robot data are obtained through a sensor system, the obtained robot data are preprocessed, and abnormal values are removed; employing a sensor system to externally register any robot in an operational mode, the sensor system comprising: the system comprises a 2D image acquisition system and a depth sensor, wherein 2D image data obtained by the 2D image acquisition system is combined with depth information of the depth sensor to generate a series of point clouds, and then a task space can be reconstructed by using the point clouds; the robot data refers to fusion of 2D image data and depth information, namely adding the depth information to each pixel of the 2D image; the fused data has 3-dimensional space information which is the space position information of the entity in the space;
s2, processing the data with the abnormal values removed by using an octree filter and generating a point cloud, wherein the point cloud generation method comprises the following steps: segmenting three-dimensional image data of the robot by using an octree filter, recording point cloud distribution of the robot in a task space, and realizing segmentation of the task space;
s3, fusing the point clouds to generate a task space when the number of the point clouds is judged to reach a preset value;
s4, setting a distance threshold according to the task space;
s5, detecting the distance between an object or a person and the robot by using an ultrasonic distance sensor; the system comprises a plurality of ultrasonic distance sensors arranged in different directions and a central controller, wherein a distance monitoring system is formed by the ultrasonic distance sensors and is used for detecting the distance between an object or a person and a robot; distance thresholds set by the ultrasonic distance sensors are independent, and the size of the distance thresholds is related to the task space; s6, judging whether the distance between the object or the person and the robot is lower than the distance threshold value, if so, executing the step S7, otherwise, returning to the step S5; wherein when the distance between the object or the person and the robot is judged to be lower than the distance threshold, an instruction is sent to the robot by the central controller;
and S7, controlling the robot to stop suddenly or safely.
2. The safe human-computer cooperation method based on task space segmentation and dynamic distance monitoring as claimed in claim 1, wherein in the step S1, the preprocessing of the data comprises: and performing down-sampling and noise reduction on the data.
3. The safe human-computer interaction method based on task space segmentation and dynamic distance monitoring as claimed in claim 1, wherein in step S3, when the number of point clouds is judged not to reach a preset value, the method returns to step S1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211679848.2A CN115648226B (en) | 2022-12-27 | 2022-12-27 | Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211679848.2A CN115648226B (en) | 2022-12-27 | 2022-12-27 | Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115648226A CN115648226A (en) | 2023-01-31 |
CN115648226B true CN115648226B (en) | 2023-03-21 |
Family
ID=85022323
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211679848.2A Active CN115648226B (en) | 2022-12-27 | 2022-12-27 | Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115648226B (en) |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102004041821A1 (en) * | 2004-08-27 | 2006-03-16 | Abb Research Ltd. | Device and method for securing a machine-controlled handling device |
US9694497B2 (en) * | 2012-07-10 | 2017-07-04 | Siemens Aktiengesellschaft | Robot arrangement and method for controlling a robot |
DE102014012563B4 (en) * | 2014-08-04 | 2018-10-25 | Abb Schweiz Ag | Proximity sensor system for a robot |
DE102016007519A1 (en) * | 2016-06-20 | 2017-12-21 | Kuka Roboter Gmbh | Monitoring a system with at least one robot |
EP3572971B1 (en) * | 2018-05-22 | 2021-02-24 | Sick Ag | Securing a surveillance area with at least one machine |
CN112991356B (en) * | 2019-12-12 | 2023-08-01 | 中国科学院沈阳自动化研究所 | Rapid segmentation method of mechanical arm in complex environment |
CN112621754A (en) * | 2020-12-11 | 2021-04-09 | 中国科学院沈阳计算技术研究所有限公司 | Design method for multi-robot-cooperated assembly line safety layout |
EP4070921A1 (en) * | 2021-04-08 | 2022-10-12 | Collobro AB | A safety system for a collaborative robot |
-
2022
- 2022-12-27 CN CN202211679848.2A patent/CN115648226B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN115648226A (en) | 2023-01-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3715065B1 (en) | Controlling a robot in the presence of a moving object | |
TWI547355B (en) | Safety monitoring system of human-machine symbiosis and method using the same | |
CN104944168B (en) | A kind of robot palletizer method of controlling security based on image three-dimensional reconstruction | |
CN109305165A (en) | Intelligent ultrasound system, rear of vehicle collision-warning device and its control method | |
Ray et al. | Coarse head pose estimation of construction equipment operators to formulate dynamic blind spots | |
JP2012108785A (en) | Person counting device, person counting method, and person counting program | |
Liu et al. | Real-time cutting tool state recognition approach based on machining features in NC machining process of complex structural parts | |
CN106680832A (en) | Obstacle detection method and device of mobile robot and mobile robot | |
CN108021899A (en) | Vehicle intelligent front truck anti-collision early warning method based on binocular camera | |
CN105882632A (en) | Advanced emergency brake system and method for controlling brake thereof | |
CN115648226B (en) | Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring | |
KR20160088099A (en) | Method for determinating collision avoiding path of vehicle | |
Schneider et al. | Unsupervised anomaly detection from time-of-flight depth images | |
CN112348837B (en) | Point-line detection fusion object edge detection method and system | |
CN114764885A (en) | Obstacle detection method and device, computer-readable storage medium and processor | |
CN113110515A (en) | Mobile robot obstacle avoidance method, device, equipment and storage medium | |
KR102270827B1 (en) | Generating Joint Cameraand LiDAR Features Using Cross-View Spatial Feature Mapping for 3D Object Detection | |
CN104698290A (en) | Motor default phase alarming method and system | |
CN111445519A (en) | Industrial robot three-dimensional attitude estimation method and device and storage medium | |
Nagata et al. | Generation of triangulated patches smoothed from original point cloud data with noise and its application to robotic machining | |
CN113523902B (en) | Five-axis linkage fork type swing head anti-collision control method | |
CN113689468A (en) | Method and apparatus for controlling workpiece processing apparatus, and computer-readable storage medium | |
Lin et al. | Road obstacle detection in stereo vision based on UV-disparity | |
EP3229173B1 (en) | Method and apparatus for determining a traversable path | |
CN109736045A (en) | Equipment progress control method, device and device for clothing processing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20231013 Address after: Room 204, 2nd Floor, Building 23, No. 79 Shuangying West Road, Science and Technology Park, Changping District, Beijing 102200 Patentee after: Luoshi (Beijing) Robotics Co.,Ltd. Address before: 100097 1-01, floor 7, building a, Beijing Haiqing Shuguang real estate development center (Office) and postal branch project, East wangfuyuan District, Haidian District, Beijing Patentee before: ROKAE, Inc. |