CN115648226A - Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring - Google Patents

Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring Download PDF

Info

Publication number
CN115648226A
CN115648226A CN202211679848.2A CN202211679848A CN115648226A CN 115648226 A CN115648226 A CN 115648226A CN 202211679848 A CN202211679848 A CN 202211679848A CN 115648226 A CN115648226 A CN 115648226A
Authority
CN
China
Prior art keywords
robot
distance
task space
data
method based
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211679848.2A
Other languages
Chinese (zh)
Other versions
CN115648226B (en
Inventor
万昊
刘勇志
庹华
韩峰涛
于文进
张航
马建涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luoshi Beijing Robotics Co ltd
Original Assignee
Rokae Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rokae Inc filed Critical Rokae Inc
Priority to CN202211679848.2A priority Critical patent/CN115648226B/en
Publication of CN115648226A publication Critical patent/CN115648226A/en
Application granted granted Critical
Publication of CN115648226B publication Critical patent/CN115648226B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The invention provides a safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring, which comprises the following steps: acquiring robot data by using a sensor system, preprocessing the acquired robot data, and removing abnormal values; processing the data with the abnormal values removed by using an octree filter and generating a point cloud, wherein the point cloud comprises the following steps: segmenting three-dimensional image data of the robot by using an octree filter, recording point cloud distribution of the robot in a task space, and realizing segmentation of the task space; fusing the point clouds to generate a task space when the number of the point clouds reaches a preset value; setting a distance threshold according to the task space; detecting the distance between an object or a person and the robot by using an ultrasonic distance sensor; and judging whether the distance between the object or the person and the robot is lower than the distance threshold value, and if so, controlling the robot to stop suddenly or safely.

Description

Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring
Technical Field
The invention relates to the technical field of industrial robots, in particular to a safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring.
Background
The concept of human-computer cooperation has been widely applied in the field of robots in recent years, and the safety problem is one of the major challenges. At present, there are many common man-machine cooperation schemes, namely lightweight robot mechanism design, control based on a torque sensor or adding a pressure sensitive surface. These approaches have evolved into a variety of new robots, commonly referred to as cooperative robots. However, statistically, cooperative robots account for less than 10% of the robots in the world, while traditional industrial robots account for the vast majority of shipment. In addition, cooperative robots also have a large gap in performance from industrial robots. By combining the two points, a safety solution on the basis of the existing industrial robot needs to be provided to meet the requirement of man-machine cooperation.
Disclosure of Invention
The object of the present invention is to solve at least one of the technical drawbacks mentioned.
Therefore, the invention aims to provide a safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring.
In order to achieve the above object, an embodiment of the present invention provides a secure human-computer cooperation method based on task space segmentation and dynamic distance monitoring, including the following steps:
the method comprises the following steps that S1, robot data are obtained through a sensor system, the obtained robot data are preprocessed, and abnormal values are removed;
s2, processing the data with the abnormal values removed by using an octree filter and generating a point cloud, wherein the processing comprises the following steps: segmenting three-dimensional image data of the robot by using an octree filter, recording point cloud distribution of the robot in a task space, and realizing segmentation of the task space;
s3, fusing the point clouds to generate a task space when the number of the point clouds is judged to reach a preset value;
step S4, setting a distance threshold value according to the task space,
s5, detecting the distance between an object or a person and the robot by using an ultrasonic distance sensor;
s6, judging whether the distance between the object or the person and the robot is lower than the distance threshold value, if so, executing the step S7, otherwise, returning to the step S5;
and S7, controlling the robot to stop suddenly or safely.
Further, the sensor system includes: the system comprises a 2D image acquisition system and a depth sensor, wherein 2D image data obtained by the 2D image acquisition system is combined with depth information of the depth sensor to generate point cloud.
Further, in the step S1, preprocessing the data includes: and performing down-sampling and noise reduction on the data.
Further, in the step S3, when it is determined that the number of point clouds does not reach the preset value, the step S1 is returned to.
Further, a distance monitoring system is formed by adopting a plurality of ultrasonic distance sensors arranged in different directions and a central controller, the distance between an object or a person and the robot is detected by utilizing the ultrasonic distance sensors, and when the distance between the object or the person and the robot is judged to be lower than the distance threshold value, the central controller sends an instruction to the robot to control the robot to stop suddenly or stop safely.
Further, the distance thresholds set by the ultrasonic distance sensors are independent, and the size of the distance thresholds is related to the task space.
According to the safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring, the 2D image sensor and the depth sensor are adopted to complete robot task space segmentation, and the multi-ultrasonic sensor distance monitoring system is matched to realize safe man-machine cooperation. The invention does not need to change the existing robot body or control system, and is applicable to industrial robots.
The invention uses task space segmentation, and can identify the actually used task space without knowing the internal program of the robot a priori. In a further step, a distance monitoring system consisting of individually configurable ultrasound modules is integrated, enabling safe or emergency stops with different distance thresholds depending on the direction of approach of an object or person. When the robot task is changed, the task space can be easily re-divided, and the method can adapt to complex and changeable industrial scenes. The invention is completely independent of the robot, does not need to modify the robot body or know the internal control system of the robot, and can be adapted to the industrial robots which account for most of the shipment at present.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a flow chart of a method for secure human-machine collaboration based on task space segmentation and dynamic distance monitoring in accordance with an embodiment of the present invention;
FIG. 2 is a diagram illustrating the effect of partitioning the task space of a robot according to an embodiment of the present invention;
FIG. 3 is a flowchart of a task space partitioning and dynamic distance monitoring based secure human-machine collaboration system according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
The robot usually performs repeated operation during working, the task space only occupies one part of the reachable space, and the task space can be divided from the reachable space, so that more accurate safety judgment is realized.
Based on the task space division and dynamic distance monitoring based safe man-machine cooperation method, the robot task space division and distance monitoring can be achieved.
As shown in fig. 1 to fig. 3, the secure human-computer cooperation method based on task space segmentation and dynamic distance monitoring according to the embodiment of the present invention includes the following steps:
and S1, acquiring robot data by using a sensor system, preprocessing the acquired robot data, and removing abnormal values.
In particular, a sensor system is employed to externally register any robot in an operational mode. The sensor system includes: the system comprises a 2D image acquisition system and a depth sensor, wherein 2D image data obtained by the 2D image acquisition system is combined with depth information of the depth sensor to generate a series of point clouds, and then the point clouds can be used for reconstructing a task space.
The robot data refers to fusion of 2D image data and depth information, that is, adding depth information to each pixel of the 2D image. The fused data has information of 3-dimensional space, and the specific data is space position information of an entity in the space.
First, preprocessing data obtained by a sensor system includes: preprocessing is carried out through a down-sampling algorithm and noise reduction, and then abnormal values are filtered and removed.
S2, processing the data with the abnormal values removed by using an octree filter and generating a point cloud, wherein the point cloud generation method comprises the following steps: and segmenting the three-dimensional image data of the robot by using an octree filter, and recording the point cloud distribution of the robot in the task space to realize the segmentation of the task space.
And partitioning the operating robot by using the modified octree filter. By adopting the set of process continuous monitoring robot, the point cloud distribution of the robot in the task space can be recorded, so that the division of the task space is completed. FIG. 2 illustrates the task space partitioning effect.
Specifically, the segmentation of the three-dimensional image data of the robot by using the octree filter includes: the three-dimensional data acquired in the last step includes the robot and other surrounding objects, and the segmentation refers to processing each frame of three-dimensional image to strip the robot from the whole space. In the following steps, the point cloud of the multi-frame robot is fused, and the task space can be obtained.
And S3, fusing the point clouds to generate a task space when the number of the point clouds is judged to reach a preset value. And returning to the step S1 when the number of the point clouds is judged not to reach the preset value.
It should be noted that the preset value can use the density of the point cloud as a standard, and the higher the density is, the more accurate the task space is.
And S4, setting a distance threshold according to the task space.
And S5, detecting the distance between the object or the person and the robot by using the ultrasonic distance sensor.
And S6, judging whether the distance between the object or the person and the robot is lower than a distance threshold value, if so, executing the step S7, otherwise, returning to the step S5.
And S7, controlling the robot to stop suddenly or safely.
Specifically, after the task space is divided, in order to realize distance detection in different directions, a distance monitoring system is formed by adopting a plurality of ultrasonic distance sensors installed in different directions and a central controller, the distance between an object or a person and the robot is detected by utilizing the ultrasonic distance sensors, and when the distance between the object or the person and the robot is judged to be lower than a distance threshold value, the central controller sends an instruction to the robot to control the robot to stop suddenly or safely.
In the embodiment of the invention, the distance thresholds set by the plurality of ultrasonic distance sensors are independent, and the size of the distance threshold is related to the task space.
The detection direction of the ultrasonic distance sensor is a sector area with a certain direction angle, and the distance in different directions can be monitored by adopting a plurality of sensors. A sudden stop is triggered whenever one does not meet the distance requirement.
According to the safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring, the 2D image sensor and the depth sensor are adopted to complete robot task space segmentation, and the multi-ultrasonic sensor distance monitoring system is matched to realize safe man-machine cooperation. The invention does not need to change the existing robot body or control system, and is applicable to industrial robots.
The invention uses task space segmentation, and can identify the actually used task space without knowing the internal program of the robot a priori. In a further step, a distance monitoring system consisting of individually configurable ultrasound modules is integrated, enabling safe or emergency stops with different distance thresholds depending on the direction of approach of an object or person. When the robot task is changed, the task space can be easily re-divided, and the method can adapt to complex and changeable industrial scenes. The invention is completely independent of the robot, does not need to modify the robot body or know the internal control system of the robot, and can be adapted to the industrial robots which account for most of the shipment at present.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made in the above embodiments by those of ordinary skill in the art without departing from the principle and spirit of the present invention. The scope of the invention is defined by the appended claims and their full range of equivalents.

Claims (6)

1. A safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring is characterized by comprising the following steps:
the method comprises the following steps that S1, robot data are obtained through a sensor system, the obtained robot data are preprocessed, and abnormal values are removed;
s2, processing the data with the abnormal values removed by using an octree filter and generating a point cloud, wherein the point cloud generation method comprises the following steps: segmenting three-dimensional image data of the robot by using an octree filter, recording point cloud distribution of the robot in a task space, and realizing segmentation of the task space;
s3, fusing the point clouds to generate a task space when the number of the point clouds is judged to reach a preset value;
s4, setting a distance threshold according to the task space;
s5, detecting the distance between an object or a person and the robot by using an ultrasonic distance sensor;
s6, judging whether the distance between the object or the person and the robot is lower than the distance threshold value, if so, executing the step S7, otherwise, returning to the step S5;
and S7, controlling the robot to stop suddenly or safely.
2. The secure human-computer collaboration method based on task space segmentation and dynamic distance monitoring as claimed in claim 1, wherein the sensor system comprises: the system comprises a 2D image acquisition system and a depth sensor, wherein 2D image data obtained by the 2D image acquisition system is combined with depth information of the depth sensor to generate point cloud.
3. The safe human-computer cooperation method based on task space segmentation and dynamic distance monitoring as claimed in claim 1, wherein in the step S1, the preprocessing of the data comprises: and performing down-sampling and noise reduction on the data.
4. The safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring as claimed in claim 1, wherein in the step S3, when the number of point clouds is judged not to reach a preset value, the step S1 is returned.
5. The safe human-computer interaction method based on task space segmentation and dynamic distance monitoring as claimed in claim 1, wherein a distance monitoring system is composed of a plurality of ultrasonic distance sensors installed in different directions and a central controller, the distance between an object or a person and a robot is detected by using the ultrasonic distance sensors, and when the distance between the object or the person and the robot is judged to be lower than the distance threshold value, an instruction is sent to the robot by the central controller to control the robot to stop suddenly or stop safely.
6. The safe human-computer interaction method based on task space segmentation and dynamic distance monitoring as claimed in claim 5, wherein distance thresholds set by a plurality of ultrasonic distance sensors are independent, and the size of the distance threshold is related to the task space.
CN202211679848.2A 2022-12-27 2022-12-27 Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring Active CN115648226B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211679848.2A CN115648226B (en) 2022-12-27 2022-12-27 Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211679848.2A CN115648226B (en) 2022-12-27 2022-12-27 Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring

Publications (2)

Publication Number Publication Date
CN115648226A true CN115648226A (en) 2023-01-31
CN115648226B CN115648226B (en) 2023-03-21

Family

ID=85022323

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211679848.2A Active CN115648226B (en) 2022-12-27 2022-12-27 Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring

Country Status (1)

Country Link
CN (1) CN115648226B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101018973A (en) * 2004-08-27 2007-08-15 Abb研究有限公司 Device and method for safeguarding a machine-controlled handling device
CN104428107A (en) * 2012-07-10 2015-03-18 西门子公司 Robot arrangement and method for controlling a robot
WO2016020052A1 (en) * 2014-08-04 2016-02-11 gomtec GmbH Proximity sensor system for a robot
WO2017220186A1 (en) * 2016-06-20 2017-12-28 Kuka Roboter Gmbh Monitoring a plant equipped with at least one robot
US20190362487A1 (en) * 2018-05-22 2019-11-28 Sick Ag Securing a Monitored Zone Comprising at Least One Machine
CN112621754A (en) * 2020-12-11 2021-04-09 中国科学院沈阳计算技术研究所有限公司 Design method for multi-robot-cooperated assembly line safety layout
CN112991356A (en) * 2019-12-12 2021-06-18 中国科学院沈阳自动化研究所 Rapid segmentation method of mechanical arm in complex environment
EP4070921A1 (en) * 2021-04-08 2022-10-12 Collobro AB A safety system for a collaborative robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101018973A (en) * 2004-08-27 2007-08-15 Abb研究有限公司 Device and method for safeguarding a machine-controlled handling device
CN104428107A (en) * 2012-07-10 2015-03-18 西门子公司 Robot arrangement and method for controlling a robot
WO2016020052A1 (en) * 2014-08-04 2016-02-11 gomtec GmbH Proximity sensor system for a robot
WO2017220186A1 (en) * 2016-06-20 2017-12-28 Kuka Roboter Gmbh Monitoring a plant equipped with at least one robot
US20190362487A1 (en) * 2018-05-22 2019-11-28 Sick Ag Securing a Monitored Zone Comprising at Least One Machine
CN112991356A (en) * 2019-12-12 2021-06-18 中国科学院沈阳自动化研究所 Rapid segmentation method of mechanical arm in complex environment
CN112621754A (en) * 2020-12-11 2021-04-09 中国科学院沈阳计算技术研究所有限公司 Design method for multi-robot-cooperated assembly line safety layout
EP4070921A1 (en) * 2021-04-08 2022-10-12 Collobro AB A safety system for a collaborative robot

Also Published As

Publication number Publication date
CN115648226B (en) 2023-03-21

Similar Documents

Publication Publication Date Title
TWI547355B (en) Safety monitoring system of human-machine symbiosis and method using the same
EP3102367B1 (en) Controlling a robot in the presence of a moving object
AU2013273831B2 (en) A method for improving tracking using dynamic background compensation with centroid compensation
CN110069990B (en) Height limiting rod detection method and device and automatic driving system
CN107530881B (en) Robot system and method for operating a robot
CN109305165A (en) Intelligent ultrasound system, rear of vehicle collision-warning device and its control method
JP2012108785A (en) Person counting device, person counting method, and person counting program
CN106680832A (en) Obstacle detection method and device of mobile robot and mobile robot
CN108021899A (en) Vehicle intelligent front truck anti-collision early warning method based on binocular camera
CN109591009A (en) Robot system
CN112883790A (en) 3D object detection method based on monocular camera
CN115648226B (en) Safe man-machine cooperation method based on task space segmentation and dynamic distance monitoring
Schneider et al. Unsupervised anomaly detection from time-of-flight depth images
TW201228766A (en) Interference preventing method and device
CN112348837B (en) Point-line detection fusion object edge detection method and system
CN113110515A (en) Mobile robot obstacle avoidance method, device, equipment and storage medium
KR102270827B1 (en) Generating Joint Cameraand LiDAR Features Using Cross-View Spatial Feature Mapping for 3D Object Detection
CN114815821B (en) Indoor self-adaptive panoramic obstacle avoidance method and system based on multi-line laser radar
JP6941226B2 (en) Object recognition device
Nagata et al. Generation of triangulated patches smoothed from original point cloud data with noise and its application to robotic machining
CN111445519A (en) Industrial robot three-dimensional attitude estimation method and device and storage medium
CN113689468A (en) Method and apparatus for controlling workpiece processing apparatus, and computer-readable storage medium
CN113523902B (en) Five-axis linkage fork type swing head anti-collision control method
CN108427931B (en) The detection method of barrier before a kind of mine locomotive based on machine vision
CN106740495A (en) A kind of rearview mirror adjusting method, device and automobile

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20231013

Address after: Room 204, 2nd Floor, Building 23, No. 79 Shuangying West Road, Science and Technology Park, Changping District, Beijing 102200

Patentee after: Luoshi (Beijing) Robotics Co.,Ltd.

Address before: 100097 1-01, floor 7, building a, Beijing Haiqing Shuguang real estate development center (Office) and postal branch project, East wangfuyuan District, Haidian District, Beijing

Patentee before: ROKAE, Inc.

TR01 Transfer of patent right