EP2046537A2 - Verfahren zur beobachtung einer person in einem industriellen umfeld - Google Patents
Verfahren zur beobachtung einer person in einem industriellen umfeldInfo
- Publication number
- EP2046537A2 EP2046537A2 EP07723978A EP07723978A EP2046537A2 EP 2046537 A2 EP2046537 A2 EP 2046537A2 EP 07723978 A EP07723978 A EP 07723978A EP 07723978 A EP07723978 A EP 07723978A EP 2046537 A2 EP2046537 A2 EP 2046537A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- person
- movement
- image data
- machine
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F16—ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
- F16P—SAFETY DEVICES IN GENERAL; SAFETY DEVICES FOR PRESSES
- F16P3/00—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body
- F16P3/12—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine
- F16P3/14—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact
- F16P3/142—Safety devices acting in conjunction with the control or operation of a machine; Control arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact using image capturing devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40202—Human robot coexistence
Definitions
- the invention relates to a method for observing a person in an industrial environment according to the preambles of claims 1 and 3.
- the corresponding approaches are divided into two-dimensional methods, with explicit shape models or model-free, and into three-dimensional methods.
- windows of different sizes are moved over the source image; the corresponding image areas are subjected to a Haar wavelet transformation.
- the corresponding wavelet coefficients are obtained by applying differential operators of different scaling and orientation at different positions of the image region. From this possibly quite large set of features, a small subset of the coefficients are selected "on the fly” based on their magnitude and their local distribution in the image
- This reduced set of features is fed to a support vector machine (SVM) for classification.
- SVM support vector machine
- For the purpose of detection windows of different sizes are pushed over the image and the corresponding features are extracted from these image areas, and the SVM then decides whether or not the corresponding window contains a person
- [5] summarizes temporal sequences of two-dimensional Haar wavelet features into high-dimensional feature vectors and classifies them with SVMs, resulting in a gain in recognition performance over the pure frame approach.
- the method of Chamfer Matching applied to the detection of pedestrian outlines in road traffic scenario in non-stationary camera.
- the Chamfer matching technique is combined with a stereo image processing system and a neural network used as a texture classifier with local receptive fields according to [8] in order to achieve a secure and robust classification result.
- Another group of methods for person detection are model-based techniques that use explicit prior knowledge about the appearance of persons in the form of a model. Since this concealment of parts of the body are problematic, many systems also require prior knowledge of the nature of the detected movements and the camera's perspective.
- the persons are z. B. segmented by subtraction of the background, which requires a stationary camera and a no or only slowly changing background.
- the models used are z. B. from straight bars ("stick figures"), wherein the individual body parts are approximated by ellipsoids
- a method for 3D modeling a person from 2D image data is described in [30].
- image data of a person are detected with the aid of a multi-camera system and their body parts, in particular by means of template matching, identified in the 2D image data.
- the identified body parts are then modeled by dynamic template matching using 3D templates. In this way it is achieved that the persons can be identified quickly and continuously even if they are partially hidden or temporarily not by the multi-camera system could be recorded.
- the identified persons are then tracked in the image data.
- the prior art described above demonstrates that a variety of image processing based methods for recognizing persons in different complex environments, for recognizing body parts and their movements, as well as for the detection of complex, compound parts objects and the corresponding assembly activities are known.
- the applicability of these algorithms is often described only in terms of purely academic applications.
- the object of the invention is to make a camera-based person recognition and modeling of use in an industrial environment accessible.
- image data of the person are detected by means of a multi-camera system.
- This image data is then examined for imaging a person, so that when a person has been detected in the image data, this person hypothesis is adapted to an articulated, virtual 3D model of the human body. Subsequently, this virtual body model is continuously adapted to the movement behavior of the person detected in the image data.
- the position and / or the movement behavior of a person located in the environment of the machine or a machine element is determined. Proceeding from this, a hazard potential can then be determined in knowledge of the position and the movement behavior of the virtual body model in the room. The hazard potential determined in this way is subjected to a threshold value comparison in order to influence the motion control of the machine or of the machine part when this threshold value is exceeded. In a particularly advantageous manner, this action on the motion control of the machine or the machine part will cause its shutdown or its movement slowing down. If only a movement slowdown is effected, the machine or its movable machine element is given the opportunity to continue the operation while reducing the risk potential.
- data are derived therefrom continuously, depending on the current shape and position of the virtual body model, which are correlated with the data of a database.
- the database contains a multiplicity of data, which were ascertained in advance based on the shape and position of a body model during a multiplicity of movement phases describing a movement sequence of a person.
- those movement phases are then taken by the observed person as being taken in, if the data derived from their current body model have a certain degree of similarity with the data stored in the database for this movement phase. Is doing a certain sequence detected motion phases stored in the database, then the movement is considered to be completed by the observed person. If, however, the motion sequence is judged to be incomplete, a corresponding signaling takes place.
- Another advantageous application of this alternative embodiment of the invention is the monitoring of newly trained operating personnel. For example, many production errors occur when new workers have to be trained at short notice during the holiday season.
- the work processes can be observed newly trained operators. It can then be an indication when it is detected that within a motion sequence to be performed necessary movement phases were not taken, so that it must be assumed that a work process was not performed correctly.
- an indication of at least one of the movement phases is given in the context of signaling a judged as incomplete movement sequence, which is considered in the context of checking a correct sequence with respect to the movement sequence as not taken. In this way, it is particularly easy for the observed person to recognize the errors in their movement or in their work execution.
- a trainer can recognize which sections of the learned activity are still difficult for the instructed person and possibly require additional explanations or further training.
- ergonomically problematic movement phases can be detected within a whole sequence of movements in a profitable manner and optionally optimized by changing the sequence of movement phases or by modifications to the observed by the person to be monitored plants or objects ,
- the amount of data to be managed in the database as well as the processing effort can also be reduced by subjecting the image data stored therein to a transformation, in particular a main axis transformation.
- a transformation in particular a main axis transformation.
- the creation of the 3D model of the person is based on 3D point data.
- These point data can be created by multi-image analysis, in particular stereo image analysis.
- stereo image analysis For example, by using a spatiotemporal features based stereo method (such as in
- [28] information is obtained for each 3D point in space that goes beyond its location coordinates (x, y, z), such as its speed or acceleration.
- the segmentation of a plurality of 3D point data (3D point cloud) takes place profitably by means of a cluster method, in particular by means of agglomerative clustering.
- the convex hull is then determined.
- simple features, in particular their height or volume are first determined for each cluster. In this way, invalid, meaningless clusters can then be discarded, in particular on the basis of a priori knowledge about the properties of a natural person.
- an articulated 3D model of the human body It is advantageous to model the body parts by interconnected cylinder.
- the posture of the person is given in this model as a vector of the joint angle of the model.
- the evaluation of a posture is preferably done by determining the divergence between the features derived from the 3D point cloud and the images of the scene and the appearance of the model at a given posture, thereby determining a likelihood that the given posture will be the measured shape of the person reproduces.
- a probabilistic approach to explore the search space is a kernel-based particle filter [29].
- the detected movements of the body parts are represented by motion templates.
- motion templates for 3D Measurement of typical human motion sequences contained representative movement patterns that narrow the space of possible joint angles and joint angular velocities of the person model. In this way, a biologically realistic extrapolation of the movements of the person is possible, especially with the aim of detecting an imminent collision between man and machine.
- a movement process can here be regarded as a composite sequence of movement phases.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Psychiatry (AREA)
- Human Computer Interaction (AREA)
- Social Psychology (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mechanical Engineering (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Emergency Alarm Devices (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102006036400 | 2006-08-02 | ||
DE102006048166A DE102006048166A1 (de) | 2006-08-02 | 2006-10-10 | Verfahren zur Beobachtung einer Person in einem industriellen Umfeld |
PCT/EP2007/003037 WO2008014831A2 (de) | 2006-08-02 | 2007-04-04 | Verfahren zur beobachtung einer person in einem industriellen umfeld |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2046537A2 true EP2046537A2 (de) | 2009-04-15 |
Family
ID=38885059
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP07723978A Ceased EP2046537A2 (de) | 2006-08-02 | 2007-04-04 | Verfahren zur beobachtung einer person in einem industriellen umfeld |
Country Status (6)
Country | Link |
---|---|
US (1) | US8154590B2 (de) |
EP (1) | EP2046537A2 (de) |
JP (1) | JP2009545789A (de) |
CN (1) | CN101511550B (de) |
DE (1) | DE102006048166A1 (de) |
WO (1) | WO2008014831A2 (de) |
Families Citing this family (87)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070134652A1 (en) * | 2005-11-09 | 2007-06-14 | Primera Biosystems, Inc. | Multiplexed quantitative detection of pathogens |
FR2927444B1 (fr) * | 2008-02-12 | 2013-06-14 | Cliris | Procede pour generer une image de densite d'une zone d'observation |
EP2364243B1 (de) * | 2008-12-03 | 2012-08-01 | ABB Research Ltd. | Robotersicherheitssystem und verfahren |
AT508094B1 (de) * | 2009-03-31 | 2015-05-15 | Fronius Int Gmbh | Verfahren und vorrichtung zur bedienung einer mit einem handbetätigten arbeitsgerät verbundenen stromquelle |
US8503720B2 (en) | 2009-05-01 | 2013-08-06 | Microsoft Corporation | Human body pose estimation |
EP2430614B1 (de) * | 2009-05-11 | 2013-09-18 | Universität zu Lübeck | Verfahren zur echtzeitfähigen, rechnergestützten analyse einer eine veränderliche pose enthaltenden bildsequenz |
US20110026770A1 (en) * | 2009-07-31 | 2011-02-03 | Jonathan David Brookshire | Person Following Using Histograms of Oriented Gradients |
US8963829B2 (en) | 2009-10-07 | 2015-02-24 | Microsoft Corporation | Methods and systems for determining and tracking extremities of a target |
US8564534B2 (en) | 2009-10-07 | 2013-10-22 | Microsoft Corporation | Human tracking system |
US8867820B2 (en) * | 2009-10-07 | 2014-10-21 | Microsoft Corporation | Systems and methods for removing a background of an image |
US7961910B2 (en) | 2009-10-07 | 2011-06-14 | Microsoft Corporation | Systems and methods for tracking a model |
DE102009046107A1 (de) * | 2009-10-28 | 2011-05-05 | Ifm Electronic Gmbh | System und Verfahren für eine Interaktion zwischen einer Person und einer Maschine |
US9189949B2 (en) | 2010-12-09 | 2015-11-17 | Sealed Air Corporation (Us) | Automated monitoring and control of contamination in a production area |
US9406212B2 (en) | 2010-04-01 | 2016-08-02 | Sealed Air Corporation (Us) | Automated monitoring and control of contamination activity in a production area |
US9143843B2 (en) | 2010-12-09 | 2015-09-22 | Sealed Air Corporation | Automated monitoring and control of safety in a production area |
DE102010017857B4 (de) | 2010-04-22 | 2019-08-08 | Sick Ag | 3D-Sicherheitsvorrichtung und Verfahren zur Absicherung und Bedienung mindestens einer Maschine |
US9011607B2 (en) | 2010-10-07 | 2015-04-21 | Sealed Air Corporation (Us) | Automated monitoring and control of cleaning in a production area |
DE102010061382B4 (de) * | 2010-12-21 | 2019-02-14 | Sick Ag | Optoelektronischer Sensor und Verfahren zur Erfassung und Abstandsbestimmung von Objekten |
US8942917B2 (en) | 2011-02-14 | 2015-01-27 | Microsoft Corporation | Change invariant scene recognition by an agent |
US20130070056A1 (en) * | 2011-09-20 | 2013-03-21 | Nexus Environmental, LLC | Method and apparatus to monitor and control workflow |
US8724906B2 (en) * | 2011-11-18 | 2014-05-13 | Microsoft Corporation | Computing pose and/or shape of modifiable entities |
CN104039513B (zh) * | 2012-01-13 | 2015-12-23 | 三菱电机株式会社 | 风险测定系统 |
DE102012102236A1 (de) | 2012-03-16 | 2013-09-19 | Pilz Gmbh & Co. Kg | Verfahren und Vorrichtung zum Absichern eines gefährlichen Arbeitsbereichs einer automatisiert arbeitenden Maschine |
DE102012103163A1 (de) * | 2012-04-12 | 2013-10-17 | Steinel Gmbh | Vorrichtung zur Steuerung eines Gebäudeaggregats |
CN104428107B (zh) | 2012-07-10 | 2016-06-29 | 西门子公司 | 机器人布置和用于控制机器人的方法 |
WO2014036549A2 (en) * | 2012-08-31 | 2014-03-06 | Rethink Robotics, Inc. | Systems and methods for safe robot operation |
US10776734B2 (en) * | 2012-09-10 | 2020-09-15 | The Boeing Company | Ergonomic safety evaluation with labor time standard |
US9857470B2 (en) | 2012-12-28 | 2018-01-02 | Microsoft Technology Licensing, Llc | Using photometric stereo for 3D environment modeling |
US9940553B2 (en) | 2013-02-22 | 2018-04-10 | Microsoft Technology Licensing, Llc | Camera/object pose from predicted coordinates |
US9498885B2 (en) | 2013-02-27 | 2016-11-22 | Rockwell Automation Technologies, Inc. | Recognition-based industrial automation control with confidence-based decision support |
US9393695B2 (en) * | 2013-02-27 | 2016-07-19 | Rockwell Automation Technologies, Inc. | Recognition-based industrial automation control with person and object discrimination |
US9804576B2 (en) * | 2013-02-27 | 2017-10-31 | Rockwell Automation Technologies, Inc. | Recognition-based industrial automation control with position and derivative decision reference |
US9798302B2 (en) | 2013-02-27 | 2017-10-24 | Rockwell Automation Technologies, Inc. | Recognition-based industrial automation control with redundant system input support |
US9427871B2 (en) | 2013-05-06 | 2016-08-30 | Abb Technology Ag | Human safety provision in mobile automation environments |
DE102014209337A1 (de) | 2013-05-17 | 2014-11-20 | Ifm Electronic Gmbh | System und Verfahren zur Erfassung eines Gefährdungsbereichs |
DE102013110905A1 (de) * | 2013-10-01 | 2015-04-02 | Daimler Ag | MRK Planungs- und Überwachungstechnologie |
US20150092040A1 (en) * | 2013-10-01 | 2015-04-02 | Broadcom Corporation | Gesture-Based Industrial Monitoring |
US9452531B2 (en) | 2014-02-04 | 2016-09-27 | Microsoft Technology Licensing, Llc | Controlling a robot in the presence of a moving object |
DE102014202733B4 (de) * | 2014-02-14 | 2022-09-01 | Homag Plattenaufteiltechnik Gmbh | Verfahren zum Betreiben einer Maschine, insbesondere einer Plattenaufteilanlage |
JP5785284B2 (ja) * | 2014-02-17 | 2015-09-24 | ファナック株式会社 | 搬送対象物の落下事故を防止するロボットシステム |
US9921300B2 (en) | 2014-05-19 | 2018-03-20 | Rockwell Automation Technologies, Inc. | Waveform reconstruction in a time-of-flight sensor |
US9256944B2 (en) * | 2014-05-19 | 2016-02-09 | Rockwell Automation Technologies, Inc. | Integration of optical area monitoring with industrial machine control |
US9696424B2 (en) | 2014-05-19 | 2017-07-04 | Rockwell Automation Technologies, Inc. | Optical area monitoring with spot matrix illumination |
US11243294B2 (en) | 2014-05-19 | 2022-02-08 | Rockwell Automation Technologies, Inc. | Waveform reconstruction in a time-of-flight sensor |
US9625108B2 (en) | 2014-10-08 | 2017-04-18 | Rockwell Automation Technologies, Inc. | Auxiliary light source associated with an industrial application |
US10198706B2 (en) * | 2015-07-31 | 2019-02-05 | Locus Robotics Corp. | Operator identification and performance tracking |
US10414047B2 (en) | 2015-09-28 | 2019-09-17 | Siemens Product Lifecycle Management Software Inc. | Method and a data processing system for simulating and handling of anti-collision management for an area of a production plant |
DE102015225587A1 (de) * | 2015-12-17 | 2017-06-22 | Volkswagen Aktiengesellschaft | Interaktionssystem und Verfahren zur Interaktion zwischen einer Person und mindestens einer Robotereinheit |
DE102016200455A1 (de) * | 2016-01-15 | 2017-07-20 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Sicherheitsvorrichtung und -verfahren zum sicheren Betrieb eines Roboters |
US10924881B2 (en) * | 2016-03-03 | 2021-02-16 | Husqvarna Ab | Device for determining construction device and worker position |
WO2017198342A1 (de) * | 2016-05-18 | 2017-11-23 | Bobst Grenchen Ag | Kontrollsystem für einen funktionsabschnitt einer papierverarbeitungsvorrichtung |
DE102016212695B4 (de) * | 2016-05-31 | 2019-02-21 | Siemens Aktiengesellschaft | Industrieroboter |
JP6703691B2 (ja) * | 2016-06-02 | 2020-06-03 | コマツ産機株式会社 | コントローラ、鍛圧機械、および制御方法 |
WO2018018574A1 (zh) * | 2016-07-29 | 2018-02-01 | 罗伯特·博世有限公司 | 人员保护系统及其运行方法 |
US11000953B2 (en) * | 2016-08-17 | 2021-05-11 | Locus Robotics Corp. | Robot gamification for improvement of operator performance |
US11518051B2 (en) | 2017-02-07 | 2022-12-06 | Veo Robotics, Inc. | Dynamic, interactive signaling of safety-related conditions in a monitored environment |
US12103170B2 (en) | 2017-01-13 | 2024-10-01 | Clara Vu | Dynamic, interactive signaling of safety-related conditions in a monitored environment |
US11541543B2 (en) | 2017-02-07 | 2023-01-03 | Veo Robotics, Inc. | Dynamic, interactive signaling of safety-related conditions in a monitored environment |
US11820025B2 (en) | 2017-02-07 | 2023-11-21 | Veo Robotics, Inc. | Safe motion planning for machinery operation |
CN110494900A (zh) | 2017-02-07 | 2019-11-22 | 韦奥机器人股份有限公司 | 工作空间安全监控和设备控制 |
US11373076B2 (en) | 2017-02-20 | 2022-06-28 | 3M Innovative Properties Company | Optical articles and systems interacting with the same |
WO2019064108A1 (en) | 2017-09-27 | 2019-04-04 | 3M Innovative Properties Company | PERSONAL PROTECTIVE EQUIPMENT MANAGEMENT SYSTEM USING OPTICAL PATTERNS FOR EQUIPMENT AND SECURITY MONITORING |
DE102017221305A1 (de) * | 2017-11-23 | 2019-05-23 | Robert Bosch Gmbh | Verfahren zum Betreiben eines kollaborativen Roboters |
US12097625B2 (en) | 2018-02-06 | 2024-09-24 | Veo Robotics, Inc. | Robot end-effector sensing and identification |
US12049014B2 (en) | 2018-02-06 | 2024-07-30 | Veo Robotics, Inc. | Workplace monitoring and semantic entity identification for safe machine operation |
DE102018109320A1 (de) * | 2018-04-19 | 2019-10-24 | Gottfried Wilhelm Leibniz Universität Hannover | Verfahren zur Erkennung einer Intention eines Partners gegenüber einer mehrgliedrigen aktuierten Kinematik |
JP2019200560A (ja) | 2018-05-16 | 2019-11-21 | パナソニックIpマネジメント株式会社 | 作業分析装置および作業分析方法 |
CN108846891B (zh) * | 2018-05-30 | 2023-04-28 | 广东省智能制造研究所 | 一种基于三维骨架检测的人机安全协作方法 |
DE102018114156B3 (de) * | 2018-06-13 | 2019-11-14 | Volkswagen Aktiengesellschaft | Verfahren zur Steuerung eines Roboters, insbesondere eines Industrieroboters, sowie Vorrichtung zur Steuerung des Roboters |
WO2020031333A1 (ja) * | 2018-08-09 | 2020-02-13 | 株式会社Fuji | シミュレーション方法およびシミュレーションシステム |
KR102085168B1 (ko) * | 2018-10-26 | 2020-03-04 | 오토아이티(주) | 인체추적 기반 위험지역 안전관리 방법 및 장치 |
DE102019103349B3 (de) | 2019-02-11 | 2020-06-18 | Beckhoff Automation Gmbh | Industrierobotersystem und Verfahren zur Steuerung eines Industrieroboters |
AU2020270998A1 (en) * | 2019-04-12 | 2021-12-02 | University Of Iowa Research Foundation | System and method to predict, prevent, and mitigate workplace injuries |
DE102019207144A1 (de) * | 2019-05-16 | 2020-11-19 | Robert Bosch Gmbh | Verfahren zur Erkennung eines Bedieners einer Arbeitsmaschine |
EP3761193A1 (de) * | 2019-07-04 | 2021-01-06 | Siemens Aktiengesellschaft | Sicherheitsanalyse von technischen systemen mit menschlichen objekten |
DE102019216405A1 (de) * | 2019-10-24 | 2021-04-29 | Robert Bosch Gmbh | Verfahren zur Verhinderung von Personenschäden bei einem Betrieb einer mobilen Arbeitsmaschine |
IT201900021108A1 (it) * | 2019-11-13 | 2021-05-13 | Gamma System S R L | Sistema di sicurezza per un macchinario industriale |
CN113033242A (zh) * | 2019-12-09 | 2021-06-25 | 上海幻电信息科技有限公司 | 动作识别方法及系统 |
CN111275941A (zh) * | 2020-01-18 | 2020-06-12 | 傲通环球环境控制(深圳)有限公司 | 工地安全管理系统 |
EP3865257A1 (de) | 2020-02-11 | 2021-08-18 | Ingenieurbüro Hannweber GmbH | Einrichtung und verfahren zur überwachung und steuerung eines technischen arbeitssystems |
CN111553264B (zh) * | 2020-04-27 | 2023-04-18 | 中科永安(安徽)科技有限公司 | 一种适用于中小学生的校园非安全行为检测及预警方法 |
CN111726589B (zh) * | 2020-07-07 | 2022-01-28 | 山东天原管业股份有限公司 | 一种阀体的生产加工方法 |
EP4016376A1 (de) * | 2020-12-18 | 2022-06-22 | Toyota Jidosha Kabushiki Kaisha | Computerimplementiertes prozessüberwachungsverfahren |
AT17459U1 (de) * | 2021-01-21 | 2022-05-15 | Altendorf Gmbh | Sicherheitseinrichtung für Werkzeugmaschinen |
CN112936267B (zh) * | 2021-01-29 | 2022-05-27 | 华中科技大学 | 一种人机协作智能制造方法及系统 |
EP4170438A1 (de) * | 2022-06-07 | 2023-04-26 | Pimu Llc | Sicherheitssteuerungssystem |
EP4401045A1 (de) | 2023-01-10 | 2024-07-17 | Sick Ag | Konfiguration eines 3d-sensors für eine sichere objektverfolgung |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0901105A1 (de) * | 1997-08-05 | 1999-03-10 | Canon Kabushiki Kaisha | Bildverarbeitungsvorrichtung |
EP1061487A1 (de) * | 1999-06-17 | 2000-12-20 | Istituto Trentino Di Cultura | Verfahren und Vorrichtung zur automatischen Kontrolle einer Raumregion |
US6347261B1 (en) * | 1999-08-04 | 2002-02-12 | Yamaha Hatsudoki Kabushiki Kaisha | User-machine interface system for enhanced interaction |
US6980690B1 (en) * | 2000-01-20 | 2005-12-27 | Canon Kabushiki Kaisha | Image processing apparatus |
DE10245720A1 (de) * | 2002-09-24 | 2004-04-01 | Pilz Gmbh & Co. | Verfahren un Vorrichtung zum Absichern eines Gefahrenbereichs |
US7729511B2 (en) | 2002-09-24 | 2010-06-01 | Pilz Gmbh & Co. Kg | Method and device for safeguarding a hazardous area |
ATE335958T1 (de) * | 2002-09-24 | 2006-09-15 | Pilz Gmbh & Co Kg | Verfahren und vorrichtung zum absichern eines gefahrenbereichs |
DE10259698A1 (de) * | 2002-12-18 | 2004-07-08 | Pilz Gmbh & Co. | Darstellungsbereich eines automobilen Nachtsichtsystems |
JP4066168B2 (ja) * | 2003-03-13 | 2008-03-26 | オムロン株式会社 | 侵入物監視装置 |
ITUD20030118A1 (it) * | 2003-05-29 | 2004-11-30 | Casagrande Spa | Dispositivo di sicurezza per macchine operatrici e metodo di riconoscimento della presenza di persone, utilizzante tale dispositivo di sicurezza. |
US6956469B2 (en) * | 2003-06-13 | 2005-10-18 | Sarnoff Corporation | Method and apparatus for pedestrian detection |
US6950733B2 (en) * | 2003-08-06 | 2005-09-27 | Ford Global Technologies, Llc | Method of controlling an external object sensor for an automotive vehicle |
SE526119C2 (sv) * | 2003-11-24 | 2005-07-05 | Abb Research Ltd | Metod och system för programmering av en industrirobot |
JP4811019B2 (ja) * | 2005-01-17 | 2011-11-09 | 株式会社豊田中央研究所 | 衝突挙動制御装置 |
DE102006013598A1 (de) | 2005-09-01 | 2007-03-15 | Daimlerchrysler Ag | Verfahren und Vorrichtung zur Korrespondenzbestimmung, vorzugsweise zur dreidimensionalen Rekonstruktion einer Szene |
KR100722229B1 (ko) * | 2005-12-02 | 2007-05-29 | 한국전자통신연구원 | 사용자 중심형 인터페이스를 위한 가상현실 상호작용 인체모델 즉석 생성/제어 장치 및 방법 |
KR100682987B1 (ko) * | 2005-12-08 | 2007-02-15 | 한국전자통신연구원 | 선형판별 분석기법을 이용한 3차원 동작인식 장치 및 그방법 |
GB0603106D0 (en) * | 2006-02-16 | 2006-03-29 | Virtual Mirrors Ltd | Design and production of garments |
-
2006
- 2006-10-10 DE DE102006048166A patent/DE102006048166A1/de not_active Withdrawn
-
2007
- 2007-04-04 JP JP2009522108A patent/JP2009545789A/ja active Pending
- 2007-04-04 EP EP07723978A patent/EP2046537A2/de not_active Ceased
- 2007-04-04 CN CN2007800332128A patent/CN101511550B/zh active Active
- 2007-04-04 WO PCT/EP2007/003037 patent/WO2008014831A2/de active Application Filing
-
2009
- 2009-01-30 US US12/362,745 patent/US8154590B2/en not_active Expired - Fee Related
Non-Patent Citations (3)
Title |
---|
See also references of WO2008014831A2 * |
THEODORIDIS, SERGIOS; KOUTROUMBAS, KONSTANTINOS: "Pattern Recognition", 1999, ACADEMIC PRESS, San Diego, ISBN: 978-0-12-686140-2, article "Chapter 13: Clustering Algorithms II: Hierarchical Algorithms", pages: 403 - 440, 276850 * |
THEODORIDIS, SERGIOS; KOUTROUMBAS, KONSTANTINOS: "Pattern Recognition", 1999, ACADEMIC PRESS, San Diego, USA, ISBN: 978-0-12-686140-2, article "Chapter 11: Clustering: Basic Concepts", pages: 351 - 382, 276850 * |
Also Published As
Publication number | Publication date |
---|---|
CN101511550B (zh) | 2013-12-18 |
US8154590B2 (en) | 2012-04-10 |
US20090237499A1 (en) | 2009-09-24 |
WO2008014831A2 (de) | 2008-02-07 |
WO2008014831A3 (de) | 2008-04-03 |
JP2009545789A (ja) | 2009-12-24 |
DE102006048166A1 (de) | 2008-02-07 |
CN101511550A (zh) | 2009-08-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2008014831A2 (de) | Verfahren zur beobachtung einer person in einem industriellen umfeld | |
DE102014105351B4 (de) | Detektion von menschen aus mehreren ansichten unter verwendung einer teilumfassenden suche | |
EP3682367B1 (de) | Gestensteuerung zur kommunikation mit einem autonomen fahrzeug auf basis einer einfachen 2d kamera | |
EP3701340B1 (de) | Überwachungsvorrichtung, industrieanlage, verfahren zur überwachung sowie computerprogramm | |
EP2344980B1 (de) | Vorrichtung, verfahren und computerprogramm zur erkennung einer geste in einem bild, sowie vorrichtung, verfahren und computerprogramm zur steuerung eines geräts | |
DE102006048163B4 (de) | Kamerabasierte Überwachung bewegter Maschinen und/oder beweglicher Maschinenelemente zur Kollisionsverhinderung | |
DE102014106210A1 (de) | Probabilistische Personennachführung unter Verwendung der Mehr- Ansichts-Vereinigung | |
DE102014106211A1 (de) | Sichtbasierte Mehrkamera-Fabriküberwachung mit dynamischer Integritätsbewertung | |
EP2174260A2 (de) | Vorrichtung zur erkennung und/oder klassifizierung von bewegungsmustern in einer bildsequenz von einer überwachungsszene, verfahren sowie computerprogramm | |
DE102007007576A1 (de) | Verfahren und Vorrichtung zum Sichern eines Arbeitsraums | |
DE10325762A1 (de) | Bildverarbeitungssystem für ein Fahrzeug | |
EP1586805A1 (de) | Verfahren zur Überwachung eines Überwachungsbereichs | |
EP0973121A2 (de) | Bildverarbeitungsverfahren und Vorrichtungen zur Erkennung von Objekten im Verkehr | |
DE10215885A1 (de) | Automatische Prozesskontrolle | |
DE102019211770B3 (de) | Verfahren zur rechnergestützten Erfassung und Auswertung eines Arbeitsablaufs, bei dem ein menschlicher Werker und ein robotisches System wechselwirken | |
EP3664973B1 (de) | Handhabungsanordnung mit einer handhabungseinrichtung zur durchführung mindestens eines arbeitsschritts sowie verfahren und computerprogramm | |
Morales-Álvarez et al. | Automatic analysis of pedestrian’s body language in the interaction with autonomous vehicles | |
DE102009026091A1 (de) | Verfahren und System zur Überwachung eines dreidimensionalen Raumbereichs mit mehreren Kameras | |
DE102020201939A1 (de) | Verfahren und Vorrichtung zur Bewertung eines Bildklassifikators | |
DE102008060768A1 (de) | Verfahren zur Klassifizierung von detektierten artikulierten Objekten und/oder Teilen des artikulierten Objektes | |
DE102009031804A1 (de) | Verfahren zur Objekterkennung und Objektverfolgung | |
WO2007048674A1 (de) | System und verfahren für ein kamerabasiertes tracking | |
DE102019009080A1 (de) | Einrichten von Sensororten in kamerabasierten Sensoren | |
EP3968298A1 (de) | Modellierung einer situation | |
DE102020209983A1 (de) | Verfahren zum Erkennen eines Objekts aus Eingabedaten unter Verwendung von relationalen Attributen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20090212 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA HR MK RS |
|
17Q | First examination report despatched |
Effective date: 20090511 |
|
DAX | Request for extension of the european patent (deleted) | ||
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 1129631 Country of ref document: HK |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: SAGERER, GERHARD Inventor name: OTT, RAINER Inventor name: SCHMIDT, JOACHIM Inventor name: KUMMERT, FRANZ Inventor name: WOEHLER, CHRISTIAN Inventor name: PROGSCHA, WERNER Inventor name: KRUEGER, LARS Inventor name: KRESSEL, ULRICH |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
18R | Application refused |
Effective date: 20130614 |
|
REG | Reference to a national code |
Ref country code: HK Ref legal event code: WD Ref document number: 1129631 Country of ref document: HK |