WO2018008702A1 - Système de détection de comportement - Google Patents

Système de détection de comportement Download PDF

Info

Publication number
WO2018008702A1
WO2018008702A1 PCT/JP2017/024714 JP2017024714W WO2018008702A1 WO 2018008702 A1 WO2018008702 A1 WO 2018008702A1 JP 2017024714 W JP2017024714 W JP 2017024714W WO 2018008702 A1 WO2018008702 A1 WO 2018008702A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
recognition
action
detection system
unit
Prior art date
Application number
PCT/JP2017/024714
Other languages
English (en)
Japanese (ja)
Inventor
洋登 永吉
大介 勝又
孝史 野口
健太郎 大西
Original Assignee
株式会社日立システムズ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立システムズ filed Critical 株式会社日立システムズ
Publication of WO2018008702A1 publication Critical patent/WO2018008702A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present invention relates to a behavior detection system, and more particularly to a technique effective for motion analysis for recognizing and analyzing human motion.
  • Patent Document 1 describes that, for example, a feature quantity extracted from a sensor attached to a person is compared with a feature quantity stored in a feature quantity database to recognize the action of the person.
  • An object of the present invention is to provide a technique capable of identifying an operation with high accuracy even when the types of the recognized operation are increased.
  • a typical behavior detection system recognizes an element action to be recognized and analyzes an action meaning indicating the meaning of the recognized element action.
  • This behavior detection system includes an operation model storage unit, a photographing unit, a position recognition unit, and an operation recognition unit.
  • the behavior model storage unit stores a behavior model representing the recognized element behavior as numerical information.
  • the photographing unit photographs a work operation to be recognized.
  • the position recognition unit recognizes the position of the recognition target from the shooting information acquired by the shooting unit.
  • the behavior detection system has a work position operation table that associates the position of the recognition target, the element operation performed at the corresponding position, and the operation meaning of the element operation.
  • the motion recognition unit refers to the work position motion table, acquires an element motion associated with the position recognized by the position recognition unit, and further extracts a motion model corresponding to the element motion from the motion model storage unit. .
  • the motion recognition unit recognizes the element motion of the recognition target by comparing the extracted motion model with the motion of the recognition target. Further, the motion recognition unit extracts the motion meaning from the recognition target position and the recognized element motion with reference to the work position motion table.
  • the motion recognition unit generates recognition information that associates the motion meaning corresponding to the acquired motion model, the position detected by the position recognition unit, and the time when the recognition target element motion is recognized.
  • the constituent elements are not necessarily indispensable unless otherwise specified and apparently essential in principle. Needless to say.
  • FIG. 1 is an explanatory diagram illustrating an example of a configuration in a behavior detection system 10 according to an embodiment.
  • the behavior detection system 10 is a system that recognizes the position and element motion of a worker who is a recognition target, and stores the meaning of the recognized element motion, time, and the like in association with each other. As shown in FIG. 1, the behavior detection system 10 includes a video photographing device 11, a recognition processing unit 12, and a display device 13.
  • the video photographing device 11 as a photographing unit is, for example, a web camera or a surveillance camera capable of photographing a color image, a depth sensor capable of photographing a distance to a photographing target, or the like.
  • the recognition processing unit 12 extracts information related to a person's movement (hereinafter referred to as movement-related information) from the shooting information shot by the video shooting apparatus 11.
  • the motion-related information is, for example, time series information of a human joint position or information obtained by abstracting it.
  • the above-described abstracted information is, for example, information representing the speed of joint movement, the distance between joints, the speed of joint movement expressed in frequency, or the combination thereof. Both are expressed as numerical information.
  • the display device 13 is composed of a display such as a liquid crystal monitor, for example, and displays recognition information described later under the control of the recognition processing unit 12.
  • the recognition processing unit 12 includes a position recognition unit 15, an operation recognition unit 16, and a storage unit 17.
  • the position recognizing unit 15 recognizes the position of the worker, that is, where the worker is in the work place, from the photographing information photographed by the video photographing device 11.
  • This position recognition can be performed, for example, by giving in advance the angle of view, the distance to the floor, and the angle of the floor of the image capturing device 11. If the photographing apparatus is a normal color camera, the position in the workplace can be recognized from the size and position of the subject on the image. In the case of a depth camera, depth information can also be used, so that the position can be recognized with high accuracy.
  • the motion recognition unit 16 recognizes the worker's elemental motion from the shooting information of the video shooting device 11.
  • the storage unit 17 includes a nonvolatile storage device such as a hard disk device (HDD) or a flash memory, and stores various types of information.
  • the storage unit 17 includes a place operation table 20, an operation model storage unit 21, and a recognition information storage unit 22.
  • the place operation table 20 which is a work position operation table is a table in which a position is associated with an element operation of a worker performed at the position.
  • the motion model storage unit 21 stores motion models of element motions to be detected. This motion model is a series of motion related information when, for example, a worker performs a standard work motion.
  • FIG. 2 is an explanatory diagram illustrating an example of a data configuration in the place operation table 20 included in the storage unit 17 of FIG.
  • the place action table 20 stores the area where the element action can be executed and the meaning of the action indicated by the element action in the area for the element action of the worker to be recognized. As shown in FIG. 2, the place operation table 20 is composed of data of “place”, “operation ID”, and “operation meaning”.
  • “Location” is data indicating which position or area of the workplace, and is represented by coordinates, for example.
  • Action ID specifies the element action of the worker to be recognized in the “place”, and is indicated by a number in the example of FIG.
  • the “motion meaning” is data indicating the meaning of the element motion of the worker to be recognized at the place.
  • FIG. 3 is an explanatory diagram showing another example of the data structure in the place operation table 20 of FIG.
  • FIG. 4 is an explanatory diagram showing an example of a data configuration in the behavior model accumulated in the behavior model accumulation unit 21 included in the storage unit 17 of FIG.
  • the operation model stored in the operation model storage unit 21 includes “operation ID” and “operation model” as shown in FIG.
  • the “motion model” is a numerical value of the elemental motion of the worker to be recognized as described above.
  • the motion related information captured once may be used as the motion model as it is, or the motion related information captured multiple times is used as the motion model. It may be used. In the latter case, motion-related information captured a plurality of times may be used as an operation model as it is, or an average may be used.
  • the recognition information storage unit 22 stores the recognition information generated by the motion recognition unit 16.
  • the recognition information stored in the recognition information storage unit 22 includes, for example, “time”, “location”, “operation ID”, and “operation meaning”.
  • Time is, for example, the time when the worker started work. Alternatively, it may be from the time when the work is started to the time when the work is finished. This is the time when the motion recognition unit 16 recognizes the worker's element motion, for example.
  • “Location” indicates the position where the worker is working. “Operation ID” identifies the element operation of the worker to be recognized. The “motion meaning” indicates what meaning the “motion ID” has at the corresponding “location”.
  • FIG. 5 is a flowchart showing an example of the operation in the behavior detection system 10 of FIG. In FIG. 5, behavior recognition processing by the behavior detection system 10 will be described.
  • the behavior recognition process is a process for recognizing a worker's work position and element motion, generating recognition information, and storing the recognition information.
  • FIG. 5 illustrates the case where the behavior recognition process described below is executed by hardware such as the position recognition unit 15 and the action recognition unit 16, and the behavior recognition process is, for example, the recognition processing unit of FIG. 1.
  • the program may be executed on the basis of software in a program format stored in a program storage memory (not shown) provided in FIG.
  • the software When executed based on software, the software is executed by, for example, a CPU (Central Processing Unit) (not shown) included in the recognition processing unit 12.
  • a CPU Central Processing Unit
  • the position recognition unit 15 recognizes the position of the worker who started the work based on, for example, the above-described method (step S101).
  • the position recognition unit 15 outputs the recognized worker position to the operation recognition unit 16.
  • the motion recognition unit 16 acquires the recognition target corresponding to the position recognized by the position recognition unit 15 with reference to the place motion table 20 of FIG. 2, that is, the element motion of the worker (step S102).
  • the motion recognition unit 16 reads all the motion models corresponding to the “motion ID” acquired in the process of step S102 from the motion model storage unit 21 of the storage unit 17 (step S103).
  • the motion recognition unit 16 calculates motion related information from the shooting information of the worker shot by the video shooting device 11 (step S104). Then, the motion recognition unit 16 operates the “motion ID” of the most similar motion model among the motion models read out in step S103 with respect to the motion-related information calculated in step S104. Obtained as a recognition result (step S105).
  • the motion recognition result is “1”.
  • the movement related information calculated in the process of step S104 is compared with movement related information in each movement model, The most similar result is adopted as a comparison result with the corresponding behavior model.
  • the motion recognition unit 16 acquires the motion meaning corresponding to the recognition result by the process of step S105 from the place motion table 20 (step S106), generates recognition information, and the recognition information storage unit 22 of the storage unit 17. (Step S107).
  • the operation meaning is “turn the lever”.
  • the recognition information generated by the motion recognition unit 16 associates the time when the work is started, the position of the worker, the motion recognition result, and the motion meaning as described above.
  • the load on the recognition processing of the motion recognition unit 16 can be reduced.
  • the recognition processing time can be shortened, and the performance of the behavior detection system 10 can be improved.
  • the recognition information stored in the recognition information storage unit 22 may be displayed on the display device 13.
  • the operation recognition unit 16 reads the recognition information stored in the recognition information storage unit 22 and causes the display device 13 to display the display.
  • the operation recognition unit 16 reads the recognition information in the recognition information storage unit 22 and displays the display device 13. You may make it display on.
  • a predetermined work (hereinafter referred to as a prescribed work) is arranged to be performed in a predetermined time zone.
  • the action recognition unit 16 may determine whether or not the prescribed work is being performed, and may output an alert to the display device 13 or the like when the prescribed work is not being performed by the worker.
  • the storage unit 17 includes a prescribed work information storage unit.
  • the specified work information storage unit stores specified work information indicating the position, time zone, and operation meaning of the specified work.
  • the motion recognition unit 16 searches the recognition information stored in the recognition information storage unit 22, and is the operation corresponding to the determined motion meaning being performed at the position defined in the prescribed work information? No, as an example, the order of work and the excess / shortage of work are determined.
  • the motion recognition unit 16 uses the time of the recognition information stored in the recognition information storage unit 22 to extract the corresponding specified work information, and is determined at a position determined in the extracted specified work information. It may be determined whether or not an operation corresponding to the operation meaning is performed.
  • the action recognition unit 16 determines that the prescribed work has not been performed, and outputs an alert to the display device 13 or the like indicating that the prescribed work has not been performed.
  • the supervisor can efficiently check the operation of the worker.
  • the operation model storage unit 21 is included in the storage unit 17.
  • the operation model storage unit 21 may be connected via, for example, the Internet.
  • FIG. 6 is an explanatory diagram showing another configuration example of the behavior detection system 10 of FIG.
  • the behavior model storage unit 21 includes, for example, storage on the cloud, so-called cloud storage.
  • the behavior model storage unit 21 is connected to the recognition processing unit 12 through a communication line 30 such as the Internet.
  • the behavior model may be uploaded to the behavior model storage unit 21 through the communication line 30, and the operation model is individually input in advance at each workplace. Can be made unnecessary. Thereby, the man-hour at the time of using the behavior detection system 10 can be reduced.
  • a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

Le but de la présente invention est d'identifier une action avec une grande précision même lorsque les types d'actions à reconnaître ont augmentés. Le système de reconnaissance de comportement comprend une unité de stockage de modèle d'action, un dispositif de photographie vidéo, une section de reconnaissance de position et une section de reconnaissance d'action. La partie de stockage de modèle d'action stocke des modèles d'action représentant des modèles d'actions fondamentales à détecter. Le dispositif de photographie vidéo 11 photographie une activité de travail d'un sujet à reconnaître. La section de reconnaissance de position 15 détecte la position du sujet à partir des informations photographiques prises par le dispositif de photographie vidéo 11. La section de reconnaissance d'action 16 reconnaît l'action fondamentale du sujet. De plus, la section de reconnaissance d'action 16, lors de l'acquisition d'un modèle d'action similaire à l'action fondamentale reconnue du sujet par recherche à travers les modèles d'action stockés dans la partie de stockage de modèle d'action 21, acquiert la signification de l'action correspondant au modèle d'action acquis.
PCT/JP2017/024714 2016-07-07 2017-07-05 Système de détection de comportement WO2018008702A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016134827A JP6841608B2 (ja) 2016-07-07 2016-07-07 振る舞い検知システム
JP2016-134827 2016-07-07

Publications (1)

Publication Number Publication Date
WO2018008702A1 true WO2018008702A1 (fr) 2018-01-11

Family

ID=60912874

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/024714 WO2018008702A1 (fr) 2016-07-07 2017-07-05 Système de détection de comportement

Country Status (2)

Country Link
JP (1) JP6841608B2 (fr)
WO (1) WO2018008702A1 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7254546B2 (ja) * 2018-02-27 2023-04-10 キヤノン株式会社 情報処理装置、情報処理方法及びプログラム
WO2019167775A1 (fr) * 2018-02-27 2019-09-06 キヤノン株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP7332465B2 (ja) * 2019-12-26 2023-08-23 株式会社日立製作所 動作認識システム、動作認識装置、および領域設定方法
WO2022009489A1 (fr) * 2020-07-10 2022-01-13 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Dispositif et procédé de reconnaissance comportementale et programme
US20230307106A1 (en) * 2020-08-13 2023-09-28 Hyungsook KIM Movement code-based emotional behavior analysis system
WO2024048741A1 (fr) * 2022-09-01 2024-03-07 味の素株式会社 Dispositif d'estimation de mouvement de cuisson, procédé d'estimation de mouvement de cuisson, et programme d'estimation de mouvement de cuisson

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003167613A (ja) * 2001-11-30 2003-06-13 Sharp Corp 作業管理システム、作業管理方法、及びその方法を実現するためのプログラムを記憶した記録媒体
JP2005202653A (ja) * 2004-01-15 2005-07-28 Canon Inc 動作認識装置及び方法、動物体認識装置及び方法、機器制御装置及び方法、並びにプログラム
WO2013145631A1 (fr) * 2012-03-30 2013-10-03 日本電気株式会社 Dispositif, système, programme et procédé d'analyse de données de ligne de production

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015043141A (ja) * 2013-08-26 2015-03-05 キヤノン株式会社 ジェスチャ認識装置および制御プログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003167613A (ja) * 2001-11-30 2003-06-13 Sharp Corp 作業管理システム、作業管理方法、及びその方法を実現するためのプログラムを記憶した記録媒体
JP2005202653A (ja) * 2004-01-15 2005-07-28 Canon Inc 動作認識装置及び方法、動物体認識装置及び方法、機器制御装置及び方法、並びにプログラム
WO2013145631A1 (fr) * 2012-03-30 2013-10-03 日本電気株式会社 Dispositif, système, programme et procédé d'analyse de données de ligne de production

Also Published As

Publication number Publication date
JP6841608B2 (ja) 2021-03-10
JP2018005752A (ja) 2018-01-11

Similar Documents

Publication Publication Date Title
WO2018008702A1 (fr) Système de détection de comportement
US10810438B2 (en) Setting apparatus, output method, and non-transitory computer-readable storage medium
JP6286474B2 (ja) 画像処理装置および領域追跡プログラム
RU2607774C2 (ru) Способ управления в системе захвата изображения, устройство управления и машиночитаемый носитель данных
US11188788B2 (en) System and method to determine a timing update for an image recognition model
JP6847254B2 (ja) 歩行者追跡の方法および電子デバイス
JP6112616B2 (ja) 情報処理装置、情報処理システム、情報処理方法、及びプログラム
JP6024658B2 (ja) 物体検出装置、物体検出方法及びプログラム
CN104919794A (zh) 用于从主从式相机跟踪系统提取元数据的方法和系统
JP2014192700A (ja) 追尾処理装置及びこれを備えた追尾処理システム並びに追尾処理方法
CN104811660A (zh) 控制装置及控制方法
US10970551B2 (en) Control apparatus and control method for determining relation of persons included in an image, and storage medium storing a program therefor
US8284292B2 (en) Probability distribution constructing method, probability distribution constructing apparatus, storage medium of probability distribution constructing program, subject detecting method, subject detecting apparatus, and storage medium of subject detecting program
JP2019152802A (ja) 作業動作解析システム及び作業動作解析方法
CN113283408A (zh) 基于监控视频的社交距离监测方法、装置、设备和介质
JP2020087312A (ja) 行動認識装置、行動認識方法及びプログラム
JP6618349B2 (ja) 映像検索システム
JP7446060B2 (ja) 情報処理装置、プログラム及び情報処理方法
JP2020181290A (ja) 物品認識システムおよび物品認識方法
JP7028729B2 (ja) 物体追跡装置、物体追跡システム、および物体追跡方法
JP2007048232A (ja) 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム
JP4449483B2 (ja) 画像解析装置、および画像解析方法、並びにコンピュータ・プログラム
JP6836985B2 (ja) 撮影映像から人の行動を表すコンテキストを推定するプログラム、装置及び方法
JP2020064684A (ja) 制御装置、制御方法及びプログラム
WO2023152825A1 (fr) Système d'évaluation de mouvement, procédé d'évaluation de mouvement et support lisible par ordinateur non transitoire

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17824305

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17824305

Country of ref document: EP

Kind code of ref document: A1