EP2327057A2 - A system and a method for identifying human behavioural intention based on an effective motion analysis - Google Patents
A system and a method for identifying human behavioural intention based on an effective motion analysisInfo
- Publication number
- EP2327057A2 EP2327057A2 EP09816484A EP09816484A EP2327057A2 EP 2327057 A2 EP2327057 A2 EP 2327057A2 EP 09816484 A EP09816484 A EP 09816484A EP 09816484 A EP09816484 A EP 09816484A EP 2327057 A2 EP2327057 A2 EP 2327057A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- component
- activity
- distance
- movement
- gait
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19613—Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
- G08B13/19615—Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion wherein said pattern is defined by the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
- G06V40/25—Recognition of walking or running movements, e.g. gait recognition
Definitions
- the present invention relates to a system and a method for identifying human behavioral intention based on effective human motion analysis.
- Passive reflective markers are placed on subjects in' this case, human; at specific anatomical landmarks. As the subjects walk through a lab, a three-dimensional location of each marker is detected by multiple infrared cameras. A biomechanics model is applied to the marker series to calculate the three-dimensional motion of each body segment. The processed data generates a graphical representation of each joint in all three planes and is expressed in terms of a gait cycle.
- the present invention provides a system for identifying a behavioral intention of a human being based on an effective motion analysis
- the system includes an image acquisition component, wherein the image acquisition component acquires a plurality of j I object images in sequence
- an activity enrollment component includes a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity registration component and an activity storage means
- an activity detection component includes a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait
- I feature extraction component compute features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a gait cycle, wherein the gait cycle consists of a stance phase and a swing phase.
- the present invention also provides a' method for identifying a behavioral feature extraction component, an activity matching ' component and an activity storage means characterized in that the gait feature extraction components compute features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a gait cycle, wherein the gait cycle consists of a stance phase and a swing phase.
- the present invention also provides a' method for identifying a behavioral
- the method includes acquiring a plurality of object images in sequence, 'enrolling data in a background and i foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity registration component and an activity storage means and detecting features and matching features using a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity matchirig component and an activity storage means characterized in that the method further includes calculating features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a gait cycle using the gait feature extraction components wherein the gait cycle consists of a stance phase and a swing phase.
- Fig. 1 is a block diagram representation of a system and a method for identifying behavioral intention of a human being based on an effective motion analysis based on the preferred embodiment of the present invention
- Fig. 2 shows a representation of an upper limb motion 'in a stance phase of a gait cycle
- Fig. 3 shows a representation of an upper limb motion 'in a swing phase of a gait cycle
- Fig. 4 is a comparison table between upper limb movement and lower limb movement in a typical gait cycle
- Fig. 10 is a diagram illustrating the computation of a distance of an upper limb movement in a swing phase
- Fig. 11 is a diagram illustrating the computation of a distance of a lower limb movement in a swing phase.
- an object partitioning component (52) divides the object of interest into four main parts which are head, torso, arms and legs. From the arms and legs, a key point extraction component (53) computes .important points on these two parts to
- Vhe important points may include the corner points, high curvature points and joining points that are detected from the outline of the arms and legs.
- the gait feature extraction component (54) computes ' the features related to movement of the arms and legs.
- the computed gait features from ; th Ie sequence of images are registered to a particular activity or intention, through an activity registration component (55).
- the gait features and the registered activity or intention are stored in the activity database (56).
- Fig. 6 depicts a detailed architecture of the activity detection component with an assumption that the raw video images (68) from the image acquisition component are available in real
- the computed gait features from the sequence of video images (68) are compared with the registered gait features in the activity database (67). A matching process will be
- Table 2 Six main features in the gait extraction component.
- the distance of arm movement in one gait cycle is computed by adding up the circular distances from feature 1 with feature 3. Basically,
- the distance of leg movement in one gait cycle is computed by adding the horizontal distances from feature 2 to that of feature 4. Basically,
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Health & Medical Sciences (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MYPI20083761A MY159289A (en) | 2008-09-24 | 2008-09-24 | A system and a method for identifying human behavioural intention based on an effective motion analysis |
PCT/MY2009/000153 WO2010036091A2 (en) | 2008-09-24 | 2009-09-18 | A system and a method for identifying human behavioural intention based on an effective motion analysis |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2327057A2 true EP2327057A2 (en) | 2011-06-01 |
EP2327057A4 EP2327057A4 (en) | 2017-11-22 |
Family
ID=42060321
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP09816484.1A Withdrawn EP2327057A4 (en) | 2008-09-24 | 2009-09-18 | A system and a method for identifying human behavioural intention based on an effective motion analysis |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP2327057A4 (en) |
CN (1) | CN102224526A (en) |
MY (1) | MY159289A (en) |
WO (1) | WO2010036091A2 (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8744522B2 (en) * | 2009-10-21 | 2014-06-03 | Xerox Corporation | Portable security system built into cell phones |
CN102119877B (en) * | 2010-12-15 | 2012-11-07 | 河北工业大学 | Method for creating expert knowledge base for automatically training lower artificial limbs |
US20150030252A1 (en) * | 2011-12-16 | 2015-01-29 | The Research Foundation For The State University Of New York | Methods of recognizing activity in video |
JP5686108B2 (en) * | 2012-02-24 | 2015-03-18 | 株式会社ダイフク | Sorting equipment provided with an erroneous work prevention device and an erroneous work prevention device |
CN102881100B (en) * | 2012-08-24 | 2017-07-07 | 济南纳维信息技术有限公司 | Entity StoreFront anti-thefting monitoring method based on video analysis |
CN103886588B (en) * | 2014-02-26 | 2016-08-17 | 浙江大学 | A kind of feature extracting method of 3 D human body attitude projection |
CN107423730B (en) * | 2017-09-20 | 2024-02-13 | 湖南师范大学 | Human gait behavior active detection and recognition system and method based on semantic folding |
CN108021865B (en) | 2017-11-03 | 2020-02-21 | 阿里巴巴集团控股有限公司 | Method and device for identifying illegal behaviors in unattended scene |
US10387737B1 (en) * | 2018-02-02 | 2019-08-20 | GM Global Technology Operations LLC | Rider rating systems and methods for shared autonomous vehicles |
JP7283037B2 (en) * | 2018-07-26 | 2023-05-30 | ソニーグループ株式会社 | Information processing device, information processing method, and program |
CN112464734B (en) * | 2020-11-04 | 2023-09-15 | 昆明理工大学 | Automatic identification method for walking motion characteristics of quadruped based on vision |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6369794B1 (en) * | 1998-09-09 | 2002-04-09 | Matsushita Electric Industrial Co., Ltd. | Operation indication outputting device for giving operation indication according to type of user's action |
US7330566B2 (en) * | 2003-05-15 | 2008-02-12 | Microsoft Corporation | Video-based gait recognition |
US7212651B2 (en) * | 2003-06-17 | 2007-05-01 | Mitsubishi Electric Research Laboratories, Inc. | Detecting pedestrians using patterns of motion and appearance in videos |
JP5028751B2 (en) * | 2005-06-09 | 2012-09-19 | ソニー株式会社 | Action recognition device |
-
2008
- 2008-09-24 MY MYPI20083761A patent/MY159289A/en unknown
-
2009
- 2009-09-18 CN CN2009801470533A patent/CN102224526A/en active Pending
- 2009-09-18 EP EP09816484.1A patent/EP2327057A4/en not_active Withdrawn
- 2009-09-18 WO PCT/MY2009/000153 patent/WO2010036091A2/en active Application Filing
Non-Patent Citations (1)
Title |
---|
See references of WO2010036091A2 * |
Also Published As
Publication number | Publication date |
---|---|
EP2327057A4 (en) | 2017-11-22 |
MY159289A (en) | 2016-12-30 |
CN102224526A (en) | 2011-10-19 |
WO2010036091A3 (en) | 2010-06-24 |
WO2010036091A2 (en) | 2010-04-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2010036091A2 (en) | A system and a method for identifying human behavioural intention based on an effective motion analysis | |
Bouchrika et al. | On using gait in forensic biometrics | |
CN111881887A (en) | Multi-camera-based motion attitude monitoring and guiding method and device | |
US9330470B2 (en) | Method and system for modeling subjects from a depth map | |
US9235753B2 (en) | Extraction of skeletons from 3D maps | |
CN108875507B (en) | Pedestrian tracking method, apparatus, system, and computer-readable storage medium | |
Kovač et al. | Human skeleton model based dynamic features for walking speed invariant gait recognition | |
Dhulekar et al. | Motion estimation for human activity surveillance | |
CN113196283A (en) | Attitude estimation using radio frequency signals | |
El-Sallam et al. | A low cost 3D markerless system for the reconstruction of athletic techniques | |
CN111062295A (en) | Area positioning method and device, and storage medium | |
CN115953838A (en) | Gait image tracking and identifying system based on MLP-Yolov5 network | |
Liu et al. | Analysis of human walking posture using a wearable camera | |
Krzeszowski et al. | The application of multiview human body tracking on the example of hurdle clearance | |
CN107045725A (en) | A kind of human body motion capture method | |
Bouchrika et al. | Recognizing people in non-intersecting camera views | |
CN114372996A (en) | Pedestrian track generation method oriented to indoor scene | |
Hachaj et al. | How Repetitive are Karate Kicks Performed by Skilled Practitioners? | |
Goffredo et al. | Performance analysis for gait in camera networks | |
Lok et al. | Model-based human motion analysis in monocular video | |
Xu et al. | A video tracking system for limb motion measurement in small animals | |
Priydarshi et al. | Speed invariant, human gait based recognition system for video surveillance security | |
Mukhtar et al. | RETRACTED: Gait Analysis of Pedestrians with the Aim of Detecting Disabled People | |
CN111435535A (en) | Method and device for acquiring joint point information | |
Bouchrika et al. | Markerless extraction of gait features using haar-like template for view-invariant biometrics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20110316 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA RS |
|
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20171023 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06K 9/00 20060101AFI20171017BHEP Ipc: G08B 13/196 20060101ALI20171017BHEP |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20180523 |