EP2327057A2 - A system and a method for identifying human behavioural intention based on an effective motion analysis - Google Patents

A system and a method for identifying human behavioural intention based on an effective motion analysis

Info

Publication number
EP2327057A2
EP2327057A2 EP09816484A EP09816484A EP2327057A2 EP 2327057 A2 EP2327057 A2 EP 2327057A2 EP 09816484 A EP09816484 A EP 09816484A EP 09816484 A EP09816484 A EP 09816484A EP 2327057 A2 EP2327057 A2 EP 2327057A2
Authority
EP
European Patent Office
Prior art keywords
component
activity
distance
movement
gait
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP09816484A
Other languages
German (de)
French (fr)
Other versions
EP2327057A4 (en
Inventor
Mei Kuan Lim
Kim Meng Liang
Tomas Henrique Maul
Weng Kin Lai
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mimos Bhd
Original Assignee
Mimos Bhd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mimos Bhd filed Critical Mimos Bhd
Publication of EP2327057A2 publication Critical patent/EP2327057A2/en
Publication of EP2327057A4 publication Critical patent/EP2327057A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • G08B13/19615Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion wherein said pattern is defined by the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • G06V40/25Recognition of walking or running movements, e.g. gait recognition

Definitions

  • the present invention relates to a system and a method for identifying human behavioral intention based on effective human motion analysis.
  • Passive reflective markers are placed on subjects in' this case, human; at specific anatomical landmarks. As the subjects walk through a lab, a three-dimensional location of each marker is detected by multiple infrared cameras. A biomechanics model is applied to the marker series to calculate the three-dimensional motion of each body segment. The processed data generates a graphical representation of each joint in all three planes and is expressed in terms of a gait cycle.
  • the present invention provides a system for identifying a behavioral intention of a human being based on an effective motion analysis
  • the system includes an image acquisition component, wherein the image acquisition component acquires a plurality of j I object images in sequence
  • an activity enrollment component includes a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity registration component and an activity storage means
  • an activity detection component includes a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait
  • I feature extraction component compute features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a gait cycle, wherein the gait cycle consists of a stance phase and a swing phase.
  • the present invention also provides a' method for identifying a behavioral feature extraction component, an activity matching ' component and an activity storage means characterized in that the gait feature extraction components compute features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a gait cycle, wherein the gait cycle consists of a stance phase and a swing phase.
  • the present invention also provides a' method for identifying a behavioral
  • the method includes acquiring a plurality of object images in sequence, 'enrolling data in a background and i foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity registration component and an activity storage means and detecting features and matching features using a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity matchirig component and an activity storage means characterized in that the method further includes calculating features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a gait cycle using the gait feature extraction components wherein the gait cycle consists of a stance phase and a swing phase.
  • Fig. 1 is a block diagram representation of a system and a method for identifying behavioral intention of a human being based on an effective motion analysis based on the preferred embodiment of the present invention
  • Fig. 2 shows a representation of an upper limb motion 'in a stance phase of a gait cycle
  • Fig. 3 shows a representation of an upper limb motion 'in a swing phase of a gait cycle
  • Fig. 4 is a comparison table between upper limb movement and lower limb movement in a typical gait cycle
  • Fig. 10 is a diagram illustrating the computation of a distance of an upper limb movement in a swing phase
  • Fig. 11 is a diagram illustrating the computation of a distance of a lower limb movement in a swing phase.
  • an object partitioning component (52) divides the object of interest into four main parts which are head, torso, arms and legs. From the arms and legs, a key point extraction component (53) computes .important points on these two parts to
  • Vhe important points may include the corner points, high curvature points and joining points that are detected from the outline of the arms and legs.
  • the gait feature extraction component (54) computes ' the features related to movement of the arms and legs.
  • the computed gait features from ; th Ie sequence of images are registered to a particular activity or intention, through an activity registration component (55).
  • the gait features and the registered activity or intention are stored in the activity database (56).
  • Fig. 6 depicts a detailed architecture of the activity detection component with an assumption that the raw video images (68) from the image acquisition component are available in real
  • the computed gait features from the sequence of video images (68) are compared with the registered gait features in the activity database (67). A matching process will be
  • Table 2 Six main features in the gait extraction component.
  • the distance of arm movement in one gait cycle is computed by adding up the circular distances from feature 1 with feature 3. Basically,
  • the distance of leg movement in one gait cycle is computed by adding the horizontal distances from feature 2 to that of feature 4. Basically,

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Health & Medical Sciences (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)

Abstract

With the growing market for video surveillance in security area, there is a need for an automated system which provides a way to track and detect human intention based on a particular human motion. The present invention relates to a system and a method for identifying human behavioral intention based on effective motion analysis wherein, the system obtains a sequence of raw images taken from live scene and processes the raw images in an activity analysis component. The activity analysis component is further provided with an activity enrollment component and activity detection component.

Description

A SYSTEM AND A METHOD FOR IDENTIFYING hjUMAN BEHAVIOURAL INTENTION BASED ON AN EFFECTIVE MOTION ANALYSIS
FIELD OF INVENTION
The present invention relates to a system and a method for identifying human behavioral intention based on effective human motion analysis.
BACKGROUND OF INVENTION
There is a growing interest in activity analysis or predicting human intention based on human motion due to immense need for better automate , d I! video surveillance systems that go beyond just merely identifying simple objects. The capability to automatically monitor human motion using computers in security-sensitive areas ' s iuch as airports, borders and building lobbies is of great interest to security personnel, e.g. police and military.
A recent market study from IMS Research has found that the trend from analogue CCTV to
i network video surveillance is in full swing. The world market for network video surveillance products increased by an impressive 41.9% in 2006 Jϊnd is forecasted to continue growing strongly for many years to come. By 2010, the combined market for network cameras, video
; i ' ■ servers and NVRs is forecasted to exceed US$2.6 billion.
In China, the market for video servers (video encoders) used in security applications increased by a massive 60% in 2007, according to a 'report titled "The Chinese Market for CCTV and Video Surveillance Equipment" by IMS Research. The market is forecasted to continue growing rapidly over the coming years to exceed US$150 million by 2011. This i j dramatic growth is primarily attributed to the strong demand for IP-based video surveillance systems in China. More and more users of security systems are choosing networked solutions based on video servers, instead of traditional analogue CCTV systems. 30.3% forecast for video camera servers. Together^ these markets will be worth some
EUR151.1 million by 2008.
These are summarized in table 1.
Traditionally, motion analysis of humans has used markers attached to appropriate parts of a human body to highlight the movement of these points and how they relate to each motion sequence. These were extensively used in sports 'to enhance performance of athletes. i I
Passive reflective markers are placed on subjects in' this case, human; at specific anatomical landmarks. As the subjects walk through a lab, a three-dimensional location of each marker is detected by multiple infrared cameras. A biomechanics model is applied to the marker series to calculate the three-dimensional motion of each body segment. The processed data generates a graphical representation of each joint in all three planes and is expressed in terms of a gait cycle.
lower limbs or better known as gait analysis.
Intille, J.W. Davis, and A.F. Bobick. "Real-time closed-world tracking", In Proc. of the IEEE
Conf. on Computer Vision and Pattern Recognition, pages 697-703, Los Alamitos, CA, June 1997. IEEE Computer Society Press.
SUWI WlARY OF INVENTION
I ! • Accordingly, the present invention provides a system for identifying a behavioral intention of a human being based on an effective motion analysis, the system includes an image acquisition component, wherein the image acquisition component acquires a plurality of j I object images in sequence, an activity enrollment component includes a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity registration component and an activity storage means and an activity detection component includes a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait
/ : I feature extraction component, an activity matching ' component and an activity storage means characterized in that the gait feature extraction components compute features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a gait cycle, wherein the gait cycle consists of a stance phase and a swing phase. j Furthermore, the present invention also provides a' method for identifying a behavioral
I intention of a human being based on an effective motion analysis, the method includes acquiring a plurality of object images in sequence, 'enrolling data in a background and i foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity registration component and an activity storage means and detecting features and matching features using a background and foreground detection component, an object detection component, an object partitioning component, a key point extraction component, a gait feature extraction component, an activity matchirig component and an activity storage means characterized in that the method further includes calculating features related to the movement of an upper limb of the human in relation to the movement of a lower limb in a gait cycle using the gait feature extraction components wherein the gait cycle consists of a stance phase and a swing phase.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be fully understood from 'the detailed description given herein below and the accompanying drawings which are given by way of illustration only, and thus are not limitative of the present invention, wherein:
Fig. 1 is a block diagram representation of a system and a method for identifying behavioral intention of a human being based on an effective motion analysis based on the preferred embodiment of the present invention;
Fig. 2 shows a representation of an upper limb motion 'in a stance phase of a gait cycle; i
Fig. 3 shows a representation of an upper limb motion 'in a swing phase of a gait cycle;
Fig. 4 is a comparison table between upper limb movement and lower limb movement in a typical gait cycle;
stance phase;
of a lower limb movement in a stance phase;
Fig. 10 is a diagram illustrating the computation of a distance of an upper limb movement in a swing phase; Fig. 11 is a diagram illustrating the computation of a distance of a lower limb movement in a swing phase.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
video images. the moving objects.
From the detected object of interest, an object partitioning component (52) divides the object of interest into four main parts which are head, torso, arms and legs. From the arms and legs, a key point extraction component (53) computes .important points on these two parts to
; j be used in a gait feature extraction component (54). Vhe important points may include the corner points, high curvature points and joining points that are detected from the outline of the arms and legs.
The gait feature extraction component (54) computes' the features related to movement of the arms and legs. The computed gait features from ; th Ie sequence of images are registered to a particular activity or intention, through an activity registration component (55). The gait features and the registered activity or intention are stored in the activity database (56). Fig. 6 depicts a detailed architecture of the activity detection component with an assumption that the raw video images (68) from the image acquisition component are available in real
' i time. The video images (68) are fed into the similar components that are describes in the
! i activity enrollment component (refer to Fig. 5) except the activity registration component (55) is replaced with an activity matching component (65). In the activity matching component
(65), the computed gait features from the sequence of video images (68) are compared with the registered gait features in the activity database (67). A matching process will be
conducted to match the activity or intention that has Jbeen registered and the gait features detected and vice versa.
Table 2: Six main features in the gait extraction component.
phase. For feature 5, the distance of arm movement in one gait cycle is computed by adding up the circular distances from feature 1 with feature 3. Basically,
Feature 5 = Z)""" + D^"
where,
D°™ : Distance of arm movement in a stance phase
D™' : Distance of arm movement in a swing phase
For feature 6, the distance of leg movement in one gait cycle is computed by adding the horizontal distances from feature 2 to that of feature 4. Basically,
Feature 6 = Ds lf + D^
where,
S : Distance of leg movement in a stance phase
D^1 : Distance of leg movement in a swing phase

Claims

1. A system for identifying a behavioral intention of a human being based on an effective motion analysis, the system includes:
a. an image acquisition component, wherein the image acquisition component
I acquires a plurality of object images in siequence;
I b. an activity enrollment component includes a background and foreground detection component. (50), an object detection component (51), an object partitioning component (52), a key point extraction component (53), a gait feature extraction component (54), an activity registration component (55) and an activity storage means (56) and
c. an activity detection component includes a background and foreground i detection component (60), an object detection component (61), an object
i partitioning component (62) , a key point extraction component (63), a gait feature extraction component (64), an activity matching component (65) and an activity storage means (67)
characterized in that
the gait feature extraction components (54, 64) compute features related to the movement of an upper limb of the human in relation to the movement of a lower limb j in a gait cycle, wherein the gait cycle consists of a stance phase and a swing phase.
2. The system as claimed in claim 1, wherein the background and foreground
I components (50, 60) determine presence of moving objects.
3. The system as claimed in claim.2, wherein the object detection components (51, 61)
I highlight the moving object as an object of interest
4. The system as claimed in claim 3, wherein the object partitioning components (52,
I
62) divide the object of interest into four main parts: 1) head, 2) torso, 3) upper limbs and 4) lower limbs. . .
5. The system as claimed in claim 1, wherein the key point extraction component (53,
63) compute important points of the four main parts of the divided object from the i object partitioning component.
i
6. The system as claimed in claim 1, wherein the gait feature extraction components
(54, 64) extract one or more of the following features:
a. distance of upper limb movement in stance phase; b. distance of lower limb movement in stance phase; c. distance of upper limb movement in swing phase; d. distance of lower limb movement in swing phase; e. distance of upper limb movement in one gait cycle; or f. distance of lower limb movement in one gait cycle.
7. The system as claimed in claim 1 , wherein a plurality of object images in sequence is
. i acquired by means of a video camera or a still camera.
8. The system as claimed in claim 1, wherein the computation processes are done using a computing means such as a computer.
9. A method for identifying a behavioral intention of a human being based on an effective motion analysis, the method includes:
a. acquiring a plurality of object images in sequence,
b. enrolling data in a background and foreground detection component (50), an object detection component (51), an object partitioning component (52), a key point extraction component (53), a gait feature extraction component (54), an activity registration component (55) and an activity storage means (56) and,
c. detecting features and matching features using a background and foreground i detection component (60), an object detection component (61), an object partitioning component (62), a key point extraction component (63), a gait feature extraction component (64), an activity matching component (65) and an activity storage means (67)
characterized in that
the method further includes calculating features related to the movement of an upper
! limb of the human in relation to the movement of a lower limb in a gait cycle using the i gait feature extraction components (54, 64) wherein the gait cycle consists of a stance phase and a swing phase.
10. The method as claimed in claim 9, wherein the background and foreground components (50, 60) determine presence of moving objects.
11. The method as claimed in claim 9, wherein the object detection components (51, 61) i highlight the moving object as an object of interest.
12. The method as claimed in claim 9, wherein the object partitioning components (52, 62) divide the object of interest into four main parts: 1) head, 2) torso, 3) upper limbs and 4) lower limbs.'
13. The method as claimed in claim 9, wherein the key point extraction component (53, 63) compute important points of the four- main parts of the divided . object from the object partitioning component.
14. The method as claimed in claim 9, wherein the activity enrollment component is performed in offline mode.
15. The method as claimed in claim 9, wherein the gait feature extraction components j (54, 64) extract one or more of the following features:
a. distance of upper limb movement in stance phase; b. distance of lower limb movement in stance phase;
J c. distance of upper limb movement in swing phase; i d. distance of lower limb movement in swing phase; j e. distance of upper limb movement in one; gait cycle; or f. distance of lower limb movement in one! gait cycle.
EP09816484.1A 2008-09-24 2009-09-18 A system and a method for identifying human behavioural intention based on an effective motion analysis Withdrawn EP2327057A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
MYPI20083761A MY159289A (en) 2008-09-24 2008-09-24 A system and a method for identifying human behavioural intention based on an effective motion analysis
PCT/MY2009/000153 WO2010036091A2 (en) 2008-09-24 2009-09-18 A system and a method for identifying human behavioural intention based on an effective motion analysis

Publications (2)

Publication Number Publication Date
EP2327057A2 true EP2327057A2 (en) 2011-06-01
EP2327057A4 EP2327057A4 (en) 2017-11-22

Family

ID=42060321

Family Applications (1)

Application Number Title Priority Date Filing Date
EP09816484.1A Withdrawn EP2327057A4 (en) 2008-09-24 2009-09-18 A system and a method for identifying human behavioural intention based on an effective motion analysis

Country Status (4)

Country Link
EP (1) EP2327057A4 (en)
CN (1) CN102224526A (en)
MY (1) MY159289A (en)
WO (1) WO2010036091A2 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8744522B2 (en) * 2009-10-21 2014-06-03 Xerox Corporation Portable security system built into cell phones
CN102119877B (en) * 2010-12-15 2012-11-07 河北工业大学 Method for creating expert knowledge base for automatically training lower artificial limbs
US20150030252A1 (en) * 2011-12-16 2015-01-29 The Research Foundation For The State University Of New York Methods of recognizing activity in video
JP5686108B2 (en) * 2012-02-24 2015-03-18 株式会社ダイフク Sorting equipment provided with an erroneous work prevention device and an erroneous work prevention device
CN102881100B (en) * 2012-08-24 2017-07-07 济南纳维信息技术有限公司 Entity StoreFront anti-thefting monitoring method based on video analysis
CN103886588B (en) * 2014-02-26 2016-08-17 浙江大学 A kind of feature extracting method of 3 D human body attitude projection
CN107423730B (en) * 2017-09-20 2024-02-13 湖南师范大学 Human gait behavior active detection and recognition system and method based on semantic folding
CN108021865B (en) 2017-11-03 2020-02-21 阿里巴巴集团控股有限公司 Method and device for identifying illegal behaviors in unattended scene
US10387737B1 (en) * 2018-02-02 2019-08-20 GM Global Technology Operations LLC Rider rating systems and methods for shared autonomous vehicles
JP7283037B2 (en) * 2018-07-26 2023-05-30 ソニーグループ株式会社 Information processing device, information processing method, and program
CN112464734B (en) * 2020-11-04 2023-09-15 昆明理工大学 Automatic identification method for walking motion characteristics of quadruped based on vision

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6369794B1 (en) * 1998-09-09 2002-04-09 Matsushita Electric Industrial Co., Ltd. Operation indication outputting device for giving operation indication according to type of user's action
US7330566B2 (en) * 2003-05-15 2008-02-12 Microsoft Corporation Video-based gait recognition
US7212651B2 (en) * 2003-06-17 2007-05-01 Mitsubishi Electric Research Laboratories, Inc. Detecting pedestrians using patterns of motion and appearance in videos
JP5028751B2 (en) * 2005-06-09 2012-09-19 ソニー株式会社 Action recognition device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2010036091A2 *

Also Published As

Publication number Publication date
EP2327057A4 (en) 2017-11-22
MY159289A (en) 2016-12-30
CN102224526A (en) 2011-10-19
WO2010036091A3 (en) 2010-06-24
WO2010036091A2 (en) 2010-04-01

Similar Documents

Publication Publication Date Title
WO2010036091A2 (en) A system and a method for identifying human behavioural intention based on an effective motion analysis
Bouchrika et al. On using gait in forensic biometrics
CN111881887A (en) Multi-camera-based motion attitude monitoring and guiding method and device
US9330470B2 (en) Method and system for modeling subjects from a depth map
US9235753B2 (en) Extraction of skeletons from 3D maps
CN108875507B (en) Pedestrian tracking method, apparatus, system, and computer-readable storage medium
Kovač et al. Human skeleton model based dynamic features for walking speed invariant gait recognition
Dhulekar et al. Motion estimation for human activity surveillance
CN113196283A (en) Attitude estimation using radio frequency signals
El-Sallam et al. A low cost 3D markerless system for the reconstruction of athletic techniques
CN111062295A (en) Area positioning method and device, and storage medium
CN115953838A (en) Gait image tracking and identifying system based on MLP-Yolov5 network
Liu et al. Analysis of human walking posture using a wearable camera
Krzeszowski et al. The application of multiview human body tracking on the example of hurdle clearance
CN107045725A (en) A kind of human body motion capture method
Bouchrika et al. Recognizing people in non-intersecting camera views
CN114372996A (en) Pedestrian track generation method oriented to indoor scene
Hachaj et al. How Repetitive are Karate Kicks Performed by Skilled Practitioners?
Goffredo et al. Performance analysis for gait in camera networks
Lok et al. Model-based human motion analysis in monocular video
Xu et al. A video tracking system for limb motion measurement in small animals
Priydarshi et al. Speed invariant, human gait based recognition system for video surveillance security
Mukhtar et al. RETRACTED: Gait Analysis of Pedestrians with the Aim of Detecting Disabled People
CN111435535A (en) Method and device for acquiring joint point information
Bouchrika et al. Markerless extraction of gait features using haar-like template for view-invariant biometrics

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20110316

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

AX Request for extension of the european patent

Extension state: AL BA RS

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20171023

RIC1 Information provided on ipc code assigned before grant

Ipc: G06K 9/00 20060101AFI20171017BHEP

Ipc: G08B 13/196 20060101ALI20171017BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20180523