WO2005069222A1 - 情報認識装置、情報認識方法、情報認識プログラム及び警報システム - Google Patents
情報認識装置、情報認識方法、情報認識プログラム及び警報システム Download PDFInfo
- Publication number
- WO2005069222A1 WO2005069222A1 PCT/JP2005/000315 JP2005000315W WO2005069222A1 WO 2005069222 A1 WO2005069222 A1 WO 2005069222A1 JP 2005000315 W JP2005000315 W JP 2005000315W WO 2005069222 A1 WO2005069222 A1 WO 2005069222A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pattern model
- information
- information recognition
- operation pattern
- thermal radiation
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 102
- 238000001514 detection method Methods 0.000 claims abstract description 174
- 230000005855 radiation Effects 0.000 claims abstract description 71
- 230000033001 locomotion Effects 0.000 claims description 118
- 230000000694 effects Effects 0.000 claims description 26
- 230000009471 action Effects 0.000 claims description 22
- 238000001228 spectrum Methods 0.000 claims description 21
- 239000000284 extract Substances 0.000 claims description 2
- 230000000875 corresponding effect Effects 0.000 abstract 2
- 230000002596 correlated effect Effects 0.000 abstract 1
- 230000008569 process Effects 0.000 description 68
- 241000282472 Canis lupus familiaris Species 0.000 description 55
- 241000282414 Homo sapiens Species 0.000 description 47
- 238000010586 diagram Methods 0.000 description 29
- 241000282412 Homo Species 0.000 description 18
- 241001465754 Metazoa Species 0.000 description 18
- 230000006399 behavior Effects 0.000 description 12
- 238000005070 sampling Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 9
- 238000013500 data storage Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 241000282326 Felis catus Species 0.000 description 4
- 241000238631 Hexapoda Species 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000010365 information processing Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000009545 invasion Effects 0.000 description 2
- HFGPZNIAWCZYJU-UHFFFAOYSA-N lead zirconate titanate Chemical compound [O-2].[O-2].[O-2].[O-2].[O-2].[Ti+4].[Zr+4].[Pb+2] HFGPZNIAWCZYJU-UHFFFAOYSA-N 0.000 description 2
- 229910052451 lead zirconate titanate Inorganic materials 0.000 description 2
- 230000002265 prevention Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- WSMQKESQZFQMFW-UHFFFAOYSA-N 5-methyl-pyrazole-3-carboxylic acid Chemical compound CC1=CC(C(O)=O)=NN1 WSMQKESQZFQMFW-UHFFFAOYSA-N 0.000 description 1
- 241000239290 Araneae Species 0.000 description 1
- 241000254173 Coleoptera Species 0.000 description 1
- 241000255777 Lepidoptera Species 0.000 description 1
- 241000131077 Lucanidae Species 0.000 description 1
- 241000124008 Mammalia Species 0.000 description 1
- 241000287107 Passer Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/10—Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors
- G01J5/34—Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors using capacitors, e.g. pyroelectric capacitors
Definitions
- Information recognition device information recognition method, information recognition program, and alarm system
- the present invention relates to information processing using a thermal radiation sensor, and in particular, to the output of the thermal radiation sensor for an object to be detected and the operation patterns of a plurality of targets prepared in advance using a predetermined modeling method.
- the present invention relates to an information recognizing device, an information recognizing method and an information recognizing program capable of recognizing predetermined information on an object to be detected based on a corresponding operation pattern model, and an alarm system including the information recognizing device.
- the detection output of the other human body detecting means is invalidated for a fixed time by the first detection control means, and
- the second detection control means invalidates the detection output of the other human body detection means. I do. This enables quick and accurate notification of the moving direction of the human body detected in the human body detection range.
- Patent Document 1 Japanese Patent No. 2766820
- the present inventors have detected thermal radiation emitted from a large number of detected objects performing the same action within the detection range of the pyroelectric infrared sensor using the pyroelectric infrared sensor. There is an individual difference in the output of the pyroelectric infrared sensor for each type of sensing object (person, animal, gender, etc.) and for each individual in the same sensing object (for example, for humans, A, B etc.) I found something.
- An object of the present invention is to provide an information recognizing device, an information recognizing method, an information recognizing program, and a security system including the information recognizing device capable of recognizing the predetermined information!
- an information recognition device provides a heat radiation sensor that detects thermal radiation emitted from a detection target existing within a detection range by a thermal radiation sensor.
- Radiation detection means
- An operation pattern model that stores an operation pattern model obtained by preliminarily modeling the output of the thermal radiation sensor according to the operation pattern of the detected object according to a predetermined modeling method.
- Dell storage means ;
- An information recognizing means for recognizing predetermined information relating to the object to be detected within the detection range based on a detection result of the thermal radiation detecting means and the operation pattern model stored in the operation pattern model storage means; It is characterized by having!
- the thermal radiation sensor With such a configuration, it is possible for the thermal radiation sensor to detect thermal radiation emitted from the detection object existing within the detection range by the thermal radiation detecting means, and to store the operation pattern model.
- the thermal radiation sensor By means, it is possible to store an operation pattern model in which the output of the thermal radiation sensor according to the operation pattern of the detected object is modeled in advance according to a predetermined modeling method. Based on the detection result of the heat radiation detection unit and the operation pattern model stored in the operation pattern model storage unit, it is possible to recognize predetermined information related to the detected object present in the detection range. It is.
- the predetermined information of the detected object is recognized based on the detection result of the thermal radiation sensor and the operation pattern model, so that various behavior patterns of the detected object, attributes of the detected object, and so on are recognized. Recognition of information becomes possible.
- the object to be detected includes anything such as humans, animals other than humans, creatures such as insects, and inanimate objects as long as they emit thermal radiation.
- the thermal radiation sensor may be any type of sensor that detects the heat released from the object.
- an infrared sensor that detects infrared emitted from the object.
- there is a quantum sensor using a photovoltaic effect or a photoconductive effect or a thermal sensor using a thermoelectromotive effect, a pyroelectric effect or a thermoconductive effect.
- the predetermined modeling method includes, for example, a well-known modeling method such as an HMM or a -ural network.
- the predetermined information relating to the detected object includes the operation content of the detected object within the detection range and
- the operation pattern model storage unit stores a plurality of operation pattern models respectively corresponding to a plurality of types of operation patterns.
- the operation pattern model of the detected object is determined by the predetermined modeling method based on an output of the thermal radiation sensor. It is characterized by having an operation pattern model generating means for generating.
- the operation pattern model of the detected object can be generated by the predetermined modeling method based on the output of the thermal radiation sensor by the operation pattern model generation means.
- the invention according to claim 4 is the information recognition device according to any one of claims 1 to 3, wherein the thermal radiation sensor is a thermal sensor.
- the thermal radiation sensor is a thermal sensor.
- a thermal radiation sensor using a thermopile or the like utilizing the thermoelectromotive effect, a sensor utilizing the pyroelectric effect using PZT (lead zirconate titanate), LiTa03 (lithium tantalate), a thermistor, It is composed of a thermal sensor such as a sensor using a thermocouple effect using a porometer or the like.
- the invention according to claim 5 is the information recognition device according to any one of claims 1 to 3, wherein the thermal radiation sensor is a quantum sensor. .
- the thermal radiation sensor uses a photovoltaic effect using a photodiode, a phototransistor, a photo IC, a solar cell, etc., and a photoconductive effect using a CdS cell, CdSe cell, PdS cell, etc. It is constituted by a quantum sensor such as a sensor utilizing a photoelectron emission effect using a sensor, a photoelectric tube, a photomultiplier tube or the like.
- the invention according to claim 6 is the information recognition device according to claim 4, wherein
- the type sensor is characterized in that it is a pyroelectric infrared sensor that detects infrared rays emitted from the body force to be detected using a pyroelectric effect.
- the pyroelectric infrared sensor is used as the thermal radiation sensor, it is possible to easily detect a moving object within the detection range.
- the invention according to claim 7 is the information recognition device according to any one of claims 1 to 6, wherein the predetermined modeling method is an HMM (Hidden Markov Model). It is characterized by.
- HMM Hidden Markov Model
- the invention according to claim 8 is the information recognition device according to any one of claims 1 to 7, wherein the predetermined information includes an action content of the object to be detected and the object to be detected.
- the output of the thermal radiation sensor changes depending on the action content, moving speed, size, etc. of the detected object.
- the action content of the detected object is, for example, a movement in a certain direction, an operation of a part of the body such as a hand or a foot (a gesture or the like), and the like.
- the size is a size with respect to the height, width, length, surface area, volume, etc. of the detected object, and this is not limited to the entire detected object, but may be a size for a part of the detected object. The size shall be included.
- the predetermined information includes attribute information of the detected object.
- the information recognizing means can recognize the attribute information of the detected object within the detection range.
- the attribute information is, for example, broadly, a person, an animal (mammal) other than a human, an insect, etc. This is information on the types of living things that emit heat, cars, motorcycles, curtains, sunlight, lights, air-conditioners, and other inanimate objects that emit heat such as hot and cold air.
- attribute information also includes information on the type of inanimate objects, such as curtain swings, tree branches and leaf swings, and the like. Recognition of such inanimate predetermined information that does not emit heat can be performed in combination with an object that emits heat. For example, if there is a thermal radiation sensor on one side of the curtain and a heat source on the other side, when the heat source is covered with a curtain, the heat radiated by the heat source will also be radiated to the sensor. When the curtain is shaken and the heat source is exposed, the heat radiated from the heat source is detected by the sensor. By comparing such a detection result with the operation pattern, it is possible to determine, for example, whether it is a curtain shake or whether a person has entered the building.
- recognition of such inanimate predetermined information that does not emit heat can be performed in combination with an object that emits heat. For example, if there is a thermal radiation sensor on one side of the curtain and a heat source on the other side, the heat radiated by the heat source will also be radiated to the
- attribute information is personal information, it is possible to identify an individual if it is a human or an individual if it is an insect or animal.
- An invention according to claim 10 is the information recognition device according to any one of claims 1 to 9, wherein the information recognition means is characterized by a detection result of the thermal radiation detection means. Amount data is extracted, and a likelihood between the feature amount data and the operation pattern model is calculated based on the feature amount data and the operation pattern model stored in the operation pattern model storage means. The predetermined information relating to the detected object is recognized based on the likelihood.
- the likelihood between the feature data and the motion pattern model is calculated, and the predetermined information on the detected object is recognized based on the calculated likelihood. It works.
- the characteristic amount data is obtained from a spectrum in a frame unit of a detection result of the thermal radiation detecting unit.
- second feature amount data including an average amplitude value of the spectrum in the frame unit.
- the first feature data which is the spectral power of the detection result in frame units
- the second feature data which is the average amplitude value of the spectrum in frame units
- the invention according to claim 12 is characterized in that, in the information recognition apparatus according to claim 11, the first feature amount data is obtained by converting the value of the spectrum into a common logarithmic value.
- the first feature amount data a value obtained by converting the value of the spectrum in the frame unit into a common logarithmic value is used. If it is 1 or more, the dispersion range will be narrowed. Thereby, depending on conditions, it is possible to further improve the recognition accuracy of the predetermined information.
- the feature amount data is a feature amount indicated by the first feature amount data of a selected frame, and It is characterized in that it further includes third feature value data which is a difference force from the feature value indicated by the first feature value data of the frame immediately before the selected frame.
- the feature indicated by the first feature data of the selected frame and the first feature of the frame immediately before the selected frame are displayed.
- Recognition of the predetermined information is performed by using the third feature data, which is a power of difference from the feature indicated by the collected data, so that the recognition accuracy of the predetermined information can be further improved. .
- the characteristic amount data includes a characteristic amount indicated by the second characteristic amount data of the selected frame and a characteristic amount of the selected frame. It is characterized in that it further includes fourth feature value data that is a difference force from the feature value indicated by the second feature value data of the previous frame.
- the second feature data of the selected frame Recognition of the predetermined information is performed by using the fourth feature data including the difference between the feature indicated by the data and the feature indicated by the second feature data of the frame immediately before the selected frame. As a result, it is possible to further improve the recognition accuracy of the predetermined information.
- the motion pattern model is a feature data storage device having four or more dimensions.
- Feature quantity data display means for displaying the feature quantity data corresponding to each action pattern model stored in the action pattern model storage means as coordinate points in a two-dimensional or three-dimensional space;
- a detection result display unit for displaying a coordinate point corresponding to the detection result of the thermal radiation detection unit on a space where the coordinate point of the feature amount data is displayed.
- the feature data display means corresponds to each motion pattern model stored in the motion pattern model storage means.
- the feature data can be displayed as coordinate points on a two-dimensional or three-dimensional space, and the detection result display means displays the thermal radiation detection means on the space where the coordinate points of the feature data are displayed. It is possible to display the coordinate point corresponding to the detection result of.
- the information recognition method according to claim 16 of the present invention detects a thermal radiation emitted from a detection target existing within a detection range by a thermal radiation sensor
- An operation pattern model is prepared in which outputs of the thermal radiation sensor corresponding to a plurality of types of operation patterns of a plurality of detected objects are modeled in advance according to a predetermined modeling method.
- the detection is performed based on a detection result of the thermal radiation sensor and the operation pattern model. Recognizing predetermined information relating to the detected object existing within the known range.
- the present invention can be realized by the information recognition device or the like described in claim 1, and the description thereof is omitted because the effects are duplicated.
- the information recognition program according to claim 17 of the present invention includes a thermal radiation detecting step of detecting thermal radiation emitted from a detection target existing within a detection range by a thermal radiation sensor,
- Information recognition for recognizing predetermined information related to the detected object within the detection range based on the detection result of the thermal radiation detection step and the operation pattern model stored in the operation pattern model storage step. And a step.
- the present invention is a program that can be applied to the information recognition device described in claim 1, and the description thereof is omitted because the effects are duplicated.
- an alarm system according to claim 18 includes an information recognition device according to any one of claims 1 to 15,
- Judging means for judging whether or not the detected object is human-powered based on a recognition result of the information recognition device
- An alarming means for issuing an alarm when the object to be detected is judged to be a person by the judging means.
- the above-mentioned “alarming” refers to a threat issued to an intruder, such as flowing a warning message by voice to a speaker or the like and continuously playing a peculiar sound such as a buzzer sound from a speaker. And warnings as warnings, and alerts to system users about dangers, which notify the system users directly of the intrusion of people into the building by voice or screen display.
- the alarm for the intruder and the alarm for the system user may be either a configuration including both of them or a configuration including either of them.
- FIG. 1 is a block diagram showing a configuration of an information recognition device according to the present invention.
- FIG. 2 (a) is a diagram showing an installation position of the information recognition device 1, (b) is a diagram showing a detection range of the pyroelectric infrared sensor 10a, and (c) is a detection target.
- FIG. 6 is a diagram showing an operation pattern of FIG.
- FIG. 3 is a diagram showing a relationship between an output waveform of a pyroelectric infrared sensor 10a and an operation pattern model.
- FIG. 4 is a flowchart showing an operation process of the infrared detection unit 10.
- FIG. 5 is a flowchart showing an operation process of an operation pattern model generation unit 11.
- FIG. 6 is a flowchart showing an operation process of a recognition processing unit 13.
- FIG. 7 is a diagram showing a recognition result of an operation direction in the embodiment.
- FIG. 8 is a diagram showing an example in which a detection range 20 is finely divided into small ranges.
- FIGS. 9 (a) and 9 (b) are diagrams showing information on a dog as a detection target at the time of identification.
- FIG. 10 (a) is a diagram showing recognition results of humans (to distinguish adults and children) and dogs (to distinguish large dogs and small dogs), and (b) shows the results of recognition of humans (to distinguish adults and children). It is a figure which shows the recognition result of a dog (no distinction between a large dog and a small dog) and a dog (no distinction).
- FIG. 11 is a diagram showing a recognition result of an operation direction in a third embodiment.
- FIGS. 12 (a) and (b) are diagrams showing recognition results of an operation direction in a fourth embodiment.
- FIG. 13 is a diagram showing a display example of a motion pattern model projected two-dimensionally.
- FIG. 14 is a flowchart showing an operation process of the two-dimensional projection unit 14.
- FIG. 15 is a block diagram showing a configuration of an alarm system according to a fourth embodiment of the present invention.
- FIG. 16 is a flowchart showing an operation process of the alarm control unit 50.
- FIG. 1 to FIG. 6 are diagrams showing a first embodiment of the information recognition device according to the present invention.
- FIG. 1 is a block diagram showing the configuration of the information recognition device according to the first embodiment of the present invention.
- the information recognition device 1 has a configuration including an infrared detection unit 10, an operation pattern model generation unit 11, an operation pattern model storage unit 12, and a recognition processing unit 13. It has become.
- the infrared detector 10 has a configuration including a pyroelectric infrared sensor 10a and a signal processor 10b.
- the pyroelectric infrared sensor 10a is a sensor that can detect infrared light that also emits a detected physical force existing within the detection range by using the pyroelectric effect.
- the signal processing unit 10b performs signal processing such as sampling and FFT (Fast Fourie Transform) on the analog signal of the detection result output from the pyroelectric infrared sensor 10a, and calculates feature amount data of the detection result. It has.
- signal processing such as sampling and FFT (Fast Fourie Transform)
- the operation pattern model generation unit 11 has a function of generating a motion pattern model by modeling feature amount data obtained from the infrared detection unit 10 using an HMM.
- the operation pattern model storage unit 12 has a function of storing the generated operation pattern model.
- the recognition processing unit 13 detects a detection target existing within the detection range of the pyroelectric infrared sensor 10a based on the storage content of the operation pattern model storage unit 12 and the feature amount data of the infrared detection result obtained also from the infrared detection unit 10.
- Machine for recognizing body movement pattern information and attribute information It has the ability.
- the information recognition device 1 includes a processor (not shown), a random access memory (RAM), and a storage medium storing a dedicated program.
- the above-mentioned components are controlled by executing a dedicated program.
- the storage medium is a semiconductor storage medium such as a RAM or a ROM, a magnetic storage type storage medium such as an FD or HD, a CD, a CDV, an LD, a DVD or the like.
- Optical read-type storage medium, magnetic storage type such as MO, etc.Z Optical read-type storage medium, which can be read by a computer regardless of electronic, magnetic, optical, etc. If so, it includes all storage media.
- FIGS. 2A is a diagram showing an installation position of the information recognition device 1
- FIG. 2B is a diagram showing a detection range of the pyroelectric infrared sensor 10a
- FIG. 2C is an operation pattern of a detection target.
- FIG. 3 is a diagram showing a relationship between an output waveform of the pyroelectric infrared sensor 10a and an operation pattern model.
- the information recognition device 1 mounts a pyroelectric infrared sensor 10a, which is a component thereof, on a ceiling such as a room, and detects an object passing through a detection range 20 thereof. Forces are also installed to detect emitted infrared radiation. Then, from the detection result of the infrared ray detected from the detected object passing through the detection range, the operation pattern and the attribute of the detected object are recognized! /
- the pyroelectric infrared sensor 10a uses one in which four pyroelectric elements are projected by 16 Fresnel lenses to expand the detection range, and the detection range is used.
- 20 has a range of about 6 m in the X direction and about 7 m in the y direction, with the horizontal axis as the X axis and the vertical axis as the y axis. That is, as shown in FIG. 2 (b), it is possible to detect infrared rays from an object to be detected passing through any of the plurality of detection zones within the above-described range.
- the detected object sets the detection range 20 in each of the directions (1) and (8) as shown in FIG. From outside the detection range 20 Walk !, think about what you have when you pass.
- a plurality of detected objects are caused to perform the above-described eight motion pattern actions in advance (for example, each action is performed five times by the same person).
- the signal processing unit 10b performs signal processing on the detection results from the pyroelectric infrared sensor 10a, which also provides the action force of these operation patterns, to calculate feature amount data, and the operation pattern model generation unit 12
- the feature data corresponding to each motion pattern is modeled by HMM.
- the analog output signal 30 having a data time length of 10 [s] from the pyroelectric infrared sensor 10a is converted to 100 [ [ms]
- the analog output signal 30 is converted into digital data by subjecting the sampled data to AZD conversion.
- the sampling data at intervals of 100 [ms] is divided into a plurality of frames 31 in 1.6 [s] units.
- FFT is performed on the sampling data of 31 units of each frame, and these sampling data are developed into a Fourier series to calculate the spectrum of each harmonic ((spectrum 32 in FIG. 3).
- Each of the frames 31 corresponds to 16 pieces of sampling data, and the overlap between the frames is set to 12 pieces of sampling data.
- the first eight are used as first feature data, and an average amplitude level is calculated for each frame, which is used as second feature data.
- the operation pattern model generation unit 11 obtains the first and second feature data from the infrared detection unit 10 and uses these feature data to perform the processing shown in FIG. Create HMM33 as shown.
- the HMM 33 uses the first feature data as a first parameter and the second feature data as a second parameter. Then, the number of internal states is set to 5 states of S—S, and each parameter is
- the motion pattern model generated by the motion pattern model generation unit 11 is stored in the motion pattern model storage unit 12 in association with the attribute (for example, name) of the detected object and the content of the motion pattern. .
- the recognition processing unit 13 After the generation of the motion pattern models of the plurality of detection targets as detection targets is completed in this way, the recognition processing unit 13 thereafter executes the detection processing based on the signal processing results from the infrared detection unit 10 based on the detection results. Recognition processing of the motion pattern and the attribute of the detection object is performed.
- the pyroelectric infrared sensor 10a detects the infrared ray of the detection target A and outputs an analog signal according to the detection result.
- This analog signal is input to the signal processing unit 1 Ob, where the above-described signal processing is performed, and the processing result is input to the recognition processing unit 13.
- the recognition processing unit 13 extracts similar feature data from the signal processing result for the motion of the detection target A, and stores the feature data and the motion pattern stored in the motion pattern model storage unit 12. Recognize the motion pattern and attributes of the detected object A based on the model.
- a feature amount data sequence (including the observation sequence) for the motion of the detection target A is selected from the motion pattern models stored in the motion pattern model storage unit 12.
- a model having a state transition sequence that generates the highest probability and the highest probability the motion pattern and the attribute of the detected object A are recognized.
- a known method is used for the detection method using the above-mentioned Viterbi algorithm.
- the motion pattern model when the motion pattern model corresponding to the state transition sequence having the maximum probability is detected by using the Viterbi algorithm, the motion pattern model includes the motion pattern in advance as described above. Since the contents and attributes of the object A are associated with! / ⁇ , the operation contents of the detected object A that has passed through the detection range (such as walking in the direction of (6)) can be recognized. It can also be recognized that the object to be detected is A.
- This recognition result is output to, for example, a display processing unit that displays the recognition result on a display unit, or an information processing unit such as an application program that performs some processing using the recognition result.
- FIG. 4 is a flowchart showing the operation processing of the infrared ray detection unit 10.
- step S100 the analog output signal of the sensor is input to the signal processing unit 10b in the pyroelectric infrared sensor 10a, and the process proceeds to step S102.
- the signal processing unit 10b performs sampling processing on the obtained analog output signal at predetermined time intervals (for example, 100 ms), and then proceeds to step S104.
- step S104 the signal processing unit 10b performs AZD conversion processing on the sampling result, and proceeds to step S106.
- step S106 the signal processing unit 10b determines whether or not the output signal has changed based on the output signal of the pyroelectric infrared sensor 10a that has been subjected to the sampling processing and the AZD conversion processing. If it is determined that the error has occurred (Yes), the process proceeds to step S108; otherwise, the process proceeds to step S110.
- the signal processing unit 10b stores the AZD-converted output signal in a storage unit (not shown) including a RAM or the like, and then proceeds to step S100.
- step S110 the signal processing unit 10b determines whether or not there is data stored in the storage unit. When it is determined that the data is present (Yes), the process proceeds to step S112. If not, the process proceeds to step S100.
- step S112 the signal processing unit 10b performs frame division processing on the data stored in the storage unit in a predetermined time unit (for example, 1.6 s), and proceeds to step S114.
- a predetermined time unit for example, 1.6 s
- step S114 the signal processing unit 10b performs an FFT for each frame unit, calculates a spectrum of each harmonic as a result of the FFT, and furthermore, calculates an average amplitude for each frame unit. The width is calculated, and the flow shifts to step S116.
- step S116 the infrared detection unit 10 determines whether the operation mode is the operation pattern model generation mode, and if it is determined that the operation mode is the operation pattern model generation mode (Yes), the process proceeds to step S118. Otherwise, (No) the process proceeds to step S120.
- two modes can be set: an operation pattern model generation mode and an information recognition mode, and the operation mode is set to the operation pattern model generation mode. If the information processing mode is set to the information recognition mode, the signal processing result of the infrared detector 10 is input to the recognition pattern Enter in Part 13.
- step S 118 the infrared detection unit 10 inputs the signal processing result to the operation pattern model generation unit 11 and proceeds to step S 100.
- step S120 the signal processing result is input to the recognition processing unit 13 in the infrared detecting unit 10, and the process proceeds to step S100.
- FIG. 5 is a flowchart showing the operation process of the operation pattern model generation unit 11.
- step S 200 it is determined whether or not a signal processing result from the infrared detector 10 has been obtained. If it is determined that the signal processing result has been obtained (Yes), the process proceeds to step S 202. If not, (No) wait until it is obtained.
- step S202 based on the obtained signal processing result, an operation pattern model is generated using the HMM, and the process proceeds to step S204.
- step S204 the operation content and the attribute information are associated with the generated operation pattern model, and the flow advances to step S206.
- step S206 the operation pattern model associated with the operation content and the attribute information is stored in the operation pattern model storage unit 12, and the process ends.
- FIG. 6 is a flowchart showing the operation processing of the recognition processing unit 13.
- step S 300 the infrared detection unit 10 performs signal processing. It is determined whether or not the result has been obtained. If it is determined that the result has been obtained (Yes), the process proceeds to step S302; otherwise (No), the process waits until the result is obtained.
- step S302 the operation pattern model is read from the operation pattern model storage unit 12, and the process proceeds to step S304.
- step S304 based on the read operation pattern model and the obtained signal processing result, an operation pattern model having a state transition sequence having the maximum probability is detected using the Viterbi algorithm, and the flow advances to step S306.
- step S306 recognition processing is performed based on the detected motion pattern model, and the flow advances to step S308.
- the recognition process is to read the operation content and attribute information associated with the operation pattern model, as described above.
- step S308 the recognition result is output to information processing means such as an application program, and the process ends.
- the infrared detection section 10 detects infrared rays of a plurality of detection targets within the detection range 20, performs a signal processing on an output signal of the detection result, and generates an operation pattern model generation section.
- the HMM generates a motion pattern model corresponding to the motion pattern content of each detected object and the attribute of the detected object, and can store the motion pattern model in the motion pattern model storage unit 12. .
- the recognition processing unit 13 performs a detection based on the infrared detection result of the detected object operating within the detection range 20 of the infrared detection unit 10 and the operation pattern model stored in the operation pattern model storage unit 12. It is possible to recognize the motion pattern of the detected object and its attributes.
- the infrared detecting unit 10 shown in FIG. 1 corresponds to the thermal radiation detecting means according to any one of claims 1, 10 and 11, and the operation pattern model generating unit 11 is described in claim 3.
- the operation pattern model storage unit 12 corresponds to the operation pattern model storage unit described in any one of claims 1, 2, and 10, and the recognition processing unit 13 Corresponds to the information recognition means described in item 1 or 10.
- FIG. 7 is a diagram illustrating a recognition result of the operation direction in the first embodiment.
- FIG. 8 is a diagram showing an example in which the detection range 20 is subdivided into small ranges.
- a 5-state HMM is generated using the same feature parameters as those in the first embodiment.
- an HMM is generated using data obtained by performing 17 (1)-(8) operations in eight directions five times by the 17 subjects A to Q in the first embodiment.
- the attributes of the object to be detected are ignored, and the generation of the motion pattern model in each direction is repeated five times by 17 persons in each direction. Were used (85 data of 17 people x 5 times).
- an HMM dedicated to each object is generated using five pieces of data for each attribute to generate a motion pattern model in each direction.
- an HMM corresponding to each direction motion of an unspecified number of detected objects is generated using all data of each direction of 17 persons.
- the average recognition rate of the movement direction of the detection target A to Q when the detection object 20 passes through the detection range 20 using the generated movement pattern model is shown in FIG. As shown in Fig. 7, when the same line error is considered, it becomes 73.7%, and when the same line error is ignored, it becomes 88.7%.
- an operation pattern model is generated for the entire detection range 20, thereby recognizing the eight directions (1)-(8).
- the detection range 20 is finely divided into small areas, and an operation pattern model in each direction is generated for each of the sections. By combining these, it is possible to recognize various operation contents in the detection range 20 of the detection target.
- FIGS. 9 (a) and 9 (b) are diagrams showing information of a dog as a detection target at the time of identification.
- FIG. 10 (a) is a diagram showing recognition results of a person (to distinguish between an adult and a child) and a dog (to distinguish between a large dog and a small dog), and
- FIG. It is a figure which shows the recognition result of a dog (no distinction between a large dog and a small dog) and a dog.
- An HMM is generated using data obtained by performing the operations of (1)-(8) in the eight directions in the first embodiment 50 times for each of the detected objects in the first embodiment.
- a value obtained by converting the first feature parameter in the first embodiment into a common logarithmic value and a second feature parameter in the first embodiment are used.
- the number of internal states of the HMM was set to 7 using this calculation.
- large dogs and small dogs are determined from among dogs selected as detected objects, including Labrador retrievers with a body height of 63 cm and a body length of 80 cm. Larger dogs were designated as large dogs, while smaller dogs, including toy poodles 40 cm tall and 40 cm tall, were designated small dogs.
- body height and body length are the height of the part of the body at the highest ground force when the dog is standing, and body length is the height when the dog is standing. The tip of the nose is also the length to the tail.
- an adult motion pattern model generated using the 36 adult motion data a child motion pattern model generated using the above 6 child motion data
- Four types of models are used: a large dog motion pattern model generated using five large dog motion data, and a small dog motion pattern model generated using the seven small dog motion data described above.
- Each motion pattern model also includes eight HMM forces corresponding to each behavior pattern. In the generation of each model, only 10 of the 50 learning data (motion data) were used for each behavior pattern, and the remaining 40 evaluation data were used for recognition processing.
- the information recognition device 1 uses the As shown in Fig. 10 (b), the average recognition rate for adults is 99.6%, the average recognition rate for children is 98.4%, and the average The recognition rate was 96.4%, the average recognition rate for small dogs was 94.8%, and the average recognition rate for these dogs was 97.3%. According to the results, it can be seen that the reason why the recognition rate of large dogs was low when large dogs and small dogs were distinguished was that most of them misrecognized large dogs and small dogs. And the recognition rate of large dogs increased dramatically, so that the overall average recognition rate increased significantly to 97.3%. From this, it was found that the information recognition device 1 according to the present invention can discriminate between a human and a dog (an animal other than a human) with high probability.
- FIG. 11 and FIG. 12 are diagrams showing the results of the information recognition device according to the second embodiment of the present invention.
- the feature data used for modeling and recognition processing are the first and second feature data in addition to the first and second feature data in the first embodiment.
- the third and fourth feature amount data calculated from the feature amount data is used. Therefore, in the configuration similar to the first embodiment, the generation method and the recognition processing method of the motion pattern model are partially different.
- the same parts as those in the first embodiment will be described. The description will be made with reference to the drawings.
- the data processing unit 10b transmits the data time length 10 [s] from the pyroelectric infrared sensor 10a, as shown in FIG.
- the analog output signal 30 is sampled at intervals of 100 [ms], and the sampled data is subjected to AZD conversion to convert the analog output signal 30 into digital data.
- the sampling data at intervals of 100 [ms] is divided into a plurality of frames 31 in 1.6 [s] units.
- each frame has 31 units of sampling data FFT is performed on these samples, and these sampled data are expanded into a Fourier series to calculate the spectrum of each harmonic ((spectrum 32 in Fig. 3).
- each frame 31 corresponds to 16 samples of data.
- the overlap between frames is set to 12 pieces of sampling data, and in the present embodiment, the value N is used as a regular pair for the first half of the above-mentioned spectrum 32 of each frame 31.
- the value converted to a numerical value (logN) is used as the first feature data, the average amplitude level is calculated for each frame, and this is used as the second feature data.
- the difference between the numerical value of the first characteristic amount data and the numerical value of the first characteristic amount data with respect to the immediately preceding frame 31 is defined as third characteristic amount data, and each selected value in all frames 31 is selected.
- the difference between the value of the second feature data for the frame 31 and the value of the second feature data for the immediately preceding frame 31 is defined as fourth feature data.
- the motion pattern model generation unit 11 acquires the first to fourth feature data from the infrared detection unit 10 and creates an HMM by using these feature data. I do.
- the first-fourth feature data is the first-fourth parameter
- the number of internal states is S—S
- the probability distribution of each parameter is the probability distribution of each parameter
- the motion pattern model generated by the motion pattern model generation unit 11 is associated with the attribute (eg, name) of the detected object and the content of the motion pattern, and is stored in the motion pattern model storage unit 12.
- the recognition processing unit 13 detects the detected objects based on the signal processing result from the infrared detection unit 10. A recognition process of the motion pattern and the attribute is performed.
- step S 114 in the flowchart shown in FIG. 4, which is different from the first embodiment, will be described.
- step S114 the signal processing unit 10b performs an FFT for each frame, calculates a spectrum of each harmonic as a result of the FFT, and based on the calculated spectrum.
- the first to fourth feature amount data are calculated, and the flow shifts to step S116.
- the value of the spectrum of each harmonic is converted into a common logarithmic value to generate first feature data, and the average amplitude of each spectrum for each frame unit is calculated as the first amplitude.
- the difference between the first feature data for the selected frame and the first feature data for the immediately preceding frame for all frames is calculated as the third feature data.
- the difference between the second feature data for the selected frame and the second feature data for the immediately preceding frame is calculated as fourth feature data for all frames.
- the infrared detection unit 10 detects infrared rays of a plurality of detected objects within the detection range 20, performs signal processing on the output signal of the detection result, and performs the signal processing by the operation pattern model generation unit 11. Resulting power It is possible to generate a motion pattern model corresponding to the motion pattern content of each detected object and the attribute of the detected object by the HMM, and store the generated motion pattern model in the motion pattern model storage unit 12.
- the motion pattern model generation unit 11 can generate a motion pattern model using the first to fourth feature amount data.
- the recognition processing unit 13 also detects the detected object based on the infrared detection result of the detected object operating within the detection range 20 of the infrared detection unit 10 and the operation pattern model stored in the operation pattern model storage unit 12. It is possible to recognize the motion pattern and its attributes.
- the infrared detecting section 10 shown in FIG. 1 corresponds to the thermal radiation detecting means according to any one of claims 1, 10 and 11, and the operation pattern model generating section 11
- the operation pattern model storage means 12 corresponds to the operation pattern model generation means described in 3
- the recognition processing section 13 corresponds to the operation pattern model storage means described in any one of claims 1, 2, and 10. Corresponds to the information recognition means described in claim 1 or 10.
- FIG. 14 is a diagram illustrating a recognition result of an operation direction in a third embodiment.
- an HMM having seven internal states is generated using the same feature parameters as in the second embodiment.
- an HMM is generated by using data obtained by performing the above-mentioned operations (1)-(8) in the eight directions five times by 17 persons A to Q, each of which is to be detected.
- the attribute of the detected object is ignored, and the generation of the motion pattern model in each direction is repeated five times for 17 directions in each direction. All trial data (85 of 17 people x 5) were used.
- an HMM dedicated to each object is generated using five pieces of data for each attribute in generating a motion pattern model in each direction.
- an HMM corresponding to each direction motion of an unspecified number of detected objects is generated using all the data of each of the 17 persons in each direction.
- the information recognition apparatus 1 a result of performing a process of recognizing a motion direction by passing the detection target A to Q through the detection range 20 using the generated motion pattern model is described.
- the average recognition rate is 90.3% when the same line error is considered, and 97.0% when the same line error is ignored.
- the line alignment error when the line alignment error is considered, it is 73.7%, and when the line alignment error is ignored, it is 88.7%.
- the introduction of the third and fourth parameters described in the second embodiment significantly increased the recognition rate compared with the recognition result shown in FIG. 7 in the first embodiment. I can say.
- FIGS. 12A and 12B are diagrams showing recognition results of the operation direction in the fourth embodiment.
- an HMM having seven internal states is generated using the same feature parameters as those in the second embodiment.
- the HMM is used by using data obtained from three persons A to C, who also selected the 17 subjects A to Q of the detected objects A to Q, who performed the above (1) -one (8) in eight directions 50 times each. Generate In this embodiment, an HMM for each operation pattern is generated.
- the motion direction of the detected objects A to C by passing through the detection range 20 is determined.
- the average recognition rate of A is 96.3%
- the average recognition rate of B is 93.5%
- the average recognition rate of C is , 90.5%
- their average recognition rate was 93.4%.
- the recognition rate is as high as 90% or more on average, and it is understood that the present invention is effective even if the identification of each individual is added to the identification of each movement direction.
- the average recognition rate of A was 96.
- the average recognition rate for B was 97.8%
- the average recognition rate for C was 96.8%
- these average recognition rates were 96.8%.
- the present invention can be said to be powerful and effective for personal identification.
- the heights and weights of the detected objects A to C are A “165 cm, 64 kg”, B “177 cm, 68 kg”, and C “182 cm, 68 kg”. It is thought that there is a difference between individuals depending on the body type and walking style.
- FIG. 13 and FIG. 14 are diagrams showing a third embodiment of the information recognition device according to the present invention.
- the information recognition device 1 in the first and second embodiments is stored in the feature pattern data for the recognition target operation and in the operation pattern model storage unit 12.
- the feature data used when generating the motion pattern model is This is a configuration that can be displayed as coordinate points in space. In this case, for example, a two-dimensional projection unit and an information display unit are added to the information recognition device 1 shown in FIG. 1 in the first and second embodiments.
- the two-dimensional projection unit includes feature amount data (hereinafter, referred to as first feature amount data) at the time of generating the motion pattern model and feature amount data (hereinafter, referred to as first feature amount data) obtained from the infrared detection unit 10.
- first feature amount data feature amount data
- second feature data the mathematical distance between the first feature data and the mathematical distance between the first feature data and the second feature data. Things.
- the Brooke distance between the respective feature data is calculated.
- the feature data has multi-dimensional (four or more) information.
- the two-dimensional projection unit uses the well-known Sammon method (Jon W. Sammon, JR , A Nonlinear Mapping for Data Structure Analysis, IEEE Trans. Transformers, see Vol. C-18, No. 5, May 1969) to project multidimensional information into two-dimensional information.
- Sammon method Jon W. Sammon, JR , A Nonlinear Mapping for Data Structure Analysis, IEEE Trans. Transformers, see Vol. C-18, No. 5, May 1969
- the information display unit has a function of displaying information of the projection result of the two-dimensional projection unit.
- FIG. 13 is a diagram illustrating a display example of feature amount data that has been two-dimensionally projected.
- the first feature amount data the data of each of the five trials for each of the A to Q individuals in the first embodiment with respect to the above (1)-(8) behavior pattern is used. Used. Therefore, five feature data (coordinate points of the same shape in FIG. 9) are two-dimensionally projected and displayed for each of A to Q for a certain action pattern.
- the two-dimensional projection unit first calculates the mathematical distance between the first feature data with respect to the behavior pattern of the five trials A to Q described above (calculated for each trial), and illustrates this. Not stored in the data storage unit. Then, when the signal processing result (second feature data) is obtained from the infrared detector 10, based on this feature data and the feature data for the five trials A to Q, the second feature data and Calculate the mathematical distance from the first feature data. Further, a mathematical distance between the first feature data with respect to A—Q stored in the data storage unit is read, and a mathematical distance between the first feature data and the second feature data is calculated. Using the Sammon method described above, each feature data is two-dimensionally projected while maintaining these mathematical distance relationships. Here, the coordinate information generated by the two-dimensional projection is input to the information display unit.
- the information display unit displays the acquired coordinate information as coordinate points having different shapes for each attribute, as shown in FIG.
- 40 in FIG. 9 is the coordinates of the second feature data, and the relationship between the shape of each coordinate point and A—Q is shown in the frame of 41 in FIG.
- the second feature data star-shaped coordinate points
- the detection result is closest to the attribute A even when viewing the display contents of the coordinate points by the two-dimensional projection. That is, it is possible for an operator or the like to recognize or predict the attribute (in this case, A) of the recognition target that crosses the detection range 20 by looking at the display contents of the coordinate points.
- FIG. 14 is a flowchart showing the operation processing of the two-dimensional projection unit.
- step S400 the process proceeds to step S400, and it is determined whether or not the signal processing result of the infrared detection unit 10 has been acquired. If it is determined that the result has been acquired (Yes), the process proceeds to step S402. If not, move to step S410.
- step S402 the feature amount data is The process proceeds to step S404. That is, the motion pattern model storage unit 12 stores the first feature amount data.
- step S404 a mathematical distance between the two is calculated based on the read characteristic amount data and the characteristic amount data as a result of the signal processing, and the process proceeds to step S406.
- step S406 based on the mathematical distance between the feature data stored in advance in the data storage unit and the calculated mathematical distance, the feature data is obtained using the Sammon method. Perform a process of two-dimensional projection while maintaining the mathematical distance relationship!
- step S408 information about the projection result is input to the information display unit, and the flow advances to step S400.
- step S410 without acquiring the signal processing result in step S400, it is determined whether or not the first feature amount data has been acquired. If it is determined that the first feature amount data has been acquired (Yes), the process proceeds to step S410. The process moves to S412, and if not! /, If (No), the process moves to step S400. When the process proceeds to step S412, a mathematical distance between the feature data is calculated, and the process proceeds to step S414.
- step S414 the calculated mathematical distance is stored in the data storage unit, and the flow advances to step S400.
- the display processing of the two-dimensional coordinates by the two-dimensional projection unit and the information display unit in the text corresponds to the feature data display unit and the detection result display unit.
- FIG. 15 and FIG. 16 are diagrams showing a fourth embodiment of the information recognition apparatus according to the present invention.
- This embodiment is an embodiment of an alarm system including the information recognition device 1 in the first and second embodiments.
- the information recognition device 1 whether a person or another moving object is used.
- This is an embodiment of a system that performs a process of recognizing an object and issues an alarm when it is determined from the result of the recognition that the person who has entered the detection range of the sensor is a person. That is, the application system using the recognition result of the information recognition device 1 in the first and second embodiments described above. Stem. Therefore, the same parts as those in the first and second embodiments are denoted by the same reference numerals, and description thereof will be omitted. Only different parts will be described.
- FIG. 15 is a block diagram showing the configuration of the alarm system according to the fourth embodiment of the present invention.
- the alarm system 2 includes an information recognition device 1, an alarm control unit 50 for controlling alarm generation based on the recognition result from the information recognition device 1, and an alarm command from the alarm control unit 50. And a notifying unit 52 for notifying the system user of the contents of the alarm in response to a notification command from the alarm control unit 50.
- the motion pattern model storage unit 12 stores a motion pattern model generated for a human motion pattern and a motion pattern model generated for a non-human animal such as a dog or cat. And remember! /
- the recognition processing unit 13 detects a detection target existing within the detection range of the pyroelectric infrared sensor 10a based on the storage content of the operation pattern model storage unit 12 and the feature amount data of the infrared detection result obtained also from the infrared detection unit 10. It has a function of recognizing body movement pattern information and attribute information, and a function of transmitting the recognition result to the alert control unit 20.
- the feature data is compared with the motion pattern model stored in the motion pattern model storage unit 12 to recognize whether or not the detected object is human-powered.
- the alarm control unit 50 determines whether or not the detected object is a human power, and when it is determined that the detected object is a human, Has a function of transmitting an instruction to cause the alarm unit 51 to issue an alarm and transmitting an instruction to notify the system user of the contents of the alarm (for example, intrusion of a person into the building) to the notification unit 52. I have. On the other hand, if the alarm control unit 50 determines that the detected object is other than a person, the alarm instruction and the notification instruction are not transmitted to the alarm unit 51 and the notification unit 52. Will not be notified.
- the recognition result "person, person, other than person ⁇ other than person, person, person ⁇ other than person ⁇ person, person, person, person, person, person, person, person, person, person, for example, when the recognition of a person is repeated a predetermined number of times (for example, three times or more), it is determined that the detected object is a person. It is possible to reduce the error of the result.
- the alarm unit 51 has a function of outputting a predetermined alarm sound together with the sound of the alarm message from a speaker (not shown) in response to an alarm command from the alarm control unit 50.
- the notifying unit 52 has a function of notifying the system user of the contents of the warning via a network (not shown) in response to a notification command from the alert control unit 50.
- the alarm system 2 apart from the information recognizing device 1, the alarm system 2 includes a processor (not shown), a RAM (Random Access Memory), and a storage medium storing a dedicated program.
- the processor controls each of the above units by executing a dedicated program by the processor.
- the storage medium includes semiconductor storage media such as RAM and ROM, magnetic storage media such as FD and HD, optical reading storage media such as CD, CDV, LD and DVD, and magnetic storage Z such as MO. It is an optically readable storage medium, and includes any storage medium that can be read by a computer, regardless of an electronic, magnetic, optical, or other reading method.
- the above-mentioned alarm system 2 installs the pyroelectric infrared sensor 10a provided in the information recognition device 1 near an entrance of a building such as an art museum or a jewelry store (a route that must pass when entering the building). Then, the information recognizing device 1 recognizes whether the person (detected body) has entered the building, whether a person or a person, and based on the recognition result, furthermore, the alarm control unit 50 further detects the detected object. It is determined whether or not the detected object is a person. If it is determined that the detected object is a person, an alarm is issued in the alarm unit 51, and the notification unit 52 notifies the system user of the contents of the alarm.
- a "person” such as a thief invades the building, it can be recognized as a person, and the warning and warning can be given by the warning of the warning unit 21.
- a person other than a person such as a dog or a cat, enters a building, it is possible to judge that the person is not a person and not to issue unnecessary warnings or notifications.
- FIG. 16 is a flowchart showing the operation process of the alarm control unit 50.
- step 500 the process proceeds to step 500, where it is determined whether or not the recognition result has been received from the recognition processing unit 13. If it is determined that the result has been received (Yes), the process proceeds to step S502. Otherwise (No), the determination process is continued until it is received.
- step S502 the recognition result received in step S500 is stored in a storage unit (not shown), and the process proceeds to step S504.
- step S504 it is determined whether a predetermined time has elapsed since the reception of the first recognition result, or whether the number of received recognition results for the same object has exceeded the predetermined reception number. If so (Yes), the process proceeds to step S506; otherwise (No), the process proceeds to step S500.
- step S 506 a process of determining whether or not the detected object is a person is performed based on the recognition result for the same detected object stored in the storage unit, and the process proceeds to step S 508.
- the process of determining whether or not human power is performed is performed based on, for example, the number of times of continuous recognition as a “human” as described above.
- step S508 the process proceeds to step S510 if it is determined that the person is a powerful person as a result of the determination in step S506 (Yes), and to step S512 if not (No).
- step S 510 a notification instruction is transmitted to the alarm unit 51, and a notification instruction is transmitted to the notification unit 52, and the process proceeds to step S 512.
- step S512 the recognition result stored in the storage unit is deleted, the determination process ends, and the flow shifts to step S500.
- the alarm control unit 50 determines whether the detected object is a person, and if it is a person, issues an alarm in the alarm unit 51, and notifies the system user of the contents of the alarm in the notification unit 52. On the other hand, when it is determined that the person is other than a person, the alarm unit 51 does not issue an alarm and the notification unit 52 does not notify the contents of the alarm, thereby providing unnecessary alarm and notification. It is possible to do so.
- the alarm control section 50 shown in FIG. 15 corresponds to the determining means described in claim 18, and the alarm section 51 and the notifying section 52 correspond to the alarm means described in claim 18.
- the living body that emits heat radiation other than human beings is the object to be detected.
- the invention is not limited to this. It is good.
- the present invention is not limited to this, and it is not limited to the eight directions. Recognition processing of these operation patterns may be performed.
- the name of the detected object is described as an example of the attribute associated with the motion pattern model.
- the attributes include gender, age, and height.
- another element such as weight, weight, etc. may be associated, or a plurality of elements may be arbitrarily combined and associated.
- multidimensional feature data is projected onto two-dimensional coordinate information.
- present invention is not limited to this, and multidimensional feature data is converted into three-dimensional data. May be projected onto the coordinate information.
- the predetermined information of the detection target is recognized based on the detection result of the thermal radiation sensor and a plurality of types of operation pattern models. It is possible to recognize various information such as complex behavior patterns of the detected object and attributes of the detected object.
- a plurality of Recognition processing can be performed based on a plurality of motion pattern models respectively corresponding to the types of motion patterns and the detection result, and thus it is possible to recognize various types of information on the detected object within the detection range.
- the information recognition device of the third aspect in addition to the effect of the first or second aspect, it is easy to add a new operation pattern model, and furthermore, according to a given condition. Since the motion pattern model can be generated, it is possible to flexibly cope with a change in the motion pattern model due to a change in the recognition content.
- a pyroelectric infrared sensor is used as the thermal radiation sensor. It is possible to easily detect a moving object in the inside.
- a behavior pattern is modeled using an HMM that is a probability model of a time-series signal.
- the thermal radiation sensor is controlled by the action content, moving speed, size, etc. of the detected object. Since the output changes, it is possible to recognize the action details, the moving speed, the size, etc. of the detected object by generating and preparing an operation pattern model corresponding to these in advance.
- the motion pattern model may include a plurality of types of detected objects. Therefore, the information recognizing means can recognize the type of the detected object within the detection range.
- the likelihood between the feature data and the motion pattern model is calculated. Since the predetermined information related to the detected object is recognized based on the detected information, the predetermined information can be easily recognized.
- the first feature amount data composed of the spectrum of the frame result of the detection result and the spectrum of the frame unit are obtained. Since the likelihood between the second feature value data having the average amplitude value force and the motion pattern model is calculated and the predetermined information relating to the detection target is recognized based on the calculation result, It is possible to improve the recognition accuracy of the predetermined information.
- the information recognition apparatus of claim 12 as the first feature value data, a value obtained by converting the value of the spectrum in the frame unit into a common logarithmic value is used. Can improve the recognition accuracy of the predetermined information.
- the first and second feature data are calculated, and the feature indicated by the first feature data of the selected frame and the selected frame are displayed.
- the predetermined information is recognized using the third feature amount data which is a difference from the feature amount indicated by the first feature amount data of the previous frame, so that the predetermined information can be obtained. It is possible to further improve the recognition accuracy of.
- the feature indicated by the second feature data of the selected frame and the selected frame are displayed.
- the predetermined information is recognized using the fourth feature value data, which is a difference force from the feature value indicated by the second feature value data of the frame immediately before the frame, so that the predetermined information is obtained. It is possible to further improve the information recognition accuracy.
- the effects of any one of claims 1 to 8 are added to the effect, and the detection result is a motion pattern model of a plurality of other detected objects. This makes it possible to visually grasp the feature amount data by comparing it with the feature amount data corresponding to the information, and to visually recognize predetermined information.
- the information recognition method described in claim 16 is realized by the information recognition device or the like described in claim 1, and the description thereof is omitted because the industrial applicability thereof overlaps.
- the information recognition program described in claim 17 is a program applicable to the information recognition device described in claim 1, and the description thereof is omitted because its industrial applicability overlaps.
- the crime prevention system according to claim 18 of the present invention, complicated detection of the detected object is performed. Based on the recognition result of an information recognition device that can recognize various information such as motion patterns and attributes of the detected object, it is possible to distinguish between humans and animals other than humans. When used for security, it is possible to reduce the occurrence of false alarms by determining that animals other than humans enter the building as humans.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Geophysics And Detection Of Objects (AREA)
- Photometry And Measurement Of Optical Pulse Characteristics (AREA)
- Burglar Alarm Systems (AREA)
- Fire-Detection Mechanisms (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP05703554A EP1705612A4 (en) | 2004-01-15 | 2005-01-13 | INFORMATION DETECTION DEVICE, INFORMATION RECOGNITION PROCEDURE, INFORMATION RECOGNITION PROGRAM AND ALARM SYSTEM |
JP2005517059A JP4180600B2 (ja) | 2004-01-15 | 2005-01-13 | 情報認識装置、情報認識方法、情報認識プログラム及び警報システム |
US10/585,823 US20070241863A1 (en) | 2004-01-15 | 2005-01-13 | Information Recognition Device, Information Recognition Method, Information Recognition Program, and Alarm System |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-008240 | 2004-01-15 | ||
JP2004008240 | 2004-01-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005069222A1 true WO2005069222A1 (ja) | 2005-07-28 |
Family
ID=34792217
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/000315 WO2005069222A1 (ja) | 2004-01-15 | 2005-01-13 | 情報認識装置、情報認識方法、情報認識プログラム及び警報システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20070241863A1 (ja) |
EP (1) | EP1705612A4 (ja) |
JP (1) | JP4180600B2 (ja) |
CN (1) | CN100527167C (ja) |
WO (1) | WO2005069222A1 (ja) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006346180A (ja) * | 2005-06-16 | 2006-12-28 | Asahi Kasei Corp | 出力時間変化値生成装置、出力時間変化値生成方法及びプログラム |
JP2008247194A (ja) * | 2007-03-30 | 2008-10-16 | Kenwood Corp | カーセキュリティ装置 |
JP2009281981A (ja) * | 2008-05-26 | 2009-12-03 | Hitachi Plant Technologies Ltd | 人体検出装置、および、それを用いた人体検出システム |
JP2010071984A (ja) * | 2008-08-21 | 2010-04-02 | Asahi Kasei Corp | 検知装置 |
CN102176067A (zh) * | 2010-12-29 | 2011-09-07 | 神华集团有限责任公司 | 获取地下煤火变化信息的方法 |
JP2012181630A (ja) * | 2011-02-28 | 2012-09-20 | Sogo Keibi Hosho Co Ltd | 警備装置および警備動作切替え方法 |
JP2014153246A (ja) * | 2013-02-12 | 2014-08-25 | Mega Chips Corp | センサ装置およびセンサ応用機器 |
JP2016537698A (ja) * | 2013-09-25 | 2016-12-01 | フィリップス ライティング ホールディング ビー ヴィ | 検出システム及び方法、並びにこのような検出システムを用いる空間制御システム |
JP2017004106A (ja) * | 2015-06-05 | 2017-01-05 | トヨタ自動車株式会社 | 車両の衝突回避支援装置 |
JP2017523376A (ja) * | 2014-06-03 | 2017-08-17 | ザ・セキュリティ・オラクル・インク | 防御及び拒絶システム |
JP2017215668A (ja) * | 2016-05-30 | 2017-12-07 | Necプラットフォームズ株式会社 | 警報通知装置、警報通知システム及び警報通知プログラム |
CN108616725A (zh) * | 2018-05-21 | 2018-10-02 | 佛山科学技术学院 | 一种人工智能精确捕鱼装置及其工作系统和方法 |
CN110487410A (zh) * | 2019-07-31 | 2019-11-22 | 上海电力大学 | 多模态图像特征融合的电力设备构件温度提取方法及装置 |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8482613B2 (en) * | 2007-09-10 | 2013-07-09 | John Kempf | Apparatus and method for photographing birds |
US8142023B2 (en) * | 2007-12-21 | 2012-03-27 | Honda Motor Co., Ltd. | Optimized projection pattern for long-range depth sensing |
WO2010055205A1 (en) * | 2008-11-11 | 2010-05-20 | Reijo Kortesalmi | Method, system and computer program for monitoring a person |
JP5340899B2 (ja) * | 2009-11-27 | 2013-11-13 | 綜合警備保障株式会社 | 警備装置およびセンサ反応要因の推定方法 |
US8462002B2 (en) | 2010-06-18 | 2013-06-11 | The Invention Science Fund I, Llc | Personal telecommunication device with target-based exposure control |
US8686865B2 (en) | 2010-06-18 | 2014-04-01 | The Invention Science Fund I, Llc | Interactive technique to reduce irradiation from external source |
US8810425B2 (en) | 2010-06-18 | 2014-08-19 | The Invention Science Fund I, Llc | Travel route mapping based on radiation exposure risks |
US8463288B2 (en) | 2010-06-18 | 2013-06-11 | The Invention Science Fund I, Llc | Irradiation self-protection from user telecommunication device |
TWI421477B (zh) * | 2010-08-30 | 2014-01-01 | Emcom Technology Inc | 溫度變化感應裝置及其方法 |
CN102466524B (zh) * | 2010-11-09 | 2015-03-04 | 好庆科技企业股份有限公司 | 温度变化感应装置及其方法 |
US9460350B2 (en) * | 2011-07-01 | 2016-10-04 | Washington State University | Activity recognition in multi-entity environments |
US9600744B2 (en) | 2012-04-24 | 2017-03-21 | Stmicroelectronics S.R.L. | Adaptive interest rate control for visual search |
US8829439B2 (en) * | 2012-10-16 | 2014-09-09 | The United States Of America As Represented By The Secretary Of The Army | Target detector with size detection and method thereof |
US9380275B2 (en) * | 2013-01-30 | 2016-06-28 | Insitu, Inc. | Augmented video system providing enhanced situational awareness |
US10482759B2 (en) * | 2015-05-13 | 2019-11-19 | Tyco Safety Products Canada Ltd. | Identified presence detection in and around premises |
JP6646549B2 (ja) * | 2016-08-30 | 2020-02-14 | アズビル株式会社 | 監視装置、監視方法、およびプログラム。 |
US10712204B2 (en) | 2017-02-10 | 2020-07-14 | Google Llc | Method, apparatus and system for passive infrared sensor framework |
US11179293B2 (en) | 2017-07-28 | 2021-11-23 | Stryker Corporation | Patient support system with chest compression system and harness assembly with sensor system |
CN112419637B (zh) * | 2019-08-22 | 2024-05-14 | 北京奇虎科技有限公司 | 安防图像数据的处理方法及装置 |
CN111557647B (zh) * | 2020-03-23 | 2024-02-27 | 未来穿戴技术有限公司 | 体温检测的方法、颈部按摩仪和装置 |
US20220101494A1 (en) * | 2020-09-30 | 2022-03-31 | Nvidia Corporation | Fourier transform-based image synthesis using neural networks |
CN112598865B (zh) * | 2020-12-14 | 2023-03-03 | 深圳供电局有限公司 | 一种电缆线路防外力破坏的监控方法及系统 |
CN113037311A (zh) * | 2021-02-24 | 2021-06-25 | 重庆工程职业技术学院 | 一种用于室内定位导航的手环 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06251159A (ja) * | 1993-03-01 | 1994-09-09 | Nippon Telegr & Teleph Corp <Ntt> | 動作認識装置 |
JPH0755573A (ja) * | 1993-08-20 | 1995-03-03 | Matsushita Electric Ind Co Ltd | 車搭載用人体検知センサー及び人体検知連動装置 |
JPH0933215A (ja) * | 1995-07-19 | 1997-02-07 | Matsushita Electric Ind Co Ltd | 移動パターン認識装置 |
JPH0942924A (ja) * | 1995-07-31 | 1997-02-14 | Matsushita Electric Works Ltd | 熱画像センサシステム |
JP2766820B2 (ja) | 1991-09-20 | 1998-06-18 | セイコープレシジョン株式会社 | 人体移動方向判別装置 |
JP2001304973A (ja) * | 2000-04-26 | 2001-10-31 | Denso Corp | 赤外線イメージセンサ |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03278696A (ja) * | 1990-03-28 | 1991-12-10 | Toshiba Corp | リモートコントロール装置 |
EP0566852B1 (en) * | 1992-04-21 | 1998-08-26 | Mitsubishi Denki Kabushiki Kaisha | Human body detection system |
JPH06117925A (ja) * | 1992-09-30 | 1994-04-28 | Horiba Ltd | 差出力型デュアルツイン焦電検出器 |
JPH06266840A (ja) * | 1993-03-11 | 1994-09-22 | Hitachi Ltd | 移動物体の状態検出装置 |
JPH0784592A (ja) * | 1993-09-14 | 1995-03-31 | Fujitsu Ltd | 音声認識装置 |
JPH07288875A (ja) * | 1994-04-14 | 1995-10-31 | Matsushita Electric Ind Co Ltd | 人体動作認識センサ及び非接触型操作装置 |
JPH08161292A (ja) * | 1994-12-09 | 1996-06-21 | Matsushita Electric Ind Co Ltd | 混雑度検知方法およびそのシステム |
JPH08305853A (ja) * | 1995-04-28 | 1996-11-22 | Mitsubishi Electric Corp | 対象の認識およびそれにもとづく意思決定の方法およびその装置 |
DE69616191T2 (de) * | 1995-07-19 | 2002-03-14 | Matsushita Electric Industrial Co., Ltd. | Bewegungsmuster-Erkennungseinrichtung zum Bestimmen der Bewegung von Menschen sowie zum Zählen vorbeigegangener Personen |
JP3497632B2 (ja) * | 1995-09-20 | 2004-02-16 | セコム株式会社 | 検出器 |
JP3086406B2 (ja) * | 1995-10-04 | 2000-09-11 | オプテックス株式会社 | 受動型赤外線式人体検知装置 |
JPH09101204A (ja) * | 1995-10-06 | 1997-04-15 | Matsushita Electric Ind Co Ltd | 焦電型赤外線検出装置 |
JP3279175B2 (ja) * | 1996-04-30 | 2002-04-30 | 松下電工株式会社 | 赤外線検出装置 |
JPH10160856A (ja) * | 1996-11-28 | 1998-06-19 | Nec Robotics Eng Ltd | 焦電型人体検知装置 |
JPH1172386A (ja) * | 1997-08-29 | 1999-03-16 | Matsushita Electric Works Ltd | 人体検知センサ |
US6092192A (en) * | 1998-01-16 | 2000-07-18 | International Business Machines Corporation | Apparatus and methods for providing repetitive enrollment in a plurality of biometric recognition systems based on an initial enrollment |
CN1297343A (zh) * | 1998-05-06 | 2001-05-30 | 松下电器产业株式会社 | 耳式妇女体温计 |
JP4053188B2 (ja) * | 1999-07-06 | 2008-02-27 | 富士通株式会社 | パターン切り出し装置及びパターン認識装置 |
JP4404329B2 (ja) * | 1999-12-28 | 2010-01-27 | ホーチキ株式会社 | 炎検出装置 |
JP2003030240A (ja) * | 2001-07-13 | 2003-01-31 | Ntt Data Corp | データ検索装置、データ検索方法、及びデータ検索プログラム |
US20030058111A1 (en) * | 2001-09-27 | 2003-03-27 | Koninklijke Philips Electronics N.V. | Computer vision based elderly care monitoring system |
JP2003281543A (ja) * | 2002-03-26 | 2003-10-03 | Namco Ltd | 動作認識装置、動作検出方法および情報記録媒体 |
-
2005
- 2005-01-13 US US10/585,823 patent/US20070241863A1/en not_active Abandoned
- 2005-01-13 JP JP2005517059A patent/JP4180600B2/ja not_active Expired - Fee Related
- 2005-01-13 WO PCT/JP2005/000315 patent/WO2005069222A1/ja not_active Application Discontinuation
- 2005-01-13 CN CNB2005800018076A patent/CN100527167C/zh not_active Expired - Fee Related
- 2005-01-13 EP EP05703554A patent/EP1705612A4/en not_active Withdrawn
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2766820B2 (ja) | 1991-09-20 | 1998-06-18 | セイコープレシジョン株式会社 | 人体移動方向判別装置 |
JPH06251159A (ja) * | 1993-03-01 | 1994-09-09 | Nippon Telegr & Teleph Corp <Ntt> | 動作認識装置 |
JPH0755573A (ja) * | 1993-08-20 | 1995-03-03 | Matsushita Electric Ind Co Ltd | 車搭載用人体検知センサー及び人体検知連動装置 |
JPH0933215A (ja) * | 1995-07-19 | 1997-02-07 | Matsushita Electric Ind Co Ltd | 移動パターン認識装置 |
JPH0942924A (ja) * | 1995-07-31 | 1997-02-14 | Matsushita Electric Works Ltd | 熱画像センサシステム |
JP2001304973A (ja) * | 2000-04-26 | 2001-10-31 | Denso Corp | 赤外線イメージセンサ |
Non-Patent Citations (2)
Title |
---|
JON W. SAMMON, JR.: "A Nonlinear Mapping for Data Structure Analysis", IEEE TRANS. COMPUTERS, vol. C-18, no. 5, May 1969 (1969-05-01) |
See also references of EP1705612A4 |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006346180A (ja) * | 2005-06-16 | 2006-12-28 | Asahi Kasei Corp | 出力時間変化値生成装置、出力時間変化値生成方法及びプログラム |
JP4667130B2 (ja) * | 2005-06-16 | 2011-04-06 | 旭化成株式会社 | 出力時間変化値生成装置、出力時間変化値生成方法及びプログラム |
JP2008247194A (ja) * | 2007-03-30 | 2008-10-16 | Kenwood Corp | カーセキュリティ装置 |
JP2009281981A (ja) * | 2008-05-26 | 2009-12-03 | Hitachi Plant Technologies Ltd | 人体検出装置、および、それを用いた人体検出システム |
JP2010071984A (ja) * | 2008-08-21 | 2010-04-02 | Asahi Kasei Corp | 検知装置 |
CN102176067A (zh) * | 2010-12-29 | 2011-09-07 | 神华集团有限责任公司 | 获取地下煤火变化信息的方法 |
JP2012181630A (ja) * | 2011-02-28 | 2012-09-20 | Sogo Keibi Hosho Co Ltd | 警備装置および警備動作切替え方法 |
JP2014153246A (ja) * | 2013-02-12 | 2014-08-25 | Mega Chips Corp | センサ装置およびセンサ応用機器 |
JP2016537698A (ja) * | 2013-09-25 | 2016-12-01 | フィリップス ライティング ホールディング ビー ヴィ | 検出システム及び方法、並びにこのような検出システムを用いる空間制御システム |
JP2017523376A (ja) * | 2014-06-03 | 2017-08-17 | ザ・セキュリティ・オラクル・インク | 防御及び拒絶システム |
JP2017004106A (ja) * | 2015-06-05 | 2017-01-05 | トヨタ自動車株式会社 | 車両の衝突回避支援装置 |
JP2017215668A (ja) * | 2016-05-30 | 2017-12-07 | Necプラットフォームズ株式会社 | 警報通知装置、警報通知システム及び警報通知プログラム |
CN108616725A (zh) * | 2018-05-21 | 2018-10-02 | 佛山科学技术学院 | 一种人工智能精确捕鱼装置及其工作系统和方法 |
CN110487410A (zh) * | 2019-07-31 | 2019-11-22 | 上海电力大学 | 多模态图像特征融合的电力设备构件温度提取方法及装置 |
CN110487410B (zh) * | 2019-07-31 | 2021-03-02 | 上海电力大学 | 多模态图像特征融合的电力设备构件温度提取方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2005069222A1 (ja) | 2008-04-24 |
EP1705612A1 (en) | 2006-09-27 |
JP4180600B2 (ja) | 2008-11-12 |
EP1705612A8 (en) | 2006-11-02 |
CN1906638A (zh) | 2007-01-31 |
CN100527167C (zh) | 2009-08-12 |
US20070241863A1 (en) | 2007-10-18 |
EP1705612A4 (en) | 2012-07-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4180600B2 (ja) | 情報認識装置、情報認識方法、情報認識プログラム及び警報システム | |
US11615623B2 (en) | Object detection in edge devices for barrier operation and parcel delivery | |
KR102408257B1 (ko) | 조기에 위험을 경고하는 방법 및 장치 | |
EP3012813B1 (en) | Event detection system | |
Yun et al. | Detecting direction of movement using pyroelectric infrared sensors | |
JP2020135243A (ja) | モデル生成装置、予測装置、モデル生成方法、及びモデル生成プログラム | |
Gami | Movement direction and distance classification using a single PIR sensor | |
US11219415B1 (en) | Thermal imaging system for disease outbreak detection | |
CN104637242A (zh) | 一种基于多分类器集成的老人跌倒检测方法及系统 | |
US20210287798A1 (en) | Systems and methods for non-invasive virus symptom detection | |
US20140218195A1 (en) | Apparatus and Method for Rapid Human Detection with Pet Immunity | |
Gomes et al. | Multi-human fall detection and localization in videos | |
Tsou et al. | Counting people by using convolutional neural network and a PIR array | |
Raheja et al. | Cross border intruder detection in hilly terrain in dark environment | |
TW201123087A (en) | System and method for detecting multi-layer intrusion events and the computer program product thereof | |
CN116745580A (zh) | 用于确定物理空间中的非静止对象的系统 | |
JP4667130B2 (ja) | 出力時間変化値生成装置、出力時間変化値生成方法及びプログラム | |
EP3414540B1 (en) | Compressive sensing detector | |
Kamble et al. | Fall alert: a novel approach to detect fall using base as a YOLO object detection | |
EP3617933B1 (en) | Detecting room occupancy with binary pir sensors | |
Reddy et al. | Automated facemask detection and monitoring of body temperature using IoT enabled smart door | |
Velychko et al. | Artificial Intelligence Based Emergency Identification Computer System | |
EP2568414A1 (en) | Surveillance system and method for detecting behavior of groups of actors | |
JPH06282654A (ja) | 学習可能な保安装置とその方法 | |
EP4207107A1 (en) | Information processing program, information processing method, and information processing apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200580001807.6 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DPEN | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005517059 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005703554 Country of ref document: EP Ref document number: 10585823 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2005703554 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10585823 Country of ref document: US |