WO2010103584A1 - 入退検出装置、監視装置及び入退検出方法 - Google Patents
入退検出装置、監視装置及び入退検出方法 Download PDFInfo
- Publication number
- WO2010103584A1 WO2010103584A1 PCT/JP2009/005379 JP2009005379W WO2010103584A1 WO 2010103584 A1 WO2010103584 A1 WO 2010103584A1 JP 2009005379 W JP2009005379 W JP 2009005379W WO 2010103584 A1 WO2010103584 A1 WO 2010103584A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- positioning
- area
- target
- person
- entry
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B29/00—Checking or monitoring of signalling or alarm systems; Prevention or correction of operating errors, e.g. preventing unauthorised operation
- G08B29/18—Prevention or correction of operating errors
- G08B29/185—Signal analysis techniques for reducing or preventing false alarms or for enhancing the reliability of the system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
Definitions
- the present invention relates to an entry / exit detection device, a monitoring device, and an entry / exit detection method, and to a technique for detecting entry and / or exit of an object to a specific area (that is, entry / exit).
- the first type of device is of a type that detects the entry / exit of a person to / from a specific area provided with a locking device, an entrance / exit and / or a gate.
- the second type of device is a type that can detect even a person entering and leaving a so-called open specific area (hereinafter referred to as an open area) that is not provided with a locking device, a doorway, or a gate.
- Examples of the first type device include those disclosed in Patent Document 1, Patent Document 2, Patent Document 3, and the like.
- Examples of the second type apparatus include those disclosed in Patent Document 4, Patent Document 5, and the like.
- the second type device has an advantage that the application range is wider than that of the first type device in that it can be applied to an open area without a locking device, an entrance / exit and / or a gate. is there.
- Patent Document 4 a person is detected by a monitoring camera, and it is determined whether or not a suspicious person has entered the specific area based on the detected position of the person, the position of the specific area, and the staying time of the person.
- An apparatus is disclosed.
- Patent Document 5 discloses an apparatus for determining whether a mobile terminal (person) is present in a specific area or outside a specific area based on a relative positional relationship between a positioning result of the mobile terminal and a specific area boundary. Is disclosed.
- Patent Document 5 describes an entry / exit determination method that takes a positioning error into consideration.
- the entrance / exit determination disclosed in Patent Document 5 is, for example, repeated entry and exit due to positioning errors even though the mobile terminal (person) is continuously stopped near the boundary of the area. Therefore, it is not intended to fundamentally improve the accuracy of entry / exit detection.
- the detection accuracy means that the entrance / exit detection can be performed in accordance with the request of the supervisor. For example, it is desired to reduce detection omission (false negative) and false detection (false positive). It is a request.
- An object of the present invention is to provide an entry / exit detection device, a monitoring device, and an entry / exit detection method that can accurately detect entry / exit of a detection target into / from a specific area even when a positioning error occurs.
- One aspect of the entrance / exit detection device of the present invention includes a positioning reliability detection unit that detects the positioning reliability of the positioning unit, the positioning result obtained by the positioning unit, and the positioning reliability detection unit. Based on the positioning reliability, based on the overlap of the existence area determination unit for determining the existence area where the object measured by the positioning unit may exist, and the existence area and the specific area And a determination unit that determines whether the target enters and / or leaves the specific area.
- One aspect of the monitoring device of the present invention includes: the entrance / exit detection device; an alarm output unit that outputs an alarm based on the determination result obtained by the determination unit; and a visual recognition region formation for determining the visual recognition region of the target And the determination unit determines that the first object has entered the specific area, the positioning position of the first object is a different object from the first object.
- the alarm output unit is configured to include a warning necessity determination unit that does not output a warning about the first target.
- FIG. 4A is a diagram for explaining how to obtain a possible area
- FIG. 4A is a diagram illustrating an example in which a 95% confidence interval area in the probability distribution is set as a possible area
- FIG. 4B is a diagram showing an area having a certain probability or more.
- FIG. 5A-1 is a diagram showing a possible area set when the positioning reliability is low
- FIG. 5A-1 is a diagram showing a possible area set when the positioning reliability is low
- FIG. 5B-1 is a diagram showing a possible area set when the positioning reliability is high
- FIG. 5A-2 FIG. 5B-2 is a diagram illustrating a situation in which, even if the position of the positioning point is the same, there may be a case where it overlaps with a monitoring area due to a difference in reliability of positioning.
- the block diagram which shows the structure of the entrance / exit detection apparatus which concerns on Embodiment 1 of this invention.
- the figure which shows the example of person position and reliability The figure which shows the example of the monitoring area information stored in monitoring area DB
- Flow chart showing the processing procedure of the possible area creation unit
- the flowchart which shows the process sequence of an approach determination part
- FIG. 15A is a diagram showing a case where a non-authorized person's positioning point is detected in the viewing area
- FIG. 15B is a diagram showing a case where an unauthorized person's positioning point is not detected in the viewing area.
- FIG. 17A is a diagram showing an example of person position and reliability for an intruder without entry authority
- FIG. 17B is a diagram showing a first example of person position and reliability for a person with entry authority
- FIG. FIG. 17D is a diagram showing a second example of person positions and reliability levels for an authorized person
- FIG. 17D is a diagram showing an example of monitoring area information stored in the monitoring area DB.
- FIG. 20A is a diagram showing an example where no alarm is required
- FIG. 20B is a diagram showing an example where an alarm is necessary.
- Block diagram showing the configuration of the monitoring system of the third embodiment 24A is a diagram showing an example of the person position and reliability for an intruder without entry authority
- FIG. 24B is a diagram showing a first example of the person position and reliability for a person with entry authority
- FIG. 24C is an entry FIG.
- FIG. 24D is a diagram showing a second example of person positions and reliability levels for an authorized person
- FIG. 24D is a diagram showing an example of monitoring area information stored in the monitoring area DB.
- FIG. 27A is a diagram showing an example where no alarm is required
- FIG. 27B is a diagram showing an example where an alarm is necessary. Diagram for explaining collision judgment considering the reliability of positioning points
- FIG. 30A is a diagram showing an example of a person position and reliability for an intruder without entry authority
- FIG. 30B is a diagram showing a first example of a person position and reliability for a person with entry authority
- FIG. 30C is an entry
- FIG. 30D is a diagram showing a second example of person positions and reliability levels for an authorized person
- FIG. 30D is a diagram showing an example of monitoring area information stored in the monitoring area DB.
- the inventors of the present invention have focused on the fact that the reliability of positioning results changes due to differences in positioning environment, etc., even when the same positioning results are obtained.
- the situation of occlusion (concealment) between persons and the degree of matching with a person template change with each positioning, so the positioning error also differs each time. It becomes a value, and the reliability of the positioning result also changes.
- the positioning environment is affected by the influence of a radio wave absorber such as a human body containing a lot of moisture and a conductor such as metal. Since it changes, the positioning error also becomes a different value each time, and the reliability of the positioning result also changes.
- a radio wave absorber such as a human body containing a lot of moisture and a conductor such as metal. Since it changes, the positioning error also becomes a different value each time, and the reliability of the positioning result also changes.
- a target possible area that reflects the reliability of the positioning result is set, and based on the overlapping state of the target possible area and the specific area, the target is moved into and out of the specific area. judge. As a result, even when a positioning error occurs, it becomes possible to accurately detect the entry / exit of the detection target to / from the specific area.
- FIG. 1 shows an image of entry / exit detection according to the present embodiment.
- the black circle indicated by the reference symbol P1 indicates a positioning point that is a result of positioning the target.
- a circle indicated by the reference symbol AR1 and centered on the positioning point P1 indicates a region where the object can exist.
- Reference symbol L0 indicates a flow line that is an actual movement trajectory of the object, and reference symbol L1 indicates a flow line that connects the positioning points P1 at a plurality of times.
- the possible area AR1 is a smaller circle as the reliability of the positioning result is higher, and conversely, it is a larger circle as the reliability of the positioning result is lower.
- the existence area AR1 is defined as a probability density, and the target is monitored when the integrated value (cumulative probability density) of the probability density of the overlapping portion of one existence area AR1 and the monitoring area AR0 is equal to or greater than a threshold value. It is determined that the vehicle has entered the area AR0.
- the existence area AR1 is defined as a probability density, and the sum of the integrated values of the probability densities of the overlapping portions of the existence area AR1 and the monitoring area AR0 of a plurality of continuous positioning points P1 within a predetermined period is greater than or equal to a threshold value. In this case, it is determined that the target has entered the monitoring area AR0.
- (Ii) has the advantage that the accuracy of the entry / exit determination can be easily changed by changing the threshold value as compared with (i). However, calculation for obtaining the area is required. Compared with (ii), (iii) has an advantage that more accurate entry / exit determination accuracy can be obtained based on the existence probability. However, a complicated integration calculation regarding the probability density distribution function is required. Therefore, it is only necessary to determine which one of (i), (ii), and (iii) is used by a trade-off between the calculation amount and calculation time and the determination accuracy.
- (iv), (v), and (vi) use a plurality of positioning points as compared with (i), (ii), and (iii), respectively, so that the accuracy of entry / exit determination is improved. There is. However, since it takes time to collect a plurality of positioning point data, it is only necessary to determine which determination method is used depending on whether or not immediate determination is necessary.
- the determination method and threshold value may be selected as follows according to the required detection accuracy.
- the threshold value for example, when using the overlap of the monitoring area AR0 and the existence possible area AR1 in (ii), if the threshold is set to 0.5, the area in the case of a positioning point on the boundary line of the monitoring area AR0 The overlap portion becomes 0.5 (a slight deviation may occur depending on the shape of the monitoring area), and the detection accuracy is the same as when only the conventional positioning points are used.
- the threshold value for example, by setting the threshold value to 0.01, it is possible to determine that the vehicle is entering, including a case where there is a possibility of entering the monitoring area AR0 when an error is taken into consideration.
- the threshold value for example, by setting the threshold value to 0.99, it is possible to determine that the vehicle has entered only when the vehicle is surely in the monitoring area AR0 even if the error is taken into consideration.
- the area of the overlapping area between the existable area AR1 and the monitoring area AR0 is the area indicated by the shaded area on the lower side of FIG. 2 (the area within the monitoring area AR0 of the existable area AR1).
- the integrated value of the probability density of the part where the existence area AR1 and the monitoring area AR0 overlap is the probability density of the part indicated by the shaded area on the upper side of FIG. 2 with respect to the part where the existence area AR1 and the monitoring area AR0 overlap. The integrated value.
- the overlapping portion of the possible area AR1 of the plurality of positioning points P1 within the predetermined period and the monitoring area AR0 is a portion indicated by hatching in FIG.
- a value ⁇ 2 corresponding to the dispersion value of the positioning result (positioning data) is obtained as the reliability from the environment at the time of positioning, the positioning situation, and the like.
- the same variance ⁇ 2 is taken in the x-axis direction and the y-axis direction, and the covariance is 0 (the positioning accuracy in the x-axis direction and the y-axis direction is independent).
- a probability distribution (for example, a normal distribution) is introduced by introducing a probability density function of a normal distribution with the positioning coordinates ( ⁇ x , ⁇ y ) as an average value and the reliability ⁇ 2 as a variance value as in the following equation: Ask for.
- the x-axis and y-axis variances are the same ⁇ 2 and the covariance is 0, so the existence region AR1 is a circle.
- the percentage of the confidence interval may be changed according to the security level of the monitoring area AR0. If the existence probability criterion is increased, the existence possible area AR1 related to the same positioning result is increased, and the security level can be increased (that is, there is a high possibility that a suspicious positioning result near the boundary outside the monitoring area AR0 is determined as entering). .
- the two-dimensional normal distribution (probability distribution) used when setting the possible area AR1 may be obtained by the following equation instead of the equation (1).
- ⁇ x is the x coordinate of the positioning result
- ⁇ y is the y coordinate of the positioning result
- ⁇ x 2 is the variance of x
- ⁇ y 2 is the variance of y
- ⁇ xy is the covariance of x and y.
- Equation (1) may be applied when the x component and the y component can be measured independently and have the same accuracy.
- Expression (2) is preferably applied when the positioning of the x component and the y component is correlated and the positioning accuracy is different.
- the existence possible area AR1 set using Expression (2) is an ellipse.
- FIG. 5 shows the state of entry / exit detection according to the present embodiment.
- FIG. 5A-1 shows the possible area AR1 set when the positioning reliability is low
- FIG. 5B-1 shows the possible area AR1 set when the positioning reliability is high.
- FIGS. 5A-2 and 5B-2 even if the position of the positioning point P1 is the same, if the reliability of positioning is low, a part of the existence possible area AR1 is in the monitoring area AR0.
- FIG. 5A-2 shows a part of the existence possible area AR1 in the monitoring area AR0.
- the target possible area AR1 reflecting the reliability of the positioning result is set, and the monitoring area AR0 is based on the overlapping state of the possible area AR1 and the monitoring area AR0.
- entry and / or exit that is, entry / exit
- FIG. 6 shows the configuration of the entry / exit detection device of the present embodiment.
- the entrance / exit detection apparatus 100 of this Embodiment is applied to a monitoring system, but the entrance / exit detection apparatus 100 is not only a monitoring system, For example, only the person who detected the approach to a specific area
- the present invention can be applied to various uses such as a content distribution system that distributes content related to a specific area wirelessly.
- a case where the target for entry / exit detection is a person will be described.
- the detection target is not limited to a person, but can be applied to articles, vehicles, mobile monitoring robots, and other moving objects capable of positioning. Is possible.
- the monitoring system 110 captures an image including the target (person) 130 by the image capturing unit 121 of the camera 120, and determines the captured image from the person position calculation unit 101, the reliability extraction unit 102, and the determination of the entrance / exit detection device 100. The result is sent to the result notification unit 103.
- the person position calculation unit 101 calculates the position of the person 130 in the video acquired from the photographing unit 121 and sends the calculated position information to the existence possible area creation unit 104 as positioning information. To do.
- the reliability extraction unit 102 extracts the reliability of positioning in the person position calculation unit 101 based on the video acquired from the imaging unit 121, and sends the extracted reliability to the possible area creation unit 104.
- the reliability of positioning is a value corresponding to a dispersion value of positioning results (positioning data) that changes in accordance with, for example, the environment at the time of positioning or the positioning situation as described above.
- the reliability extraction unit 102 extracts the reliability (dispersion value) based on the positioning status parameters such as the number of pixels constituting the person 130 in the video, the occlusion (hiding) status between the people, and the sharpness of the contour, for example. To do. Specifically, the reliability (dispersion value) can be obtained by referring to a reliability table prepared in advance (a table indicating the correspondence between positioning status parameters and reliability or dispersion values). The reliability table may be obtained from a theoretical model of an image positioning method, or may be obtained from actual measurement by pre-sampling.
- the reliability extraction unit 102 obtains a result that the reliability is high (the variance value is small) as the number of pixels constituting the person 130 increases (that is, the person 130 appears larger), or the occlusions between the persons.
- the result that the reliability is high (the variance value is small) as the (concealment) is small, or the result that the reliability is high (the variance value is small) as the contour of the person 130 is high is output.
- the reliability of positioning is extracted based on the video acquired from the imaging unit 121.
- the reliability is extracted based on the matching degree with the person template in the person position calculating unit 101. Also good. That is, the higher the matching degree, the higher the reliability (the smaller the variance value).
- the reliability (dispersion value) can be obtained by preparing a reliability table (a table indicating the correspondence between the matching degree with the template and the reliability or the dispersion value) in advance. The method of extracting the reliability is not limited to these.
- FIG. 7 shows an example of the person position calculated by the person position calculation unit 101 and the reliability extracted by the reliability extraction unit 102. For each person (person ID), a time at which the person is detected, the coordinates of the person position, and a probability density function representing the reliability are obtained.
- a variance value of a two-dimensional normal distribution is used as an index indicating how reliable the positioning result is.
- the probability density functions f 1 (x, y), g 1 (x, y), and the like that appear in the present embodiment are two-dimensional normal distributions expressed by Expression (1), respectively, and the function name and subscript are When different, it represents that the value of variance value ⁇ 2 which is a parameter of the two-dimensional normal distribution is different.
- the existence possible area creation unit 104 is based on the positioning information acquired from the person position calculation part 101 and the positioning reliability acquired from the reliability extraction part 102, and the existence possible area AR1 (see FIG. 1 to FIG. 5) and the created existence possible area AR1 is sent to the entry determination unit 105.
- the method of creating the existence possible area AR1 is as described above with reference to FIG.
- the monitoring area database (DB) 106 holds area information indicating the monitoring area AR0, and provides monitoring area information to the approach determination unit 105.
- FIG. 8 shows an example of monitoring area information stored in the monitoring area DB 106. In the example of FIG. 8, two monitoring areas (Nos. 1 and 2) are stored. In the example shown in the figure, the monitoring area is represented by rectangular coordinates. The monitoring area may be set by the user using, for example, GUI (Graphical User Interface).
- GUI Graphic User Interface
- the entry determination unit 105 selects any one of the above (i) to (vi). By performing the determination, it is determined whether or not the person 130 has entered the monitoring area AR0, and the entry determination result is sent to the determination result notification unit 103.
- the determination result notification unit 103 acquires the determination result from the entry determination unit 105, and when the determination result indicates entry, the determination result notification unit 103 marks the video acquired from the imaging unit 121 so that the intruder can be recognized and outputs the marking to the monitor 141. Thus, the manager 142 is informed that the person 130 has entered the management area AR0.
- FIG. 9 shows an example of an approach warning image displayed on the monitor 141.
- the person 130 determined to enter the management area AR0 is surrounded by a frame.
- the characters “finding an intruder!” Are displayed. Thereby, the administrator 142 can specify the person 130 that has entered the management area AR0.
- FIG. 10 shows the processing procedure of the possible area creation unit 104.
- the existable area creation unit 104 acquires positioning information from the person position calculation unit 101 in step ST11, and acquires reliability from the reliability extraction unit 102 in step ST12.
- the possible area creation unit 104 creates a 95% confidence area for the positioning point as the possible area AR1 from the positioning information and the reliability, for example, as shown in FIG. 4A.
- the possible existence area AR1 to be created is not limited to an area having a confidence interval of 95%.
- FIG. 11 shows a processing procedure of the entry determination unit 105.
- the entry determination unit 105 acquires information on the possible area AR1 from the possible area creation unit 104, and acquires information on the monitoring area AR0 from the monitoring area DB 106 in step ST22.
- the determination of (i) described above is performed in step ST23 is shown, but any one of (ii) to (vi) may be adopted instead of the determination of (i). Good.
- step ST23 determines that the person 130 has not entered the monitoring area AR0, and if a positive result is obtained in step ST23 (step ST23; Yes). ), Moving to step ST25, it is determined that the person 130 has entered the monitoring area AR0.
- the radius of the possible area generated from f 1 (x, y) that is the reliability of the user ID 001 is 35, and the existence can be generated from f 2 (x, y) that is the reliability of the user ID 003. If the radius of the area is 15 and the above determination (i) is made, the area where the user ID 001 can exist is the monitoring area number. Since it overlaps with 1, it is determined as an entry, and the possible area of the user ID 003 does not overlap with any monitoring area, so it is not determined as an entry.
- the existence possible area AR1 is determined based on the reliability extraction unit 102 as the positioning reliability detection unit, and the positioning result and the reliability (positioning accuracy).
- the possible existence region creation unit 104 to be determined and the entry determination unit 105 that determines entry and / or exit of the target monitoring area AR0 based on the overlap between the existence possible region AR1 and the monitoring area AR0
- the positioning method is not limited to this.
- positioning may be performed using a signal from a wireless tag attached to the object.
- Any object can be attached to the object as long as the object can be identified by the entry / exit detection device 100 and the object can be positioned in the space.
- RFID Radio Frequency IDentification
- a result obtained by the wireless tag attached to the target autonomously measured by GPS (Global Positioning System) or the like may be transmitted to the entrance / exit detection device 100.
- the positioning reliability may be detected based on the radio wave reception status from the wireless tag.
- the radio wave reception status is, for example, RSSI (Received Signal Strength Indicator), WEI (Word Error Indicator), BEI (Bit Error Indicator), BER (Bit Error Rate) or SNR (Signal Signal to Noise Ratio).
- RSSI Receiveived Signal Strength Indicator
- WEI Wired Error Indicator
- BEI Bit Error Indicator
- BER Bit Error Rate
- SNR Signal to Noise Ratio
- the reliability table a table showing the correspondence between the parameter of the radio wave reception status and the reliability or the variance value
- Reliability disersion value
- the positioning data may be obtained by using a sensing signal from a sensor group embedded in the ground, floor, table, or the like, for example. Or what was calculated
- a monitoring device capable of performing only necessary and sufficient alarm notification without performing unnecessary alarm notification after performing an accurate entry / exit determination without leakage is presented.
- FIG. 12 shows an image of this embodiment.
- a black circle indicated by reference symbol P1 indicates a positioning point of a person who is an entry detection target and is not authorized to enter the monitoring area AR0.
- Reference symbol L1 indicates a flow line connecting the positioning points P1 at a plurality of times.
- Reference numerals P2-1 and P2-2 indicate positioning points of persons (for example, employees) who have authority to enter the monitoring area AR0.
- a positioning point P2-1 of an employee whose person ID is “001” and a positioning point P2-2 of an employee whose person ID is “002” are shown.
- the areas indicated by the reference signs F1 and F2 are the visual recognition areas, the visual recognition area of the employee with the person ID “001” is F1, and the visual recognition area of the employee with the person ID “002” is F2.
- FIG. 13 explains how to obtain the visual recognition area.
- the gaze direction is obtained by connecting the positioning point at time t and the positioning point at time t-1 and extending the line to the time t side.
- an area Ycm from the positioning point at time t is set as the visual recognition area.
- the line-of-sight direction is obtained from a line (flow line) connecting positioning points at different times, but the method of obtaining the line-of-sight direction is not limited to this.
- a plurality of wireless tags may be attached to a person, and the line-of-sight direction may be obtained based on differences in tag positioning results.
- the direction of the face may be determined by performing image processing on the camera image, and the line-of-sight direction may be obtained.
- the line-of-sight direction may be directly acquired using a gyro sensor or a geomagnetic sensor.
- a fixed visual recognition area is created, but the visual recognition area is changed according to a person's peripheral situation (for example, there is a shielding object) and a person attribute (such as a visual acuity value).
- the peripheral situation may be acquired with reference to a peripheral situation database storing three-dimensional arrangement data such as an obstacle.
- the person attribute may be obtained by referring to a person attribute database in which the position calculation unit or the photographing unit obtains a person ID and stores the person ID and attribute data in association with each other.
- the visual recognition area is created based only on the positioning point.
- the visual recognition area is created based only on the positioning point.
- the visual recognition area forms an assumption that an employee or the like can monitor a suspicious person, and thus must be an area that can be visually recognized with certainty.
- FIG. 14 shows an example of how to obtain such a visual recognition area. That is, an example of how to obtain a region that falls within the field of view regardless of the positioning error is shown.
- Straight line 1 and straight line 2 are the inscribed lines of the area where the authorized person can exist at time t1 and the area where the authorized person can exist at time t2.
- the region 1 is a region having a viewing angle of 60 degrees ( ⁇ 30 degrees to the left and right of the straight line 1) and a distance of 3 m with respect to the direction of the straight line 1 from the contact point between the straight line 1 and the possible region at time t2.
- Area 2 is an area created for line 2 in the same manner as area 1.
- Region 3 is a region with a viewing angle of 60 degrees and a distance of 3 m centered on the direction of straight line 3. An area that overlaps all three areas is set as a visual recognition area. In addition, what is necessary is just to obtain
- FIG. 15 shows an example of a monitoring state in the present embodiment.
- FIG. 15A even when the positioning point P1 of the non-authorized person is detected in the monitoring area AR0, if the positioning point P1 is present in the visual recognition area F1 of an authorized person such as an employee, an alarm is issued. (Entry notification alarm) is not output.
- FIG. 15B when the positioning point P1 of the non-authorized person is detected in the monitoring area AR0 and the positioning point P1 does not exist in the visual recognition area F1, an alarm indicating that there is an intruder (Entry notification alarm) is output.
- FIG. 16 is a diagram showing the configuration of the monitoring system according to the present embodiment, in which the same reference numerals are assigned to the corresponding components in FIG.
- the monitoring system 210 includes a camera 120, a monitoring device 200, and a monitor 141.
- the monitoring device 200 has the same configuration as the entrance / exit detection device 100 of FIG. 6 except that it includes a positioning information history database (DB) 201, a visual recognition area forming unit 202, and an alarm necessity determination unit 203. .
- DB positioning information history database
- the positioning information history DB 201 holds the positioning information from the person position calculation unit 101 and provides the positioning history to the visual recognition area forming unit 202.
- the visual recognition area forming unit 202 inputs a positioning point of a person without entry authority determined to have entered from the entry determination unit 105, and also inputs a history of positioning points of a person with entry authority from the positioning information history DB 201. From these, the visual recognition area of the person with the entry authority located in the vicinity of the person without the entry authority determined to have entered the monitoring area AR0 is formed, and the visual recognition area is sent to the alarm necessity determination unit 203.
- the visual recognition area forming unit 202 may obtain the visual recognition area as shown in FIG. 13, or may obtain the visual recognition area in consideration of the reliability of positioning as shown in FIG.
- the reliability extraction section 102 and the existence possible area creation section 104 obtain an existence possibility area of a person with the entry authority, and the existence existence area is given to the visual recognition area formation section 202. Enter it.
- the alarm necessity determination unit 203 inputs the positioning point of the person without entry authority determined to have entered from the entry determination unit 105, and also inputs the visual recognition area from the visual recognition area forming unit 202. It is determined whether or not the person having the entry authority has visually recognized the entry. When it is determined that a person with the entry authority is viewing, the alarm necessity determination unit 203 outputs a determination result indicating that an alarm is unnecessary to the determination result notification unit 103. On the other hand, the alarm necessity determination unit 203 outputs a determination result indicating that an alarm is necessary to the determination result notification unit 103 when it is determined that a person with entry authority is not visually recognized.
- FIG. 17A shows an example of a person position and reliability for an intruder without entry authority.
- FIG. 17B shows a first example of person position and reliability for a person with entry authority.
- FIG. 17C shows a second example of the person position and the reliability for a person with entry authority.
- FIG. 17D shows an example of monitoring area information stored in the monitoring area DB 106.
- step ST31 the information of the positioning point of the person without the entry authority determined that the visual recognition area forming unit 202 has entered from the entry determination unit 105 is acquired.
- step ST32 the visual recognition area forming unit 202 refers to the positioning information history DB 201 to determine whether there is a person who has an entry authority within a range of, for example, a radius of 300 [cm] centered on the positioning point acquired in step ST31. And search.
- step ST33 determines in step ST33 that there is no person with the entry authority within the radius of 300 [cm] (step ST33; No)
- step ST34 the visual recognition area forming unit 202 requires an alarm.
- the notifying unit 203 is notified that the intruder is a warning target, and the warning necessity determining unit 203 determines that the intruder is a warning target.
- step ST33 determines that there is a person with the entry authority within the radius of 300 [cm] (step ST33; Yes)
- the process proceeds to step ST35, and the corresponding person with the entry authority is found.
- An extension line connecting the time positioning point and the previous positioning point is calculated as the viewing direction (see FIG. 13).
- step ST36 the visual recognition area forming unit 202 forms, for example, an area of 300 [cm] in the line-of-sight direction and ⁇ 30 ° from the line-of-sight direction as a visual recognition area for a person with entry authority (see FIG. 13).
- step ST37 the alarm necessity determination unit 203 determines whether or not the positioning point of the intruder who does not have the entry authority exists in the viewing area. If the alarm necessity determination unit 203 obtains a negative result in step ST37 (see FIG. 15B), the alarm necessity determination unit 203 moves to step ST38 and determines that the intruder is an alarm target. On the other hand, when the alarm necessity determination unit 203 obtains a positive result in step ST37 (see FIG. 15A), the alarm necessity determination unit 203 moves to step ST39 and determines that the intruder is not an alarm target.
- the visual recognition area forming unit 202 and the alarm necessity determination unit 203 are provided, and an object that does not have authority to enter the monitoring area AR0 is a monitoring area. Even if it is determined that the vehicle has entered AR0, if the target positioning point that is entered is included in the visual recognition area of a person authorized to enter the monitoring area AR0, an alarm is not output. Necessary alarms can be prevented from being output.
- the request (2) (that is, reducing unnecessary alarm output by avoiding unnecessary entry determination) can be more fully satisfied.
- An apparatus and method are presented. That is, in the present embodiment, a monitoring device is provided that can perform only necessary and sufficient alarm notification without performing unnecessary alarm notification after performing accurate entry / exit determination without leakage.
- FIG. 19 shows an image of this embodiment.
- a warning is not output if a person with entry authority is accompanied.
- a black circle indicated by reference symbol P1 indicates a positioning point of a person who is an entry detection target and is not authorized to enter the monitoring area AR0.
- Reference symbol P2 indicates a positioning point of a person (for example, an employee) who has authority to enter the monitoring area AR0.
- a line connecting several positioning points before and several positioning points where a person without entry authority enters the monitoring area AR0 is used as the similarity determination line L1.
- a line connecting a plurality of positioning points of a person with entry authority at the same time as a plurality of positioning points that are the basis of the similarity determination flow line L1 is used as the similarity determination line L2.
- the present invention there are the following methods (i) to (iv) as a method for determining that a person with the entry authority is accompanied by a person without the entry authority.
- the present invention is not limited to these methods. Absent.
- the distance between the similarity determination flow lines L1 and L2 referred to below is the positioning point of the person with the entry authority and the person without the entry authority, which are the basis of the similarity determination flow lines L1 and L2.
- the distance between the positioning points at the same time. In practice, the distance may be an average value between a plurality of positioning points.
- the distance between the similarity determination flow lines L1 and L2 is equal to or less than a predetermined value, and the difference in length between the similarity determination flow lines L1 and L2 (or between a person with entry authority and a person without entry authority) Is determined to be accompanied when the difference in speed is equal to or less than a predetermined value.
- the angle between the similarity determination flow lines L1 and L2 is equal to or less than a predetermined value and the length difference between the similarity determination flow lines L1 and L2 (or the speed of the person having the entry authority and the person having no entry authority) When the difference is equal to or less than a predetermined value, it is determined that the person is accompanied.
- FIG. 20 shows an image of necessity determination of alarm according to the present embodiment.
- the determination criterion (iv) is used among the determination criteria (i) to (iv), in FIG. 20A, the distance between the similarity determination flow lines L1 and L2 is equal to or less than a predetermined value, and the similarity is Since the angles of the determination flow lines L1 and L2 are equal to or smaller than a predetermined value, it is determined that a person with entry authority is accompanied by a person without entry authority, and it is determined that an alarm is unnecessary.
- FIG. 20 shows an image of necessity determination of alarm according to the present embodiment.
- the difference in length between the similarity determination flow lines L1 and L2 is equal to or less than a predetermined value, but the angle between the similarity determination flow lines L1 and L2 is larger than the predetermined value. It is determined that an unauthorized person is not accompanied by an unauthorized person and it is determined that an alarm is necessary.
- the similarity determination may be performed in consideration of the reliability of the positioning point without being limited to the similarity determination based only on the positioning point.
- circles 11 and 12 are set as possible areas at times t1 and t2 of the determination target person (person without entry authority), and time t1 of the accompanying candidate (person with entry authority) is set.
- Circles 21 and 22 which are possible areas at t2.
- tangent lines 11 and 12 that are inscribed lines of the circle 11 and the circle 12 are drawn, and tangent lines 21 and 22 that are inscribed lines of the circle 21 and the circle 22 are drawn.
- the possible range of similarity flow line L1 of a person without entry authority is between tangent line 11 and tangent line 12.
- the possible range of similarity flow line L2 of a person with entry authority is between tangent line 21 and tangent line 22. Therefore, among the combinations of the tangent line 11 and the tangent line 21, the tangent line 11 and the tangent line 22, the tangent line 12 and the tangent line 21, and the tangent line 12 and the tangent line 22, the angle based on the combination of the tangents having the largest angle is defined as the determination target angle. . That is, the angle under the condition that the positioning error is maximum is used as the determination target angle.
- the distance between the centers of the circle 11 and the circle 12 is r1
- the distance between the centers of the circle 21 and the circle 22 is r2
- the total radius of the possible areas of the circle 11 and the circle 12 is a1
- the circle 21 and the circle 21 Assuming that the total radius of 22 possible areas is a2, the larger of the difference between r1 + a1 and r2-a2 and the difference between r1-a1 and r2 + a2 is the length of the segment to be judged.
- the similarity determination flow line L1 under the condition that the positioning error is maximum is when one of the similarity determination flow lines L1 and L2 is the shortest and the other is the longest. , L2 length difference. Specifically, the length (r1 + a1) when the similarity determination flow line L1 of the person without entry authority is the longest (r1 + a1) and the length when the similarity determination flow line L2 of the person with entry authority is the shortest ( r2-a2) or the length (r1-a1) when the similarity determination flow line L1 of the person without entry authority is the shortest and the similarity determination flow line L2 of the person with entry authority are one.
- One of the differences from the longest length (r2 + a2) is the maximum length difference, and the maximum length difference may be used as a determination target.
- the difference in length of one section has been described, but it may be determined based on the difference in length of a plurality of sections.
- the person with the entry authority can surely accompany the person without the entry authority. You can determine that you are traveling only when you are. As a result, it is possible to avoid determining that the vehicle is accompanying the vehicle even though it is not accompanying the vehicle due to the positioning error, so that the alarm leakage can be avoided.
- FIG. 23 shows the configuration of the monitoring system of this embodiment.
- the monitoring system 310 includes a camera 120, a monitoring device 300, and a monitor 141.
- the monitoring device 300 has the same configuration as the entry / exit detection device 100 of FIG. 6 except that it includes a positioning information history database (DB) 301 and a similarity determination unit 302.
- DB positioning information history database
- the positioning information history DB 301 holds positioning information from the person position calculation unit 101 and provides a positioning history to the similarity determination unit 302.
- the similarity determination unit 302 inputs the positioning points of the person without entry authority determined to have entered from the entry determination unit 105 and the history of the positioning points of the person with entry authority and the person without entry authority with the positioning information. Input from the history DB 301.
- the similarity determination unit 302 uses the determination criteria (i) to (iv) described above to determine whether a person with entry authority is accompanied by a person without entry authority. If the similarity determination unit 302 determines that a person with entry authority is accompanied, the similarity determination unit 302 outputs a determination result indicating that an alarm is unnecessary to the determination result notification unit 103. On the other hand, the similarity determination unit 302 outputs a determination result indicating that an alarm is necessary to the determination result notification unit 103 when it is determined that the person with the entry authority is not accompanied.
- FIG. 24A shows an example of a person position and reliability for an intruder without entry authority.
- FIG. 24B shows a first example of person position and reliability for a person with entry authority.
- FIG. 24C shows a second example of person position and reliability for a person with entry authority.
- FIG. 24D shows an example of monitoring area information stored in the monitoring area DB 106. If there is a person who has the entry authority of the positioning point data of FIG. 24B with respect to the intruder of the positioning point data of FIG. 24A, the similarity determination unit 302 determines that an alarm is unnecessary. On the other hand, when there is a person who has the entry authority of the positioning point data of FIG. 24C for the intruder of the positioning point data of FIG. 24A, the similarity determination unit 302 determines that an alarm is necessary.
- step ST41 the degree-of-similarity determination unit 302 acquires information on positioning points of a person who has been determined to have entered from the entry determination unit 105 and has no entry authority.
- step ST42 the similarity determination unit 302 refers to the positioning information history DB 301 to determine whether there is a person who has an entry authority within a radius of 100 [cm], for example, with the positioning point acquired in step ST41 as the center. And search.
- step ST43 determines in step ST43 that there is no person with authority to enter within the radius of 100 [cm] (step ST43; No)
- step ST44 the similarity determination unit 302 enters the intruder. Is determined to be an alarm target.
- step ST43 determines that there is a person with entry authority within the radius of 100 [cm] (step ST43; Yes)
- the process proceeds to step ST45, and the corresponding person with entry authority corresponds.
- a line connecting the time positioning point and the previous four positioning points is calculated as the similarity determination flow line L2 (see FIG. 19).
- step ST46 the similarity determination unit 302 calculates a line connecting the positioning point of the entering person without entry authority and the four previous positioning points as the similarity determination flow line L1 (see FIG. 19).
- step ST47 the similarity determination unit 302 determines that the difference between the lengths of the two similarity determination flow lines L1 and L2 is within 100 [cm], and the angle when the viewpoint is matched is within 30 °. Judge whether or not. If the similarity determination unit 302 obtains a negative result in step ST47, the similarity determination unit 302 proceeds to step ST48 and determines that the intruder is an alarm target. On the other hand, when the similarity determination unit 302 obtains a positive result in step ST47, the similarity determination unit 302 moves to step ST49 and determines that the intruder is not an alarm target.
- FIG. 26 shows an image of this embodiment.
- this embodiment even when it is detected that a person without entry authority has entered the monitoring area AR0, if it is determined that the entry is to avoid a collision with another person, no alarm is output. .
- a black circle indicated by reference symbol P1 indicates a positioning point of a person who is an entry detection target and is not authorized to enter the monitoring area AR0.
- Reference symbol P2 indicates a positioning point of another person existing in the vicinity of the entry detection target.
- a collision determination line L1 ′ is used as the collision determination line L1 ′.
- a collision determination line L2 ' is formed based on a positioning point at the same time as the positioning point that formed the collision determination line L1' for other persons existing in the vicinity of the entering person.
- FIG. 27 shows an image of determining necessity of alarm according to the present embodiment.
- the collision determination line L1 'and the collision determination line L2' intersect each other, it is determined that the vehicle is entering for avoiding a collision, and it is determined that a warning is unnecessary.
- FIG. 27B since the collision determination line L1 'and the collision determination line L2' do not intersect, it is determined that the approach is not for collision avoidance, and the warning is determined to be necessary.
- the collision determination may be performed in consideration of the reliability of the positioning point without being limited to the collision determination based only on the positioning point. The method will be described with reference to FIG.
- circles 11 and 12 are set, which are regions where positioning points can exist at times t1 and t2 of the person to be judged (person without entry authority), and the inscribed lines 11 and 12 of the circle 11 and the circle 12 are drawn.
- the contact points between the inscribed lines 11 and 12 and the circles 1 and 2 are defined as shown in the figure.
- the distance between the contact point 11 and the contact point 21 is r, and a point on the inscribed line 11 on the opposite side of the contact point 11 is a distance a ⁇ r (a is a constant) from the contact point 21.
- a is a parameter indicating how far the collision at the previous time point is taken into account. If the value of a is small, only the next collision is judged, and if the value of a is large, the collision beyond that is also determined. It will be judged.
- the arc on the short side surrounded by the inscribed line 11 and the inscribed line 12 among the arcs whose radius is “distance of the contact point 21 + distance ar” centering on the intersection of the inscribed line 11 and the inscribed line 12 is the arc.
- the arc Set to 1.
- an area surrounded by the longer arc divided by the contact 21 and the contact 22 is defined as a collision determination area.
- the collision determination area is similarly obtained for the positioning point of the person to be collided (other persons existing in the vicinity of the entering person).
- the collision determination is performed by determining whether or not there is an overlap in each of the collision determination areas of the determination target and the collision target obtained in this way.
- the collision determination method may be a determination method in which the overlapping area is equal to or greater than a certain value, instead of whether there is an overlapping region.
- FIG. 29, shows the configuration of the monitoring system of the present embodiment.
- the monitoring system 410 includes a camera 120, a monitoring device 400, and a monitor 141.
- the monitoring device 400 has the same configuration as the entry / exit detection device 100 of FIG. 6 except that it includes a positioning information history database (DB) 401 and a collision determination unit 402.
- DB positioning information history database
- the positioning information history DB 401 holds the positioning information from the person position calculation unit 101 and provides the positioning history to the collision determination unit 402.
- the collision determination unit 402 inputs the positioning points of the person without entry authority determined to have entered from the entry determination unit 105, and the positioning information history of the positioning points of the person with entry authority and the person without entry authority. Input from DB 401.
- the entry determination unit 105 determines whether the entry of a person without entry authority is for collision avoidance. When the collision determination unit 402 determines that the entry is for collision avoidance, the collision determination unit 402 outputs a determination result indicating that an alarm is unnecessary to the determination result notification unit 103. On the other hand, when the collision determination unit 402 determines that the entry is not for collision avoidance, the collision determination unit 402 outputs a determination result indicating that an alarm is necessary to the determination result notification unit 103.
- FIG. 30A shows an example of the person position and reliability for an intruder without entry authority.
- FIG. 30B shows a first example of person position and reliability for a person with entry authority.
- FIG. 30C shows a second example of person position and reliability for a person with entry authority.
- FIG. 30D shows an example of monitoring area information stored in the monitoring area DB 106.
- step ST51 the collision determination unit 402 obtains information on the positioning points of the person without the entry authority determined to have entered from the entry determination unit 105.
- step ST52 the collision determination unit 402 searches the positioning information history DB 401 to find out whether another person exists within a range of, for example, a radius of 100 [cm] centered on the positioning point acquired in step ST51. To do.
- step ST53 determines in step ST53 that there is no other person within the radius of 100 [cm] (step ST53; No)
- step ST54 the collision determination unit 402 detects that the intruder is an alarm target. Judge that there is.
- step ST53 when the collision determination unit 402 determines that another person exists within a radius of 100 [cm] (step ST53; Yes), the collision determination unit 402 moves to step ST55 and determines the positioning point at the corresponding time of the other person.
- a collision determination line L2 ′ is calculated by extending a line connecting the previous positioning point and the previous positioning point to the previous positioning point by the length of two.
- step ST56 the collision determination unit 402 sets the line connecting the previous positioning point and the second previous positioning point of the ingress person without entry authority to the previous positioning point by the length of two. By extending to the side, the collision determination line L1 ′ is calculated.
- step ST57 the collision determination unit 402 determines whether or not the two collision determination lines L1 'and L2' intersect.
- the collision determination unit 402 proceeds to step ST58 and determines that the intruder is an alarm target.
- the collision determination unit 402 proceeds to step ST59 and determines that the intruder is not an alarm target.
- the entry / exit detection device 100 and the monitoring devices 200, 300, 400 described in the first to fourth embodiments can be implemented by a general-purpose computer such as a personal computer. Each process included is realized by reading out a software program corresponding to the process of each processing unit stored in the memory of the computer and executing the software program by the CPU. Further, the entry / exit detection device 100 and the monitoring devices 200, 300, and 400 may be realized by dedicated devices equipped with LSI chips corresponding to the respective processing units.
- the entry / exit detection device and the monitoring device of the above-described embodiment are characterized in that the entry and / or exit from the target monitoring area (specific area) AR0 can be accurately detected, and how to use the detection result.
- a target for example, a portable wireless terminal
- a target that is determined to have entered by the entry determination unit 105 may be notified that the specific area has been entered wirelessly.
- content relating to a specific area may be distributed only to a target (for example, a portable wireless terminal) that is determined to be entered by the entry determination unit 105.
- the entrance / exit detection device and monitoring device of the above-described embodiment are built in the mobile terminal, and the entrance / exit detection result and the monitoring result are displayed on the monitor of the mobile terminal, or a warning corresponding to the entrance / exit detection result and the monitoring result. You may make it notify the user who hold
- the present invention has an effect of accurately detecting the entry / exit of a detection target to / from a specific area even when a positioning error occurs, and is suitable for a monitoring system, for example.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Image Analysis (AREA)
- Alarm Systems (AREA)
- Burglar Alarm Systems (AREA)
Abstract
Description
[1]原理
先ず、実施の形態を説明する前に、本発明に至った経緯と、本実施の形態の原理について説明する。
図6に、本実施の形態の入退検出装置の構成を示す。以下では、本実施の形態の入退検出装置100を監視システムに適用した場合について説明するが、入退検出装置100は、監視システムに限らず、例えば特定エリアへの進入を検出した人物にのみ無線により特定エリアに関するコンテンツを配信するコンテンツ配信システム等、種々の用途に適用可能である。また、以下では、入退を検出する対象が人物である場合について説明するが、検出対象は人物に限らず、物品、車両、移動型の監視ロボット、その他、測位可能な移動物体であれば適用可能である。
次に、本実施の形態の動作について説明する。本実施の形態の入退検出装置100は、存在可能領域作成部104及び進入判定部105の処理に特徴があるので、ここではそれらの処理手順について説明する。
以上説明したように、本実施の形態によれば、測位信頼度検出部としての信頼度抽出部102と、測位結果と信頼度(測位精度)とに基づいて存在可能領域AR1を決定する存在可能領域作成部104と、存在可能領域AR1と監視エリアAR0との重なりに基づいて、対象の監視エリアAR0への進入及び又は退出を判定する進入判定部105と、を設けたことにより、測位誤差が生じた場合でも、監視エリアAR0への検出対象130の入退を精度良く検出できる。
[1]原理
監視装置においては、(1)不審人物の進入を漏れなく検出すること、(2)不要な進入判定を避けることで無駄な警報出力を減らすこと、が重要である。実施の形態1の構成及び方法を用いれば、前記(1)及び前記(2)の要求を共に充たすことができるが、本実施の形態では、前記(2)の要求をより十分に充たすことができる装置及び方法を提示する。
図6との対応部分に同一符号を付して示す図16に、本実施の形態の監視システムの構成を示す。監視システム210は、カメラ120と、監視装置200と、モニタ141とを有する。監視装置200は、測位情報履歴データベース(DB)201と、視認領域形成部202と、警報要否判定部203とを有することを除いて、図6の入退検出装置100と同様の構成でなる。
次に、本実施の形態の動作について説明する。本実施の形態の監視装置200は、視認領域形成部202と、警報要否判定部203の処理に特徴があるので、ここではそれらの処理手順について、図18を用いて説明する。
以上説明したように、本実施の形態によれば、視認領域形成部202と、警報要否判定部203とを設け、監視エリアAR0に進入する権限をもたない対象が監視エリアAR0に進入したと判定した場合でも、進入した対象の測位点が、監視エリアAR0に進入する権限をもつ人物の視認領域に含まれる場合は、警報を出力させないようにしたことにより、実際上不必要な警報が出力されることを防止できる。
[1]原理
本実施の形態では、実施の形態2と同様に、前記(2)の要求(すなわち不要な進入判定を避けることで無駄な警報出力を減らすこと)をより十分に充たすことができる装置及び方法を提示する。つまり、本実施の形態では、漏れのない正確な入退判定を行った上で、不要な警報通知は行わずに必要十分な警報通知のみ行うことができる監視装置を提示する。
図6との対応部分に同一符号を付して示す図23に、本実施の形態の監視システムの構成を示す。監視システム310は、カメラ120と、監視装置300と、モニタ141とを有する。監視装置300は、測位情報履歴データベース(DB)301と、類似度判定部302とを有することを除いて、図6の入退検出装置100と同様の構成でなる。
次に、本実施の形態の動作について説明する。本実施の形態の監視装置300は、類似度判定部302の処理に特徴があるので、ここでは類似度判定部302の処理手順について、図25を用いて説明する。なお、以下では、上記判定基準(i)~(iv)のうち、(iv)の判定基準を用いた場合を例にとって説明する。
以上説明したように、本実施の形態によれば、類似度判定部302を設け、監視エリアAR0に進入する権限をもたない対象が監視エリアAR0に進入したと判定した場合でも、監視エリアAR0に進入する権限をもつ人物が同行していると判定した場合は、警報を出力させないようにしたことにより、実際上不必要な警報が出力されることを防止できる。
[1]原理
本実施の形態では、実施の形態2、3と同様に、前記(2)の要求(すなわち不要な進入判定を避けることで無駄な警報出力を減らすこと)をより十分に充たすことができる装置及び方法を提示する。つまり、本実施の形態では、漏れのない正確な入退判定を行った上で、不要な警報通知は行わずに必要十分な警報通知のみ行うことができる監視装置を提示する。
図6との対応部分に同一符号を付して示す図29に、本実施の形態の監視システムの構成を示す。監視システム410は、カメラ120と、監視装置400と、モニタ141とを有する。監視装置400は、測位情報履歴データベース(DB)401と、衝突判定部402とを有することを除いて、図6の入退検出装置100と同様の構成でなる。
次に、本実施の形態の動作について説明する。本実施の形態の監視装置400は、衝突判定部402の処理に特徴があるので、ここでは衝突判定部402の処理手順について、図31を用いて説明する。
以上説明したように、本実施の形態によれば、衝突判定部402を設け、監視エリアAR0に進入する権限をもたない対象が監視エリアAR0に進入したと判定された場合でも、他の人物との衝突を避けるために進入したと判定した場合は、警報を出力させないようにしたことにより、実際上不必要な警報が出力されることを防止できる。
Claims (15)
- 測位部の測位信頼度を検出する測位信頼度検出部と、
前記測位部によって得られた測位結果と前記測位信頼度検出部により得られた前記測位信頼度とに基づいて、前記測位部により測位された対象が存在する可能性のある存在可能領域を決定する存在可能領域決定部と、
前記存在可能領域と、特定エリアとの重なりに基づいて、前記対象の前記特定エリアへの進入及び又は退出を判定する判定部と、
を具備する入退検出装置。 - 前記判定部は、前記存在可能領域と前記特定エリアとが重なる面積に基づいて、前記対象の前記特定エリアへの進入及び又は退出を判定する、
請求項1に記載の入退検出装置。 - 前記測位信頼度検出部は、前記測位信頼度として、前記対象の位置の確率密度分布を得る、
請求項1に記載の入退検出装置。 - 前記存在可能領域決定部は、前記確率密度分布における確率密度が所定値以上である領域、又は、前記確率密度分布における累積確率が所定値にほぼ一致する領域を、前記存在可能領域として決定する、
請求項3に記載の入退検出装置。 - 前記判定部は、前記確率密度分布における確率密度が所定値以上である領域又は前記確率密度分布における累積確率が所定値にほぼ一致する領域と、前記特定エリアと、が重なる領域における前記確率密度の積算値に基づいて、前記対象の前記特定エリアへの進入及び又は退出を判定する、
請求項4に記載の入退検出装置。 - 請求項1に記載の入退検出装置と、
前記判定部により得られた判定結果に基づいて、警報を出力する警報出力部と、
前記対象の視認領域を求める視認領域形成部と、
前記判定部により第1の対象が前記特定エリアに進入したと判定された場合であっても、前記第1の対象の測位位置が、前記第1の対象とは別の対象である第2の対象の視認領域に含まれる場合には、前記警報出力部に前記第1の対象についての警報を出力させない警報要否判定部と、
を具備する監視装置。 - 前記視認領域形成部は、前記対象の測位履歴に基づいて、前記視認領域を求める、
請求項6に記載の監視装置。 - 前記第1の対象は、前記特定エリアに進入する権限が無い対象であり、
前記第2の対象は、前記特定エリアに進入する権限がある対象である、
請求項6に記載の監視装置。 - 請求項1に記載の入退検出装置と、
前記判定部により得られた判定結果に基づいて、警報を出力する警報出力部と、
前記測位部による測位結果の履歴を記録する測位履歴記録部と、
前記測位履歴記録部に記録された第1の対象の測位履歴と第2の対象の測位履歴とから、第1の対象と第2の対象との相対状況を検出し、前記判定部により前記第1の対象が前記特定エリアに進入したと判定された場合であっても、前記第1の対象と前記第2の対象との相対状況が所定基準を満たす場合は、前記警報出力部に前記第1の対象についての警報を出力させない警報要否判定部と、
を具備する監視装置。 - 前記警報要否判定部は、前記相対状況と前記所定基準とを用いて、前記第1の対象と前記第2の対象が同行しているか否かを判定し、同行していると判定した場合は、前記警報出力部に前記第1の対象についての警報を出力させない、
請求項9に記載の監視装置。 - 前記第1の対象は、前記特定エリアに進入する権限が無い対象であり、
前記第2の対象は、前記特定エリアに進入する権限がある対象である、
請求項10に記載の監視装置。 - 前記警報要否判定部は、前記相対状況と前記所定基準とを用いて、前記第1の対象と前記第2の対象が衝突する可能性があったか否かを判定し、衝突する可能性があったと判定した場合は、前記警報出力部に前記第1の対象についての警報を出力させない、
請求項9に記載の監視装置。 - 前記第1の対象は、前記特定エリアに進入する権限が無い対象であり、
前記第2の対象は、前記特定エリアに進入する権限がある対象である、
請求項12に記載の監視装置。 - 測位信頼度を検出する測位信頼度検出ステップと、
測位結果と前記測位信頼度とに基づいて、測位された対象が存在する可能性のある存在可能領域を決定する存在可能領域決定ステップと、
前記存在可能領域と、特定エリアとの重なりに基づいて、前記対象の前記特定エリアへの進入及び又は退出を判定する判定ステップと、
を含む入退検出方法。 - コンピュータに、
測位信頼度を検出するステップと、
測位結果と前記測位信頼度とに基づいて、測位された対象が存在する可能性のある存在可能領域を決定するステップと、
前記存在可能領域と、特定エリアとの重なりに基づいて、前記対象の前記特定エリアへの進入及び又は退出を判定するステップと、
を実行させるプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/254,963 US8717171B2 (en) | 2009-03-09 | 2009-10-15 | Device for detecting entry and/or exit, monitoring device, and method for detecting entry and/or exit including a possible existing region |
CN200980157998.3A CN102349096B (zh) | 2009-03-09 | 2009-10-15 | 出入检测装置、监视装置以及出入检测方法 |
JP2011503571A JP5432983B2 (ja) | 2009-03-09 | 2009-10-15 | 入退検出装置、監視装置及び入退検出方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009055591 | 2009-03-09 | ||
JP2009-055591 | 2009-03-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010103584A1 true WO2010103584A1 (ja) | 2010-09-16 |
Family
ID=42727893
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/005379 WO2010103584A1 (ja) | 2009-03-09 | 2009-10-15 | 入退検出装置、監視装置及び入退検出方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8717171B2 (ja) |
JP (1) | JP5432983B2 (ja) |
CN (1) | CN102349096B (ja) |
WO (1) | WO2010103584A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012042864A1 (ja) * | 2010-10-01 | 2012-04-05 | パナソニック株式会社 | 物体位置推定システム、物体位置推定装置、物体位置推定方法、及び物体位置推定プログラム |
JP2016528481A (ja) * | 2013-06-14 | 2016-09-15 | マイクロソフト テクノロジー ライセンシング,エルエルシー | 変化する信頼度レベルを用いたジオフェンスイベントの検出 |
JP2016530640A (ja) * | 2013-09-06 | 2016-09-29 | ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング | 画像情報の中の物体を認識する方法および制御装置 |
JP2017015562A (ja) * | 2015-07-01 | 2017-01-19 | パナソニックIpマネジメント株式会社 | 屋内測位システムのセットアップシステムおよびセットアップ方法 |
US9820231B2 (en) | 2013-06-14 | 2017-11-14 | Microsoft Technology Licensing, Llc | Coalescing geo-fence events |
US9880604B2 (en) | 2011-04-20 | 2018-01-30 | Microsoft Technology Licensing, Llc | Energy efficient location detection |
JP2019139275A (ja) * | 2018-02-06 | 2019-08-22 | 株式会社デンソーウェーブ | 共連れ検出装置 |
JP2021148646A (ja) * | 2020-03-19 | 2021-09-27 | セコム株式会社 | 判定システム及び判定装置 |
JP2021148647A (ja) * | 2020-03-19 | 2021-09-27 | セコム株式会社 | 判定システム及び判定装置 |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SG11201508085QA (en) * | 2013-03-29 | 2015-11-27 | Nec Corp | Target object identifying device, target object identifying method and target object identifying program |
US9384607B1 (en) * | 2014-12-03 | 2016-07-05 | Tyco Fire & Security Gmbh | Access control system |
US9831724B2 (en) | 2014-12-02 | 2017-11-28 | Tyco Fire & Security Gmbh | Access control system using a wearable access sensory implementing an energy harvesting technique |
CN107209869B (zh) | 2014-12-02 | 2020-10-27 | 泰科消防及安全有限公司 | 具有使用亚阈值技术的集成电路的无源rfid标签 |
US9384608B2 (en) * | 2014-12-03 | 2016-07-05 | Tyco Fire & Security Gmbh | Dual level human identification and location system |
CN106454717B (zh) * | 2015-08-13 | 2020-01-17 | 株式会社理光 | 位置判断方法、位置判断装置和电子设备 |
US9710978B1 (en) | 2016-03-15 | 2017-07-18 | Tyco Fire & Security Gmbh | Access control system using optical communication protocol |
US9824559B2 (en) | 2016-04-07 | 2017-11-21 | Tyco Fire & Security Gmbh | Security sensing method and apparatus |
CN107884795B (zh) * | 2016-09-30 | 2021-06-29 | 厦门雅迅网络股份有限公司 | 基于gps的进出区域的判断方法及其系统 |
WO2018066059A1 (ja) * | 2016-10-04 | 2018-04-12 | 三菱電機株式会社 | 入退管理システム |
JP6917804B2 (ja) * | 2017-06-28 | 2021-08-11 | アズビル株式会社 | 人検知結果判定方法および装置 |
CN110942540A (zh) * | 2019-11-29 | 2020-03-31 | 中核第四研究设计工程有限公司 | 核安保监控报警方法及装置 |
CN111489455B (zh) * | 2020-03-27 | 2021-09-28 | 中科车港(深圳)实业股份有限公司 | 融合北斗etc有源射频识别的三合一车载单元 |
CN114005210A (zh) * | 2021-09-24 | 2022-02-01 | 珠海格力电器股份有限公司 | 一种安全防护方法及安全防护装置 |
US11495025B1 (en) * | 2021-10-25 | 2022-11-08 | Motorola Solutions, Inc. | Method and apparatus for increasing security at an entrance point |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11203567A (ja) * | 1998-01-13 | 1999-07-30 | Mitsubishi Electric Corp | 監視用画像処理装置 |
JP2000295600A (ja) * | 1999-04-08 | 2000-10-20 | Toshiba Corp | 監視装置 |
JP2003051083A (ja) * | 2001-08-07 | 2003-02-21 | Omron Corp | 情報収集装置、情報収集方法、情報収集プログラム、情報収集プログラムを記録した記録媒体および情報収集システム |
JP2004276154A (ja) * | 2003-03-13 | 2004-10-07 | Omron Corp | 侵入物監視装置 |
JP2005250989A (ja) * | 2004-03-05 | 2005-09-15 | Sony Corp | 移動物体追跡方法及び画像処理装置 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2659666B2 (ja) | 1993-06-23 | 1997-09-30 | 東京電力株式会社 | 禁止区域の侵入監視方法及び装置 |
KR20000026757A (ko) * | 1998-10-22 | 2000-05-15 | 국필종 | 동작 검출 시스템 및 방법 |
JP3523795B2 (ja) | 1998-11-19 | 2004-04-26 | 沖電気工業株式会社 | 入退室管理システム |
JP3997911B2 (ja) | 2002-12-27 | 2007-10-24 | セイコーエプソン株式会社 | 領域判定装置、領域判定方法、領域判定機能を発揮させるプログラム及び、領域判定機能を発揮させるプログラムを記録したコンピュータ読み取り可能な情報記録媒体 |
WO2006022594A1 (en) * | 2004-08-27 | 2006-03-02 | Singapore Technologies Dynamics Pte Ltd | Multi-sensor intrusion detection system |
JP4538299B2 (ja) | 2004-11-11 | 2010-09-08 | セコム株式会社 | 出入管理システム |
JP4525351B2 (ja) | 2005-01-05 | 2010-08-18 | オムロン株式会社 | セキュリティ装置 |
JP2008047074A (ja) | 2006-08-21 | 2008-02-28 | Sogo Keibi Hosho Co Ltd | 警備装置、警備方法および警備プログラム |
-
2009
- 2009-10-15 WO PCT/JP2009/005379 patent/WO2010103584A1/ja active Application Filing
- 2009-10-15 JP JP2011503571A patent/JP5432983B2/ja not_active Expired - Fee Related
- 2009-10-15 US US13/254,963 patent/US8717171B2/en not_active Expired - Fee Related
- 2009-10-15 CN CN200980157998.3A patent/CN102349096B/zh not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11203567A (ja) * | 1998-01-13 | 1999-07-30 | Mitsubishi Electric Corp | 監視用画像処理装置 |
JP2000295600A (ja) * | 1999-04-08 | 2000-10-20 | Toshiba Corp | 監視装置 |
JP2003051083A (ja) * | 2001-08-07 | 2003-02-21 | Omron Corp | 情報収集装置、情報収集方法、情報収集プログラム、情報収集プログラムを記録した記録媒体および情報収集システム |
JP2004276154A (ja) * | 2003-03-13 | 2004-10-07 | Omron Corp | 侵入物監視装置 |
JP2005250989A (ja) * | 2004-03-05 | 2005-09-15 | Sony Corp | 移動物体追跡方法及び画像処理装置 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012042864A1 (ja) * | 2010-10-01 | 2012-04-05 | パナソニック株式会社 | 物体位置推定システム、物体位置推定装置、物体位置推定方法、及び物体位置推定プログラム |
US9880604B2 (en) | 2011-04-20 | 2018-01-30 | Microsoft Technology Licensing, Llc | Energy efficient location detection |
US9998866B2 (en) | 2013-06-14 | 2018-06-12 | Microsoft Technology Licensing, Llc | Detecting geo-fence events using varying confidence levels |
JP2016528481A (ja) * | 2013-06-14 | 2016-09-15 | マイクロソフト テクノロジー ライセンシング,エルエルシー | 変化する信頼度レベルを用いたジオフェンスイベントの検出 |
US9820231B2 (en) | 2013-06-14 | 2017-11-14 | Microsoft Technology Licensing, Llc | Coalescing geo-fence events |
JP2016530640A (ja) * | 2013-09-06 | 2016-09-29 | ローベルト ボッシュ ゲゼルシャフト ミット ベシュレンクテル ハフツング | 画像情報の中の物体を認識する方法および制御装置 |
US9842262B2 (en) | 2013-09-06 | 2017-12-12 | Robert Bosch Gmbh | Method and control device for identifying an object in a piece of image information |
JP2017015562A (ja) * | 2015-07-01 | 2017-01-19 | パナソニックIpマネジメント株式会社 | 屋内測位システムのセットアップシステムおよびセットアップ方法 |
JP2019139275A (ja) * | 2018-02-06 | 2019-08-22 | 株式会社デンソーウェーブ | 共連れ検出装置 |
JP2021148646A (ja) * | 2020-03-19 | 2021-09-27 | セコム株式会社 | 判定システム及び判定装置 |
JP2021148647A (ja) * | 2020-03-19 | 2021-09-27 | セコム株式会社 | 判定システム及び判定装置 |
JP7402089B2 (ja) | 2020-03-19 | 2023-12-20 | セコム株式会社 | 判定システム及び判定装置 |
JP7402090B2 (ja) | 2020-03-19 | 2023-12-20 | セコム株式会社 | 判定システム及び判定装置 |
Also Published As
Publication number | Publication date |
---|---|
US8717171B2 (en) | 2014-05-06 |
JPWO2010103584A1 (ja) | 2012-09-10 |
JP5432983B2 (ja) | 2014-03-05 |
US20110316700A1 (en) | 2011-12-29 |
CN102349096A (zh) | 2012-02-08 |
CN102349096B (zh) | 2014-04-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5432983B2 (ja) | 入退検出装置、監視装置及び入退検出方法 | |
JP4924607B2 (ja) | 不審行動検知装置および方法、プログラムおよび記録媒体 | |
CN111741884B (zh) | 交通遇险和路怒症检测方法 | |
US20190259284A1 (en) | Pedestrian detection for vehicle driving assistance | |
EP2801956B1 (en) | Passenger counter | |
US9928404B2 (en) | Determination device, determination method, and non-transitory storage medium | |
US9524643B2 (en) | Orientation sensitive traffic collision warning system | |
JP2014219704A (ja) | 顔認証システム | |
WO2019220589A1 (ja) | 映像解析装置、映像解析方法、及びプログラム | |
US8873804B2 (en) | Traffic monitoring device | |
Pech et al. | Head tracking based glance area estimation for driver behaviour modelling during lane change execution | |
CN111339901B (zh) | 基于图像的入侵检测方法、装置、电子设备及存储介质 | |
US20140002658A1 (en) | Overtaking vehicle warning system and overtaking vehicle warning method | |
WO2020167155A1 (ru) | Способ и система выявления тревожных событий при взаимодействии с устройством самообслуживания | |
CN112633150B (zh) | 一种基于目标轨迹分析的滞留徘徊行为识别方法和系统 | |
CN112699802A (zh) | 一种驾驶员微表情检测装置及方法 | |
JP5143780B2 (ja) | 監視装置及び監視方法 | |
CN118379800A (zh) | 遮挡条件下的人体跌倒检测方法、装置、设备及存储介质 | |
CN111178194A (zh) | 入侵检测方法、装置和设备 | |
CN111104845B (zh) | 检测设备,控制方法及计算机可读记录介质 | |
JP6978986B2 (ja) | 警報システム、警報制御装置及び警報方法 | |
JP2012115570A (ja) | 通過者識別装置 | |
Hariyono et al. | Estimation of collision risk for improving driver's safety | |
CN115249400A (zh) | 用于主动入侵检测的非接触式警报系统 | |
JP7208051B2 (ja) | 状態認識装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980157998.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09841416 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2011503571 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13254963 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09841416 Country of ref document: EP Kind code of ref document: A1 |