WO2017104835A1 - 侵入検出装置、設定支援装置、侵入検出方法、設定支援方法及びプログラム記録媒体 - Google Patents
侵入検出装置、設定支援装置、侵入検出方法、設定支援方法及びプログラム記録媒体 Download PDFInfo
- Publication number
- WO2017104835A1 WO2017104835A1 PCT/JP2016/087649 JP2016087649W WO2017104835A1 WO 2017104835 A1 WO2017104835 A1 WO 2017104835A1 JP 2016087649 W JP2016087649 W JP 2016087649W WO 2017104835 A1 WO2017104835 A1 WO 2017104835A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- intrusion
- coordinates
- area
- predetermined time
- video
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/1968—Interfaces for setting up or customising the system
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19608—Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19652—Systems using zones in a single scene defined for different treatment, e.g. outer zone gives pre-alarm, inner zone gives alarm
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
Definitions
- the present invention relates to video surveillance technology, and more particularly to intrusion detection technology.
- This video surveillance technology includes, for example, a warning line technology that detects an object that has passed a line on the video, and a warning region technology that detects an object that has entered a specific area on the video.
- FIG. 5 is an example of a warning line technique. In this example, an object passing through a line segment connecting walls is detected on the video, and a warning is given when the object is a monitoring target. This line segment is sometimes called a warning line.
- FIG. 6 is an example of the alert area technology. In this example, an object entering a specific area is detected on the video, and a warning is given when the object is a monitoring target. This specific area may be called a warning area.
- an auxiliary warning area that detects the monitoring target trying to approach the warning line or the warning area before detecting the passage of the monitoring target.
- the auxiliary warning area is set in a wide range including a warning line or a warning area. After detecting an object that has entered the auxiliary warning area as a monitoring target, a warning is given when the monitoring target is within the auxiliary warning area for a predetermined time or more. This predetermined time may be referred to as intrusion duration time.
- the auxiliary alert area is limited to the imaging area. For this reason, the auxiliary warning area often has a complicated shape in accordance with the shooting range and the shape of the site.
- FIG. 7 shows an example of the auxiliary alert area.
- an object entering a specific area is detected on the video, and a warning is given when the object is a monitoring target. This specific area may be called an auxiliary alert area.
- routes i and ii connecting the warning line and the auxiliary warning area have different actual distances. For example, if the intrusion duration is set in the auxiliary alert area assuming the route ii, an object that has entered through the route i having a shorter actual distance than the route ii will enter the alert area before the set intrusion duration has elapsed. There is a possibility of intrusion.
- An object of the present invention is to provide an intrusion detection device, a setting support device, an intrusion detection method, a setting support method, and a program recording medium that can perform video monitoring according to an intrusion position of an object on a specific area on an image. There is.
- An intrusion detection apparatus includes a detection unit that detects an intrusion position of an object that has entered a specific area on a video, and a control unit that associates the intrusion position with a predetermined time. A warning is given when an object stays in a specific area for a predetermined time or longer on the video.
- the intrusion detection method of the present invention detects an intrusion position of an object that has entered a specific area on a video, associates the intrusion position with a predetermined time, and identifies an object on the video for a predetermined time or more. Warning if staying in the area.
- the program recording medium of the present invention causes a computer to function as a detection unit that detects an intrusion position of an object that has entered a specific area on a video, and a control unit that associates the intrusion position with a predetermined time,
- a program for causing the detection means to function as a warning means when an object stays in a specific area for a predetermined time or longer on an image is recorded in a computer-readable manner.
- the setting support apparatus includes an acquisition unit that acquires coordinates specified by a user for an image of a three-dimensional space, and a predetermined distance from the position of the three-dimensional space corresponding to the acquired coordinates. Calculation means for calculating coordinates in the video at a certain position, and determination means for determining an area set for the acquired coordinates based on the calculated coordinates.
- the setting support method of the present invention acquires coordinates specified by a user for an image of a three-dimensional space, and sets a position at a predetermined distance from the position of the three-dimensional space corresponding to the acquired coordinates. Coordinates in the video are calculated, and an area set for the acquired coordinates is determined based on the calculated coordinates.
- Another program recording medium of the present invention includes a process of acquiring coordinates designated by a user for an image captured in a three-dimensional space, and a position in the three-dimensional space corresponding to the acquired coordinates.
- a computer program for executing a process of calculating coordinates in the video at a position at a predetermined distance and a process of determining an area set for the acquired coordinates based on the calculated coordinates Record readable.
- the intrusion detection device the setting support device, the intrusion detection method, the setting support method, and the program recording medium of the present invention, it is possible to perform video monitoring according to the intrusion position of an object on a specific area.
- FIG. 1 is a block diagram showing an example of means for setting an auxiliary alert area according to the first embodiment of the present invention.
- FIG. 2 is a block diagram showing an example of means for setting an auxiliary alert area according to the second embodiment of the present invention.
- FIG. 3 is a block diagram showing an example of means for setting an auxiliary alert area according to the third embodiment of the present invention.
- FIG. 4 is a block diagram showing an example of means for setting an auxiliary alert area according to the fourth embodiment of the present invention.
- FIG. 5 is an example of a warning line technique.
- FIG. 6 is an example of the alert area technology.
- FIG. 7 is an example of the auxiliary alert area.
- FIG. 1 is a block diagram showing an example of means for setting an auxiliary alert area according to the first embodiment of the present invention.
- FIG. 2 is a block diagram showing an example of means for setting an auxiliary alert area according to the second embodiment of the present invention.
- FIG. 3 is a block diagram showing an example of means for setting an auxiliary
- FIG. 8 is a block diagram illustrating an example of a hardware configuration of a computer that implements the intrusion detection device according to the first embodiment of the present invention.
- FIG. 9 is a block diagram showing an example of the configuration of a setting support apparatus according to the fifth embodiment of the present invention.
- FIG. 10A is a first schematic diagram for explaining coordinates calculated by a calculation unit.
- FIG. 10B is a second schematic diagram for explaining coordinates calculated by the calculation unit.
- FIG. 10C is a third schematic diagram for explaining coordinates calculated by the calculation unit.
- FIG. 11 is a flowchart illustrating an example of processing executed by the setting support apparatus.
- FIG. 12 is a block diagram illustrating an example of the configuration of the intrusion detection system according to the sixth embodiment.
- FIG. 13 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus.
- FIG. 14 is a flowchart illustrating an example of processing executed by the information processing apparatus.
- FIG. 15 is a flowchart illustrating an example of the setting process.
- FIG. 16A is a diagram illustrating screen transition of the setting screen.
- FIG. 16B is another diagram illustrating the screen transition of the setting screen.
- FIG. 17A is a first diagram illustrating a coordinate calculation method.
- FIG. 17B is a second diagram illustrating a coordinate calculation method.
- FIG. 17C is a third diagram illustrating a coordinate calculation method.
- FIG. 18A is a diagram illustrating a first auxiliary alert area.
- FIG. 18B is a diagram illustrating a second auxiliary alert area.
- FIG. 18A is a diagram illustrating a first auxiliary alert area.
- FIG. 19 is a diagram illustrating a third auxiliary alert area.
- FIG. 20 is a diagram illustrating a fourth auxiliary alert area.
- FIG. 21 is a flowchart illustrating an example of the detection process and the notification process.
- FIG. 22 is a diagram illustrating the fifth and sixth auxiliary alert areas.
- FIG. 1 is a block diagram showing an example of means for setting an auxiliary alert area according to the first embodiment of the present invention.
- the intrusion detection device 1 includes a detection unit 2 and a control unit 3.
- the detection means 2 detects the intrusion position of the object that has entered the specific area (auxiliary warning area) on the video into the specific area (auxiliary warning area).
- the control means 3 associates the position on the video with a predetermined time (intrusion duration time). Furthermore, the detection means 2 warns the operator when an object stays in a specific area (auxiliary warning area) for a predetermined time (intrusion duration) associated with a position that matches the detected intrusion position.
- the predetermined time here is a time determined for each position on the video, and is defined by the operator, for example.
- the detection means 2 detects an object that has entered the auxiliary warning area, and when the object is a monitoring target, specifies the intrusion position of the monitoring target into the auxiliary warning area. Furthermore, the detection means 2 warns the operator when the monitoring target stays in the auxiliary warning area for a predetermined time (intrusion duration) associated with the specified intrusion position.
- the control unit 3 associates the position on the video with a predetermined time (intrusion duration), and informs the detection unit 2 of a set of the associated position and the predetermined time (intrusion duration).
- the detection means 2 warns the operator based on the intrusion duration time of the set in which the intrusion position specified by the detection means 2 and the position associated with the control means 3 match among the transmitted sets.
- the detection unit 2 may warn an object that has entered the auxiliary alert area.
- the control means 3 may be associated at regular intervals.
- the control means 3 may be associated when requested by the detection means 2.
- the control means 3 may have a set of the associated position and a predetermined time (intrusion duration time).
- effect it is possible to perform video monitoring according to an intrusion position of a specific area on an image of an object.
- FIG. 2 is a block diagram showing an example of means for setting an auxiliary alert area according to the second embodiment of the present invention.
- the intrusion detection device 1 further includes an input means 4 for receiving an input of an intrusion position into the auxiliary alert area and an input for a predetermined time.
- the input means 4 is means for receiving an input of an intrusion position into the auxiliary alert area and an input for a predetermined time.
- the intrusion position is, for example, coordinates on the image.
- the intrusion position is input by, for example, the input unit 4 displaying an image on a display, accepting writing of points and lines on the displayed image, and calculating coordinates on the image from the written points and lines. .
- the input for a predetermined time is, for example, a numerical value input.
- the numerical value may be input using a numeric keypad or may be input using another input method.
- the unit may be seconds or minutes.
- the control means 3 is means for associating the intrusion position with the predetermined time based on the accepted input of the intrusion position and the input for a predetermined time.
- the control means 3 may be realized by a CPU (Central Processing Unit) executing a predetermined program.
- the input unit 4 receives the input of the intrusion position and the input of the predetermined time as a set
- the control unit 3 associates the input of the intrusion position and the input of the predetermined time received as a set. May be.
- the control unit 3 selects the input of the intrusion position to be associated with the input for the predetermined time during the input, so that the control unit 3 can select the input of the intrusion position and the predetermined time. May be associated with each other. Or the control means 3 may be matched based on each order of the input of the several intrusion position input and the input of several predetermined time. For example, the control means 3 associates the input of the first intrusion position with the input of the first predetermined time, and further associates the input of the second intrusion position with the input of the second predetermined time.
- control means 3 tells the detection means 2 a set of the intrusion position and the input for a predetermined time.
- the control unit 3 may transmit the plurality of sets to the detection unit 2.
- the detection means 2 identifies the intrusion position of the auxiliary alert area to be monitored, refers to the input for a predetermined time associated with the input of the intrusion position indicating the position as the intrusion duration time, and monitors for the intrusion duration time or longer. This is a means for warning the operator when the target stays in the auxiliary alert area.
- the detection means 2 is notified from the control means 3 of one or more pairs of the input of the intrusion position and the input for a predetermined time that are associated with each other in the control means 3.
- the detection means 2 may store one or more of these transmitted sets in a storage means (not shown).
- the detection means 2 can specify the intrusion position into the auxiliary alert area to be monitored.
- the detection means 2 may specify the position where the monitoring object crosses the boundary line of the auxiliary warning area as the intrusion position using the warning line technique.
- the detection unit 2 searches the input of the intrusion position indicating the specified intrusion position from one or more sets transmitted from the control unit 3. Thereafter, the detection unit 2 refers to a predetermined time associated with the input of the searched intrusion position as the intrusion duration time.
- the detection unit 2 compares the time during which the monitoring target stays in the auxiliary warning area with the intrusion duration time, and warns the operator when the monitoring target stays longer than the intrusion duration time.
- the input means 4 accepts the input of the intrusion position into the auxiliary alert area and the input for a predetermined time.
- the input means 4 notifies the control means 3 of the input of the accepted intrusion position and the input for a predetermined time.
- control means 3 associates the intrusion position with the predetermined time based on the received intrusion position input and the predetermined time input.
- the method of associating is as described above.
- the control means 3 associates the intrusion position with the predetermined time based on the received entry of the intrusion position and the input for a predetermined time, and informs the detection means 2 of the set of the associated intrusion position and the predetermined time.
- the detecting means 2 identifies the intrusion position into the auxiliary alert area to be monitored asynchronously with the transmission of the intrusion position and the predetermined time by the control means 3. This specification may be performed every time a predetermined time elapses, for example, every 30 seconds or every minute.
- the detecting means 2 When detecting the intrusion of the auxiliary warning area to be monitored or specifying the intrusion position, the detecting means 2 receives one or more of the intrusion positions input from the control means 3 indicating the specified intrusion position. Search from a set. Thereafter, the detection unit 2 refers to a predetermined time associated with the input of the searched intrusion position as the intrusion duration time. Then, the detection unit 2 compares the time during which the monitoring target stays in the auxiliary warning area with the intrusion duration time, and warns the operator when the monitoring target stays longer than the intrusion duration time.
- the mode in which the detection unit 2 specifies the intrusion position and the transmission from the control unit 3 to the detection unit 2 is performed asynchronously, but the order of both is limited to this. Absent. For example, when the intrusion position and a predetermined time are transmitted from the control means 3 to the detection means 2, the detection means 2 may operate so as to start specifying the intrusion position. Further, when the detection means 2 can identify the intrusion position, the control means 3 may be operated to transmit the intrusion position and a predetermined time.
- effect it is possible to perform video monitoring according to an intrusion position of a specific area on an image of an object.
- FIG. 3 is a block diagram showing an example of means for setting an auxiliary alert area according to the third embodiment of the present invention.
- the intrusion detection device 1 further includes an input unit 5 that receives an input of an intrusion position into the auxiliary alert area in addition to the detection unit 2 and the control unit 3.
- the input means 5 receives an input of an intrusion position into the auxiliary alert area. Further, the input means 5 transmits the received entry of the intrusion position to the control means 3. The control means 3 sets a predetermined time based on the input of the intrusion position.
- the setting of the predetermined time of the control means 3 can be performed by the following method, for example.
- the control means 3 acquires a warning line, and calculates the shortest distance between the acquired warning line and the input intrusion position.
- the control means 3 acquires the moving speed of the object.
- a predetermined time is calculated from the moving speed of the object acquired by the control means 3 and the calculated shortest distance.
- Acquisition of the moving speed of the object of the control means 3 can be performed by the following method, for example.
- the control means 3 accepts the input as numerical values.
- the control unit 3 calculates the moving speed of the object from the image.
- the calculation of the predetermined time of the control means 3 can be performed by the following method, for example.
- the control means 3 obtains a value obtained by dividing the shortest distance calculated by the above method by the moving speed calculated by the above method.
- the calculation of the predetermined time by the control means 3 may be a value obtained by adding a predetermined time to the obtained value.
- effect it is possible to perform video monitoring according to an intrusion position of a specific area on an image of an object.
- FIG. 4 is a block diagram showing an example of means for setting an auxiliary alert area according to the fourth embodiment of the present invention.
- the intrusion detection apparatus 1 further includes an input unit 6 that receives an input for a predetermined time in addition to the detection unit 2 and the control unit 3.
- the input means 6 receives an input for a predetermined time. Further, the input means 6 transmits the received input for a predetermined time to the control means 3.
- the control means 3 sets a specific area (auxiliary warning area) based on the accepted input for a predetermined time.
- the setting of the auxiliary warning area of the control means 3 can be performed by the following method, for example. For example, the control means 3 acquires the moving speed of the object, and calculates the moving distance of the object in a predetermined time from the acquired moving speed of the object and an input for a predetermined time.
- control means 3 receives an input of a warning line, calculates coordinates at the movement distance of the object calculated from the received warning line as an object intrusion position, and sets the calculated object intrusion position. There may be a plurality of intrusion positions of the object at this time.
- the set of intrusion positions of the plurality of objects forms a line segment. This line segment is an auxiliary warning line.
- the area surrounded by the auxiliary warning line is the auxiliary warning area.
- an auxiliary warning line can be generated by inputting a predetermined time.
- FIG. 8 is a block diagram showing an example of a hardware configuration of a computer that realizes the intrusion detection apparatus according to the first embodiment of the present invention.
- the computer 600 includes a processor 610, a memory 620, a storage 630, and an interface 640.
- the processor 610 is, for example, a CPU (Central Processing Unit).
- the memory 620 corresponds to a main storage device.
- the storage 630 corresponds to an auxiliary storage device.
- the storage 630 is configured by, for example, a hard disk or a flash memory.
- the storage 630 may include a reader / writer of a removable recording medium such as an optical disk or a USB (Universal Serial Bus) flash drive.
- the interface 640 transmits / receives data to / from an external device.
- the processor 610 can function as the detection unit 2 and the control unit 3 of the intrusion detection device 1 by executing a program stored in the memory 620 or the storage 630.
- the present invention can provide an intrusion detection method in addition to an intrusion detection device.
- the present invention can also be provided in the form of a program for causing a computer to function as an intrusion detection device or a computer-readable recording medium (such as an optical disk, a magnetic disk, or a semiconductor memory) that records the program.
- the program according to the present invention may be downloaded to a certain device via a network and cause the device to function as an intrusion detection device.
- FIG. 9 is a block diagram showing the configuration of the setting support apparatus 100 according to the fifth embodiment of the present invention.
- the setting support apparatus 100 is an information processing apparatus for supporting (facilitating) setting of an area performed by a user based on video.
- the setting support apparatus 100 includes at least an acquisition unit 110, a calculation unit 120, and a determination unit 130. Note that the hardware configuration of the setting support apparatus 100 may be the same as that of the computer 600 illustrated in FIG.
- the acquisition unit 110 acquires coordinates in the video.
- This coordinate represents a line (straight line, curved line, broken line, etc.) or area designated by the user with respect to an image of a three-dimensional space (that is, a real space) photographed by a photographing device such as a monitoring camera.
- the coordinates acquired by the acquisition unit 110 are represented by, for example, a two-dimensional coordinate system having a predetermined position (end point, center, etc.) of the video as an origin. It can be said that the video imaged by the imaging device in the present embodiment is a video image with depth.
- the number of coordinates acquired by the acquisition unit 110 is not limited to a specific number as long as a line can be defined.
- the acquisition unit 110 may acquire the coordinates of the end points (start point and end point) of the line segment.
- the acquisition unit 110 may acquire the coordinates of the end points of a plurality of line segments that form the broken line.
- the designation of coordinates is performed via an input device such as a mouse or a touch screen display, for example.
- the acquisition unit 110 acquires each coordinate on the hand-drawn line.
- the calculation unit 120 calculates the coordinates of a position at a predetermined distance from the position in the three-dimensional space corresponding to the coordinates acquired by the acquisition unit 110.
- the position here is different from the coordinates in the (two-dimensional) image in that it is a position on a plane in an actual three-dimensional space. More specifically, the calculation unit 120 calculates the coordinates in the video at a position at a predetermined distance from the position in the three-dimensional space of the line specified by the user.
- FIG. 10A, 10B, and 10C are schematic diagrams for explaining coordinates calculated by the calculation unit 120.
- FIG. 10A is a diagram illustrating a line L1 designated for an image.
- FIG. 10B is a diagram illustrating a line L1a in the three-dimensional space corresponding to the line L1 and a line L2a connecting positions that are equidistant from the line L1a.
- FIG. 10C is a diagram illustrating the line L1 and the line L2 on the video corresponding to the line L2a.
- FIGS. 10A and 10C are different in perspective in the y 1 axis direction in the drawings. That is, it is assumed that the images illustrated in FIGS. 10A and 10C represent positions that are farther away as coordinates having a larger y 1 -axis component. Also, the object in the video is photographed smaller in the video as the distance from the photographing device is longer. This is because the magnification of the object in the video is inversely proportional to the distance between the object and the imaging device. Therefore, in the images illustrated in FIGS. 10A and 10C, an object having the same size is photographed smaller as the y 1 -axis component is at a larger coordinate.
- the x 1 y 1 coordinate system in FIGS. 10A and 10C is a coordinate system (screen coordinate system) defined for an image.
- the coordinates based on this coordinate system that is, the coordinates on the image, are numerical values indicating the position of each pixel with reference to the pixel at a predetermined position of the image (origin).
- the x 2 y 2 coordinate system in FIG. 10B is a coordinate system (world coordinate system) corresponding to an existing three-dimensional space, and is different from the coordinate systems in FIGS. 10A and 10C.
- the x 2 axis component and the y 2 axis component in FIG. 10B correspond to latitude and longitude.
- the line L2 in FIG. 10C does not have the same shape as the line L2a in FIG. 10B. This is because an apparent distortion (deformation) due to perspective difference occurs in the images represented by FIGS. 10A and 10C. Therefore, even if the line L2 in the video is equidistant from the line L1a in the actual three-dimensional space, it is not equidistant from the line L1 in the video. More specifically, the line L2 is closer to the line L1 on the image as the y 1 -axis component is larger.
- the calculation unit 120 calculates the coordinates of each point on the line L2 in FIG. 10C using a predetermined function.
- This function can be defined based on, for example, calibration executed in advance.
- reference objects of known sizes bars of a predetermined length, marks of a predetermined size, etc.
- This is performed by associating (for example, the number of pixels) with the actual size.
- the determining unit 130 determines an area set for the coordinates acquired by the acquiring unit 110 based on the coordinates calculated by the calculating unit 120. For example, the determination unit 130 determines a region (for example, a region inside the line L2 in FIG. 10C) surrounded by the closed curve represented by the coordinates calculated by the calculation unit 120 as a region corresponding to the line L1.
- the area determined by the determination unit 130 is also referred to as a “setting area”.
- the determining unit 130 may set an area partially different from the area surrounded by the closed curve represented by the coordinates calculated by the calculating unit 120 as the setting area. For example, the determination unit 130 may set a part of the region surrounded by the closed curve represented by the coordinates calculated by the calculation unit 120 as the setting region. In other words, the determination unit 130 may set a region excluding a part of the region surrounded by the closed curve represented by the coordinates calculated by the calculation unit 120 as the setting region. At this time, the determination unit 130 may determine an area to be excluded from the setting area based on other information. The other information here is, for example, coordinates acquired by the acquisition unit 110, features extracted from the video, predetermined rules, and the like. The determination unit 130 may determine an area to be excluded from the setting area based on a user operation.
- FIG. 11 is a flowchart showing processing executed by the setting support apparatus 100.
- the acquisition unit 110 acquires coordinates specified by a user using a line for a video image of a three-dimensional space. At this time, the video is shot by the shooting device and displayed on the display device. The user designates coordinates for an image displayed by the display device using an input device such as a mouse. Taking FIG. 10A as an example, the acquisition unit 110 acquires coordinates defining the line L1 (for example, the start point and the end point of the line L1).
- step S12 the calculation unit 120 calculates the coordinates of a position at a predetermined distance from the position in the three-dimensional space corresponding to the line represented by the coordinates acquired in step S11. Taking FIG. 10C as an example, the calculation unit 120 calculates each coordinate on the line L2 based on the coordinates defining the line L1.
- step S13 the determination unit 130 determines a setting area based on the coordinates calculated in step S12.
- the determination unit 130 determines the setting area so as to include at least a part of the area surrounded by the line L2.
- the determination unit 130 may set a part or all of the region surrounded by the line L2 as the setting region.
- the determination unit 130 may determine the setting region so as to include not only the region surrounded by the line L2 but also other regions using the other information described above.
- the setting support apparatus 100 has a configuration in which the setting area is determined based on the line specified by the user. With this configuration, the user only needs to specify a line when setting the setting area, and does not need to input the setting area itself. Therefore, according to the setting support apparatus 100 of the present embodiment, it becomes easy for the user to accurately set the setting area for a video having a depth. That is, the setting support apparatus 100 can support the setting work by the user.
- FIG. 12 is a block diagram illustrating a configuration of an intrusion detection system 200 according to the sixth embodiment.
- the intrusion detection system 200 is an information processing system for detecting an intrusion of an object.
- the object here is a human such as a suspicious person.
- the object here may be an animal other than a human being or a movable machine such as an automobile or a robot. In the following, it is assumed that the object detected by the intrusion detection system 200 is a human.
- “Intrusion” here refers to an intrusion of an object into a specific area that may be fraudulent. However, in each embodiment of the present invention, it does not have to be a problem whether an object that has entered a specific area actually has an unauthorized purpose. For example, whether or not an object that has entered a specific area has an illegal purpose may be determined using a system different from the intrusion detection system 200 or may be determined by a human. In other words, the intrusion detection system 200 can be said to be a system for detecting signs or possibilities of intrusion or a system for detecting the entry of an object (regardless of whether it is illegal or not).
- the intrusion detection system 200 includes at least an information processing device 210, a photographing device 220, an input device 230, and a display device 240.
- the intrusion detection system 200 may include a plurality of information processing devices 210, imaging devices 220, input devices 230, and display devices 240.
- the information processing apparatus 210, the imaging apparatus 220, the input apparatus 230, and the display apparatus 240 may be configured such that a part or all of these are configured as a single apparatus.
- the intrusion detection system 200 may include other configurations in addition to the information processing device 210, the imaging device 220, the input device 230, and the display device 240.
- the intrusion detection system 200 may include a device or a device (speaker, siren, warning light, etc.) for notifying the detection of intrusion.
- the information processing apparatus 210 detects a person using video. In addition, the information processing apparatus 210 supports settings performed by a user (operator) to detect a person.
- the information processing apparatus 210 is a computer apparatus such as a personal computer, for example.
- the information processing device 210 is communicably connected to the imaging device 220, the input device 230, and the display device 240. Communication by the information processing apparatus 210 may be either wired or wireless, and may be performed via another apparatus (that is, indirectly).
- the imaging device 220 captures an image.
- the imaging device 220 is, for example, a surveillance camera that is installed in a certain place and continuously captures a specific area.
- the imaging device 220 captures an area to be monitored and generates video data representing the video in the area.
- the imaging device 220 supplies this video data to the information processing device 210.
- the input device 230 accepts user operations.
- the input device 230 is, for example, a mouse or a keyboard.
- the input device 230 may be a touch screen display that is configured integrally with the display device 240.
- the input device 230 supplies input data representing a user operation to the information processing device 210.
- Display device 240 displays an image.
- the display device 240 is, for example, a liquid crystal display.
- the display device 240 displays an image corresponding to the image data supplied from the information processing device 210.
- the display device 240 may display an image captured by the image capturing device 220.
- the display device 240 may display a screen (hereinafter also referred to as “setting screen”) for the user to execute various settings related to monitoring.
- the intrusion detection system 200 may be configured to include a display device that displays video captured by the imaging device 220 and another display device that displays a setting screen.
- FIG. 13 is a block diagram illustrating a hardware configuration of the information processing apparatus 210.
- the information processing apparatus 210 includes a control unit 211, a storage unit 212, and an interface unit 213.
- the information processing apparatus 210 corresponds to an example of the setting support apparatus 100 described in the fifth embodiment. More specifically, the information processing apparatus 210 can realize a function corresponding to the setting support apparatus 100 when the control unit 211 executes a predetermined program.
- the control unit 211 includes a processor (arithmetic processing unit) such as a CPU (Central Processing Unit) and a main memory (main storage device).
- the control unit 211 may be configured to include a plurality of processors such as having a graphics processing GPU (Graphics Processing Unit) in addition to the CPU.
- the control unit 211 implements several functions related to person detection by executing a program.
- the storage unit 212 stores data used for the control unit 211.
- the storage unit 212 may store a program executed by the control unit 211.
- the storage unit 212 includes a storage device such as a hard disk drive. Further, the storage unit 212 may include a reader / writer of a storage medium (memory card or the like) that can be attached to and detached from the information processing apparatus 210. Data exchange in the information processing apparatus 210 may be performed via this removable storage medium.
- the interface unit 213 exchanges data with the imaging device 220, the input device 230, and the display device 240.
- the interface unit 213 can transmit and receive data in accordance with a predetermined standard such as USB (Universal Serial Bus) or HDMI (High-Definition Multimedia Interface).
- the interface unit 213 may include an interface connected to a network such as the Internet.
- the configuration of the intrusion detection system 200 is as described above. With this configuration, the intrusion detection system 200 detects a person based on the video imaged by the imaging device 220.
- the information processing apparatus 210 performs the following processing as processing related to person detection.
- FIG. 14 is a flowchart showing an outline of processing executed by the information processing apparatus 210.
- the processing executed by the information processing apparatus 210 is roughly divided into setting processing (step S21), detection processing (step S22), and notification processing (step S23).
- the detection process does not necessarily have to be executed following the setting process. Since it is sufficient that the setting process is executed at least once in advance, it is not a necessary process every time the detection process is executed.
- the setting process is a process for setting a warning line and an auxiliary warning area.
- the alert line of the present embodiment is a straight line set by a user operation.
- the auxiliary warning area of this embodiment is an area set based on the warning line, and corresponds to an example of a setting area in the fifth embodiment.
- the setting process of the present embodiment includes a process for assisting the user in setting an auxiliary alert area.
- the detection process is a process for detecting a person entering the auxiliary alert area.
- the detection process may further include a process for detecting passage of a warning line by a person.
- the detection process may include a process for detecting a person staying in the auxiliary warning area, that is, a person staying in the auxiliary warning area for a predetermined time or more.
- the person detected by the detection process is also referred to as “a person who needs attention”.
- the notification process is a process for notifying the detection result of the detection process.
- the notification process for example, an entry or stay in the auxiliary alert area or passage of the alert line is notified by a person requiring attention.
- the notification by the notification process may be executed by the display device 240 or may be executed by a siren or a warning light.
- FIG. 15 is a flowchart showing details of the setting process.
- the control unit 211 causes the display device 240 to display a setting screen. More specifically, the control unit 211 supplies image data for displaying the setting screen to the display device 240. The control unit 211 supplies image data to the display device 240 via the interface unit 213.
- FIG. 16A and 16B are diagrams illustrating screen transitions of the setting screen displayed in step S211.
- FIG. 16A is a diagram illustrating an example of a setting screen.
- the setting screen SC1 includes at least an image shot by the shooting device 220.
- the setting screen may include a message for prompting the user to input, such as “Please enter a warning line”.
- the user inputs a warning line using the input device 230.
- the video illustrated in the present embodiment includes a description that is emphasized, exaggerated, or simplified for easy understanding.
- step S212 the control unit 211 acquires coordinates. More specifically, the control unit 211 acquires coordinates by acquiring input data from the input device 230 via the interface unit 213.
- the coordinates defining the warning line are the two end points of the line segment that is the warning line.
- FIG. 16B is a diagram illustrating a warning line L21 specified for the video illustrated in FIG. 16A.
- the warning line L21 is a line segment connecting the coordinates P21 and the coordinates P22.
- the user can set the warning line L21 by designating the coordinates P21 and P22 using the input device 230.
- step S213 the control unit 211 calculates the coordinates of a position at a predetermined distance (for example, 100 m) from the warning line.
- a predetermined distance for example, 100 m
- the control unit 211 calculates the coordinates of a position where the distance from the warning line in the existing space is constant.
- the control unit 211 calculates coordinates according to the following equation (1).
- R dist f (P a , P b ) (1)
- f (P a , P b ) is a function that returns an actual distance between the coordinates P a and the coordinates P b on the video (that is, an actual distance in a three-dimensional space).
- R dist represents a constant corresponding to a predetermined distance.
- the function f (P a , P b ) can be defined in advance by calibration using the video imaged by the imaging device 220.
- the function f (P a , P b ) is a function for calculating the distance between two coordinates by converting the coordinates P a and P b in the screen coordinate system into two coordinates in the world coordinate system. Is possible.
- the control unit 211 performs such calculation for all the coordinates on the video included in the warning line L21. For example, the control unit 211 calculates coordinates as follows.
- FIG. 17A is a diagram illustrating a curve C1 which is a set of coordinates whose actual distance from the coordinate P21 is equal.
- the curve C1 is a closed curve approximated to a circle centered on the coordinate P21, but is not strictly a perfect circle.
- the apparent distance from the coordinate P21 becomes smaller as the actual distance from the imaging device 220 is longer.
- FIG. 17B is a diagram illustrating a curve C2 that is a set of coordinates in which the actual distance from the coordinate P22 is the same distance in addition to the curve C1.
- the curve C2 is a circular figure similar to the curve C1, but the apparent size is smaller than the curve C1. This is because the coordinate P22 is farther from the imaging device 220 than the coordinate P21. Note that the control unit 211 calculates similar curves not only for the coordinates P21 and P22 but also for all the coordinates included in the warning line L21.
- FIG. 17C is a diagram illustrating an area A1 serving as a basis for determining the auxiliary alert area.
- Area A1 when the coordinates P b and an arbitrary coordinate on the warning line L21, R dist ⁇ f (P a, P b) is the set of coordinates P a satisfying. Further, the boundary line of the region A1 is a set of coordinates whose actual distance from the warning line L21 is equal.
- step S214 the control unit 211 determines an auxiliary alert area based on the coordinates calculated in step S213.
- the control unit 211 can determine the auxiliary alert area by any of the following methods.
- the control unit 211 may use the area A1 in the example of FIG. 17C as an auxiliary warning area as it is. This method requires the least amount of calculation compared to other methods described later.
- control unit 211 may determine the auxiliary warning area based on the direction (direction) in which the person crosses the warning line. It can be said that this method is a method for determining the auxiliary alert area based on the moving direction of the person.
- the direction of movement of the person is predetermined with respect to the warning line.
- the movement direction of the person may be set by the user via the setting screen.
- the movement direction of the person may be determined based on the actual movement of the person detected from the video.
- the movement direction of the person may be patterned in several typical directions (for example, two directions).
- FIG. 18A and FIG. 18B are diagrams exemplifying auxiliary alert areas determined based on the movement direction of the person.
- the auxiliary alert areas A2 and A3 are both determined based on the area A1 in FIG. 17C.
- the auxiliary warning area A2 is an auxiliary warning area when the moving direction of the person is the direction of the arrow D1.
- the auxiliary warning area A3 is an auxiliary warning area when the moving direction of the person is the direction of the arrow D2.
- control unit 211 sets the remaining area of the area A1 excluding the area ahead of the warning line L21 as viewed from the front side in the movement direction of the person as the auxiliary warning area. More specifically, the control unit 211 specifies the intersection of the straight line including the warning line L21 and the boundary line of the area A1, and determines one of the areas surrounded by the straight line in the area A1 based on the moving direction of the person. Auxiliary alert area.
- the control unit 211 may determine an auxiliary warning area according to the moving direction of the person. For example, when a warning line is set at a place where the moving direction is limited to one direction, the control unit 211 may determine an auxiliary warning area as shown in FIGS. 18A and 18B.
- the auxiliary warning area determined in this way makes it possible to reduce false detection of an intrusion (that is, unintentional detection) by excluding an area unnecessary for detection in the area A1. By doing in this way, the user can use the more suitable auxiliary
- control unit 211 may determine an auxiliary alert area based on a user operation. For example, the control unit 211 may determine the auxiliary alert area based on the coordinates acquired in step S212, that is, the coordinates designated as the end points of the alert line by the user. Or the control part 211 may determine an auxiliary
- FIG. 19 is a diagram showing another example of the auxiliary alert area.
- the auxiliary alert area A4 is determined based on the area A1 in FIG. 17C.
- the control unit 211 identifies the intersection of the perpendicular line to the warning line L21 that intersects the end point of the warning line L21 and the boundary line of the area A1, and assists the area surrounded by the boundary line and the perpendicular line and the warning line L21. It is a warning area.
- the auxiliary warning area determined in this way can also reduce false detection of intrusion, as in the example of FIGS. 18A and 18B.
- control unit 211 may determine the auxiliary warning area using a feature extracted from the video imaged by the imaging device 220.
- the feature here is, for example, an edge or HOG (Histograms of Oriented Gradients) feature.
- the control unit 211 can extract such features from the video imaged by the imaging device 220, and can determine an auxiliary warning area based on the extracted features.
- FIG. 20 is a diagram showing still another example of the auxiliary alert area.
- the auxiliary alert area A5 is determined based on the area A1 in FIG. 17C.
- the control unit 211 detects edges E1 and E2 in the vicinity of the warning line L21.
- the edges E1 and E2 are, for example, a pixel group in which the change in brightness in a specific direction in the video is larger than a predetermined threshold value.
- the control unit 211 sets an area surrounded by the warning line L21, the edges E1 and E2, and the boundary line of the area A1 as an auxiliary warning area.
- the auxiliary warning area determined in this way can reduce false detection of intrusion.
- control unit 211 may be configured to select one of a plurality of auxiliary warning area candidates.
- the control unit 211 may display a plurality of auxiliary warning area candidates on the display device 240 together with the video imaged by the imaging device 220, and select any of them according to the user's operation. For example, the user confirms a candidate for the auxiliary warning area displayed superimposed on the video, and selects any desired one.
- the control unit 211 may display a plurality of candidates with different R dist on the display device 240 and allow the user to select them.
- step S ⁇ b> 215 the control unit 211 records setting information in the storage unit 212.
- the setting information includes information indicating a warning line and information indicating an auxiliary warning area.
- the setting information stored in the storage unit 212 is used in the detection process.
- the setting information is, for example, coordinates indicating the warning line and the boundary of the auxiliary warning area.
- FIG. 21 is a flowchart showing details of the detection process (steps S221 to S224) and the notification process (steps S231 to S232).
- the control unit 211 starts executing a series of processes shown in FIG. 21 at the timing of starting monitoring by video.
- the processing start timing by the control unit 211 is, for example, timing when the imaging device 220 starts imaging or timing instructed by the user.
- the control unit 211 executes the following processing for each frame of the video. For convenience of explanation, it is assumed in the following that no more than one person is detected in each frame.
- step S221 the control unit 211 determines whether a person is recognized from the video.
- the control unit 211 can recognize a person using a known object recognition technique. Recognition by the control unit 211 may be either general object recognition or specific object recognition. That is, the control unit 211 may recognize an object having a characteristic like a person, or may recognize a person having a specific characteristic recorded in a database (so-called black list) in advance.
- the control unit 211 ends the process without executing the notification process.
- step S222 the control unit 211 determines whether the person recognized in step S221 has entered the auxiliary alert area. At this time, the control unit 211 specifies the coordinates of the auxiliary alert area based on the setting information recorded in the setting process. The control unit 211 may determine that the person has entered the area when at least a part of the person recognized in step S221 is included in the auxiliary alert area, and the entire person is included in the area. In this case, it may be determined that the person has entered the area. When the person recognized in step S221 has not entered the auxiliary alert area (S222: NO), the control unit 211 ends the process without executing the notification process.
- step S221 When the person recognized in step S221 has entered the auxiliary alert area (S222: YES), the control unit 211 further executes a measurement process in step S223.
- the measurement process is a process of measuring the length of time during which a certain person is detected in the auxiliary alert area (hereinafter also referred to as “detection time”).
- the control unit 211 if the person recognized in step S221 has not been detected in the auxiliary alert area in the immediately preceding frame, the control unit 211 starts measuring the detection time.
- the control unit 211 adds the already detected detection time for one frame.
- the control part 211 resets detection time, when the said person is not detected in the flame
- step S224 the control unit 211 determines whether the person recognized in step S221 has passed the warning line.
- the control unit 211 executes a first notification process (step S231).
- step S225 the control unit 211 determines whether the detection time measured by the measurement process is equal to or greater than a predetermined threshold. When the detection time is less than the predetermined threshold (S225: NO), the control unit 211 ends the process without executing the notification process. On the other hand, when the detection time is equal to or greater than the predetermined threshold (S225: YES), the control unit 211 executes a second notification process (step S232).
- the first notification process is, for example, a process for displaying a message such as “A person requiring attention has passed the warning line” on the display device 240.
- the second notification process is a process for displaying, for example, a message such as “A person requiring attention has entered the auxiliary alert area” on the display device 240.
- the control unit 211 may notify the measured detection time together.
- the first notification process and the second notification process may be the same process.
- the intrusion detection system 200 of the present embodiment it is possible to execute from the setting of the warning line and the auxiliary warning area to the detection and notification of the person.
- the information processing apparatus 210 can easily set the auxiliary warning area (that is, the setting area) by the user, like the setting support apparatus 100 of the fifth embodiment. is there.
- an image obtained by photographing a three-dimensional space includes positions with different distances (that is, depths) from the photographing device 220.
- photographed with the imaging device 220 produces a sense of perspective to a viewer (namely, user). Therefore, when a certain object is included in such an image, the appearance of the object in the image varies depending on the position of the object (in other words, the distance from the imaging device 220).
- the user may want to set the auxiliary alert area within the equidistant range from the alert line.
- the auxiliary warning area is often set as “a range of 100 m from the warning line”.
- the object can enter from any position on the boundary of the auxiliary warning area, it can be said that such a range setting at an equal distance is reasonable.
- the image has a depth. Therefore, the apparent distance in the video does not always match the actual distance (actual distance) in the three-dimensional space. Therefore, it is generally difficult for the user to manually input a line like the boundary of the auxiliary alert area manually.
- the warning line is set at a place where a person enters and exits, such as the entrance of a facility (building, park, etc.) to be monitored.
- a place where a person enters and exits, such as the entrance of a facility (building, park, etc.) to be monitored.
- Such places usually have an appearance that is distinguishable from other places. For example, there is a gate at the entrance of the facility, or there are no objects (such as fences or fences) that obstruct the passage of people. Therefore, it can be said that it is relatively easy for the user to manually set the alert line compared to manually setting an area such as the auxiliary alert area.
- the user can set an auxiliary warning area that is relatively difficult to set manually only by specifying only a warning line that is relatively easy to set manually. is there.
- the auxiliary alert area determined by the intrusion detection system 200 the actual distance from the alert line is the same distance, and therefore, the actual distance from the alert line is not shortened against the intention of the user. According to such an auxiliary warning area, it is possible to appropriately determine whether or not the person has entered the auxiliary warning area from any position.
- the user can easily change (edit) the auxiliary alert area into a more preferable shape as necessary.
- the intrusion detection system 200 can suppress the detection contrary to the user's will.
- the intrusion detection system 200 can reduce the possibility of detecting a person who does not need to be detected originally.
- the warning line does not have to be a straight line.
- the warning line may be a broken line or a curve, or a combination of a straight line and a curve.
- the coordinates acquired by the information processing apparatus 210 via the input device 230 are not limited to the examples in the present embodiment, and may be the same as the coordinates acquired by the acquisition unit 110 in the fifth embodiment.
- the information processing apparatus 210 sets a warning line based on the coordinates of a broken line or a curve acquired via the input device 230.
- FIG. 22 is a diagram illustrating an auxiliary alert area when the alert line is a curve.
- the curve C3 represents a set of coordinates whose actual distance from the warning line L31 is equal.
- the control unit 211 calculates tangents T1 and T2 at the end points P31 and P32 of the warning line L31, and among the areas surrounded by the curve C3, the area A6 surrounded by the warning line L31, the tangents T1 and T2, and the curve C3.
- A7 is set as an auxiliary warning area.
- the tangents T1 and T2 here may be rephrased as straight lines having slopes of the right differential coefficient or the left differential coefficient at the end points P31 and P32.
- the setting process may be executed by a device different from the detection process and the notification process.
- the intrusion detection system 200 may be configured to include an information processing device that performs setting processing and another information processing device that performs detection processing and notification processing.
- the configuration of any information processing apparatus may be the same as that of the information processing apparatus 210 in FIG.
- the information processing apparatus 210 may detect a plurality of types of objects. For example, the information processing apparatus 210 may simultaneously perform human detection and automobile detection. However, the average moving speed differs between humans and cars. Therefore, when detecting a plurality of types of objects, the information processing apparatus 210 varies R dist in the expression (1) according to the type of each object. For example, the information processing apparatus 210 may store a table in which the object type and R dist are associated with each other in the storage unit 212, and determine an auxiliary warning area according to the object type.
- R dist may be specified by the user.
- the user may specify the moving speed of the object and a desired time instead of R dist .
- the control unit 211 can calculate R dist by multiplying the moving speed and time.
- the intrusion detection system 200 does not need to detect an object from a video in real time. That is, the video here may be recorded in advance and stored in a storage device or the like. Further, the information processing apparatus 210 may be in a remote place separated from other apparatuses. For example, the information processing apparatus 210 may be realized using so-called cloud computing technology.
- (Appendix 1) Detecting means for detecting an intrusion position of the object that has entered the specific area on the video; Control means for associating a position on the video with a predetermined time, An intrusion detection apparatus that warns when the object stays in the specific area on the video for the predetermined time or more associated with a position that matches the detected intrusion position.
- (Appendix 2) It further comprises an input means for receiving an entry of an intrusion position and an input for a predetermined time, The control means associates the intrusion position with the predetermined time based on the received input of the intrusion position and the input of the predetermined time; The intrusion detection device according to attachment 1.
- (Appendix 3) It further comprises an input means for receiving an input of the intrusion position, The control means sets the predetermined time based on the accepted entry of the intrusion position, The intrusion detection device according to attachment 1.
- (Appendix 4) It further comprises an input means for receiving an input for a predetermined time, The control means sets the specific area based on the received input for the predetermined time.
- the intrusion detection device according to attachment 1. (Appendix 5) Detecting an intrusion position of an object that has entered a specific area on the image into the specific area, Associating the intrusion position with a predetermined time, An intrusion detection method that warns when the object stays in the specific area for a predetermined time or longer on an image.
- (Appendix 6) Computer Detecting means for detecting an intrusion position of the object that has entered the specific area on the video; Control means for associating the intrusion position with a predetermined time; A computer-readable program recording medium storing a program for causing the detection means to function as a warning means when the object stays in the specific area for a predetermined time or longer on an image.
- (Appendix 7) Obtaining means for obtaining coordinates designated by a user for an image of a three-dimensional space; Calculating means for calculating coordinates in the video at a position at a predetermined distance from the position in the three-dimensional space corresponding to the acquired coordinates; A setting support apparatus comprising: a determining unit that determines an area set for the designated line based on the calculated coordinates.
- the determining means includes The setting support device according to claim 7, wherein the region is determined based on a direction in which an object crosses a position in the three-dimensional space corresponding to the acquired coordinates.
- the determining means includes The setting support apparatus according to appendix 8, wherein the different areas are determined according to the direction.
- the determining means includes The setting support device according to any one of appendix 7 to appendix 9, wherein the region is determined using the acquired coordinates.
- the determining means includes The setting support device according to any one of appendix 7 to appendix 10, wherein the region is determined using a feature extracted from the video.
- the determining means includes The setting support apparatus according to any one of Supplementary Note 7 to Supplementary Note 11, further comprising selection means for selecting any of the plurality of candidates for the region.
- the selection means includes The setting support apparatus according to claim 12, wherein one of the plurality of candidates displayed by the display unit is selected according to a user operation.
- the setting support device according to any one of Supplementary Note 7 to Supplementary Note 13, further comprising detection means for detecting entry of an object in the three-dimensional space with respect to the determined region.
- (Appendix 15) Acquire the coordinates specified by the user for the video of the 3D space, Calculating coordinates in the image at a position at a predetermined distance from a position in the three-dimensional space corresponding to the acquired coordinates; A setting support method for determining an area to be set for the specified line based on the calculated coordinates.
- (Appendix 16) On the computer, A process of acquiring coordinates designated by the user for an image of a three-dimensional space; Processing for calculating coordinates in the video at a position at a predetermined distance from a position in the three-dimensional space corresponding to the acquired coordinates; A computer-readable program recording medium recording a program for executing a process of determining an area set for the designated line based on the calculated coordinates.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Alarm Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Burglar Alarm Systems (AREA)
- Image Analysis (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
(構成)
図1は本発明の第1の実施形態にかかる補助警戒領域を設定するための手段の一例を示すブロック図である。本実施形態において、侵入検出装置1は検出手段2と制御手段3とを備える。
検出手段2は映像上において特定の領域(補助警戒領域)へ侵入した物体の特定の領域(補助警戒領域)への侵入位置を検出する。制御手段3は映像上における位置と所定時間(侵入継続時間)とを対応付ける。さらに検出手段2は検出した侵入位置と一致する位置に対応付けられた所定時間(侵入継続時間)以上映像上において物体が特定の領域(補助警戒領域)内に滞留した場合に操作者へ警告する。ここでいう所定時間は、映像上における位置毎に決められた時間であり、例えば、操作者によって定義される。
本実施形態によれば、物体の映像上における特定の領域への侵入位置に応じた映像監視を行うことができる。
(構成)
図2は本発明の第2の実施形態にかかる補助警戒領域を設定するための手段の一例を示すブロック図である。侵入検出装置1は検出手段2と制御手段3に加えて、補助警戒領域への侵入位置の入力と所定時間の入力とを受け付ける入力手段4をさらに備える。
次に、本発明の第2の実施形態にかかる侵入検出装置1の動作の一例について説明する。
本実施形態によれば、物体の映像上における特定の領域への侵入位置に応じた映像監視を行うことができる。
(構成)
図3は本発明の第3の実施形態にかかる補助警戒領域を設定するための手段の一例を示すブロック図である。侵入検出装置1は検出手段2と制御手段3に加えて、補助警戒領域への侵入位置の入力を受け付ける入力手段5をさらに備える。
入力手段5は、補助警戒領域への侵入位置の入力を受け付ける。さらに入力手段5は受け付けた侵入位置の入力を制御手段3へ伝える。制御手段3は侵入位置の入力に基づいて所定時間を設定する。
本実施形態によれば、物体の映像上における特定の領域への侵入位置に応じた映像監視を行うことができる。
(構成)
図4は本発明の第4の実施形態にかかる補助警戒領域を設定するための手段の一例を示すブロック図である。侵入検出装置1は検出手段2と制御手段3に加えて、所定時間の入力を受け付ける入力手段6をさらに備える。
入力手段6は、所定時間の入力を受け付ける。さらに入力手段6は受け付けた所定時間の入力を制御手段3へ伝える。制御手段3は受け付けられた所定時間の入力に基づいて特定の領域(補助警戒領域)を設定する。制御手段3の補助警戒領域の設定は、例えば以下の方法で行うことができる。例えば制御手段3が、物体の移動速度を取得し、取得した物体の移動速度と所定時間の入力とから所定時間における物体の移動距離を算出する。さらに例えば制御手段3が、警戒線の入力を受け付け、受け付けた警戒線から算出した物体の移動距離にある座標を物体の侵入位置として算出し、算出した物体の侵入位置を設定する。このときの物体の侵入位置は複数存在しうる。この複数の物体の侵入位置の集合は線分を形成する。この線分は補助警戒線である。そして補助警戒線で囲まれる領域が補助警戒領域である。
本実施形態によれば、物体の映像上における特定の領域への侵入位置に応じた映像監視を行うことができる。
さらに本実施形態によれば所定時間の入力によって補助警戒線を生成することができる。
図9は、本発明の第5の実施形態に係る設定支援装置100の構成を示すブロック図である。設定支援装置100は、ユーザにより映像に基づいて行われる領域の設定を支援する(容易にする)ための情報処理装置である。設定支援装置100は、取得部110と、算出部120と、決定部130とを少なくとも含む。なお、設定支援装置100のハードウェア構成は、図8に例示されたコンピュータ600と同様であってもよい。
図12は、第6の実施形態に係る侵入検出システム200の構成を示すブロック図である。侵入検出システム200は、物体の侵入を検出するための情報処理システムである。いくつかの態様において、ここでいう物体は、不審者等の人間である。ただし、ここでいう物体は、人間以外の動物であってもよく、自動車、ロボット等の移動可能な機械であってもよい。以下において、侵入検出システム200により検出される物体は、人間であるとする。
(1)式において、f(Pa,Pb)は、映像上の座標Paと座標Pbとの間の実距離(すなわち、3次元空間における実際の距離)を返す関数である。また、Rdistは、所定の距離に相当する定数を表す。なお、関数f(Pa,Pb)は、撮影装置220により撮影された映像を用いたキャリブレーションによってあらかじめ定義可能である。関数f(Pa,Pb)は、スクリーン座標系の座標Pa、Pbをワールド座標系の2つの座標に変換して2つの座標間の距離を計算する関数であり、周知技術によって算出可能である。
第6の実施形態は、以下のような変形を適用することが可能である。
本発明の実施形態の一部又は全部は、以下の付記のようにも記載され得るが、以下には限定されない。
映像上において特定の領域へ侵入した物体の前記特定の領域への侵入位置を検出する検出手段と、
前記映像上における位置と所定時間とを対応付ける制御手段
とを備え、
前記検出手段が、検出した前記侵入位置と一致する位置に対応付けられた前記所定時間以上映像上において前記物体が前記特定の領域内に滞留した場合に警告する
侵入検出装置。
(付記2)
侵入位置の入力と所定時間の入力とを受け付ける入力手段をさらに備え、
前記制御手段は、受け付けられた前記侵入位置の入力と前記所定時間の入力とに基づいて前記侵入位置と前記所定時間とを対応付ける、
付記1に記載した侵入検出装置。
(付記3)
侵入位置の入力を受け付ける入力手段をさらに備え、
前記制御手段は、受け付けられた前記侵入位置の入力に基づいて前記所定時間を設定する、
付記1に記載した侵入検出装置。
(付記4)
所定時間の入力を受け付ける入力手段をさらに備え、
前記制御手段は、受け付けられた前記所定時間の入力に基づいて前記特定の領域を設定する、
付記1に記載した侵入検出装置。
(付記5)
映像上において特定の領域へ侵入した物体の前記特定の領域への侵入位置を検出し、
前記侵入位置と所定時間とを対応付け、
映像上において前記物体が前記所定時間以上前記特定の領域内に滞留した場合に警告する
侵入検出方法。
(付記6)
コンピュータを、
映像上において特定の領域へ侵入した物体の前記特定の領域への侵入位置を検出する検出手段と、
前記侵入位置と所定時間とを対応付ける制御手段と、
前記検出手段が、映像上において前記物体が前記所定時間以上前記特定の領域内に滞留した場合に警告する手段
として機能させるためのプログラムを記録したコンピュータ読み取り可能なプログラム記録媒体。
(付記7)
3次元空間を撮影した映像に対してユーザにより指定された座標を取得する取得手段と、
前記取得された座標に対応する前記3次元空間の位置から所定の距離にある位置の前記映像における座標を算出する算出手段と、
前記指定された線に対して設定される領域を前記算出された座標に基づいて決定する決定手段と
を備える設定支援装置。
(付記8)
前記決定手段は、
前記取得された座標に対応する前記3次元空間における位置を物体が横切る方向に基づいて前記領域を決定する
付記7に記載の設定支援装置。
(付記9)
前記決定手段は、
前記方向に応じて異なる前記領域を決定する
付記8に記載の設定支援装置。
(付記10)
前記決定手段は、
前記取得された座標を用いて前記領域を決定する
付記7から付記9までのいずれかに記載の設定支援装置。
(付記11)
前記決定手段は、
前記映像から抽出される特徴を用いて前記領域を決定する
付記7から付記10までのいずれかに記載の設定支援装置。
(付記12)
前記決定手段は、
前記領域の複数の候補からいずれかを選択する選択手段をさらに備える
付記7から付記11までのいずれかに記載の設定支援装置。
(付記13)
前記複数の候補を前記映像とともに表示する表示手段をさらに備え、
前記選択手段は、
前記表示手段により表示された前記複数の候補のいずれかをユーザの操作に応じて選択する
付記12に記載の設定支援装置。
(付記14)
前記決定された領域に対する前記3次元空間における物体の進入を検出する検出手段をさらに備える
付記7から付記13までのいずれかに記載の設定支援装置。
(付記15)
3次元空間を撮影した映像に対してユーザにより指定された座標を取得し、
前記取得された座標に対応する前記3次元空間の位置から所定の距離にある位置の前記映像における座標を算出し、
前記指定された線に対して設定される領域を前記算出された座標に基づいて決定する
設定支援方法。
(付記16)
コンピュータに、
3次元空間を撮影した映像に対してユーザにより指定された座標を取得する処理と、
前記取得された座標に対応する前記3次元空間の位置から所定の距離にある位置の前記映像における座標を算出する処理と、
前記指定された線に対して設定される領域を前記算出された座標に基づいて決定する処理と
を実行させるためのプログラムを記録したコンピュータ読み取り可能なプログラム記録媒体。
2 検出手段
3 制御手段
4,5,6 入力手段
Claims (16)
- 映像上において特定の領域へ侵入した物体の前記特定の領域への侵入位置を検出する検出手段と、
前記映像上における位置と所定時間とを対応付ける制御手段
とを備え、
前記検出手段が、検出した前記侵入位置と一致する位置に対応付けられた前記所定時間以上映像上において前記物体が前記特定の領域内に滞留した場合に警告する
侵入検出装置。 - 侵入位置の入力と所定時間の入力とを受け付ける入力手段をさらに備え、
前記制御手段は、受け付けられた前記侵入位置の入力と前記所定時間の入力とに基づいて前記侵入位置と前記所定時間とを対応付ける、
請求項1に記載した侵入検出装置。 - 侵入位置の入力を受け付ける入力手段をさらに備え、
前記制御手段は、受け付けられた前記侵入位置の入力に基づいて前記所定時間を設定する、
請求項1に記載した侵入検出装置。 - 所定時間の入力を受け付ける入力手段をさらに備え、
前記制御手段は、受け付けられた前記所定時間の入力に基づいて前記特定の領域を設定する、
請求項1に記載した侵入検出装置。 - 映像上において特定の領域へ侵入した物体の前記特定の領域への侵入位置を検出し、
前記侵入位置と所定時間とを対応付け、
映像上において前記物体が前記所定時間以上前記特定の領域内に滞留した場合に警告する
侵入検出方法。 - コンピュータを、
映像上において特定の領域へ侵入した物体の前記特定の領域への侵入位置を検出する検出手段と、
前記侵入位置と所定時間とを対応付ける制御手段と、
前記検出手段が、映像上において前記物体が前記所定時間以上前記特定の領域内に滞留した場合に警告する手段
として機能させるためのプログラムを記録したコンピュータ読み取り可能なプログラム記録媒体。 - 3次元空間を撮影した映像に対してユーザにより指定された座標を取得する取得手段と、
前記取得された座標に対応する前記3次元空間の位置から所定の距離にある位置の前記映像における座標を算出する算出手段と、
前記取得された座標に対して設定される領域を前記算出された座標に基づいて決定する決定手段と
を備える設定支援装置。 - 前記決定手段は、
前記取得された座標に対応する前記3次元空間における位置を物体が横切る方向に基づいて前記領域を決定する
請求項7に記載の設定支援装置。 - 前記決定手段は、
前記方向に応じて異なる前記領域を決定する
請求項8に記載の設定支援装置。 - 前記決定手段は、
前記取得された座標を用いて前記領域を決定する
請求項7から請求項9までのいずれか1項に記載の設定支援装置。 - 前記決定手段は、
前記映像から抽出される特徴を用いて前記領域を決定する
請求項7から請求項10までのいずれか1項に記載の設定支援装置。 - 前記決定手段は、
前記領域の複数の候補からいずれかを選択する選択手段をさらに備える
請求項7から請求項11までのいずれか1項に記載の設定支援装置。 - 前記複数の候補を前記映像とともに表示する表示手段をさらに備え、
前記選択手段は、
前記表示手段により表示された前記複数の候補のいずれかをユーザの操作に応じて選択する
請求項12に記載の設定支援装置。 - 前記決定された領域に対する前記3次元空間における物体の進入を検出する検出手段をさらに備える
請求項7から請求項13までのいずれか1項に記載の設定支援装置。 - 3次元空間を撮影した映像に対してユーザにより指定された座標を取得し、
前記取得された座標に対応する前記3次元空間の位置から所定の距離にある位置の前記映像における座標を算出し、
前記取得された座標に対して設定される領域を前記算出された座標に基づいて決定する
設定支援方法。 - コンピュータに、
3次元空間を撮影した映像に対してユーザにより指定された座標を取得する処理と、
前記取得された座標に対応する前記3次元空間の位置から所定の距離にある位置の前記映像における座標を算出する処理と、
前記取得された座標に対して設定される領域を前記算出された座標に基づいて決定する処理と
を実行させるためのプログラムを記録したコンピュータ読み取り可能なプログラム記録媒体。
Priority Applications (11)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201680074334.0A CN108431876B (zh) | 2015-12-16 | 2016-12-16 | 设置辅助设备、设置辅助方法和程序记录介质 |
EP16875809.2A EP3392852A4 (en) | 2015-12-16 | 2016-12-16 | INTRUSION DETECTION DEVICE, SETTING ASSIST DEVICE, INTRUSION DETECTION METHOD, SETTING ASSIST METHOD, AND PROGRAM RECORDING MEDIUM |
US15/778,334 US11049376B2 (en) | 2015-12-16 | 2016-12-16 | Setting assistance device, setting assistance method, and program recording medium |
CA3008594A CA3008594C (en) | 2015-12-16 | 2016-12-16 | Setting assistance device, setting assistance method, and program recording medium |
JP2017556481A JP6708218B2 (ja) | 2015-12-16 | 2016-12-16 | 情報処理装置、侵入検知方法及びコンピュータプログラム |
HK18115541.2A HK1256410A1 (zh) | 2015-12-16 | 2018-12-05 | 入侵檢測設備、設置輔助設備、入侵檢測方法、設置輔助方法和程序記錄介質 |
US16/292,820 US20190197847A1 (en) | 2015-12-16 | 2019-03-05 | Setting assistance device, setting assistance method, and program recording medium |
US16/292,885 US10832541B2 (en) | 2015-12-16 | 2019-03-05 | Setting assistance device, setting assistance method, and program recording medium |
US17/322,219 US11468753B2 (en) | 2015-12-16 | 2021-05-17 | Intrusion detection system, intrusion detection method, and computer-readable medium |
US17/900,163 US11783685B2 (en) | 2015-12-16 | 2022-08-31 | Intrusion detection system, intrusion detection method, and computer-readable medium |
US18/460,099 US20230410621A1 (en) | 2015-12-16 | 2023-09-01 | Intrusion detection system, intrusion detection method, and computer-readable medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-245497 | 2015-12-16 | ||
JP2015245497 | 2015-12-16 |
Related Child Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/778,334 A-371-Of-International US11049376B2 (en) | 2015-12-16 | 2016-12-16 | Setting assistance device, setting assistance method, and program recording medium |
US16/292,820 Continuation US20190197847A1 (en) | 2015-12-16 | 2019-03-05 | Setting assistance device, setting assistance method, and program recording medium |
US16/292,885 Continuation US10832541B2 (en) | 2015-12-16 | 2019-03-05 | Setting assistance device, setting assistance method, and program recording medium |
US17/322,219 Continuation US11468753B2 (en) | 2015-12-16 | 2021-05-17 | Intrusion detection system, intrusion detection method, and computer-readable medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017104835A1 true WO2017104835A1 (ja) | 2017-06-22 |
Family
ID=59056637
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/087649 WO2017104835A1 (ja) | 2015-12-16 | 2016-12-16 | 侵入検出装置、設定支援装置、侵入検出方法、設定支援方法及びプログラム記録媒体 |
Country Status (7)
Country | Link |
---|---|
US (6) | US11049376B2 (ja) |
EP (1) | EP3392852A4 (ja) |
JP (4) | JP6708218B2 (ja) |
CN (1) | CN108431876B (ja) |
CA (1) | CA3008594C (ja) |
HK (1) | HK1256410A1 (ja) |
WO (1) | WO2017104835A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019135640A (ja) * | 2017-12-19 | 2019-08-15 | アクシス アーベー | 徘徊イベントを検出するための方法、デバイスおよびシステム |
JP2020013289A (ja) * | 2018-07-17 | 2020-01-23 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP2020024669A (ja) * | 2018-08-07 | 2020-02-13 | キヤノン株式会社 | 検知装置およびその制御方法 |
TWI719766B (zh) * | 2019-12-19 | 2021-02-21 | 國立臺北科技大學 | 警戒區域設定系統及其方法 |
US11049376B2 (en) | 2015-12-16 | 2021-06-29 | Nec Corporation | Setting assistance device, setting assistance method, and program recording medium |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10671857B2 (en) * | 2015-06-17 | 2020-06-02 | Zhejiang Dahua Technology Co., Ltd. | Methods and systems for video surveillance |
JP7146402B2 (ja) * | 2018-01-18 | 2022-10-04 | キヤノン株式会社 | 情報処理装置、および情報処理方法 |
US10848539B2 (en) * | 2018-09-20 | 2020-11-24 | Cisco Technology, Inc. | Genlock mechanism for software pacing of media constant bit rate streams |
CN109345747A (zh) * | 2018-09-26 | 2019-02-15 | 深圳市敢为特种设备物联网技术有限公司 | 防入侵系统及其控制方法和计算机可读存储介质 |
CN109727426A (zh) * | 2019-01-23 | 2019-05-07 | 南京市特种设备安全监督检验研究院 | 一种机械式车库人员误入监测识别预警系统及检测方法 |
CN110070687A (zh) * | 2019-03-18 | 2019-07-30 | 苏州凸现信息科技有限公司 | 一种基于动作分析的智能安防监控系统及其工作方法 |
JP7127592B2 (ja) * | 2019-03-27 | 2022-08-30 | オムロン株式会社 | 報知システム |
CN110689694B (zh) * | 2019-10-17 | 2021-02-19 | 重庆工商职业学院 | 基于图像处理的智能监控系统及方法 |
CN116030423B (zh) * | 2023-03-29 | 2023-06-16 | 浪潮通用软件有限公司 | 一种区域边界侵入检测方法、设备及介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008225803A (ja) | 2007-03-12 | 2008-09-25 | Saxa Inc | 監視領域設定装置 |
JP2010102511A (ja) | 2008-10-23 | 2010-05-06 | Panasonic Corp | 動的エリア監視装置、動的エリア監視システム、動的エリア監視用表示装置、及び方法 |
JP2010146290A (ja) * | 2008-12-18 | 2010-07-01 | Toyota Home Kk | 防犯システム |
JP2012058880A (ja) | 2010-09-07 | 2012-03-22 | Optex Co Ltd | 位置検出部付き監視装置 |
JP2013065351A (ja) * | 2012-12-14 | 2013-04-11 | Hitachi Kokusai Electric Inc | 画像処理装置および画像処理方法 |
Family Cites Families (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0950585A (ja) * | 1995-08-07 | 1997-02-18 | Hitachi Ltd | 侵入者監視装置 |
US6445409B1 (en) | 1997-05-14 | 2002-09-03 | Hitachi Denshi Kabushiki Kaisha | Method of distinguishing a moving object and apparatus of tracking and monitoring a moving object |
US6185314B1 (en) | 1997-06-19 | 2001-02-06 | Ncr Corporation | System and method for matching image information to object model information |
JP3567066B2 (ja) | 1997-10-31 | 2004-09-15 | 株式会社日立製作所 | 移動体組合せ検出装置および方法 |
JP2001069268A (ja) | 1999-08-27 | 2001-03-16 | Horiba Ltd | 通信装置 |
US6970083B2 (en) * | 2001-10-09 | 2005-11-29 | Objectvideo, Inc. | Video tripwire |
US7577199B1 (en) * | 2003-06-19 | 2009-08-18 | Nvidia Corporation | Apparatus and method for performing surveillance using motion vectors |
US8558892B2 (en) * | 2004-01-20 | 2013-10-15 | Honeywell International Inc. | Object blocking zones to reduce false alarms in video surveillance systems |
TW200634674A (en) * | 2005-03-28 | 2006-10-01 | Avermedia Tech Inc | Surveillance system having multi-area motion-detection function |
US7327253B2 (en) * | 2005-05-04 | 2008-02-05 | Squire Communications Inc. | Intruder detection and warning system |
JP4754283B2 (ja) | 2005-06-30 | 2011-08-24 | セコム株式会社 | 監視システム及び設定装置 |
JP2007249722A (ja) * | 2006-03-17 | 2007-09-27 | Hitachi Ltd | 物体検知装置 |
JP4201025B2 (ja) | 2006-06-30 | 2008-12-24 | ソニー株式会社 | 監視装置、監視システム及びフィルタ設定方法、並びに監視プログラム |
WO2008008505A2 (en) | 2006-07-14 | 2008-01-17 | Objectvideo, Inc. | Video analytics for retail business process monitoring |
JP2008181347A (ja) * | 2007-01-25 | 2008-08-07 | Meidensha Corp | 侵入監視システム |
CN100531373C (zh) * | 2007-06-05 | 2009-08-19 | 西安理工大学 | 基于双摄像头联动结构的视频运动目标特写跟踪监视方法 |
JP5013319B2 (ja) * | 2007-09-28 | 2012-08-29 | サクサ株式会社 | ステレオ画像処理装置及び同画像処理用プログラム |
DE112009000480T5 (de) | 2008-03-03 | 2011-04-07 | VideoIQ, Inc., Bedford | Dynamische Objektklassifikation |
WO2009126151A1 (en) | 2008-04-09 | 2009-10-15 | Utc Fire & Security Corporation | Video content analysis |
CN101635835A (zh) | 2008-07-25 | 2010-01-27 | 深圳市信义科技有限公司 | 智能视频监控方法及系统 |
JP5405340B2 (ja) * | 2010-02-10 | 2014-02-05 | セコム株式会社 | 画像監視装置および監視システム |
JP5511615B2 (ja) * | 2010-09-30 | 2014-06-04 | インターナショナル・ビジネス・マシーンズ・コーポレーション | 作業指示に関連付けられた資産又は当該資産に関連付けられた要素を管理する方法、並びにそのシステム及びコンピュータ・プログラム |
EP2687547B1 (en) | 2011-03-14 | 2017-02-08 | Sumitomo Seika Chemicals CO. LTD. | Polyrotaxane composition |
JP2013008298A (ja) * | 2011-06-27 | 2013-01-10 | Secom Co Ltd | 警備システム |
JP5880199B2 (ja) * | 2012-03-27 | 2016-03-08 | ソニー株式会社 | 表示制御装置、表示制御方法およびプログラム |
JP5964636B2 (ja) * | 2012-03-30 | 2016-08-03 | セコム株式会社 | 侵入監視装置 |
KR20130127822A (ko) | 2012-05-15 | 2013-11-25 | 한국전자통신연구원 | 도로상 물체 분류 및 위치검출을 위한 이종 센서 융합처리 장치 및 방법 |
JP6150419B2 (ja) | 2012-09-20 | 2017-06-21 | 株式会社ニコンシステム | カメラおよびプログラム |
JP6181925B2 (ja) | 2012-12-12 | 2017-08-16 | キヤノン株式会社 | 画像処理装置、画像処理装置の制御方法およびプログラム |
JP6226539B2 (ja) * | 2013-03-15 | 2017-11-08 | キヤノン株式会社 | 情報処理装置、情報処理装置の制御方法、およびプログラム |
AU2014240669B2 (en) | 2013-03-29 | 2016-06-23 | Nec Corporation | Object monitoring system, object monitoring method, and monitoring target extraction project |
CN103414870B (zh) | 2013-07-16 | 2016-05-04 | 南京师范大学 | 一种多模式警戒分析方法 |
TWI508027B (zh) | 2013-08-08 | 2015-11-11 | Huper Lab Co Ltd | 三維偵測裝置及其偵測影像之方法 |
JP6326847B2 (ja) * | 2014-02-14 | 2018-05-23 | 富士通株式会社 | 画像処理装置、画像処理方法および画像処理プログラム |
WO2015132271A1 (en) * | 2014-03-03 | 2015-09-11 | Vsk Electronics Nv | Intrusion detection with directional sensing |
US10334150B2 (en) | 2014-05-14 | 2019-06-25 | Hanwha Aerospace Co., Ltd. | Camera system and method of tracking object using the same |
KR20150132693A (ko) | 2014-05-15 | 2015-11-26 | 에스케이텔레콤 주식회사 | 보안 감시장치 및 그 방법 |
US10127783B2 (en) * | 2014-07-07 | 2018-11-13 | Google Llc | Method and device for processing motion events |
KR102335045B1 (ko) | 2014-10-07 | 2021-12-03 | 주식회사 케이티 | 깊이 카메라 기반 사람 객체를 판별하는 방법 및 장치 |
JP6650677B2 (ja) * | 2015-02-26 | 2020-02-19 | キヤノン株式会社 | 映像処理装置、映像処理方法、およびプログラム |
TWI541767B (zh) | 2015-04-07 | 2016-07-11 | 群暉科技股份有限公司 | 藉助於自動產生之巡邏路徑控制一監視系統之方法與裝置 |
US10671857B2 (en) * | 2015-06-17 | 2020-06-02 | Zhejiang Dahua Technology Co., Ltd. | Methods and systems for video surveillance |
EP3392852A4 (en) * | 2015-12-16 | 2019-08-21 | Nec Corporation | INTRUSION DETECTION DEVICE, SETTING ASSIST DEVICE, INTRUSION DETECTION METHOD, SETTING ASSIST METHOD, AND PROGRAM RECORDING MEDIUM |
CN105828045B (zh) | 2016-05-12 | 2019-03-08 | 浙江宇视科技有限公司 | 一种利用空间信息实现目标追踪的方法及装置 |
-
2016
- 2016-12-16 EP EP16875809.2A patent/EP3392852A4/en not_active Ceased
- 2016-12-16 WO PCT/JP2016/087649 patent/WO2017104835A1/ja active Application Filing
- 2016-12-16 US US15/778,334 patent/US11049376B2/en active Active
- 2016-12-16 CN CN201680074334.0A patent/CN108431876B/zh active Active
- 2016-12-16 JP JP2017556481A patent/JP6708218B2/ja active Active
- 2016-12-16 CA CA3008594A patent/CA3008594C/en active Active
-
2018
- 2018-12-05 HK HK18115541.2A patent/HK1256410A1/zh unknown
-
2019
- 2019-03-05 US US16/292,820 patent/US20190197847A1/en not_active Abandoned
- 2019-03-05 US US16/292,885 patent/US10832541B2/en active Active
-
2020
- 2020-05-20 JP JP2020087798A patent/JP2020149705A/ja active Pending
-
2021
- 2021-05-17 US US17/322,219 patent/US11468753B2/en active Active
- 2021-10-29 JP JP2021177345A patent/JP7468493B2/ja active Active
-
2022
- 2022-08-31 US US17/900,163 patent/US11783685B2/en active Active
-
2023
- 2023-09-01 US US18/460,099 patent/US20230410621A1/en active Pending
-
2024
- 2024-03-25 JP JP2024047504A patent/JP2024071506A/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008225803A (ja) | 2007-03-12 | 2008-09-25 | Saxa Inc | 監視領域設定装置 |
JP2010102511A (ja) | 2008-10-23 | 2010-05-06 | Panasonic Corp | 動的エリア監視装置、動的エリア監視システム、動的エリア監視用表示装置、及び方法 |
JP2010146290A (ja) * | 2008-12-18 | 2010-07-01 | Toyota Home Kk | 防犯システム |
JP2012058880A (ja) | 2010-09-07 | 2012-03-22 | Optex Co Ltd | 位置検出部付き監視装置 |
JP2013065351A (ja) * | 2012-12-14 | 2013-04-11 | Hitachi Kokusai Electric Inc | 画像処理装置および画像処理方法 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11049376B2 (en) | 2015-12-16 | 2021-06-29 | Nec Corporation | Setting assistance device, setting assistance method, and program recording medium |
JP2019135640A (ja) * | 2017-12-19 | 2019-08-15 | アクシス アーベー | 徘徊イベントを検出するための方法、デバイスおよびシステム |
JP2020013289A (ja) * | 2018-07-17 | 2020-01-23 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP7140583B2 (ja) | 2018-07-17 | 2022-09-21 | キヤノン株式会社 | 画像処理装置、画像処理方法およびプログラム |
JP2020024669A (ja) * | 2018-08-07 | 2020-02-13 | キヤノン株式会社 | 検知装置およびその制御方法 |
JP7378223B2 (ja) | 2018-08-07 | 2023-11-13 | キヤノン株式会社 | 検知装置およびその制御方法 |
TWI719766B (zh) * | 2019-12-19 | 2021-02-21 | 國立臺北科技大學 | 警戒區域設定系統及其方法 |
Also Published As
Publication number | Publication date |
---|---|
JP6708218B2 (ja) | 2020-06-10 |
US20180350212A1 (en) | 2018-12-06 |
HK1256410A1 (zh) | 2019-09-20 |
JPWO2017104835A1 (ja) | 2018-11-22 |
JP7468493B2 (ja) | 2024-04-16 |
EP3392852A4 (en) | 2019-08-21 |
US11468753B2 (en) | 2022-10-11 |
CN108431876B (zh) | 2021-03-12 |
US20190206209A1 (en) | 2019-07-04 |
CA3008594A1 (en) | 2017-06-22 |
US20230410621A1 (en) | 2023-12-21 |
CA3008594C (en) | 2021-09-21 |
US20220415145A1 (en) | 2022-12-29 |
JP2022016458A (ja) | 2022-01-21 |
US20190197847A1 (en) | 2019-06-27 |
US10832541B2 (en) | 2020-11-10 |
US11783685B2 (en) | 2023-10-10 |
JP2020149705A (ja) | 2020-09-17 |
JP2024071506A (ja) | 2024-05-24 |
US20210280026A1 (en) | 2021-09-09 |
EP3392852A1 (en) | 2018-10-24 |
CN108431876A (zh) | 2018-08-21 |
US11049376B2 (en) | 2021-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017104835A1 (ja) | 侵入検出装置、設定支援装置、侵入検出方法、設定支援方法及びプログラム記録媒体 | |
JP2009175802A (ja) | 物体検出装置 | |
JP6638527B2 (ja) | 車両用装置、車両用プログラム | |
JP6610994B2 (ja) | 障害物検出装置、及び、障害物検出方法 | |
JP2011053005A (ja) | 監視システム | |
JP6405606B2 (ja) | 画像処理装置、画像処理方法、および画像処理プログラム | |
JP5012522B2 (ja) | 路側境界面検出装置 | |
JP2008165595A (ja) | 障害物検出方法、障害物検出装置、障害物検出システム | |
US20160335916A1 (en) | Portable device and control method using plurality of cameras | |
JP2019200718A (ja) | 監視装置、監視方法及びプログラム | |
JP2011134119A (ja) | 車両周辺監視装置 | |
JP2011203766A (ja) | 車両用画像処理装置 | |
JP2008146132A (ja) | 画像検出装置、プログラム及び画像検出方法 | |
JP2007280151A (ja) | 画像解析装置、画像解析方法及びプログラム | |
JP4013872B2 (ja) | 障害物検出装置 | |
JP2008028857A (ja) | 障害物検出システム、及び障害物検出方法 | |
JP3398937B2 (ja) | 車両用監視方法及び装置 | |
JP2011113410A (ja) | 幹線路合流情報提供システム | |
JP2010093570A (ja) | 車両周辺監視装置 | |
JP2006072491A (ja) | 移動位置予測装置、移動位置予測方法、衝突判定装置、及び衝突判定方法 | |
JP2006088940A (ja) | 車両用衝突警報装置及び車両用衝突警報方法 | |
JP2017142661A (ja) | 車両周辺監視システム及びコンピュータプログラム | |
JP2019082822A (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP2006262527A (ja) | 障害物検出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16875809 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017556481 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 3008594 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016875809 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016875809 Country of ref document: EP Effective date: 20180716 |