WO2018159144A1 - 3次元侵入検知システムおよび3次元侵入検知方法 - Google Patents
3次元侵入検知システムおよび3次元侵入検知方法 Download PDFInfo
- Publication number
- WO2018159144A1 WO2018159144A1 PCT/JP2018/001492 JP2018001492W WO2018159144A1 WO 2018159144 A1 WO2018159144 A1 WO 2018159144A1 JP 2018001492 W JP2018001492 W JP 2018001492W WO 2018159144 A1 WO2018159144 A1 WO 2018159144A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dimensional
- image
- area
- monitoring
- intrusion detection
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19678—User interface
- G08B13/19691—Signalling events for better perception by user, e.g. indicating alarms by making display brighter, adding text, creating a sound
- G08B13/19693—Signalling events for better perception by user, e.g. indicating alarms by making display brighter, adding text, creating a sound using multiple video sources viewed on a single or compound screen
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
- G01C11/06—Interpretation of pictures by comparison of two or more pictures of the same area
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19613—Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19641—Multiple cameras having overlapping views on a single scene
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/18—Status alarms
- G08B21/24—Reminder alarms, e.g. anti-loss alarms
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
Definitions
- three-dimensional information of a monitoring area is acquired from a plurality of camera images obtained by photographing the monitoring area with at least two cameras that are spaced apart, and an object that has entered the monitoring area is detected based on the three-dimensional information.
- the present invention relates to a three-dimensional intrusion detection system and a three-dimensional intrusion detection method.
- An intrusion detection system that detects an object such as a person who enters the surveillance area by installing a camera that captures the surveillance area and performs image processing of the camera image is widely used.
- an intrusion detection system false detections frequently occur due to changes in the environment such as brightness, so a technique capable of robust intrusion detection that is not easily affected by environmental changes is desired.
- the main object is to provide a three-dimensional intrusion detection system and a three-dimensional intrusion detection method that can be well implemented.
- the three-dimensional intrusion detection system acquires three-dimensional information of a monitoring area from a plurality of camera images obtained by photographing the monitoring area with at least two cameras that are spaced apart, and monitors based on the three-dimensional information.
- a three-dimensional intrusion detection system that detects an object that has entered an area, and that measures an three-dimensional position of an object in a monitoring area based on an image acquisition unit that acquires a plurality of camera images and a plurality of camera images 3D measurement unit that performs 3D measurement and outputs 3D information of the monitoring area, an intrusion detection unit that detects an object that has entered the monitoring area based on the change status of the 3D information, and visualization of 3D information And at least one of the camera image and the map image selected by the user's input operation is generated.
- the 3D intrusion detection method of the present disclosure acquires 3D information of a monitoring area from a plurality of camera images obtained by capturing the monitoring area with at least two cameras that are spaced apart, and based on the 3D information.
- a three-dimensional intrusion detection method for causing an information processing device to perform processing for detecting an object that has entered a monitoring area, wherein a plurality of camera images are acquired, and based on the plurality of camera images, three objects in the monitoring area are detected.
- a map image that performs 3D measurement to measure the 3D position generates 3D information of the monitoring area, detects an object that has entered the monitoring area based on the change status of the 3D information, and visualizes the 3D information And a mark image indicating an object that has entered the monitoring area, and at least one of a camera image and a map image selected by a user input operation, and a mark image And configured to output a monitoring screen that displays.
- a monitor it is possible for a monitor to check whether or not there is a false detection by using a camera image that captures the actual situation of a monitoring area, and by using a map image that visualizes three-dimensional information.
- the supervisor can check whether or not the intrusion detection based on the information is normal.
- the monitor can customize the monitor to select an image to be displayed on the monitor screen.
- the monitor it is possible for the monitor to easily check whether it is not a false detection or whether the intrusion detection is normal, if necessary, and the monitor performs the monitoring work efficiently. be able to.
- FIG. 1 is an overall configuration diagram of a three-dimensional intrusion detection system according to the present embodiment.
- FIG. 2 is an explanatory diagram illustrating an example of a detection area and a gaze area set on a camera image.
- FIG. 3 is an explanatory diagram showing an outline of processing performed in the server 2.
- FIG. 4 is a block diagram illustrating a schematic configuration of the server 2.
- FIG. 5 is an explanatory diagram showing a monitoring screen in the two-split display mode.
- FIG. 6A is an explanatory diagram illustrating a monitoring screen in a single image display mode.
- FIG. 6B is an explanatory diagram illustrating a monitoring screen in the single image display mode.
- FIG. 7 is an explanatory diagram showing a monitoring screen in the 4-split display mode.
- three-dimensional information of a monitoring area is acquired from a plurality of camera images obtained by photographing the monitoring area with at least two cameras that are spaced apart, and the three-dimensional information is obtained.
- a three-dimensional measurement unit that performs three-dimensional measurement to measure a position and outputs three-dimensional information of the monitoring area
- an intrusion detection unit that detects an object that has entered the monitoring area based on a change state of the three-dimensional information
- a map image that visualizes the three-dimensional information and a mark image that indicates an object that has entered the surveillance area are generated, and the number of camera images and map images selected by the user's input operation is reduced.
- the monitor can confirm whether or not there is a false detection by using a camera image showing the actual situation of the monitoring area, and the map image obtained by visualizing the three-dimensional information can be converted into three-dimensional information.
- the supervisor can check whether or not the intrusion detection based on the normal is normal.
- the monitor can customize the monitor to select an image to be displayed on the monitor screen. As a result, it is possible for the monitor to easily check whether it is not a false detection or whether the intrusion detection is normal, if necessary, and the monitor performs the monitoring work efficiently. be able to.
- the second invention further includes an area setting unit that sets a gaze area on the camera image in accordance with a user input operation, and the screen generation unit maps the map in a state where the display range is limited to the gaze area. It is assumed that at least one of the image and the camera image is displayed on the monitoring screen.
- the display range of the camera image and the map image to a gaze area that is important in the monitoring work, the visibility of the camera image and the map image is improved, and the monitoring work can be performed efficiently.
- the third aspect of the invention has a configuration in which the region setting unit sets a measurement region that is a target of three-dimensional measurement to a range that includes a detection region that is a target of intrusion detection and that is the same as the gaze region.
- the measurement area is set to include the detection area, intrusion detection can be appropriately performed based on the three-dimensional information generated by the three-dimensional measurement.
- the measurement area is set to the same range as the gaze area, it is only necessary to calculate and display the map image of the gaze area. Speeding up and cost reduction of the apparatus can be achieved.
- the fourth aspect of the invention is configured such that the screen generation unit outputs a screen in a two-divided display state in which any one of a plurality of camera images and a map image are displayed side by side as a monitoring screen.
- the number of images to be displayed on the monitoring screen can be reduced and the camera image and the map image can be displayed in a large size, the visibility of the camera image and the map image is improved, and the monitoring work is efficiently performed. Can do.
- the screen generation unit displays an operation unit for switching the monitoring screen on the monitoring screen, and displays only a single camera image in response to the user operating the operation unit.
- the screen is configured to switch between a screen that displays only a map image and a screen that is in a two-split display state.
- the monitor can switch the monitor screen according to the use.
- three-dimensional information of a monitoring area is acquired from a plurality of camera images obtained by photographing the monitoring area with at least two cameras that are spaced apart from each other, and the monitoring area is based on the three-dimensional information.
- a three-dimensional intrusion detection method for causing an information processing device to perform processing for detecting an intruding object, acquiring a plurality of camera images and measuring a three-dimensional position of an object in a monitoring area based on the plurality of camera images.
- 3D measurement is performed, 3D information of the monitoring area is generated, an object that has entered the monitoring area is detected based on a change state of the 3D information, and a 3D information is visualized, and the monitoring area
- a monitor image that generates a mark image indicating an object that has entered the camera and displays at least one of a camera image and a map image selected by a user input operation and a mark image To print the configuration to.
- the monitor can easily perform operations such as confirmation of whether or not there is a false detection and confirmation of whether or not the intrusion detection is normal. Can perform monitoring work efficiently.
- FIG. 1 is an overall configuration diagram of a three-dimensional intrusion detection system according to the present embodiment.
- the 3D intrusion detection system includes a pair of left and right cameras 1 and a server 2 (3D intrusion detection device, information processing device).
- Camera 1 captures the surveillance area.
- a synchronization signal for the left and right cameras 1 to perform shooting at the same timing is output from one camera 1 to the other camera 1.
- the server 2 Based on the left and right camera images output from the left and right cameras 1, the server 2 performs three-dimensional measurement to measure the three-dimensional position of the object shown in the camera image, and the three-dimensional monitoring area acquired by the three-dimensional measurement. Based on the information, an object such as a person who has entered the monitoring area is detected.
- the camera 1 is a monocular camera and is set apart from the left and right at a predetermined distance. With such a configuration, a large interval between the two cameras 1 can be secured, so that three-dimensional information with depth can be acquired, which is suitable for wide area monitoring.
- the server 2 may be connected to the camera 1 via a network, whereby the intrusion detection can be performed by the server 2 installed at a remote place.
- the configuration of a pair of left and right cameras is illustrated as the camera 1, but the camera 1 can also be configured with three or more cameras. In that case, more accurate three-dimensional information can be acquired for the monitoring area.
- FIG. 2 is an explanatory diagram illustrating an example of a detection area and a gaze area.
- three-dimensional measurement is performed to measure the three-dimensional position of an object shown in the camera image, and the three-dimensional information acquired by the three-dimensional measurement is obtained. To detect intrusions.
- This detection area is a three-dimensional space in which an object such as a person to be detected exists, and is a box-shaped (polyhedron) space defined by a bottom surface (floor surface) such as the ground and a height.
- a gaze area that should be watched by the supervisor is set.
- This gaze area is a range of the camera image displayed on the monitoring screen. This gaze area is set so as to include the detection area.
- the measurement area that is the target of 3D measurement is set.
- the measurement area is set to the same range as the gaze area.
- This detection area and gaze area are set according to the user's input operation that specifies each range. Note that the user may specify the range of the detection area, and the gaze area (measurement area) may be automatically set so as to include the detection area.
- the rectangular gaze area is set so as to share the left and right sides of the camera image, but this gaze area can be set at an arbitrary position on the camera image.
- the shape of the gaze area is not limited to a rectangle, and the gaze area can be set to an arbitrary shape.
- FIG. 3 is an explanatory diagram showing an outline of processing performed in the server 2.
- left and right camera images (frames) output from the left and right cameras 1 are acquired, and a gaze area (measurement area) is cut out from the left and right camera images to acquire partial camera images. Then, three-dimensional measurement is performed using the partial camera image, and three-dimensional information at each time corresponding to the frame is generated. Note that three-dimensional information may be generated by thinning out frames as appropriate.
- intrusion detection is performed to detect an object that has entered the detection area based on the change state of the three-dimensional information at each time.
- the area of the intruding object is detected by comparing the three-dimensional information at each time with the three-dimensional information of the background acquired in the absence of the intruding object, and the position information (three-dimensional position) of the intruding object is detected.
- the intrusion detection may be executed in combination with a detection function from a captured image provided in each camera 1.
- a partial depth map that visualizes the three-dimensional information of the gaze area is generated. Further, based on the position information of the intruding object acquired by intrusion detection, a frame image (mark image) surrounding the intruding object is generated, and image composition is performed to superimpose the frame image on the position of the intruding object in the partial camera image. . And the monitoring screen which displayed the partial camera image and partial depth map after image composition side by side is generated.
- the left partial camera image is displayed on the monitoring screen, but the right partial camera image may be displayed on the monitoring screen.
- the gaze area (measurement area) is set to include the detection area. Is done. Further, when the gaze area (measurement area) becomes large, the amount of calculation in three-dimensional measurement or the like increases. Therefore, the gaze area (measurement area) may be a rectangle circumscribing the detection area.
- the measurement area is set to the same range as the gaze area, but the measurement area may be set to a range different from the gaze area.
- the gaze area may be set to include the detection area
- the measurement area may be set to include the gaze area.
- FIG. 4 is a block diagram illustrating a schematic configuration of the server 2.
- the server 2 includes an image input unit 11 (image acquisition unit), a control unit 12, a storage unit 13, a display unit 14 (display device), and an operation input unit 15.
- image input unit 11 image acquisition unit
- control unit 12 control unit 12
- storage unit 13 storage unit
- display unit 14 display device
- operation input unit 15 operation input unit
- the left and right camera images output from the left and right cameras 1 are input to the image input unit 11.
- the storage unit 13 stores a camera image input to the image input unit 11, a depth map generated by the control unit 12, and the like.
- the storage unit 13 stores a program executed by the control unit 12.
- the display unit 14 includes a display device such as a liquid crystal display.
- the operation input unit 15 includes an input device such as a keyboard and a mouse.
- the control unit 12 includes an area setting unit 21, a three-dimensional measurement unit 22, an intrusion detection unit 23, and a screen generation unit 24.
- the control unit 12 includes a processor, and each unit of the control unit 12 is realized by executing a program stored in the storage unit 13.
- the region setting unit 21 sets a detection region and a gaze region in accordance with a user input operation at the operation input unit 15.
- the range of the detection area and the gaze area may be individually designated by the user, but the user designates the range of the detection area, and the range of the gaze area is set based on the range of the detection area. It may be set by the unit 21.
- the three-dimensional measurement unit 22 performs three-dimensional measurement for measuring the three-dimensional position of the object in the gaze region (measurement region) set by the region setting unit 21 based on the left and right camera images input to the image input unit 11. To generate three-dimensional information of the gaze area.
- the intrusion detection unit 23 detects an intruding object that has entered the detection region set by the region setting unit 21 based on the three-dimensional information acquired by the three-dimensional measurement unit 22.
- the screen generation unit 24 displays the monitoring screen on the display unit 14 based on the three-dimensional information acquired by the three-dimensional measurement unit 22, the detection result of the intrusion detection unit 23, and the gaze area set by the region setting unit 21. Is generated. Further, the monitor screen display mode is switched in accordance with the user input operation on the operation input unit 15 to generate a monitor screen corresponding to the display mode.
- FIG. 5 is an explanatory diagram showing a monitoring screen in the two-split display mode.
- 6A and 6B are explanatory diagrams illustrating a monitoring screen in a single image display mode (camera image display mode and depth map display mode).
- FIG. 7 is an explanatory diagram showing a monitoring screen in the 4-split display mode.
- the monitoring screen is provided with tabs 31 to 34 (operation units) of “2 division”, “camera”, “depth”, and “4 division”.
- the partial camera image 41 and the partial depth map 42 are displayed side by side on the image display unit 35.
- the partial camera image 41 is obtained by cutting out the gaze area from the camera image acquired from the camera 1.
- an intruding object that has entered the monitoring area is shown, and a frame image 43 (mark image) indicating the intruding object is displayed based on the detection result of the intrusion detection.
- the partial depth map 42 is obtained by visualizing the three-dimensional information of the gaze area generated by the three-dimensional measurement unit 22 and is displayed in a state limited to the gaze area, like the partial camera image 41.
- information related to intrusion detection such as the shooting time and shooting location may be displayed on the monitoring screen.
- necessary information may be displayed in the blank portion, but may be displayed superimposed on the partial camera image 41 or the partial depth map 42.
- the partial camera image 41 and the partial depth map 42 are displayed simultaneously.
- the observer can visually confirm the intruding object.
- the monitoring person can judge whether the false detection which detects the object which is not a detection target has generate
- the partial depth map 42 by observing the partial depth map 42, it is possible to visually confirm whether or not intrusion detection is normally performed. If the partial depth map 42 is abnormal, the intrusion detection performed based on the original three-dimensional information also becomes abnormal. Further, by visually comparing the partial camera image 41 and the partial depth map 42, the supervisor can infer the cause of the erroneous detection.
- the camera image display mode monitoring screen only the camera image 44 is displayed on the image display unit 35.
- the partial camera image 41 limited to the gaze area is displayed.
- the camera image acquired from the camera 1 is displayed as it is, and the entire shooting area is displayed. Can be observed.
- a frame image 43 indicating an intruding object is displayed.
- a depth map display mode monitoring screen shown in FIG. 6B is displayed.
- the partial depth map 42 is displayed on the image display unit 35.
- This partial depth map 42 is displayed in a state limited to the gaze area, as in the two-split display mode shown in FIG. Further, around the partial depth map 42, an area outside the gaze area in the imaging area is displayed in gray out. Thereby, the supervisor can confirm the position of the gaze area in the imaging area.
- a frame image 43 indicating an intruding object is displayed on the partial depth map 42.
- the left camera image 44, the right camera image 45, the depth map 46 (map image), and the information display column 47 are displayed on the image display unit 35.
- the left camera image 44 and the right camera image 45 are obtained from the left and right cameras 1.
- the depth map 46 is generated based on the three-dimensional information acquired by the three-dimensional measurement for the entire imaging region.
- information display column 47 character information relating to information related to intrusion detection (detection information) such as a shooting time and a shooting location is displayed.
- a “setting” button 36 is provided on the monitoring screen.
- this “setting” button 36 is operated, a transition is made to a setting screen (not shown).
- the user can specify various setting items. For example, the user can specify the range of the detection area and the gaze area (see FIG. 2).
- the display mode of the monitoring screen is switched according to the application. Can do.
- the monitoring screen in the two-part display mode shown in FIG. 5 may be displayed.
- the partial camera image 41 and the partial depth map 42 limited to the gaze area are displayed, and the number of images displayed is smaller than that in the four-divided display mode shown in FIG.
- the image is displayed in a large size, the status of the monitoring area can be observed in detail and skillfully, and the monitoring work can be performed efficiently.
- a partial depth map 42 limited to the gaze area is displayed, but the gaze area (measurement area) includes the detection area.
- the depth map 46 of the entire imaging region is displayed, and therefore, when switching to the quadrant display mode, the 3D measurement process is performed again for the entire imaging region. become. Therefore, a partial depth map limited to the gaze area may be used as in FIG. 6B.
- the left camera image is displayed on the monitoring screen in the two-split display mode or the camera image display mode, but the right camera image may be displayed.
- an operation unit such as a camera image switching button may be provided on the monitoring screen so that the left camera image and the right camera image can be switched.
- the frame image 43 indicating the intruding object detected by the intrusion detection is displayed on the camera image, but the frame image is displayed on the depth map. You may do it. Further, a frame image may be displayed on both the camera image and the depth map.
- the partial depth map 42 limited to the gaze area is displayed in the depth map display mode.
- the depth map for the entire imaging area is displayed. You may make it do.
- the display mode is switched, the three-dimensional measurement process is performed again for the entire imaging region.
- the display mode can be switched by operating the tabs 31 to 34 (operation unit) displayed on the monitoring screen.
- the display mode may be switched using the input device.
- the embodiment has been described as an example of the technique disclosed in the present application.
- the technology in the present disclosure is not limited to this, and can be applied to embodiments in which changes, replacements, additions, omissions, and the like have been performed.
- the pair of left and right cameras 1 are installed, but the number of cameras is not limited to this, and it is sufficient if there are at least two (plural) cameras. That is, it is possible to install three or more cameras and generate three-dimensional information from three or more camera images, thereby improving the accuracy of the three-dimensional information.
- the rectangular frame image 43 surrounding the intruding object is displayed on the monitoring screen as the mark image indicating the intruding object. It is good also as various shapes, such as. Further, the mark image is not limited to a form surrounding the intruding object, and the intruding object may be indicated by an arrow image or the like.
- the three-dimensional intrusion detection system and the three-dimensional intrusion detection method according to the present disclosure allow the monitor to perform operations such as confirmation of whether or not there is a false detection and confirmation of whether or not the intrusion detection is normal. It is simple and has the effect that the monitor can efficiently perform the monitoring work, and obtains three-dimensional information of the monitoring area from a plurality of camera images obtained by capturing the monitoring area with at least two cameras that are spaced apart. Thus, it is useful as a three-dimensional intrusion detection system and a three-dimensional intrusion detection method for detecting an object that has entered the monitoring area based on the three-dimensional information.
- Camera 2 Server (Information processing device) 11 Image input unit (image acquisition unit) DESCRIPTION OF SYMBOLS 12 Control part 13 Memory
- storage part 14 Display part 15 Operation input part 21 Area setting part 22 Three-dimensional measurement part 23 Intrusion detection part 24 Screen generation part 41 Partial camera image 42 Partial depth map (map image) 43 Frame image (mark image) 44 Camera image 45 Camera image 46 Depth map 47 Information display field
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Human Computer Interaction (AREA)
- Closed-Circuit Television Systems (AREA)
- Alarm Systems (AREA)
- Emergency Alarm Devices (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Processing (AREA)
- Burglar Alarm Systems (AREA)
Abstract
Description
2 サーバ(情報処理装置)
11 画像入力部(画像取得部)
12 制御部
13 記憶部
14 表示部
15 操作入力部
21 領域設定部
22 3次元計測部
23 侵入検知部
24 画面生成部
41 部分カメラ画像
42 部分デプスマップ(マップ画像)
43 枠画像(マーク画像)
44 カメラ画像
45 カメラ画像
46 デプスマップ
47 情報表示欄
Claims (6)
- 離間配置された少なくとも2つのカメラで監視エリアを撮影した複数のカメラ画像から、監視エリアの3次元情報を取得して、その3次元情報に基づいて、監視エリアに侵入した物体を検知する3次元侵入検知システムであって、
複数の前記カメラ画像を取得する画像取得部と、
複数の前記カメラ画像に基づいて、監視エリア内の物体の3次元位置を計測する3次元計測を行って、監視エリアの3次元情報を出力する3次元計測部と、
前記3次元情報の変化状況に基づいて、監視エリアに侵入した物体を検知する侵入検知部と、
前記3次元情報を可視化したマップ画像、および監視エリアに侵入した物体を示すマーク画像を生成して、ユーザの入力操作により選択された前記カメラ画像および前記マップ画像の少なくとも1つの画像と前記マーク画像とを表示した監視画面を出力する画面生成部と、を備えたことを特徴とする3次元侵入検知システム。 - さらに、ユーザの入力操作に応じて、前記カメラ画像上に注視領域を設定する領域設定部を備え、
前記画面生成部は、表示範囲を前記注視領域に限定した状態で、前記マップ画像および前記カメラ画像の少なくとも1つの画像を前記監視画面に表示することを特徴とする請求項1に記載の3次元侵入検知システム。 - 前記領域設定部は、前記3次元計測の対象となる計測領域を、前記侵入検知の対象となる検知領域を含み、かつ前記注視領域と同一の範囲に設定することを特徴とする請求項2に記載の3次元侵入検知システム。
- 前記画面生成部は、前記監視画面として、複数の前記カメラ画像のいずれかと前記マップ画像とを並べて表示した2分割表示状態の画面を出力することを特徴とする請求項2に記載の3次元侵入検知システム。
- 前記画面生成部は、前記監視画面を切り替える操作部を前記監視画面に表示し、この操作部をユーザが操作するのに応じて、単一の前記カメラ画像のみを表示する画面と、前記マップ画像のみを表示する画面と、前記2分割表示状態の画面とを切り替えることを特徴とする請求項4に記載の3次元侵入検知システム。
- 離間配置された少なくとも2つのカメラで監視エリアを撮影した複数のカメラ画像から、監視エリアの3次元情報を取得して、その3次元情報に基づいて、監視エリアに侵入した物体を検知する処理を情報処理装置に行わせる3次元侵入検知方法であって、
複数の前記カメラ画像を取得し、
複数の前記カメラ画像に基づいて、監視エリア内の物体の3次元位置を計測する3次元計測を行って、監視エリアの3次元情報を生成し、
前記3次元情報の変化状況に基づいて、監視エリアに侵入した物体を検知し、
前記3次元情報を可視化したマップ画像、および監視エリアに侵入した物体を示すマーク画像を生成して、ユーザの入力操作により選択された前記カメラ画像および前記マップ画像の少なくとも1つの画像と前記マーク画像とを表示した監視画面を出力することを特徴とする3次元侵入検知方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/488,640 US20210142636A1 (en) | 2017-03-01 | 2018-01-19 | Three-dimensional intrusion detection system and three-dimensional intrusion detection method |
GB1911226.7A GB2572933B (en) | 2017-03-01 | 2018-01-19 | Three-dimensional intrusion detection system and three-dimensional intrusion detection method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-038136 | 2017-03-01 | ||
JP2017038136A JP6365906B1 (ja) | 2017-03-01 | 2017-03-01 | 3次元侵入検知システムおよび3次元侵入検知方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018159144A1 true WO2018159144A1 (ja) | 2018-09-07 |
Family
ID=63036761
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/001492 WO2018159144A1 (ja) | 2017-03-01 | 2018-01-19 | 3次元侵入検知システムおよび3次元侵入検知方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20210142636A1 (ja) |
JP (1) | JP6365906B1 (ja) |
GB (1) | GB2572933B (ja) |
WO (1) | WO2018159144A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110398199A (zh) * | 2019-07-05 | 2019-11-01 | 内蒙古能建数字信息科技有限公司 | 一种建筑限界检测方法 |
WO2020054110A1 (ja) * | 2018-09-12 | 2020-03-19 | コニカミノルタ株式会社 | 物体検出システム、および物体検出方法 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA3120621A1 (en) * | 2018-11-22 | 2020-05-28 | Presien Pty Ltd | System and method for identifying a defined object and alerting a user |
JP7272128B2 (ja) * | 2019-06-14 | 2023-05-12 | オムロン株式会社 | 情報処理装置、情報処理方法、情報処理プログラム、および記録媒体 |
JP7405395B2 (ja) * | 2019-10-07 | 2023-12-26 | 日本電気通信システム株式会社 | 物体検知装置、システム、方法、及びプログラム |
CN113724478A (zh) * | 2021-08-31 | 2021-11-30 | 上海中通吉网络技术有限公司 | 基于边缘计算的智能安检系统 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09282459A (ja) * | 1996-04-18 | 1997-10-31 | Matsushita Electric Ind Co Ltd | 物体検出装置 |
JP2001204007A (ja) * | 2000-01-19 | 2001-07-27 | Meidensha Corp | 監視カメラの監視エリア設定装置及びその方法 |
JP2005080156A (ja) * | 2003-09-03 | 2005-03-24 | Hitachi Kokusai Electric Inc | 映像監視システム |
JP2005269397A (ja) * | 2004-03-19 | 2005-09-29 | D & M Holdings Inc | 遠隔監視システム |
JP2007116666A (ja) * | 2005-09-20 | 2007-05-10 | Fujinon Corp | 監視カメラ装置及び監視カメラシステム |
JP2007257122A (ja) * | 2006-03-22 | 2007-10-04 | Hitachi Kokusai Electric Inc | 監視システム |
JP2010277262A (ja) * | 2009-05-27 | 2010-12-09 | Konica Minolta Holdings Inc | 画像処理装置および方法 |
JP2012175631A (ja) * | 2011-02-24 | 2012-09-10 | Mitsubishi Electric Corp | 映像監視装置 |
JP2016174252A (ja) * | 2015-03-16 | 2016-09-29 | キヤノン株式会社 | 画像処理装置、画像処理システム、画像処理方法及びコンピュータプログラム |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080192118A1 (en) * | 2006-09-22 | 2008-08-14 | Rimbold Robert K | Three-Dimensional Surveillance Toolkit |
US20090322874A1 (en) * | 2007-04-23 | 2009-12-31 | Mark Knutson | System and method for remote surveillance |
KR20170059760A (ko) * | 2015-11-23 | 2017-05-31 | 엘지전자 주식회사 | 이동단말기 및 그 제어방법 |
-
2017
- 2017-03-01 JP JP2017038136A patent/JP6365906B1/ja active Active
-
2018
- 2018-01-19 US US16/488,640 patent/US20210142636A1/en not_active Abandoned
- 2018-01-19 GB GB1911226.7A patent/GB2572933B/en active Active
- 2018-01-19 WO PCT/JP2018/001492 patent/WO2018159144A1/ja active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09282459A (ja) * | 1996-04-18 | 1997-10-31 | Matsushita Electric Ind Co Ltd | 物体検出装置 |
JP2001204007A (ja) * | 2000-01-19 | 2001-07-27 | Meidensha Corp | 監視カメラの監視エリア設定装置及びその方法 |
JP2005080156A (ja) * | 2003-09-03 | 2005-03-24 | Hitachi Kokusai Electric Inc | 映像監視システム |
JP2005269397A (ja) * | 2004-03-19 | 2005-09-29 | D & M Holdings Inc | 遠隔監視システム |
JP2007116666A (ja) * | 2005-09-20 | 2007-05-10 | Fujinon Corp | 監視カメラ装置及び監視カメラシステム |
JP2007257122A (ja) * | 2006-03-22 | 2007-10-04 | Hitachi Kokusai Electric Inc | 監視システム |
JP2010277262A (ja) * | 2009-05-27 | 2010-12-09 | Konica Minolta Holdings Inc | 画像処理装置および方法 |
JP2012175631A (ja) * | 2011-02-24 | 2012-09-10 | Mitsubishi Electric Corp | 映像監視装置 |
JP2016174252A (ja) * | 2015-03-16 | 2016-09-29 | キヤノン株式会社 | 画像処理装置、画像処理システム、画像処理方法及びコンピュータプログラム |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020054110A1 (ja) * | 2018-09-12 | 2020-03-19 | コニカミノルタ株式会社 | 物体検出システム、および物体検出方法 |
JPWO2020054110A1 (ja) * | 2018-09-12 | 2021-08-30 | コニカミノルタ株式会社 | 物体検出システム、および物体検出方法 |
JP7184087B2 (ja) | 2018-09-12 | 2022-12-06 | コニカミノルタ株式会社 | 物体検出システム、および物体検出方法 |
CN110398199A (zh) * | 2019-07-05 | 2019-11-01 | 内蒙古能建数字信息科技有限公司 | 一种建筑限界检测方法 |
Also Published As
Publication number | Publication date |
---|---|
US20210142636A1 (en) | 2021-05-13 |
JP2018147015A (ja) | 2018-09-20 |
GB2572933B (en) | 2022-05-18 |
JP6365906B1 (ja) | 2018-08-01 |
GB2572933A (en) | 2019-10-16 |
GB201911226D0 (en) | 2019-09-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6365906B1 (ja) | 3次元侵入検知システムおよび3次元侵入検知方法 | |
JP2007249722A (ja) | 物体検知装置 | |
US10019217B2 (en) | Visual focus-aware techniques for visualizing display changes | |
EP3675066A1 (en) | Information processing device, system, image processing method, computer program, and storage medium | |
JP6332568B2 (ja) | 情報処理装置、制御方法、及びプログラム | |
JP2019071578A (ja) | 物体検知装置、物体検知システムおよび物体検知方法 | |
JPWO2016151950A1 (ja) | 監視システム、監視方法および監視プログラム | |
KR101256894B1 (ko) | 3d이미지 및 사진이미지를 이용한 실시간 설비 모니터링 장치 | |
CN114726978A (zh) | 信息处理装置、信息处理方法以及程序 | |
US11327292B2 (en) | Method of operating observation device, observation device, and recording medium | |
JP6620846B2 (ja) | 3次元侵入検知システムおよび3次元侵入検知方法 | |
JP6664078B2 (ja) | 3次元侵入検知システムおよび3次元侵入検知方法 | |
US10068375B2 (en) | Information processing apparatus, information processing method, and recording medium | |
JP2017025503A (ja) | 建設機器操作アシスト表示システム | |
JP2019032713A (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP6581280B1 (ja) | 監視装置、監視システム、監視方法、監視プログラム | |
KR20170134021A (ko) | 무인항공기 영상분석자료 기반 재난시각화 시스템 | |
KR20180119344A (ko) | 영역 감시 장치 및 이에 의한 영역 감시 방법 | |
JP2012096637A (ja) | 車両用表示装置 | |
JP2020128908A (ja) | ガス漏れ位置特定システム及びガス漏れ位置特定プログラム | |
JP7395137B2 (ja) | ヘッドマウント型温度分布認識装置 | |
JP7418734B2 (ja) | 煙検出装置 | |
US20220113260A1 (en) | Image processing apparatus | |
JP2007288661A (ja) | 監視画像表示システム | |
KR101265224B1 (ko) | 카메라 영상을 이용한 회전체의 진단방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18761505 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 201911226 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20180119 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18761505 Country of ref document: EP Kind code of ref document: A1 |