WO2014069088A1 - 情報処理システム、情報処理方法及びプログラム - Google Patents
情報処理システム、情報処理方法及びプログラム Download PDFInfo
- Publication number
- WO2014069088A1 WO2014069088A1 PCT/JP2013/073358 JP2013073358W WO2014069088A1 WO 2014069088 A1 WO2014069088 A1 WO 2014069088A1 JP 2013073358 W JP2013073358 W JP 2013073358W WO 2014069088 A1 WO2014069088 A1 WO 2014069088A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- imaging device
- time
- information processing
- person
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19608—Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19639—Details of the system layout
- G08B13/19645—Multiple cameras, each having view on one of a plurality of scenes, e.g. multiple cameras for multi-room surveillance or for tracking an object by view hand-over
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- Some aspects according to the present invention relate to an information processing system, an information processing method, and a program.
- Some aspects of the present invention have been made in view of the above-described problems, and provide an information processing system, an information processing method, and a program capable of suitably monitoring a moving body related to a plurality of imaging apparatuses. Is one of the purposes.
- An information processing system includes an input unit that receives input of images captured by a plurality of imaging devices, and a first imaging device out of the plurality of imaging devices that is input by the input unit. Time transition of the probability that the moving body detected in the video of the first imaging device and the moving body detected in the video of the first imaging device appear in the video captured by the second imaging device of the plurality of imaging devices And a predicting unit that predicts a time zone in which the moving body appears in the video of the second imaging device based on the time elapsed after the moving body detected by the first imaging device is out of the imaging range. And an informing means for informing a time zone predicted to appear in the video of the second photographing apparatus.
- An information processing method includes a step of receiving input of images taken by a plurality of imaging devices, and a step of detecting a moving body that appears in an image taken by a first imaging device of the plurality of imaging devices. And the time transition of the probability that the moving body detected in the video of the first imaging device appears in the video captured by the second imaging device of the plurality of imaging devices, and detected by the first imaging device Predicting a time zone in which the moving body appears in the video of the second imaging device based on the time elapsed after the moved mobile body is out of the imaging range; and the video of the second imaging device
- the information processing system performs a step of notifying a time zone predicted to appear.
- the program according to the present invention includes a process of receiving input of images captured by a plurality of imaging devices, a process of detecting a moving body reflected in an image captured by a first imaging device of the plurality of imaging devices, The time transition of the probability that the moving body detected in the video of the first imaging device appears in the video captured by the second imaging device of the plurality of imaging devices and the first imaging device A process for predicting a time zone in which the moving body appears in the video of the second imaging device based on the passage of time after the moving body is out of the imaging range, and when the mobile body appears in the video of the second imaging device And causing the computer to execute processing for notifying the predicted time zone.
- “part”, “means”, “apparatus”, and “system” do not simply mean physical means, but “part”, “means”, “apparatus”, “system”. This includes the case where the functions possessed by "are realized by software. Further, even if the functions of one “unit”, “means”, “apparatus”, and “system” are realized by two or more physical means or devices, two or more “parts” or “means”, The functions of “device” and “system” may be realized by a single physical means or device.
- an information processing system an information processing method, and a program capable of suitably monitoring a moving body related to a plurality of imaging devices.
- FIG. 1 is a block diagram showing a system configuration of the monitoring system 1.
- the monitoring system 1 is broadly divided into an information processing server 100 and a plurality of video cameras 200 (video cameras 200A to 200N are collectively referred to as a video camera 200) that captures (captures) video (moving images). It comprises a display device 300 and an input device 400.
- the monitoring system 1 will be described as a system for monitoring (tracking) a person photographed by the video camera 200, but the monitoring target may be applied to various moving objects such as cars, bicycles, and motorcycles. Conceivable.
- the video camera 200 which is a photographing device, shoots a video and determines whether or not there is a person in the shot video, and then processes information such as the position and feature amount related to the person together with the shot video. Send to server 100.
- the video camera 200 can also track a person in the video by comparing the shot video between frames. It should be noted that processes such as person detection, feature extraction, and person tracking in the camera may be performed on the information processing server 100 or other information processing apparatus (not shown), for example.
- the information processing server 100 performs various processes such as detection of a person, registration of a person to be tracked, and tracking of a registered person by analyzing video captured by the video camera 200.
- the information processing server 100 In person monitoring by the information processing server 100, the information processing server 100 outputs a monitoring screen to the display device 300 and receives operation signals related to various operation inputs related to person monitoring (person tracking) from the input device 400. . More specifically, for example, on a monitoring screen (a specific example is shown in FIG. 2 to be described later) displayed on the display device 300, a plurality of videos input from the video camera 200 are displayed, so that the person to be monitored is displayed. If it is not reflected, the video camera 200 is likely to appear next, so that the user who is a monitor can grasp. Thus, the information processing server 100 has a function of predicting which video camera 200 a person will appear in (next camera prediction).
- the information processing server 100 when the information processing server 100 indicates (notifies) to which video camera 200 the person to be monitored is likely to appear next, the information processing server 100 dynamically determines when the person is likely to appear. Can be shown. For example, when there is a high possibility of appearing 3 seconds to 5 seconds from now, by displaying that fact on the display screen, the user as a monitor can recognize the time zone to which attention should be paid. It has become. Furthermore, since the person to be monitored changes the certainty of the video camera 200 that appears next according to the elapsed time after the frame out from the video of the video camera 200, the information processing server 100 dynamically changes each video. After calculating the probability of appearing in the camera 200, display according to the probability is performed for the user. Details of this processing will be described later with reference to FIGS.
- a user who is a monitor looks at the display device 300, and a person to be monitored shown in a video (moving image) of a video camera 200 is framed out of the video camera 200 and then displayed on the video of another video camera 200.
- the input device 400 can be operated to associate the two persons as being the same person.
- the information processing server 100 may associate a person with no human intervention based on information such as the feature amount of the photographed person.
- the display device 300 is, for example, a display that displays an image on a liquid crystal, an organic EL (Electro Luminescence), or the like.
- the display device 300 displays the monitoring screen output from the information processing server 100.
- the input device 400 is a device for a user (monitor) to input various information.
- a pointing device such as a mouse, a touch pad, or a touch panel, a keyboard, and the like correspond to the input device 400.
- Various processes such as registration of a monitoring target person (tracking target person) and associating the registered person with a person newly appearing in the video camera 200 (corresponding as the same person) are performed on the user input device 400. Made based on operation.
- the display device 300 and the input device 400 may be realized as one client, or the functions of the information processing server 100, the display device 300, and the input device 400 are realized by four or more information processing devices. You may do it.
- the client may have some functions of the information processing server 100 according to the present embodiment.
- the information processing server 100 includes an input unit 110, a person detection unit 120, a probability information generation unit 130, a time prediction unit 140, a display control unit 150, and a database (DB) 160.
- the functions of the information processing server 100 may be realized by a plurality of information processing apparatuses (computers).
- the input unit 110 outputs the video received from the video camera 200 to the display control unit 150 to display the video on the display device 300, and also detects information on the person detection result received from the video camera 200. Registered in the DB 160 as person information 163 or the like.
- the detected person information 163 registered in the DB 160 by the input unit 110 includes information on the feature amount of the person detected by the video camera 200.
- the person detection unit 120 Based on the person detection result received by the input unit 110 from the video camera 200, the person detection unit 120 indicates that the person to be monitored is reflected in the video of the video camera 200 and that it has disappeared (out of frame). Detect. As described above, in the present embodiment, the video camera 200 performs person detection. However, the present invention is not limited to this, and the person detection unit 120 itself may be mounted so as to perform person detection.
- the probability information generation unit 130 after appearing on a certain video camera 200, takes into consideration the case where a person who has been framed out is another video camera 200 (a person has returned (made a U-turn)).
- Information related to the time transition of the probability of being reflected in the information may be included, and registered as probability information 161 of the DB 160.
- the registration of the probability information 161 may be performed by pre-processing in advance, or may be dynamically updated while performing person detection sequentially.
- the time transition of the probability of a person appearing in another video camera 200 after being framed out from the video of the video camera 200 generated by the probability information generation unit 130 will be described later with reference to FIGS.
- the time predicting unit 140 predicts the time (time zone) reflected in another camera after a person reflected in a certain video camera 200 is out of the frame based on the passage of time from the frame out and the probability information 161. . This process will be described later with reference to FIG.
- the display control unit 150 causes the display device 300 to display various display screens such as a monitoring screen.
- the display control unit 150 includes a video display unit 151 and a UI generation unit 153.
- the video display unit 151 displays the captured video input by the input unit 110 on the display device 300.
- a video area to be displayed by the video display unit 151 is provided in a part of the monitoring screen.
- the video displayed by the video display unit 151 on the display device 300 may not be a real-time video. If the recorded video is displayed on the display device 300, the video input from the video camera 200 is stored in a storage medium (not shown), and the video display unit 151 reads the video and displays it on the display device 300. Will be allowed to.
- the UI generation unit 153 causes the display device 300 to display various display screens such as a monitoring screen whose specific example is shown in FIG.
- the GUI Graphic User Interface
- the GUI Graphic User Interface
- the UI generation unit 153 obtains information on the video camera 200 estimated to appear as a monitoring target person and its time zone as a result of the next camera prediction by the probability information generation unit 130 and the time prediction unit 140. Notify the user.
- the DB 160 is constructed on various storage devices such as an HDD (not shown). The DB 160 manages probability information 161 and detected person information 163.
- FIG. 3 is a diagram illustrating a specific example of a display screen (hereinafter, also referred to as a monitoring screen 20) that the display device 300 displays for person monitoring.
- the example of the monitoring screen 20 in FIG. 2 includes video areas 21A to 21D (hereinafter also collectively referred to as video areas 21) for displaying captured videos input from a plurality of video cameras 200, respectively.
- the video area 21 displays multi-camera video input from a plurality of video cameras 200 as described above.
- the video of the video camera 200 displayed in each video area 21 is switched at any time (dynamically). For example, after the person to be monitored moves out of the display area, the UI generation unit 153 and the video are converted into the video of the video camera 200 predicted by the time prediction unit 140 when the person appears next in accordance with the movement of the person.
- the display unit 151 switches the video to be displayed in the video area 21.
- FIG. 2A shows a specific example of the monitoring screen 20 when the person P moving in the traveling direction a is shown in the video in the video area 21A (video of the video camera 200 having the identifier “Camera001”).
- 2B is a diagram illustrating a specific example of the monitoring screen 20 in a case where the person P is not detected by any of the plurality of video cameras 200 after the person P has framed out of the video in the video area 21A.
- the video areas 21B to 21D of the monitoring screen 20 after the person P is framed out include time zone display areas 23B to 23D (hereinafter collectively referred to as time zones). And display position suggestion images 25B to 25D (hereinafter also collectively referred to as appearance position suggestion images 25).
- the time zone display area 23 indicates a time zone in which a person P is likely to appear in each video. For example, since “3 s to 5 s” is displayed on the time zone display area 23 of the video area 21B, if a person P appears, it is 3 to 5 seconds from the current time when the monitoring screen 20 is displayed. This indicates that there is a high possibility that the person P appears in the video during the time period. Similarly, since “12s to 16s” is displayed on the time zone display area 23C of the video area 21C, if the person P appears in the video of the video area 21C, the time zone of 12 to 16 seconds from the current time. This indicates that there is a high possibility. Since “30 s to 40 s” is displayed on the time zone display area 23D of the video area 21D, if the person P appears in the video of the video area 21D, the time zone is 30 to 40 seconds from the current time. This indicates a high possibility.
- the appearance position suggestion image 25 indicates a position on the video where the person P to be monitored is likely to appear.
- the appearance position suggestion image 25B is shown in a darker color than the appearance position suggestion images 25C and 25D. This indicates that, at the current time, the probability that the person P appears in the video in the video area 21B is higher than the probability that the person P appears in the video areas 21C and 21D. In this way, by changing the color of the appearance position suggestion image 25 in accordance with the appearance probability at the current time or blinking the appearance position suggestion image 25, the possibility that the person P will appear and the time zone where the person P is likely to appear are You may make it make a user recognize.
- the video areas 21B, 21C, and 21D in the order that the person P is likely to appear in the near time, it becomes easy to understand which video area 21 the user should pay attention to.
- the user can check each video area 21 in this time zone, that is, the time zone (range). It is not necessary to watch the video from a time earlier than (), and it is not necessary to watch the video at a time later than that time zone, so the monitoring load on the user can be reduced.
- the position of the video displayed in each video area 21 is not changed, and the monitoring method is changed by changing the emphasis method by the color of each video area 21 according to the high possibility of the appearance of the person P. You may express the priority of the video which a certain user should watch.
- the time zone in which the person P is likely to appear is notified by the time zone display area 23 and the appearance position suggestion image 25, but the present invention is not limited to this.
- the video region 21 and its surroundings are blinked or the luminance is changed according to the possibility (probability) that the person P appears.
- FIG. 3A shows a state at time T after the person P shown in FIG. 2 is out of the frame from the video area 21A
- FIG. 3B shows a state after 2 seconds of FIG. 3A.
- 3A and 3B display examples of the video area 21 are shown on the left side, and time transitions of the probability that the person P appears in the video area 21 are shown on the right side.
- the UI generation unit 153 displays “3 s to 5 s” in the time zone display area 23 in the video area 21.
- the time zone having the probability of exceeding the threshold X in the time transition of the probability that the person P appears is 1 to 3 from the current time.
- the UI generation unit 153 displays “1 to 3 s” in the time zone display area 23 in the video area 21.
- the notification method of the time zone in which the person P appears in the video area 21 changes dynamically with the passage of time.
- the appearance position suggestion image 25 can also be dynamically changed according to the passage of time as in the time zone display area 23.
- ⁇ is an average appearance time when the tracking target person appears from the camera A to the camera B.
- the average appearance time may be corrected according to the moving speed of the monitoring target person in the video by the camera A. For example, in the video of camera A, when the monitoring target person is moving at twice the average speed, it is conceivable to correct the average appearance time by half. Alternatively, it is also conceivable that attributes such as the gender and height of the person to be monitored are identified in advance and an average appearance time corresponding to these attributes is adopted.
- a (t) is a histogram, and a specific example is shown in FIG.
- the distribution of a (t) may be statistically created / updated by the probability information generation unit 130 as a part of the probability information 161 based on the movements of the monitoring target persons of the monitoring system 1, or in advance, the probability information
- the generation unit 130 may prepare as part of the probability information 161.
- the information related to a (t) is generated for each camera between which a person can move.
- a common function may be adopted. More specifically, for example, it is conceivable to adopt a BPT (Brownian Passage Time) distribution expressed by the following equation.
- ⁇ is a predetermined parameter and represents the degree of variation in appearance time.
- This formula is a formula when it is assumed that the movement of the person is a Brownian motion, but other heuristic functions such as expressing the BPT distribution by a piecewise linear function may be considered.
- A, exit) may be provided to examine the exit correspondence separately.
- the probability that the monitoring target person appears in the video of the camera B at time T is set to a value other than B. It is possible to use the value calculated by the following formula, considering that it is certain that it does not appear in the camera.
- the value is close to 1 if the possibility of appearing in a video camera other than camera B is 0, and close to 0 if the possibility of appearing in other than camera B is high.
- the probability calculation may consider whether or not a person has appeared based on the feature amount of the person who has appeared in the camera B or other video camera 200. For example, if the probability that a person who appears in the camera B is a person to be monitored is g from the feature amount of the person, the probability can be calculated by the following mathematical formula.
- FIG. 5 is a flowchart showing the flow of processing of the information processing server 100 according to the present embodiment.
- Each processing step to be described later can be executed in any order or in parallel as long as there is no contradiction in processing contents, and other steps can be added between the processing steps. good. Further, a step described as a single step for convenience can be executed by being divided into a plurality of steps, and a step described as being divided into a plurality of steps for convenience can be executed as one step.
- the person detection unit 120 determines whether or not a person as a detection target object is reflected in the video captured by the video camera 200 (S501). .
- the video camera 200 is referred to as “camera A”.
- the person detection unit 120 determines whether or not the person is out of frame from the camera A. (S503).
- the time prediction unit 140 When the person has framed out from the camera A (Yes in S503), the time prediction unit 140 first calculates the elapsed time after the person out of the frame from the camera A (S505). The time prediction unit 140 further refers to the probability information 161 from the DB 160 (S507), and for each video camera 200 at a position where the person can move from the camera A, the person is displayed on the video of the video camera 200. A time transition of the probability of appearing (for example, information corresponding to the graph on the right side of FIG. 3) is generated.
- the time prediction unit 140 obtains a time zone in which the person is likely to appear in each video camera 200 (S509). More specifically, for example, in the time transition of the probability shown on the right side of FIG. 3, the time prediction unit uses a time zone in which the probability exceeds the threshold value X as a time zone where the person is likely to appear in the video camera 200. 140 can be specified.
- the UI generation unit 153 rearranges the videos in the video area 21 of the monitoring screen 20 in the order of the time period predicted by the time prediction unit 140 that is likely to move from the camera A to each video camera 200, for example.
- the time zone is displayed in the time zone display area 23 (S511). As described above, the method of notifying the user of the time zone is not limited to this.
- the person detection unit 120 determines whether or not a tracking target person out of the camera A has been detected by any of the video cameras 200 (S513). If not detected (Yes in S513), the information is detected.
- the processing server 100 repeats the processing after S505. On the other hand, if the tracking target person is detected by any of the video cameras 200 (Yes in S513), the notification to the user is ended (S515).
- the information processing server 100 includes a processor 601, a memory 603, a storage device 605, an input interface (I / F) 607, a data I / F 609, a communication I / F 611, and a display device 613.
- a processor 601 a memory 603, a storage device 605, an input interface (I / F) 607, a data I / F 609, a communication I / F 611, and a display device 613.
- the processor 601 controls various processes in the information processing server 100 by executing a program stored in the memory 603.
- the processing related to the input unit 110, the person detection unit 120, the probability information generation unit 130, the time prediction unit 140, and the display control unit 150 described in FIG. 1 is mainly stored on the processor 601 after being temporarily stored in the memory 603. It can be realized as a program that runs on
- the memory 603 is a storage medium such as a RAM (Random Access Memory).
- the memory 603 temporarily stores a program code of a program executed by the processor 601 and data necessary for executing the program. For example, in the storage area of the memory 603, a stack area necessary for program execution is secured.
- the storage device 605 is a non-volatile storage medium such as a hard disk or a flash memory.
- the storage device 605 includes an operating system, various programs for realizing the input unit 110, the person detection unit 120, the probability information generation unit 130, the time prediction unit 140, and the display control unit 150, and probability information stored as the DB 160. 161 and various data including detected person information 163 are stored. Programs and data stored in the storage device 605 are referred to by the processor 601 by being loaded into the memory 603 as necessary.
- the input I / F 607 is a device for receiving input from the user.
- the input device 400 described in FIG. 1 can also be realized by the input I / F 607.
- Specific examples of the input I / F 607 include a keyboard, a mouse, a touch panel, and various sensors.
- the input I / F 607 may be connected to the information processing server 100 via an interface such as USB (Universal Serial Bus).
- the data I / F 609 is a device for inputting data from outside the information processing server 100.
- Specific examples of the data I / F 609 include a drive device for reading data stored in various storage media.
- the data I / F 609 may be provided outside the information processing server 100. In this case, the data I / F 609 is connected to the information processing server 100 via an interface such as a USB.
- the communication I / F 611 is a device for performing data communication with an external device of the information processing server 100, such as a video camera 200, by wire or wireless.
- the communication I / F 611 may be provided outside the information processing server 100. In that case, the communication I / F 611 is connected to the information processing server 100 via an interface such as a USB.
- the display device 613 is a device for displaying various information.
- the display device 300 described in FIG. 1 can also be realized by the display device 613.
- Specific examples of the display device 613 include a liquid crystal display and an organic EL (Electro-Luminescence) display.
- the display device 613 may be provided outside the information processing server 100. In that case, the display device 613 is connected to the information processing server 100 via, for example, a display cable.
- a user is notified of a time zone (time interval) that is likely to move from a video camera 200 to a shooting area of each video camera 200. .
- a time zone time interval
- the user who is a supervisor can grasp
- FIG. 7 is a block diagram illustrating a functional configuration of the monitoring apparatus 700 that is an information processing system.
- the monitoring device 700 includes an input unit 710, a detection unit 720, a prediction unit 730, and a notification unit 740.
- the input unit 710 can receive an image captured by a video camera (imaging device) (not shown).
- the detection unit 720 detects a moving body reflected in the video captured by at least one video camera among the plurality of video cameras input by the input unit 710.
- specific examples of the moving body include a human, a car, a bicycle, a motorcycle, and the like.
- the predicting unit 730 is configured such that the moving object detected by a certain video camera deviates from the time transition of the probability that the moving object detected by the other video camera including the video camera is captured and the imaging range of the video camera. Based on the later passage of time, a time zone in which the moving body appears in another video camera is predicted. The notification unit 740 notifies a time zone predicted to appear in the video of the other video camera.
- Input means for receiving images taken by a plurality of photographing devices, and detection for detecting a moving body that is input by the input means and appears in the images photographed by the first photographing device among the plurality of photographing devices. And a time transition of a probability that the moving body detected in the video of the first imaging device is reflected in an image captured by the second imaging device of the plurality of imaging devices, and the first imaging device.
- Prediction means for predicting a time zone in which the moving body appears in the video of the second imaging device based on the passage of time after the detected moving body is out of the imaging range; and
- An information processing system comprising notification means for notifying a time zone predicted to appear in a video.
- the predicting means includes a time zone in which the probability that the moving object detected by the first imaging device is reflected in an image captured by the second imaging device exceeds a threshold, and the moving object appears in the second imaging device.
- Appendix 7 The information processing method according to appendix 5 or appendix 6, wherein the time zone in which the mobile object is predicted to be displayed is displayed on the video imaged by the second imaging device to notify the time zone.
- a process of predicting a time zone in which the moving body appears in the video of the second image capturing device based on the time elapsed after that, and a time zone predicted to appear in the image of the second image capturing device A program that causes a computer to execute processing to be performed.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Closed-Circuit Television Systems (AREA)
- Alarm Systems (AREA)
Abstract
Description
図1乃至図6は、第1実施形態を説明するための図である。以下、これらの図を参照しながら、以下の流れに沿って本実施形態を説明する。まず「1.1」でシステムの機能構成の概要を示すとともに、「1.2」で動作の概要を、表示画面の具体例を示しながら説明する。その後、「1.3」で処理の流れを、「1.4」で、本システムを実現可能なハードウェア構成の具体例を示す。最後に「1.5」以降で、本実施形態に係る効果などを説明する。
(1.1.1 システム構成概要)
図1を参照しながら、本実施形態に係る情報処理システムである監視システム1のシステム構成を説明する。図1は、監視システム1のシステム構成を示すブロック図である。
なお、人物の検出や特徴量の抽出、カメラ内の人物追跡などの処理は、例えば、情報処理サーバ100や、図示しない他の情報処理装置上で行なっても良い。
以下、本実施形態に係る情報処理サーバ100の構成を説明する。情報処理サーバ100は、図1に示すように、入力部110、人物検知部120、確率情報生成部130、時刻予測部140、表示制御部150、及びデータベース(DB)160を含む。なお、情報処理サーバ100の機能は、複数の情報処理装置(コンピュータ)により実現しても良い。
表示制御部150は、監視用画面等の各種表示画面を表示装置300に表示させる。表示制御部150は、映像表示部151及びUI生成部153を含む。
DB160は、例えば図示しないHDD等の各種記憶装置上に構築される。DB160は、確率情報161、及び検出人物情報163を管理する。
以下、図2乃至図4を参照しながら、監視システム1の機能や動作を説明する。
まず、図2を参照しながら、表示装置300が表示する表示画面の具体例を説明する。図3は、表示装置300が人物監視のために表示する表示画面(以下、監視用画面20ともいう。)の具体例を示す図である。
次に、図3を参照しながら、時間経過に応じたユーザ報知の変化を説明する。図3(a)は、図2に示した人物Pが映像領域21Aからフレームアウトした後の時刻Tにおける状態、図3(b)は図3(a)の2秒後の状態を示している。また、図3(a)(b)のそれぞれ左側には映像領域21の表示例を、右側には、当該映像領域21に人物Pが現れる確率の時間遷移を、それぞれ示している。
このように、映像領域21へ人物Pが出現する時間帯の報知方法は、時間経過に応じて動的に変化する。
続いて、図4を参照しながら、確率の算出方法を説明する。あるビデオカメラ200(ここでは「カメラA」と呼ぶ。)の映像からフレームアウトしてからt秒後に、次のビデオカメラ200(ここでは「カメラB」と呼ぶ。)の映像に現れる確率f(t|A,B)は、下記の数式により表現することができる。
この数式は、人物の移動がブラウン運動であるものと仮定した場合の数式であるが、BPT分布を区分線形関数で表現するなど、他のヒューリスティックな関数を採用することも考えられる。
なお、上記数式及び図4からわかるように、カメラAからカメラBに追跡対象人物が現れる確率は、平均出現時刻μ周辺で最も高くなる。
また、カメラが全部でN個あったとして、全てのカメラをCi(i=1,2,…N)とすると、時刻TにカメラBの映像に監視対象人物が現れる確率を、B以外のカメラに現れない程確からしいと考えて、以下の数式により計算した値を用いることも考えられる。
次に、情報処理サーバ100の処理の流れを、図5を参照しながら説明する。図5は、本実施形態にかかる情報処理サーバ100の処理の流れを示すフローチャートである。
以下、図6を参照しながら、上述してきた情報処理サーバ100をコンピュータにより実現する場合のハードウェア構成の一例を説明する。なお前述の通り、情報処理サーバ100の機能は、複数の情報処理装置により実現することも可能である。
以上説明したように、本実施形態に係る監視システム1では、追跡対象者があるビデオカメラ200から各ビデオカメラ200の撮影領域に移動する可能性の高い時間帯(時間区間)をユーザに報知する。これにより、監視者であるユーザは、各映像を注視すべき時間帯を明確に把握することができるようになるため、人物監視に係る負荷を軽減することが可能となる。
以下、第2実施形態を、図7を参照しながら説明する。図7は、情報処理システムである監視装置700の機能構成を示すブロック図である。図7に示すように、監視装置700は、入力部710と、検知部720と、予測部730と、報知部740とを含む。
入力部710は、図示しないビデオカメラ(撮像装置)で撮影された映像の入力を受けることができる。
報知部740は、当該他のビデオカメラの映像に現れると予測される時間帯を報知する。
このように実装することで、本実施形態に係る監視装置700によれば、複数の撮影装置に係る移動体の監視を好適に行うことができるようになる。
なお、前述の実施形態の構成は、組み合わせたり或いは一部の構成部分を入れ替えたりしてもよい。また、本発明の構成は前述の実施形態のみに限定されるものではなく、本発明の要旨を逸脱しない範囲内において種々変更を加えてもよい。
複数の撮影装置で撮影された映像の入力を受ける入力手段と、前記入力手段により入力された、前記複数の撮影装置のうちの第1の撮影装置が撮影した映像に映る移動体を検知する検知手段と、前記第1の撮影装置の映像で検知された移動体が前記複数の撮影装置のうちの第2の撮影装置が撮影する映像に映る確率の時間遷移と、前記第1の撮影装置で検知された移動体が撮影範囲から外れた後の時間経過とに基づいて、当該移動体が前記第2の撮影装置の映像に現れる時間帯を予測する予測手段と、前記第2の撮影装置の映像に現れると予測される時間帯を報知する報知手段とを備える情報処理システム。
前記予測手段は、前記第1の撮影装置で検知された移動体が前記第2の撮影装置が撮影する映像に映る確率が閾値を超える時間帯を、移動体が前記第2の撮影装置に現れる時間帯と予測する、付記1記載の情報処理システム。
前記報知手段は、前記第2の撮影装置で撮影された映像の上に、移動体が現れると予測される時間帯を表示することにより、当該時間帯を報知する、付記1又は付記2記載の情報処理システム。
前記報知手段は、移動体が現れると予測される時間帯に、前記第2の撮影装置で撮影された映像に係る表示を変化させることにより、当該時間帯を報知する、付記1又は付記2記載の情報処理システム。
複数の撮影装置で撮影された映像の入力を受けるステップと、前記複数の撮影装置のうちの第1の撮影装置が撮影した映像に映る移動体を検知するステップと、前記第1の撮影装置の映像で検知された移動体が前記複数の撮影装置のうちの第2の撮影装置が撮影する映像に映る確率の時間遷移と、前記第1の撮影装置で検知された移動体が撮影範囲から外れた後の時間経過とに基づいて、当該移動体が前記第2の撮影装置の映像に現れる時間帯を予測するステップと、前記第2の撮影装置の映像に現れると予測される時間帯を報知するステップとを情報処理システムが行う情報処理方法。
前記第1の撮影装置で検知された移動体が前記第2の撮影装置が撮影する映像に映る確率が閾値を超える時間帯を、移動体が前記第2の撮影装置に現れる時間帯と予測する、付記5記載の情報処理方法。
前記第2の撮影装置で撮影された映像の上に、移動体が現れると予測される時間帯を表示することにより、当該時間帯を報知する、付記5又は付記6記載の情報処理方法。
移動体が現れると予測される時間帯に、前記第2の撮影装置で撮影された映像に係る表示を変化させることにより、当該時間帯を報知する、付記5又は付記6記載の情報処理方法。
複数の撮影装置で撮影された映像の入力を受ける処理と、前記複数の撮影装置のうちの第1の撮影装置が撮影した映像に映る移動体を検知する処理と、前記第1の撮影装置の映像で検知された移動体が前記複数の撮影装置のうちの第2の撮影装置が撮影する映像に映る確率の時間遷移と、前記第1の撮影装置で検知された移動体が撮影範囲から外れた後の時間経過とに基づいて、当該移動体が前記第2の撮影装置の映像に現れる時間帯を予測する処理と、前記第2の撮影装置の映像に現れると予測される時間帯を報知する処理とをコンピュータに実行させるプログラム。
前記第1の撮影装置で検知された移動体が前記第2の撮影装置が撮影する映像に映る確率が閾値を超える時間帯を、移動体が前記第2の撮影装置に現れる時間帯と予測する、付記9記載のプログラム。
前記第2の撮影装置で撮影された映像の上に、移動体が現れると予測される時間帯を表示することにより、当該時間帯を報知する、付記9又は付記10記載のプログラム。
移動体が現れると予測される時間帯に、前記第2の撮影装置で撮影された映像に係る表示を変化させることにより、当該時間帯を報知する、付記9又は付記10記載のプログラム。
Claims (6)
- 複数の撮影装置で撮影された映像の入力を受ける入力手段と、
前記入力手段により入力された、前記複数の撮影装置のうちの第1の撮影装置が撮影した映像に映る移動体を検知する検知手段と、
前記第1の撮影装置の映像で検知された移動体が前記複数の撮影装置のうちの第2の撮影装置が撮影する映像に映る確率の時間遷移と、前記第1の撮影装置で検知された移動体が撮影範囲から外れた後の時間経過とに基づいて、当該移動体が前記第2の撮影装置の映像に現れる時間帯を予測する予測手段と、
前記第2の撮影装置の映像に現れると予測される時間帯を報知する報知手段と
を備える情報処理システム。 - 前記予測手段は、前記第1の撮影装置で検知された移動体が前記第2の撮影装置が撮影する映像に映る確率が閾値を超える時間帯を、移動体が前記第2の撮影装置に現れる時間帯と予測する、
請求項1記載の情報処理システム。 - 前記報知手段は、前記第2の撮影装置で撮影された映像の上に、移動体が現れると予測される時間帯を表示することにより、当該時間帯を報知する、
請求項1又は請求項2記載の情報処理システム。 - 前記報知手段は、移動体が現れると予測される時間帯に、前記第2の撮影装置で撮影された映像に係る表示を変化させることにより、当該時間帯を報知する、
請求項1又は請求項2記載の情報処理システム。 - 複数の撮影装置で撮影された映像の入力を受けるステップと、
前記複数の撮影装置のうちの第1の撮影装置が撮影した映像に映る移動体を検知するステップと、
前記第1の撮影装置の映像で検知された移動体が前記複数の撮影装置のうちの第2の撮影装置が撮影する映像に映る確率の時間遷移と、前記第1の撮影装置で検知された移動体が撮影範囲から外れた後の時間経過とに基づいて、当該移動体が前記第2の撮影装置の映像に現れる時間帯を予測するステップと、
前記第2の撮影装置の映像に現れると予測される時間帯を報知するステップと
を情報処理システムが行う情報処理方法。 - 複数の撮影装置で撮影された映像の入力を受ける処理と、
前記複数の撮影装置のうちの第1の撮影装置が撮影した映像に映る移動体を検知する処理と、
前記第1の撮影装置の映像で検知された移動体が前記複数の撮影装置のうちの第2の撮影装置が撮影する映像に映る確率の時間遷移と、前記第1の撮影装置で検知された移動体が撮影範囲から外れた後の時間経過とに基づいて、当該移動体が前記第2の撮影装置の映像に現れる時間帯を予測する処理と、
前記第2の撮影装置の映像に現れると予測される時間帯を報知する処理と
をコンピュータに実行させるプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2013339935A AU2013339935A1 (en) | 2012-10-29 | 2013-08-30 | Information processing system, information processing method, and program |
JP2014544362A JP6233721B2 (ja) | 2012-10-29 | 2013-08-30 | 情報処理システム、情報処理方法及びプログラム |
EP13850572.2A EP2913997B1 (en) | 2012-10-29 | 2013-08-30 | Information processing system, information processing method, and program |
US14/439,312 US9633253B2 (en) | 2012-10-29 | 2013-08-30 | Moving body appearance prediction information processing system, and method |
ZA2015/03400A ZA201503400B (en) | 2012-10-29 | 2015-05-15 | Information processing system, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-237987 | 2012-10-29 | ||
JP2012237987 | 2012-10-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014069088A1 true WO2014069088A1 (ja) | 2014-05-08 |
Family
ID=50627009
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/073358 WO2014069088A1 (ja) | 2012-10-29 | 2013-08-30 | 情報処理システム、情報処理方法及びプログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US9633253B2 (ja) |
EP (1) | EP2913997B1 (ja) |
JP (1) | JP6233721B2 (ja) |
AU (1) | AU2013339935A1 (ja) |
WO (1) | WO2014069088A1 (ja) |
ZA (1) | ZA201503400B (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016178543A (ja) * | 2015-03-20 | 2016-10-06 | 国立大学法人岐阜大学 | 画像処理装置及び画像処理プログラム |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017163282A1 (ja) * | 2016-03-25 | 2017-09-28 | パナソニックIpマネジメント株式会社 | 監視装置及び監視システム |
CN109013677A (zh) * | 2018-08-04 | 2018-12-18 | 石修英 | 一种土壤有机污染环境监控系统 |
CN111291585B (zh) * | 2018-12-06 | 2023-12-08 | 杭州海康威视数字技术股份有限公司 | 一种基于gps的目标跟踪系统、方法、装置及球机 |
US11954941B2 (en) * | 2020-01-14 | 2024-04-09 | EMC IP Holding Company LLC | Facial recognition IOT application for multi-stream video using forecasting tools |
US11031044B1 (en) * | 2020-03-16 | 2021-06-08 | Motorola Solutions, Inc. | Method, system and computer program product for self-learned and probabilistic-based prediction of inter-camera object movement |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006081053A (ja) * | 2004-09-13 | 2006-03-23 | Mitsubishi Electric Corp | 移動体追跡支援システム |
JP2008219570A (ja) * | 2007-03-06 | 2008-09-18 | Matsushita Electric Ind Co Ltd | カメラ間連結関係情報生成装置 |
JP2009017416A (ja) | 2007-07-09 | 2009-01-22 | Mitsubishi Electric Corp | 監視装置及び監視方法及びプログラム |
JP2010161732A (ja) * | 2009-01-09 | 2010-07-22 | Toshiba Corp | 人物監視装置、人物監視方法、及び、人物監視システム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5570100A (en) * | 1994-03-10 | 1996-10-29 | Motorola, Inc. | Method for providing a communication unit's estimated time of arrival |
JP2000069346A (ja) * | 1998-06-12 | 2000-03-03 | Canon Inc | カメラ制御装置、方法、カメラ、追尾カメラシステム及びコンピュ―タ読み取り可能な記憶媒体 |
GB2378339A (en) * | 2001-07-31 | 2003-02-05 | Hewlett Packard Co | Predictive control of multiple image capture devices. |
AU2006338248B2 (en) * | 2005-03-25 | 2011-01-20 | Sensormatic Electronics, LLC | Intelligent camera selection and object tracking |
US8212877B2 (en) | 2007-03-02 | 2012-07-03 | Fujifilm Corporation | Image capturing system, image capturing method, and computer program product at which an image is captured at a predetermined time |
TWI391801B (zh) * | 2008-12-01 | 2013-04-01 | Inst Information Industry | 接手視訊監控方法與系統以及電腦裝置 |
TWI405457B (zh) * | 2008-12-18 | 2013-08-11 | Ind Tech Res Inst | 應用攝影機換手技術之多目標追蹤系統及其方法,與其智慧節點 |
EP2418849B1 (en) * | 2009-04-10 | 2013-10-02 | Omron Corporation | Monitoring system, and monitoring terminal |
GB2485969A (en) * | 2010-11-12 | 2012-06-06 | Sony Corp | Video surveillance with anticipated arrival time of object in another camera view |
WO2014045843A1 (ja) * | 2012-09-19 | 2014-03-27 | 日本電気株式会社 | 画像処理システム、画像処理方法及びプログラム |
JPWO2014050432A1 (ja) * | 2012-09-27 | 2016-08-22 | 日本電気株式会社 | 情報処理システム、情報処理方法及びプログラム |
-
2013
- 2013-08-30 US US14/439,312 patent/US9633253B2/en active Active
- 2013-08-30 AU AU2013339935A patent/AU2013339935A1/en not_active Abandoned
- 2013-08-30 WO PCT/JP2013/073358 patent/WO2014069088A1/ja active Application Filing
- 2013-08-30 JP JP2014544362A patent/JP6233721B2/ja active Active
- 2013-08-30 EP EP13850572.2A patent/EP2913997B1/en active Active
-
2015
- 2015-05-15 ZA ZA2015/03400A patent/ZA201503400B/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006081053A (ja) * | 2004-09-13 | 2006-03-23 | Mitsubishi Electric Corp | 移動体追跡支援システム |
JP2008219570A (ja) * | 2007-03-06 | 2008-09-18 | Matsushita Electric Ind Co Ltd | カメラ間連結関係情報生成装置 |
JP2009017416A (ja) | 2007-07-09 | 2009-01-22 | Mitsubishi Electric Corp | 監視装置及び監視方法及びプログラム |
JP2010161732A (ja) * | 2009-01-09 | 2010-07-22 | Toshiba Corp | 人物監視装置、人物監視方法、及び、人物監視システム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2913997A4 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016178543A (ja) * | 2015-03-20 | 2016-10-06 | 国立大学法人岐阜大学 | 画像処理装置及び画像処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
EP2913997A1 (en) | 2015-09-02 |
JPWO2014069088A1 (ja) | 2016-09-08 |
US20150294140A1 (en) | 2015-10-15 |
US9633253B2 (en) | 2017-04-25 |
EP2913997A4 (en) | 2016-07-06 |
ZA201503400B (en) | 2016-11-30 |
EP2913997B1 (en) | 2021-09-29 |
JP6233721B2 (ja) | 2017-11-22 |
AU2013339935A1 (en) | 2015-05-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6741130B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
JP6213843B2 (ja) | 画像処理システム、画像処理方法及びプログラム | |
JP6347211B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
JP6233721B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
JP6406241B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
JP7131599B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
US9298987B2 (en) | Information processing apparatus, information processing method, program, and information processing system | |
JP6233624B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
JP6210234B2 (ja) | 画像処理システム、画像処理方法及びプログラム | |
JP2011048736A (ja) | 監視制御装置及び監視システム | |
WO2014050432A1 (ja) | 情報処理システム、情報処理方法及びプログラム | |
WO2014045670A1 (ja) | 画像処理システム、画像処理方法及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13850572 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014544362 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013850572 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14439312 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2013339935 Country of ref document: AU Date of ref document: 20130830 Kind code of ref document: A |