WO2014021004A1 - Image processing system, image processing method, and program - Google Patents
Image processing system, image processing method, and program Download PDFInfo
- Publication number
- WO2014021004A1 WO2014021004A1 PCT/JP2013/066565 JP2013066565W WO2014021004A1 WO 2014021004 A1 WO2014021004 A1 WO 2014021004A1 JP 2013066565 W JP2013066565 W JP 2013066565W WO 2014021004 A1 WO2014021004 A1 WO 2014021004A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- person
- input
- moving image
- image processing
- video camera
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4316—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Definitions
- Some aspects according to the present invention relate to an image processing system, an image processing method, and a program.
- Patent Document 1 discloses an apparatus that can appropriately perform tracking (monitoring) of a person across cameras using connection relationship information between cameras. This apparatus obtains the correspondence between persons according to the similarity of the person feature amount between a point appearing in the camera field of view (In point) and a point disappearing from the camera field of view (Out point).
- An image processing system capable of suppressing confusion related to identification of a target person when performing person tracking Is one of the purposes.
- One image processing system includes an input unit that receives input of moving images captured by a plurality of video cameras, and a registration unit that can register one or more persons appearing in the moving images input from the input unit.
- Display control means for switching the moving image input from the video camera for each person registered by the registration means.
- An image processing method includes a step of receiving input of moving images captured by a plurality of video cameras, a step of registering one or more persons appearing in the input moving images, and an input from the video camera
- the image processing system performs a step of displaying a moving image for each registered person in a switchable manner.
- the program according to the present invention includes a process of receiving input of moving images captured by a plurality of video cameras, a process of registering one or more persons appearing in the input moving images, and a moving image input from the video cameras. For each registered person to be displayed in a switchable manner.
- “part”, “means”, “apparatus”, and “system” do not simply mean physical means, but “part”, “means”, “apparatus”, “system”. This includes the case where the functions possessed by "are realized by software. Further, even if the functions of one “unit”, “means”, “apparatus”, and “system” are realized by two or more physical means or devices, two or more “parts” or “means”, The functions of “device” and “system” may be realized by a single physical means or device.
- an image processing system an image processing method, and a program capable of suppressing confusion related to identification of a target person when performing person tracking.
- (1. First embodiment) 1 to 5 are diagrams for explaining the first embodiment.
- the first embodiment will be described along the following flow with reference to these drawings.
- “1.1” indicates the functional configuration of the entire system
- “1.2” indicates a specific example of the display screen, thereby giving an overview of the entire first embodiment.
- a specific example of a hardware configuration that can be realized by “1.3” and a flow of processing by “1, 4” is shown.
- the effects and the like according to the present embodiment will be described after “1.5”.
- FIG. 1 is a block diagram showing a functional configuration of the monitoring system 1.
- the monitoring system 1 is roughly composed of an information processing server 100 and a plurality of video cameras 200 that capture moving images (video cameras 200A to 200N are collectively referred to as video cameras 200).
- the video camera 200 captures (captures) a moving image. Further, the video camera 200 determines whether or not a person is reflected in the captured moving image, and information such as a position and a feature amount in the moving image related to the person together with the captured moving image. Send to.
- the video camera 200 can also track a person in the captured moving image. It should be noted that processes such as person detection, person feature extraction, and camera tracking may be performed on the information processing server 100 or other information processing apparatus (not shown).
- the information processing server 100 performs various processes such as detection of a person, registration of a person to be tracked, and tracking of a registered person by analyzing a moving image captured by the video camera 200.
- a case where person monitoring is performed based on a real-time moving image captured by the video camera 200 will be described as an example.
- the present invention is not limited to this.
- a past image captured by the video camera 200 may be used. It is also conceivable to monitor (analyze) a moving image.
- the information processing server 100 includes a camera control unit 110, a similarity calculation unit 120, a tracking person registration unit 130, a next camera prediction unit 140, a display screen generation unit 150, an input device 160, a display device 170, and a database (DB) 180. Including.
- the function of the information processing server 100 may be realized by a plurality of devices such as a server and a client.
- the camera control (camera control unit 110) and tracking (monitoring) person registration processing (tracking person) The processing of the registration unit 130), the prediction of the video camera 200 in which the person to be tracked next appears (next camera prediction unit 140), the generation of the display screen (display screen generation unit 150), etc. is performed on the server side by the user (monitorer ) Input (input device 160), display screen output (display device 170), and the like may be performed on the client side.
- Various methods of sharing processing by the server / client can be considered.
- the camera control unit 110 controls the video camera 200. More specifically, based on a user instruction or the like input from the input device 160, a command for zooming in or zooming out, changing the shooting direction up, down, left, or right is transmitted to the video camera 200. Also, the moving image and the person detection information received from the video camera 200 are registered in the DB 180 as the captured moving image 181 and the detected person information 183.
- the similarity calculation unit 120 performs the detection process of the monitored person by calculating the similarity between the person shown in the moving image input from the video camera 200 and the person registered in the person tracking information 185. At this time, the similarity calculation unit 120 selects a person in the moving image input from the video camera 200 from a plurality of person images related to each registered person (person images related to a plurality of timings of the same person). The similarity is calculated after selecting a person image having a similar posture to the image. Thereby, it is possible to improve the accuracy of similarity calculation.
- the similar posture specifically means, for example, the state of facing front / back / right / left, whether it is bent, whether it overlaps with another person, etc. Those that can be discriminated to be the same or similar (parameters for discriminating these are approximate).
- the tracking person registering unit 130 identifies a person shown in the captured moving image input from the video camera 200 based on a user instruction input from the input device 160 or the like (monitored target / tracked target). Registered in the person tracking information 185 of the DB 180. The tracking person registration unit 130 also determines that the person shown in the captured moving image input from the video camera 200 is the same person as the person already registered in the person tracking information 185. Can also register the person in the person tracking information 185.
- the next camera prediction unit 140 predicts to which video camera 200 a person who is reflected in a certain video camera 200 is reflected next.
- Various prediction methods are conceivable. For example, the calculation may be performed based on the installation distance between the video cameras 200, the structure of the building, the walking speed of the person, etc. It is conceivable to predict probabilistically by statistical processing of information such as which video camera 200 has appeared.
- the display screen generation unit 150 generates a display screen to be displayed on the display device 170 as illustrated in FIGS. 2 and 3 to be described later.
- a window 21 is displayed for each person to be tracked, and the window 21 can be switched by a tab 25.
- a moving image of the video camera 200 in which the person to be tracked is projected or predicted to be projected in the near future is arranged.
- the display screen generation unit 150 determines whether or not to newly register a person shown in the moving image as a person to be tracked or whether or not to associate the person with an already registered person.
- a registrable GUI (Graphical User Interface) is displayed on the display device 170.
- the input device 160 is a device for a user (monitor) to input various information.
- a pointing device such as a mouse, a touch pad, or a touch panel, a keyboard, and the like correspond to the input device 160.
- Various processes such as registration of the target person by the tracking person registration unit 130, association with the registered person, and switching of the tab 25 are performed based on operations on the input device 160.
- the display device 170 is a display that displays an image on, for example, a liquid crystal or an organic EL (Electro Luminescence).
- the display device 170 displays the display screen created by the display screen generation unit 150.
- the DB 170 is constructed on various storage devices such as an HDD (Hard Disk Drive) (not shown).
- the DB 180 manages the captured moving image 181, the detected person information 183, and the person tracking information 185.
- the captured moving image 181 is a moving image input from the video camera 200.
- the moving image 181 that has been taken for a certain period of time after shooting or a portion that can be determined that a person is not shown may be deleted.
- the detected person information 183 is information such as a feature amount of a person detected by the video camera 200, a shooting time in the shooting moving image 181 and a person image.
- the person tracking information 185 is information on a person that is tracked by the tracking person registration unit 130 among the persons detected as the detected person information 183.
- the person shown in the video of the plurality of video cameras 200 is associated as the same person by the tracking person registration unit 130, the information is also registered in the person tracking information 185.
- FIGS. 2 and 3 are diagrams illustrating specific examples of a display screen (hereinafter also referred to as a monitoring screen 20) that the display device 170 displays for person monitoring. First, FIG. 2 will be described.
- a window including moving image display areas 23A to 23D (hereinafter also collectively referred to as a moving image display area 23) that displays captured moving images input from a plurality of video cameras 200.
- 21 and tabs 25A to 25C (hereinafter also collectively referred to as tabs 25) for switching the windows 21.
- the moving image display area 23 arranged on the window 21 displays multi-camera images input from a plurality of video cameras 200 as described above.
- the video of the video camera 200 displayed in each moving image display area 23 may be switched at any time. For example, it is conceivable that, after a person to be monitored moves out of the display area, it can be predicted that the person will appear next or move to the video of the video camera 200 that appears as the person moves.
- the tab 25 is for switching the window 21.
- the windows 21 that can be switched by the tab 25 are provided according to the person to be monitored. In the example of FIG. 2, windows 21 (including corresponding moving image display areas 23) are set for the three monitored persons registered by the tracking person registration unit 130. 25 can be switched.
- a monitored person other than the monitored person (monitored person corresponding to the person image 25A1 in the example of FIG. 2) presented to the user as the window 21, a moving image (video) from any of the video cameras 200 is displayed. ), Or if the next camera prediction unit 140 predicts that it will appear in the near future, it prompts the user to switch the window 21. In the example of FIG. 2, the color of the tab 25B changes or blinks to prompt the user to switch to the tab 25B.
- FIG. 3 is a specific example when the window 21 is switched by the tab 25B.
- a person P2 who is a monitored person (a monitored person corresponding to the person image 25B1 in the example of FIG. 3) is shown in the moving image display area 23C.
- the display screen generation unit 150 informs the user who is a monitor by a change in the state of the monitored person (detection of a new monitored person or the near future in the near future). Prediction of the person being monitored) is not limited to this. For example, notification may be made by displaying a window message or by voice. Alternatively, it is conceivable that the window 21 is forcibly switched without any user operation in accordance with detection of a new monitored person or prediction of detected person being monitored.
- FIG. 4 is a flowchart showing a processing flow of the information processing server 100 according to the present embodiment.
- Each processing step to be described later can be executed in any order or in parallel as long as there is no contradiction in processing contents, and other steps can be added between the processing steps. good. Further, a step described as a single step for convenience can be executed by being divided into a plurality of steps, and a step described as being divided into a plurality of steps for convenience can be executed as one step.
- the similarity calculation unit 120 determines whether or not a different person to be monitored (monitored person) related to the window 21 displayed on the monitoring screen 20 has been detected (S401). For example, in the example of FIG. 2, the similarity calculation unit 120 determines whether or not the monitored person related to the person images 25B1 and 25C1 has been detected.
- the display screen generation unit 150 changes the color of the tab 25 or changes the dot, etc. 21 (switching of tab 25) is prompted.
- the next camera prediction unit 140 predicts that the monitored person related to the other window 21 will appear in the near future (for example, within 5 seconds). (Yes in S403), the display screen generation unit 150 proceeds to S405 and prompts the window 21 to be switched.
- the display screen generation unit 150 displays the tab. It is highlighted by changing or blinking 25 colors. Thereby, even when there are a plurality of persons to be tracked, the person can be monitored on the screen divided for each person by the tab 25, so that confusion can be prevented.
- the function of the information processing server 100 can be realized by a plurality of information processing apparatuses (for example, a server and a client).
- the information processing server 100 includes a processor 501, a memory 503, a storage device 505, an input interface (I / F) 507, a data I / F 509, a communication I / F 511, and a display device 513.
- a processor 501 the information processing server 100 includes a processor 501, a memory 503, a storage device 505, an input interface (I / F) 507, a data I / F 509, a communication I / F 511, and a display device 513.
- the processor 501 controls various processes in the information processing server 100 by executing a program stored in the memory 503. For example, the processes related to the camera control unit 110, the similarity calculation unit 120, the tracking person registration unit 130, the next camera prediction unit 140, and the display screen generation unit 150 described in FIG. 1 are temporarily stored in the memory 503. It can be realized as a program that mainly operates on the processor 501.
- the memory 503 is a storage medium such as a RAM (Random Access Memory).
- the memory 503 temporarily stores a program code of a program executed by the processor 501 and data necessary for executing the program. For example, a stack area necessary for program execution is secured in the storage area of the memory 503.
- the storage device 505 is a nonvolatile storage medium such as an HDD (Hard Disk Drive) or a flash memory.
- the storage device 505 is stored as an operating system, various programs for realizing the camera control unit 110, the similarity calculation unit 120, the tracking person registration unit 130, the next camera prediction unit 140, and the display screen generation unit 150, and the DB 180.
- Various data including the captured moving image 181, detected person information 183, and person tracking information 185 are stored.
- Programs and data stored in the storage device 505 are referred to by the processor 501 by being loaded into the memory 503 as necessary.
- the input I / F 507 is a device for receiving input from the user.
- the input device 160 described in FIG. 1 is realized by the input I / F 507.
- Specific examples of the input I / F 507 include a keyboard, a mouse, a touch panel, and various sensors.
- the input I / F 507 may be connected to the information processing server 100 via an interface such as USB (Universal Serial Bus).
- the data I / F 509 is a device for inputting data from outside the information processing server 100.
- Specific examples of the data I / F 509 include a drive device for reading data stored in various storage media.
- the data I / F 509 may be provided outside the information processing server 100. In this case, the data I / F 509 is connected to the information processing server 100 via an interface such as a USB.
- the communication I / F 511 is a device for performing data communication with a device external to the information processing server 100, such as a video camera 200, by wire or wireless. It is conceivable that the communication I / F 511 is provided outside the information processing server 100. In this case, the communication I / F 511 is connected to the information processing server 100 via an interface such as a USB.
- the display device 513 is a device for displaying various kinds of information such as the monitoring screen 20, for example, and is a liquid crystal display, an organic EL (Electro-Luminescence) display, or the like.
- the display device 513 may be provided outside the information processing server 100. In that case, the display device 513 is connected to the information processing server 100 via, for example, a display cable.
- the information processing server 100 displays the moving images input from the plurality of video cameras 200 on the window 21 generated for each monitoring target person.
- the windows 21 can be switched by tabs 25, respectively. If it can be predicted that a person to be tracked (monitored person) who is not displaying the main screen appears in the video of the video camera 200 or is likely to appear in the near future, the display screen generation unit 150 sets the tab 25. Highlight by changing color or blinking. Thereby, even when there are a plurality of persons to be tracked, the person can be monitored on the screen divided for each person by the tab 25, so that confusion can be prevented.
- FIG. 6 is a block diagram illustrating a functional configuration of the monitoring apparatus 600 that is an image processing system.
- the monitoring device 600 includes an input unit 610, a registration unit 620, and a display control unit 630.
- the input unit 610 receives input of moving images captured by a plurality of video cameras.
- the registration unit 620 can register one or more persons shown in the moving image input from the input unit 610.
- the display control unit 630 displays the moving image input from the video camera in a switchable manner for each person registered by the registration unit 620.
- the monitoring apparatus 600 According to the present embodiment, it is possible to suppress confusion related to the identification of the target person when performing person tracking.
- the monitoring system 1 according to the third embodiment will be described.
- the same components as those in the first embodiment are denoted by the same reference numerals and description thereof is omitted.
- the same points as in the first embodiment are not described as necessary.
- the difference between the monitoring system 1 according to the third embodiment and the monitoring system 1 according to the first embodiment mobile phone will be mainly described.
- the functional configuration of the monitoring system 1 according to the present embodiment is basically the same as that according to the first embodiment illustrated in FIG. 1, but the display generated by the display screen generation unit 150 and displayed on the display device 170.
- the screen is different.
- FIG. 7 shows a specific example of a display screen generated by the display generation unit 150 according to the present embodiment.
- a video camera in which a tracking target person is shown in each window 31 on the display screen generated by the display screen generation unit 150 according to the present embodiment or is predicted to appear in the near future. 200 moving images and a time chart image are arranged.
- a specific example of a display screen displayed by the display device 170 in the present embodiment will be described with reference to FIGS. 7 and 8.
- a time chart image to be displayed for person monitoring is arranged (hereinafter, this display screen is also referred to as a monitoring screen 30).
- the monitoring screen 30 has a moving image display area 23 that has moving image display areas 33 ⁇ / b> A to 33 ⁇ / b> D (hereinafter collectively referred to as a moving image display area 33). It is different in that.
- the time chart image is arranged in the moving image display area 33D instead of the moving image, which is different from the monitoring image according to FIG.
- the time chart image shows in which time zone of each camera the person P2 corresponding to the tab 25B is detected.
- FIG. 8 is an enlarged view of the time chart image displayed in the moving image display area 33D.
- T1 is a time chart showing in which time zone of each camera the person corresponding to the tab of the currently displayed window is detected.
- numbers in the left column in the figure represent camera numbers
- T2 represents a time axis. For example, it is assumed that the interval between the scales is 5 seconds at T2, and the leftmost scale currently represents 10 o'clock.
- FIG. 8 displays the detection status of the person P2 corresponding to the tab 25B
- T1 indicates that the person P2 is in the camera 1 from 10: 00: 5 to 10:00:10. It shows that it was reflected.
- T1 represents that the person P2 was reflected in the camera 2 from 10:00 to 10:00:10.
- T3 is a knob for sliding the entire T1 and T2 in the right direction or the left direction (future or past).
- T4 is a button for selecting a monitoring image to be displayed in the moving image display areas 33A to 33C in FIG. 7 and 8, the monitoring images corresponding to the cameras 2, 4 and 6 are displayed in the moving image display areas 33A, 33B and 33C.
- the button switching process at T4 is performed based on an operation on the input device 160. For example, when the input device 160 is a mouse, the display image may be switched by placing the cursor on the corresponding button and clicking. Alternatively, when the input device 160 is a touch panel, the display image may be switched by the user directly touching the button.
- the display screen generation unit 150 includes the similarity calculation unit 120, the tracking person registration unit 130, the next camera prediction unit 140, the captured moving image 181, and the detected person information.
- a time chart image may be generated using 183, person tracking information 185, and the like.
- the display screen generation unit 150 displays the corresponding person. You may make it fill the time slot
- the display screen generation unit 150 displays the current time zone detection status in the second column from the right, and the rightmost column indicates the status predicted by the next camera prediction unit 140. May be displayed.
- the display screen generation unit 150 may display T1 in the moving image display area 33D so as to flow from right to left in real time.
- the display screen generation unit 150 may generate T1 not for real-time captured moving images but for accumulated past captured moving images (offline moving image data).
- the display screen generation unit 150 arranges the time chart image on the display screen. Thereby, it is possible to confirm at a glance which camera the person to be monitored was detected in which time zone.
- the display screen generation unit 150 according to the present embodiment displays a button capable of switching which monitor image captured by which camera is displayed in the moving image display area 33. Thereby, the user can arbitrarily switch an image to be displayed while checking the time chart.
- the functional configuration of the monitoring system 1 according to the present embodiment is basically the same as that according to the first embodiment illustrated in FIG. 1, but the display generated by the display screen generation unit 150 and displayed on the display device 170.
- the screen is different.
- FIG. 9 shows a specific example of a display screen generated by the display generation unit 150 according to the present embodiment.
- a tracking target person is shown in or near each window 41 in a display screen (hereinafter also referred to as a monitoring screen 40) generated by the display screen generation unit 150 according to the present embodiment.
- a map image is arranged together with a moving image of the video camera 200 predicted to be projected in the future.
- the moving image display area 23 is a moving image display area 43A to 43D (hereinafter also collectively referred to as a moving image display area 43). .
- the difference is that not the moving image but the map image is arranged in the moving image display area 43D.
- the map image shows a person corresponding to the tab 25A1 and a locus of the person corresponding to the tab 25B1.
- the display screen generation unit 150 changes the map image in real time according to the detection and tracking results.
- the display screen generation unit 150 may display the color of the tab and the color of the locus displayed on the map image in a similar color for each monitoring target person.
- the tab 25A and the color of the trajectory of the person corresponding to the tab 25A in the map image are red
- the color of the tab 25B and the trajectory of the person corresponding to the tab 25B in the map image is blue.
- the color In the example of FIG. 9, the person corresponding to the tab 25C is not detected on the map image.
- the tab 25C and the color of the locus of the person corresponding to the tab 25C in the map image may be yellow colors. good.
- the display screen generation unit 150 may notify the user of the person detection by surrounding the person on the moving image with a corresponding similar color rectangle.
- the display screen generation unit 150 arranges the map image on the display screen and displays the movement trajectory of the monitoring target person. Thereby, the user can confirm the movement trajectory of the monitoring target person at a glance.
- the display screen generation unit 150 displays the tab color and the locus color displayed on the map image in similar colors for each person to be monitored. Thereby, the user can confirm the movement trajectory of the plurality of monitoring target persons more easily at a glance.
- FIG. 10 is a diagram for explaining the fifth embodiment.
- the monitoring system 1 displays a monitoring screen (in the example of FIG. 10, a display screen similar to the monitoring screen 20 according to the first embodiment) on the mobile terminal 1000.
- the mobile terminal 1000 includes a notebook computer, a tablet terminal, a PDA, a mobile phone, a smartphone, a portable game machine, and the like.
- the display of the mobile terminal 1000 is configured with a touch panel.
- the monitoring system 1 realizes at least the functions of the input device 160 and the display device 170 on the mobile terminal 1000 among the functions of the information processing server 100 illustrated in FIG.
- the information processing server 100 and the mobile terminal 1000 can be linked. It is possible to realize the same function as in the first embodiment.
- the user can monitor the display from the bottom of the screen to the top of the screen (in the direction of the arrow in the figure) by hand.
- the system 1 may realize a user interface for switching the display on the display from the lower monitored person to the upper monitored person in the order of the tabs to the corresponding window.
- the tab may be switched by directly touching the image area of the tab.
- the tracking status of the monitored person can be confirmed outdoors, for example, by carrying a guard.
- An image processing system comprising: a display control unit that displays a switchable manner for each person registered by the registration unit.
- Appendix 3 The image processing system according to appendix 2, wherein the display control means urges switching of the window when a person registered by the registration means is captured in a moving image input from the video camera.
- Appendix 4 The image processing system according to appendix 2 or appendix 3, wherein the display control unit urges switching of the window when it can be predicted that the person registered by the registration unit appears in the moving image input from the video camera. .
- Input means for receiving an input of a moving image captured by a video camera; and display control means for displaying the moving image input from the video camera for each registered person so as to be switchable, the display control means Is an image processing system that displays, for each registered person, information related to the video camera that took the person and the time when the person was taken.
- Input means for receiving an input of a moving image captured by a video camera; and display control means for displaying the moving image input from the video camera for each registered person so as to be switchable, the display control means
- the image processing system which displays the map information showing the locus of movement for each registered person.
- Appendix 10 The image processing method according to appendix 9, wherein the moving image input from the video camera is displayed so that a window associated with each registered person can be switched.
- Appendix 11 The image processing method according to appendix 10, wherein the window switching is urged when a person registered by the registration unit is captured in a moving image input from the video camera.
- Appendix 12 The image processing method according to appendix 10 or appendix 11, wherein the window switching is urged when it can be predicted that the person registered by the registration unit appears in the moving image input from the video camera.
- Appendix 13 12. The image processing method according to appendix 10 or appendix 11, wherein a plurality of moving images input from the input unit are arranged on the window.
- the image processing system performs a step of receiving input of a moving image captured by the video camera and a step of displaying the moving image input from the video camera for each registered person in a switchable manner.
- the image processing method which displays the information regarding the video camera which image
- the image processing system performs a step of receiving input of a moving image captured by the video camera and a step of displaying the moving image input from the video camera for each registered person in a switchable manner.
- Appendix 17 A process of receiving input of moving images picked up by a plurality of video cameras, a process of registering one or more persons appearing in the input moving images, and a moving person input from the video cameras; A program that causes a computer to execute a process of switching display each time.
- Appendix 18 18. The program according to appendix 17, wherein the moving image input from the video camera is displayed so that a window associated with each registered person can be switched.
- appendix 19 The program according to appendix 18, which prompts switching of the window when a person registered by the registration unit is captured in a moving image input from the video camera.
- Appendix 20 The program according to appendix 18 or appendix 19, wherein the window is switched when the person registered by the registration unit can be predicted to be reflected in the moving image input from the video camera.
- Appendix 21 20.
- Appendix 22 A process for receiving input of a moving image picked up by a video camera and a process for displaying the moving image input from the video camera for each registered person in a switchable manner. A program that displays information about the video camera that took the person and the time when the person was taken.
- Appendix 23 A process for receiving input of a moving image picked up by a video camera and a process for displaying the moving image input from the video camera for each registered person in a switchable manner. A program that displays map information showing the trajectory of each movement.
- Appendix 24 Item 23.
- the program according to any one of appendix 17 to appendix 22, which displays information on a mobile terminal having a touch panel as an interface.
- DESCRIPTION OF SYMBOLS 1 ... Monitoring system, 20 ... Monitoring screen, 21 ... Window, 23A, 23B, 23C, 23D ... Moving image display area, 25A, 25B, 25C ... Tab, 25A1, 25B1, 25C1 ... person image, 100 ... information processing server, 110 ... camera control unit, 120 ... similarity calculation unit, 130 ... tracking person registration unit, 140 ... next camera prediction unit, 150 ... Display screen generation unit, 160 ... input device, 170 ... display device, 180 ... database (DB), 181 ... moving image, 183 ... detected person information, 185 ... ⁇ Person tracking information, 200 ... video camera, P1, P2 ... person
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Closed-Circuit Television Systems (AREA)
- Alarm Systems (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
図1乃至図5は、第1実施形態を説明するための図である。以下、これらの図を参照しながら、以下の流れに沿って第1実施形態を説明する。まず「1.1」でシステム全体の機能構成を示すと共に、「1.2」で表示画面の具体例を示すことで、第1実施形態全体の概要を示す。その上で「1.3」で処理の流れを、「1,4」で実現可能なハードウェア構成の具体例を示す。最後に、「1.5」以降で、本実施形態に係る効果などを説明する。 (1. First embodiment)
1 to 5 are diagrams for explaining the first embodiment. Hereinafter, the first embodiment will be described along the following flow with reference to these drawings. First, “1.1” indicates the functional configuration of the entire system, and “1.2” indicates a specific example of the display screen, thereby giving an overview of the entire first embodiment. In addition, a specific example of a hardware configuration that can be realized by “1.3” and a flow of processing by “1, 4” is shown. Finally, the effects and the like according to the present embodiment will be described after “1.5”.
図1を参照しながら、本実施形態に係る情報処理システムである監視システム1の機能構成を説明する。図1は、監視システム1の機能構成を示すブロック図である。 (1.1 Functional configuration)
With reference to FIG. 1, a functional configuration of a
尚、人物の検出や人物の特徴量の抽出、カメラ内の人物追跡などの処理は、例えば情報処理サーバ100や、図示しない他の情報処理装置上で行なっても良い。 The video camera 200 captures (captures) a moving image. Further, the video camera 200 determines whether or not a person is reflected in the captured moving image, and information such as a position and a feature amount in the moving image related to the person together with the captured moving image. Send to. The video camera 200 can also track a person in the captured moving image.
It should be noted that processes such as person detection, person feature extraction, and camera tracking may be performed on the
検出人物情報183は、ビデオカメラ200により検出された人物の特徴量や、撮影動画像181内での撮影時刻、人物画像等の情報である。 The captured moving
The detected
以下、図2及び図3を参照しながら、表示装置170が表示する表示画面の具体例を説明する。図2及び図3は、表示装置170が、人物監視のために表示する表示画面(以下、監視画面20ともいう。)の具体例を示す図である。まず、図2について説明する。 (1.2 Specific examples of display screen)
Hereinafter, a specific example of the display screen displayed by the
或いは、新たな被監視者の検知、または被監視者検知の予測にともなって、ユーザ操作を経ずに、強制的にウィンドウ21を切り換えることも考えられる。 In the example of FIG. 2, the display
Alternatively, it is conceivable that the
次に、情報処理サーバ100の処理の流れを、図4を参照しながら説明する。図4は、本実施形態に係る情報処理サーバ100の処理の流れを示すフローチャートである。 (1.3 Process flow)
Next, a processing flow of the
以下、図5を参照しながら、上述してきた情報処理サーバ100のハードウェア構成の一例を説明する。尚、前述の通り、情報処理サーバ100の機能は、複数の情報処理装置(例えば、サーバとクライアント)により実現することも可能である。 (1.4 Specific examples of hardware configuration)
Hereinafter, an example of the hardware configuration of the
以上説明したように、本実施形態に係る情報処理サーバ100は、複数のビデオカメラ200から入力される動画像を、監視対象者毎に生成されるウィンドウ21上に表示する。ウィンドウ21は、タブ25によりそれぞれ切換可能である。もし、メイン表示していない追跡対象人物(被監視者)がビデオカメラ200の映像に出現している、若しくは近い将来に出現しそうだと予測できる場合には、表示画面生成部150はタブ25の色を変化させたり点滅させたりすることにより強調表示する。これにより、追跡対象人物が複数いる場合であっても、タブ25により人物毎に分けられた画面で人物を監視することができるため、混乱を防ぐことが可能となる。 (1.5 Effects according to this embodiment)
As described above, the
以下、第2実施形態を図6を参照しながら説明する。図6は、画像処理システムである監視装置600の機能構成を示すブロック図である。図6に示すように、監視装置600は、入力部610と、登録部620と、表示制御部630とを含む。 (2 Second Embodiment)
The second embodiment will be described below with reference to FIG. FIG. 6 is a block diagram illustrating a functional configuration of the
表示制御部630は、ビデオカメラから入力された動画像を、登録部620により登録された人物毎に、切換え可能に表示する。 The
The
次に、第3実施形態に係る監視システム1について説明する。なお、以下の説明において、第1実施形態と同様の構成については同一の符号を付すとともに、説明を省略する。作用効果についても、第1実施形態と同様の点については、必要に応じて説明を省略している。この点、第4実施形態および第5実施形態においても同様である。
以下、第3実施形態に係る監視システム1と、第1実施携帯に係る監視システム1との差異を中心に説明する。 (3 Third Embodiment)
Next, the
Hereinafter, the difference between the
また、本実施形態に係る表示画面生成部150は、どのカメラで撮影した監視画像を、動画像表示領域33に表示するかを切り替え可能なボタンを表示する。これにより、ユーザは、タイムチャートを確認しながら、表示したい画像を任意に切り替えることができる。 As described above, the display
In addition, the display
次に、第4実施形態に係る監視システム1について説明する。本実施形態に係る監視システム1の機能構成は、基本的には図1に示す第1実施形態に係るものと同様であるが、表示画面生成部150が生成し、表示装置170に表示させる表示画面が異なる。図9に、本実施形態に係る表示生成部150が生成する表示画面の具体例を示す。 (4 Fourth embodiment)
Next, the
図10は、第5実施形態を説明するための図である。図10に示すように、本実施形態に係る監視システム1は、監視画面(図10の例では、第1実施形態に係る監視画面20と同様の表示画面)をモバイル端末1000上に表示する。ここで、モバイル端末1000は、ノートパソコン、タブレット端末、PDA、携帯電話、スマートフォン、携帯ゲーム機などを含む。本実施形態において、モバイル端末1000のディスプレイはタッチパネルで構成される。 (5 Fifth embodiment)
FIG. 10 is a diagram for explaining the fifth embodiment. As shown in FIG. 10, the
尚、前述の実施形態の構成は、組み合わせたり或いは一部の構成部分を入れ替えたりしてもよい。また、本発明の構成は前述の実施形態のみに限定されるものではなく、本発明の要旨を逸脱しない範囲内において種々変更を加えてもよい。
尚、前述の各実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られない。 (6 Appendix)
Note that the configurations of the above-described embodiments may be combined or some components may be replaced. The configuration of the present invention is not limited to the above-described embodiment, and various modifications may be made without departing from the scope of the present invention.
A part or all of each of the above-described embodiments can be described as in the following supplementary notes, but is not limited thereto.
複数のビデオカメラで撮像された動画像の入力を受ける入力手段と、前記入力手段から入力された動画像に映る人物を1以上登録可能な登録手段と、前記ビデオカメラから入力された動画像を、前記登録手段により登録された人物毎に、切換え可能に表示する表示制御手段とを備える画像処理システム。 (Appendix 1)
Input means for receiving input of moving images picked up by a plurality of video cameras, registration means for registering one or more persons appearing in the moving images input from the input means, and moving images input from the video cameras An image processing system comprising: a display control unit that displays a switchable manner for each person registered by the registration unit.
前記表示制御手段は、前記ビデオカメラから入力された動画像を、前記登録手段により登録された人物にそれぞれ対応付けられたウィンドウを切換え可能に表示する、付記1記載の画像処理システム。 (Appendix 2)
The image processing system according to
前記表示制御手段は、前記ビデオカメラから入力された動画像に、前記登録手段により登録された人物が撮像されている場合に、前記ウィンドウの切換えを促す、付記2記載の画像処理システム。 (Appendix 3)
The image processing system according to
前記表示制御手段は、前記ビデオカメラから入力された動画像に、前記登録手段により登録された人物が映ると予測できる場合に、前記ウィンドウの切換えを促す、付記2又は付記3記載の画像処理システム。 (Appendix 4)
The image processing system according to
前記ウィンドウ上には、それぞれ、前記入力手段から入力された動画像が複数配置される、付記2又は付記3記載の画像処理システム。 (Appendix 5)
The image processing system according to
ビデオカメラで撮像された動画像の入力を受ける入力手段と、前記ビデオカメラから入力された動画像を、登録された人物毎に、切換え可能に表示する表示制御手段とを備え、前記表示制御手段は、登録された人物毎に、当該人物を撮影したビデオカメラ及び撮影した時間に関する情報を表示する、画像処理システム。 (Appendix 6)
Input means for receiving an input of a moving image captured by a video camera; and display control means for displaying the moving image input from the video camera for each registered person so as to be switchable, the display control means Is an image processing system that displays, for each registered person, information related to the video camera that took the person and the time when the person was taken.
ビデオカメラで撮像された動画像の入力を受ける入力手段と、前記ビデオカメラから入力された動画像を、登録された人物毎に、切換え可能に表示する表示制御手段とを備え、前記表示制御手段は、登録された人物毎の移動の軌跡を表した地図情報を表示する、画像処理システム。 (Appendix 7)
Input means for receiving an input of a moving image captured by a video camera; and display control means for displaying the moving image input from the video camera for each registered person so as to be switchable, the display control means The image processing system which displays the map information showing the locus of movement for each registered person.
前記表示制御手段は、タッチパネルをインタフェースとするモバイル端末に情報を表示する付記1乃至付記7のいずれか1項に記載の画像処理システム。 (Appendix 8)
The image processing system according to any one of
複数のビデオカメラで撮像された動画像の入力を受けるステップと、前記入力された動画像に映る人物を1以上登録するステップと、前記ビデオカメラから入力された動画像を、前記登録された人物毎に、切換え可能に表示するステップとを画像処理システムが行う画像処理方法。 (Appendix 9)
A step of receiving input of moving images taken by a plurality of video cameras; a step of registering one or more persons appearing in the input moving images; and a step of registering moving images input from the video cameras An image processing method in which an image processing system performs a step of switching display each time.
前記ビデオカメラから入力された動画像を、登録された人物にそれぞれ対応付けられたウィンドウを切換え可能に表示する、付記9記載の画像処理方法。 (Appendix 10)
The image processing method according to appendix 9, wherein the moving image input from the video camera is displayed so that a window associated with each registered person can be switched.
前記ビデオカメラから入力された動画像に、前記登録手段により登録された人物が撮像されている場合に、前記ウィンドウの切換えを促す、付記10記載の画像処理方法。 (Appendix 11)
The image processing method according to appendix 10, wherein the window switching is urged when a person registered by the registration unit is captured in a moving image input from the video camera.
前記ビデオカメラから入力された動画像に、前記登録手段により登録された人物が映ると予測できる場合に、前記ウィンドウの切換えを促す、付記10又は付記11記載の画像処理方法。 (Appendix 12)
The image processing method according to appendix 10 or appendix 11, wherein the window switching is urged when it can be predicted that the person registered by the registration unit appears in the moving image input from the video camera.
前記ウィンドウ上には、それぞれ、前記入力手段から入力された動画像が複数配置される、付記10又は付記11記載の画像処理方法。 (Appendix 13)
12. The image processing method according to appendix 10 or appendix 11, wherein a plurality of moving images input from the input unit are arranged on the window.
ビデオカメラで撮像された動画像の入力を受けるステップと、前記ビデオカメラから入力された動画像を、登録された人物毎に、切換え可能に表示するステップとを画像処理システムが行い、画像処理システムは、登録された人物毎に、当該人物を撮影したビデオカメラ及び撮影した時間に関する情報を表示する、画像処理方法。 (Appendix 14)
The image processing system performs a step of receiving input of a moving image captured by the video camera and a step of displaying the moving image input from the video camera for each registered person in a switchable manner. The image processing method which displays the information regarding the video camera which image | photographed the said person, and the image | photographing time for every registered person.
ビデオカメラで撮像された動画像の入力を受けるステップと、前記ビデオカメラから入力された動画像を、登録された人物毎に、切換え可能に表示するステップとを画像処理システムが行い、画像処理システムは、登録された人物毎の移動の軌跡を表した地図情報を表示する、画像処理方法。 (Appendix 15)
The image processing system performs a step of receiving input of a moving image captured by the video camera and a step of displaying the moving image input from the video camera for each registered person in a switchable manner. The image processing method of displaying the map information showing the locus of movement for each registered person.
タッチパネルをインタフェースとするモバイル端末に情報を表示する付記9乃至付記15のいずれか1項に記載の画像処理方法。 (Appendix 16)
The image processing method according to any one of supplementary notes 9 to 15, wherein information is displayed on a mobile terminal having a touch panel as an interface.
複数のビデオカメラで撮像された動画像の入力を受ける処理と、前記入力された動画像に映る人物を1以上登録する処理と、前記ビデオカメラから入力された動画像を、前記登録された人物毎に、切換え可能に表示する処理とをコンピュータに実行させるプログラム。 (Appendix 17)
A process of receiving input of moving images picked up by a plurality of video cameras, a process of registering one or more persons appearing in the input moving images, and a moving person input from the video cameras; A program that causes a computer to execute a process of switching display each time.
前記ビデオカメラから入力された動画像を、登録された人物にそれぞれ対応付けられたウィンドウを切換え可能に表示する、付記17記載のプログラム。 (Appendix 18)
18. The program according to appendix 17, wherein the moving image input from the video camera is displayed so that a window associated with each registered person can be switched.
前記ビデオカメラから入力された動画像に、前記登録手段により登録された人物が撮像されている場合に、前記ウィンドウの切換えを促す、付記18記載のプログラム。 (Appendix 19)
The program according to appendix 18, which prompts switching of the window when a person registered by the registration unit is captured in a moving image input from the video camera.
前記ビデオカメラから入力された動画像に、前記登録手段により登録された人物が映ると予測できる場合に、前記ウィンドウの切換えを促す、付記18又は付記19記載のプログラム。 (Appendix 20)
The program according to appendix 18 or appendix 19, wherein the window is switched when the person registered by the registration unit can be predicted to be reflected in the moving image input from the video camera.
前記ウィンドウ上には、それぞれ、前記入力手段から入力された動画像が複数配置される、付記18又は付記19記載の画像処理方法。 (Appendix 21)
20. The image processing method according to appendix 18 or appendix 19, wherein a plurality of moving images input from the input unit are arranged on the window.
ビデオカメラで撮像された動画像の入力を受ける処理と、前記ビデオカメラから入力された動画像を、登録された人物毎に、切換え可能に表示する処理とをコンピュータに実行させ、登録された人物毎に、当該人物を撮影したビデオカメラ及び撮影した時間に関する情報を表示する、プログラム。 (Appendix 22)
A process for receiving input of a moving image picked up by a video camera and a process for displaying the moving image input from the video camera for each registered person in a switchable manner. A program that displays information about the video camera that took the person and the time when the person was taken.
ビデオカメラで撮像された動画像の入力を受ける処理と、前記ビデオカメラから入力された動画像を、登録された人物毎に、切換え可能に表示する処理とをコンピュータに実行させ、登録された人物毎の移動の軌跡を表した地図情報を表示する、プログラム。 (Appendix 23)
A process for receiving input of a moving image picked up by a video camera and a process for displaying the moving image input from the video camera for each registered person in a switchable manner. A program that displays map information showing the trajectory of each movement.
タッチパネルをインタフェースとするモバイル端末に情報を表示する付記17乃至付記22のいずれか1項に記載のプログラム。 (Appendix 24)
Item 23. The program according to any one of appendix 17 to appendix 22, which displays information on a mobile terminal having a touch panel as an interface.
Claims (7)
- 複数のビデオカメラで撮像された動画像の入力を受ける入力手段と、
前記入力手段から入力された動画像に映る人物を1以上登録可能な登録手段と、
前記ビデオカメラから入力された動画像を、前記登録手段により登録された人物毎に、切換え可能に表示する表示制御手段と
を備える画像処理システム。 Input means for receiving input of moving images captured by a plurality of video cameras;
Registration means capable of registering one or more persons appearing in the moving image input from the input means;
An image processing system comprising: a display control unit that displays a moving image input from the video camera for each person registered by the registration unit in a switchable manner. - 前記表示制御手段は、前記ビデオカメラから入力された動画像を、前記登録手段により登録された人物にそれぞれ対応付けられたウィンドウを切換え可能に表示する、
請求項1記載の画像処理システム。 The display control means displays the moving image input from the video camera so that the windows respectively associated with the persons registered by the registration means can be switched.
The image processing system according to claim 1. - 前記表示制御手段は、前記ビデオカメラから入力された動画像に、前記登録手段により登録された人物が撮像されている場合に、前記ウィンドウの切換えを促す、
請求項2記載の画像処理システム。 The display control means urges switching of the window when a person registered by the registration means is captured in a moving image input from the video camera.
The image processing system according to claim 2. - 前記表示制御手段は、前記ビデオカメラから入力された動画像に、前記登録手段により登録された人物が映ると予測できる場合に、前記ウィンドウの切換えを促す、
請求項2又は請求項3記載の画像処理システム。 The display control means urges switching of the window when it can be predicted that the person registered by the registration means appears in the moving image input from the video camera.
The image processing system according to claim 2 or 3. - 前記ウィンドウ上には、それぞれ、前記入力手段から入力された動画像が複数配置される、
請求項2又は請求項3記載の画像処理システム。 A plurality of moving images input from the input unit are arranged on the window, respectively.
The image processing system according to claim 2 or 3. - 複数のビデオカメラで撮像された動画像の入力を受けるステップと、
前記入力された動画像に映る人物を1以上登録するステップと、
前記ビデオカメラから入力された動画像を、前記登録された人物毎に、切換え可能に表示するステップと
を画像処理システムが行う画像処理方法。 Receiving input of moving images captured by a plurality of video cameras;
Registering one or more persons appearing in the input moving image;
An image processing method in which an image processing system performs a step of displaying a moving image input from the video camera in a switchable manner for each registered person. - 複数のビデオカメラで撮像された動画像の入力を受ける処理と、
前記入力された動画像に映る人物を1以上登録する処理と、
前記ビデオカメラから入力された動画像を、前記登録された人物毎に、切換え可能に表示する処理と
をコンピュータに実行させるプログラム。 A process of receiving input of moving images captured by a plurality of video cameras;
A process of registering one or more persons appearing in the input moving image;
A program for causing a computer to execute a process of switching a moving image input from the video camera for each registered person.
Priority Applications (12)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201380040754.3A CN104718749A (en) | 2012-07-31 | 2013-06-17 | Image processing system, image processing method, and program |
RU2015106938A RU2015106938A (en) | 2012-07-31 | 2013-06-17 | IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD AND PROGRAM |
US14/416,716 US10841528B2 (en) | 2012-07-31 | 2013-06-17 | Systems, methods and apparatuses for tracking persons by processing images |
JP2014528039A JP6332833B2 (en) | 2012-07-31 | 2013-06-17 | Image processing system, image processing method, and program |
BR112015001949-8A BR112015001949B1 (en) | 2012-07-31 | 2013-06-17 | IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD, AND STORAGE MEDIA |
SG11201500693QA SG11201500693QA (en) | 2012-07-31 | 2013-06-17 | Image processing system, image processing method, and program |
MX2015001292A MX2015001292A (en) | 2012-07-31 | 2013-06-17 | Image processing system, image processing method, and program. |
US16/286,430 US10778931B2 (en) | 2012-07-31 | 2019-02-26 | Image processing system, image processing method, and program |
US16/286,449 US10999635B2 (en) | 2012-07-31 | 2019-02-26 | Image processing system, image processing method, and program |
US16/378,081 US10750113B2 (en) | 2012-07-31 | 2019-04-08 | Image processing system, image processing method, and program |
US16/925,183 US11343575B2 (en) | 2012-07-31 | 2020-07-09 | Image processing system, image processing method, and program |
US17/563,261 US20220124410A1 (en) | 2012-07-31 | 2021-12-28 | Image processing system, image processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012170406 | 2012-07-31 | ||
JP2012-170406 | 2012-07-31 |
Related Child Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/416,716 A-371-Of-International US10841528B2 (en) | 2012-07-31 | 2013-06-17 | Systems, methods and apparatuses for tracking persons by processing images |
US16/286,430 Continuation US10778931B2 (en) | 2012-07-31 | 2019-02-26 | Image processing system, image processing method, and program |
US16/286,449 Continuation US10999635B2 (en) | 2012-07-31 | 2019-02-26 | Image processing system, image processing method, and program |
US16/378,081 Continuation US10750113B2 (en) | 2012-07-31 | 2019-04-08 | Image processing system, image processing method, and program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014021004A1 true WO2014021004A1 (en) | 2014-02-06 |
Family
ID=50027694
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/066565 WO2014021004A1 (en) | 2012-07-31 | 2013-06-17 | Image processing system, image processing method, and program |
Country Status (10)
Country | Link |
---|---|
US (6) | US10841528B2 (en) |
JP (1) | JP6332833B2 (en) |
CN (1) | CN104718749A (en) |
AR (1) | AR091912A1 (en) |
BR (1) | BR112015001949B1 (en) |
MX (1) | MX2015001292A (en) |
MY (1) | MY171395A (en) |
RU (1) | RU2015106938A (en) |
SG (1) | SG11201500693QA (en) |
WO (1) | WO2014021004A1 (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015154465A (en) * | 2014-02-19 | 2015-08-24 | キヤノン株式会社 | Display control device, display control method, and program |
JP2016184229A (en) * | 2015-03-25 | 2016-10-20 | 東芝テック株式会社 | Demand prediction device and program |
CN107431786A (en) * | 2015-03-16 | 2017-12-01 | 佳能株式会社 | Image processing equipment, image processing system, image processing method and computer program |
JP2020191645A (en) * | 2020-07-16 | 2020-11-26 | 日本電気株式会社 | Information processing device, control method, and program |
JP2022033600A (en) * | 2020-08-17 | 2022-03-02 | 横河電機株式会社 | Device, system, method, and program |
WO2022154387A1 (en) * | 2021-01-13 | 2022-07-21 | 삼성전자 주식회사 | Electronic device and operation method therefor |
US11532160B2 (en) | 2016-11-07 | 2022-12-20 | Nec Corporation | Information processing apparatus, control method, and program |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017046192A (en) * | 2015-08-26 | 2017-03-02 | 株式会社リコー | Information processing device, program, and information processing method |
US10699422B2 (en) | 2016-03-18 | 2020-06-30 | Nec Corporation | Information processing apparatus, control method, and program |
US20180176512A1 (en) * | 2016-10-26 | 2018-06-21 | Ring Inc. | Customizable intrusion zones associated with security systems |
JP6873740B2 (en) * | 2017-02-24 | 2021-05-19 | ソニー・オリンパスメディカルソリューションズ株式会社 | Endoscope device and edge detection method |
EP3379471A1 (en) * | 2017-03-21 | 2018-09-26 | Canon Kabushiki Kaisha | Image processing apparatus, method of controlling image processing apparatus, and storage medium |
CN110798653B (en) * | 2018-08-01 | 2021-09-14 | 华为技术有限公司 | Image processing method and video monitoring system based on multi-machine cooperation |
JP6573346B1 (en) * | 2018-09-20 | 2019-09-11 | パナソニック株式会社 | Person search system and person search method |
CN110944109B (en) * | 2018-09-21 | 2022-01-14 | 华为技术有限公司 | Photographing method, device and equipment |
WO2021033703A1 (en) * | 2019-08-22 | 2021-02-25 | 日本電気株式会社 | Display control device, display control method, program, and display control system |
CN112579593A (en) * | 2019-09-30 | 2021-03-30 | 华为技术有限公司 | Population database sorting method and device |
US11954941B2 (en) * | 2020-01-14 | 2024-04-09 | EMC IP Holding Company LLC | Facial recognition IOT application for multi-stream video using forecasting tools |
JP7064642B1 (en) * | 2021-07-14 | 2022-05-10 | 日本コンピュータビジョン株式会社 | Information processing equipment, information processing methods and information processing programs |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08221592A (en) * | 1995-02-16 | 1996-08-30 | Matsushita Electric Ind Co Ltd | Interactive information providing device |
JP2001236514A (en) * | 2000-02-24 | 2001-08-31 | Nippon Hoso Kyokai <Nhk> | Automatic figure index generator |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7304662B1 (en) * | 1996-07-10 | 2007-12-04 | Visilinx Inc. | Video surveillance system and method |
US6061055A (en) | 1997-03-21 | 2000-05-09 | Autodesk, Inc. | Method of tracking objects with an imaging device |
US20030025599A1 (en) * | 2001-05-11 | 2003-02-06 | Monroe David A. | Method and apparatus for collecting, sending, archiving and retrieving motion video and still images and notification of detected events |
US6720990B1 (en) * | 1998-12-28 | 2004-04-13 | Walker Digital, Llc | Internet surveillance system and method |
JP4672104B2 (en) | 2000-04-26 | 2011-04-20 | パナソニック株式会社 | Digital image recording / playback device for monitoring |
JP4195991B2 (en) * | 2003-06-18 | 2008-12-17 | パナソニック株式会社 | Surveillance video monitoring system, surveillance video generation method, and surveillance video monitoring server |
US8174572B2 (en) * | 2005-03-25 | 2012-05-08 | Sensormatic Electronics, LLC | Intelligent camera selection and object tracking |
US7843491B2 (en) | 2005-04-05 | 2010-11-30 | 3Vr Security, Inc. | Monitoring and presenting video surveillance data |
US9036028B2 (en) * | 2005-09-02 | 2015-05-19 | Sensormatic Electronics, LLC | Object tracking and alerts |
JP2007241377A (en) | 2006-03-06 | 2007-09-20 | Sony Corp | Retrieval system, imaging apparatus, data storage device, information processor, picked-up image processing method, information processing method, and program |
JP4881766B2 (en) | 2007-03-06 | 2012-02-22 | パナソニック株式会社 | Inter-camera link relation information generation device |
JP4933354B2 (en) | 2007-06-08 | 2012-05-16 | キヤノン株式会社 | Information processing apparatus and information processing method |
US20090153654A1 (en) | 2007-12-18 | 2009-06-18 | Enge Amy D | Video customized to include person-of-interest |
US8601494B2 (en) | 2008-01-14 | 2013-12-03 | International Business Machines Corporation | Multi-event type monitoring and searching |
DK2260646T3 (en) * | 2008-03-28 | 2019-04-23 | On Net Surveillance Systems Inc | METHOD AND SYSTEMS FOR VIDEO COLLECTION AND ANALYSIS THEREOF |
JP5027759B2 (en) | 2008-08-19 | 2012-09-19 | 本田技研工業株式会社 | Visual support device for vehicle |
MX2012009579A (en) * | 2010-02-19 | 2012-10-01 | Toshiba Kk | Moving object tracking system and moving object tracking method. |
US9252897B2 (en) * | 2010-11-10 | 2016-02-02 | Verizon Patent And Licensing Inc. | Multi-feed event viewing |
GB2485969A (en) * | 2010-11-12 | 2012-06-06 | Sony Corp | Video surveillance with anticipated arrival time of object in another camera view |
US8908034B2 (en) | 2011-01-23 | 2014-12-09 | James Bordonaro | Surveillance systems and methods to monitor, recognize, track objects and unusual activities in real time within user defined boundaries in an area |
CN102170560A (en) * | 2011-02-25 | 2011-08-31 | 李兆全 | Radio-frequency-identification (RFID)-based closed circuit television system and monitoring method |
-
2013
- 2013-06-17 CN CN201380040754.3A patent/CN104718749A/en active Pending
- 2013-06-17 BR BR112015001949-8A patent/BR112015001949B1/en active IP Right Grant
- 2013-06-17 SG SG11201500693QA patent/SG11201500693QA/en unknown
- 2013-06-17 MX MX2015001292A patent/MX2015001292A/en unknown
- 2013-06-17 JP JP2014528039A patent/JP6332833B2/en active Active
- 2013-06-17 US US14/416,716 patent/US10841528B2/en active Active
- 2013-06-17 RU RU2015106938A patent/RU2015106938A/en not_active Application Discontinuation
- 2013-06-17 MY MYPI2015700249A patent/MY171395A/en unknown
- 2013-06-17 WO PCT/JP2013/066565 patent/WO2014021004A1/en active Application Filing
- 2013-07-26 AR ARP130102666A patent/AR091912A1/en active IP Right Grant
-
2019
- 2019-02-26 US US16/286,449 patent/US10999635B2/en active Active
- 2019-02-26 US US16/286,430 patent/US10778931B2/en active Active
- 2019-04-08 US US16/378,081 patent/US10750113B2/en active Active
-
2020
- 2020-07-09 US US16/925,183 patent/US11343575B2/en active Active
-
2021
- 2021-12-28 US US17/563,261 patent/US20220124410A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08221592A (en) * | 1995-02-16 | 1996-08-30 | Matsushita Electric Ind Co Ltd | Interactive information providing device |
JP2001236514A (en) * | 2000-02-24 | 2001-08-31 | Nippon Hoso Kyokai <Nhk> | Automatic figure index generator |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015154465A (en) * | 2014-02-19 | 2015-08-24 | キヤノン株式会社 | Display control device, display control method, and program |
CN107431786A (en) * | 2015-03-16 | 2017-12-01 | 佳能株式会社 | Image processing equipment, image processing system, image processing method and computer program |
US10572736B2 (en) | 2015-03-16 | 2020-02-25 | Canon Kabushiki Kaisha | Image processing apparatus, image processing system, method for image processing, and computer program |
JP2016184229A (en) * | 2015-03-25 | 2016-10-20 | 東芝テック株式会社 | Demand prediction device and program |
US11532160B2 (en) | 2016-11-07 | 2022-12-20 | Nec Corporation | Information processing apparatus, control method, and program |
JP2020191645A (en) * | 2020-07-16 | 2020-11-26 | 日本電気株式会社 | Information processing device, control method, and program |
JP7052833B2 (en) | 2020-07-16 | 2022-04-12 | 日本電気株式会社 | Information processing equipment, control methods, and programs |
JP2022033600A (en) * | 2020-08-17 | 2022-03-02 | 横河電機株式会社 | Device, system, method, and program |
US11657515B2 (en) | 2020-08-17 | 2023-05-23 | Yokogawa Electric Corporation | Device, method and storage medium |
JP7415848B2 (en) | 2020-08-17 | 2024-01-17 | 横河電機株式会社 | Apparatus, system, method and program |
WO2022154387A1 (en) * | 2021-01-13 | 2022-07-21 | 삼성전자 주식회사 | Electronic device and operation method therefor |
Also Published As
Publication number | Publication date |
---|---|
US20150208015A1 (en) | 2015-07-23 |
US20200344436A1 (en) | 2020-10-29 |
US10778931B2 (en) | 2020-09-15 |
BR112015001949B1 (en) | 2023-04-25 |
MY171395A (en) | 2019-10-10 |
US20190199957A1 (en) | 2019-06-27 |
JPWO2014021004A1 (en) | 2016-07-21 |
US10841528B2 (en) | 2020-11-17 |
US10750113B2 (en) | 2020-08-18 |
JP6332833B2 (en) | 2018-05-30 |
US10999635B2 (en) | 2021-05-04 |
AR091912A1 (en) | 2015-03-11 |
US20190238786A1 (en) | 2019-08-01 |
US11343575B2 (en) | 2022-05-24 |
US20220124410A1 (en) | 2022-04-21 |
CN104718749A (en) | 2015-06-17 |
SG11201500693QA (en) | 2015-04-29 |
MX2015001292A (en) | 2015-04-08 |
BR112015001949A2 (en) | 2017-07-04 |
RU2015106938A (en) | 2016-09-20 |
US20190199956A1 (en) | 2019-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6332833B2 (en) | Image processing system, image processing method, and program | |
US9870684B2 (en) | Information processing apparatus, information processing method, program, and information processing system for achieving a surveillance camera system | |
JP6210234B2 (en) | Image processing system, image processing method, and program | |
WO2014050432A1 (en) | Information processing system, information processing method and program | |
US9298987B2 (en) | Information processing apparatus, information processing method, program, and information processing system | |
JP6347211B2 (en) | Information processing system, information processing method, and program | |
US10623659B2 (en) | Image processing system, image processing method, and program | |
JPWO2014045670A1 (en) | Image processing system, image processing method, and program | |
JP2014170367A (en) | Object detection device, object detection method, object detection system and program | |
KR20210012634A (en) | Computer device to communicate with network system including plurality of cameras and method of operating thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13825914 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14416716 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2014528039 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2015/001292 Country of ref document: MX |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2015106938 Country of ref document: RU Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112015001949 Country of ref document: BR |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13825914 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 112015001949 Country of ref document: BR Kind code of ref document: A2 Effective date: 20150128 |