WO2014045670A1 - 画像処理システム、画像処理方法及びプログラム - Google Patents
画像処理システム、画像処理方法及びプログラム Download PDFInfo
- Publication number
- WO2014045670A1 WO2014045670A1 PCT/JP2013/068016 JP2013068016W WO2014045670A1 WO 2014045670 A1 WO2014045670 A1 WO 2014045670A1 JP 2013068016 W JP2013068016 W JP 2013068016W WO 2014045670 A1 WO2014045670 A1 WO 2014045670A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- moving body
- video
- input
- registration
- person
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 47
- 238000003672 processing method Methods 0.000 title claims abstract description 13
- 238000012937 correction Methods 0.000 claims abstract description 52
- 238000000034 method Methods 0.000 claims description 27
- 230000008569 process Effects 0.000 claims description 25
- 238000012217 deletion Methods 0.000 claims description 8
- 230000037430 deletion Effects 0.000 claims description 8
- 238000012544 monitoring process Methods 0.000 description 49
- 230000010365 information processing Effects 0.000 description 38
- 238000010586 diagram Methods 0.000 description 12
- 238000000605 extraction Methods 0.000 description 9
- 238000004891 communication Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 238000005401 electroluminescence Methods 0.000 description 4
- 238000012806 monitoring device Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000036544 posture Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/14—Transformations for image registration, e.g. adjusting or mapping for alignment of images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- Some aspects according to the present invention relate to an image processing system, an image processing method, and a program.
- Patent Document 1 discloses an apparatus that can appropriately perform tracking (monitoring) of a person across cameras using connection relationship information between cameras. This apparatus obtains the correspondence between persons according to the similarity of the person feature amount between a point appearing in the camera field of view (In point) and a point disappearing from the camera field of view (Out point).
- An image processing system and an image processing system that are capable of repairing a person even after incorrect association between the cameras is possible.
- An object is to provide a method and a program.
- An image processing system includes an input unit that receives input of images picked up by a plurality of video cameras, and a first registration unit that can register one or more moving objects reflected in the image input from the input unit.
- the first display control means for displaying the video input by the input means on the display device, the moving body reflected in the displayed video, and the moving body registered by the first registration means move in the same way
- a second registration unit that enables registration as a mobile unit, and one mobile unit registered as the same mobile unit by the second registration unit as another mobile unit different from the other mobile unit
- a correction unit that can correct the correspondence relationship of the moving object.
- An image processing method includes a step of receiving input of images picked up by a plurality of video cameras, a step of enabling registration of one or more moving bodies reflected in the input images, and the input images
- a step of displaying on the display device a step of enabling registration that the moving body shown in the displayed video and the registered moving body are the same moving body, and the correspondence relationship registered as the same moving body
- the image processing system performs a step of correcting the correspondence of the moving body by associating one of the moving bodies with another moving body different from the other moving body.
- the program according to the present invention includes a process for receiving input of images picked up by a plurality of video cameras, a process for registering one or more moving objects shown in the input image, and a display device for displaying the input image
- a process for receiving input of images picked up by a plurality of video cameras a process for registering one or more moving objects shown in the input image
- a display device for displaying the input image
- the processing allowing registration that the moving body shown in the displayed video and the registered moving body are the same moving body, and the correspondence registered as the same moving body
- the computer is caused to execute a process of making it possible to correct the correspondence of the moving body.
- “part”, “means”, “apparatus”, and “system” do not simply mean physical means, but “part”, “means”, “apparatus”, “system”. This includes the case where the functions possessed by "are realized by software. Further, even if the functions of one “unit”, “means”, “apparatus”, and “system” are realized by two or more physical means or devices, two or more “parts” or “means”, The functions of “device” and “system” may be realized by a single physical means or device.
- an image processing system an image processing method, and a program capable of repairing a person after camera even if the correspondence between people is wrong.
- FIG. 1 It is a flowchart which shows the flow of a process of the information processing server shown in FIG. It is a block diagram which shows the structure of the hardware which can mount the information processing server shown in FIG. It is a functional block diagram which shows schematic structure of the monitoring apparatus which concerns on 2nd Embodiment.
- FIG. 1 is a block diagram showing a system configuration of the monitoring system 1.
- the monitoring system 1 is broadly divided into an information processing server 100 and a plurality of video cameras 200 (video cameras 200A to 200N are collectively referred to as a video camera 200) that captures (captures) video (moving images). It comprises a display device 300 and an input device 400.
- the monitoring system 1 is described as a system for monitoring a person photographed by the video camera 200, the monitoring target is not limited to this.
- the monitoring target may be applied to various moving objects such as cars, bicycles, and motorcycles.
- the video camera 200 captures a video (moving image), determines whether or not there is a person in the captured video, and then stores information such as the position and feature amount of the person along with the captured moving image. It transmits to the processing server 100.
- the video camera 200 can also track a person in the video by comparing the shot video between frames.
- processes such as person detection, feature amount extraction, and person tracking in the camera may be performed not on the video camera 200 but on, for example, the information processing server 100 or other information processing apparatus (not shown). .
- the information processing server 100 performs various processes such as registration of the person to be tracked and tracking of the registered person by analyzing the video imaged by the video camera 200.
- the video processed by the information processing server 100 is not only a real-time video shot by the video camera 200 but also a storage device (for example, an HDD (Hard Disk Drive) or a VCR (Video) after being shot by the video camera 200. It is also conceivable to track (analyze) the video stored in the (Cassette Recorder)). In particular, in order to make it possible to correct the registration relationship of the person correspondence described later, it is necessary to record the captured video and perform the tracking process again based on the recorded video.
- a storage device for example, an HDD (Hard Disk Drive) or a VCR (Video) after being shot by the video camera 200. It is also conceivable to track (analyze) the video stored in the (Cassette Recorder)). In particular, in order to make it possible to correct the registration relationship of the person correspondence described later, it is necessary to record the captured video and perform the tracking process again based on the recorded video.
- the moving images stored in the storage device are reproduced in reverse order (reverse reproduction) and tracked.
- reverse reproduction Normally, when a person takes a suspicious action, it is necessary to examine what route the person has taken and how the action has been taken. Enabling tracking is extremely important.
- the information processing server 100 outputs various display screens such as a monitoring screen to the display device 300 and also receives operation signals related to various operation inputs related to person monitoring from the input device 400. Accept. More specifically, for example, on a monitoring screen (a specific example is shown in FIG. 2 to be described later) displayed on the display device 300, a plurality of videos input from the video camera 200 are displayed, so that the person to be monitored is displayed. The user who is the monitor can grasp where the person is now.
- a monitoring screen displayed on the display device 300
- a plurality of videos input from the video camera 200 are displayed, so that the person to be monitored is displayed. The user who is the monitor can grasp where the person is now.
- a user who is a monitor looks at the display device 300, and when the person to be monitored shown in the video (moving image) of a certain video camera 200 appears in the video of another video camera 200, the user turns the input device 400 on.
- the two persons are associated with each other as being the same person.
- the user can operate the input device 400 to correct the association.
- the display device 300 is, for example, a display that displays an image on a liquid crystal, an organic EL (Electro Luminescence), or the like.
- the display device 300 displays the monitoring screen output from the information processing server 100.
- the input device 400 is a device for a user (monitor) to input various information.
- a pointing device such as a mouse, a touch pad, or a touch panel, a keyboard, and the like correspond to the input device 400.
- Various processes such as registration of a monitoring target person, association that the registered person and the person appearing on the video camera 200 are the same person, and correction thereof are performed based on the user's operation on the input device 400. .
- the display device 300 and the input device 400 may be realized as one client, or the functions of the information processing server 100, the display device 300, and the input device 400 are realized by three or more information processing devices. You may do it.
- the client may have some functions of the information processing server 100 according to the present embodiment.
- FIG. 2 is a diagram illustrating a specific example of a display screen (hereinafter, also referred to as a monitoring screen 20) that the display device 300 displays for person monitoring.
- video areas 21A to 21D (hereinafter also collectively referred to as video areas 21) for displaying captured videos input from a plurality of video cameras 200, and a person to be monitored are displayed. And a monitoring subject display area 23 to be displayed.
- the video area 21 displays multi-camera video input from a plurality of video cameras 200 as described above.
- the video of the video camera 200 displayed in each video area 21 may be switched at any time. For example, after the person to be monitored moves out of the display area, control may be performed such as switching to the video of the video camera 200 predicted to appear next when the person moves.
- the monitoring target person display area 23 is an area in which a person selected as a monitoring target among the persons shown in the video area 21 by the user who is the monitoring person is registered. As in the example of FIG. 2, a plurality of monitoring subjects can be selected (three people in the example of FIG. 2). In addition, for each person, a plurality of person images (two in the example of FIG. 2) having different postures, such as the front and the back, are displayed in the monitoring subject display area 23.
- a pop-up window 25 is displayed above the person P shown in the video area 21D.
- the pop-up window 25 is used to select whether or not the person P is to be monitored and whether or not the person is the same person as the person already registered in the monitoring target display area 23 as the monitoring target. Is.
- the pop-up window 25 is displayed in the vicinity of the newly detected person P. As the person P moves (moves), the position of the pop-up window 25 moves. By arranging the pop-up window 25 in the vicinity of the target person P in this way, it becomes easy for the user to specify the person P to be newly registered, or to compare the person P with the registered monitoring target person. Or the operation range can be narrowed.
- FIG. 3 is a diagram illustrating a specific example of the pop-up window 25.
- person images 31A to 31C (hereinafter collectively referred to as person image 31) that are thumbnails of the monitoring subject to which ID1, ID2, and ID3 are assigned, and a blank image to which ID4 is assigned. 33 are lined up.
- the information processing server 100 associates the person to be monitored corresponding to the person image 31 with the person P shown in the video as the same person.
- the information processing server 100 registers the person P appearing in the video as a new monitoring target person having ID4.
- registration buttons 35A to 35C for the user to register that they are not the same person (indicated as “rem” meaning remove in the figure. Hereinafter, they are collectively referred to as registration.
- the button 35 is also used by the user to register that the person to be monitored corresponding to ID1, ID2 and ID3 already registered and the person P shown in the video are not the same person.
- the persons arranged in the pop-up window 25 may be arranged in order of high possibility of matching with the person P, or only N pieces having high possibility may be arranged.
- the possibility of matching with the person P can be calculated by association estimation described later.
- the monitoring system 1 provides a user interface (UI) that enables the registered association to be corrected.
- UI user interface
- FIGS. 4 and 5 are diagrams showing an example of a user interface for the user to register corrections.
- the video area 21 in FIGS. 4 and 5 corresponds to the video area 21 of the monitoring screen 20 described in FIG. Note that the interfaces shown in FIGS. 4 and 5 are used to allow the user to input two types of information, that is, the necessity for correction and the designation of a moving object that has been erroneously tracked.
- FIG. 4 After an image 41 for recognizing a person to be monitored is selected by the user, a correction registration button 43 for correcting the association of the person is displayed on the display screen.
- the example in FIG. 5 is an example when the order is reversed.
- the specific example of a display screen when a user selects the monitoring subject person of matching correction object after the correction registration button 43 always displayed on the monitoring screen 20 is selected by the user is shown.
- the information processing server 100 causes the display device 300 to display a screen for allowing the user to select where to correct the association related to the person.
- 6 and 7 are diagrams illustrating specific examples of an input screen for inputting how far the user association is to be corrected (inputting correction time).
- a plurality of person images associated with the person P designated as a correction target by the user are displayed in time series.
- five images are arranged retroactively from the current time, and the photographing time and information of the photographed camera are shown at the bottom of each image.
- the example in FIG. 7 shows the movement path estimated based on the current association information and the captured image for the person P designated as the correction target by the user.
- the movement path obtained as a result of movement tracking within the camera is indicated by a solid line
- the movement path between cameras estimated by the user's association with the person is indicated by a broken line.
- the hatched area indicates the shooting range of the video camera 200.
- FIG. 8 is an image for registering a person who performs correct association after correction.
- a plurality of images (5 in the example of FIG. 8) of human images that can be associated are arranged in the descending order of the possibility. Further, for each person, a plurality of person images having different shooting directions (three in the example of FIG. 8, for example, selecting three of the left and right front and rear) can be arranged.
- FIG. 9 is a functional block diagram showing a functional configuration of the information processing server 100 according to the present embodiment.
- the information processing server 100 roughly includes an image input unit 110, a registration / correction instruction input unit 120, a person tracking processing unit 130, and a display control unit 140.
- the image input unit 110 sequentially receives input of images taken by the video camera 200 and the result of person detection.
- the person detection result that receives the input is the same in the angle of view (same for each person extracted as an object by comparing the feature amount of the person detected in the video and the time series images (frames).
- Information on the movement trajectory tracked in the video) (time series data of the position of the person and feature amount information) and the like are included.
- the registration / correction instruction input unit 120 is based on an operation input by the user from the input device 400, and based on an operation signal received from the input device 400, the monitoring target person registration instruction, the monitoring target person association instruction, and the association Various inputs such as correction instructions are received. At the time of correction, for example, as described above, information on the person (moving body) at the time of incorrect tracking, information on the person who is the correct tracking target, and the like are input by the registration / correction instruction input unit 120.
- the display control unit 140 is a person who monitors the user interface illustrated in FIGS. 2 to 8 by displaying various display screens on the display device 300 based on the result of the person tracking processing performed by the person tracking processing unit 130. Provide to users.
- the display control unit 140 includes a moving image display unit 141 and a movement history generation unit 143.
- the moving image display unit 141 reproduces the video input from the image input unit 110 and the captured video 134A recorded in the DB 134, and displays the video on the video area 21 illustrated in a specific example in FIG.
- the movement history generation unit 143 generates a movement history area included in the display screen of FIG. 7 based on the tracking result of the person analyzed by the moving body tracking unit 136.
- the person tracking processing unit 130 performs processing related to person tracking based on the video and person detection result input from the image input unit 110 and the user instruction input from the registration / correction instruction input unit 120. As shown in FIG. 9, the person tracking processing unit 130 includes a feature amount extraction unit 131, a feature amount selection unit 132, an association estimation unit 133, a database (DB) 134, an information deletion unit 135, and a moving body tracking unit 136. .
- DB database
- the feature amount extraction unit 131 extracts information related to the feature amount of the person from the video and the person detection result input from the image input unit 110.
- the feature amount selection unit 132 stores the feature amount extracted by the feature amount extraction unit 131 in the DB 134 as feature amount information 134D as necessary.
- the feature quantity selection unit 132 receives, from the DB 134, information related to a person who is highly likely to correspond to the feature quantity input from the feature quantity extraction unit 131 in response to an operation input from the registration / correction instruction input unit 120.
- the data is read and output to the moving object tracking unit 136.
- the association estimation unit 133 registers information related to the operation input from the registration / modification instruction input unit 120 in the DB 134 as the association operation information 134B. In addition, the association estimation unit 133 estimates a person's correspondence (combination combination) based on information such as a feature amount in the DB 134. In addition to the association information input by the user, the estimation of the correspondence relationship includes the similarity of the feature amount of the person shown in the video captured by each video camera 200, and the elapsed time required for movement between the two video cameras. Usually, it can be performed based on a comparison result with an average time required for movement between the two video cameras.
- both persons correspond (the same person).
- the possibility is high.
- the time normally required for movement from the shooting range of the video camera 200A to the shooting range of the video camera 200N may be given in advance or may be calculated statistically by an average value) and actually required for movement. If the difference between the time (the difference between time t + 1 and time t) is small, it is considered that both persons are likely to correspond.
- the association estimation unit 133 calculates the probability of each association by evaluating the possibility of association for each person and evaluating whether or not a correspondence relationship combining these associations can be established. And registered as association probability information 134C in the DB 134.
- association estimation unit 133 when the association estimation unit 133 receives a person relationship correction instruction from the registration / correction instruction input unit 120, the association estimation unit 134 recalculates the association probability information 134C based on the instruction. Based on the probability, for example, in the display of the pop-up window 25 described with reference to FIG. 3 or the display screen described with reference to FIG.
- the association estimation unit 133 When there is a contradiction in the association (including the modified association) input by the user, the association estimation unit 133 notifies the user of the fact by a message display or voice notification by the display control unit 140. To do.
- the case where there is a contradiction in correspondence is, for example, when the person A is shown in the camera 1 at time t, the person B is shown in the camera 2 at time t, and the person C is shown in the camera 3 at time t + 1.
- the user associates both the person A and the person B with the person C.
- the information deletion unit 135 deletes information that can be determined as unnecessary from various information registered in the DB 134 as needed. As a condition for deleting the information, for example, feature amount information 134D that has passed for a certain period of time after extraction, association probability information 134C that has passed for a certain period of time without correction, association operation information 134B, and the like can be considered. Alternatively, when the correspondence is instructed by the user, the information deletion unit 135 deletes the association operation information 134B and the association probability information 134C before the correspondence is corrected.
- the moving body tracking unit 136 performs person tracking on the video input from the image input unit 110 based on the feature amount output from the feature amount selection unit 132. If the correspondence is instructed from the user, the mobile body tracking unit 136 redoes the person tracking process from the correction time to the current time based on the corrected correspondence.
- FIG. 10 is a flowchart showing a processing flow of the information processing server 100 according to the present embodiment.
- Each processing step to be described later can be executed in any order or in parallel as long as there is no contradiction in processing contents, and other steps can be added between the processing steps. good. Further, a step described as a single step for convenience can be executed by being divided into a plurality of steps, and a step described as being divided into a plurality of steps for convenience can be executed as one step.
- the registration / correction instruction input unit 120 determines whether a correction instruction has been input from the user via the input device 400 (S1001). This corresponds to a state in which the user has selected both the images 41 and 43 on the display screens of FIGS.
- the display control unit 140 reads data from the DB 134, reads the association operation information 134B, which is a past association history (S1003), and each person image that has been associated so far. Is displayed on the display device 300 (S1005). Specific examples of the interface are those described above with reference to FIGS.
- the display The control unit 140 reads the association probability information 134C from the DB 134, and causes the display device 300 to display correction candidates having a high association probability at the person to be corrected and the time on the display screen illustrated in FIG.
- the association estimation unit 133 refers to the association operation information 134B and confirms whether there is an inconsistency in association (S1015). If there is a contradiction (Yes in S1017), the display control unit 140 displays a message on the display device 300 that the correspondence needs to be corrected (raises an alert) (S1019). As a result, when a correction instruction is input from the user (Yes in S1021), the processes after S1003 are performed on the person who causes the contradiction.
- the association estimation unit 133 sets the association probability based on the corrected correspondence. It recalculates (S1023) and registers it as association probability information 134C in the DB 134 (S1025). At this time, the video from the correction time to the current time may be reproduced at a higher speed and displayed in the video area 21 so that the monitoring at the current time can be resumed early.
- the information processing server 100 includes a processor 1101, a memory 1103, a storage device 1105, an input interface (I / F) 1107, a data I / F 1109, a communication I / F 1111, and a display device 1113.
- a processor 1101 a memory 1103, a storage device 1105, an input interface (I / F) 1107, a data I / F 1109, a communication I / F 1111, and a display device 1113.
- the processor 1101 controls various processes in the information processing server 100 by executing a program stored in the memory 1103.
- the processing related to the display control unit 140 can be realized as a program mainly operating on the processor 1101 after being temporarily stored in the memory 1103.
- the memory 1103 is a storage medium such as a RAM (Random Access Memory).
- the memory 1103 temporarily stores a program code of a program executed by the processor 1101 and data necessary for executing the program. For example, in the storage area of the memory 1103, a stack area necessary for program execution is secured.
- the storage device 1105 is a non-volatile storage medium such as a hard disk or a flash memory.
- the storage device 1105 includes an operating system, a registration / correction instruction input unit 120, an image input unit 110, a feature amount extraction unit 131, a feature amount selection unit 132, an association estimation unit 133, an information deletion unit 135, and a moving body tracking unit.
- 136 and various programs for realizing the display control unit 140, and various data such as a captured video 134A, association operation information 134B, association probability information 134C, and feature amount information 134D included in the DB 134 are stored.
- Programs and data stored in the storage device 1105 are referred to by the processor 1101 by being loaded into the memory 1103 as necessary.
- the input I / F 1107 is a device for receiving input from the user.
- the input device 400 described in FIG. 1 can also be realized by the input I / F 1107.
- Specific examples of the input I / F 1107 include a keyboard, a mouse, a touch panel, and various sensors.
- the input I / F 1107 may be connected to the information processing server 100 via an interface such as USB (Universal Serial Bus), for example.
- USB Universal Serial Bus
- the data I / F 1109 is a device for inputting data from outside the information processing server 100.
- Specific examples of the data I / F 1109 include a drive device for reading data stored in various storage media.
- the data I / F 1109 may be provided outside the information processing server 100. In this case, the data I / F 1109 is connected to the information processing server 100 via an interface such as a USB.
- the communication I / F 1111 is a device for data communication with an external device of the information processing server 100, for example, a video camera 200 or the like by wire or wireless.
- the communication I / F 1111 may be provided outside the information processing server 100. In that case, the communication I / F 1111 is connected to the information processing server 100 via an interface such as a USB.
- the display device 1113 is a device for displaying various information.
- the display device 300 described in FIG. 1 can also be realized by the display device 1113.
- Specific examples of the display device 1113 include a liquid crystal display and an organic EL (Electro-Luminescence) display.
- the display device 1113 may be provided outside the information processing server 100. In that case, the display device 1113 is connected to the information processing server 100 via, for example, a display cable.
- the monitoring target person can be associated by receiving information such as information indicating that the person is the same person from the user. Furthermore, when the user notices that the association is incorrect, the user can retroactively modify the association relating to the person, and thus the association can be restored afterwards. .
- FIG. 12 is a block diagram illustrating a functional configuration of a monitoring device 1200 that is an image processing system.
- the monitoring device 1200 includes an input unit 1210, a first registration unit 1220, a display control unit 1230, a second registration unit 1240, and a correction unit 1250.
- the input unit 1210 receives input of images captured by a plurality of video cameras (not shown).
- the first registration unit 1220 can register one or more moving objects shown in the video input from the input unit 1210. The registration is performed based on a user instruction, for example.
- the display control unit 1230 displays the video input by the input unit 1210 on a display device (not shown).
- the second registration unit 1240 can register that the moving body shown in the video displayed by the display control unit 1230 and the moving body registered by the first registration unit 1220 are the same moving body. The registration is performed based on a user instruction, for example.
- the correcting unit 1250 associates one moving body of the correspondence registered as the same moving body by the second registration unit 1240 with another moving body different from the other moving body, so that the correspondence of the moving body Can be modified.
- (Appendix 1) Input means for receiving input of images picked up by a plurality of video cameras, first registration means capable of registering one or more moving objects shown in the images input from the input means, and images input by the input means It is possible to register that the first display control means for displaying the image on the display device, the moving body reflected in the displayed video, and the moving body registered by the first registration means are the same moving body.
- An image processing system comprising correction means for correcting the correspondence relationship.
- the image processing apparatus further includes a calculation unit that calculates a feature amount relating to the moving body that is reflected in the video input by the input unit, and the second registration unit displays the feature amount according to the feature amount calculated by the calculation unit.
- appendix 3 The image processing system according to appendix 2, further comprising: a storage unit that stores the feature amount; and a deletion unit that deletes the feature amount stored in the storage unit according to a condition.
- Appendix 4 Displays the movement history of the moving body registered by the first registration means according to the correspondence relation of the moving body registered by the second registration means and the correspondence relation of the moving body corrected by the correction means.
- Appendix 5 The image processing system according to any one of appendix 1 to appendix 4, wherein the correction unit displays the mobile object to be corrected on the display device in time series so that the correspondence of the mobile object can be corrected.
- Appendix 6 The image processing system according to any one of appendix 1 to appendix 4, wherein the correction unit is configured to display the movement history of the mobile object to be corrected on a display device so that the correspondence relationship of the mobile object can be corrected.
- (Appendix 8) Receiving an input of video captured by a plurality of video cameras; enabling registration of one or more moving objects shown in the input video; displaying the input video on a display device; The step of enabling registration that the moving body shown in the displayed video and the registered moving body are the same moving body, and one moving body of the corresponding relationship registered as the same moving body, An image processing method in which the image processing system performs a step of correcting a correspondence relationship of a moving body by associating with another moving body different from the moving body.
- Appendix 9 A step of calculating a feature amount relating to the moving object shown in the input video, and according to the feature amount calculated by the calculation, the moving object shown in the displayed image and the registered moving object; The image processing method according to appendix 8, wherein it is possible to register that the two are the same moving object.
- the supplementary note 9 or the supplementary note 10 further comprising a step of displaying a movement history of the registered mobile object on a display device in accordance with the correspondence relation of the registered mobile object and the corrected correspondence of the mobile object.
- Image processing method
- Appendix 12 12. The image processing method according to any one of appendix 8 to appendix 11, wherein the mobile object to be corrected is displayed on the display device in time series so that the correspondence of the mobile object can be corrected.
- Appendix 15 A process of receiving input of images picked up by a plurality of video cameras, a process of enabling registration of one or more moving objects shown in the input image, a process of displaying the input image on a display device, The process of enabling registration that the moving body shown in the displayed video and the registered moving body are the same moving body, and one moving body of the corresponding relationship registered as the same moving body, A program that causes a computer to execute a process of making it possible to correct the correspondence of a moving object by associating it with another moving object that is different from the moving object.
- the image processing apparatus further includes a process of calculating a feature amount related to the moving object shown in the input video, and according to the feature value calculated by the calculation, the moving object shown in the displayed video and the registered moving object, The program according to appendix 15, wherein it is possible to register that the two are the same mobile object.
- Appendix 17 The program according to appendix 16, further comprising: a process for storing the feature quantity; and a process for deleting the stored feature quantity in accordance with a condition.
- Appendix 19 The program according to any one of appendix 15 to appendix 18, wherein the moving object to be corrected is displayed on the display device in time series so that the correspondence of the moving object can be corrected.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
Abstract
Description
図1乃至図11は、第1実施形態を説明するための図である。以下、これらの図を参照しながら、次の流れに従って本実施形態を説明する。まず、「1.1」でシステム構成の概要を示すとともに、「1.2」で動作の概要を、表示画面の具体例等を示しながら説明する。その後、「1.3」でシステムの機能構成を説明し、「1.4」で処理の流れを、「1.5」で、本システムを実現可能なハードウェア構成の具体例を示す。最後に、「1.6」以降で、本実施形態に係る効果等を説明する。
図1を参照しながら、本実施形態に係る画像処理システムである監視システム1のシステム構成を説明する。図1は、監視システム1のシステム構成を示すブロック図である。
(1.2.1 対応関係の登録)
以下、図2及び図3を参照しながら、表示装置300が表示する、人物監視の為に表示する表示画面の具体例を説明する。図2は、表示装置300が、人物監視のために表示する表示画面(以下、監視用画面20ともいう。)の具体例を示す図である。
「1.2.1」で述べた操作により、ユーザは監視人物の登録や、監視対象人物に係る対応付けを行うが、監視対象人物と外見の似た人物が他にもいる場合や操作ミス等により、誤った人物の対応付けを行なってしまう場合も考えられる。そこで、本実施形態に係る監視システム1では、登録した対応付けを修正できるようにするユーザインタフェース(UI)を提供する。
次に、図9を参照しながら、本実施形態に係る情報処理サーバ100の機能構成を説明する。図9は、本実施形態に係る情報処理サーバ100の機能構成を示す機能ブロック図である。
以下、情報処理サーバ100の対応付け修正時に係る処理の流れを、図10を参照しながら説明する。図10は、本実施形態に係る情報処理サーバ100の処理の流れを示すフローチャートである。
以下、図11を参照しながら、上述してきた情報処理サーバ100をコンピュータにより実現する場合のハードウェア構成の一例を説明する。なお、前述の通り、情報処理サーバ100の機能は、複数の情報処理装置により実現することも可能である。
以上説明したように、本実施形態に係る監視システム1では、ユーザから同一人物である旨の情報等の入力を受けることにより、監視対象人物の対応付けを行うことができる。更に、ユーザが対応付けを誤ったことに気づいた場合には、ユーザは当該人物に係る対応付けを遡って修正することができるようにしているため、事後的に対応付けを修復することができる。
以下、第2実施形態を、図12を参照しながら説明する。図12は、画像処理システムである監視装置1200の機能構成を示すブロック図である。図12に示すように監視装置1200は、入力部1210と、第1の登録部1220と、表示制御部1230と、第2の登録部1240と、修正部1250とを含む。
なお、前述の実施形態の構成は、組み合わせたり或いは一部の構成部分を入れ替えたりしてもよい。また、本発明の構成は前述の実施形態のみに限定されるものではなく、本発明の要旨を逸脱しない範囲内において種々変更を加えてもよい。
複数のビデオカメラで撮像された映像の入力を受ける入力手段と、前記入力手段から入力された映像に映る移動体を1以上登録可能な第1の登録手段と、前記入力手段により入力された映像を表示装置に表示する第1の表示制御手段と、当該表示している映像に映る移動体と前記第1の登録手段により登録済みの移動体とが同一移動体であることを登録可能とする第2の登録手段と、前記第2の登録手段により同一移動体として登録された対応関係の一方の移動体を、他方の移動体とは異なる他の移動体と対応づけることにより、移動体の対応関係を修正可能とする修正手段とを備える画像処理システム。
前記入力手段により入力された映像に映る移動体に係る特徴量を算出する演算手段を更に備え、前記第2の登録手段は、前記演算手段により算出された特徴量に応じて、当該表示している映像に映る移動体と前記第1の登録手段により登録済みの移動体とが同一移動体であることを登録可能とする、付記1記載の画像処理システム。
前記特徴量を記憶する記憶手段と、前記記憶手段に記憶された前記特徴量を、条件に応じて削除する削除手段とを更に備える、付記2記載の画像処理システム。
前記第2の登録手段により登録された移動体の対応関係、及び前記修正手段により修正された移動体の対応関係に応じて、前記第1の登録手段で登録された移動体の移動履歴を表示装置に表示する第2の表示制御手段を更に備える付記1乃至付記3記載の画像処理システム。
前記修正手段は、修正対象となる移動体を時系列に表示装置に表示することにより、移動体の対応関係を修正可能とする、付記1乃至付記4のいずれか1項記載の画像処理システム。
前記修正手段は、修正対象となる移動体の移動履歴を表示装置に表示することにより、移動体の対応関係を修正可能とする、付記1乃至付記4のいずれか1項記載の画像処理システム。
前記修正手段により対応関係を修正すると、当該修正後の対応関係に矛盾が生じる場合に、その旨を報知する報知手段を更に備える、付記1乃至付記6のいずれか1項記載の画像処理システム。
複数のビデオカメラで撮像された映像の入力を受けるステップと、前記入力された映像に映る移動体を1以上登録可能とするステップと、前記入力された映像を表示装置に表示するステップと、当該表示している映像に映る移動体と前記登録済みの移動体とが同一移動体であることを登録可能とするステップと、同一移動体として登録された対応関係の一方の移動体を、他方の移動体とは異なる他の移動体と対応づけることにより、移動体の対応関係を修正可能とするステップとを画像処理システムが行う、画像処理方法。
前記入力された映像に映る移動体に係る特徴量を算出するステップを更に備え、前記演算により算出された特徴量に応じて、表示している映像に映る移動体と前記登録済みの移動体とが同一移動体であることを登録可能とする、付記8記載の画像処理方法。
前記特徴量を記憶するステップと、前記記憶された前記特徴量を、条件に応じて削除するステップとを更に備える、付記9記載の画像処理方法。
前記登録された移動体の対応関係、及び前記修正された移動体の対応関係に応じて、前記登録された移動体の移動履歴を表示装置に表示するステップを更に備える付記9又は付記10記載の画像処理方法。
修正対象となる移動体を時系列に表示装置に表示することにより、移動体の対応関係を修正可能とする、付記8乃至付記11のいずれか1項記載の画像処理方法。
修正対象となる移動体の移動履歴を表示装置に表示することにより、移動体の対応関係を修正可能とする、付記8乃至付記11のいずれか1項記載の画像処理方法。
対応関係を修正すると、当該修正後の対応関係に矛盾が生じる場合に、その旨を報知するステップを更に備える、付記8乃至付記13のいずれか1項記載の画像処理方法。
複数のビデオカメラで撮像された映像の入力を受ける処理と、前記入力された映像に映る移動体を1以上登録可能とする処理と、前記入力された映像を表示装置に表示する処理と、当該表示している映像に映る移動体と前記登録済みの移動体とが同一移動体であることを登録可能とする処理と、同一移動体として登録された対応関係の一方の移動体を、他方の移動体とは異なる他の移動体と対応づけることにより、移動体の対応関係を修正可能とする処理とをコンピュータに実行させるプログラム。
前記入力された映像に映る移動体に係る特徴量を算出する処理を更に備え、前記演算により算出された特徴量に応じて、表示している映像に映る移動体と前記登録済みの移動体とが同一移動体であることを登録可能とする、付記15記載のプログラム。
前記特徴量を記憶する処理と、前記記憶された前記特徴量を、条件に応じて削除する処理とを更に備える、付記16記載のプログラム。
前記登録された移動体の対応関係、及び前記修正された移動体の対応関係に応じて、前記登録された移動体の移動履歴を表示装置に表示する処理を更に備える付記16又は付記17記載のプログラム。
修正対象となる移動体を時系列に表示装置に表示することにより、移動体の対応関係を修正可能とする、付記15乃至付記18のいずれか1項記載のプログラム。
修正対象となる移動体の移動履歴を表示装置に表示することにより、移動体の対応関係を修正可能とする、付記15乃至付記18のいずれか1項記載のプログラム。
対応関係を修正すると、当該修正後の対応関係に矛盾が生じる場合に、その旨を報知する処理を更に備える、付記15乃至付記20のいずれか1項記載のプログラム。
Claims (9)
- 複数のビデオカメラで撮像された映像の入力を受ける入力手段と、
前記入力手段から入力された映像に映る移動体を1以上登録可能な第1の登録手段と、
前記入力手段により入力された映像を表示装置に表示する第1の表示制御手段と、
当該表示している映像に映る移動体と前記第1の登録手段により登録済みの移動体とが同一移動体であることを登録可能とする第2の登録手段と、
前記第2の登録手段により同一移動体として登録された対応関係の一方の移動体を、他方の移動体とは異なる他の移動体と対応づけることにより、移動体の対応関係を修正可能とする修正手段と
を備える画像処理システム。 - 前記入力手段により入力された映像に映る移動体に係る特徴量を算出する演算手段
を更に備え、
前記第2の登録手段は、前記演算手段により算出された特徴量に応じて、表示している映像に映る移動体と前記第1の登録手段により登録済みの移動体とが同一移動体であることを登録可能とする、
請求項1記載の画像処理システム。 - 前記特徴量を記憶する記憶手段と、
前記記憶手段に記憶された前記特徴量を、条件に応じて削除する削除手段と
を更に備える、請求項2記載の画像処理システム。 - 前記第2の登録手段により登録された移動体の対応関係、及び前記修正手段により修正された移動体の対応関係に応じて、前記第1の登録手段で登録された移動体の移動履歴を表示装置に表示する第2の表示制御手段
を更に備える請求項1乃至請求項3記載の画像処理システム。 - 前記修正手段は、修正対象となる移動体を時系列に表示装置に表示することにより、移動体の対応関係を修正可能とする、
請求項1乃至請求項4のいずれか1項記載の画像処理システム。 - 前記修正手段は、修正対象となる移動体の移動履歴を表示装置に表示することにより、移動体の対応関係を修正可能とする、
請求項1乃至請求項4のいずれか1項記載の画像処理システム。 - 前記修正手段により対応関係を修正すると、当該修正後の対応関係に矛盾が生じる場合に、その旨を報知する報知手段
を更に備える、請求項1乃至請求項6のいずれか1項記載の画像処理システム。 - 複数のビデオカメラで撮像された映像の入力を受けるステップと、
前記入力された映像に映る移動体を1以上登録可能とするステップと、
前記入力された映像を表示装置に表示するステップと、
当該表示している映像に映る移動体と前記登録済みの移動体とが同一移動体であることを登録可能とするステップと、
同一移動体として登録された対応関係の一方の移動体を、他方の移動体とは異なる他の移動体と対応づけることにより、移動体の対応関係を修正可能とするステップと
を画像処理システムが行う、画像処理方法。 - 複数のビデオカメラで撮像された映像の入力を受ける処理と、
前記入力された映像に映る移動体を1以上登録可能とする処理と、
前記入力された映像を表示装置に表示する処理と、
当該表示している映像に映る移動体と前記登録済みの移動体とが同一移動体であることを登録可能とする処理と、
同一移動体として登録された対応関係の一方の移動体を、他方の移動体とは異なる他の移動体と対応づけることにより、移動体の対応関係を修正可能とする処理と
をコンピュータに実行させるプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/428,692 US9396538B2 (en) | 2012-09-19 | 2013-07-01 | Image processing system, image processing method, and program |
JP2014536633A JPWO2014045670A1 (ja) | 2012-09-19 | 2013-07-01 | 画像処理システム、画像処理方法及びプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012205909 | 2012-09-19 | ||
JP2012-205909 | 2012-09-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014045670A1 true WO2014045670A1 (ja) | 2014-03-27 |
Family
ID=50340996
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/068016 WO2014045670A1 (ja) | 2012-09-19 | 2013-07-01 | 画像処理システム、画像処理方法及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US9396538B2 (ja) |
JP (1) | JPWO2014045670A1 (ja) |
WO (1) | WO2014045670A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016132705A1 (ja) * | 2015-02-20 | 2016-08-25 | パナソニックIpマネジメント株式会社 | 追跡支援装置、追跡支援システムおよび追跡支援方法 |
JP2016201758A (ja) * | 2015-04-14 | 2016-12-01 | パナソニックIpマネジメント株式会社 | 施設内人物捜索支援装置、施設内人物捜索支援システムおよび施設内人物捜索支援方法 |
WO2017043056A1 (ja) * | 2015-09-07 | 2017-03-16 | パナソニックIpマネジメント株式会社 | 運転支援方法およびそれを利用した運転支援装置とプログラム |
WO2018198373A1 (ja) * | 2017-04-28 | 2018-11-01 | 株式会社日立国際電気 | 映像監視システム |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6210234B2 (ja) * | 2012-09-19 | 2017-10-11 | 日本電気株式会社 | 画像処理システム、画像処理方法及びプログラム |
JP6897691B2 (ja) * | 2016-12-22 | 2021-07-07 | 日本電気株式会社 | 追跡支援装置、端末、追跡支援システム、追跡支援方法及びプログラム |
JP7094702B2 (ja) * | 2018-01-12 | 2022-07-04 | キヤノン株式会社 | 画像処理装置及びその方法、プログラム |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006155402A (ja) * | 2004-11-30 | 2006-06-15 | Sanyo Electric Co Ltd | セキュリティシステム |
JP2006229465A (ja) * | 2005-02-16 | 2006-08-31 | Matsushita Electric Ind Co Ltd | 監視装置、監視方法、監視用プログラム |
JP2006236255A (ja) * | 2005-02-28 | 2006-09-07 | Mitsubishi Electric Corp | 人物追跡装置および人物追跡システム |
JP2009098774A (ja) * | 2007-10-15 | 2009-05-07 | Mitsubishi Electric Corp | 人物追跡システム及び人物追跡方法及び人物追跡プログラム |
JP2010080993A (ja) * | 2008-09-23 | 2010-04-08 | Brother Ind Ltd | インターホンシステム |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE69738287T2 (de) * | 1996-09-20 | 2008-06-12 | Hitachi, Ltd. | Verfahren zum Anzeigen eines sich bewegenden Objekts, dessen Bahn zu identifizieren ist, Anzeigesystem unter Verwendung dieses Verfahrens und Programmaufzeichnungsmedium dafür |
US6445409B1 (en) * | 1997-05-14 | 2002-09-03 | Hitachi Denshi Kabushiki Kaisha | Method of distinguishing a moving object and apparatus of tracking and monitoring a moving object |
US7787013B2 (en) * | 2004-02-03 | 2010-08-31 | Panasonic Corporation | Monitor system and camera |
JP4685465B2 (ja) * | 2005-02-01 | 2011-05-18 | パナソニック株式会社 | 監視記録装置 |
KR100883065B1 (ko) * | 2007-08-29 | 2009-02-10 | 엘지전자 주식회사 | 모션 검출에 의한 녹화 제어장치 및 방법 |
JP5173915B2 (ja) | 2009-04-08 | 2013-04-03 | 三洋電機株式会社 | 画像処理装置及び撮像装置 |
KR101434768B1 (ko) * | 2010-02-19 | 2014-08-27 | 가부시끼가이샤 도시바 | 이동 물체 추적 시스템 및 이동 물체 추적 방법 |
US10645344B2 (en) * | 2010-09-10 | 2020-05-05 | Avigilion Analytics Corporation | Video system with intelligent visual display |
-
2013
- 2013-07-01 JP JP2014536633A patent/JPWO2014045670A1/ja active Pending
- 2013-07-01 US US14/428,692 patent/US9396538B2/en active Active
- 2013-07-01 WO PCT/JP2013/068016 patent/WO2014045670A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006155402A (ja) * | 2004-11-30 | 2006-06-15 | Sanyo Electric Co Ltd | セキュリティシステム |
JP2006229465A (ja) * | 2005-02-16 | 2006-08-31 | Matsushita Electric Ind Co Ltd | 監視装置、監視方法、監視用プログラム |
JP2006236255A (ja) * | 2005-02-28 | 2006-09-07 | Mitsubishi Electric Corp | 人物追跡装置および人物追跡システム |
JP2009098774A (ja) * | 2007-10-15 | 2009-05-07 | Mitsubishi Electric Corp | 人物追跡システム及び人物追跡方法及び人物追跡プログラム |
JP2010080993A (ja) * | 2008-09-23 | 2010-04-08 | Brother Ind Ltd | インターホンシステム |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016132705A1 (ja) * | 2015-02-20 | 2016-08-25 | パナソニックIpマネジメント株式会社 | 追跡支援装置、追跡支援システムおよび追跡支援方法 |
JP2016154306A (ja) * | 2015-02-20 | 2016-08-25 | パナソニックIpマネジメント株式会社 | 追跡支援装置、追跡支援システムおよび追跡支援方法 |
GB2550509A (en) * | 2015-02-20 | 2017-11-22 | Panasonic Ip Man Co Ltd | Tracking assistance device, tracking assistance system, and tracking assistance method |
US10181197B2 (en) | 2015-02-20 | 2019-01-15 | Panasonic Intellectual Property Management Co., Ltd. | Tracking assistance device, tracking assistance system, and tracking assistance method |
RU2696855C2 (ru) * | 2015-02-20 | 2019-08-07 | Панасоник Интеллекчуал Проперти Менеджмент Ко., Лтд. | Устройство поддержки отслеживания, система поддержки отслеживания и способ поддержки отслеживания |
GB2550509B (en) * | 2015-02-20 | 2020-08-26 | Panasonic Ip Man Co Ltd | Tracking assistance device, tracking assistance system, and tracking assistance method |
JP2016201758A (ja) * | 2015-04-14 | 2016-12-01 | パナソニックIpマネジメント株式会社 | 施設内人物捜索支援装置、施設内人物捜索支援システムおよび施設内人物捜索支援方法 |
WO2017043056A1 (ja) * | 2015-09-07 | 2017-03-16 | パナソニックIpマネジメント株式会社 | 運転支援方法およびそれを利用した運転支援装置とプログラム |
JP2017055181A (ja) * | 2015-09-07 | 2017-03-16 | パナソニックIpマネジメント株式会社 | 運転支援方法およびそれを利用した運転支援装置 |
WO2018198373A1 (ja) * | 2017-04-28 | 2018-11-01 | 株式会社日立国際電気 | 映像監視システム |
JPWO2018198373A1 (ja) * | 2017-04-28 | 2019-11-14 | 株式会社日立国際電気 | 映像監視システム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2014045670A1 (ja) | 2016-08-18 |
US9396538B2 (en) | 2016-07-19 |
US20150248751A1 (en) | 2015-09-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014045670A1 (ja) | 画像処理システム、画像処理方法及びプログラム | |
US10750113B2 (en) | Image processing system, image processing method, and program | |
JP6210234B2 (ja) | 画像処理システム、画像処理方法及びプログラム | |
US11908293B2 (en) | Information processing system, method and computer readable medium for determining whether moving bodies appearing in first and second videos are the same or not using histogram | |
JP6347211B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
US9589192B2 (en) | Information processing system, information processing method, and program | |
US20210329175A1 (en) | Image processing system, image processing method, and program | |
EP2913997B1 (en) | Information processing system, information processing method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13838741 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014536633 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14428692 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13838741 Country of ref document: EP Kind code of ref document: A1 |