WO2010016175A1 - 対象検出装置および対象検出方法 - Google Patents
対象検出装置および対象検出方法 Download PDFInfo
- Publication number
- WO2010016175A1 WO2010016175A1 PCT/JP2009/002315 JP2009002315W WO2010016175A1 WO 2010016175 A1 WO2010016175 A1 WO 2010016175A1 JP 2009002315 W JP2009002315 W JP 2009002315W WO 2010016175 A1 WO2010016175 A1 WO 2010016175A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- feature information
- unit
- identifier
- association
- object detection
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01V—GEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
- G01V8/00—Prospecting or detecting by optical means
- G01V8/10—Detecting, e.g. by using light barriers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
Definitions
- the present invention relates to a target detection apparatus and a target detection method for associating a moving object such as a person detected in a captured image with an identifier of a wireless terminal.
- Patent Document 1 discloses a method for associating a moving object such as a photographed person with an identifier of a wireless terminal held by the moving object in a space such as an open space where the entrance is not clear and there is no entrance gate. Technologies are known.
- the position history of the moving object (hereinafter referred to as “position history” as appropriate).
- an identifier of the wireless terminal (hereinafter referred to as “identifier” as appropriate).
- the position history is a history of the position of a moving object photographed by a camera or the like and its detection start time.
- the conventional moving object detection system displays the position history that has the detection start time closest to the reception start time of the identifier. Associate with an identifier.
- the conventional moving object detection system acquires two position histories P1 and P2 and one identifier ID1 as shown in FIG.
- Calculate
- , the conventional moving object detection system associates the position history P2 with the identifier T1.
- this is a case where two moving objects A and B move while maintaining a short distance.
- the detection start times T P1 and T P2 of the two position histories P1 and P2 and the reception start times T ID1 and T ID2 of the two identifiers ID1 and ID2 are almost simultaneous.
- the conventional moving object detection system cannot associate the position history with the identifier.
- the detection range of the position of the moving object may be different from the reception range of the identifier. Therefore, for example, of the two moving objects A and B, the moving object A moves in an area that is outside the position detection range but within the identifier reception range, and the moving object B is in both the position detection range and the identifier reception range. There is a case of moving within an area.
- one position history P1 and two identifiers ID1 and ID2 are acquired.
- the conventional moving object detection system cannot perform association at any of the reception start times T ID1 and T ID2 of the two identifiers ID1 and ID2.
- the reason is the closest detection start time T P1 position history P1 is the position history P1, because it is impossible to determine to be associated with both identifier.
- An object of the present invention is to detect a moving object and an identifier in a captured image even when a plurality of identifiers are started to be received in a short time or when the number of identifiers received is larger than the number of detected position histories. It is to provide a target detection apparatus and a target detection method that can be associated with each other.
- An object detection apparatus includes a feature information extraction unit that extracts feature information of at least one moving object shown in a captured image, a reading unit that reads an identifier of a wireless terminal held by the moving object, and the feature A history management unit that stores information and the identifier in association with each time, and the feature information and the identifier stored in the history management unit, a similarity between the feature information, the feature information, and the identifier And an associating unit for associating based on the association with each other.
- An object detection method includes a feature information extraction step of extracting feature information of at least one moving object shown in a captured image, a reading step of reading an identifier of a wireless terminal held by the moving object, and the feature A history management step of associating information with the identifier for each time and storing it in a memory; the feature information stored in the memory; and the identifier; a similarity between feature information stored in the memory; And an associating step of associating based on the association between the feature information and the identifier.
- the present invention by obtaining an identifier corresponding to feature information based on the similarity between feature information, when a plurality of identifiers are started to be received in a short time, or from the number of detected position histories Even when the number of received identifiers is large, it is possible to easily associate the moving object in the captured image with the identifier.
- the figure explaining the matching of the position history and identifier of the prior art The figure explaining the problem in the prior art
- the figure explaining the problem in the prior art The block diagram which shows the structure of the target detection apparatus which concerns on Embodiment 1 of this invention.
- history management part which concerns on Embodiment 1 of this invention The figure which shows an example of the display image which concerns on Embodiment 1 of this invention.
- the wireless terminal is an RFID (Radio Frequency ⁇ Identification), a non-contact IC card, a wireless tag, Bluetooth (registered trademark), a wireless LAN (local area network), a millimeter wave tag, or the like.
- RFID Radio Frequency ⁇ Identification
- non-contact IC card a non-contact IC card
- wireless tag a wireless tag
- Bluetooth registered trademark
- wireless LAN local area network
- millimeter wave tag or the like.
- FIG. 4 is a block diagram showing a configuration of the target detection apparatus according to Embodiment 1 of the present invention.
- FIG. 5 is a diagram showing a system configuration including the target detection apparatus according to Embodiment 1 of the present invention.
- the object detection apparatus 100 includes a photographing unit 101, a person cutout unit 102, a feature information extraction unit 103, a reading unit 104, a history management unit 105, an association unit 106, an abnormal output unit 107, an image storage unit 108, and a display unit 109. Configured.
- the association unit 106 includes an association determination unit 151, a clustering unit 152, a similarity calculation unit 153, and a cluster association unit 154.
- the photographing unit 101 photographs a predetermined area using a camera, and outputs the photographed image to the person segmentation unit 102, the feature information extraction unit 103, and the image storage unit 108.
- the person cutout unit 102 detects an area (hereinafter referred to as “person area”) in which the person 201 is reflected from the captured image output from the imaging unit 101 at a predetermined timing.
- the person cutout unit 102 notifies the reading unit 104 of information indicating that fact.
- the person cutout unit 102 outputs information indicating the position of the person area to the feature information extraction unit 103.
- the detection of the person area is realized by using an existing method. For example, the person cutout unit 102 detects a person region based on a difference between image information (background information) of a past frame and image information of the current frame, using the input captured image.
- the feature information extraction unit 103 refers to the information output from the person cutout unit 102, extracts all the feature information of the person 201 shown in the captured image output from the imaging unit 101, and extracts the extracted feature information as Output to the history management unit 105.
- feature information for example, there is N-dimensional data using a color feature amount, a luminance gradient feature amount, or the like.
- the feature vector will be described as being two-dimensional (for example, a first principal component and a second principal component obtained by principal component analysis of an N-dimensional vector).
- the feature information is, for example, feature information indicating the feature of a person's face or feature information indicating a feature of a person's shape.
- the reading unit 104 has a receiving antenna, and receives radio waves transmitted from the wireless terminal 202 held by the person 201 at the timing of notification from the person cutout unit 102.
- the notification from the person cutout unit 102 is a notification of information indicating that the number of persons 201 shown in the captured image has increased or decreased as described above.
- the reading unit 104 may inquire of the wireless terminal 202 about the identifier in response to the notification from the person cutout unit 102. Specifically, the reading unit 104 transmits an inquiry signal to the wireless terminal 202, receives a radio wave transmitted by the wireless terminal 202 as a response thereto, and reads an identifier superimposed on the received radio wave. That's fine.
- the reading unit 104 outputs all acquired identifiers to the history management unit 105. As shown in FIG. 5, for example, the reception range 203 of the reading unit 104 is set equal to or wider than the shooting range 204 of the shooting unit 101.
- the history management unit 105 inputs the feature information extracted by the feature information extraction unit 103 and the identifier read by the reading unit 104, associates the feature information with the identifier for each time, and manages it as history information.
- the association unit 106 associates the feature information and the identifier managed as history information by the history management unit 105. Then, the association unit 106 outputs information indicating the result of the association to the display unit 109.
- the abnormality output unit 107 outputs information indicating the result to the display unit 109 when the associating unit 106 determines abnormality.
- the image storage unit 108 stores the captured image output from the imaging unit 101 with time information.
- the display unit 109 displays the captured image stored in the image storage unit 108 with an identifier corresponding to the feature information of the person shown in the captured image superimposed.
- the display part 109 may overlap and display the information which shows this abnormality on a picked-up image.
- the association determination unit 151 determines whether or not the feature information and the identifier of the history information can be associated one-on-one.
- the association determination unit 151 associates the feature information and the identifier on a one-to-one basis when the number of feature information and the number of identifiers of the history information is one each at a predetermined time. .
- the association determination unit 151 may associate the feature information and the identifier on a one-to-one basis. Judge that it is not possible.
- the association determination unit 151 issues a trigger to the clustering unit 152 to perform clustering processing. Note that, in the past clustering process (a process of classifying into a subset), the association determination unit 151 uses the result of the association between the feature information and the identifier when the association is made with the target feature information. May be associated one-to-one.
- the association determination unit 151 associates the feature information with the identifier based on the result.
- the association determination unit 151 outputs information indicating the result of the association to the display unit 109.
- the clustering unit 152 When the clustering unit 152 receives a trigger from the association determination unit 151, the clustering unit 152 inputs the feature information of the history information managed by the history management unit 105, and clusters the feature information based on the similarity. Specific processing of the clustering unit 152 will be described later.
- the similarity calculation unit 153 calculates the similarity between the feature information when the clustering unit 152 performs clustering.
- the distance between vectors is used as the similarity.
- the method for calculating the similarity is not limited.
- the cluster association unit 154 associates the cluster generated by the clustering unit 152 with the identifier.
- the cluster association unit 154 outputs information indicating the result to the association determination unit 151.
- the cluster association unit 154 outputs information indicating the result to the abnormality output unit 107.
- FIG. 6 is a diagram illustrating an example of history information managed by the history management unit 105.
- the characteristic information of each person is (100, 10) and (15, 150), and the identifiers of the wireless terminals detected at that time are ID11 and ID22.
- the characteristic information of each person is (95, 15) and (80, 90), and the identifiers of the wireless terminals detected at that time are ID22 and ID33.
- FIG. 7 is a flowchart showing an operation procedure of the association unit 106 according to the present embodiment.
- Step 11 Cluster feature information.
- the clustering unit 152 performs clustering so that the feature information whose vector distance calculated by the similarity calculation unit 153 is shorter than a predetermined threshold is classified into the same cluster.
- a method of classifying feature information at a distance equal to or less than a certain threshold is used as one cluster.
- the clustering method is not limited to this method.
- hierarchical methods such as the shortest distance method and partition optimization methods such as the k-means method are known.
- the clustering unit 152 classifies these feature information into cluster 1. Further, the clustering unit 152 classifies the feature information (15, 150) at time t1 into cluster 2, and classifies the feature information (80, 90) at time t2 into cluster 3.
- Step 12 Create a logical expression.
- the cluster associating unit 154 creates a logical expression that represents the allocation status of each cluster and an identifier at each time, with the cluster as a variable.
- the cluster association unit 154 creates Expression (4) from Expression (1) below.
- the feature information (100, 10) at time t1 belongs to cluster 1 and may be ID11 and ID22. This can be expressed as the following equation (1).
- Cluster 1 ⁇ ID11, ID22 ⁇ (1)
- the feature information (15, 150) at time t1 belongs to cluster 2 and may be ID11 and ID22. This can be expressed as the following equation (2).
- Cluster 2 ⁇ ID11, ID22 ⁇ (2)
- the feature information (95, 15) at time t2 belongs to cluster 1 and may be ID22 and ID33
- the feature information (80, 90) belongs to cluster 3 and may be ID22 and ID33. is there.
- This can be expressed as the following equations (3) and (4).
- Cluster 1 ⁇ ID22, ID33 ⁇ ...
- Cluster 3 ⁇ ID22, ID33 ⁇ ... (4)
- Step 13 Solve the logical expression.
- the cluster associating unit 154 solves the logical expression created in step 12 using logical expression conversion.
- Known methods for solving logical expressions include solving a constraint satisfaction problem (backtracking method, etc.), its approximation algorithm, full search, and the like.
- the cluster association unit 154 solves Equation (1) to Equation (4) to obtain an identifier corresponding to each cluster.
- the identifier that the cluster 1 can take is limited to ID22. This can be expressed as the following equation (5).
- Cluster 1 ⁇ ID22 ⁇ ... (5)
- the identifier that cluster 2 can take is limited to ID11. Moreover, the identifier which the cluster 3 can take from Formula (4) and Formula (5) is limited to ID33.
- Step 14 Associating clusters with identifiers.
- the cluster association unit 154 associates a cluster with an identifier based on the result of solving the logical expression.
- each cluster can have only one identifier
- the cluster and the identifier are associated one-to-one (step 14; YES).
- the identifier cannot be associated with the cluster when the logical expression is solved (step 14; NO).
- Step 15 Associating the feature information with the identifier.
- the association determination unit 151 associates the identifier with the feature information from the correspondence relationship between the cluster and the identifier. Further, the association determination unit 151 outputs information indicating the association result to the display unit 109.
- the cluster association unit 154 associates the ID 22 with the feature information (100, 10) and the feature information (95, 15) belonging to the cluster 1. Further, the cluster association unit 154 associates ID11 with the feature information (15, 150) belonging to cluster 2, and associates ID33 with the feature information (80, 90) belonging to cluster 3.
- Step 16 Perform processing when an abnormality occurs.
- step 14 when the identifier cannot be associated with the cluster (step 14; NO), the abnormality output unit 107 determines that there is an abnormality and outputs the determination result to the display unit 109. As a result, it is possible to detect an abnormality such as possession or replacement of the wireless terminal.
- the feature information extraction unit 103 extracts the position of the object in the image in addition to the feature vector when extracting the feature information of the person object from the captured image. Then, the feature information extraction unit 103 registers the feature vector and the position of the object in the history management unit 105.
- FIG. 9 is a diagram illustrating an example of feature information including object position information in an image.
- two objects of feature information ((100, 10), (300, 400)) and feature information ((15, 150), (600, 450)) are obtained from the captured image at time t1. Indicates that it was detected.
- the meaning of the feature information ((100, 10), (300, 400)) is that (100, 10) is the feature vector of the object, and (300, 400) is the position of the object in the image.
- the display unit 109 performs the following processing when displaying specific feature information. (1) The display unit 109 acquires an image at a time corresponding to the feature information from the image storage unit 108. (2) The display unit 109 superimposes and displays the image acquired from the image storage unit 108 on the position coordinate information of the object acquired from the feature information.
- FIG. 10 is a diagram illustrating an example of a display image displayed by superimposing an identifier on the captured image at time t1 in FIG.
- the display unit 109 uses the result obtained by the associating unit 106 to perform display so that the identifier of the wireless terminal held by the person can be known around the person area captured in the captured image.
- the display unit 109 displays “ID11” and an arrow around the person area corresponding to the feature information (15, 150), and around the person area corresponding to the feature information (100, 10). “ID22” and an arrow are displayed.
- feature information is clustered to obtain an identifier corresponding to each cluster.
- the wireless terminal must always transmit radio waves in order to grasp the position history, resulting in an increase in power consumption.
- the wireless terminal 202 since the reading unit 104 only needs to communicate with the wireless terminal 202 when the person cutout unit 102 detects an increase or decrease in the number of persons appearing in the captured image, the wireless terminal 202 is always connected. There is no need to send radio waves. For this reason, the wireless terminal 202 does not communicate with the reading unit 104 in a normal state, and is activated by receiving radio waves from the reading unit 104.
- the wireless terminal 202 can be a semi-passive wireless terminal (semi-passive tag) that uses its own power after activation and transmits an identifier superimposed on a radio wave.
- the wireless terminal 202 can use a passive wireless terminal (passive tag) that superimposes an identifier on a radio wave obtained by reflecting a part of the radio wave from the reading unit 104. Thereby, the power consumption of the wireless terminal can be suppressed as compared with the conventional technology.
- associating unit 106 may have a configuration including cluster number determining unit 155 inside associating unit 106, as shown in FIG.
- the cluster number determination unit 155 calculates the total number of identifiers of history information managed by the history management unit 105 and outputs the result to the clustering unit 152 as the maximum number of clusters.
- the maximum number of clusters is set to “3”.
- the cluster number determination unit 155 obtains the total number of identifiers detected during the corresponding period, The number of clusters may be output to the clustering unit 152.
- the clustering unit 152 sets the upper limit of the number of clusters to the maximum number of clusters at the time of clustering in step 11 of FIG. This makes it possible to reduce the search range in the clustering process, reduce the amount of calculation for clustering, and improve the accuracy of clustering.
- the person cutout unit 102 may use an identifier change detection unit 111 instead of the person cutout unit 102 as shown in FIG.
- the identifier change detection unit 111 monitors the reading result of the reading unit 104, and when the number of identifiers increases or decreases, the history information management unit extracts the feature information and the identifier from the feature information extraction unit 103 and the reading unit 104. Instruct the user to register at 105.
- the identifier change detecting unit 111 also instructs the photographing unit 101 to start photographing. However, the identifier change detection unit 111 issues an instruction to start shooting before issuing an instruction to the feature information extraction unit 103.
- the associating unit 106 may output the result of the association as association information to the history management unit 105 to manage the association information.
- the association unit 106 can reduce the amount of calculation for clustering and association by using the association information when performing association for the second time and thereafter. Hereinafter, this case will be described.
- FIG. 13 is a diagram illustrating an example of history data managed by the history management unit 105.
- the data reflects the result of the association performed by the association unit 106.
- the first line of the data table represents that the feature information (100, 10) is associated with ID22.
- feature information (15, 150) is associated with ID11
- feature information (95, 15) is associated with ID22
- feature information (80, 90) is associated with ID33.
- the association determination unit 151 already corresponds to the feature information (100, 10) at time t1. You can see that it is attached. For this reason, the matching result that the identifier is ID22 can be obtained without performing the processing after the clustering unit 152.
- association determination unit 151 first determines whether or not the feature information and the identifier can be associated in a single time.
- the present invention is not limited to this, and the association determination unit 151 may be deleted from FIG. 4 and clustering may be performed in all cases.
- FIG. 14 is a block diagram showing the configuration of the target detection apparatus according to Embodiment 2 of the present invention.
- the same components as those in the target detection apparatus 100 shown in FIG. 14 the same components as those in the target detection apparatus 100 shown in FIG.
- Embodiment 2 is different from Embodiment 1 in the procedure for associating specific information with an identifier.
- the object detection device 300 illustrated in FIG. 14 is different from the association unit 106 in the internal configuration of the association unit 306 as compared to the object detection device 100 illustrated in FIG.
- the association unit 306 includes an association determination unit 351, a neighborhood feature information addition unit 352, and a similarity calculation unit 153, and obtains an identifier associated with the specified feature information.
- the association determination unit 351 determines whether or not one or more identifiers that can be associated with one or more pieces of feature information to be determined can be limited to one, and outputs the result.
- the neighborhood feature information adding unit 352 obtains feature information having a similarity in the neighborhood of the specified feature information, and outputs the feature information to the association determination unit 351.
- the associating unit 306 obtains an identifier associated with the designated feature information from the feature information managed by the history management unit 105 will be specifically described.
- the history information managed in the history management unit 105 is the same as that in FIG. 6 exemplified in the first embodiment, and the associating unit 306 obtains the identifier of the feature information (100, 10). Is requested.
- FIG. 15 is a flowchart showing an operation procedure of the associating unit 306 according to the present embodiment.
- Step 21 Determine whether association is possible.
- the association determination unit 351 determines whether or not the number of identifiers that may be associated with feature information can be limited to one.
- identifiers ID11 and ID22 there are two identifiers ID11 and ID22 that may correspond to the feature information (100, 10), and the association determination unit 351 cannot limit the identifier to one (step 21; NO).
- Step 22 Select neighborhood feature information. If the identifier cannot be limited to one in step 21 (step 21; NO), the neighboring feature information adding unit 352 selects neighboring feature information based on the similarity calculated by the similarity calculating unit 153.
- the neighborhood feature information adding unit 352 selects the nearest feature information (95, 15) of the feature information (100, 10).
- Step 23 Determining whether association is possible
- the association determination unit 351 adds the feature information selected by the neighborhood feature information addition unit 352 as a determination target, and is an identifier common to all the feature information that is the determination target Is determined to be limited to one.
- identifiers that may correspond to the feature information (95, 15) are ID22 and ID33. Therefore, the identifier that can be matched by both the feature information (100, 10) and the feature information (95, 15) is only ID22, and the association determination unit 351 can limit the identifier to one. (Step 23; YES).
- Step 24 Associate feature information with an identifier.
- the association determination unit 351 associates the obtained identifier with the specified feature information. . Further, the association determination unit 351 outputs information indicating the association result to the display unit 109.
- the association determination unit 351 associates the ID 22 with the feature information (100, 10) and the feature information (95, 15).
- Step 25 An abnormality is determined.
- the association determination unit 351 determines whether or not there is zero identifier common to one or more pieces of feature information in step 5. to decide.
- the neighborhood feature information adding unit 352 and the association determining unit 351 repeat the processing of steps 22 and 23 (step 25; NO).
- Step 26 Perform processing when an abnormality occurs.
- a person who does not have a wireless terminal enters the shooting range, or when the wireless terminal is replaced, there may be no common identifier (step 25; YES).
- the abnormality output unit 107 determines that there is an abnormality, and outputs the determination result to the display unit 109. As a result, it is possible to detect an abnormality such as possession or replacement of the wireless terminal.
- FIG. 16 is a diagram showing an example of a display image in which the identifier of the specified feature information is superimposed on the captured image at time t1 in FIG.
- the display unit 109 uses the result obtained by the associating unit 306 so that the identifier of the wireless terminal held by the person can be known around the person area of the specified feature information captured in the captured image. Display.
- the display unit 109 displays “ID22” and an arrow around the person area corresponding to the feature information (100, 10).
- Embodiment 1 and Embodiment 2 can be used properly according to a condition.
- the method for associating the feature information with the identifier mentioned in the first embodiment or the second embodiment may be switched as appropriate.
- FIG. 19 is a diagram showing a system configuration including an object detection apparatus according to Embodiment 3 of the present invention.
- the target detection apparatus 400 shown in FIG. 17 the same components as those in the target detection apparatus 100 shown in FIG.
- the object detection device 500 shown in FIG. 18 the same components as those in the object detection device 300 shown in FIG.
- FIG. 19 shows a configuration in which one target detection device 400 includes a plurality of imaging units 1001 to 1003, a plurality of reading units 1011 to 1013, and the target detection device 100.
- the target detection apparatus 100 associates feature information and identifiers from the plurality of imaging units 1001 to 1003 and the plurality of reading units 1011 to 1013 with each other.
- the object detection device 500 illustrated in FIG. 18 has a configuration in which a similarity weighting unit 156 is added inside the associating unit 306, as compared with the object detection device 300 illustrated in FIG.
- the similarity weighting unit 156 calculates a weighting coefficient and outputs it to the similarity calculation unit 153.
- the similarity calculation unit 153 corrects the inter-vector distance by multiplying the calculated inter-vector distance of the feature information by the weighting coefficient output from the similarity weighting unit 156.
- the similarity weighting unit 156 calculates a weighting coefficient so that the distance between vectors becomes smaller as the time zone in which the feature information for which the distance between vectors is calculated is taken is longer.
- the shooting conditions for different images shot with different cameras are different, such as changes in the light exposure. Therefore, even if it is the feature information of the same person, when it is the feature information extracted from the picked-up image of a different camera, the distance between vectors may become large.
- the similarity weighting unit 156 calculates a weighting coefficient based on the shooting conditions.
- the shooting conditions include, for example, a camera ID and white balance information at the time of shooting.
- the feature information extraction unit 103 acquires the camera ID and white balance information from the photographing unit 101 and registers them in the history management unit 105 together with the feature information.
- An example of the data format managed by the history management unit 105 including the camera ID and white balance information is shown in FIG.
- the similarity weighting unit 156 makes it possible to perform the estimation using only history information in a certain time period in which the appearance of the person does not change significantly. For this reason, the similarity weighting unit 156 sets the weighting coefficient to the maximum when the difference in time at which the feature information that is the target of the calculation of the distance between vectors is taken is greater than a predetermined threshold.
- a plurality of sets of the imaging unit 101 and the reading unit 104 that receives radio waves in a range including the imaging range of the imaging unit 101 may be used.
- the history management unit 105 adds one or more feature information and one or more identifiers after adding as additional information that the data is from the imaging unit 101 and the reading unit 104 that capture and receive the same range. Manage as history information.
- the present invention is not limited to this and can be applied to a moving object other than a person.
- the present invention can be applied to a digital camera.
- the digital camera by providing the digital camera with the function of the object detection device used in each of the above embodiments, it is possible to specify the ID of a person who is captured when taking a picture.
- the present invention can increase the added value of the digital camera by combining with an application that automatically sends the photographed photograph to the address corresponding to the ID of the person taking the photograph.
- the present invention can also be applied to marketing applications such as those used for marketing activities in supermarkets and department stores.
- An application example of the marketing application is, for example, to arrange an object detection device that detects a user ID and its action history in association with each other.
- the object detection device used in each of the above embodiments can be implemented by a general-purpose computer such as a personal computer.
- each process including the process of a matching part is implement
- Each process including the process of the associating unit may be realized by a dedicated device equipped with a corresponding LSI chip.
- the present invention is suitable for use in a target detection apparatus that associates a moving object such as a person detected in a captured image with an identifier of a wireless terminal.
Abstract
Description
複数の移動物体が近距離を保ちながら移動している場合には、各移動物体の検出開始時刻と、各移動物体が保持する無線端末の識別子の受信開始時刻とが、それぞれほぼ同時(誤差の範囲内)となる。
例えば、移動物体の位置の検出範囲と識別子の受信範囲は異なる場合がある。そのため、例えば2つの移動物体A、Bのうち、移動物体Aは、位置検出範囲外だが識別子受信範囲内であるエリアを移動し、移動物体Bは、位置検出範囲と識別子受信範囲の両方の範囲内であるエリアを移動する場合がある。
図4は、本発明の実施の形態1に係る対象検出装置の構成を示すブロック図である。図5は、本発明の実施の形態1に係る対象検出装置を含むシステム構成を示す図である。
クラスタリング部152は、類似度計算部153が計算したベクトル間距離が所定の閾値よりも短い特徴情報を、同じクラスタに分類するようにクラスタリングを行う。なお、本実施の形態では、一定の閾値以下の距離にある特徴情報を1つのクラスタとして分類する方法を用いているが、本発明では、クラスタリングの手法についてこの方法に限定されない。他のクラスタリングの手法として、最短距離法などの階層的手法やk-means法などの分割最適化手法が知られている。
クラスタ対応付け部154は、クラスタを変数として、各クラスタと各時刻における識別子との割り当て状況を表現する論理式を作成する。
クラスタ対応付け部154は、論理式の変換を用いて、ステップ12で作成した論理式を解く。なお、本発明は、論理式を解く手法に制限はない。論理式を解く手法として、制約充足問題の解法(バックトラッキング法など)、その近似アルゴリズム、全探索等が知られている。
クラスタ対応付け部154は、論理式を解いた結果に基づいて、クラスタと識別子との対応付けを行う。
ステップ14において、クラスタと識別子が1対1に対応付けられた場合(ステップ14;YES)、対応付け判定部151は、クラスタと識別子の対応関係から、特徴情報に識別子を対応付ける。さらに、対応付け判定部151は、対応付けの結果を示す情報を表示部109に出力する。
ステップ14において、クラスタに識別子を対応付けられない場合(ステップ14;NO)、異常出力部107は、異常であると判断し、判断結果を表示部109に出力する。これにより、無線端末の不所持や交換といった異常を検知することができる。
図14は、本発明の実施の形態2に係る対象検出装置の構成を示すブロック図である。なお、図14に示す対象検出装置300において、図4に示した対象検出装置100と共通する構成部分には、図4と同一符号を付して詳しい説明を省略する。
対応付け判定部351は、特徴情報が対応付けられる可能性のある識別子の数が1個に限定することができるかどうかを判断する。
ステップ21において、識別子を1つに限定することができない場合(ステップ21;NO)、近傍特徴情報追加部352は、類似度計算部153が計算した類似度に基づく近傍の特徴情報を選択する。
対応付け判定部351は、近傍特徴情報追加部352が選択した特徴情報を判定の対象として加えて、判定の対象となっている全ての特徴情報に共通する識別子が1個に限定されるかどうかを判断する。
ステップ21またはステップ23において、識別子を1つに限定することができた場合(ステップ21;YES、ステップ23;YES)、対応付け判定部351は、求めた識別子を、指定された特徴情報に対応付ける。更に、対応付け判定部351は、対応付けの結果を示す情報を、表示部109に出力する。
ステップ23において、識別子を1つに限定することができない場合(ステップ23;NO)、対応付け判定部351は、ステップ5で1つ以上の特徴情報に共通する識別子が0個になるかどうかを判断する。
無線端末を持たない人物が撮影範囲に入った場合や、無線端末を交換した場合には、共通する識別子が存在しない場合がある(ステップ25;YES)。この場合、異常出力部107は、異常であると判断し、判断結果を表示部109に出力する。これにより、無線端末の不所持や交換といった異常を検知することができる。
図17、図18は、本発明の実施の形態3に係る対象検出装置の構成を示すブロック図である。図19は、本発明の実施の形態3に係る対象検出装置を含むシステム構成を示す図である。なお、図17に示す対象検出装置400において、図4に示した対象検出装置100と共通する構成部分には、図4と同一符号を付して詳しい説明を省略する。また、図18に示す対象検出装置500において、図14に示した対象検出装置300と共通する構成部分には、図14と同一符号を付して詳しい説明を省略する。図19において、一つの対象検出装置400が、複数の撮影部1001~1003、複数の読み取り部1011~1013および対象検出装置100を有する構成を示している。対象検出装置100は、複数の撮影部1001~1003および複数の読み取り部1011~1013からの特徴情報と識別子との対応付けを行う。
101、1001、1003 撮影部
102 人物切り出し部
103 特徴情報抽出部
104、1011、1013 読み取り部
105 履歴管理部
106、306 対応付け部
107 異常出力部
108 画像記憶部
109 表示部
151、351 対応付け判定部
152 クラスタリング部
153 類似度計算部
154 クラスタ対応付け部
155 クラスタ数決定部
156 類似度重み付け部
352 近傍特徴情報追加部
Claims (15)
- 撮影画像に映っている少なくとも1つの移動物体の特徴情報を抽出する特徴情報抽出部と、
前記移動物体が保持する無線端末の識別子を読み取る読み取り部と、
前記特徴情報と前記識別子とを時刻毎に関連付けて保存する履歴管理部と、
前記履歴管理部に保存された前記特徴情報と前記識別子とを、前記特徴情報間の類似度と、前記特徴情報と前記識別子との間の前記関連付けとに基づいて対応付ける対応付け部と、
を具備する対象検出装置。 - 前記対応付け部は、
前記履歴管理部に保存された複数の前記特徴情報を、前記特徴情報間の類似度に基づいて部分集合に分類し、
前記特徴情報と前記識別子との間の前記関連付けに基づいて、前記部分集合と前記識別子とを対応付ける、
請求項1記載の対象検出装置。 - 前記対応付け部は、
前記部分集合に対応付けられた識別子が1つである場合には、前記部分集合の要素である全ての特徴情報と前記識別子とを対応付ける、
請求項2記載の対象検出装置。 - 前記対応付け部は、
複数の識別子に対応付けられた部分集合が存在する場合、あるいは、1つも識別子に対応付けられない部分集合が存在する場合に、異常であると判定する、
請求項2に記載の対象検出装置。 - 前記履歴管理部に保存されている異なる識別子の数を、前記部分集合の数の上限として設定する部分集合数決定部をさらに備える、
請求項2に記載の対象検出装置。 - 前記対応付け部は、
指定された特徴情報が属する前記部分集合に対応付けられた識別子が1つである場合には、前記指定された特徴情報と前記識別子とを対応付け、
前記指定された特徴情報が属する前記部分集合に対応付けられた識別子が複数存在する場合には、前記履歴管理部に保存された他の特徴情報から前記指定された特徴情報との類似度が最も高い特徴情報を抽出し、
前記指定された特徴情報と前記抽出された特徴情報との間で共通する識別子を求めることにより、前記指定された特徴情報に対応する識別子を特定する、
請求項1記載の対象検出装置。 - 前記対応付け部が、前記抽出された特徴情報が含む1つ以上の識別子と共通する識別子が求められた際に、前記共通する識別子が存在しなかった場合に、異常であると判定する、
請求項6記載の対象検出装置。 - 前記対応付け部は、
特徴情報間の類似度を算出する際に、特徴情報に対応する移動物体を撮影した時刻の差が離れているほど類似度が小さくなるように、類似度に対して重み付けを行う、
請求項1に記載の対象検出装置。 - 前記対応付け部は、
特徴情報同士の類似度を算出する際に、特徴情報に対応する移動物体を撮影した撮影部の撮影状況や撮影環境を表す撮影パラメータの差に応じて類似度に重み付けを行う、
請求項1に記載の対象検出装置。 - 前記対応付け部は、
前記履歴管理部に保存された全ての前記特徴情報および前記識別子のうち、特定の時間帯の前記特徴情報および前記識別子のみを用いて対応付けを行う、
請求項1に記載の対象検出装置。 - 前記撮影画像に映っている移動物体の数を検知する画像状況検知部をさらに備え、
前記読み取り部は、前記画像状況検知部が前記撮影画像に映っている移動物体の増加又は減少を検知した場合に読み取りを行う、
請求項1に記載の対象検出装置。 - 前記読み取り部が読み取る識別子の数を検知する識別子状況検知部をさらに備え、
前記特徴情報抽出部は、前記識別子状況検知部が識別子の増加又は減少を検知した場合に特徴情報の抽出を行う、
請求項1に記載の対象検出装置。 - 前記撮影部によって撮影画像内に撮影されている移動物体の特徴情報に対応する識別子を前記撮影画像内に重ねて表示する表示部をさらに備える、
請求項1に記載の対象検出装置。 - 前記特徴情報は、画像中の移動物体の特徴を示すn個の成分値からなるn次元ベクトルで表現され、
前記対応付け部は、前記特徴情報間の類似度を、n次元ベクトルとして表現された特徴情報のベクトル間距離として算出する、
請求項1に記載の対象検出装置。 - 撮影画像に映っている少なくとも1つの移動物体の特徴情報を抽出する特徴情報抽出工程と、
前記移動物体が保持する無線端末の識別子を読み取る読み取り工程と、
前記特徴情報と前記識別子とを時刻毎に関連付けてメモリに保存する履歴管理工程と、
前記メモリに保存された前記特徴情報と前記識別子とを、前記メモリに保存された特徴情報間の類似度と、前記特徴情報と前記識別子との間の前記関連付けとに基づいて対応付ける対応付け工程と、
を具備する対象検出方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/056,188 US8472670B2 (en) | 2008-08-08 | 2009-05-26 | Target detection device and target detection method |
CN2009801305019A CN102119343B (zh) | 2008-08-08 | 2009-05-26 | 对象检测装置和对象检测方法 |
JP2009543302A JP5450089B2 (ja) | 2008-08-08 | 2009-05-26 | 対象検出装置および対象検出方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008206111 | 2008-08-08 | ||
JP2008-206111 | 2008-08-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010016175A1 true WO2010016175A1 (ja) | 2010-02-11 |
Family
ID=41663404
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/002315 WO2010016175A1 (ja) | 2008-08-08 | 2009-05-26 | 対象検出装置および対象検出方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8472670B2 (ja) |
JP (1) | JP5450089B2 (ja) |
CN (1) | CN102119343B (ja) |
WO (1) | WO2010016175A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012113355A (ja) * | 2010-11-19 | 2012-06-14 | Japan Research Institute Ltd | 宣伝情報提供システム、宣伝情報提供方法および宣伝情報提供プログラム |
JP2016080619A (ja) * | 2014-10-21 | 2016-05-16 | アズビル株式会社 | 人検知システムおよび方法 |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013133057A1 (ja) * | 2012-03-07 | 2013-09-12 | ソニー株式会社 | 画像処理装置および方法、並びにプログラム |
US9275285B2 (en) | 2012-03-29 | 2016-03-01 | The Nielsen Company (Us), Llc | Methods and apparatus to count people in images |
US8761442B2 (en) * | 2012-03-29 | 2014-06-24 | The Nielsen Company (Us), Llc | Methods and apparatus to count people in images |
US8660307B2 (en) | 2012-03-29 | 2014-02-25 | The Nielsen Company (Us), Llc | Methods and apparatus to count people in images |
US9092675B2 (en) | 2012-03-29 | 2015-07-28 | The Nielsen Company (Us), Llc | Methods and apparatus to count people in images |
US9473353B2 (en) | 2014-06-23 | 2016-10-18 | International Business Machines Corporation | Cluster reconfiguration management |
US9658897B2 (en) | 2014-06-23 | 2017-05-23 | International Business Machines Corporation | Flexible deployment and migration of virtual machines |
CN107064917A (zh) * | 2017-03-30 | 2017-08-18 | 上海斐讯数据通信技术有限公司 | 一种微波定位方法及系统 |
JP6921317B2 (ja) * | 2018-05-18 | 2021-08-18 | 三菱電機株式会社 | 情報収集装置 |
JP7068595B1 (ja) * | 2020-11-09 | 2022-05-17 | ダイキン工業株式会社 | 管理装置 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006098214A (ja) * | 2004-09-29 | 2006-04-13 | Mitsubishi Electric Corp | 移動物体検出システム |
JP2006236183A (ja) * | 2005-02-28 | 2006-09-07 | Nec Engineering Ltd | 入退場管理システム |
JP2006268577A (ja) * | 2005-03-24 | 2006-10-05 | Fuji Xerox Co Ltd | 認証装置、認証システムおよび画像形成装置 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1515152A1 (de) * | 2003-09-12 | 2005-03-16 | Leica Geosystems AG | Verfahren zur Richtungsbestimmung zu einem zu vermessenden Objekt |
US7843471B2 (en) * | 2006-03-09 | 2010-11-30 | International Business Machines Corporation | Persistent authenticating mechanism to map real world object presence into virtual world object awareness |
US8064655B2 (en) | 2007-04-20 | 2011-11-22 | Panasonic Corporation | Face image detection device, face image detection method and imaging apparatus |
-
2009
- 2009-05-26 US US13/056,188 patent/US8472670B2/en active Active
- 2009-05-26 CN CN2009801305019A patent/CN102119343B/zh active Active
- 2009-05-26 JP JP2009543302A patent/JP5450089B2/ja not_active Expired - Fee Related
- 2009-05-26 WO PCT/JP2009/002315 patent/WO2010016175A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006098214A (ja) * | 2004-09-29 | 2006-04-13 | Mitsubishi Electric Corp | 移動物体検出システム |
JP2006236183A (ja) * | 2005-02-28 | 2006-09-07 | Nec Engineering Ltd | 入退場管理システム |
JP2006268577A (ja) * | 2005-03-24 | 2006-10-05 | Fuji Xerox Co Ltd | 認証装置、認証システムおよび画像形成装置 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012113355A (ja) * | 2010-11-19 | 2012-06-14 | Japan Research Institute Ltd | 宣伝情報提供システム、宣伝情報提供方法および宣伝情報提供プログラム |
JP2016080619A (ja) * | 2014-10-21 | 2016-05-16 | アズビル株式会社 | 人検知システムおよび方法 |
Also Published As
Publication number | Publication date |
---|---|
US20110216940A1 (en) | 2011-09-08 |
CN102119343B (zh) | 2013-04-03 |
JP5450089B2 (ja) | 2014-03-26 |
JPWO2010016175A1 (ja) | 2012-01-12 |
CN102119343A (zh) | 2011-07-06 |
US8472670B2 (en) | 2013-06-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5450089B2 (ja) | 対象検出装置および対象検出方法 | |
JP5740210B2 (ja) | 顔画像検索システム、及び顔画像検索方法 | |
US9626551B2 (en) | Collation apparatus and method for the same, and image searching apparatus and method for the same | |
JP7132387B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
CN105139040B (zh) | 一种排队状态信息检测方法及其系统 | |
TWI430186B (zh) | 影像處理裝置及影像處理方法 | |
US8929611B2 (en) | Matching device, digital image processing system, matching device control program, computer-readable recording medium, and matching device control method | |
US10747991B2 (en) | People stream analysis method, people stream analysis apparatus, and people stream analysis system | |
JP2011165008A (ja) | 画像認識装置および方法 | |
JP2014182480A (ja) | 人物認識装置、及び方法 | |
WO2020195732A1 (ja) | 画像処理装置、画像処理方法、およびプログラムが格納された記録媒体 | |
CN112912888A (zh) | 识别视频活动的设备和方法 | |
US20200050838A1 (en) | Suspiciousness degree estimation model generation device | |
US20190147251A1 (en) | Information processing apparatus, monitoring system, method, and non-transitory computer-readable storage medium | |
JP2022003526A (ja) | 情報処理装置、検出システム、情報処理方法、及びプログラム | |
JP6289308B2 (ja) | 情報処理装置およびプログラム | |
KR102250712B1 (ko) | 전자 장치 및 제어 방법 | |
JP5680954B2 (ja) | 滞留時間測定装置、滞留時間測定システム、および滞留時間測定方法 | |
WO2020115910A1 (ja) | 情報処理システム、情報処理装置、情報処理方法、およびプログラム | |
KR102300500B1 (ko) | 제품 입체컷 이미지 처리 방법, 장치 및 시스템 | |
El-Din et al. | Adversarial unsupervised domain adaptation guided with deep clustering for face presentation attack detection | |
JP2022058833A (ja) | 情報処理システム、情報処理装置、情報処理方法、およびプログラム | |
JP6658402B2 (ja) | フレームレート判定装置、フレームレート判定方法及びフレームレート判定用コンピュータプログラム | |
JP2017005582A (ja) | 画像処理装置、画像処理方法、及びプログラム | |
JP6112346B2 (ja) | 情報収集システム、プログラムおよび情報収集方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980130501.9 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2009543302 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09804662 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13056188 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 09804662 Country of ref document: EP Kind code of ref document: A1 |