WO2016114134A1 - 移動状況推定装置、移動状況推定方法およびプログラム記録媒体 - Google Patents
移動状況推定装置、移動状況推定方法およびプログラム記録媒体 Download PDFInfo
- Publication number
- WO2016114134A1 WO2016114134A1 PCT/JP2016/000146 JP2016000146W WO2016114134A1 WO 2016114134 A1 WO2016114134 A1 WO 2016114134A1 JP 2016000146 W JP2016000146 W JP 2016000146W WO 2016114134 A1 WO2016114134 A1 WO 2016114134A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- movement
- monitoring target
- state
- estimated
- monitoring
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/254—Analysis of motion involving subtraction of images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
- G06V20/53—Recognition of crowd images, e.g. recognition of crowd congestion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30242—Counting objects in image
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
Definitions
- the present invention relates to a movement situation estimation device, a movement situation estimation method, and a program recording medium.
- Patent Document 1 describes a person counting device that measures the number of people from an image of crowded images.
- the person counting device described in Patent Document 1 extracts the head of a person included in an image based on a head model, and uses the feature information such as position information and color distribution to match the same person between frames.
- the determined head positions are connected, and the number of people is measured from the connection result.
- Non-Patent Document 1 describes a method for estimating the number of crowds.
- the method described in Non-Patent Document 1 captures the crowd state including the overlap of persons with a “crowd-patch” that shows a local image, and performs regression learning on the number of people in the patch. Estimate the number of people from still images.
- Patent Document 2 describes a traffic measurement system capable of obtaining traffic data at a survey target point.
- the system described in Patent Literature 2 identifies a passerby of a survey target area from an image obtained by imaging a predetermined survey target area, and determines the number of passersby.
- Non-Patent Document 1 by using the crowd patch described in Non-Patent Document 1, it is possible to recognize the crowd in the image without depending on the frame rate.
- the crowd patch described in Non-Patent Document 1 it is possible to predict the number of people that can exist in a predetermined region in the image, but how many objects the observation target moves It is difficult to estimate.
- An exemplary object of the present invention is to provide a movement situation estimation apparatus, a movement situation estimation method, and a movement situation estimation program that can accurately estimate the movement situation of a monitoring target even in a congested environment.
- the movement status estimation apparatus uses a plurality of temporally continuous images, and estimates quantity in each local area, with quantity estimation means for estimating the quantity of monitoring targets for each local area of each image. It is characterized by comprising movement status estimation means for estimating the movement status of the monitoring object from the time series change of the quantity.
- the movement state estimation method uses a plurality of temporally continuous images to estimate the number of monitoring targets for each local region of each image, and at the time of the amount estimated in each of the local regions.
- the movement status of the monitoring target is estimated from the series change.
- a program recording medium uses a plurality of temporally continuous images on a computer to estimate the quantity of a monitoring target for each local region of the plurality of images, and each of the local regions
- a program for executing a movement situation estimation process for estimating the movement situation of the monitoring target from the time-series change of the quantity estimated in step 1 is recorded.
- FIG. 1 is a block diagram illustrating an embodiment of a movement status estimation apparatus.
- FIG. 2 is an explanatory diagram illustrating an example of processing for estimating the quantity to be monitored.
- FIG. 3 is an explanatory diagram illustrating an example of the movement status of the monitoring target.
- FIG. 4 is an explanatory diagram showing a relationship between a local region in which the quantity to be monitored is estimated and particles present in the local region.
- FIG. 5 is an explanatory diagram illustrating an example of processing for updating the weight value when the local regions overlap.
- FIG. 6 is an explanatory diagram illustrating an example of a situation where the detection probabilities are different.
- FIG. 7 is an explanatory diagram illustrating an example of processing for calculating the quantity of monitoring targets that have passed a predetermined position.
- FIG. 8 is a flowchart illustrating an operation example of the movement state estimation apparatus.
- FIG. 9 is a block diagram illustrating an overview of the movement status estimation apparatus.
- FIG. 10 is a block diagram illustrating a
- FIG. 1 is a block diagram showing an embodiment of a movement situation estimation apparatus according to the present invention.
- the movement state estimation device 100 of this embodiment includes an image input unit 11, a number of people estimation unit 12, a flow calculation unit 13, a state storage device 14, a state prediction unit 15, a staying information calculation unit 16, and an individual person.
- a detection unit 17, a state update unit 18, and a number of people output unit 19 are provided.
- the arrows shown in the figure show an example of the data flow.
- the flow of data in the movement status estimation device 100 is not limited to a specific direction.
- the image input unit 11 acquires an image at a certain processing time from a video (moving image).
- the image acquired by the image input unit 11 is referred to as a “target image”.
- the image input unit 11 receives input of a plurality of target images that are temporally continuous.
- the number estimation unit 12 estimates the number of photographed people for each local region in the target image. That is, the number estimating unit 12 estimates the number of persons (the number of monitoring targets) for each local region of the input target image.
- the method for estimating the number of people by the number-of-people estimation unit 12 is not particularly limited.
- the number-of-people estimation unit 12 may estimate the number of people by comparing the crowd patch described in Non-Patent Document 1 and the local region of the target image, for example, as in the method described in Patent Document 1
- the number of people may be estimated using a plurality of images including images.
- FIG. 2 is an explanatory diagram showing an example of processing for estimating the number of people.
- the number estimating unit 12 extracts a local region 21 from the target image 20 and estimates the number of persons included in the local region 21. In the example shown in FIG. 2, the number estimating unit 12 estimates that there are four people in the local region 21.
- the number output unit 19 outputs an image representing the number of persons estimated for each local area in a mode (color, shading, etc.) according to the number of persons. May be. The specific process executed by the number-of-people output unit 19 will be described later.
- the number estimation unit 12 can capture the number of people in each local region (time series change). Particularly in a crowded environment, it is conceivable that people move in a certain group. Therefore, the number estimating unit 12 estimates the movement status of the person from the transition of the number of persons in the local area.
- the number estimating unit 12 predicts the future position of the person from the movement of the person when the number of persons is estimated.
- the number estimating unit 12 may assume that the monitoring target moves in all directions at an equal probability and at a constant speed in the initial state. Further, the number estimating unit 12 estimates the future number of persons in each local region based on the predicted position of the future person.
- prediction the estimation of the future number of people by the number of people estimation unit 12 is hereinafter referred to as “prediction”.
- the person estimation unit 12 may assume that each person moves at a constant speed, but may use a prediction result of a state prediction unit 15 described later.
- the number-of-people estimation unit 12 compares the number of people predicted in advance for each local region at a certain time in the future with the number of people estimated for each local region from the target image at that time. Then, the person estimation unit 12 estimates the movement state of the person with emphasis on the local area where the difference in the number of persons is smaller. Specifically, the number-of-people estimation unit 12 determines that a person who is in a specific local area has moved to a local area where the difference in the number of persons estimated from the target image is the smallest among a plurality of local areas close to the local area. It may be estimated.
- FIG. 3 is an explanatory diagram illustrating an example of a movement situation of a person.
- the density map illustrated in FIG. 3 indicates that more people are present in regions with lighter colors.
- the person estimation unit 12 estimates that there are many persons in the region 31a and the region 31b of the target image at a certain time.
- the number-of-people estimation unit 12 it is assumed that many persons have moved to the region 32a and the region 32b of the target image at different points in time. From this time series change, it becomes possible to grasp the movement of the person from the area 31a to the area 32a (arrow 33a) and the movement of the person from the area 31b to the area 32b (arrow 33b).
- the number of persons estimation unit 12 is a person included in the target image at a certain point in time, or a person who has newly appeared at the certain point in time, or an existing person who has appeared in the target image before the point in time (that is, has been shot It may be determined whether the person has moved within the area. For example, when the presence of a person is estimated at a position where the movement of the person is not predicted from the target image, the person estimation unit 12 may determine that the person is a newly appearing person.
- the flow calculation unit 13 calculates an optical flow for the target image.
- the method for calculating the optical flow is not particularly limited.
- the flow calculation unit 13 may calculate an optical flow using feature points as in the Lucas-Kanade method, or may calculate an optical flow using a variational method as in the Horn-Schunck method. .
- the state storage device 14 stores the past state of the person.
- the state of the person includes the position, speed, and likelihood (weight value) of the person at the past time point.
- the person's state may include the person's operation state (moving state or stationary state).
- the state storage device 14 may hold a variable s i indicating whether the person is in a moving state or a stationary state as the state of each particle.
- the sum of the particle weight values w i corresponds to the quantity to be monitored, that is, the number of people. For example, when 100 particles are newly dispersed per monitoring target in the PHD particle filter, the weight value of each particle set at that time is “0.01”.
- the state prediction unit 15 uses the past person state stored in the state storage device 14 to predict the state of the person at the time when the image input unit 11 acquires the image.
- the state prediction unit 15 may predict whether the movement state or the stationary state of the person is included in addition to the position, speed, and weight value of the person.
- the state prediction unit 15 predicts the future state of the person using a plurality of particles that represent the state of the person.
- a method in which the state prediction unit 15 predicts the state of an object using a PHD particle filter will be described.
- the state prediction unit 15 predicts the state of the person at the time when the image input unit 11 acquires the target image.
- one monitoring target state is expressed by a plurality of particles.
- the method by which the state prediction unit 15 predicts the position and speed is the same as the prediction method performed by a general particle filter.
- the position of the particle at a certain time is x
- the position of the particle after the dt has elapsed is represented by x + v ⁇ dt + e.
- e represents noise that cannot be expressed by the constant velocity motion model, and is, for example, a random value generated based on a normal distribution with a predetermined standard deviation.
- the velocity of the particle at the position x is v
- the velocity of the particle after dt has elapsed is represented by v + f.
- f represents noise that cannot be expressed by the constant velocity motion model, and is, for example, a random value generated based on a normal distribution with a predetermined standard deviation.
- the state prediction unit 15 predicts the operation state.
- the motion state of the person transitions from the stationary state to the moving state according to a predetermined probability P
- the motion state transitions from the moving state to the stationary state based on a predetermined probability Q.
- the state prediction unit 15 When s i represents a stationary state, the state prediction unit 15 generates a uniform random number from 0 to 1, and if the random value is equal to or less than P, changes the s i to a value indicating the moving state. On the other hand, when s i represents an operating state, the state prediction unit 15 generates a uniform random number from 0 to 1, and if the random value is equal to or less than Q, changes s i to a value indicating a stationary state.
- the state prediction unit 15 may predict the state of the variable s i based on the past history and statistical results. For example, it is assumed that the tracking target person changes from a stationary state to a moving state, and s i is a value indicating the moving state. If this transition has been made recently and a predetermined period has not elapsed since the transition to the moving state, it can be assumed that the tracking target does not immediately return to the stationary state. Therefore, in this case, the state prediction unit 15 may hold the operation state in the movement state for a certain period.
- the state prediction unit 15 may hold the operation state in a stationary state for a certain period.
- the state prediction unit 15 may change the state of transition of the operation state according to the location in the image. For example, the state prediction unit 15 may set the transition probability Q from the moving state to the stationary state to be small in the region where the passage in the image is shown. In addition, the state prediction unit 15 may set the transition probability Q to be large in an area where an area with a lot of waiting in the image is shown.
- the stay information calculation unit 16 uses the target image acquired by the image input unit 11 to extract a region that is determined to be a stay.
- the staying information calculation unit 16 determines, for example, for each pixel whether or not it is a temporary stationary object using a method of detecting an object that has been stationary for a long period of time, and is determined to be a temporary stationary object by a labeling process. May be detected.
- the individual person detection unit 17 individually detects a person from the target image. For example, a person shown on the front side of an image is often shielded by other objects and can be detected by a general detection method. Therefore, the individual person detection unit 17 sets an area in which a person can be individually detected (hereinafter referred to as an individual detection area), and detects a person from the individual detection area of the target image.
- an individual detection area an area in which a person can be individually detected
- the state update unit 18 updates the state of the person predicted by the state prediction unit 15 based on the estimation result of the number estimation unit 12. Further, the state update unit 18 may update the state of the person based on the processing results of the flow calculation unit 13, the stay information calculation unit 16, and the individual person detection unit 17. Hereinafter, a method in which the state update unit 18 updates the monitoring target will be specifically described.
- the state update unit 18 updates the state of the person in the corresponding region according to the number of people in each local region estimated by the number estimation unit 12.
- the state updating unit 18 updates the weight value w i of the particles predicted to exist in the corresponding region according to the number of people in each local region estimated by the number estimating unit 12.
- the state update unit 18 updates the weight value w i of each particle so that the sum of the weights of the local regions is equal to the number of people.
- the state update unit 18 uses a predetermined value ⁇ from 0 to 1 so that the change of the weight value w i becomes gradual, and the sum of the updated weights is (1 ⁇ ) ⁇ may update the weight value w i of each particle such that the (previous sum of the weights of) + (alpha ⁇ number). In this way, the state update unit 18 may update the particle weight values so that the sum of the weight values approaches the number of people in the local region.
- FIG. 4 is an explanatory diagram showing the relationship between the local area where the number of persons is estimated and the particles existing in the local area.
- white or black circles indicate particles.
- Black particles indicate particles whose moving direction (arrow in the figure) is the same as or within a predetermined angle with a specific direction.
- the state update unit 18 updates the weight value of each particle so that the sum of the weight values becomes 2 for the three particles included in the local region 41c.
- the state update unit 18 may update the weight value of the particles individually for each local region, and collect the overlapping local regions together. You may update the weight value of particle
- FIG. 5 is an explanatory diagram showing an example of processing for updating the weight value when the local regions overlap.
- the number of persons is estimated for three local regions that partially overlap.
- the state update unit 18 may update the weight values of the particles in a lump for a region where the local region 42a, the local region 42b, and the local region 42c are combined. Further, the state updating unit 18 may individually update the weights of particles included in each local region for each of the local region 42a, the local region 42b, and the local region 42c.
- the state update unit 18 may update all the weight values of the particles included in the local region at the same rate, or may update at different rates for each particle. For example, the state update unit 18 may change the weight value to be updated according to the detection probability of the person. Specifically, when the detection probability of each particle is P i and the estimated number of people is H, the state update unit 18 sets the weight value w i of each particle to be updated to [(1-P i ) *. W i + H * (P i * w i ) / ⁇ sum of (P i * w i ) ⁇ ] may be calculated.
- the state update unit 18 can set a weight value that is affected by the estimated number of people H as the detection probability increases. In the case where the person is not detected, since the detection probability is 0, the weight value does not change.
- update processing can be performed with priority given to information of a region with a higher detection probability.
- FIG. 6 is an explanatory diagram showing an example of a situation where the detection probabilities are different.
- the back side of the target image is often photographed at a low depression angle.
- the state updating unit 18 updates the weight of the particles existing on the near side to a higher value, and updates the weight of the particles existing on the far side to a lower value.
- the state updating unit 18 newly generates a particle at that position. Specifically, the state update unit 18 may generate new particles randomly in the vicinity of the position in accordance with a normal distribution with a predetermined standard deviation.
- the state update unit 18 may update the weight value w i of particles that are close to the optical flow of the target image to increase. At this time, the state updating unit 18 updates the particle weight value w i so as not to change the sum of the weights.
- L i is a value that increases as the particle moves closer to the optical flow.
- the state update unit 18 may update the weight value of each particle with [L i * w i / ⁇ (sum of (L i * w i )) ⁇ * S. By performing such an update, the weight value w i increases as the particle moves closer to the optical flow.
- the method of updating the weight value of the particle that moves close to the optical flow is not limited to the above method.
- the state update unit 18 simply multiplies the particle weight value by a positive constant when the angle ⁇ formed by both vectors is equal to or smaller than the threshold, and then the sum of the weight values multiplied by the constant is the original weight value.
- Each weight value may be normalized so as to be equal to the sum S.
- the state update unit 18 may determine a particle having a distance between both vectors equal to or smaller than a threshold as a particle having a close motion.
- the state update unit 18 may update the particle weight value according to the proximity between the region determined by the stay information calculation unit 16 as a stay and the moving state of the particle. Specifically, the state update unit 18 increases the weight value of the particle as the distance between the region determined by the stay information calculation unit 16 and the particle whose movement state is predicted to be stationary is closer. May be updated. At this time, the state updating unit 18 may update the particle weight value by using a method similar to the method for updating the weight value of the particle that moves close to the optical flow, for example.
- the state update unit 18 when the individual person detection unit 17 detects a person in the individual detection area, the state update unit 18 generally uses a PHD particle filter for the weight value of particles existing in the area corresponding to the detected person.
- the weight value may be updated by a simple method. That is, the state update unit 18 updates the weight value of the particles included in each local region by a method other than using the estimated number of people in each local region for the range in which the individual person detection unit 17 detects the person. May be. As described above, it is possible to improve the accuracy of tracking the monitoring target by selecting a more suitable prediction method according to the area of the image to be captured.
- the state update unit 18 deletes particles whose weight value w i is equal to or smaller than the threshold value. Further, the state update unit 18 updates the position information of the person in the same manner as a general tracking process. The state update unit 18 records the updated state of the person in the state storage device 14. Further, the state update unit 18 may perform resampling to respread particles according to the weight value of each particle.
- the number output unit 19 outputs the number of people within the shooting range based on the state of the person. Specifically, persons output unit 19, the state update unit 18 using the weight values w i of the updated particles, and outputs the number of people included in the target image.
- the number output unit 19 may calculate the sum of the particle weight values w i and output the number of persons within the imaging range. Further, when outputting the number of persons existing in a predetermined area, the number output unit 19 may specify particles existing in the area and calculate the sum of the weight values w i of the specified particles.
- the number output unit 19 calculates, for example, the sum of the weight values w i of the particles in a stationary state and outputs the number of people staying in the imaging range. May be.
- the number output unit 19 uses the current particle state and the past particle state to straddle the number of people who have moved in a specific direction or a line defined at a predetermined position in the imaging range (that is, a predetermined number).
- the number of people who passed the line may be output.
- the number-of-people output unit 19 outputs the number of people who have passed a predetermined position based on the weight of the particles that have passed that position.
- the number-of-people output unit 19 specifies, for example, particles whose direction connecting the past position and the current position is the same as or within a predetermined angle with the specific direction. You may calculate the sum total of the weight value of particle
- the number output unit 19 may use the past particle weight value, the current particle weight value, or the weight value of both particles when calculating the sum of the weight values of the particles. An average may be used.
- the number-of-people output unit 19 identifies the particles that have passed the predetermined line when moving from the previous position to the current position, and sets the weight value of those particles. Calculate the sum. For example, when only particles moving in a specific direction among particles passing through a predetermined line are targeted, the number-of-persons output unit 19 has an inner product of a normal vector of the predetermined line and a vector indicating the moving direction of the particles. Zero or more particles may be targeted. Specifically, if the predetermined line here is a horizontal straight line, the particles passing through the straight line include particles moving from a position above the straight line to a position below the straight line. There are particles that move from a lower position to an upper position.
- the number-of-people output unit 19 targets particles moving from the top to the bottom among these particles, the normal vector with respect to a horizontal straight line (in this case, a vector heading directly below) and the moving direction of each particle
- the inner product of the vectors indicating is calculated, and particles whose inner product is 0 or more are targeted.
- the number output unit 19 can output the number of persons who have passed a predetermined line during a predetermined period by accumulating the sum of the weight values within the predetermined period. As in the case of calculating the number of people who have moved in a specific direction as the particle weight value used when calculating the sum of the weight values, the number-of-people output unit 19 Both can be used.
- FIG. 7 is an explanatory diagram showing an example of processing for calculating the quantity of monitoring targets that have passed a predetermined position.
- a dotted circle illustrated in FIG. 7 indicates a past particle, and a solid circle indicates a current particle.
- the arrow illustrated in FIG. 7 indicates the movement state of particles from the past to the present.
- the number-of-persons output unit 19 identifies particles that have passed through the solid line 52 from above to below, and calculates the sum of the weights of the identified particles.
- the particle 50b, the particle 50c, and the particle 50d pass through the solid line 52 from the upper direction to the lower direction. Therefore, the number-of-people output unit 19 calculates the sum of the weights of the particles 50b, 50c, and 50d and outputs it as the number of people passing through. For example, when it is desired to output the number of passing people within a certain period, the number output unit 19 may add up the sum of the weights of particles within the certain period.
- the image input unit 11, the number of people estimation unit 12, the flow calculation unit 13, the state prediction unit 15, the staying information calculation unit 16, the individual person detection unit 17, the state update unit 18, and the number of people output unit 19 may be realized by a program. These units can be realized by a processor of a computer that operates according to the program.
- FIG. 10 is a block diagram illustrating a hardware configuration of the computer apparatus 200 that implements the movement state estimation apparatus 100.
- the computer apparatus 200 includes a CPU (Central Processing Unit) 201, a ROM (Read Only Memory) 202, a RAM (Random Access Memory) 203, a storage device 204, a drive device 205, a communication interface 206, and an input / output interface. 207.
- the movement state estimation device 100 can be realized by the configuration (or part thereof) shown in FIG.
- the CPU 201 executes the program 208 using the RAM 203.
- the program 208 may be stored in the ROM 202.
- the program 208 may be recorded on a recording medium 209 such as a flash memory and read by the drive device 205 or transmitted from an external device via the network 210.
- the communication interface 206 exchanges data with an external device via the network 210.
- the input / output interface 207 exchanges data with peripheral devices (such as an input device and a display device).
- the communication interface 206 and the input / output interface 207 can function as means for acquiring or outputting data.
- the movement status estimation apparatus 100 may be configured by a single circuit (such as a processor) or may be configured by a combination of a plurality of circuits.
- the circuit here may be either dedicated or general purpose.
- the CPU 201 functions as the image input unit 11, the number of people estimation unit 12, the flow calculation unit 13, the state prediction unit 15, the stay information calculation unit 16, the individual person detection unit 17, the state update unit 18, and the number of people output unit 19 according to the program 208. May be.
- the image input unit 11, the number of people estimation unit 12, the flow calculation unit 13, the state prediction unit 15, the staying information calculation unit 16, the individual person detection unit 17, the state update unit 18, and the number of people output unit 19 are each dedicated hardware. It may be realized by hardware. Further, the state storage device 14 may be realized by the storage device 204 or may be an external device connected via the communication interface 206.
- FIG. 8 is a flowchart illustrating an operation example of the movement state estimation apparatus 100 according to the present embodiment.
- the number of persons estimation unit 12 estimates the number of persons for each local region of each image using a plurality of temporally continuous images (step S11). Then, the number estimating unit 12 estimates the movement situation of the person from the time series change of the quantity estimated in each local region (step S12).
- the number estimation unit 12 predicts the future number of persons in each local region by predicting the future position of the person from the estimated movement state of the person.
- the movement status of the person can be determined from the state of the particles representing the person, for example.
- the number-of-people estimation unit 12 can predict the number of future people, for example, from the positions of future particles predicted by the state prediction unit 15. Then, the number estimating unit 12 places importance on the local area where the difference between the predicted number of persons and the estimated number of persons is small for each local area.
- the number-of-people estimation unit 12 estimates the number of people for each local region of each image using a plurality of temporally continuous images, and the number of people estimated in each local region.
- the movement situation of the person is estimated from the series change. Therefore, according to the present embodiment, it is possible to accurately estimate the movement state of a person even in a crowded environment where it is difficult to track individual persons.
- the state prediction unit 15 predicts the future state of the person using a plurality of particles expressing the person's state, and the number of people output unit 19 selects the number of people for the weighted particles. calculate. That is, according to the present embodiment, it is possible to measure the number of persons who have passed a specific location by tracking the state of the monitoring target using particles having weight values. Further, according to the present embodiment, it is possible to measure not only the number of people who have passed a specific location but also the number of people moving in a specific direction. As a result, it is possible to measure not only a simple congestion at a certain place but also a flow rate (human flow) at that place.
- FIG. 9 is a block diagram showing an outline of the movement status estimation apparatus in the present embodiment.
- the movement status estimation apparatus shown in FIG. 9 uses a plurality of temporally continuous images to estimate a quantity to be monitored for each local area of each image, and a quantity estimated in each local area.
- a movement situation estimation unit 82 that estimates the movement situation of the monitoring target from the time series change of the above.
- the quantity estimation unit 81 and the movement status estimation unit 82 correspond to the number of people estimation unit 12 in the above-described embodiment.
- the movement status estimation apparatus having such a configuration can accurately estimate the movement status of the monitoring target even in a congested environment.
- the movement state estimation unit 82 may predict the future position of the monitoring target based on the movement state of the monitoring target at the estimated time point to predict the future quantity of the monitoring target in each local region.
- the movement state estimation unit 82 compares the quantity of the future monitoring target predicted for each local area with the quantity of the monitoring target estimated for each local area from the target image at a future time point.
- the movement status of the monitoring target may be estimated with emphasis on a small local area.
- the movement status estimation unit 82 may estimate whether the monitoring target captured in the target image is a new monitoring target or a monitoring target that has moved within the target image, and estimate the movement status of the monitoring target.
- the movement state estimation apparatus may include a prediction unit (for example, the state prediction unit 15) that predicts the future state of the monitoring target using a plurality of particles that express the state of the monitoring target.
- the movement status estimation device may update the weight value set for the particles predicted to be included in each local region according to the estimated number of monitoring targets in each local region (for example, the state An updating unit 18) may be provided.
- the movement state estimation unit 82 may estimate the movement state of the monitoring target from the time series change of the sum of the weight values of the particles included in each local region. According to such a configuration, it is possible to cope with various movement situations performed by the monitoring target.
- the update unit is set to the particles so that the sum of the weight values set for the particles predicted to be included in the local region approaches the estimated number of monitoring targets in the corresponding local region.
- the weight value may be updated.
- a weight value is set so that the total is 1 for the particles expressing the state of one monitoring target. Further, at least the position and speed of the monitoring target are set for the particles expressing the state of the monitoring target. The prediction unit predicts the future position of the monitoring target based on the position and velocity set for the particles.
- the movement state estimation device may include a quantity output unit (for example, the number output unit 19) that outputs the quantity to be monitored according to the weight set for the particles.
- a quantity output unit for example, the number output unit 19
- the quantity output unit may output the quantity to be monitored that has passed a predetermined position based on the weight of the particles that have passed that position.
- the movement state estimation apparatus may include a flow calculation unit (for example, the flow calculation unit 13) that calculates the optical flow of the target image. Then, the updating unit may update so as to increase the weight value of particles that move close to the optical flow. According to such a configuration, it is possible to estimate the movement state with emphasis on particles close to the motion estimated from the image.
- a flow calculation unit for example, the flow calculation unit 13
- the updating unit may update so as to increase the weight value of particles that move close to the optical flow. According to such a configuration, it is possible to estimate the movement state with emphasis on particles close to the motion estimated from the image.
- the movement state estimation device may include a stay information calculation unit (for example, stay information calculation unit 16) that extracts an area determined as a stay from the target image. Then, the prediction unit predicts the future movement state of the object to be viewed, and the update unit increases the weight value of the particle as the particle whose movement state is predicted to be stationary is closer to the region where it is determined to be stagnant. You may update as you do. According to such a configuration, it is possible to appropriately determine the state of the monitoring target that has not moved.
- stay information calculation unit for example, stay information calculation unit 16
- the movement state estimation device may include a monitoring target detection unit that detects a monitoring target from a target image (specifically, an individual detection region set in the target image as a range in which the monitoring target can be individually detected). Good. Then, the update unit detects particles that are predicted to be included in each local region by a method other than using the estimated number of monitoring targets in each local region for the range in which the monitoring target detection unit detects the monitoring target.
- the weight value set to may be updated.
- An example of such a method is a method in which a general PHD particle filter updates the weight value.
- the present invention is preferably applied to a movement state estimation device that estimates the number of moving objects.
- the present invention is suitably applied to an apparatus for estimating the flow of an object such as a person or a car or the number of objects passing through a specific location from an image captured by the camera, for example, in a monitoring system using a fixed camera or the like. Is done.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
以下、本発明の実施形態を図面を参照して説明する。なお、本発明でいう監視対象は、以下の実施形態の説明においては人物が例示されているが、人物以外の自転車や自動車のような物体であってもよい。
なお、移動状況推定装置100は、単一の回路(プロセッサ等)によって構成されてもよいし、複数の回路の組み合わせによって構成されてもよい。ここでいう回路(circuitry)は、専用又は汎用のいずれであってもよい。
次に、本発明の実施形態の概要を説明する。図9は、本実施形態における移動状況推定装置の概要を示すブロック図である。図9に示す移動状況推定装置は、時間的に連続する複数の画像を用いて、各画像の局所領域ごとに監視対象の数量を推定する数量推定部81と、各局所領域において推定された数量の時系列変化から監視対象の移動状況を推定する移動状況推定部82とを備えている。数量推定部81および移動状況推定部82は、上述した実施形態における人数推定部12に相当する。
12 人数推定部
13 フロー計算部
14 状態記憶装置
15 状態予測部
16 滞留情報計算部
17 個別人物検出部
18 状態更新部
19 人数出力部
20 対象画像
21,41a~41c,42a~42c,43 局所領域
50a~50f,51a~51f 粒子
Claims (14)
- 時間的に連続する複数の画像を用いて、当該各画像の局所領域ごとに監視対象の数量を推定する数量推定手段と、
前記局所領域の各々において推定された前記数量の時系列変化から前記監視対象の移動状況を推定する移動状況推定手段とを備えた
移動状況推定装置。 - 前記移動状況推定手段は、推定された時点の前記監視対象の前記移動状況から前記監視対象の将来の位置を予測して前記局所領域の各々における前記監視対象の将来の数量を予測し、前記局所領域ごとに予測された将来の前記監視対象の数量と、将来の時点において対象画像から前記局所領域ごとに推定された前記監視対象の数量とを比較し、数量の差がより小さい前記局所領域を重視して、前記監視対象の前記移動状況を推定する
請求項1記載の移動状況推定装置。 - 前記移動状況推定手段は、前記画像に含まれる前記監視対象が新規の監視対象か既存の監視対象かを判断して、前記監視対象の移動状況を推定する
請求項1または請求項2記載の移動状況推定装置。 - 前記監視対象の状態を表現する複数の粒子を用いて、前記監視対象の将来の状態を予測する予測手段と、
推定された前記局所領域の各々の監視対象の数量に応じて、当該局所領域の各々に含まれると予測された前記粒子に設定される重み値を更新する更新手段とを備え、
前記移動状況推定手段は、前記局所領域の各々に含まれる前記粒子の前記重み値の総和の時系列変化から前記監視対象の前記移動状況を推定する
請求項1から請求項3のうちのいずれか1項に記載の移動状況推定装置。 - 前記更新手段は、前記局所領域に含まれると予測された前記粒子に設定される前記重み値の総和が推定された対応する前記局所領域の前記監視対象の数量に近づくように、前記重み値を更新する
請求項4記載の移動状況推定装置。 - 前記粒子に設定された重みに応じて前記監視対象の数量を出力する数量出力手段を備えた
請求項4または請求項5記載の移動状況推定装置。 - 前記数量出力手段は、所定の位置を通過した前記監視対象の数量を、当該位置を通過した前記粒子の前記重みに基づいて出力する
請求項6記載の移動状況推定装置。 - 前記画像のオプティカルフローを計算するフロー計算手段を備え、
前記更新手段は、前記オプティカルフローに近い動きの前記粒子の前記重み値を増加させる
請求項4から請求項7のうちのいずれか1項に記載の移動状況推定装置。 - 前記対象画像から滞留物と判定される領域を抽出する滞留情報計算手段を備え、
前記予測手段は、前記監視対象の将来の前記移動状態を予測し、
前記更新手段は、前記移動状態が静止状態であると予測された粒子が前記滞留物と判定された領域に近いほど、当該粒子の前記重み値を大きくする
請求項4から請求項8のうちのいずれか1項に記載の移動状況推定装置。 - 前記対象画像から前記監視対象を検出する監視対象検出手段を備え、
前記更新手段は、前記監視対象検出手段が前記監視対象を検出した範囲については、推定された前記局所領域の各々の前記監視対象の数量を用いる以外の方法で、当該局所領域の各々に含まれると予測された前記粒子に設定される前記重み値を更新する
請求項4から請求項9のうちのいずれか1項に記載の移動状況推定装置。 - 時間的に連続する複数の画像を用いて、当該各画像の局所領域ごとに監視対象の数量を推定し、
前記局所領域の各々において推定された前記数量の時系列変化から前記監視対象の移動状況を推定する
移動状況推定方法。 - 推定された時点の前記監視対象の前記移動状況から前記監視対象の将来の位置を予測して前記局所領域の各々における前記監視対象の将来の数量を予測し、
前記局所領域ごとに予測された将来の前記監視対象の数量と、将来の時点において前記対象画像から前記局所領域ごとに推定された前記監視対象の数量とを比較し、
数量の差がより小さい前記局所領域を重視して、前記監視対象の前記移動状況を推定する
請求項11記載の移動状況推定方法。 - コンピュータに、
時間的に連続する複数の画像を用いて、当該複数の画像の局所領域ごとに監視対象の数量を推定する数量推定処理、および、
前記局所領域の各々において推定された前記数量の時系列変化から前記監視対象の移動状況を推定する移動状況推定処理
を実行させるためのプログラムを記録したプログラム記録媒体。 - 前記コンピュータに、
前記移動状況推定処理において、推定された時点の前記監視対象の前記移動状況から前記監視対象の将来の位置を予測して前記局所領域の各々における前記監視対象の将来の数量を予測させ、前記局所領域ごとに予測された将来の前記監視対象の数量と、将来の時点において前記対象画像から前記局所領域ごとに推定された前記監視対象の数量とを比較させ、数量の差がより小さい前記局所領域を重視して、前記監視対象の前記移動状況を推定させる
請求項13記載のプログラム記録媒体。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/543,408 US10325160B2 (en) | 2015-01-14 | 2016-01-13 | Movement state estimation device, movement state estimation method and program recording medium |
JP2016569286A JP6969871B2 (ja) | 2015-01-14 | 2016-01-13 | 移動状況推定装置、移動状況推定方法およびプログラム |
US16/296,468 US10657386B2 (en) | 2015-01-14 | 2019-03-08 | Movement state estimation device, movement state estimation method and program recording medium |
US16/296,516 US10755108B2 (en) | 2015-01-14 | 2019-03-08 | Movement state estimation device, movement state estimation method and program recording medium |
US16/921,447 US20200334472A1 (en) | 2015-01-14 | 2020-07-06 | Movement state estimation device, movement state estimation method and program recording medium |
US17/849,211 US20220327839A1 (en) | 2015-01-14 | 2022-06-24 | Movement state estimation device, movement state estimation method and program recording medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-004963 | 2015-01-14 | ||
JP2015004963 | 2015-01-14 |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/543,408 A-371-Of-International US10325160B2 (en) | 2015-01-14 | 2016-01-13 | Movement state estimation device, movement state estimation method and program recording medium |
US16/296,468 Continuation US10657386B2 (en) | 2015-01-14 | 2019-03-08 | Movement state estimation device, movement state estimation method and program recording medium |
US16/296,516 Continuation US10755108B2 (en) | 2015-01-14 | 2019-03-08 | Movement state estimation device, movement state estimation method and program recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016114134A1 true WO2016114134A1 (ja) | 2016-07-21 |
Family
ID=56405682
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/000146 WO2016114134A1 (ja) | 2015-01-14 | 2016-01-13 | 移動状況推定装置、移動状況推定方法およびプログラム記録媒体 |
Country Status (3)
Country | Link |
---|---|
US (5) | US10325160B2 (ja) |
JP (4) | JP6969871B2 (ja) |
WO (1) | WO2016114134A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018025831A1 (ja) * | 2016-08-04 | 2018-02-08 | 日本電気株式会社 | 人流推定装置、表示制御装置、人流推定方法および記録媒体 |
WO2018051944A1 (ja) * | 2016-09-13 | 2018-03-22 | 日本電気株式会社 | 人流推定装置、人流推定方法および記録媒体 |
JP2018116511A (ja) * | 2017-01-18 | 2018-07-26 | 日本放送協会 | 状態推定器、及びプログラム |
JP2018180619A (ja) * | 2017-04-04 | 2018-11-15 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP2019128605A (ja) * | 2018-01-19 | 2019-08-01 | 日本電信電話株式会社 | 予測装置、予測方法及びコンピュータプログラム |
WO2019229979A1 (ja) * | 2018-06-01 | 2019-12-05 | 日本電気株式会社 | 情報処理装置、制御方法、及びプログラム |
WO2019244627A1 (ja) * | 2018-06-22 | 2019-12-26 | 日本電信電話株式会社 | 推定方法、推定装置及び推定プログラム |
JPWO2021181612A1 (ja) * | 2020-03-12 | 2021-09-16 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10789484B2 (en) | 2016-03-07 | 2020-09-29 | Nec Corporation | Crowd type classification system, crowd type classification method and storage medium for storing crowd type classification program |
WO2018011944A1 (ja) * | 2016-07-14 | 2018-01-18 | 三菱電機株式会社 | 群集監視装置、および、群集監視システム |
US10839552B2 (en) * | 2017-06-01 | 2020-11-17 | Nec Corporation | Image processing apparatus, tracking method, and program |
SG10201802673VA (en) * | 2018-03-29 | 2019-10-30 | Nec Asia Pacific Pte Ltd | Method and system for integration and automatic switching of crowd estimation techniques |
JP7216487B2 (ja) * | 2018-06-21 | 2023-02-01 | キヤノン株式会社 | 画像処理装置およびその制御方法 |
US20220343712A1 (en) * | 2019-08-22 | 2022-10-27 | Nippon Telegraph And Telephone Corporation | Number of people estimation device, number of people estimation method, and number of people estimation program |
JP7443002B2 (ja) * | 2019-09-13 | 2024-03-05 | キヤノン株式会社 | 画像解析装置、画像解析方法、及びプログラム |
JP7533571B2 (ja) * | 2020-03-27 | 2024-08-14 | 日本電気株式会社 | 人流予測システム、人流予測方法および人流予測プログラム |
US11373425B2 (en) * | 2020-06-02 | 2022-06-28 | The Nielsen Company (U.S.), Llc | Methods and apparatus for monitoring an audience of media based on thermal imaging |
US11595723B2 (en) | 2020-08-20 | 2023-02-28 | The Nielsen Company (Us), Llc | Methods and apparatus to determine an audience composition based on voice recognition |
US11553247B2 (en) | 2020-08-20 | 2023-01-10 | The Nielsen Company (Us), Llc | Methods and apparatus to determine an audience composition based on thermal imaging and facial recognition |
US11763591B2 (en) | 2020-08-20 | 2023-09-19 | The Nielsen Company (Us), Llc | Methods and apparatus to determine an audience composition based on voice recognition, thermal imaging, and facial recognition |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012234285A (ja) * | 2011-04-28 | 2012-11-29 | Dainippon Printing Co Ltd | 画像解析装置、画像解析方法、画像解析プログラム及び記録媒体 |
WO2014112407A1 (ja) * | 2013-01-16 | 2014-07-24 | 日本電気株式会社 | 情報処理システム、情報処理方法及びプログラム |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0460880A (ja) * | 1990-06-29 | 1992-02-26 | Shimizu Corp | 動体識別解析管理システム |
JP2855157B2 (ja) * | 1990-07-17 | 1999-02-10 | 清水建設株式会社 | 群衆歩行シミュレーションシステム |
JP2001076291A (ja) | 1999-09-02 | 2001-03-23 | Nri & Ncc Co Ltd | 通行量測定システム |
US6633232B2 (en) * | 2001-05-14 | 2003-10-14 | Koninklijke Philips Electronics N.V. | Method and apparatus for routing persons through one or more destinations based on a least-cost criterion |
US7123918B1 (en) * | 2001-08-20 | 2006-10-17 | Verizon Services Corp. | Methods and apparatus for extrapolating person and device counts |
JP2006031645A (ja) * | 2004-07-12 | 2006-02-02 | Nariyuki Mitachi | 動的群集密度のリアルタイム推定方法及び群集事故防止システム |
JP2006270865A (ja) * | 2005-03-25 | 2006-10-05 | Victor Co Of Japan Ltd | 画像監視装置 |
JP2007243342A (ja) * | 2006-03-06 | 2007-09-20 | Yokogawa Electric Corp | 画像監視装置及び画像監視システム |
WO2008058296A2 (en) * | 2006-11-10 | 2008-05-15 | Verificon Corporation | Method and apparatus for analyzing activity in a space |
JP4624396B2 (ja) | 2007-10-26 | 2011-02-02 | パナソニック株式会社 | 状況判定装置、状況判定方法、状況判定プログラム、異常判定装置、異常判定方法および異常判定プログラム |
US20090158309A1 (en) | 2007-12-12 | 2009-06-18 | Hankyu Moon | Method and system for media audience measurement and spatial extrapolation based on site, display, crowd, and viewership characterization |
CN102007516A (zh) | 2008-04-14 | 2011-04-06 | 汤姆森特许公司 | 自动跟踪对象的技术 |
JP2009294887A (ja) * | 2008-06-05 | 2009-12-17 | Vector Research Institute Inc | 建築設備制御システムおよびプログラム |
JP2010198566A (ja) | 2009-02-27 | 2010-09-09 | Nec Corp | 人数計測装置、方法及びプログラム |
WO2012111138A1 (ja) * | 2011-02-18 | 2012-08-23 | 株式会社日立製作所 | 歩行者移動情報検出装置 |
JP5680524B2 (ja) * | 2011-12-09 | 2015-03-04 | 株式会社日立国際電気 | 画像処理装置 |
US9165190B2 (en) | 2012-09-12 | 2015-10-20 | Avigilon Fortress Corporation | 3D human pose and shape modeling |
JP2014106879A (ja) * | 2012-11-29 | 2014-06-09 | Railway Technical Research Institute | 人の分布状況推定システム |
US9955124B2 (en) * | 2013-06-21 | 2018-04-24 | Hitachi, Ltd. | Sensor placement determination device and sensor placement determination method |
EP3312770B1 (en) * | 2013-06-28 | 2023-05-10 | NEC Corporation | Crowd state recognition device, method, and program |
JP5613815B1 (ja) * | 2013-10-29 | 2014-10-29 | パナソニック株式会社 | 滞留状況分析装置、滞留状況分析システムおよび滞留状況分析方法 |
JP6331785B2 (ja) * | 2014-07-08 | 2018-05-30 | 日本電気株式会社 | 物体追跡装置、物体追跡方法および物体追跡プログラム |
JP5854098B2 (ja) * | 2014-08-08 | 2016-02-09 | 大日本印刷株式会社 | 情報表示装置及び情報表示用プログラム |
-
2016
- 2016-01-13 JP JP2016569286A patent/JP6969871B2/ja active Active
- 2016-01-13 US US15/543,408 patent/US10325160B2/en active Active
- 2016-01-13 WO PCT/JP2016/000146 patent/WO2016114134A1/ja active Application Filing
-
2019
- 2019-03-08 US US16/296,468 patent/US10657386B2/en active Active
- 2019-03-08 US US16/296,516 patent/US10755108B2/en active Active
-
2020
- 2020-07-06 US US16/921,447 patent/US20200334472A1/en not_active Abandoned
- 2020-10-14 JP JP2020172914A patent/JP7163945B2/ja active Active
-
2022
- 2022-06-24 US US17/849,211 patent/US20220327839A1/en active Pending
- 2022-08-03 JP JP2022123727A patent/JP7428213B2/ja active Active
-
2024
- 2024-01-22 JP JP2024007200A patent/JP2024041997A/ja active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012234285A (ja) * | 2011-04-28 | 2012-11-29 | Dainippon Printing Co Ltd | 画像解析装置、画像解析方法、画像解析プログラム及び記録媒体 |
WO2014112407A1 (ja) * | 2013-01-16 | 2014-07-24 | 日本電気株式会社 | 情報処理システム、情報処理方法及びプログラム |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11106920B2 (en) | 2016-08-04 | 2021-08-31 | Nec Corporation | People flow estimation device, display control device, people flow estimation method, and recording medium |
US11074461B2 (en) | 2016-08-04 | 2021-07-27 | Nec Corporation | People flow estimation device, display control device, people flow estimation method, and recording medium |
US10936882B2 (en) | 2016-08-04 | 2021-03-02 | Nec Corporation | People flow estimation device, display control device, people flow estimation method, and recording medium |
WO2018025831A1 (ja) * | 2016-08-04 | 2018-02-08 | 日本電気株式会社 | 人流推定装置、表示制御装置、人流推定方法および記録媒体 |
JPWO2018025831A1 (ja) * | 2016-08-04 | 2019-06-13 | 日本電気株式会社 | 人流推定装置、人流推定方法およびプログラム |
US10970559B2 (en) | 2016-09-13 | 2021-04-06 | Nec Corporation | People flow estimation device, people flow estimation method, and recording medium |
US10970558B2 (en) | 2016-09-13 | 2021-04-06 | Nec Corporation | People flow estimation device, people flow estimation method, and recording medium |
JPWO2018051944A1 (ja) * | 2016-09-13 | 2019-07-04 | 日本電気株式会社 | 人流推定装置、人流推定方法およびプログラム |
WO2018051944A1 (ja) * | 2016-09-13 | 2018-03-22 | 日本電気株式会社 | 人流推定装置、人流推定方法および記録媒体 |
US10810442B2 (en) | 2016-09-13 | 2020-10-20 | Nec Corporation | People flow estimation device, people flow estimation method, and recording medium |
JP2018116511A (ja) * | 2017-01-18 | 2018-07-26 | 日本放送協会 | 状態推定器、及びプログラム |
US11450114B2 (en) | 2017-04-04 | 2022-09-20 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and computer-readable storage medium, for estimating state of objects |
JP2018180619A (ja) * | 2017-04-04 | 2018-11-15 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP2019128605A (ja) * | 2018-01-19 | 2019-08-01 | 日本電信電話株式会社 | 予測装置、予測方法及びコンピュータプログラム |
JPWO2019229979A1 (ja) * | 2018-06-01 | 2021-05-13 | 日本電気株式会社 | 情報処理装置、制御方法、及びプログラム |
JP7006782B2 (ja) | 2018-06-01 | 2022-01-24 | 日本電気株式会社 | 情報処理装置、制御方法、及びプログラム |
WO2019229979A1 (ja) * | 2018-06-01 | 2019-12-05 | 日本電気株式会社 | 情報処理装置、制御方法、及びプログラム |
US12039451B2 (en) | 2018-06-01 | 2024-07-16 | Nec Corporation | Information processing device, control method, and program |
WO2019244627A1 (ja) * | 2018-06-22 | 2019-12-26 | 日本電信電話株式会社 | 推定方法、推定装置及び推定プログラム |
JPWO2021181612A1 (ja) * | 2020-03-12 | 2021-09-16 | ||
JP7327645B2 (ja) | 2020-03-12 | 2023-08-16 | 日本電気株式会社 | 画像処理装置、画像処理システム、画像処理方法、および画像処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
JP7163945B2 (ja) | 2022-11-01 |
US10325160B2 (en) | 2019-06-18 |
US20190205660A1 (en) | 2019-07-04 |
US20190220672A1 (en) | 2019-07-18 |
US20220327839A1 (en) | 2022-10-13 |
JPWO2016114134A1 (ja) | 2017-10-26 |
JP7428213B2 (ja) | 2024-02-06 |
JP6969871B2 (ja) | 2021-11-24 |
JP2024041997A (ja) | 2024-03-27 |
US10657386B2 (en) | 2020-05-19 |
JP2021036437A (ja) | 2021-03-04 |
JP2022166067A (ja) | 2022-11-01 |
US10755108B2 (en) | 2020-08-25 |
US20180005046A1 (en) | 2018-01-04 |
US20200334472A1 (en) | 2020-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7163945B2 (ja) | 移動状況推定装置、移動状況推定方法およびプログラム記録媒体 | |
JP6561830B2 (ja) | 情報処理システム、情報処理方法及びプログラム | |
JP6213843B2 (ja) | 画像処理システム、画像処理方法及びプログラム | |
CN104103030B (zh) | 图像分析方法、照相机装置、控制装置及控制方法 | |
CN105144705B (zh) | 对象监视系统、对象监视方法和用于提取待监视对象的程序 | |
US9940633B2 (en) | System and method for video-based detection of drive-arounds in a retail setting | |
US9858486B2 (en) | Device and method for detecting circumventing behavior and device and method for processing cause of circumvention | |
JP2020149704A (ja) | ビデオデータを用いた活動モニタリングのためのシステム及び方法 | |
JP6120404B2 (ja) | 移動体行動分析・予測装置 | |
JP5271227B2 (ja) | 群衆監視装置および方法ならびにプログラム | |
US20150146006A1 (en) | Display control apparatus and display control method | |
KR102584708B1 (ko) | 과소 및 과밀 환경을 지원하는 군중위험관리시스템 및 방법 | |
KR20140132140A (ko) | 군중 궤적 추출을 이용한 비정상 행동 검출에 기초한 영상 감시 방법 및 영상 감시 장치 | |
JP2021149687A (ja) | 物体認識装置、物体認識方法及び物体認識プログラム | |
JP5864231B2 (ja) | 移動方向識別装置 | |
JP5599228B2 (ja) | 繁忙検知システム及び繁忙検知プログラム | |
Li et al. | A video-based algorithm for moving objects detection at signalized intersection | |
CN116704394A (zh) | 视频流中的异常行为检测 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16737205 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016569286 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15543408 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16737205 Country of ref document: EP Kind code of ref document: A1 |