WO2011151999A1 - 動線検出システム、動線検出方法および動線検出プログラム - Google Patents
動線検出システム、動線検出方法および動線検出プログラム Download PDFInfo
- Publication number
- WO2011151999A1 WO2011151999A1 PCT/JP2011/002930 JP2011002930W WO2011151999A1 WO 2011151999 A1 WO2011151999 A1 WO 2011151999A1 JP 2011002930 W JP2011002930 W JP 2011002930W WO 2011151999 A1 WO2011151999 A1 WO 2011151999A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- score
- information
- correspondence information
- time
- flow line
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06311—Scheduling, planning or task assignment for a person or group
- G06Q10/063114—Status monitoring or status determination for a person or group
Definitions
- the present invention relates to a flow line detection system for determining which position in a tracking region corresponds to which identification information is present based on the detection result of the identification information unique to the position of the movable object and the movable object,
- the present invention relates to a flow line detection method and a flow line detection program.
- the detection of a flow line of a moving object is realized by associating the position of the moving object with identification information (hereinafter referred to as ID) of the moving object.
- ID identification information
- a moving body can be uniquely identified and a flow line can be detected.
- Various techniques relating to such flow line detection have been proposed (see, for example, Patent Documents 1 and 2).
- the flow line of a moving body is the information showing the path
- the trajectory means a connection of position coordinates detected continuously with respect to a certain moving body. Accordingly, when the detection of the position coordinates of the moving object is interrupted, the trajectory is also interrupted.
- the moving body tracking system described in Patent Document 1 includes a surveillance camera that captures an image of a space and an IC tag reader.
- the moving body tracking system described in Patent Document 1 obtains the position coordinates of each moving body based on the output of the surveillance camera, and first identification information (camera tracking ID) unique to the moving body and the position coordinates of the moving body. Are managed in the tracking information table.
- this mobile body tracking system reads unique second identification information (object ID) from each mobile body having an IC tag, and associates the second identification information with the tag position coordinates in the read information table. to manage.
- the first identification information and the position information are associated with the third identification information (position management ID) and managed by the position management table.
- the moving body tracking system described in Patent Document 1 further relates to a position estimation table that manages second identification information and third identification information in association with each other with respect to a moving body recognized within a predetermined error range at the same time.
- the mobile object is tracked by the position management table and the position estimation table. That is, when the position of the moving body at a certain time detected by the monitoring camera and the position at which the object ID is detected by the IC tag reader are within a predetermined error range, the position management table and the position estimation table move the certain time.
- the body position information and the object ID are associated and managed.
- the moving body tracking system described in Patent Document 1 integrates detection results obtained by a plurality of sensors and tracks the moving body.
- the moving body tracking system described in Patent Document 1 is recognized by the moving body recognized by the first identification means (the means using the camera) and the second identification means (the means using the IC tag).
- a mobile object is recognized as the same mobile object if its recognition time and recognition position are substantially the same, and the information obtained by each identification means is integrated, so it can be obtained by a plurality of sensors with different detection mechanisms.
- the movement line of the moving body can be detected based on the positional information obtained.
- the monitoring system using a plurality of cameras described in Patent Document 2 has a feature in which a plurality of cameras are installed in a monitoring target space, and a moving body and its feature amount information are extracted from the video using an image recognition technology.
- a quantity extraction means is provided.
- the video imaged by the camera and the feature amount information of the moving body are sent to the collation and tracking means of the moving body via the communication network.
- the verification tracking means expresses the entire monitoring target space as a three-dimensional model, and further has a monitoring space database that expresses the connection of the space in which the mobile body can move, centered on the network, and accumulates the transmitted feature information. Since a plurality of moving bodies generally exist in one camera image, they are separated as individual moving bodies.
- a route calculation unit in the verification tracking unit obtains a movement route candidate between the camera photographing ranges in the monitoring target space. Then, the moving object matching degree calculation means in the verification tracking means calculates the matching degree between the feature quantity sets of the two moving objects, and determines whether the two moving objects match using the matching degree.
- the verification tracking means includes personal identification information matching means such as card authentication and biometric authentication installed on the door. Then, the vicinity of the door with the card authentication means is photographed by the monitoring camera, the feature amount information of the appearance is extracted, and at the same time, the card information is associated with the owner information.
- the monitoring system using a plurality of cameras described in Patent Document 2 connects the trajectories of the moving bodies detected between the monitoring areas, and the trajectory is detected by personal identification information matching means such as card authentication or biometrics authentication. By associating with the personal identification information, a flow line with the personal identification information can be generated.
- the luminance values of the lower and upper body of the person are used as the feature amount.
- Patent Document 3 describes an image processing apparatus that can track a moving object that appears on a video image even when occlusion occurs.
- each of a plurality of feature points is tracked from the previous frame to the current frame, and the movement of the tracking region is predicted based on the tracking result of each feature point, and the current frame Locate the tracking area at.
- a reliability indicating the high possibility that the feature point exists on the moving object is calculated, and the movement of the tracking area is calculated using the reliability.
- the reliability the number of frames that have been successfully tracked (history) or the like is used.
- JP 2006-146378 A (paragraphs 0007 to 0017, 0024)
- Japanese Patent Laying-Open No. 2005-250989 paragraphs 0010, 0016, 0017, 0074, 0075
- a moving body tracking system using a camera can track a moving body, but cannot uniquely identify a moving body.
- ID detection devices such as card authentication, biometric authentication, and RFID (Radio Frequency IDentification) readers
- the unique ID of the mobile object can be detected by the ID detection device.
- the unique ID cannot be detected for a mobile object located away from the ID detection device, and the position of the mobile object cannot be determined.
- tracking may be interrupted when a plurality of moving objects overlap at the same place or move from one camera field of view to another camera field of view.
- the system described in Patent Document 1 can detect a flow line when an ID can be detected by a sensor. However, if the tracking is interrupted frequently, the flow line detection process is restarted from the newly acquired sensor information, so that the accuracy of the flow line detection is lowered. For example, it is assumed that the moving object A exists in the tracking area and the locus 1 is detected from the moving object A. When the trajectory 1 passes through the ID detection area and the ID detection apparatus detects an ID (referred to as IDa), the trajectory 1 can be associated with IDa. Here, it is assumed that the detection of the locus 1 is interrupted and the locus 2 is detected from the moving object A at the next time.
- Patent Document 1 resumes the flow line detection process of the moving object A using only the newly acquired trajectory 2, but the trajectory 2 does not pass through the ID detection area. There is no ID associated with 2. Thus, every time tracking or ID interruption occurs, sensor information acquired in the past (the locus 1 and IDa in the above example) cannot be used, and sensor information newly acquired after the locus is interrupted ( Since the flow line detection is resumed based only on the trajectory 2) in the above example, the obtained trajectory cannot be accurately associated with the ID.
- the ID is detected in a limited situation such as when a moving object is present in an area where the sensor can detect a wireless tag or the like, and cannot be frequently acquired. For this reason, when tracking is frequently interrupted, there are many trajectories that are not associated with the ID of the moving object. In addition, a trajectory of an object that is not a tracking target may be detected, and such a trajectory becomes noise. The presence of such noise also causes an increase in trajectories that are not associated with IDs.
- the timing for performing the connection process between the trajectories becomes important. For example, when a moving body appears in a tracking area A, the degree of matching between the trajectory of the moving body and a group of trajectories detected in the past in the adjacent tracking area B or tracking area C is calculated. Based on the above, a set of trajectories is connected.
- the luminance of the upper and lower bodies of a moving body obtained from a camera is used as a feature amount of a trajectory, and the matching degree of the feature amount between two trajectories to be connected is used as a matching degree. Used.
- the feature amount of the moving object is accurately detected when the position and size of the moving object satisfy a certain condition within the angle of view of the camera, and may include an error otherwise. Therefore, when the trajectory connection process is performed at an early stage, the degree of matching may be calculated without accurately obtaining the trajectory feature value, and the trajectory may not be correctly connected.
- the degree of consistency is not calculated all at once, but the degree of consistency between the trajectories is calculated using the latest feature amount at regular time intervals, and the connection results are updated sequentially.
- the method can be considered.
- the trajectory that becomes a connection candidate increases when the number of monitoring targets and the number of tracking areas increase.
- the number of combinations increases, and in reality, it is considered difficult to perform the flow line detection process in real time.
- the number of frames that have been successfully tracked continuously is used as the reliability of the feature points, and the movement of the tracking area is calculated using the reliability. It is conceivable to use such reliability in a system for calculating a flow line by detecting information such as the position and ID of a moving body. However, since the same amount of sensor information is not always obtained at each time, even if an index value indicating the existence possibility of a moving object is obtained, the accuracy of the index varies depending on the time, and the accuracy There are cases where the flow line cannot be calculated well.
- the present invention provides a flow line detection system capable of accurately determining the position of a moving body of each identification information and detecting a flow line even when the position and identification information of the moving body cannot be detected frequently. Another object is to provide a flow line detection method and a flow line detection program.
- the position score correspondence relationship information in which a score indicating the high possibility of the existence of a moving body with unique identification information is determined for each position in the tracking area of the moving body, Position score correspondence information generating means for generating each mobile object identification information, state storage means for storing position score correspondence information for each time, and position stored in the state storage means for each mobile object identification information
- the position score correspondence information satisfying a predetermined criterion is selected as deterministic position score correspondence information
- the definite position score correspondence information is associated with the definite position score correspondence information.
- the position score correspondence information at the most recent time is reflected in the position score correspondence information, and the position score correspondence information is repeatedly defined as deterministic position score correspondence information.
- From the score at the A correspondence relationship information characterized in that it comprises a flow-line identification means for identifying a flow line of the moving body.
- the flow line detection method provides position score correspondence information in which a score indicating a high possibility of the existence of a moving object having unique identification information is determined for each position in the tracking area of the moving object. Is generated for each piece of mobile object identification information, the position score correspondence information for each time is stored in the state storage means, and the position score correspondence relation information stored in the state storage means is stored for each piece of mobile body identification information.
- the position score correspondence information satisfying a predetermined criterion is selected as deterministic position score correspondence information, and the definite position score correspondence information is selected as the latest time corresponding to the deterministic position score correspondence information.
- the position score correspondence information is repeatedly reflected in the position score correspondence information, and the position score correspondence information is defined as definitive position score correspondence information. , And identifies the flow line of the moving body.
- the flow line detection program provides a computer with a position score determined for each position in a tracking area of a moving object, which indicates a high possibility that a moving object having specific identification information exists.
- Position score correspondence information generation processing for generating correspondence relationship information for each mobile body identification information, state storage processing for storing position score correspondence information for each time in the state storage means, and for each mobile body identification information
- the position score correspondence information satisfying a predetermined criterion is selected as deterministic position score correspondence information from among the position score correspondence information stored in the state storage means, and the definite position score correspondence information is determined
- the position score correspondence information is reflected in the position score correspondence information at the most recent time corresponding to the specific position score correspondence information, and the position score correspondence information is deterministic. Repeatedly it is defined as information, from the score at the position score correspondence information for each time, characterized in that to execute a flow-line identification process for specifying the flow line of the moving body.
- the position of the moving body of each identification information can be determined with high accuracy and a flow line can be detected.
- FIG. FIG. 1 is a block diagram showing an example of a flow line detection system according to the first embodiment of the present invention.
- the flow line detection system of the present invention includes a position information input unit 1, an ID information input unit 2, a flow line detection unit 3, and a flow line output unit 4.
- the flow line detection system of the present invention acquires the position and ID (identification information) of the moving object, and obtains a score indicating the high possibility that the moving object exists for each position in the predetermined tracking area 50. To derive. This score is calculated for each mobile object ID.
- the flow line detection system of the present invention corrects the score at the most recent time based on the score at a time point that satisfies a predetermined criterion, and further, the score at the most recent time at which the score is corrected Will also be revised sequentially. And the flow line of a moving body is pinpointed using the score after correction.
- Each moving body P freely moves in the tracking area 50. Further, the moving body P may come out of the tracking area 50.
- the type of the moving body is not particularly limited, and may be a human, an animal, or a thing.
- the position information input unit 1 is a device that detects the position coordinates of the moving body in the tracking area 50 and inputs the position coordinates and the detection time to the flow line detection unit 3.
- the position information input unit 1 inputs the two-dimensional position coordinates of the moving object in the tracking region 50 and the detection time of the position coordinates to the flow line detection unit 3 as a set.
- position information a set of the position coordinates of the moving body detected by the position information input unit 1 and the detection time of the position coordinates.
- the position information input unit 1 may be any device that can detect the position coordinates of the moving body in the tracking area 50 and specify the detection time. Further, the position information input unit 1 does not need to detect an ID unique to each mobile object.
- the position information input unit 1 may be realized by, for example, a moving body tracking system using a camera, a floor pressure sensor, a laser range finder, or a radar. In the case where the position information input unit 1 is realized in such a manner, the moving body does not have to hold a device necessary for detecting itself. In addition, the position information input unit 1 may detect the position coordinates of the moving body in such a manner that the moving body holds equipment necessary for coordinate detection.
- the position information input unit 1 may be realized by a mobile tracking system using a wireless communication device such as GPS (Global Positioning System) or an ultrasonic transceiver. Even in a mode in which the mobile body holds devices for coordinate detection, the position information input unit 1 does not need to acquire an ID unique to the mobile body from these devices.
- GPS Global Positioning System
- ultrasonic transceiver Even in a mode in which the mobile body holds devices for coordinate detection, the position information input unit 1 does not need to acquire an ID unique to the mobile body from these devices.
- the position information input unit 1 is installed so that the entire tracking area 50 can be detected without a blind spot, but a blind spot that is partially undetectable may be generated. This is because the flow line detection unit 3 to be described later can identify a position coordinate of a moving object and an ID associated therewith and generate a series of flow lines even if the input position information is missing. It is.
- the ID information input unit 2 is a device that acquires the ID of a moving object in the tracking area 50.
- the ID information input unit 2 is not always able to detect the ID when the moving object exists, and whether or not the ID can be detected depends on the position of the moving object in the tracking region 50. For example, the detection probability of an ID of a mobile object existing in a place close to the ID information input unit 2 is high, and the detection probability of an ID of a mobile object existing in a place away from the ID information input unit 2 is low.
- FIG. 1 shows one ID information input unit 2, but a plurality of ID information input units 2 may be installed in the tracking area 50.
- Each ID information input unit 2 is assigned in advance an ID information input unit ID (that is, identification information of the ID information input unit) for uniquely identifying the ID information input unit 2.
- the ID information input unit ID is used to determine which ID information input unit 2 has detected the ID of the moving object.
- ID information input unit ID is represented by a number
- the ID information input unit ID will be referred to as an ID information input unit number.
- the ID information input unit ID may be represented by a number other than the number.
- the ID information input unit 2 inputs the ID of the detected mobile object, its detection time, and the ID information input unit number of the ID information input unit 2 itself to the flow line detection unit 3.
- ID information a set of the ID of the mobile object, its detection time, and the ID information input unit number of the ID information input unit 2 that detected the ID
- the ID information input unit 2 tries to detect the ID of the moving object and the ID cannot be detected, the ID of the moving object is set to “none”, and the flow line is detected together with the time and the ID information input unit number. You may input into the part 3.
- the ID information input unit 2 does not input any ID information to the flow line detection unit 3, the flow line detection unit 3 determines that no moving body ID is detected at that time. Also good.
- the ID information input unit 2 may be any device that can detect an ID unique to a mobile body and specify the detection time and the ID information input unit number of the ID information input unit 2 itself.
- an RFID reader may be used as the ID information input unit 2.
- the moving body has an IC card and the identification information of the IC card is used as the ID of the moving body, an IC card reader may be used as the ID information input unit 2.
- an access point may be used as the ID information input unit 2.
- a barcode reader may be used as the ID information input unit 2.
- the moving body is a person
- the person's face, fingerprint, vein, or the like may be used as the person ID
- the ID information input unit 2 may be a reading device for these IDs.
- you may use together ID information input part 2 from which a detection target differs like a face authentication apparatus and an RFID reader.
- the detection areas of the ID information input devices 2 may be installed so as to overlap each other. Or you may install so that it may not mutually overlap.
- the detection of the position coordinates of the moving object by the position information input unit 1 and the detection of the ID of the moving object by the ID information input unit 2 are performed at the same time.
- the position information input unit 1 and the ID information input unit 2 detect position coordinates and ID asynchronously
- the position information and ID information input by the flow line detection unit 3 are buffered for a certain period of time.
- the position information and ID information accumulated in the buffer at regular intervals may be used.
- the time information is not synchronized between the position information input unit 1 and the ID information input unit 2
- the position information and the ID information are input to the flow line detection unit 3
- the same detection time may be set for the input position information and ID information.
- the flow line detection unit 3 uses the position information input from the position information input unit 1 and the ID information input from the ID information input unit 2, and the moving body corresponding to the ID for each position in the tracking area 50 A score representing the high possibility of existence is calculated, and information (hereinafter referred to as position score correspondence information) in which each position in the tracking area 50 is associated with the score is created for each time. Then, the flow line detection unit 3 determines position score correspondence information that can be said that the score of each position in the tracking region 50 is deterministic based on a predetermined criterion. The time when the position score correspondence information is created is called a deterministic time. A reference for determining position score correspondence information that can be said to be deterministic at each position will be described later.
- the flow line detection unit 3 corrects the score at the most recent time based on the score at the definite time. Then, the time at which the score is corrected is set as a definitive time point, and the correction of the score at the latest time is similarly repeated.
- a time that is not a definite time point may be referred to as an indeterminate time point.
- the flow line detection unit 3 determines a deterministic time point, and corrects the score at the uncertain time point while following the most recent uncertain time point. And the flow line detection part 3 detects the flow line of a moving body using the score after correction
- the flow line output unit 4 is an output device that outputs a flow line detected by the flow line detection unit 3.
- the output mode of the flow line is not particularly limited.
- a display device may be used as the flow line output unit 4.
- a case where the flow line output unit 4 displays a flow line is taken as an example.
- FIG. 2 is a block diagram illustrating a configuration example of the flow line detection unit 3.
- the position information input unit 1, the ID information input unit 2, and the flow line output unit 4 are also illustrated.
- the flow line detection unit 3 includes a state storage unit 31, a state update unit 32, and a flow line identification unit 33.
- the state storage unit 31 is a storage device that stores the score state of each position in the tracking area 50 at each time for each ID of the moving object. Specifically, the state storage unit 31 stores a set of the time, the ID of the moving object, and the position score correspondence information regarding the moving object at that time. A set of time, ID, and position score correspondence information is hereinafter referred to as state information.
- the state update unit 32 reads state information generated at the previous time from the state storage unit 31 for each ID of the moving object, and detects the state information, the movement model, the position coordinates of the moving object and the ID at the current time. Based on the result, the position score correspondence information of the previous time is updated.
- the state update unit 32 leaves the state information of the previous time as it is, and from the position score correspondence information included in the state information, A new position score correspondence information at the current time is created, and new state information (state information at the current time) including the current time, the ID of interest, and the position score correspondence information is stored in the state storage unit 31.
- the state update unit 32 can also be called a state information generation unit.
- the previous time is the latest time in the past when the state information was created as viewed from the current time.
- the state update unit 32 propagates the position score indicated by the position score correspondence information at the previous time to a nearby position, and reflects the position information and ID information at the current time on the result, The position score correspondence information at the current time is generated.
- the movement model is a rule that defines the manner of propagation of the score. Specific examples of the movement model and details of the process of generating the position score correspondence information at the current time by propagating the score at the previous time will be described later.
- one piece of position score correspondence information is generated by the state update unit 32 and stored in the state storage unit 31 every time.
- the state update unit 32 may store all the state information generated for each ID in the state storage unit 31 at each time. In addition, the state update unit 32 may sequentially delete state information generated before a certain time in the past from the state storage unit 31 as viewed from the current time.
- the flow line specifying unit 33 selects an ID, and determines state information that can be said that the score of each position is definite with respect to the ID.
- the time indicated by the state information is a definite point.
- the flow line specifying means 33 corrects the score indicated by the position score correspondence information at the indeterminate time while following the indeterminate time immediately before the deterministic time. Then, the flow line specifying unit 33 refers to the corrected position score correspondence information at each time with respect to the focused (selected) ID, and pays attention by tracing the position of the score peak at each time.
- the flow line of the moving object indicated by the ID is specified.
- the flow line specifying means 33 sequentially selects the IDs of the moving objects and repeats the same processing.
- the flow line specifying means 33 is a moving body that exists in the tracking area 50 (see FIG. 1) by specifying a flow line for all IDs, and for each moving body identified by the ID, Can be specified.
- the position score correspondence information described above may be information in which each position in the tracking region 50 is associated with a score obtained by quantifying the probability that a moving body identified by an ID exists at the position. That's fine.
- the tracking area 50 is divided into grids, and the divided area is called a cell.
- the position score correspondence information is represented as, for example, a set of information in which the coordinates of individual cells are associated with the scores.
- the tracking area 50 is divided into certain areas, a score is determined for each divided area, each area is a node, and the adjacent relationship between the areas is a link, and the position score is supported by a network composed of nodes and links. Relationship information may be represented.
- FIG. 3 is a block diagram illustrating a configuration example of the state update unit 32.
- the state update unit 32 includes a state prediction unit 321 and an observation information reflection unit 322.
- the state predicting means 321 generates new position score correspondence information by propagating the score of each position indicated by the position score correspondence relation information of the previous time as the score of the neighboring position according to a predetermined movement model. To do.
- This position score correspondence information can be said to be a prediction of the position score correspondence information at the current time.
- the observation information reflecting means 322 is the position information at the current time input from the position information input unit 1 for the new position score correspondence information (prediction of the position score correspondence information at the current time) created by the state prediction means 321.
- the position score correspondence information at the current time is determined.
- the input position information and ID information can be called observation information.
- the state prediction unit 321 and the observation information reflection unit 322 will be described in more detail.
- the state prediction unit 321 reads the position score correspondence information included in the state information of each ID generated at the previous time from the state storage unit 31, and newly creates a copy of the position score correspondence information of each ID. Then, in each copied position score correspondence information, a process of propagating the score to a nearby position is performed according to a predetermined movement model. By the process of propagating the score, the newly created position score correspondence information is updated.
- the state prediction unit 321 inputs the position score correspondence information that has been subjected to this processing to the observation information reflection unit 322.
- the movement model is a rule that defines a mode in which the score in the position score correspondence information is propagated to a nearby position.
- 4 and 5 are explanatory diagrams showing specific examples of the movement model.
- the position score correspondence information is a set of information in which the coordinates of individual cells are associated with the scores. 4 and 5, the position score correspondence information is illustrated as a map.
- FIG. 4A schematically shows a movement model in which the score of the cell 71 of interest in the position score correspondence information is propagated to each of the upper, lower, left, and right cells for each time step.
- the cell from which the score is propagated is indicated by a dotted pattern, and the cell to which the score is propagated is indicated by a thin oblique line.
- the movement model shown in FIG. 4A is a movement model determined from the viewpoint that a moving body existing in a cell at the previous time can move to a cell adjacent in the vertical and horizontal directions on the map.
- the state predicting unit 321 sets the cell score to the upper, lower, left, and right cells in the position score correspondence information copied from the previous position score correspondence information. The process to propagate is performed and the position score correspondence information is updated.
- each cell has a score indicating the high possibility that a moving object exists.
- the state prediction unit 321 propagates the score while paying attention to each cell. That is, the score is propagated for all individual cells. At this time, the score is propagated to a single cell from a plurality of cells. However, the state predicting unit 321 determines the score of each cell as a propagation source for the cell to which the scores from the plurality of cells are propagated. The maximum value of is defined as a new score. This point is the same even when other operation models other than FIG.
- FIG. 4B schematically shows a movement model in which the score of the cell 71 of interest is propagated up to n cells ahead in the vertical, horizontal, and diagonal directions for each time step. Yes.
- a movement model may be defined.
- a moving model for propagating the score in the moving direction of the moving object may be determined using the moving direction and moving speed of the moving object at the previous time as parameters.
- the score of the cell 71 of interest is propagated to the cell 72 in the traveling direction, and the cells that are not in the traveling direction among the cells around the cell 71 of interest.
- FIG. 5A shows an example when the moving speed of the moving body is slow
- FIG. 5B shows an example when the moving speed of the moving body is high.
- the range of the cell 72 in the traveling direction may be determined in advance according to the moving direction and moving speed of the moving body. Further, as shown in FIG. 5B, when the speed is high, the range of the cells 72 in the traveling direction may be widened.
- the state predicting means 321 stores the coordinates where the obstacle is present in advance, and the score is not propagated to the position where the obstacle exists because it cannot move. It may be. Alternatively, an extra time required for the moving object to move beyond the obstacle is set as a cost for the area where the obstacle exists, and the state predicting unit 321 takes this cost into consideration for the movement model. You may make it propagate a score.
- the observation information reflecting means 322 receives a set of position coordinates of the moving body and detection times of the position coordinates as position information from the position information input unit 1.
- a set of an ID of the moving object, an ID information input unit number of the ID information input unit 2 and a detection time of the ID is input from the ID information input unit 2 as ID information.
- the position information corresponding to the position score after the process of propagating the score is input from the state prediction unit 321 to the observation information reflecting unit 322. This positional score correspondence information is input to the observation information reflecting means 322 for each moving body ID.
- the observation information reflecting means 322 reflects the position information and ID information at the current time in the input position score correspondence information. This process will be described.
- the observation information reflecting means 322 stores an ID detection area corresponding to the ID information input unit number in advance.
- the ID detection area is an area that is determined in advance as an area in which the ID information input unit 2 can be regarded as detecting a moving body ID with a detection probability equal to or higher than a predetermined probability.
- the ID detection area is determined in advance by the administrator of the flow line detection system, for example. Since it is difficult to strictly define an ID detection area as an area for detecting an ID with a detection probability equal to or higher than a predetermined probability, the ID detection area can be regarded as detecting an ID with a detection probability higher than a predetermined probability.
- the area may be an area determined by an administrator or the like.
- the ID detection area is predetermined for each ID information input unit 2 (that is, for each ID information input unit number).
- the observation information reflecting means 322 pays attention to the ID of each mobile object, and the focused ID in the positional score correspondence information (position score correspondence information input from the state predicting means 321) of the focused ID.
- the score is updated so that the score increase amount of the ID detection region corresponding to the ID information input unit number of the ID information input unit 2 that detects the ID becomes larger than the score increase amount of the other regions.
- the ID information input unit number of the ID information input unit 2 that has detected the ID may be determined based on the ID information input from the ID information input unit 2. That is, if the ID information input unit number paired with the ID of interest is input from the ID information input unit 2, the score increase amount of the ID detection region corresponding to the ID information input unit number is the other region's score.
- the observation information reflecting means 322 adds a predetermined value to the score for the ID detection area where the ID is detected, and adds a smaller value to the score for the area where the ID is not detected. Also good.
- the score may be kept as it is for the region where the ID is not detected (that is, it may not be changed). Moreover, you may subtract a score regarding the area
- the observation information reflecting means 322 holds the ID detection area where the ID is detected without changing the score, and subtracts the score for the area where the ID is not detected or sets the score to a value of 1 or more. You may divide.
- the observation information reflecting unit 322 may multiply the score by a value for the ID detection region where the ID is detected, and multiply the score by a value smaller than the value for the region where the ID is not detected.
- the score may be held as it is, or the score may be divided by one or more values.
- the score may be calculated using a placed coefficient or a coefficient that is dynamically determined according to the ID detection status. For example, in the vicinity of the boundary of the ID detection area, ID overdetection and detection omission are more likely to occur than in the center of the ID detection area. Therefore, even within the ID detection area, the score increase amount may be differentiated by using different coefficients for the center of the ID detection area and the vicinity of the boundary of the ID detection area.
- the amount of increase in score in the ID detection area that is likely to cause a false detection is reduced, and a false detection is caused.
- the increase in score in the difficult ID detection area may be increased to make a difference in the increase in score.
- the observation information reflecting means 322 increases the score corresponding to the position coordinates input from the position information input unit 1 for each position score corresponding relation information (position score corresponding relation information input from the state prediction means 321) of each ID.
- the score is updated so that the amount is higher than the increase amount of the score in the other region. If the amount of increase in the score corresponding to the input position coordinates is higher than the amount of increase in the score of the other region, the amount of increase in the score may be negative.
- the position information input from the position information input unit 1 represents the content of detecting two position coordinates (x1, y1) and (x2, y2) at a certain time.
- the observation information reflecting unit 322 is configured so that the increase amount of the score in (x1, y1) and (x2, y2) is higher than the increase amount of the score of other regions in the position score correspondence information of ID1.
- Update the score is updated so that the amount of increase in the score in (x1, y1) and (x2, y2) is higher than the amount of increase in the score in other regions.
- the observation information reflecting means 322 may add a predetermined value to the score for the detected position coordinate area, and may add a smaller value to the score for the other areas. Further, regarding the area other than the detected position coordinate area, the score may be held as it is without being changed, or the score may be subtracted.
- the observation information reflecting unit 322 holds the detected position coordinate area without changing the score, and for other areas, subtracts the score or divides the score by one or more values. May be.
- the observation information reflecting unit 322 may multiply the score by a value larger than 1 for the detected position coordinate region, and may multiply the score by a value smaller than that value for other regions.
- the score may be held as it is for the area other than the detected position coordinate area, or the score may be divided by one or more values.
- the score may be calculated using a coefficient that is dynamically determined based on the detected position coordinates. For example, if the detection error of the position coordinates is determined in advance, the score is changed so that the amount of increase in the score is larger than the other areas for the area where the detection error is considered with the detected position coordinates as the center. You may let them.
- the position coordinates of the mobile object detected at that position are changed in a wide area including its surroundings, and at that time
- the increase amount of the score may be made smaller than the increase amount of the score in the position coordinates detected at other positions.
- the observation information reflecting unit 322 performs a process of changing the score on the position score correspondence information for each ID of each moving body input from the state prediction unit 321, and then performs processing for changing the ID, the position score correspondence information,
- the state storage unit 31 stores the detected time (current time) obtained from the position information or the ID information as the associated state information.
- the observation information reflecting unit 322 stores state information in the state storage unit 31 for each ID.
- FIG. 6 is a block diagram showing a configuration example of the flow line specifying means 33.
- the flow line specifying unit 33 includes a confirmed state selecting unit 331, a confirmed state reflecting unit 332, and a moving body position detecting unit 333.
- the confirmed state selection means 331 acquires the state information of each ID generated from the state storage unit 31 during the past certain time from the current time.
- the confirmed state selection means 331 extracts state information including the ID of the moving body from which the flow line is derived from the state information.
- the state information at each time is extracted as the state information of this ID.
- the confirmed state selection means 331 selects state information at a definite time from among the state information at each time of the ID.
- the confirmed state selection means 331 determines state information that can be said that the score of each position in the tracking region 50 is deterministic based on a predetermined criterion.
- the time indicated by the status information is a definitive time.
- state information that can be said that the score of each position in the tracking area 50 is deterministic is referred to as definite state information.
- a standard that “state information generated at the latest time is defined state information” may be adopted.
- the score of each position in the tracking area is updated based on the observation results of more effective position coordinates and IDs.
- the confirmed state selecting unit 331 may determine the state information generated at the latest time as the confirmed state information based on the above-described criteria.
- a criterion for determining the confirmed state information a criterion that “the state information at the time when both the position coordinates and the ID of the moving object are detected is defined state information” may be adopted. At the time when the position coordinates and ID of the mobile object are detected, a more prominent peak appears in each score in the tracking area than when the position coordinates and ID are not detected. Accordingly, the confirmed state selecting unit 331 may determine the time at which both the position coordinates and the ID of the moving object are detected as a definite point of time, and determine the state information at that time as the definite state information.
- the confirmed state reflecting means 332 corrects the score indicated by the position score correspondence information at the uncertain time while following the uncertain time nearest to the definite time.
- the determinate state reflecting means 332 sequentially follows the uncertain time points from the definitive time point toward the past.
- the definite state reflecting means 332 starts from the deterministic time point as a starting point and does not move toward the future and the past. What is necessary is just to follow a fixed time point sequentially.
- the definite state reflecting unit 332 When correcting the position score correspondence information at the most uncertain time immediately before the deterministic time point, the definite state reflecting unit 332 first calculates the score of each position according to the movement model with respect to the position score correspondence information at the definite time point. To propagate. This process is the same as the process in which the state predicting unit 321 propagates the score, and the confirmed state reflecting unit 332 creates a copy of the position score correspondence information at a definite point in time, and in that copy, the score of each position What is necessary is just to perform the process which propagates to near.
- the confirmed state reflecting means 332 reflects the position score correspondence information on which the process of propagating the score to the vicinity is reflected in the position score correspondence information at the latest uncertain time.
- the confirmed state reflecting unit 332 For example, the score of each cell in the position score correspondence information at a deterministic time point is added or multiplied as it is to the score of the corresponding cell in the position score correspondence information at the most recent uncertain time point. That's fine.
- the confirmed state reflecting unit 332 multiplies the score of each cell in the position score correspondence information at a definite time by a coefficient, and the result corresponds to the position score correspondence information at the most recent uncertain time. You may add to the score of a cell, or you may multiply.
- the confirmed state reflecting means 332 sequentially repeats the same processing by using the position score correspondence information at the uncertain time at which the above calculation is performed as the position score correspondence information at the definite time.
- the moving body position detecting means 333 detects the position of the moving body at each time from the position score correspondence information at each time after the processing by the confirmed state reflecting means 332.
- the moving body position detection unit 333 detects the position where the score is a peak as the position of the moving body from the position score correspondence information at each time.
- the mode of detecting the position where the score becomes a peak may be, for example, the mode of detecting the position where the score is maximum.
- the aspect which detects the gravity center position of each position whose score is more than a fixed value may be sufficient.
- the moving body position detection means 333 determines that the position detected from the position score correspondence information at a certain time is the position at which the moving body of the ID of interest is at that time.
- the moving body position detecting means 333 detects the position where the score is peaked from the position score correspondence information for each time, and uses the position information in time series order as a flow line. As a result, the flow line of the moving body corresponding to the ID selected by the confirmed state selection unit 331 is obtained.
- the state update unit 32 and the flow line specifying unit 33 are synchronized, and the flow line is generated each time the state update unit 32 generates the state information of each ID.
- the specifying unit 33 may specify the flow line.
- the state update unit 32 and the flow line specifying unit 33 operate asynchronously, the state update unit 32 performs processing every time position information and ID information are input, and the flow line specifying unit 33 sets the position information and ID.
- the flow line detection process may be performed at a different cycle.
- the flow line specifying means 33 may perform the flow line detection process non-periodically when it becomes necessary, not periodically. For example, the flow line detection process may be performed when an instruction to detect a flow line is input by the administrator of the flow line detection system.
- state update means 32 state prediction means 321 and observation information reflection means 322 and flow line identification means 33 (determined state selection means 331, fixed state reflection means 332 and moving body position detection means 333).
- state update means 32 state prediction means 321 and observation information reflection means 322
- flow line identification means 33 determined state selection means 331, fixed state reflection means 332 and moving body position detection means 333.
- a computer program storage device stores the flow line detection program, and the CPU reads the program, and in accordance with the program, state update means 32 (state prediction means 321 and observation information reflection means 322),
- the flow line specifying means 33 (determined state selecting means 331, confirmed state reflecting means 332, and moving body position detecting means 333) may be operated.
- the state updating unit 32 and the flow line specifying unit 33 may be realized by different hardware.
- the state prediction unit 321 and the observation information reflection unit 322 may also be realized by other hardware.
- the confirmed state selecting unit 331, the confirmed state reflecting unit 332, and the moving body position detecting unit 333 may also be realized by other hardware
- FIG. 7 and FIG. 8 are flowcharts illustrating an example of processing progress of the flow line detection unit 3 according to the first embodiment.
- FIGS. 9 to 16 an example of processing progress of the first embodiment will be described with reference to specific examples of FIGS. 9 to 16.
- FIG. 9 is an explanatory diagram showing an example of cells defined by dividing the tracking area 50 into cells for defining scores.
- FIG. 9 also shows the positions of the two ID information input units 2a and 2b installed in the tracking area 50.
- the lower left of the map of the tracking area 50 is the origin (0, 0)
- arbitrary position coordinates on the map are represented in the format p (x, y).
- the coordinates of an arbitrary cell on the map are expressed in a format c (m, n).
- the number of cell divisions may be set arbitrarily, but in this example, the value is assumed to be in the range of 0 to 11 in the x-axis direction and 0 to 7 in the y-axis direction.
- the ID information input units 2a and 2b are arranged at c (0, 7) and c (11, 0), respectively.
- Each ID information input unit 2a, 2b defines an ID detection area in advance.
- a rectangle having c (0, 5) as the lower left and c (2, 7) as the upper right is defined as the ID detection area Ra of the ID information input unit 2a.
- a rectangle having c (9, 0) at the lower left and c (11, 2) at the upper right is defined as an ID detection region Rb of the ID information input unit 2b.
- FIG. 10 is a diagram illustrating an example of detection positions at times t 1 to t 10 for two moving bodies a and b that move in the tracking area 50.
- the true state is that the positions detected from the moving object a are p 1a to p 10a and the positions detected from the moving object b are p 1b to p 10b .
- the mobile object a is associated with ID1 and the mobile object b is associated with ID2 respectively.
- the fact that p 5a , p 6a and p 5b , p 6b are not described indicates that position coordinates are not detected at times t 5 and t 6 .
- the subscript number indicates at which time t 1 to t 10 the subscript number is detected, and the subscript a or b represents the mobile objects a and b. It indicates which one is detected.
- FIG. 11 shows the presence / absence of ID detection for each ID information input unit between times t 1 and t 10 .
- it indicates that the ID information input unit 2a detects the "ID1" and "ID2" at time t 1, t 2, ID information input unit 2b detects the "ID1" to time t 10.
- a blank part indicates that the ID of the moving object has not been detected.
- the ID is reliably detected when the moving body exists in the ID detection areas Ra and Rb, and the ID is not detected at all when the moving body exists outside the ID detection area. (See FIGS. 10 and 11).
- the subscript number added to the reference numeral of the detection position of the moving object shown in FIG. 10 represents the detection time of the detection position.
- the position information input unit 1 (not shown in FIG. 10) is the position coordinate p 1a, to detect the p 1b, ID information input unit 2a is ID1, Assume that ID2 is detected.
- ID1 Assume that ID2 is detected.
- the position information input unit 1 detects p 10a and p 10b and the ID information input unit 2b detects ID 1 at time t 10 .
- FIGS. 12 to 15 are explanatory diagrams showing specific examples of situations in which the score of the position score correspondence information is updated.
- the position score correspondence information is schematically represented as a cell map, and the score of each cell is distinguished by a pattern in the drawings.
- FIG. 16 is an explanatory diagram showing the score values of the cells shown in FIGS. As shown in FIG. 16, white cells indicate that the score is 0 or less. However, in the position score correspondence information shown in FIGS. 12 to 15, the white cells have negative scores. It shall be.
- the state update unit 32 acquires ID information (that is, a set of the ID of the moving object, the ID information input unit number, and the detection time of the ID) from the ID information input unit 2 (step S1).
- ID information that is, a set of the ID of the moving object, the ID information input unit number, and the detection time of the ID
- the status update unit 32 receives the ID information ⁇ “ID1”, “ID information input unit 2b”, “t 10 ” ⁇ from the ID information input unit 2.
- the “ID information input unit 2b” is an ID information input unit number.
- the state update unit 32 acquires position information (that is, a set of the position coordinates of the moving object and the detection time of the position coordinates) from the position information input unit 1 (step S2). At time t 10, as shown in FIG. 10, ⁇ "p 10a”, “t 10" ⁇ , the position information of ⁇ "p 10b”, “t 10" ⁇ is input.
- the state predicting means 321 can refer to the position information
- the observation information reflecting means 322 is the ID. If the information and the position information can be referred to, the ID information and the position information may be input to either the state prediction unit 321 or the observation information reflection unit 322.
- the state predicting unit 321 of the state updating unit 32 receives the state information of the latest time from the state storage unit 31 in the state information stored in the state storage unit 31 (that is, the time immediately before the current time). (State information generated in step S3) is acquired (step S3).
- the state information is a set of time, the ID of the moving object, and position score correspondence information regarding the moving object at that time.
- the state prediction unit 321 may check the time of each piece of state information stored in the state storage unit 31, select the state information group with the latest time, and read from the state storage unit 31.
- the state predicting means 321 reads the state information group generated before the time t 9.
- the state update unit 32 has not created state information of the current time (t 10 ) when the state of the score is updated for the state information group of the previous time acquired from the state storage unit 31 (specifically, steps S5 to S5). It is determined whether or not there is one for which the process of S7 has not been performed (step S4).
- the state prediction means 321 determines whether the state information from the state information at the previous time read at step S3 One piece of processing status information is selected. Here, it is assumed that the state information of ID1 is selected. Note that the determination in step S4 and means for performing this selection may be provided in the state update unit 32 separately from the state prediction unit 321, and the unit may input the selected state information to the state prediction unit 321.
- the state prediction means 321 calculates the elapsed time from the time of the selected state information to the current time, determines the propagation range of the score based on a predefined movement model of the moving object, and is included in the selected state information
- the state of the score at the current time is predicted by propagating the score in the positional score correspondence relationship information (step S5).
- the state predicting unit 321 creates a copy of the position score correspondence information included in the selected state information instead of overwriting the score on the position score correspondence information included in the selected state information.
- each positional score correspondence information the process of propagating the score to a nearby position is performed according to a predetermined movement model. As a result, the position score correspondence information at the previous time is left as it is, and the position score correspondence information at the current time is newly created.
- a movement model (see FIG. 4A) is defined in which the score of each cell is propagated to the adjacent upper, lower, left, and right cells for each time step.
- a period in which the position information input unit 1 and the ID information input unit 2 detect the position coordinates and ID and input them to the state update unit 32 is one time step.
- the state predictor 321 a score at time t 9 is set in each cell is set as the score of the upper and lower left and right adjacent cells, so the time the result of the propagating scores t 10 is a prediction result of each position score (prediction result of position score correspondence information).
- the state predicting unit 321 propagates the scores to the upper, lower, left, and right cells for all individual cells. At this time, since the score is propagated to each cell from the top, bottom, left, and right cells, four scores are propagated to one cell. The score with the largest value is set as the score of the cell.
- the movement model is not limited to the above movement model, and may be appropriately defined according to, for example, the movement characteristics of the moving object to be tracked.
- FIGS. 12A and 13A both relate to the selected ID1.
- FIG. 14A and FIG. 15A described later relate to ID1.
- the state predicting unit 321 inputs the prediction result (see FIG. 13A) of the position score correspondence information at the current time generated in step S5 to the observation information reflecting unit 322. Specifically, a set (that is, state information) of the prediction result of the position score correspondence information, the previous time, and the selected ID (here, ID1) is input to the observation information reflecting unit 322.
- the observation information reflecting unit 322 includes the position score correspondence information at the current time predicted by the state predicting unit 321, the ID information input from the ID information input unit 2, and the position information input from the position input unit 1.
- the position score correspondence relationship information at the predicted current time is updated.
- the observation information reflection unit 322 overwrites and updates the prediction result of the position score correspondence information input from the state prediction unit 321.
- the observation information reflecting unit 322 updates the position score correspondence information at the current time predicted by the state predicting unit 321 based on the ID information at the current time input from the ID information input unit 2 (step S6).
- step S1 as ID information observed in t 10 is the current time, ⁇ "ID1", “ID information input unit 2b", "t 10" ⁇ is input.
- the ID information is a time t 10, the ID information input unit 2b which means that it has detected a "ID1".
- step S6 the observation information reflecting unit 322 uses the score of the cell corresponding to the ID detection area defined for the ID information input unit 2 that has detected the ID of the mobile body based on the ID information acquired from the ID information input unit 2.
- the position score correspondence information at the current time predicted by the state prediction unit 321 is updated so that the score increase amount is larger than the score increase amount of other cells.
- the observation information reflecting unit 322 has a score of 0.3 for the cell corresponding to the ID detection region Rb corresponding to the ID information detection device 2b in the position score correspondence information shown in FIG. Add the other areas and leave the score as it is.
- the position score correspondence information shown in FIG. 13A is updated as shown in FIG.
- the score of the cell corresponding to the ID detection region Rb is larger than that shown in FIG. 13A, and the scores of the other cells are shown in FIG. Same as the case.
- the observation information reflection unit 322 updates the position score correspondence information (see FIG. 14A) after the process of step S6 based on the position information input from the position information input unit 1 (step S7). ).
- a position information observed in t 10 is the current time, ⁇ "p 10a”, " t 10" ⁇ , are entered ⁇ "p 10b", "t 10" ⁇ .
- the observation information reflecting means 322 performs the update process in step S7 using all the position information at the current time input in step S2.
- the observation information reflecting means 322 uses the position coordinates included in the position information acquired from the position information input unit 1 to determine which cell in the tracking area each position coordinate is included in.
- the observation information reflecting section 322 determines the cell that contains the coordinates p 10a are c (9,1), and the cell containing the position coordinates p10b is c (10,7).
- the observation information reflecting unit 322 updates the position score correspondence information so that the score increase amount in the score of the cell corresponding to the detected position coordinate is larger than the score increase amount of other cells.
- the observation information reflecting means 322 leaves the scores of the cells c (9, 1) and c (10, 7) in which the position coordinates are detected as they are, and sets the scores of other cells in which the position coordinates are not detected. Subtract 0.3.
- the position score correspondence information shown in FIG. 14A is updated as shown in FIG.
- observation information reflecting means 322 updates the time included in the state information to the current time in addition to updating the position score correspondence information.
- the case where the observation information reflecting means 322 updates the time included in the state information to the current time is shown, but the process of updating the time included in the state information to the current time is a state prediction.
- Means 321 may perform.
- the state prediction unit 321 may update the time in the state information to the current time after performing the process of propagating the score in step S5.
- the observation information reflecting unit 322 creates the state information at the current time for the selected ID1 by performing the update processing of the position score correspondence information (steps S6 and S7) for the state information input from the state predicting unit 321. Is done.
- the state update means 32 assumes that the current time state information has been created for ID1 in the state information group acquired in step S3.
- steps S5 to S7 described above are processes for creating the current time state information for one ID based on the previous time state information.
- step S7 the state update unit 32 returns to step S4 again, and determines whether or not unprocessed state information exists for steps S5 to S7.
- the processing of steps S5 to S7 is performed on the status information of ID2 as in the case of ID1.
- Figure 12 (b) is an example of a positional score correspondence information of ID2 generated before the time t 9.
- the state prediction means 321 performs the process of propagating the score of each cell indicated by the position score correspondence information, so that the position score correspondence information shown in FIG. 13B is obtained. This is the predicted result of the position score correspondence information at the time t 10.
- the observation information reflection unit 322 updates the prediction result of the position score correspondence information based on the ID information (step S6).
- the result is shown in FIG.
- the observation information reflecting unit 322 updates the position score correspondence information after step S6 based on the position information (step S7). That is, as in the case of ID1, the scores of the cells c (9, 1) and c (10, 7) in which the position coordinates are detected are left as they are, and the scores of other cells in which the position coordinates are not detected are subtracted. .
- FIG. 1 the scores of the cells c (9, 1) and c (10, 7) in which the position coordinates are detected are left as they are, and the scores of other cells in which the position coordinates are not detected are subtracted. .
- step S7 the state update unit 32 returns to step S4 again, and when it is determined that there is no unprocessed state information for steps S5 to S7 (No in step S4), the observation information reflection unit 322
- Each state information of the current time created for each ID by repeating the process of S7 is stored in the state storage unit 31 (step S8).
- Each status information corresponds to the detection time included in the position information or ID information acquired from the position information input unit 1 or the ID information input unit 2, the ID of the moving object, and the position score created in the processing of steps S5 to S7. Contains relationship information.
- the confirmed state selecting unit 331 of the flow line specifying unit 33 reads the state information of each ID for a certain past time from the current time from the state storage unit 31 (step S9).
- it is described as to read all the state information stored in the state storage unit 31 (status information from time t 1 to time t 10).
- the specification of the time range in which the flow line is to be specified may be received, and the confirmed state selection unit 331 may read the state information corresponding to the specified time range.
- the confirmed state selecting unit 331 reflects the score at a definite point of time in the state information read from the state storage unit 31 (specifically, steps S11 to S11). It is determined whether or not there is a state information group related to an ID for which S14) has not been performed (step S10).
- the state information read from the state storage unit 31 includes a state information group related to ID1 and a state information group related to ID2.
- the deterministic state selection unit 331 uses one ID.
- the selected state information regarding the selected ID is selected (step S11).
- ID1 is selected as the ID.
- a criterion that “state information generated at the latest time is determined state information” is set in advance, and the determined state selection unit 331 is generated from the state information group of ID1 at the latest time according to the criterion.
- the selected state information is selected as the confirmed state information. Therefore, in this case, the ID1 status information including the position score correspondence information of the time t 10 shown in FIG. 15 (a), selected as determined state information.
- the time t 10 included in the determined state information, a deterministic time, other times t1 ⁇ t9 corresponds to the uncertain time.
- the confirmed state reflecting unit 332 determines whether or not state information (hereinafter referred to as uncertain state information) at an uncertain point exists in the selected “ID1” state information group. Is determined (step S12).
- the definite state reflecting unit 332 When indefinite state information exists (Yes in step S12), the definite state reflecting unit 332 performs deterministic by performing a process of propagating the score of each cell in the position score correspondence information included in the definite state information.
- the state of the score at the indeterminate time (t 9 ) nearest to the time (t 10 ) is predicted (step S13).
- the process of propagating the score is the same as in step S5, and the confirmed state reflecting means 332 creates a copy of the position score correspondence information at a definite point in time (t 10 ), and in the copied position score correspondence information
- a process of propagating the score to a nearby position is performed. This result is the position score correspondence information at the most recent time predicted based on the position score correspondence information at a definite time.
- the confirmed state reflecting means 332 selects state information at an uncertain time point (t 9 in this example) closest to the definite time point. Then, the position score correspondence information predicted at step S13 is reflected on the position score correspondence information included in the indeterminate state information (step S14).
- the score of each cell in the position score correspondence information predicted in step S13 may be added to the score of each corresponding cell in the position score correspondence information included in the selected uncertain state information. Or you may multiply the score of a corresponding cell. Alternatively, the above addition or multiplication may be performed after the score of each cell in the position score correspondence information predicted in step S13 is multiplied by a weighting factor. The result of this calculation is used as the score of each cell at the uncertain time nearest to the deterministic time, and the position score correspondence information at the uncertain time is updated.
- the confirmed state reflecting means 332 sets the state information at the time when the process of step S14 is performed as the confirmed state information. As a result, the time t 9, treated as a definite point in time.
- the flow line specifying means 33 returns to step S12 again, and determines whether or not indeterminate state information exists in the selected state information group of “ID1”.
- the state information at times t 1 to t 8 corresponds to the indeterminate state information. Therefore, the confirmed state reflecting means 332 performs a process of propagating the score of each cell in the position score correspondence information at time t 9 , and the state of the score at the uncertain time (t 8 ) nearest to the definite time is displayed. Prediction is made (step S13). Then, the confirmed state reflecting means 332 selects the state information at the uncertain time (t 8 ) nearest to the definite time (t 10 , t 9 ), and the position score correspondence relationship included in the uncertain state information. The position score correspondence information at that time predicted in step S13 is reflected on the information (step S14). Then, the state information at time t 8 a definite state information.
- the confirmed state reflecting means 332 similarly repeats the processes of steps S13 and S14 until the indeterminate state information no longer exists.
- step S12 When all the state information related to the selected ID “ID1” is confirmed state information (No in step S12), the process proceeds to step S10 again. Since the processes of steps S11 to S14 are not performed for “ID2”, the process proceeds to step S11, and the confirmed state selecting unit 331 selects ID2 and selects the confirmed state information regarding ID2 (step S11). Then, similarly to the case where ID1 is selected, the processing of steps S12 to S14 is repeated, and the score at the definite time point is reflected on the score at the uncertain time point.
- step S10 When there is no more indeterminate state information regarding ID2, the process returns to step S10 again. If the determined state selection unit 331 determines that there is no ID for which the processing in steps S11 to S14 has not been performed (No in step S10), the moving body position detection unit 333 uses the state information group of each ID. A flow line is specified (step S15).
- step S15 the moving body position detecting means 333 specifies a flow line for each ID.
- a method for determining the order in which the flow line of the ID is specified a method of selecting an ID at random may be used.
- the flow line may be specified by selecting IDs in the order in which the peak of the score from the current time to the past fixed time appears most strongly.
- the mobile body position detection means 333 After selecting the ID, the mobile body position detection means 333 refers to the position score correspondence information at each time of the ID, and detects a cell in which a score peak appears at each time.
- the moving body position detecting unit 333 may simply detect the cell having the highest score. Or you may detect the cell applicable to the gravity center of the cell whose score is more than a fixed value (threshold value) as a cell where a score becomes a peak.
- the process of detecting the cell in which the peak of the score appears may be performed in order from any time.
- the moving object position detecting means 333 determines that the moving object corresponding to the selected ID exists in the detected cell at each time, and determines the combination of the time and the position coordinate of the cell coordinates of the moving object. It is determined as information representing a flow line (step S15).
- the flow line is represented by the position coordinates of the cells obtained by dividing the tracking area into grids. That is, the resolution of the position coordinate of the flow line depends on the resolution of the cell determined by grid division.
- step S15 when selecting a flow line in order from the ID where the peak of the score from the current time to the past certain time appears most strongly and specifying the flow line, the flow line generated later is generated first. It is preferable to select a cell other than the flow line cell. By selecting a cell in this way, the accuracy of the flow line can be improved.
- peaks with the same score appear for the mobile objects a and b in any state from time t 1 to time t 10 .
- the flow line may be identified by tracing a cell other than the cell selected by ID1.
- the moving body position detecting means 333 causes the flow line output unit 4 to display the flow line connecting the cell position coordinates in order of time (step S16).
- the moving body position detection unit 333 may display a flow line that connects the cell regions selected at each time in time series. Or you may display the flow line which connected the gravity center position of the cell selected in each time in time series.
- the moving body position detection unit 333 displays the ID corresponding to the flow line on the flow line output unit 4 together with the flow line.
- the time t It becomes possible to determine which ID should be assigned to which moving body only after 10 is reached. That is, for the flow line from time t 1 to time t 9, it is impossible to uniquely identify and assign the correct ID.
- the confirmed state information is selected in step S11, and the score at the definite time point is reflected in the score at the uncertain time point (steps S13 and S14). As a result, even between the time t1 and the time t9, a difference occurs in the score between the position score correspondence information of ID1 and the position score correspondence information of ID2, and the flow line can be specified with high accuracy.
- a procedure for detecting a flow line of a moving body having no ID will be described.
- a mobile object having no ID is referred to as unknown
- the position score correspondence information of the mobile object not having this ID is referred to as unknown position score correspondence information.
- the position score correspondence information of unknown is collectively stored for each time. That is, even if there are a plurality of unknownuns, only one location score correspondence information of unknownun needs to be prepared at each time.
- the state prediction unit 321 of the state update unit 32 acquires the state information of unknownun generated at the previous time from the state storage unit 31 (step S3). Then, the state predicting unit 321 propagates the score according to the moving model of the moving body like other moving bodies having an ID, and predicts the position score corresponding information at the current time from the position score corresponding information at the previous time (Ste S5). Then, the observation information reflecting means 322 does not perform the process of step S6 because there is no ID observation information, and updates the state using the position observation information (step S7). Next, after detecting the flow lines of all the moving bodies having IDs (steps S10 to S15), the flow line specifying means 33 performs the unknown line flow detection process.
- the score of the position score correspondence information at a definite time point is reflected on the score of the state at an uncertain time point (steps S13 and S14).
- the peak of the score at each time is selected to generate a flow line (step S15).
- the mobile object position detection means 333 excludes the cell selected at each time for each mobile object having an ID among the cell group in which the peak of the score appears in the position score correspondence information at each time of unknownun. It is determined that the cell group is the location of each unknown bag. By connecting these cells in time series, it is possible to generate a flow line of a moving body having no ID.
- a cell having a score exceeding a predefined threshold value may be selected as a peak, or cells having a score value within the top n are selected. It may be selected as a peak.
- the state update unit 32 acquires the state information generated at the previous time from the state storage unit 31, and based on the state information, the moving model of the moving body, and the observation information of the position and ID, Time state information is created and stored in the state storage unit 31. Further, the flow line specifying unit 33 acquires state information for a predetermined past time from the current time from the state storage unit 31, and scores from the time point when the moving object identified by the ID is deterministic to the time when it is uncertain. The flow line is generated by reflecting.
- the position information and ID information observed by the sensors are converted into a state in which the high possibility of being present at each position in the tracking region is represented as a score.
- the sensor information (position information and ID information) observed in the past is reflected in the score, so the flow line can be detected robustly. Can do.
- the flow line of each moving body can be uniquely determined by following the peak of the score at each time for the state information group related to a certain ID, the position information detected during the past certain time from the current time It is not necessary to generate all combinations of ID information as flow line hypotheses and calculate likelihood for each hypothesis to estimate a flow line. Therefore, according to the present embodiment, even if the number of moving objects to be tracked increases, the number of combinations as a flow line hypothesis does not increase. Processing becomes possible.
- FIG. The second embodiment is an embodiment in which the resolution of a divided region (for example, a cell) in the tracking region 50 in which a score is set is not fixed but variable.
- FIG. 17 is a block diagram showing an example of a flow line detection system according to the second embodiment of the present invention.
- the flow line detection system of the second embodiment includes a position information input unit 1, an ID information input unit 2, a flow line detection unit 3 b, and a flow line output unit 4.
- the position information input unit 1, the ID information input unit 2, and the flow line output unit 4 are the same as those in the first embodiment, and detailed description thereof is omitted.
- the flow line detection unit 3b includes a state update unit 32b, a state storage unit 31, and a flow line identification unit 33.
- the state storage unit 31 is the same as that of the first embodiment, and detailed description thereof is omitted.
- FIG. 18 is a block diagram illustrating a configuration example of the state update unit 32b according to the second embodiment.
- the state update unit 32b includes a resolution control unit 323, a state prediction unit 321b, and an observation information reflection unit 322b.
- the state predicting unit 321b and the observation information reflecting unit 322b are the same as the state predicting unit 321 and the observation information reflecting unit 322 in the first embodiment.
- the resolution control means 323 calculates the distance related to the position coordinates of each moving body input from the position information input unit 1. Then, the resolution control means 323 controls the resolution of the divided area according to the distance. That is, the resolution control means 323 redefines the area where the tracking area 50 is divided so as to change the size of the divided area of the tracking area 50 (see FIG. 1), and scores each redefined area. Set again.
- the tracking area 50 is grid-divided into cells as illustrated in FIG. 9 as an example.
- reduce the cell size for example, the length of each side
- reset the cell for example, the length of each side
- increase the cell size for example, the length of each side
- the resolution control means 323 controls the resolution in the position score correspondence information according to the distance between the moving bodies at each time so that the flow line specifying means 33 can detect the moving lines by separating the moving bodies.
- the position score correspondence information represents the correspondence between a cell and a score represented by two-dimensional coordinates.
- the resolution control unit 323 selects two of the position information of each moving body input from the position information detection unit 1, and the distance in the x-axis direction between the position coordinates included in the selected position information and the y-axis direction Calculate the distance.
- the resolution control unit 323 sequentially selects combinations of two position information, and calculates the distance in the x-axis direction and the distance in the y-axis direction between the position coordinates for each combination.
- the resolution control means 323 calculates the distance in the x-axis direction and the distance in the y-axis direction for all the combinations between the two points, and as a result, the resolution of the cell is set so that the resolution is higher than the shortest of these distances.
- the resolution higher than the shortest distance means, for example, setting the length of one side of the cell to be shorter than the shortest distance.
- the distance between moving bodies may temporarily approach 0, such as when passing between moving bodies.
- the resolution is changed by using the calculation result of the distance between the moving bodies at each time, the position score correspondence relationship with the highest resolution must be generated every time the moving bodies approach each other. . Therefore, instead of changing the resolution using the distance calculated in one time step, when updating the resolution higher (when the divided area is smaller), the shortest distance between the moving bodies is shorter than the resolution at the previous time. If the resolution is updated at a high level and the resolution is updated at a low level (when the divided area is enlarged), the shortest distance between moving bodies is higher than the resolution at the previous time. The resolution may be updated to be low on the condition that the state of “long” continues for a certain time or longer.
- the resolution control unit 323 determines the resolution used at the current time using the position information acquired from the position information input unit 1, the resolution of the state information group at the previous time acquired from the state storage unit 31 is set to the determined resolution. Update.
- the resolution control means 323 determines the position score correspondence information of the previous time by determining the score of each cell of the new resolution based on the score in the position score correspondence information included in the state information group at the previous time. Is generated in accordance with the resolution of the current time. This process can be said to be a process for updating the resolution of the position score correspondence information at the previous time.
- the position score correspondence information is not updated by overwriting the position score correspondence information of the previous time, but the position score correspondence information corresponding to the resolution of the current time is newly updated. create.
- the resolution control means 323 inputs the position score correspondence information to the state prediction means 321b, and the state prediction means 321b and the observation information reflection means 322b perform processing on the position score correspondence relation information, thereby Time position score correspondence information is completed.
- the resolution control unit 323 may create a copy of the position score correspondence information at the previous time and input the copy to the state prediction unit 321b.
- the resolution control unit 323 creates the position score correspondence information obtained by updating the position score correspondence information of the previous time according to the resolution of the current time, the time included in the state information of the previous time (that is, the previous time) and the ID Is input to the state prediction means 321b as state information combined with the above.
- the position score correspondence information is input to the state prediction unit 321b as the state information together with the time and the ID.
- the resolution control unit 323 also inputs the position information at the current time input from the position information input unit 1 together with the state information to the state prediction unit 321b.
- the state prediction unit 321b When the state information and the position information of the moving body are input from the resolution control unit 323, the state prediction unit 321b performs a process of propagating the score of the position score correspondence information included in the state information according to the movement model. This process is the same as the process of the state prediction unit 321 in the first embodiment, and can be said to be a process of predicting the position score correspondence information at the current time.
- the state prediction unit 321b inputs the state information including the position score correspondence information on which the process of propagating the score and the position information of the moving object are input to the observation information reflection unit 322b.
- the observation information reflecting means 322b receives ID information at the current time from the ID information input unit 2.
- the observation information reflecting means 322b uses the current time ID information, the state information input from the state prediction means 321b, and the current time position information to create the current time state information, and stores it in the state storage unit 31.
- the observation information reflecting means 322b reflects the current time ID information and the current time position information in the position score correspondence information included in the input state information. This process is the same as the process of the observation information reflecting unit 322 in the first embodiment.
- the observation information reflecting means 322b causes the state storage unit 31 to store state information including ID information at the current time and position score correspondence information reflecting the position information as state information at the current time.
- FIG. 18 shows a case where position information is input to the resolution control means 323 and ID information is input to the observation information reflection means 322b.
- the resolution control means 323 and the state prediction means 321b can refer to the position information and
- the information reflecting means 322b can refer to the position information and the ID information
- the position information and the ID information may be input to any means of the state update means 32b.
- the flow line specifying means 33 includes a confirmed state selecting means 331, a confirmed state reflecting means 332, and a moving body position detecting means 333, as in the first embodiment. Each of these means is the same as in the first embodiment. Therefore, the confirmed state reflecting unit 332 propagates the score of each position according to the movement model with respect to the position score correspondence information at a definite time point, and the position score correspondence information after the processing is transmitted to the most recent uncertain time point. Is reflected in the positional score correspondence information. However, in the second embodiment, the resolution may differ between the position score correspondence information. In this case, the confirmed state reflecting means 332 performs processing for reflecting the position score correspondence information after the process of propagating the score in the position score correspondence information at the most recent uncertain time by a method according to the difference in resolution. Do.
- the other points are the same as those of the first embodiment, and the flow line specifying means 33 stores the state information of each ID generated during a certain past time from the current time in the state storage unit 31. To identify the flow line for each ID.
- state update means 32b (state prediction means 321b, observation information reflection means 322b, resolution control means 323), flow line identification means 33 (determined state selection means 331, confirmed state reflection means 332, and moving body
- the position detection means 333 is realized by, for example, a CPU of a computer that operates according to a flow line detection program.
- a computer program storage device (not shown) stores a flow line detection program, and the CPU reads the program, and in accordance with the program, state update means 32b (state prediction means 321b, observation information reflection means 322b, resolution control) Means 323) and flow line specifying means 33 (determined state selecting means 331, confirmed state reflecting means 332 and moving body position detecting means 333).
- the state update unit 32b and the flow line specifying unit 33 may be realized by different hardware.
- the state prediction unit 321b, the observation information reflection unit 322b, and the resolution control unit 323 may also be realized by different hardware.
- 19 and 20 are flowcharts illustrating an example of processing progress of the flow line detection unit 3b according to the second embodiment.
- FIGS. 21 and 22 an example of processing progress of the second embodiment will be described with reference to FIGS. 21 and 22.
- the same processing as that in the first embodiment is denoted by the same reference numerals as those in FIGS. 7 and 8, and detailed description thereof is omitted.
- the state update unit 32b acquires ID information (that is, a set of the ID of the moving object, the ID information input unit number, and the detection time of the ID) from the ID information input unit 2 (step S1).
- ID information that is, a set of the ID of the moving object, the ID information input unit number, and the detection time of the ID
- the state update unit 32b acquires position information (that is, a set of the position coordinates of the moving object and the detection time of the position coordinates) from the position information input unit 1 (step S2).
- the resolution control unit 323 of the state update unit 32b receives the state information of the latest time from the state storage unit 31 and the state information stored in the state storage unit 31 (that is, the state information immediately before the current time). (Status information generated at time) is acquired (step S3).
- the resolution control means 323 determines the resolution at the time of generating the position score correspondence information at the current time using the position information of each moving body acquired in step S2 (step S23).
- the resolution control unit 323 includes identification information (for example, cell coordinates) obtained by dividing the tracking region in accordance with the resolution, and the score associated with the coordinates of each cell is in an undetermined state. The corresponding position score correspondence information is created.
- step 23 a specific example of the processing of step 23 will be described with reference to FIGS. 21 and 22.
- the resolution control unit 323 calculates the distance in the x-axis direction and the distance in the y-axis direction between pa and pb, between pb and pc, and between pc and pa, respectively.
- the distance in the x-axis direction between pa and pb is represented as ab_x
- the distance in the y-axis direction is represented as ab_y.
- the distance in the x-axis direction between pb and pc is represented as bc_x
- the distance in the y-axis direction is represented as bc_y
- the distance in the x-axis direction between pc and pa is expressed as ac_x
- the distance in the y-axis direction is expressed as ac_y.
- the resolution control unit 323 calculates the distance in the x-axis direction and the distance in the y-axis direction between the position coordinates of the two points as described above, and selects the shortest distance. In the example shown in FIG. 21, it is assumed that ab_x is the shortest distance. In step 23, as shown in FIG. 22, the distance of this ab_x is determined as a new resolution.
- the upper limit and lower limit of the resolution may be defined in advance.
- the upper limit value of the resolution is a value when the resolution becomes the highest, and is the minimum value as the resolution such as the cell size.
- the lower limit value of the resolution is a value when the resolution is the lowest, and is the maximum value as the resolution such as the cell size.
- the upper limit is set as the resolution at the current time, and when the distance between the two position coordinates is longer than the lower limit of the resolution, the resolution is set.
- the lower limit value may be used as the resolution at the current time.
- the upper limit value of the resolution may be determined based on the resolution of the position coordinates input from the position information input unit 1. Further, the lower limit value of the resolution may be determined on the basis that each ID detection area in the tracking area belongs to a different cell.
- step S4 the state information at the current time when the state of the score is updated is not created for the state information group at the previous time acquired from the state storage unit 31 (specifically, steps S24 and S5). , S6, S7) is determined (step S4).
- the resolution control means 323 reads from the state information group at the previous time read at step S3, One of the unprocessed state information is selected. Then, the resolution control unit 323 generates position score correspondence information that matches the position score correspondence information of the previous time included in the state information with the resolution determined in step S23 (step S24). Specifically, the resolution control unit 323 sets the score corresponding to the identification information of each cell in the position score correspondence information created in step S23 based on the position score correspondence information of the previous time. That is, in the position score correspondence information created in step S23, the resolution is the resolution at the current time, but the score is undetermined, and the score is set in step S24.
- step S24 The procedure of step S24 will be specifically described with reference to FIG. 21 and FIG. First, a case where the resolution is updated from a fine one to a coarse one will be described.
- FIG. 21 represents the cell state at the previous time
- FIG. 22 represents the cell state at the current time.
- each cell shown in FIG. 22 is defined, but the score corresponding to each cell is undetermined.
- the resolution control means 323 determines which of the cells indicated by the position score correspondence information at the previous time is included for each cell obtained by dividing the tracking area with the resolution determined in step S23. For example, among the cells shown in FIG. 22 (the cells having the resolution at the current time determined in step S23), the cell c (0, 0) includes each cell (in FIG. 21) indicated by the position score correspondence information at the previous time. It is determined that c (0,0), c (0,1), c (1,0), and c (1,1) are included in each cell shown. Then, the resolution control means 323 includes the cells c (0, 0), c (0, 1), c (1, 0) of the previous time included in the cell c (0, 0) (see FIG.
- the resolution control means 323 determines the score as a score corresponding to c (0, 0) in the positional score correspondence information created in step S23.
- the resolution control means 323 calculates the score for each cell having the resolution at the current time as described above, and includes it in the position score correspondence information.
- the average value of the score of each cell at the previous time included in the cell at the current time is determined as the score at the current time.
- the resolution control unit 323 has The highest value among the scores of the cells at the previous time included in the cell may be determined as the score of the cell at the current time.
- a high score may be defined as the score of the cell c (0, 0) at the current time shown in FIG.
- the resolution control means 323 determines the score for each cell having the resolution at the current time, and includes it in the position score correspondence information created in step S23, thereby adapting the position score correspondence information at the previous time to the resolution at the current time. Position score correspondence information can be obtained.
- FIG. 22 represents the cell state at the previous time
- FIG. 21 represents the cell state at the current time.
- the resolution control means 323 determines which of the cells indicated by the position score correspondence information at the previous time is included in each cell obtained by dividing the tracking area with the resolution determined in step S23.
- the resolution control means 323 determines the score of the cell at the previous time including the cell as the score of the cell at the current time, and includes it in the position score correspondence information. For example, among the cells shown in FIG. 21 (in this example, the cell having the resolution at the current time determined in step S23), the cell c (0, 0) is each cell (indicated by the position score correspondence information at the previous time ( In this example, each cell shown in FIG. 22 is included in c (0, 0). Therefore, the resolution control means 323 determines the score of the cell c (0, 0) shown in FIG.
- the resolution control means 323 determines the score as described above for each cell having the resolution at the current time, and includes it in the position score correspondence information. It is possible to obtain position score correspondence information in which the position score correspondence information of the previous time is adapted to the resolution of the current time.
- a cell defined at the current time may straddle a plurality of cells defined at the previous time.
- the resolution control unit 323 may use the average value of the scores set in the cell at the previous time that the cell defined at the current time straddles as the score of the cell at the current time. Also, the resolution control means 323 weights the score by the area ratio that the cell at the current time straddles for each cell at the previous time that the cell at the current time straddles, and calculates the average value of the weighted scores at the current time It may be used as the score of the cell.
- the resolution control unit 323 newly generates, as state information, a set of the time (previous time) and ID included in the state information selected in Step S24 and the position score correspondence information created in Step S24. It inputs into the state prediction means 321b. The resolution control unit 323 also inputs the position information of the moving body acquired in step S2 to the state prediction unit 321b.
- the state predicting unit 321b calculates the elapsed time from the time (previous time) of the state information input from the resolution control unit 323 to the current time, and scores based on a predefined moving model of the moving object. And the state of the current score is predicted by propagating the score in the position score correspondence information included in the input state information (step S5).
- the process of propagating the score in the position score correspondence information is the same as step S5 in the first embodiment.
- the resolution of the position score correspondence information is variable in the second embodiment, it is desirable not to include the state resolution in the parameters of the movement model when defining the movement model of the moving object.
- step S5 after performing the process of propagating the score, the state prediction unit 321b updates the time (previous time) in the input state information to the current time.
- the time included in the position information of the moving body may be used as the current time.
- the process of updating the time in the state information to the current time is performed by the observation information reflection unit 322b. May be.
- the observation information reflecting unit 322b may update the time included in the state information to the current time in step S7 and the like.
- the state prediction unit 321b inputs the state information updated in the position score correspondence information in step 5 and the current position information of the moving body input from the resolution control unit 323 to the observation information reflection unit 322b. To do.
- the observation information reflecting unit 322b is included in the state information by using the state information and the moving body position information input from the state prediction unit 321b and the current time ID information input from the ID information input unit 2.
- the corresponding position score correspondence information is updated (steps S6 and S7).
- the processes in steps S6 and S7 may be the same as steps S6 and S7 in the first embodiment.
- the increase amount of the score of the cell partially overlapping with the ID detection region may be the same as or less than the increase amount of the score of the cell included in the ID detection region.
- the state information obtained as a result of step S7 is state information at the current time.
- the process up to step S7 is the process of creating the current time status information based on the previous time status information for one ID.
- state update means 32b assumes that the current time state information has already been created for the IDs subjected to the processing of steps S24, S5, S6, and S7.
- the state update unit 32b returns to step S4 again after step S7, and determines whether or not unprocessed state information exists for steps S24, S5, S6, and S7.
- step S4 When it is determined that there is no unprocessed state information for steps S24, S5, S6, and S7 (No in step S4), the observation information reflecting unit 322 repeats the processes of steps S24, S5, S6, and S7 for each ID. Each state information created at the current time is stored in the state storage unit 31 (step S8).
- the flow line specifying means 33 acquires the state information of each ID for a certain past time from the current time from the state storage unit 31, and specifies the flow line (steps S9 to S16).
- the flow line generation procedure may be the same as the method described in steps S9 to S16 of the first embodiment.
- step S13 the resolution of the position score correspondence information at the most recent time predicted based on the position score correspondence information at the definite time point and the latest of the deterministic time point.
- the resolution may be different from the position score correspondence information in the time uncertain state information.
- the confirmed state reflecting unit 332 may perform the process of step S14 as follows.
- the position score correspondence information in the indeterminate state information is simply referred to as indeterminate state information.
- the definite state reflecting means 332 determines the uncertain among the cells of the position score correspondence information obtained at step S13. A cell across which one cell of the state information spans is specified, and the average value of the scores of each cell is calculated. Then, the confirmed state reflecting means 332 adds or multiplies the average value to the score of the cell focused on by the uncertain state information, thereby multiplying the cell focused on by the uncertain state information. Update the score. The confirmed state reflecting means 332 may perform this process for each cell of the indeterminate state information.
- the cell of the uncertain state information When obtaining the average value, among the cells of the position score correspondence information obtained as a result of step S13, for each cell spanned by one cell of the uncertain state information, the cell of the uncertain state information straddles.
- the score may be weighted by the area ratio, and the average value of the weighted scores may be obtained. Then, the confirmed state reflecting means 332 may add or multiply the average value to the score of the cell focused on by the indeterminate state information.
- the confirmed state reflecting means 332 identifies a cell that one cell of the uncertain state information spans among the cells of the position score correspondence information obtained in step S13, and sets the maximum score of each cell.
- the maximum value may be specified and added to the score of the cell focused on by the indeterminate state information, or may be multiplied.
- the definite state reflecting means 332 determines the uncertain among the cells of the position score correspondence information obtained in step S13.
- a cell including one cell of the state information is specified, and the score of the cell may be added to or multiplied by the score of the cell focused on by the indeterminate state information. Then, the confirmed state reflecting unit 332 may perform this process for each cell of the indeterminate state information.
- the position of the moving body can be specified for each ID, and the flow line can be specified robustly.
- the resolution control unit 323 controls the resolution of the position score correspondence information generated at each time according to the closeness of the distance between the position coordinates of each moving body, so that the necessary minimum It is possible to specify a flow line for each moving body (that is, for each ID) with a limited amount of calculation.
- Embodiment 3 the score peak position is detected from the position score correspondence information at each time, and the moving object closest to the score peak position based on the position information input from the position information input unit 1
- the flow line is determined by specifying the position coordinates.
- FIG. 23 is a block diagram showing an example of a flow line detection system according to the third embodiment of the present invention.
- the flow line detection system of the third embodiment includes a position information input unit 1, an ID information input unit 2, a flow line detection unit 3 c, and a flow line output unit 4.
- the position information input unit 1, the ID information input unit 2, and the flow line output unit 4 are the same as those in the first embodiment, and detailed description thereof is omitted.
- the flow line detection unit 3c includes a state update unit 32c, a state storage unit 31, a flow line identification unit 33c, and a position information storage unit 34.
- the state storage unit 31 is a storage device that stores state information at each time.
- the state storage unit 31 is the same as that of the first embodiment, and detailed description thereof is omitted.
- FIG. 24 is a block diagram illustrating a configuration example of the state update unit 32c according to the third embodiment.
- the state update unit 32c includes a state prediction unit 321c and an observation information reflection unit 322c.
- the state prediction means 321c propagates the score indicated by the position score correspondence information included in the state information at the previous time read from the state storage unit 31 according to a predetermined movement model.
- the observation information reflecting unit 322c updates the position score correspondence information after the process of propagating the score based on the position information input from the position information input unit 1 and the ID information input from the ID information input unit 2.
- the state predicting unit 321c and the observation information reflecting unit 322c are the same as the state predicting unit 321 and the observation information reflecting unit 322 in the first embodiment, and detailed description thereof is omitted.
- the observation information reflecting unit 322 c stores the position information input from the position information input unit 1 in the position information storage unit 34.
- the process of storing the position information in the position information storage unit 34 may be performed by the state prediction unit 321c.
- the state predicting unit 321c and the observation information reflecting unit 322c can refer to the ID information and the position information
- either the state predicting unit 321c or the observation information reflecting unit 322c may have the ID information and Position information may be input.
- FIG. 25 is a block diagram showing a configuration example of the flow line specifying means 33c in the third embodiment.
- the flow line specifying unit 33c includes a confirmed state selecting unit 331, a confirmed state reflecting unit 332, and a moving body position detecting unit 333c.
- the confirmed state selection means 331 acquires from the state storage unit 31 the state information of each ID generated between the current time and the past certain time, and moves from which the flow line is derived from the state information. Select state information including body ID. Then, the confirmed state selecting unit 331 selects state information at a definite time from among the state information at each time of the ID. The confirmed state reflecting means 332 corrects the score indicated by the position score correspondence information at the uncertain time while following the uncertain time nearest to the definite time.
- the confirmed state selecting unit 331 and the confirmed state reflecting unit 332 are the same as those in the first embodiment, and a description thereof will be omitted.
- the moving body position detecting means 333c detects the position of the moving body at each time from the position score correspondence information at each time after the processing by the confirmed state reflecting means 332.
- the mobile object position detection unit 333c in the third embodiment not only detects the position where the score is a peak as the position of the mobile object from the position score correspondence information, but also the time corresponding to the position score correspondence information. Among the position coordinates, the position closest to the position where the score is peak is specified.
- the moving body position detection unit 333c performs this process on the position score correspondence information at each time.
- the moving body position detection unit 333c uses the position coordinates in time series order as the flow line. For example, a line connecting position coordinates in time series is displayed on the flow line output unit 4 as a flow line.
- state update means 32c state prediction means 321c, observation information reflection means 322c
- flow line identification means 33c confirmed state selection means 331, confirmed state reflection means 332 and moving body position detection means 333c.
- the program storage device not shown
- the program storage device stores the flow line detection program, and the CPU reads the program, and according to the program, the state update means 32c (state prediction means 321c, observation information reflection means 322c), What is necessary is just to operate
- state update unit 32b and the flow line specifying unit 33 may be realized by different hardware.
- state prediction unit 321c and the observation information reflection unit 322c may also be realized by different hardware.
- the confirmed state selecting means 331, the confirmed state reflecting means 332, and the moving body position detecting means 333c may also be realized by other hardware.
- FIG. 26 and FIG. 27 are flowcharts illustrating an example of processing progress of the flow line detection unit 3c of the third embodiment.
- symbol same as FIG. 7 and FIG. 8 is attached
- subjected and detailed description is abbreviate
- step S1 The process (steps S1 to S8) until the state update unit 32c generates the state information of the current time using the state information of the previous time and stores it in the state storage unit 31 is step S1 in the first embodiment. Same as S8.
- the state update unit 32c stores the position information (a set of the position coordinates of the moving body and its detection time) acquired from the position information input unit 1 in Step S2 in the position information storage unit 34 (Step S38).
- the flow line specifying means 33c reads state information for a certain past time from the current time from the state storage unit 31, selects the confirmed state information, and based on the score indicated by the confirmed state information, the latest uncertain A process of predicting the score at the time point and reflecting it in the indeterminate state information at the time point is performed (steps S9 to S14). If there is no state information for which the processing of steps S11 to S14 has not been performed for each piece of state information read from the state storage unit 31 (No in step S10), the process proceeds to step S314. At this time, the score of the state information at the definite time is sequentially reflected in the state information at the uncertain time.
- step S314 the moving body position detecting unit 333c reads each position information for a predetermined period of time from the current time from the position information storage unit 34 (step S314).
- the time width for the past fixed time is the same as the time width when the fixed state selection means 331 reads the state information for the past fixed time from the current time in step S9.
- the moving body position detecting means 333c specifies a flow line for each ID using the state information at each time and the position information acquired in step S314 (step S15c).
- the moving object position detection unit 333c sequentially selects the IDs of moving objects. Then, the mobile object position detection unit 333c refers to the position score correspondence information included in the state information at each time of the selected ID, and detects a cell in which a peak of the score appears at each time. The mobile body position detection unit 333c further detects the detected cell (that is, the cell in which the peak of the score appears) among the position coordinates included in the position information of the time corresponding to the position score correspondence information that detected the cell. Specify the closest position coordinate.
- the moving body position detecting means 333c specifies the position coordinates closest to the cell in which the peak of the score appears in the same manner for each time with respect to the selected ID. Information representing the flow line is obtained by specifying the identified coordinates in chronological order. If the position coordinates at each time are specified for one ID, the mobile object position detection unit 333c selects the next ID and similarly specifies the position coordinates at each time. The moving body position detecting unit 333c selects each ID and repeats the same processing, whereby the flow line of each ID is obtained.
- the moving body position detecting means 333c When the moving body position detecting means 333c specifies the position coordinates at each time for each ID, the moving body position detecting means 333c displays the flow line connecting the position coordinates in time series for each ID on the flow line output unit 4 (step S16). .
- the position of the moving body can be specified for each ID, and the flow line can be estimated robustly.
- the resolution of the generated flow line does not depend on the resolution of the position score correspondence information, even if the state is generated with the minimum necessary resolution that can separate the moving object, it is fine. Can generate simple flow lines.
- the third embodiment may be combined with the second embodiment, and the state update unit may include a resolution control unit in addition to the state prediction unit and the observation information reflection unit. In this case, both the effects of the second embodiment and the effects of the third embodiment are obtained.
- FIG. 28 is a block diagram showing an example of the minimum configuration of the flow line detection system of the present invention.
- the flow line detection system of the present invention includes position score correspondence information generating means 81, state storage means 82, and flow line specifying means 83.
- the position score correspondence information generating unit 81 calculates a score indicating the possibility of the existence of a moving object with specific identification information for each position in the tracking area of the moving object ( For example, the position score correspondence information defined for each cell) is generated for each mobile object identification information.
- State storage means 82 (for example, state storage unit 31) stores position score correspondence information for each time.
- the flow line specifying unit 83 determines position score correspondence information satisfying a predetermined criterion among the position score correspondence information stored in the state storage unit for each moving body identification information. And the deterministic position score correspondence information is reflected in the position score correspondence information at the time closest to the time corresponding to the definite position score correspondence information, and the position The determination of the score correspondence information as definitive position score correspondence information is repeated, and the flow line of the moving object is specified from the score in the position score correspondence information for each time.
- the position of the moving body of each identification information can be determined with high accuracy and the flow line can be detected.
- Position score correspondence information in which a score indicating the possibility of the existence of a moving body with unique identification information is determined for each position in the tracking area of the moving body is used as the mobile body identification information.
- Position score correspondence information generating means to be generated every time, state storage means for storing position score correspondence information for each time, and position score correspondence information stored in the state storage means for each identification information of the moving body Among them, position score correspondence information satisfying a predetermined standard is selected as deterministic position score correspondence information, and deterministic position score correspondence information is selected as the latest of the time corresponding to the definite position score correspondence information.
- the flow line detection system characterized in that it comprises a flow-line identification means for identifying a flow line of the moving body.
- the position score correspondence information generating means calculates the score of each position in the position score correspondence information created at a time before the current time according to a predetermined score propagation mode.
- the score propagation means for propagating as a score, and the position score correspondence information score propagated by the score propagation means to the detection area where the identification information is detected at the current time and the position coordinates of the mobile object detected at the current time.
- the flow line detection system according to supplementary note 1, including observation information reflecting means for generating position score correspondence information at the current time by updating based on the information.
- the position score correspondence information generating means determines individual areas in the tracking area to which the score is assigned based on the distance between the position coordinates of the moving object detected at the current time, Based on the score in the position score correspondence information created at the time, resolution control means for generating position score correspondence information defining a score for each individual region, and the position score correspondence relation generated in the resolution control means
- the score propagation means for propagating each score in the information as a score of a nearby position according to a predetermined score propagation mode, and the score of the position score correspondence information to which the score is propagated by the score propagation means at the current time
- the position score at the current time is updated by updating the detection area where the identification information is detected and the position coordinates of the moving object detected at the current time.
- Flow line detection system including the observation information reflecting means for generating a response related information.
- the observation information reflecting means determines the position score correspondence information satisfying a predetermined criterion among the position score correspondence information stored in the state storage means for each mobile object identification information. After deterministic information selection means to select as information and each score in the deterministic position score correspondence information is propagated as a score of a nearby position according to a predetermined score propagation mode, the score is propagated Each score of the deterministic position score correspondence information is reflected in each score of the position score correspondence information at the latest time corresponding to the definite position score correspondence information, and the position of the most recent time In the score reflection means for repeatedly making the score correspondence information as definitive position score correspondence information, and the definite position score correspondence information at each time Flow line detection system according to Supplementary Note 1 to any one of Appendices 3 including a vehicle location specifying means for specifying the flow line of the moving body by the core, to identify the position of the mobile object at each time.
- the moving body position specifying means specifies the position of the mobile body at each time based on the position where the peak score appears in the definite position score correspondence information at each time. Detection system.
- Position score correspondence information in which a score indicating the possibility of the existence of a mobile object with specific identification information is determined for each position in the tracking area of the mobile object is used as mobile object identification information.
- a position score correspondence information generation step that is generated every time, a state storage step that stores the position score correspondence information for each time in the state storage means, and a position score that is stored in the state storage means for each identification information of the moving body Among the correspondence information, the position score correspondence information satisfying a predetermined criterion is selected as deterministic position score correspondence information, and the definite position score correspondence information is associated with the definite position score correspondence information.
- the position score correspondence information is repeatedly defined as deterministic position score correspondence information.
- the flow line detection method characterized by including the flow-line identification step of identifying the flow line of the moving body.
- the position score correspondence information generation step calculates the score of each position in the position score correspondence information created at a time before the current time according to a predetermined score propagation mode.
- the score propagation step for propagating as a score, and the score of the position score correspondence information for which the score was propagated in the score propagation step are set to the detection area where the identification information is detected at the current time and the position coordinates of the mobile object detected at the current time.
- the position score correspondence information generation step determines individual areas in the tracking area to which the score is assigned based on the distance between the position coordinates of the moving object detected at the current time, Based on the score in the position score correspondence information created at the time, a resolution control step for generating position score correspondence information defining a score for each individual region, and the position score correspondence generated in the resolution control step Each score in the information is propagated as a score of a nearby position according to a predetermined score propagation mode, and the score of the position score correspondence information to which the score is propagated in the score propagation step is set at the current time.
- Flow line detection method according to Appendix 7 with the observation information reflecting step of generating positional score correspondence information.
- the observation information reflecting step includes, for each mobile object identification information, the position score correspondence information satisfying a predetermined criterion among the position score correspondence information stored in the state storage unit, and the definite position score correspondence relationship.
- deterministic information selection step to select as information and each score in deterministic position score correspondence information is propagated as a score of a nearby position according to a predetermined score propagation mode, and after the score is propagated
- Each score of the deterministic position score correspondence information is reflected in each score of the position score correspondence information at the latest time corresponding to the definite position score correspondence information, and the position of the most recent time
- a score reflection step that repeats making the score correspondence information into deterministic position score correspondence information, and a definite position score correspondence at each time
- the moving line detection method according to any one of appendix 7 to appendix 9, further comprising: a moving body position specifying step of specifying a moving line of the moving body by specifying the position of the moving body at each time from the score in the information. .
- a position information storage step for storing the position coordinates of the moving body detected at each time in the position information storage means is provided, and in the moving body position specifying step, in the definite position score correspondence information at each time Supplementary note 10 that specifies a position where a peak score appears, and specifies a position coordinate closest to the position where the peak score appears among the position coordinates of the moving object detected at the same time as the position of the moving object.
- Position score correspondence information in which a score indicating the likelihood of the presence of a mobile object with specific identification information is determined for each position in the tracking area of the mobile object For each piece of identification information, the position score correspondence information generation processing, the state storage processing for storing the position score correspondence information for each time in the state storage means, and the identification information of the moving body are stored in the state storage means.
- the position score correspondence information satisfying a predetermined standard is selected as deterministic position score correspondence information
- the definite position score correspondence information is selected as the deterministic position score correspondence information.
- the flow line detection program for executing the flow-line identification process for specifying the flow line of the moving body.
- the position scores in the position score correspondence information created at the time before the current time are stored in the computer according to a predetermined score propagation mode.
- the score propagation process for propagating as a position score, and the position score correspondence information for which the score was propagated in the score propagation process, the detection area where the identification information is detected at the current time, and the mobile object detected at the current time The flow line detection program according to appendix 13, wherein the observation information reflection process for generating the position score correspondence information at the current time is executed by updating based on the position coordinates.
- the computer determines individual areas in the tracking area to which the score is assigned based on the distance between the position coordinates of the moving object detected at the current time.
- Flow line detection program according to Note 13 to execute the observation information reflecting process for generating positional score correspondence information of time.
- the position score correspondence information satisfying a predetermined criterion among the position score correspondence information stored in the state storage means is determined in a deterministic position for each identification information of the moving object in the observation information reflection process.
- Deterministic information selection processing to select as score correspondence information each score in the definite position score correspondence information is propagated as a score of a nearby position according to a predetermined score propagation mode, and the score is propagated
- Each score of the subsequent deterministic position score correspondence information is reflected in each score of the position score correspondence information at the latest time corresponding to the definite position score correspondence information, and the latest time Score reflection processing that repeats making the position score correspondence information of the current position deterministic position score correspondence information, and a deterministic position score at each time
- the flow line according to any one of supplementary note 13 to supplementary note 15, wherein a moving body position specifying process for specifying a flow line of the moving body is performed by specifying the position of the moving body at each time from the score in the correspondence relationship information.
- the supplementary note 16 which causes the computer to identify the position of the mobile object at each time based on the position where the peak score appears in the deterministic positional score correspondence information at each time in the mobile object location specifying process. Flow line detection program.
- a position information storage process for causing the computer to store the position coordinates of the moving body detected at each time in the position information storage means is executed, and in the moving body position specifying process, a definite position score at each time is determined.
- the position where the peak score appears in the correspondence information is specified, and the position coordinate closest to the position where the peak score appears is specified as the position of the moving object among the position coordinates of the moving object detected at the same time as the time.
- Position score correspondence information in which a score indicating the likelihood of the presence of a mobile object with unique identification information is determined for each position in the tracking area of the mobile object,
- position score correspondence information satisfying a predetermined standard is selected as deterministic position score correspondence information
- deterministic position score correspondence information is selected as the latest of the time corresponding to the definite position score correspondence information.
- the position score correspondence information is reflected in the time position correspondence information, and the position score correspondence information is repeatedly defined as deterministic position score correspondence information. From flow line detection system, characterized in that it comprises a flow-line identification unit for identifying the flow line of the moving body.
- the position score correspondence information generation unit calculates a score of each position in the position score correspondence information created at a time before the current time according to a predetermined score propagation mode.
- the score propagation unit for propagating as a score, and the position score correspondence information score propagated by the score propagation unit to the detection area where the identification information is detected at the current time and the position coordinates of the moving body detected at the current time.
- the flow line detection system according to supplementary note 19, including an observation information reflection unit that generates position score correspondence information at the current time by updating based on the information.
- the position score correspondence information generation unit determines individual areas in the tracking area to which the score is assigned based on the distance between the position coordinates of the moving object detected at the current time, Based on the score in the position score correspondence information created at the time, a resolution control unit that generates position score correspondence information that defines a score for each individual region, and a position score correspondence generated in the resolution control unit
- the score propagation unit that propagates each score in the information as a score of a nearby position according to a predetermined score propagation mode, and the score of the position score correspondence information in which the score is propagated by the score propagation unit at the current time
- the observation information reflecting unit determines the position score correspondence information satisfying a predetermined criterion among the position score correspondence information stored in the state storage unit for each mobile object identification information.
- deterministic information selection unit to select as information and each score in deterministic position score correspondence information is propagated as a score of a nearby position according to a predetermined score propagation mode.
- Each score of the deterministic position score correspondence information is reflected in each score of the position score correspondence information at the latest time corresponding to the definite position score correspondence information, and the position of the most recent time
- a score reflection unit that repeatedly sets the score correspondence information as definitive position score correspondence information, and the score in the definite position score correspondence information at each time Et al., Flow line detection system according to Supplementary Note 19 to any one of Appendices 21 including a vehicle location specifying unit for specifying a flow line of the moving object by identifying the position of the mobile object at each time.
- specification part specifies the position of the moving body of each time based on the position where the peak score in the definite position score corresponding
- a position information storage unit that stores the position coordinates of the moving object detected at each time is provided, and the moving object position specifying unit is a position where the peak score appears in the definite position score correspondence information at each time. And the position coordinate closest to the position where the peak score appears among the position coordinates of the moving body detected at the same time as the time. .
- the present invention is preferably applied to a flow line detection system that associates an ID with a moving body and identifies a moving line of the moving body.
- the person when detecting a flow line by associating the position of a person working in an office or factory with an employee number unique to each person, the person enters according to the security authority for each person based on the obtained flow line. It is applicable to security purposes such as determining whether the area is possible and controlling alerts as necessary.
- detecting a flow line by associating the position of a person shopping in a shopping center with a member number unique to each person, it can also be applied to marketing purposes such as measuring the flow line of a shopper.
Landscapes
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Engineering & Computer Science (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Educational Administration (AREA)
- Game Theory and Decision Science (AREA)
- Development Economics (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
Description
図1は、本発明の第1の実施形態の動線検出システムの例を示すブロック図である。本発明の動線検出システムは、位置情報入力部1と、ID情報入力部2と、動線検出部3と、動線出力部4とを備える。本発明の動線検出システムは、移動体の位置およびID(識別情報)を取得し、予め定められた追跡領域50内の位置毎に、移動体が存在する可能性の高さを示すスコアを導出する。このスコアは、移動体のID毎に算出する。そして、本発明の動線検出システムは、所定の基準を満たす時点でのスコアに基づいて、その時刻の直近の時刻におけるスコアを修正し、さらに、スコアが修正された時刻の直近の時刻におけるスコアも順次、修正していく。そして、修正後のスコアを用いて、移動体の動線を特定する。
観測情報反映手段322は、予めID情報入力部番号に対応するID検知領域を記憶しておく。ID検知領域は、ID情報入力部2が移動体のIDを所定の確率以上の検出確率で検出するとみなすことができる領域として予め定められた領域である。ID検知領域は、例えば、予め動線検出システムの管理者によって定められる。所定の確率以上の検出確率でIDを検出する領域として厳密にID検知領域を定義することは困難であるので、ID検知領域は、所定の確率以上の検出確率でIDを検出するとみなすことができる領域として、管理者等によって判断された領域であってよい。ID検知領域は、ID情報入力部2毎に(すなわち、ID情報入力部番号毎に)それぞれ予め定められる。
観測情報反映手段322は、各IDの位置スコア対応関係情報(状態予測手段321から入力された位置スコア対応関係情報)それぞれについて、位置情報入力部1から入力された位置座標に対応するスコアの増加量が他の領域のスコアの増加量よりも高くなるように、スコアを更新する。入力された位置座標に対応するスコアの増加量が、他の領域のスコアの増加量よりも高くなるようになるのであれば、スコアの増加量は負であってもよい。
図7および図8は、第1の実施形態の動線検出部3の処理経過の例を示すフローチャートである。以下、図9から図16までの具体例を示しつつ、第1の実施形態の処理経過の例を説明する。
まず、状態更新手段32は、ID情報入力部2からID情報(すなわち、移動体のIDと、ID情報入力部番号と、そのIDの検出時刻との組)を取得する(ステップS1)。本例では、現時刻が、図11に示した時刻t10であるとする。この場合、状態更新手段32は、ID情報入力部2から{“ID1”,“ID情報入力部2b”,“t10”}というID情報が入力される。なお、“ID情報入力部2b”は、ID情報入力部番号であるものとする。
第2の実施形態は、スコアが設定される追跡領域50内の分割領域(例えば、セル)の分解能を、固定とするのではなく、可変とする実施形態である。
第3の実施形態は、各時刻における位置スコア対応関係情報から、スコアピーク位置を検出し、さらに、位置情報入力部1から入力された位置情報に基づいて、そのスコアピーク位置に最も近い移動体の位置座標を特定していくことにより、動線を決定する実施形態である。
2 ID情報入力部
3 動線検出部
4 動線出力部
31 状態記憶部
32 状態更新手段
33 動線特定手段
321 状態予測手段
322 観測情報反映手段
323 分解能制御手段
331 確定状態選択手段
332 確定状態反映手段
333 移動体位置検出手段
Claims (8)
- 固有の識別情報が定められた移動体が存在する可能性の高さを示すスコアを移動体の追跡領域内の位置毎に定めた位置スコア対応関係情報を、移動体の識別情報毎に生成する位置スコア対応関係情報生成手段と、
時刻毎の位置スコア対応関係情報を記憶する状態記憶手段と、
移動体の識別情報毎に、状態記憶手段に記憶された位置スコア対応関係情報のうち所定の基準を満たす位置スコア対応関係情報を確定的な位置スコア対応関係情報として選択し、確定的な位置スコア対応関係情報を、当該確定的な位置スコア対応関係情報に対応する時刻の直近の時刻の位置スコア対応関係情報に反映させ、前記位置スコア対応関係情報を確定的な位置スコア対応関係情報と定めることを繰り返し、時刻毎の位置スコア対応関係情報におけるスコアから、移動体の動線を特定する動線特定手段とを備える
ことを特徴とする動線検出システム。 - 位置スコア対応関係情報生成手段は、
現時刻の前の時刻に作成された位置スコア対応関係情報における各位置のスコアを、予め定められたスコアの伝播態様に従って、近傍の位置のスコアとして伝播させるスコア伝播手段と、
スコア伝播手段によってスコアが伝播された位置スコア対応関係情報のスコアを、現時刻に識別情報が検出された検出領域および現時刻に検出された移動体の位置座標に基づいて更新することによって、現時刻の位置スコア対応関係情報を生成する観測情報反映手段とを含む
請求項1に記載の動線検出システム。 - 位置スコア対応関係情報生成手段は、
現時刻に検出された移動体の位置座標間の距離に基づいて、スコアを割り当てる追跡領域内の個々の領域を決定し、現時刻の前の時刻に作成された位置スコア対応関係情報におけるスコアに基づいて、前記個々の領域毎のスコアを定めた位置スコア対応関係情報を生成する分解能制御手段と、
分解能制御手段に生成された位置スコア対応関係情報における各スコアを、予め定められたスコアの伝播態様に従って、近傍の位置のスコアとして伝播させるスコア伝播手段と、
スコア伝播手段によってスコアが伝播された位置スコア対応関係情報のスコアを、現時刻に識別情報が検出された検出領域および現時刻に検出された移動体の位置座標に基づいて更新することによって、現時刻の位置スコア対応関係情報を生成する観測情報反映手段とを含む
請求項1に記載の動線検出システム。 - 観測情報反映手段は、
移動体の識別情報毎に、状態記憶手段に記憶された位置スコア対応関係情報のうち所定の基準を満たす位置スコア対応関係情報を確定的な位置スコア対応関係情報として選択する確定的情報選択手段と、
確定的な位置スコア対応関係情報における各スコアを、予め定められたスコアの伝播態様に従って、近傍の位置のスコアとして伝播させ、スコアを伝播させた後の前記確定的な位置スコア対応関係情報の各スコアを、当該確定的な位置スコア対応関係情報に対応する時刻の直近の時刻の位置スコア対応関係情報の各スコアに反映させ、前記直近の時刻の位置スコア対応関係情報を確定的な位置スコア対応関係情報とすることを繰り返すスコア反映手段と、
各時刻の確定的な位置スコア対応関係情報におけるスコアから、各時刻の移動体の位置を特定することにより移動体の動線を特定する移動体位置特定手段とを含む
請求項1から請求項3のうちのいずれか1項に記載の動線検出システム。 - 移動体位置特定手段は、各時刻の確定的な位置スコア対応関係情報におけるピークスコアが現れる位置に基づいて、各時刻の移動体の位置を特定する
請求項4に記載の動線検出システム。 - 各時刻に検出された移動体の位置座標を記憶する位置情報記憶手段を備え、
移動体位置特定手段は、個々の時刻の確定的な位置スコア対応関係情報におけるピークスコアが現れる位置を特定し、当該時刻と同時刻に検出された移動体の位置座標のうち、前記ピークスコアが現れる位置に最も近い位置座標を、移動体の位置として特定する
請求項4に記載の動線検出システム。 - 固有の識別情報が定められた移動体が存在する可能性の高さを示すスコアを移動体の追跡領域内の位置毎に定めた位置スコア対応関係情報を、移動体の識別情報毎に生成し、
時刻毎の位置スコア対応関係情報を状態記憶手段に記憶させ、
移動体の識別情報毎に、状態記憶手段に記憶された位置スコア対応関係情報のうち所定の基準を満たす位置スコア対応関係情報を確定的な位置スコア対応関係情報として選択し、確定的な位置スコア対応関係情報を、当該確定的な位置スコア対応関係情報に対応する時刻の直近の時刻の位置スコア対応関係情報に反映させ、前記位置スコア対応関係情報を確定的な位置スコア対応関係情報と定めることを繰り返し、時刻毎の位置スコア対応関係情報におけるスコアから、移動体の動線を特定する
ことを特徴とする動線検出方法。 - コンピュータに、
固有の識別情報が定められた移動体が存在する可能性の高さを示すスコアを移動体の追跡領域内の位置毎に定めた位置スコア対応関係情報を、移動体の識別情報毎に生成する位置スコア対応関係情報生成処理、
時刻毎の位置スコア対応関係情報を状態記憶手段に記憶させる状態記憶処理、および、
移動体の識別情報毎に、状態記憶手段に記憶された位置スコア対応関係情報のうち所定の基準を満たす位置スコア対応関係情報を確定的な位置スコア対応関係情報として選択し、確定的な位置スコア対応関係情報を、当該確定的な位置スコア対応関係情報に対応する時刻の直近の時刻の位置スコア対応関係情報に反映させ、前記位置スコア対応関係情報を確定的な位置スコア対応関係情報と定めることを繰り返し、時刻毎の位置スコア対応関係情報におけるスコアから、移動体の動線を特定する動線特定処理
を実行させるための動線検出プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11789413.9A EP2579191A4 (en) | 2010-05-31 | 2011-05-26 | RIVER LINE RECOGNITION SYSTEM, RIVER LINE RECOGNITION PROCEDURE AND RUNNER RANGE DETECTION PROGRAM |
CN201180026941.7A CN102939611B (zh) | 2010-05-31 | 2011-05-26 | 流动线检测系统、流动线检测方法和流动线检测程序 |
US13/695,489 US8731829B2 (en) | 2010-05-31 | 2011-05-26 | Flow line detection system, flow line detection method, and flow line detection program |
JP2012518228A JP5807635B2 (ja) | 2010-05-31 | 2011-05-26 | 動線検出システム、動線検出方法および動線検出プログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010125079 | 2010-05-31 | ||
JP2010-125079 | 2010-05-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011151999A1 true WO2011151999A1 (ja) | 2011-12-08 |
Family
ID=45066392
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/002930 WO2011151999A1 (ja) | 2010-05-31 | 2011-05-26 | 動線検出システム、動線検出方法および動線検出プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US8731829B2 (ja) |
EP (1) | EP2579191A4 (ja) |
JP (1) | JP5807635B2 (ja) |
CN (1) | CN102939611B (ja) |
WO (1) | WO2011151999A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017049729A (ja) * | 2015-08-31 | 2017-03-09 | 綜合警備保障株式会社 | 警備装置 |
JPWO2020261378A1 (ja) * | 2019-06-25 | 2020-12-30 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102597797B (zh) * | 2009-06-19 | 2015-02-11 | 科达无线私人有限公司 | 无线通信系统中的环境估计 |
US9245268B1 (en) * | 2014-07-10 | 2016-01-26 | Bank Of America Corporation | Dynamic card validation |
US11175142B2 (en) * | 2014-07-31 | 2021-11-16 | Honeywell International Inc. | Updating intensities in a PHD filter based on a sensor track ID |
US10605607B2 (en) | 2014-07-31 | 2020-03-31 | Honeywell International Inc. | Two step pruning in a PHD filter |
JP5720843B1 (ja) * | 2014-09-22 | 2015-05-20 | 富士ゼロックス株式会社 | 位置変換プログラム及び情報処理装置 |
TWI605252B (zh) * | 2016-11-16 | 2017-11-11 | 中原大學 | 用以量測目標物運動狀態及其磁性粒子含量之磁泳量測系統 |
EP3493102B1 (en) | 2017-11-30 | 2020-04-29 | Axis AB | A method and system for tracking a plurality of objects in a sequence of images |
CN109272351B (zh) * | 2018-08-31 | 2022-02-01 | 京东方科技集团股份有限公司 | 客流动线以及客流热区确定方法及装置 |
CN111609865B (zh) * | 2020-05-25 | 2022-04-26 | 广州市建筑科学研究院有限公司 | 一种基于无线网络的装配式自动导航盲道系统 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005031955A (ja) | 2003-07-11 | 2005-02-03 | Kddi Corp | 移動体追跡システム |
JP2005250989A (ja) | 2004-03-05 | 2005-09-15 | Sony Corp | 移動物体追跡方法及び画像処理装置 |
JP2006146378A (ja) | 2004-11-17 | 2006-06-08 | Hitachi Ltd | 複数カメラを用いた監視システム |
JP2008014742A (ja) * | 2006-07-05 | 2008-01-24 | Japan Advanced Institute Of Science & Technology Hokuriku | 移動体位置推定システム、及び、移動体位置推定方法 |
JP2008014743A (ja) * | 2006-07-05 | 2008-01-24 | Japan Advanced Institute Of Science & Technology Hokuriku | 移動体位置推定システム、及び、移動体位置推定方法 |
JP2008122093A (ja) * | 2006-11-08 | 2008-05-29 | Mitsubishi Electric Corp | 多目標追尾装置 |
JP2008175786A (ja) * | 2007-01-22 | 2008-07-31 | Zhencheng Hu | 移動体位置検出方法および移動体位置検出装置 |
JP2009176031A (ja) * | 2008-01-24 | 2009-08-06 | Toyota Motor Corp | 自律移動体,自律移動体制御システムおよび自律移動体の自己位置推定方法 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3567066B2 (ja) * | 1997-10-31 | 2004-09-15 | 株式会社日立製作所 | 移動体組合せ検出装置および方法 |
US6567116B1 (en) * | 1998-11-20 | 2003-05-20 | James A. Aman | Multiple object tracking system |
US7319479B1 (en) * | 2000-09-22 | 2008-01-15 | Brickstream Corporation | System and method for multi-camera linking and analysis |
JP2003022309A (ja) * | 2001-07-06 | 2003-01-24 | Hitachi Ltd | 動線情報を基にした施設管理装置 |
JP3900870B2 (ja) * | 2001-08-07 | 2007-04-04 | オムロン株式会社 | 情報収集装置、情報収集方法、および情報収集システム |
JP4493953B2 (ja) * | 2003-08-22 | 2010-06-30 | 富士通テン株式会社 | 移動体位置提供装置および移動体位置提供システム |
JP2005250692A (ja) * | 2004-03-02 | 2005-09-15 | Softopia Japan Foundation | 物体の同定方法、移動体同定方法、物体同定プログラム、移動体同定プログラム、物体同定プログラム記録媒体、移動体同定プログラム記録媒体 |
WO2007033286A2 (en) * | 2005-09-13 | 2007-03-22 | Verificon Corporation | System and method for object tracking and activity analysis |
JP4984728B2 (ja) * | 2006-08-07 | 2012-07-25 | パナソニック株式会社 | 被写体照合装置および被写体照合方法 |
US7929804B2 (en) * | 2007-10-03 | 2011-04-19 | Mitsubishi Electric Research Laboratories, Inc. | System and method for tracking objects with a synthetic aperture |
JP4510112B2 (ja) * | 2008-04-11 | 2010-07-21 | 東芝テック株式会社 | 動線解析装置 |
JP2010002997A (ja) * | 2008-06-18 | 2010-01-07 | Toshiba Tec Corp | 人物行動分析装置及び人物行動分析プログラム |
US8564534B2 (en) * | 2009-10-07 | 2013-10-22 | Microsoft Corporation | Human tracking system |
US20110181720A1 (en) * | 2010-01-25 | 2011-07-28 | Edgeworth Christopher M | System, method, and computer program product for tracking mobile objects from an aerial vehicle |
JP5488076B2 (ja) * | 2010-03-15 | 2014-05-14 | オムロン株式会社 | 対象物追跡装置、対象物追跡方法、および制御プログラム |
-
2011
- 2011-05-26 WO PCT/JP2011/002930 patent/WO2011151999A1/ja active Application Filing
- 2011-05-26 EP EP11789413.9A patent/EP2579191A4/en not_active Withdrawn
- 2011-05-26 US US13/695,489 patent/US8731829B2/en active Active
- 2011-05-26 CN CN201180026941.7A patent/CN102939611B/zh active Active
- 2011-05-26 JP JP2012518228A patent/JP5807635B2/ja active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005031955A (ja) | 2003-07-11 | 2005-02-03 | Kddi Corp | 移動体追跡システム |
JP2005250989A (ja) | 2004-03-05 | 2005-09-15 | Sony Corp | 移動物体追跡方法及び画像処理装置 |
JP2006146378A (ja) | 2004-11-17 | 2006-06-08 | Hitachi Ltd | 複数カメラを用いた監視システム |
JP2008014742A (ja) * | 2006-07-05 | 2008-01-24 | Japan Advanced Institute Of Science & Technology Hokuriku | 移動体位置推定システム、及び、移動体位置推定方法 |
JP2008014743A (ja) * | 2006-07-05 | 2008-01-24 | Japan Advanced Institute Of Science & Technology Hokuriku | 移動体位置推定システム、及び、移動体位置推定方法 |
JP2008122093A (ja) * | 2006-11-08 | 2008-05-29 | Mitsubishi Electric Corp | 多目標追尾装置 |
JP2008175786A (ja) * | 2007-01-22 | 2008-07-31 | Zhencheng Hu | 移動体位置検出方法および移動体位置検出装置 |
JP2009176031A (ja) * | 2008-01-24 | 2009-08-06 | Toyota Motor Corp | 自律移動体,自律移動体制御システムおよび自律移動体の自己位置推定方法 |
Non-Patent Citations (2)
Title |
---|
See also references of EP2579191A4 * |
YUKIE MORIGUCHI: "Ishu Sensor Togo ni yoru Sensor Data no Ketsuraku ni Ganken na Jinbutsu Dosen Kenshutsu Hoho", DAI 72 KAI (HEISEI 22 NEN) ZENKOKU TAIKAI KOEN RONBUNSHU, 8 March 2010 (2010-03-08), pages 3.89 - 3.90, XP008169472 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017049729A (ja) * | 2015-08-31 | 2017-03-09 | 綜合警備保障株式会社 | 警備装置 |
JPWO2020261378A1 (ja) * | 2019-06-25 | 2020-12-30 | ||
WO2020261378A1 (ja) * | 2019-06-25 | 2020-12-30 | 日本電気株式会社 | 軌跡連結装置、軌跡連結方法、及び、プログラムが格納された非一時的なコンピュータ可読媒体 |
JP7164040B2 (ja) | 2019-06-25 | 2022-11-01 | 日本電気株式会社 | 軌跡連結装置、軌跡連結方法、及び、プログラム |
Also Published As
Publication number | Publication date |
---|---|
JP5807635B2 (ja) | 2015-11-10 |
CN102939611B (zh) | 2016-08-03 |
US20130054142A1 (en) | 2013-02-28 |
EP2579191A4 (en) | 2014-04-09 |
US8731829B2 (en) | 2014-05-20 |
CN102939611A (zh) | 2013-02-20 |
EP2579191A1 (en) | 2013-04-10 |
JPWO2011151999A1 (ja) | 2013-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5807635B2 (ja) | 動線検出システム、動線検出方法および動線検出プログラム | |
JP5682563B2 (ja) | 移動体軌跡識別システム、移動体軌跡識別方法および移動体軌跡識別プログラム | |
JP7004017B2 (ja) | 物体追跡システム、物体追跡方法、プログラム | |
JP6070689B2 (ja) | 動線情報生成システム、動線情報生成方法および動線情報生成プログラム | |
JP6455113B2 (ja) | 物体追跡方法と装置 | |
US8995714B2 (en) | Information creation device for estimating object position and information creation method and program for estimating object position | |
CN110751674A (zh) | 多目标跟踪方法及相应视频分析系统 | |
JP6405778B2 (ja) | 対象追跡方法及び対象追跡装置 | |
CN110753953A (zh) | 用于自动驾驶车辆中经由交叉模态验证的以物体为中心的立体视觉的方法和系统 | |
CN110869936A (zh) | 用于自动驾驶车辆中的分布式学习与适应的方法和系统 | |
KR20190106853A (ko) | 텍스트 인식 장치 및 방법 | |
WO2012098853A1 (ja) | 動線検出処理データ分散システム、動線検出処理データ分散方法およびプログラム | |
CN101930611A (zh) | 多视图面部追踪 | |
JP2017168029A (ja) | 行動価値によって調査対象の位置を予測する装置、プログラム及び方法 | |
JP7065557B2 (ja) | 人物を追跡する映像解析装置、プログラム及び方法 | |
CN110992424A (zh) | 基于双目视觉的定位方法和系统 | |
US11948312B2 (en) | Object detection/tracking device, method, and program recording medium | |
US20220292397A1 (en) | Recognition system, model processing apparatus, model processing method, and recording medium | |
JP7224592B2 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
KR20100041172A (ko) | 영상 추적 장치의 이동표적 움직임 추적 방법 | |
JP2020052822A (ja) | 情報処理装置、認証システムおよびそれらの制御方法、プログラム | |
KR101595334B1 (ko) | 농장에서의 움직임 개체의 이동 궤적 트래킹 방법 및 장치 | |
Wu et al. | Indoor surveillance video based feature recognition for pedestrian dead reckoning | |
JP2019144900A (ja) | 状態推定装置及びプログラム | |
CN114964204A (zh) | 地图构建方法、地图使用方法、装置、设备和存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180026941.7 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11789413 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2011789413 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011789413 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012518228 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13695489 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 9936/CHENP/2012 Country of ref document: IN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |