WO2009113265A1 - タグセンサシステムおよびセンサ装置、ならびに、物体位置推定装置および物体位置推定方法 - Google Patents
タグセンサシステムおよびセンサ装置、ならびに、物体位置推定装置および物体位置推定方法 Download PDFInfo
- Publication number
- WO2009113265A1 WO2009113265A1 PCT/JP2009/000915 JP2009000915W WO2009113265A1 WO 2009113265 A1 WO2009113265 A1 WO 2009113265A1 JP 2009000915 W JP2009000915 W JP 2009000915W WO 2009113265 A1 WO2009113265 A1 WO 2009113265A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- image
- positioning
- probability density
- tag
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/13—Receivers
- G01S19/23—Testing, monitoring, correcting or calibrating of receiver elements
Definitions
- the present invention relates to a position specifying system including a wireless terminal device and a sensor device, and particularly to a technique for specifying a position of a monitoring target using wireless communication with the wireless terminal device and a captured image of the monitoring target.
- the present invention also relates to an object position estimation apparatus and method for estimating the position of an object based on the probability density of positioning coordinates.
- Japanese Patent Laid-Open No. 2006-311111 discloses such a system.
- an authenticator has an active tag.
- an area to be monitored is photographed with a security camera, and an active tag is detected with an RFID positioning device.
- the position of the suspicious person in an area is specified by comparing the camera positioning data acquired with the security camera, and the RFID positioning data acquired with the RFID positioning apparatus.
- the monitoring target position in the image cannot be obtained because the monitoring target cannot be captured by the security camera. It was difficult to track the movement of the monitoring target.
- Japanese Patent Laid-Open No. 2005-141687 discloses a probability density distribution P t V of a target position at time t obtained from image information and a probability density distribution P of a target position at time t obtained from sound information. It is described that a likelihood distribution F obtained by integrating the probability density distribution P t V and the probability density distribution P t A is obtained by the following expression 1 using t A and the weighting factor k 1 (t). Yes.
- a probability density distribution obtained by integrating the probability density distribution of image information and the probability density distribution of audio information is obtained by using the maximum value of the probability density of the probability density distribution as a measure of reliability. Then, by estimating the position of the object from the integrated probability density distribution, it is possible to improve the estimation accuracy of the object position.
- FIG. 30A shows an example of probability density distribution when one position coordinate is obtained as observation position data (that is, a measurement result for a certain target), and FIG. 30B shows observation position data.
- observation position data that is, a measurement result for a certain target
- observation position data As an example, a probability density distribution in the case where position coordinates of two points having the same possibility of the target being obtained is obtained is shown.
- the observation position data on the two-dimensional plane is obtained, and the probability density distribution obtained from the observation position data is also distributed on the two-dimensional plane.
- FIG. 30 shows an example of positioning on a one-dimensional line segment for simplification. In the following examples, for simplification, positioning on a two-dimensional plane may be described using an example of positioning on a one-dimensional line segment.
- observation position data as shown in FIG. 30A is obtained for both image information and audio information, and the probability density distribution shown in FIG. 30A is obtained as the probability density distribution based on the observation position data. If it is obtained, the probability density distribution based on the image information and the probability density distribution based on the voice information are integrated using substantially the same weighting factor.
- a probability density distribution based on image information is as shown in FIG. 30A
- observation position data as shown in FIG. 30B is obtained from voice information
- a probability density distribution based on the observation position data is obtained.
- the two probability density distributions are multiplied by the weighting coefficient obtained by multiplying the probability density distribution obtained from the speech information by the probability density distribution obtained by the image information. Integration is performed using a value smaller than the weight coefficient.
- the present invention has been made under the above background, and the object of the present invention is to perform positioning by wireless communication even when positioning by a captured image fails while suppressing power consumption of a wireless communication terminal such as an active tag. It is an object of the present invention to provide a position specifying system that can improve monitoring accuracy and reliably track a monitoring target using information.
- an object of the present invention is to provide an object position estimation device capable of improving position estimation accuracy when estimating the position of an object using a probability density distribution, and
- the object is to provide an object position estimation method.
- the position specifying system includes a wireless terminal device held by a monitoring target, a wireless communication unit that wirelessly communicates with the wireless terminal device, and a sensor device including a photographing unit that captures an image of the monitoring target.
- the wireless terminal device includes an identification information holding unit that holds identification information unique to the wireless terminal device, and a transmission unit that transmits a detection signal for detecting the position of the wireless terminal device together with the identification information, and a sensor
- the apparatus calculates the position of the monitoring target based on the wireless positioning unit that detects the position of the wireless terminal device based on the detection signal including the identification information received by the wireless communication unit, and the image captured by the imaging unit.
- An image positioning unit an integrated position specifying unit that specifies a position to be monitored by linking the position calculated by the image positioning unit with the position and identification information detected by the wireless positioning unit.
- Another aspect of the present invention is a sensor device, which is a wireless terminal device held by a monitoring target and wirelessly communicates with a wireless terminal device that transmits identification information together with a detection signal, and a monitoring device.
- the wireless positioning unit Based on the imaging unit that captures the target image, the wireless positioning unit that detects the position of the wireless terminal device based on the detection signal including the identification information received by the wireless communication unit, and the image captured by the imaging unit, An image positioning unit that calculates the position of the monitoring target; an integrated position specifying unit that specifies the position of the monitoring target by combining the position calculated by the image positioning unit and the position and identification information detected by the wireless positioning unit; It is equipped with.
- Another aspect of the present invention is an object position estimation device, which forms a first probability density distribution for target coordinates based on a captured image captured by a camera.
- a probability density distribution forming unit a second probability density distribution forming unit that forms a second probability density distribution for the coordinates of the target based on a signal of a sensor attached to the target; and a first probability density distribution;
- a probability density integration unit that integrates the second probability density distribution, a detection unit that detects the surrounding state of the target based on the captured image, and a second probability density distribution forming unit according to the detected surrounding state
- a probability density distribution changing unit that changes a weighting factor for the second probability density distribution in the probability density distribution or the probability density integrating unit.
- the object position estimation method forms a first probability density distribution of a target from a captured image including the target, and a sensor signal attached to the target. Forming the second probability density distribution of the target based on the detected image, detecting the surrounding situation of the target based on the captured image, and generating the second probability density distribution according to the detected surrounding situation. Changing and integrating the first probability density distribution and the second probability density distribution.
- FIG. 1 is a block diagram showing a configuration of a position specifying system in the present embodiment.
- FIG. 2 is a diagram for explaining the outline of the position specifying system in the present embodiment.
- FIG. 3 is a block diagram of each configuration of the sensor device according to the present embodiment.
- FIG. 4 is an explanatory diagram of the first identification information interpolation process.
- FIG. 5 is an explanatory diagram of image interpolation processing.
- FIG. 6 is an explanatory diagram of control for pausing the tag device.
- FIG. 7 is an explanatory diagram of control for starting the tag device.
- FIG. 8 is an explanatory diagram of the timing of tag positioning and image positioning
- FIG. 9 is an explanatory diagram of the second identification information interpolation process FIG.
- FIG. 10 is a block diagram illustrating a configuration of a position specifying unit in the position specifying system according to another embodiment.
- FIG. 11 is a diagram illustrating an example of processing in the position specifying unit at time t1.
- FIG. 12 is a diagram illustrating an example of processing in the position specifying unit at time t2.
- FIG. 13 is a diagram showing an example of creation of candidate information with likelihood
- FIG. 14 is a diagram illustrating an example of association based on candidate information with likelihood
- FIG. 15 is a block diagram illustrating a configuration of a first modification of the position specifying system according to another embodiment.
- FIG. 16 is a block diagram illustrating a configuration of a second modification of the position specifying system according to another embodiment.
- FIG. 11 is a diagram illustrating an example of processing in the position specifying unit at time t1.
- FIG. 12 is a diagram illustrating an example of processing in the position specifying unit at time t2.
- FIG. 13 is a diagram showing an example of creation of candidate
- FIG. 17 is a block diagram illustrating a configuration of a third modification of the position specifying system according to another embodiment.
- FIG. 18 is a diagram for explaining a modification of the image position vicinity search unit.
- FIG. 19 is a block diagram showing the configuration of the object position estimation apparatus according to the embodiment of the present invention.
- FIG. 20A is a diagram showing the existence probability when three candidate candidates are detected from the captured image
- FIG. 20B is a diagram showing the probability density distribution.
- FIG. 21A is a diagram showing an example in which one target candidate is detected from a captured image
- FIG. 21B is a diagram showing a probability density distribution.
- FIG. 22 is a diagram showing a probability density distribution when two candidate positions (observation positions) with different existence probabilities are detected in the captured image.
- FIG. 22 is a diagram showing a probability density distribution when two candidate positions (observation positions) with different existence probabilities are detected in the captured image.
- FIG. 23A shows an example in which the variance value of the probability density distribution is reduced because there is no obstacle
- FIG. 23B shows an example in which the variance value of the probability density distribution is increased because there is an obstacle.
- Figure showing FIG. 24A is a diagram illustrating a situation where the wireless tag is attached to the head
- FIG. 24B is a diagram illustrating a situation where the wireless tag is attached from the subject's neck.
- FIG. 25A is a diagram illustrating an image of a captured image captured by the camera
- FIG. 25B is a diagram illustrating an image obtained by converting the captured image into a planar map coordinate system.
- FIG. 26 is a diagram showing a probability density distribution in which the probability density of x ⁇ 20 that is the position of the obstacle is 0.
- FIG. 26 is a diagram showing a probability density distribution in which the probability density of x ⁇ 20 that is the position of the obstacle is 0.
- FIG. 27 is a diagram showing a probability density distribution when there is no obstacle.
- FIG. 28 is a diagram showing an image in which the object is located in the passage.
- FIG. 29 is a diagram showing a probability density distribution in which the probability density other than the passage is zero.
- FIG. 30A is a diagram showing an example of probability density distribution when the observation position (that is, the measurement result for one target) is one point, and
- FIG. 30B is a diagram showing two observation positions. Showing an example of probability density distribution
- the position specifying system of the present invention includes a wireless terminal device held by a monitoring target, a wireless communication unit that wirelessly communicates with the wireless terminal device, and a sensor device including a photographing unit that captures an image of the monitoring target.
- the wireless terminal device includes an identification information holding unit that holds identification information unique to the wireless terminal device, and a transmission unit that transmits a detection signal for detecting the position of the wireless terminal device together with the identification information.
- the sensor device detects the position of the monitoring target based on the wireless positioning unit that detects the position of the wireless terminal device based on the detection signal including the identification information received by the wireless communication unit, and the image captured by the imaging unit.
- An image positioning unit to be calculated, an integrated position specifying unit that specifies the position of the monitoring target by linking the position calculated by the image positioning unit with the position and identification information detected by the wireless positioning unit. It has formed.
- a position measured based on a captured image also referred to as an image position
- a position detected based on a detection signal of the wireless terminal device also referred to as a tag position
- the position of the monitoring subject or the monitoring target is specified.
- the monitoring target holds the wireless terminal device. Therefore, once the position of the monitoring target is specified, the position of the monitoring target is detected based on the detection signal of the wireless terminal device, even if the monitoring target may not be photographed thereafter. Can be tracked.
- the sensor device is determined to have succeeded in specifying the position specifying success / failure determination unit for determining the success / failure of the position of the monitoring target by the integrated position specifying unit and the position of the monitoring target.
- a configuration may be provided that includes a suspension request unit that transmits a transmission suspension request for lowering the transmission frequency of the detection signal to the wireless terminal device.
- the wireless terminal device when the position of the monitoring target is specified by the connection (also referred to as integration) of the image position and the tag position, the wireless terminal device is suspended from transmitting the detection signal. As a result, it is possible to save power in the wireless terminal device. In this case, since the position of the monitoring target is calculated based on the captured image, the movement of the monitoring target can be tracked. In the conventional system, the power consumption of the active tag is not taken into consideration, and there is a problem that the battery life of the active tag does not exist when the person tracking is performed by constantly comparing the camera positioning data and the RFID positioning data. It was. On the other hand, according to the position specifying system of the present invention, the power consumption of the wireless terminal device can be suppressed as much as possible.
- the sensor device failed in the image positioning of the monitoring target in the image positioning success / failure determination unit and the image positioning success / failure determination unit that determine the success or failure of the image positioning of the monitoring target by the image positioning unit.
- a first transmission requesting unit that transmits a detection signal transmission request to the wireless terminal device.
- the wireless terminal device restarts transmitting the detection signal.
- the position of the monitoring target is detected based on the detection signal of the wireless terminal device, so that it is possible to continue tracking the movement of the monitoring target.
- the sensor device detects the wireless terminal device when the position specifying success / failure determination unit determines that the position of the monitoring target has failed after the transmission suspension request is transmitted.
- You may have the structure provided with the 2nd transmission request part which transmits the transmission request
- the detection signal is transmitted to the wireless terminal device.
- the position of the monitoring target fails, the position of the monitoring target is detected based on the detection signal of the wireless terminal device, so that it is possible to continue tracking the movement of the monitoring target.
- the position of the monitoring target by the integrated position specifying unit is determined after the position specifying success / failure determining unit determines that the position of the monitoring target by the integrated position specifying unit has failed.
- the identification information of the wireless terminal device associated with the monitoring target again in the integrated position specifying unit is obtained for the monitoring target image when the position specification has failed. You may have the structure provided with the 1st identification information interpolation part to provide.
- the sensor device when the image positioning succeeds again after the image positioning of the monitoring target fails, the sensor device fails in the image positioning based on the position where the previous image positioning was performed and the position where the current image positioning was performed. It is also possible to have a configuration including an image interpolation unit that interpolates an image to be monitored when the image is being monitored.
- the frequency of detection signal transmission of the wireless terminal device and the frequency of wireless positioning of the wireless terminal device by the wireless positioning unit are set smaller than the frequency of image positioning of the monitoring target by the image positioning unit.
- the sensor device is configured to detect the position of the monitoring target in the image captured by the imaging unit before the position of the monitoring target by the integrated position specifying unit at or after the position specifying the monitoring target.
- This configuration makes it possible to reduce the power consumption of the wireless terminal device by reducing the frequency of transmission of the detection signal of the wireless terminal device.
- the tag position may not be detected and only the image position may be calculated before specifying the position of the monitoring target (linking the image position to the tag position / identification information).
- identification information of the monitoring target in the image before the position of the monitoring target is specified (retroactively). Thereby, it becomes possible to grasp
- the position specifying system of the present invention includes an associating unit that associates the image positioning to be monitored with the result of wireless positioning based on the distance between the position calculated by the image positioning unit and the position detected by the wireless positioning unit. You may prepare.
- the image positioning of the monitoring target and the wireless positioning result are appropriately associated with each other.
- the image positioning result and the wireless positioning result are associated with the same monitoring target.
- the associating unit is a monitoring target when a position detected by the wireless positioning unit exists within a predetermined search area centered on the position calculated by the image positioning unit.
- Image positioning and wireless positioning results may be associated with each other.
- the associating unit is a monitoring target when the position calculated by the image positioning unit exists in a predetermined search area centered on the position detected by the wireless positioning unit.
- Image positioning and wireless positioning results may be associated with each other.
- the positioning system of the present invention further includes a positioning accuracy comparison unit that compares the positioning accuracy of the monitored image positioning and the wireless positioning, and the associating unit determines the positioning accuracy of the image positioning based on the comparison result of the positioning accuracy. If the position detected by the wireless positioning unit exists within a predetermined search area centered on the position calculated by the image positioning unit, the result of the image positioning and the wireless positioning to be monitored is displayed. When the positioning accuracy of the association and wireless positioning is higher, when the position calculated by the image positioning unit exists within a predetermined search area centered on the position detected by the wireless positioning unit, Image positioning and wireless positioning results may be associated with each other.
- the positioning accuracy of image positioning and wireless positioning is compared, and the positioning accuracy is determined within the predetermined search area (for example, within a predetermined radius) centered on the position determined with the higher positioning accuracy.
- the results of the image positioning and the wireless positioning of the monitoring target are appropriately associated as one same monitoring target.
- the size of the predetermined search area centered on the position calculated by the image positioning unit may be set according to the positioning accuracy of the image positioning to be monitored.
- the size of the search area centered on the position calculated by the image positioning unit is appropriately set according to the positioning accuracy of the image positioning to be monitored. For example, when the positioning accuracy is high, the size of the search area is set small, and when the positioning accuracy is low, the size of the search area is set large.
- the position specifying system of the present invention can select information on a plurality of positions when there are a plurality of positions detected by the wireless positioning unit in a predetermined search area centered on the position calculated by the image positioning unit. You may provide the log
- the associating unit may associate the image positioning of the monitoring target with the result of the wireless positioning based on the candidate information held in the history holding unit.
- the history holding unit holds it. Based on the obtained candidate information, it is possible to appropriately associate the image positioning of the monitoring target with the result of the wireless positioning.
- the size of the predetermined search area centered on the position calculated by the wireless positioning unit may be set according to the positioning accuracy of the wireless positioning to be monitored.
- the size of the search area centered on the position calculated by the wireless positioning unit is appropriately set according to the positioning accuracy of the wireless positioning to be monitored. For example, when the positioning accuracy is high, the size of the search area is set small, and when the positioning accuracy is low, the size of the search area is set large.
- the position specifying system of the present invention can select information on a plurality of positions when there are a plurality of positions detected by the image positioning unit in a predetermined search area centered on the position calculated by the wireless positioning unit. You may provide the log
- the associating unit may associate the wireless positioning of the monitoring target with the result of the image positioning based on the candidate information held in the history holding unit.
- the history holding unit holds it. Based on the obtained candidate information, it is possible to appropriately associate the wireless positioning of the monitoring target with the result of image positioning.
- the associating unit calculates a combination that calculates a combination that minimizes a sum of squares of distances between the position calculated by the image positioning unit and the position detected by the wireless positioning unit. May be provided.
- the integrated position specifying unit may determine an average position of the position calculated by the image positioning unit and the position detected by the wireless positioning unit as the position to be monitored.
- the average position of the position calculated by the image positioning unit and the position detected by the wireless positioning unit is determined as the position to be monitored. Thereby, the position of the monitoring target is appropriately determined.
- the average may be a weighted average according to the positioning accuracy of the monitored image positioning and the wireless positioning.
- the position of the monitoring target is appropriately determined by using a weighted average corresponding to the positioning accuracy of the monitoring target image positioning and wireless positioning.
- the position calculated by the image positioning unit and the position detected by the wireless positioning unit are arranged in a cell space divided into a plurality of cells.
- the apparatus may further include an association unit that associates the monitored image positioning with the result of wireless positioning based on the positional relationship between the cell to which the position calculated by the image positioning unit belongs and the cell to which the position detected by the wireless positioning unit belongs. Good.
- image positioning of the monitoring target and wireless Positioning results can be appropriately associated. For example, when the image position and the tag position belong to the same cell, the image positioning result and the wireless positioning result are associated as the same monitoring target. As a result, the amount of calculation can be reduced and the processing speed can be increased as compared with the case where distances are calculated for all image positions and tag positions.
- the position specifying system of the present invention is a warning unit that performs a process of issuing a warning when a position detected by the wireless positioning unit does not exist within a predetermined search area centered on the position calculated by the image positioning unit. May be provided.
- a sensor device of the present invention is a wireless terminal device held by a monitoring target and wirelessly communicates with a wireless terminal device that transmits identification information together with a detection signal, a photographing unit that captures an image of the monitoring target, A wireless positioning unit that detects a position of the wireless terminal device based on a detection signal including identification information received by the communication unit; and an image positioning unit that calculates a position of a monitoring target based on an image captured by the imaging unit; And an integrated position specifying unit that specifies the position to be monitored by linking the position calculated by the image positioning unit with the position and identification information detected by the wireless positioning unit.
- the position calculated based on the captured image (image position) and the position detected based on the detection signal of the wireless terminal device (tag position) are combined as in the above system.
- the position of the monitoring target (monitoring person, monitoring target, etc.) is specified. In this case, the monitoring target holds the wireless terminal device. Therefore, once the position of the monitoring target is specified, the position of the monitoring target is detected based on the detection signal of the wireless terminal device, even if the monitoring target may not be photographed thereafter. Can be tracked.
- an image to be monitored cannot be captured by associating a position (image position) calculated based on a captured image with a position (tag position) detected based on a detection signal of the wireless terminal device. Even in this case, it is possible to track the movement of the monitoring target by specifying the position of the monitoring target.
- one aspect of the object position estimation apparatus of the present invention includes a first probability density distribution forming unit that forms a first probability density distribution for the target coordinates based on a captured image captured by a camera; A second probability density distribution forming unit that forms a second probability density distribution with respect to the coordinates of the object based on a sensor signal attached to the object; a first probability density distribution; and a second probability density distribution; A probability density integration unit that integrates the image, a detection unit that detects the surrounding situation of the target based on the captured image, and a probability density distribution or probability density in the second probability density distribution forming unit according to the detected surrounding situation A probability density distribution changing unit that changes a weighting factor for the second probability density distribution in the integrating unit is adopted.
- the first probability density distribution of the target is formed from the captured image including the target, and the signal of the target is calculated based on the sensor signal attached to the target. Forming a probability density distribution of 2; detecting a surrounding situation of the target based on the captured image; changing the second probability density distribution according to the detected surrounding situation; Integrating the probability density distribution and the second probability density distribution.
- the surrounding state of the target is detected from the captured image, and the probability density distribution in the second probability density distribution is changed according to the detected surrounding state.
- the second probability density distribution itself that becomes can be made closer to reality. As a result, the position estimation accuracy can be improved.
- This location system is, for example, a factory work efficiency improvement system that improves the work efficiency by analyzing the flow of people in the factory, and a loss accident or misdelivery in the warehouse by constantly monitoring the movement of people in the warehouse area. It is used as a distribution warehouse loss monitoring system for early detection, suppression and recording of accidents, an office entrance / exit management system for automatically recording office entry / exit history, etc., but the application is not limited thereto.
- FIG. 1 is a block diagram showing the configuration of the position specifying system.
- FIG. 2 is an explanatory diagram showing an outline of the position specifying system.
- the position specifying system 1 includes a tag device 2 as a wireless terminal device held by a monitoring target (for example, a monitoring target person), and a sensor device 3 installed in a monitoring area.
- the sensor device 3 is connected to a monitoring control device 4 and a storage device 5 installed in a monitoring room.
- the sensor device 3 has, for example, a circular monitoring area (also referred to as a sensing area) having a radius of 10 m.
- the monitoring control device 4 is, for example, a control computer, and the storage device 5 is, for example, an HDD.
- the sensor device 3 may include an internal memory device (including a recording medium such as a memory card) in addition to the external memory device such as the storage device 5.
- this position specifying system 1 includes a plurality of tag devices 2 and a plurality of sensor devices 3.
- a plurality of persons to be monitored are each provided with the tag device 2, and a plurality of sensor devices 3 are respectively installed at a plurality of locations in the monitoring area.
- the tag device 2 includes a wireless communication unit 6 for performing wireless communication with the sensor device 3 and a power supply control unit 7 for performing power control of the power supply.
- the wireless communication unit 6 corresponds to the wireless communication unit of the present invention.
- a signal (detection signal) for detecting the position of the tag device 2 is transmitted from the wireless communication unit 6. Therefore, the wireless communication unit 6 also corresponds to the transmission unit of the present invention.
- the tag device 2 is, for example, an active tag or a semi-passive tag.
- the tag device 2 includes a memory 8 that holds a unique identifier (also referred to as an identification ID).
- the tag device 2 reads the identification ID held in the memory 8 in response to a transmission request from the sensor device 3, and stores the identification ID in the sensor device 3.
- a detection signal including an identification ID can be transmitted.
- a signal for notifying only the identification ID is provided separately from the detection signal, and the signal for notifying the identification ID in response to the identification ID transmission request received from the sensor device 3 is set in response to the detection signal transmission request.
- the detection signal may be transmitted.
- the monitoring person can recognize the attribute and position of the owner (monitoring person) of the tag device 2 based on the signal received by the sensor device 3 from the tag device 2.
- the sensor device 3 includes a wireless communication unit 9 that performs wireless communication with the tag device 2 and performs wireless positioning, a camera unit 10 that captures an image of the person to be monitored and performs positioning based on the image, and a wireless communication unit 9.
- An integration processing unit 11 for performing integration processing (described later) of the position of the monitoring target person respectively positioned with the camera unit 10 and data of the integration processing result are external devices (such as the monitoring control device 4 and the storage device 5).
- the communication IF unit 12 is provided for output to the network.
- FIG. 3 is a block diagram for explaining each configuration of the sensor device 3 in more detail.
- the wireless communication unit 9 includes a transmission / reception unit 13 and a wireless positioning unit 14.
- the transmission / reception unit 13 transmits a detection signal transmission request to the tag device 2 and receives a detection signal (a signal including the identification ID of the tag device 2) transmitted by the tag device 2 that has received the transmission request.
- the wireless positioning unit 14 uses the information obtained from the detection signal received by the transmission / reception unit 13, for example, information such as radio wave intensity, radio wave arrival direction, radio wave propagation time, and the like, relative position of the tag device 2 that has transmitted the detection signal. Position in 3 dimensions. Further, the three-dimensional coordinate data as the positioning result and the identification ID of the tag device 2 extracted from the detection signal are output.
- the wireless communication unit 9 outputs the following data to the integration processing unit 11.
- Time data for example, date, hour, minute, second
- Tag device identification ID (3)
- the camera unit 10 includes a photographing unit 15 and an image positioning unit 16.
- the imaging unit 15 for example, a stereo camera that can capture an image with a super-wide angle (eg, 360 °) within a certain radius (eg, 10 m) can be used.
- the image positioning unit 16 detects and extracts image areas of a plurality of persons (monitoring target persons) in the image captured by the capturing unit 15, measures the three-dimensional coordinates from the sensor device 3 to the person, and outputs them. To do.
- a positioning method using an image for example, a depth estimation method based on binocular parallax information obtained from a stereo image can be used.
- the camera unit 10 outputs the following data to the integration processing unit 11.
- Image data (2) Image area of a person object in the image (for example, an area surrounded by the outline of a person in the image, or a rectangular area circumscribing the person in the image) (3) Representative points of the image area of the person object (for example, the center of gravity of the image area) (3) Time data (for example, frame number of captured image) (4) 3D coordinate data obtained by image positioning (3D position coordinates in real space)
- the integration processing unit 11 includes a position specifying unit 17, an interpolation processing unit 18, a determination unit 19, and a request unit 20.
- the integration processing unit 11 integrates the data output from the wireless communication unit 9 and the camera unit 10 and controls the process of specifying the monitoring target position and identification information so as to suppress the power consumption of the tag device 2 as much as possible. Run while.
- the position specifying unit 17 associates the position data of the tag device 2 output from the wireless communication unit 9 with the position data of the image area of the person object in the image output from the camera unit 8, thereby
- the identification ID of the device 2 is linked to the image area of the person object, and the position and identification information of the monitoring target person are specified.
- the interpolation processing unit 18 includes an image interpolation unit 21 that interpolates the position and image of the monitoring subject, and an identification information interpolation unit 22 that interpolates the identification information of the monitoring subject. The details of these image interpolation processing and identification information interpolation processing will be described later.
- the position specifying unit 17 determines whether or not the position of the monitoring subject has been specified by the position specifying unit 17, and whether or not the image positioning of the position of the monitoring target by the camera unit 10 has succeeded.
- An image positioning success / failure determination unit 24 for determining whether or not is included.
- the requesting unit 20 requests the tag device 2 to stop (request for detection signal transmission stop).
- the image positioning of the position of the monitoring subject in the camera unit 10 is determined to have failed by the pause request unit 25 and the image positioning success / failure determination unit 24, and the position specification unit 23 by the position specification success / failure determination unit 23
- a transmission request unit 26 that transmits a transmission request (transmission request for a detection signal) to the tag device 2 is provided.
- the integration processing unit 11 outputs the following data to the communication IF unit 12.
- the data (2) to (6) are data indicating the attributes of the image frame.
- (1) Identification ID of the sensor device (2) Time data (for example, date, hour, minute, second) (3) Frame image data (4) Image area of person object (5) Representative point of image area of person object (5) Identification ID of tag device (6) Degree of integration (for example, T: tag measurement coordinates only, D: image calculation representative points only, C: both tag measurement coordinates and image calculation representative points, P: interpolation calculation coordinates or estimated calculation coordinates)
- the monitoring control device 4 includes a monitor 27 for displaying a monitoring target.
- the monitoring control device 4 has a function of generating a flow line image based on the person object information output from the sensor device 3, a function of setting / controlling the operation of the sensor device 3, and various applications (for example, a factory It has a function to execute an operational efficiency system, a distribution warehouse loss monitoring system, an office entry / exit management system application, and the like.
- FIGS. 4 and 5 are diagrams (schematic diagrams of the display screen of the monitor 27) for explaining the interpolation processing.
- one person to be monitored having the tag device 2 with “ID: 1” is gradually moving from the left side of the screen to the center of the screen and the right side of the screen. The situation is shown.
- the monitoring target person who has succeeded in specifying the position by the position specifying unit 17, that is, the monitoring target person to whom the identification ID of the tag device 2 is given is illustrated by hatching, and the monitoring target person who has failed to specify the position for some reason Is indicated by a broken line. That is, in this example, when the monitoring target person is located on the left side and the right side of the screen, the position of the monitoring target person has been successfully identified, and when the monitoring target person is located in the center of the screen, The location of the person has failed.
- the location specification success / failure determination unit 22 performs the success or failure of the location specification of the monitoring target.
- the position of the monitoring subject photographed by the camera unit 10 and the position of the monitoring subject detected by the wireless communication with the tag device 2 are integrated again and linked. If possible, it becomes possible to associate the identification information of the tag device 2 retroactively (first identification information interpolation process) to the image to be monitored at the time of unsuccessful integration. Tracking back to can be performed.
- first identification information interpolation process is executed in the identification information interpolation unit 22. Therefore, the identification information interpolation unit 22 corresponds to the first identification information interpolation unit of the present invention.
- the image of the monitoring target person (image of the monitoring target person at the center of the screen) lost by the camera unit 10 by the image interpolation processing is the image before and after losing sight (the image of the monitoring target person on the left side and right side of the screen). ) Is generated based on.
- This image interpolation processing is executed in the image interpolation unit 21. Therefore, the image interpolation unit 21 corresponds to the image interpolation unit of the present invention.
- FIG. 6 is a diagram (schematic diagram of a display screen of the monitor 27) for explaining suspension control of the tag device 2.
- the position specifying unit 17 once associates the position of the monitoring subject photographed by the camera unit 10 (shooting position) with the position of the monitoring subject detected by wireless communication with the tag device 2 (tag position). If so, the position of the monitoring subject can be tracked by the tag device 2.
- the tag device 2 in the dormant state is indicated by a broken line with a square identification ID.
- This process is executed by the pause request unit 24. Specifically, when the position specification success / failure determination unit 22 determines that the position specification is successful, the suspension request unit 24 transmits a suspension request signal to the tag device 2 via the wireless communication unit 9. Thus, the transmission of the detection signal from the tag device 2 is suspended.
- the suspension of detection signal transmission means lowering the frequency of transmission of detection signals. Note that the suspension of detection signal transmission includes suspension of detection signal transmission (setting the transmission frequency to zero).
- the camera unit 10 loses sight of the person to be monitored, that is, when the image positioning success / failure determination unit 23 determines that the image positioning has failed. Transmits a transmission request to the tag device 2 from the transmission request unit 25 and requests to transmit a detection signal.
- the wireless positioning information by the wireless positioning unit 14 and the image positioning information by the image positioning unit 16 of the camera unit 10 are integrated by the position specifying unit 17, and the position of the person being monitored Can be identified and tracking can continue.
- processing first transmission request processing
- the call request unit 26 corresponds to the first call request unit of the present invention.
- the transmission request unit 25 transmits a transmission request to the tag device 2 and requests to transmit the detection signal without waiting for the next transmission timing (instead of the transmission request, transmission of the detection signal). You may send a request to increase the frequency).
- the wireless positioning information by the wireless positioning unit 14 and the image positioning information by the image positioning unit 16 of the camera unit 10 are integrated by the position specifying unit 17, and the position of the person being monitored Can be identified and tracking can continue.
- Such processing (second transmission request processing) is executed by the transmission request unit 26.
- the transmission request unit 26 corresponds to the second transmission request unit of the present invention.
- the power saving control (1) of the tag device described here is particularly effective when the tag device 2 is an active tag.
- the tag device 2 When the tag device 2 is a semi-passive tag, the tag device 2 does not transmit a detection signal spontaneously (or periodically), but only when a transmission request from the sensor device 3 is received. Since the detection signal is transmitted as described above, the control for stopping the tag device 2 by making a stop request from the sensor device 3 becomes unnecessary.
- the sensor device 3 By sending a transmission request to the tag device 2 and transmitting the detection signal to the tag device 2, an effect of power saving can be obtained.
- the frequency of the tag device 2 positioning (tag positioning) by the wireless communication unit 9 is based on the frequency of the monitoring target person positioning (image positioning) by the camera unit 10. A small case will be described.
- FIG. 8 is an explanatory diagram of the timing of tag positioning and image positioning.
- the frequency of tag positioning is set smaller than the frequency of image positioning.
- the power consumption of the tag device 2 can be suppressed.
- the tag device 2 is an active tag, this is realized by setting the frequency of periodically transmitting the detection signal in the wireless communication unit 6 of the tag device 2.
- the tag device 2 is a semi-passive tag, this can be realized by making a transmission request from the sensor device 3 to the tag device 2 at a frequency lower than the frequency of image positioning.
- tag positioning and image positioning are not synchronized when the person to be monitored who has the tag device 2 enters the monitoring area of the sensor device 3 (when entering the area). In some cases, only image positioning is performed and tag positioning is not performed. Even in such a case, at the time of image positioning after a predetermined number of times, matching is performed because the timing with tag positioning is aligned.
- Matching refers to a process of determining an image area of a person object that matches the tag positioning position and assigning tag information (identification ID) to the image area of the person object. The matching is a process executed by the position specifying unit 17 described above.
- the tag object data before the matching is subjected to the tag positioning data and the estimation based on the image positioning data after the matching.
- a process of assigning information (identification ID) (second identification information interpolation process) is possible.
- the tag information “ID: 1” is given to the image area of the person object before matching (left side of the screen) by the second identification information interpolation processing.
- the second identification information interpolation process is executed by the identification information interpolation unit 22. Therefore, it can be said that the identification information interpolation unit 22 corresponds to the second identification information interpolation unit of the present invention.
- the position of the monitoring target can be specified and the movement of the monitoring target can be tracked even when the monitoring target image cannot be captured.
- the position (image position) calculated based on the photographed image and the position (tag position) detected based on the detection signal of the tag device 2 are combined to be monitored ( The position of the person to be monitored) is identified.
- the monitoring target holds the tag device 2. Therefore, as shown in FIG. 5, once the position of the monitoring target is specified, the position of the monitoring target is determined based on the detection signal of the tag device 2 even if the monitoring target cannot be captured after that. It is possible to detect and track the movement of the monitoring target.
- the tag device 2 when the position to be monitored is specified by linking (integrating) the image position and the tag position, the tag device 2 transmits a detection signal. Pause. Thereby, the power saving of the tag apparatus 2 is attained. In this case, since the position of the monitoring target is calculated based on the captured image, the movement of the monitoring target can be tracked.
- the tag device 2 when the position of the monitoring target cannot be calculated based on the captured image (when the monitoring target is lost), the tag device 2 outputs the detection signal. Resume outgoing calls. Thereby, when the monitoring target cannot be photographed, the position of the monitoring target is detected based on the detection signal of the tag device 2, so that it is possible to continue tracking the movement of the monitoring target.
- the identification information of the tag device 2 given to the image by specifying the position of the monitored object is retroactively given to interpolate the lack of identification information for the monitored object, and the movement of the monitored object is continuously It becomes possible to grasp.
- the present embodiment even when the position of the monitoring target cannot be calculated based on the captured image, as shown in FIG.
- the target image area is interpolated. Thereby, it becomes possible to grasp
- the tag position may not be detected and only the image position may be calculated before specifying the position of the monitoring target (linking the image position and the tag position).
- the monitoring target image prior to the monitoring target position specification is traced back to the past.
- Tag information including the identification ID of the tag device 2 is given. Thereby, it becomes possible to grasp
- the position specifying system according to another embodiment is characterized by an operation of connecting (matching) an image position and a tag position. Therefore, here, the configuration and operation of the position specifying unit, which is a feature of the position specifying system according to another embodiment, will be mainly described. Unless otherwise mentioned here, the configurations and operations of the position specifying systems of the other embodiments are the same as those of the above embodiments.
- FIG. 10 is a block diagram illustrating a configuration of a position specifying unit in the position specifying system according to another embodiment.
- the position specifying unit 17 determines the image position (tag position) based on the distance between the position (image position) calculated by the image positioning unit 14 and the position (tag position) detected by the wireless positioning unit 16.
- An association unit 28 is provided for associating a result of image positioning) with a tag position (result of wireless positioning).
- the position specifying unit 17 includes an image position vicinity searching unit 29 that searches for a tag position within a circular area (search area) having a predetermined radius centered on the image position.
- the image position vicinity search unit 29 receives an image position and a tag position at a certain time, and performs a process of searching for a tag position in a predetermined search area centered on the image position, as will be described later.
- the association unit 28 associates the image position with the tag position based on the search result of the image position vicinity search unit 29 (see FIG. 11).
- the position specifying unit 17 includes a search area setting unit 30 for setting the size and shape of the search area. Further, the position specifying unit 17 includes a positioning accuracy detection unit 31 that detects the positioning accuracy of image positioning, and the search area setting unit 30 corresponds to the positioning accuracy of image positioning detected by the positioning accuracy detection unit 31. Set the size of the search area. For example, the accuracy of image positioning is determined based on the degree of dispersion (matching accuracy when matching with a “person template”) when detecting or identifying a person (monitoring target) by image processing (matching accuracy is If it is high, the accuracy of image positioning is also high).
- the size (radius) of the search area is set to be small, and when the positioning accuracy of image positioning is low, the size (radius) of the search area is set to be large.
- the search area an example of a circular area having a predetermined radius will be described as the search area, but the shape of the search area is not limited to this, and may be a shape such as a quadrangle or a hexagon.
- the positioning accuracy detection unit 31 may have a function of detecting the positioning accuracy of wireless positioning.
- the search area setting unit 30 sets the size of the search area according to the positioning accuracy of the wireless positioning detected by the positioning accuracy detection unit 31.
- the accuracy of the wireless positioning is determined based on, for example, the reception quality (received radio wave intensity and error rate) of the detection signal of the tag device 2 (when the reception quality is high, the accuracy of the wireless positioning is also a high value). ).
- the position specifying unit 17 includes information on the plurality of positions (for example, a tag position corresponding to the time, the image position, the image position, etc. ) As candidate information (see FIG. 11).
- the association unit 28 has a function of associating the image position with the tag position based on the candidate information held in the history holding unit 32.
- the position specifying unit 17 includes a history correcting unit 33 that corrects candidate information held in the history holding unit 32. For example, as will be described later, even when a plurality of image positions cannot be associated with tag positions (cannot be determined as one-to-one) at time t1, each image position is later at time t2. And tag positions may be associated (determined one-to-one). In such a case, the history correction unit 33 rewrites the information in the history holding unit 32 (see FIG. 12).
- the position specifying unit 17 includes a warning unit 34 that performs a process of generating a warning sound or a warning display when the tag position does not exist in the search area centered on the image position.
- a warning (warning sound, warning display, etc.) is issued and a user's warning is given as a monitoring target that does not have the tag device 2 is found. Attention is aroused.
- FIG. 11 and FIG. 12 are diagrams illustrating an example of processing in the position specifying unit.
- three image positions G1 to G3 are calculated by the image positioning unit 16
- three tag positions T1 to T3 are calculated by the wireless positioning unit 14.
- the radius of the search area is set based on the positioning accuracy of the image positioning, and the tag position in the search area centered on the image position is searched.
- three search areas centered at three image positions are indicated by broken lines. In this case, there are two tag positions T1 and T2 in the search area of the image position G1, and the image position and the tag position cannot be associated one-to-one.
- candidate information indicating that the tag positions corresponding to the image position G1 at time t1 are T1 and T2 is recorded in the history holding unit 32.
- the history holding unit 32 indicates that the tag positions corresponding to the image position G2 at time t1 are T2 and T3, and the tag positions corresponding to the image position G3 at time t1 are T1 and T3.
- Candidate information indicating that it is present is recorded.
- the image positioning unit 16 calculates two image positions (G1 and G2), and the wireless positioning unit 14 detects two tag positions (T1 and T2).
- the radius of the search area is set based on the positioning accuracy of image positioning, and the tag position in the search area centered on the image position is searched. For example, in FIG. 12, two search areas centering on two image positions are indicated by broken lines. In this case, only one tag position T1 exists in the search area for the image position G1, and the image position G1 and the tag position T1 are associated one-to-one.
- the history correction unit 33 narrows down past candidate information using the association result obtained at time t2. Specifically, for the candidate information (T1 and T2) corresponding to the image position G1 at time t1, the information that G1 obtained at time t2 corresponds to the tag position T1 is also used at time t1. The tag position corresponding to the image position G1 is narrowed down to T1. Similarly, the tag position corresponding to the image position G2 at time t1 is narrowed down to T2.
- the history correction unit 33 corrects the data in the history holding unit 32 from the state before correction shown in FIG. 12 to the state after correction using the above result.
- association between the image position and the tag position may be performed using the likelihood.
- association using likelihood will be described in detail with reference to FIGS. 13 and 14.
- FIG. 13 is a diagram illustrating an example of creation of candidate information with likelihood.
- two image positions G1 and G2 are calculated by the image positioning unit 16 and two tag positions (T1 and T2) are detected by the wireless positioning unit 14 at a certain time.
- the likelihood ratio of the image position corresponding to the tag position is: Reciprocal of distance.
- candidate information with likelihood indicating that the tag positions corresponding to the image position G1 at time t1 are T1 (75%) and T2 (25%) is recorded in the history holding unit 32.
- FIG. 14 is a diagram illustrating an example of association based on a history of candidate information with likelihood. As shown in FIG. 14, it is assumed that candidate information with likelihood at times t1 and t2 is recorded. In this case, the association at each time can be obtained by Bayesian estimation. In the example of FIG. 14, a conditional probability (posterior probability) P (T 1
- T 1 is event image position G1 is correlated with the tag position T1
- T 2 is an event in which the image position G1 is correlated with the tag position T2.
- P (T 1 ) is a probability (prior probability) that the image position G1 is associated with the tag position T1.
- P (T 1 ) is updated using data, and when P (T 1 ) exceeds a predetermined upper threshold (for example, 0.95), it is determined that the image position G1 is associated with the tag position T1. When the value falls below a predetermined lower threshold (for example, 0.05), it is determined that the image position G1 and the tag position T1 are not associated with each other.
- a predetermined upper threshold for example 0.95
- a predetermined lower threshold for example, 0.05
- the posterior probability using the data X 1 is calculated as P (T 1
- X 1 ) 0.7
- the posterior probability using the data X 2 is P (T 1
- X 2 ) 0.78. Is calculated.
- the posterior probability using the data X 3 is calculated as P (T 1
- X 3 ) 0.93.
- the posterior probability using the data X 4 is calculated as P (T 1
- X 4 ) 0.97. At this time, the posterior probability exceeds the upper threshold (0.95), and the tag position T1 and the image position G1 are associated with each other.
- the position specifying system it is possible to appropriately associate the image positioning of the monitoring target with the result of the wireless positioning based on the distance between the image position and the tag position. For example, when the image position and the tag position are close, the image position and the tag position can be associated as the same monitoring target. In this case, when a tag position exists in a predetermined search area (for example, within a predetermined radius) centered on the image position, the image position and the tag position can be appropriately associated as one thing to be monitored. it can. Thereby, the position specifying accuracy of the monitoring target is improved.
- the search area setting unit 30 can appropriately set the size of the search area centered on the image position in accordance with the positioning accuracy of the image positioning of the monitoring target. For example, when the positioning accuracy is high, the size of the search area is set small, and when the positioning accuracy is low, the size of the search area is set large.
- the history holding unit uses information on the plurality of positions (for example, time, image position, tag position corresponding to the image position, etc.) as candidate information.
- the association unit 28 can appropriately associate the monitoring target image positioning with the wireless positioning result based on the candidate information held in the history holding unit 32.
- the warning unit 34 issues a warning (warning sound, warning display, etc.) to alert the user. be able to.
- the associating unit 28 associates the image position with the tag position based on the search result of the tag position vicinity searching unit 35, which will be described later in Modification 1 instead of the image position vicinity searching unit 29. May be.
- the tag position vicinity searching unit 35 performs a search
- the size of the search area is set according to the radio positioning accuracy, or when there are a plurality of image positions in the search area, a history holding unit as candidate information 32 may be held.
- the associating unit 28 may associate the image positioning to be monitored with the result of wireless positioning based on the candidate information held in the history holding unit 32 obtained by the tag position vicinity searching unit 35.
- the image position exists within a predetermined search area (for example, within a predetermined radius) centered on the tag position
- the result of the image positioning and the wireless positioning of the monitoring target are the same for the same monitoring target.
- the predetermined search area for example, within a predetermined radius
- FIG. 15 is a block diagram illustrating a configuration of Modification 1 of the position specifying system according to another embodiment.
- the configuration and operation of the modified example 1 different from those of the other embodiments will be mainly described.
- the configuration and operation of Modification 1 are the same as those of the other embodiments described above.
- the position specifying unit 17 searches not only the image position vicinity searching unit 29 but also a tag position for searching whether a tag position exists in a circular area (search area) with a predetermined radius centered on the tag position.
- a neighborhood search unit 35 is provided.
- the tag position vicinity search unit 35 receives a tag position and an image position at a certain time, and performs a process of searching for an image position in a predetermined search area centered on the tag position. Then, the position specifying unit 17 changes the search method (whether the image position vicinity search unit 29 or the tag position vicinity search unit 35 is used) according to the positioning accuracy of image positioning and wireless positioning.
- a portion 36 is provided.
- the positioning accuracy detection unit 31 of the position specifying unit 17 has a function of detecting the positioning accuracy of both the image positioning unit 16 and the wireless positioning unit 14.
- the accuracy of image positioning is determined based on the degree of dispersion (matching accuracy when matching with a “person template”) when detecting or identifying a person (monitoring target) by image processing (matching accuracy is If it is high, the accuracy of image positioning is also high).
- the accuracy of the wireless positioning is determined based on, for example, the reception quality (received radio wave intensity and error rate) of the detection signal of the tag device 2 (when the reception quality is high, the accuracy of the wireless positioning is a high value). Become).
- the position specifying unit 17 includes a positioning accuracy comparison unit 37 that compares the positioning accuracy of image positioning and wireless positioning.
- the search method changing unit 36 changes the search method based on the comparison result of the positioning accuracy. Specifically, when the positioning accuracy of the image positioning is higher, the image position vicinity search unit 29 is used to search whether or not the tag position exists in the search area centered on the image position. When the positioning accuracy of the positioning is higher, the tag position vicinity search unit 35 is used to search whether the image position exists in the search area centered on the tag position.
- the positioning accuracy of image positioning and wireless positioning is compared, and within a predetermined search area (for example, within a predetermined radius) centered on the position obtained with the higher positioning accuracy.
- a predetermined search area for example, within a predetermined radius
- the result of the image positioning and the wireless positioning of the monitoring target can be appropriately associated as one same monitoring target.
- the positioning accuracy of image positioning when a tag position exists in the search area centered on the image position, the image position and the tag position are appropriate as a single monitoring target. Can be associated.
- the positioning accuracy of wireless positioning when the image position exists in the search area centered on the tag position, the image position and the tag position are appropriately associated as one thing to be monitored. Can do.
- FIG. 16 is a block diagram illustrating a configuration of a second modification of the position specifying system according to another embodiment.
- the configuration and operation of the modified example 2 different from those of the other embodiments will be mainly described.
- the configuration and operation of Modification 2 are the same as those of the other embodiments described above.
- the position specifying unit 17 of Modification 2 includes a combination calculation unit 38 that calculates a combination that minimizes the sum of squares of the difference between the image position and the tag position, instead of the image position vicinity search unit 29.
- the combination calculation unit 38 receives a tag position and an image position at a certain time, and calculates an optimal combination of the tag position and the image position.
- a calculation method for calculating the combination that minimizes the sum of squares of the distance difference between the image position and the tag position for example, a full search method or a local search method is used.
- an optimal combination that minimizes the sum of squares of the distance difference between the image position and the tag position is calculated, and based on the result, the correspondence between the image position and the tag position is calculated.
- the attachment is done. Thereby, an image position and a tag position can be matched appropriately.
- FIG. 17 is a block diagram illustrating a configuration of Modification 3 of the position specifying system according to another embodiment.
- the configuration and operation of the modified example 3 different from those of the other embodiments will be mainly described. That is, unless otherwise specified, the configuration and operation of Modification 3 are the same as those of the other embodiments described above.
- the position specifying unit 17 of the third modification includes an average value calculating unit 39 that calculates the average position of the associated image position and tag position as the position to be monitored.
- the average value calculation unit 39 receives the positioning accuracy of image positioning and wireless positioning, and calculates the weighted average according to the positioning accuracy as the average position of the monitoring target using the following equation 3.
- x is an average position (integrated position coordinates) of the monitoring target
- xv is an image position (camera position coordinates)
- xt is a tag position (tag position coordinates).
- a v is the image positioning of positioning accuracy (camera precision)
- a r is a radio positioning the positioning accuracy (tag accuracy).
- the average position of the image position and the tag position is determined as the position to be monitored.
- the position of the monitoring target can be appropriately specified.
- FIG. 18 is a diagram illustrating a modification of the image position vicinity search unit.
- the image position and the tag position are within a cell space divided into a plurality of cells (in FIG. 18, 16 cells A1 to D4 are illustrated). Be placed.
- three image positions G1 to G3 and three tag positions T1 to T3 are respectively arranged in the cell space.
- the image position G1 is arranged in the cell B3, and the tag position T1 is arranged in the cell B4.
- the image position G2 and the tag position T2 are arranged in the cell C2.
- the image position G3 is arranged in the cell C3, and the tag position T3 is arranged in the cell D4.
- the image position and the tag position are associated with each other based on the positional relationship between the cell to which the image position belongs and the cell to which the tag position belongs.
- the search range of the tag position is widened.
- the search range is expanded to cells (cells A2, B2, C2, A3, C3, A4, B4, and C4) around the cell B3 to which the image position G1 belongs, and a search is performed as to whether the tag position belongs to those cells. .
- the tag position T1 that is close to the image position G1 is determined as the tag position corresponding to the image position G1 (the image position G1 and the tag position).
- Position T1 is associated).
- the tag position T2 belongs to the same cell (cell C2) as the cell C2 to which the image position G2 belongs. Therefore, in this case, the tag position T2 is determined as the tag position corresponding to the image position G2 (the image position G2 and the tag position T2 are associated).
- the search range of the tag position is widened.
- the search range is expanded to cells (cells B1, D1, B2, C2, D2) around the cell C1 to which the image position G3 belongs, and it is searched whether the tag position belongs to those cells.
- a tag position T3 belonging to the cell D1 and a tag position T2 belonging to the cell C2 exist.
- the tag position T3 that is close to the image position G3 is determined as the tag position corresponding to the image position G3. (The image position G3 and the tag position T3 are associated). In this case, since the tag position T2 is already associated with the image position G2, the tag position T3 is determined as the tag position corresponding to the image position G3 based on the information (without calculating the distance). May be.
- the image position and the tag position can be appropriately associated based on the positional relationship between the cell to which the image position belongs and the cell to which the tag position belongs. For example, when the image position and the tag position belong to the same cell, the image position and the tag position are associated as the same monitoring target (without calculating the distance between the image position and the tag position). Thereby, compared with the case where distance calculation is performed for all image positions and tag positions, the amount of calculation can be greatly reduced, and the processing speed can be significantly increased.
- FIG. 19 shows the configuration of the object position estimation apparatus according to the embodiment of the present invention.
- the object position estimation apparatus 100 inputs a captured image signal captured by the camera 101 to the captured image acquisition unit 102.
- the captured image acquisition unit 102 sends the acquired captured image to the image coordinate probability density distribution forming unit 103.
- the image coordinate probability density distribution forming unit 103 forms the probability density distribution P t V of the coordinates of the target 200 (person in the figure) whose position is to be estimated from the captured image at time t.
- the image coordinate probability density distribution forming unit 103 has a preset probability density distribution model.
- the image coordinate probability density distribution forming unit 103 detects a location where an object may exist in the image based on the captured image, and obtains the existence probability at each location. This existence probability may be obtained from the degree of matching between the template for the target 200 and the target 200 detected from the captured image.
- the image coordinate probability density distribution forming unit 103 uses a preset probability density distribution model to create a probability density distribution with an area corresponding to the degree of matching at each detection location, and add them together. To form a probability density distribution P t V.
- FIG. 20 is an example in which three candidates for the target 200 are detected from the captured image.
- the image coordinate probability density distribution forming unit 103 assumes that the degree of matching with the template in each detected candidate is 5%, 75%, and 20%, as shown in FIG.
- a probability density distribution model is used to create a probability density distribution with an area ratio of 5:75:20 at the detected position, and this is output as a probability density distribution P t V.
- the probability density distribution model is a normal distribution, and the average value may be adjusted to the detection position and the normal distribution may be scaled according to the degree of matching.
- FIG. 20 is an example in which three candidates for the target 200 are detected from the captured image.
- the image coordinate probability density distribution forming unit 103 assumes that the degree of matching with the template in each detected candidate is 5%, 75%, and 20%, as shown in FIG.
- a probability density distribution model is used to create a probability density distribution with an area ratio of 5:75:20 at the detected position, and this is output as a probability
- FIG. 21A shows an example in which one candidate for the target 200 is detected from the captured image.
- the image coordinate probability density distribution forming unit 103 creates a probability density distribution as shown in FIG. 21B using the probability density distribution model, and outputs this as a probability density distribution P t V.
- the image coordinate probability density distribution forming unit 103 detects two candidate positions of the target 200 having the same existence probability in the captured image, the probability density as shown in FIG. Output the distribution.
- the image coordinate probability density distribution forming unit 103 outputs a probability density distribution as shown in FIG. 22 when two candidate positions of the target 200 having different existence probabilities are detected in the captured image.
- the probability density distribution P t V formed by the image coordinate probability density distribution forming unit 103 is sent to the weight determining unit 121 of the probability density integrating unit 120.
- the object position estimation apparatus 100 inputs the signal received by the tag signal receiver 111 to the tag information acquisition unit 112.
- the tag information acquisition unit 112 extracts a signal indicating the coordinates of the tag from the acquired tag information, and sends this to the tag coordinate probability density distribution forming unit 113.
- the tag coordinate probability density distribution forming unit 113 forms the probability density distribution P t T of the coordinates of the tag 201 from the tag coordinates at the time point t.
- the tag coordinate probability density distribution forming unit 113 has a preset probability density distribution model, similar to the image coordinate probability density distribution forming unit 103.
- the probability density distribution model is a normal distribution.
- the tag coordinate probability density distribution forming unit 113 creates a probability density distribution by matching the average value of the normal distribution with the tag coordinates, and outputs this as a probability density distribution P t T.
- the tag coordinate probability density distribution forming unit 113 of the present embodiment performs a probability density distribution changing process in addition to the basic process. Details thereof will be described later.
- the probability density distribution P t T formed by the tag coordinate probability density estimation unit 113 is sent to the weight determination unit 121 of the probability density integration unit 120.
- the integration unit 122 of the probability density integration unit 120 outputs the probability density distribution P t V of the position of the target 201 at the time t output from the image coordinate probability density distribution formation unit 103 and the tag coordinate probability density distribution formation unit 113.
- the probability density distribution P t T at the position of the tag 201 at the time t and the weight coefficient k (t) determined by the weight determination unit 121 the probability density distribution P t V and the probability A probability density distribution L obtained by integrating the density distribution P t T is obtained.
- the weight determination unit 121 sets the weighting coefficient k (t) to a value expressed by the following equation, thereby increasing the weight as the maximum value of the probability density of the probability density distribution increases, while increasing the probability density distribution P t. Weighting is performed on V 1 and P t T.
- Equations 4 and 5 are examples of the probability density distribution integration method and the weight determination method.
- the present invention can also be implemented by applying various conventionally proposed methods other than the methods represented by Equation 4 and Equation 5 as the probability density distribution integration method and weight determination method. The effect of the present invention is not affected. For example, a fixed value may be used as the weighting factor k (t). Furthermore, the probability density distribution P t V and the probability density distribution P t T may be simply integrated (taken a sum or product) without weighting.
- the probability density distribution obtained by the integration unit 122 is input to the determination unit 130.
- the determination unit 130 determines that the position having the maximum probability density in the probability density distribution is the position of the target 200, and outputs this position as a position estimation result.
- the object position estimation apparatus 100 includes a tag status detection unit 131 and a probability density distribution change control unit 132.
- the tag status detection unit 131 inputs the captured image output from the captured image acquisition unit 102.
- the tag status detection unit 131 detects the surrounding status of the target 200 using the captured image.
- the surrounding situation of the target 200 can be rephrased as the surrounding situation of the wireless tag 201.
- the surrounding situation is whether or not there is an obstacle around the target 200, and where the obstacle exists when it is present.
- the obstacle is a structure or an object that makes it impossible or difficult to move the object 200.
- An obstacle exists in the radio wave transmission range of the wireless tag 201 and the detection range of the tag signal receiver 111, and deteriorates communication between the wireless tag 201 and the tag signal receiver 111.
- the wireless tag since an object existing between the wireless tag 201 and the tag signal receiver 111 is likely to deteriorate communication between the wireless tag 201 and the tag signal receiver 111, the wireless tag is used as an obstacle. It is effective to detect an object existing between 201 and the tag signal receiver 111.
- the detection result obtained by the tag status detection unit 131 is sent to the probability density distribution change control unit 132.
- the probability density distribution change control unit 132 performs control to change the probability density distribution in the tag coordinate probability density distribution forming unit 113 according to the surrounding situation detected by the tag situation detection unit 131. Specifically, the probability density distribution change control unit 132 determines the distribution value of the normal distribution, the average value of the normal distribution, or a part of the detection result in the normal distribution according to the detection result of the obstacle. Control to change the probability density of the region.
- the probability density distribution change control unit 132 causes the tag coordinate probability density distribution formation unit 113 to The probability density at the position where the obstacle is detected is set to 0, and the integrated value of the probability density at other portions is set to 1. Accordingly, since the probability density at a position where the target 200 cannot actually exist can be zero, the probability density distribution P t T can be made closer to reality.
- the probability density distribution change control unit 132 determines that there is an obstacle that degrades communication between the wireless tag 201 and the tag signal receiver 111 around the wireless tag 201 and the tag signal receiver 111.
- the tag coordinate probability density distribution forming unit 113 forms a probability density distribution P t T in which the variance of the normal distribution is increased as compared with the case where no obstacle is detected. Instruct. Thereby, the probability density distribution P t T commensurate with the wireless communication quality can be formed.
- the type of obstacle may be detected from the captured image, and the variance value may be changed according to the type of obstacle. For example, when an obstacle such as a metal that easily interferes with radio waves is detected, the variance value may be set larger than when an obstacle that does not easily interfere with radio waves is detected.
- the probability density distribution change control unit 132 has the tag coordinate probability density distribution forming unit 113 changed the average value of the normal distribution when the obstacle is detected and when the obstacle is not detected. It may be instructed to form P t T. What is necessary is just to obtain
- FIG. 23 shows an example in which the variance value of the probability density distribution is changed depending on whether an obstacle exists around the wireless tag 201 and the tag signal receiver 111.
- the tag coordinate probability density distribution forming unit 113 As shown in FIG. A small probability density distribution is formed and output.
- the tag coordinate probability density distribution forming unit 113 as shown in FIG. A probability density distribution with a large variance value is formed and output.
- the probability density distributions in FIGS. 23A and 23B are both normal distributions.
- the target 200 is also an obstacle for the wireless tag 201. Therefore, it is also effective to detect the incidental status of the wireless tag 201 by the target 200 by the tag status detection unit 131. Specifically, the tag status detection unit 131 detects the incidental status of the wireless tag 201 as shown in FIG. 24, and the tag coordinate probability density distribution formation unit 113 outputs a probability density distribution P t T according to the detection result. It is also effective to do. That is, when the tag status detection unit 131 detects that the wireless tag 201 is attached above the target 200 as shown in FIG. Assuming that no obstacle (that is, a person) exists, the tag coordinate probability density distribution forming unit 113 outputs a probability density distribution with a small variance as shown in FIG.
- the tag status detection unit 131 detects that the wireless tag 201 is attached from the neck of the target 200 as shown in FIG. Assuming that an obstacle (that is, a person) exists with the machine 111, the tag coordinate probability density distribution forming unit 113 outputs a probability density distribution with a large variance as shown in FIG.
- the tag coordinate probability density distribution forming unit 113 detects an obstacle.
- An example of outputting a probability density distribution P t T in which the probability density at a certain position is 0 will be described.
- FIG. 25A shows an image of a captured image captured by the camera 101.
- FIG. 25B shows a captured image converted into a plane map coordinate system.
- the tag status detection unit 131 converts the coordinate system of the captured image shown in FIG. 25A to obtain a detection result as shown in FIG. Further, it is assumed that the positioning position by the tag is (17, 15). Then, the probability density distribution change control unit 132 determines that the position of the target 200 is (17, 15) based on the tag positioning result and the probability of x ⁇ 20 corresponding to the obstacle in the probability density distribution based on the detection result based on the image.
- a probability density distribution change signal instructing to set the density to 0 is output to the tag coordinate probability density distribution forming unit 113. Then, as shown in FIG. 26, the tag coordinate probability density distribution forming unit 113 outputs a probability density distribution in which the coordinates (17, 15) are the average value of the normal distribution and the probability density of x ⁇ 20 is 0. To do.
- FIG. 27 shows a probability density distribution output from the tag coordinate probability density distribution forming unit 113 when no obstacle exists. Setting the coordinates (17, 15) as the average of the normal distribution is the same as in FIG. 26, but the obstacle density is not considered, so the probability density in the range of x ⁇ 20 is not zero. As shown in FIG. 26, even when the probability density of x ⁇ 20 is set to 0, the sum of the integrated values of the probability density must be 1 as in the case of FIG. In the example of FIG. 26, the probability density (z-axis value) at the coordinates (17, 15) is larger than the probability density (z-axis value) in the average of the normal distribution of FIG.
- FIG. 28 and FIG. 29 show another example from FIG. 25 and FIG. FIG. 28 shows an image in which the object 200 is located in a passage with obstacles on both sides.
- a chair and a desk are shown as obstacles, but a wall or the like may naturally be used.
- the tag coordinate probability density distribution forming unit 113 outputs a probability density distribution as shown in FIG.
- the probability density distribution of FIG. 29 has a shape in which the normal distribution is enlarged in the passage direction and reduced in a direction perpendicular to the passage direction. As is clear from FIG. 29, the probability density of the position corresponding to the obstacle is reduced, and the probability density of the position corresponding to the passage is increased accordingly.
- the image coordinate probability density distribution forming unit 103 that forms the first probability density distribution P t V for the coordinates of the target 200 based on the captured image, and the target 200
- a tag coordinate probability density distribution forming unit 113 that forms a second probability density distribution P t T for the coordinates of the target 200 based on a signal from the attached wireless tag 201, and the periphery of the wireless tag 201 based on a captured image
- a tag status detection unit 131 that detects the status
- a probability density distribution change control unit 132 that changes the probability density distribution P t T in the tag coordinate probability density distribution forming unit 113 according to the detected peripheral status of the wireless tag 201
- a probability density integration unit 120 that integrates the first probability density distribution P t V and the second probability density distribution P t T , thereby effectively utilizing the captured image
- the object position estimation apparatus 100 that can improve the position estimation accuracy as compared with the past can be realized.
- the probability density distribution itself is changed according to the detection result of the tag status detection unit 131.
- the probability density distribution itself becomes closer to reality. As a result, the position estimation accuracy is improved.
- the probability density distribution in the tag coordinate probability density distribution forming unit 113 is changed according to the detected peripheral situation. It is obtained from the tag coordinate probability density distribution forming unit 113 according to the detected peripheral situation.
- the weighting factor for the obtained probability density distribution P t T may be changed.
- the probability density distribution based on the tag coordinate probability density distribution forming unit 113 may be changed as in the above-described embodiment.
- the present invention can also be applied to an object position estimation apparatus using a particle filter.
- the tag coordinate probability density distribution forming unit 113 is configured by a particle filter, and likelihoods used in the particle filter and / or particles to be excluded are selected according to the obstacle detected from the captured image.
- the particle filter estimates the true value from the observed value including noise.
- the particle filter sets a large number of particles (particles). The particles are given likelihood. Then, the particle filter estimates the true value by repeating the following processes (1) and (2).
- the likelihood at time t is estimated from the likelihood at time t-1 of each particle. (2) Exclude particles that do not match by using the observed value at time t for the estimated likelihood at time t.
- next likelihood (likelihood at time t) estimated in the processing of (1) is selected according to the obstacle detected from the captured image.
- the particles to be excluded in the process (2) are changed according to the obstacle detected from the captured image. Specifically, the likelihood corresponding to the position of the obstacle is lowered and / or the particles corresponding to the position of the obstacle are excluded. Thereby, the noise removal capability by particles improves.
- the case where the sensor attached to the target 200 is the wireless tag 201 has been described, but there may be a sonic tag, an ultrasonic tag, an infrared tag, or the like. Further, the target 200 may be in a form in which positioning is performed remotely using radio, sound, ultrasonic waves, infrared rays, or the like without attaching a device such as a tag.
- the position specifying system has an effect that the position of the monitoring target can be specified and the movement of the monitoring target can be tracked even when the monitoring target image cannot be taken.
- it is useful as a factory work efficiency system, a distribution warehouse loss monitoring system, an office entrance / exit management system, and the like.
- the present invention has an effect of improving the accuracy of position estimation when estimating the position of an object using a probability density distribution, and is suitable for application to an object tracking system, for example.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar Systems Or Details Thereof (AREA)
- Alarm Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Traffic Control Systems (AREA)
- Mobile Radio Communication Systems (AREA)
- Image Analysis (AREA)
Abstract
Description
2 タグ装置
3 センサ装置
6 無線通信部
7 電源制御部
8 メモリ
9 無線通信部
10 カメラ部
11 統合処理部
14 無線測位部
16 画像測位部
17 位置特定部
18 補間処理部
21 画像補間部
22 識別情報補間部
23 位置特定成否判定部
24 画像測位成否判定部
25 休止要求部
26 発信要求部
100 物体位置推定装置
101 カメラ
102 撮像画像取得部
103 画像座標確率密度分布形成部
111 タグ信号受信機
112 タグ情報取得部
113 タグ座標確率密度分布形成部
120 確率密度統合部
121 重み決定部
122 統合部
130 判定部
131 タグ状況検知部
132 確率密度分布変更制御部
200 対象
201 無線タグ
以下、本発明の実施の形態の位置特定システムについて、図面を用いて説明する。この位置特定システムは、例えば、工場で人の動線を分析して作業効率を改善する工場作業効率化システム、倉庫エリア内の人の動きを常時監視して倉庫内での紛失事故や誤配送事故などを早期発見・抑制・記録する物流倉庫内紛失監視システム、オフィスでの入退室履歴の自動記録するオフィス入退室管理システム等として用いられるが、用途はこれらに限定されない。
(1)タグ装置から検出用信号を受信した時刻データ(例えば、日付、時間、分、秒)
(2)タグ装置の識別ID
(3)測位したタグ装置の3次元座標データ(実空間の3次元位置座標)
(1)画像データ
(2)画像中の人物オブジェクトの画像領域(例えば、画像中の人物の輪郭線で囲まれた領域、または、画像中の人物に外接する四角形の領域など)
(3)人物オブジェクトの画像領域の代表点(例えば、画像領域の重心など)
(3)時刻データ(例えば、撮影した画像のフレームナンバーなど)
(4)画像測位した3次元座標データ(実空間の3次元位置座標)
(1)センサ装置の識別ID
(2)時刻データ(例えば、日付、時間、分、秒)
(3)フレーム画像データ
(4)人物オブジェクトの画像領域
(5)人物オブジェクトの画像領域の代表点
(5)タグ装置の識別ID
(6)統合の度合い(例えば、T:タグ計測座標のみ、D:画像算出の代表点のみ、C:タグ計測座標と画像算出の代表点の両方、P:補間算出座標または推定算出座標)
まず、監視対象者の位置の補間処理について説明する。図4および図5は、補間処理を説明するための図(モニタ27の表示画面の模式図)である。図4、図5に示した例では、一人の監視対象者(「ID:1」のタグ装置2を所持している)が、画面左側から画面中央、画面右側へと徐々に移動している様子が示されている。
つぎに、タグ装置2の省電力制御の一例として、タグ装置2の休止制御について説明する。図6は、タグ装置2の休止制御を説明するための図(モニタ27の表示画面の模式図)である。上述のように、一度、位置特定部17によって、カメラ部10で撮影した監視対象者の位置(撮影位置)とタグ装置2との無線通信で検出した監視対象者の位置(タグ位置)が結び付けられていれば、タグ装置2による監視対象者の位置の追跡が可能である。
つぎに、タグ装置2の省電力制御の他の例として、無線通信部9によるタグ装置2の測位(タグ測位)の頻度が、カメラ部10による監視対象者の測位(画像測位)の頻度より小さい場合を説明する。
以下、他の実施の形態の位置特定システムについて、図10~図18を参照して説明する。他の実施の形態の位置特定システムは、画像位置とタグ位置とを結び付ける(マッチングする)動作に特徴がある。したがって、ここでは、他の実施の形態の位置特定システムの特徴である位置特定部の構成や動作を中心に説明する。ここで特に言及しない限り、他の実施の形態の位置特定システムの構成や動作は、上記の実施の形態と同様である。
図15は、他の実施の形態の位置特定システムの変形例1の構成を示すブロック図である。ここでも、変形例1が他の実施の形態と相違する構成や動作を中心に説明する。つまり、ここで特に言及しない限り、変形例1の構成や動作は、上記の他の実施の形態と同様である。
図16は、他の実施の形態の位置特定システムの変形例2の構成を示すブロック図である。ここでも、変形例2が他の実施の形態と相違する構成や動作を中心に説明する。つまり、ここで特に言及しない限り、変形例2の構成や動作は、上記の他の実施の形態と同様である。
図17は、他の実施の形態の位置特定システムの変形例3の構成を示すブロック図である。ここでも、変形例3が他の実施の形態と相違する構成や動作を中心に説明する。つまり、ここで特に言及しない限り、変形例3の構成や動作は、上記の他の実施の形態と同様である。
図18は、画像位置近傍探索部の変形例を説明する図である。この変形例では、図18に示すように、画像位置とタグ位置が、複数のセル(図18では、A1~D4までの16個のセルが例示されている)に分割されたセル空間内に配置される。
以下、本発明の実施の形態について図面を参照して詳細に説明する。
図19に、本発明の実施の形態に係る物体位置推定装置の構成を示す。
次に、周辺状況の検知と、検知結果に応じた確率密度分布の変更処理とについて、具体例を挙げて説明する。
以上説明したように、本実施の形態によれば、撮像画像に基づいて対象200の座標についての第1の確率密度分布Pt Vを形成する画像座標確率密度分布形成部103と、対象200に付帯された無線タグ201からの信号に基づいて対象200の座標についての第2の確率密度分布Pt Tを形成するタグ座標確率密度分布形成部113と、撮像画像に基づいて無線タグ201の周辺状況を検知するタグ状況検知部131と、検知された無線タグ201の周辺状況に応じて、タグ座標確率密度分布形成部113における確率密度分布Pt Tを変更する確率密度分布変更制御部132と、第1の確率密度分布Pt Vと第2の確率密度分布Pt Tとを統合する確率密度統合部120と、を設けたことにより、撮像画像を有効活用して、従来に比して、位置推定精度を向上し得る物体位置推定装置100を実現できる。
Claims (30)
- 監視対象が保持する無線端末装置と、
前記無線端末装置と無線通信する無線通信部と、前記監視対象の画像を撮影する撮影部とを備えたセンサ装置と、
を備えた位置特定システムにおいて、
前記無線端末装置は、
前記無線端末装置に固有の識別情報を保持する識別情報保持部と、
前記無線端末装置の位置を検出するための検出用信号を前記識別情報とともに発信する発信部を備え、
前記センサ装置は、
前記無線通信部で受信した前記識別情報を含む前記検出用信号に基づいて、前記無線端末装置の位置を検出する無線測位部と、
前記撮影部で撮影した画像に基づいて、前記監視対象の位置を算出する画像測位部と、
前記画像測位部で算出された位置と、前記無線測位部で検出された位置および前記識別情報とを結び付けて、前記監視対象の位置を特定する統合位置特定部と、
を備えたことを特徴とする位置特定システム。 - 前記センサ装置は、
前記統合位置特定部による前記監視対象の位置の特定の成否を判定する位置特定成否判定部と、
前記監視対象の位置の特定が成功したと判定された場合に、前記無線端末装置に対して前記検出用信号の発信頻度を下げるための発信休止要求を送信する休止要求部と、
を備えたことを特徴とする請求項1に記載の位置特定システム。 - 前記センサ装置は、
前記画像測位部による前記監視対象の画像測位の成否を判定する画像測位成否判定部と、
前記画像測位成否判定部において、前記監視対象の画像測位が失敗したと判定された場合に、前記無線端末装置に対して前記検出用信号の発信要求を送信する第1の発信要求部と、
を備えたことを特徴とする請求項2に記載の位置特定システム。 - 前記センサ装置は、
前記発信休止要求の送信後に、前記位置特定成否判定部において前記監視対象の位置の特定が失敗したと判定された場合に、前記無線端末装置に対して前記検出用信号の発信要求を送信する第2の発信要求部を備えたことを特徴とする請求項2に記載の位置特定システム。 - 前記センサ装置は、
前記位置特定成否判定部において、前記統合位置特定部による前記監視対象の位置の特定が失敗したと判定された後に、前記統合位置特定部による前記監視対象の位置の特定が再度成功したと判定された場合に、前記位置の特定に失敗していたときの前記監視対象の画像に対して、前記統合位置特定部において再度前記監視対象に結び付けられた前記無線端末装置の前記識別情報を付与する第1の識別情報補間部を備えたことを特徴とする請求項3に記載の位置特定システム。 - 前記センサ装置は、
前記監視対象の画像測位の失敗後に前記画像測位が再度成功した場合に、前回画像測位した位置と今回画像測位した位置に基づいて、前記画像測位に失敗していたときの前記監視対象の画像を補間する画像補間部を備えたことを特徴とする請求項3に記載の位置特定システム。 - 前記無線端末装置の検出用信号の発信頻度および前記無線測位部による前記無線端末装置の無線測位の頻度は、前記画像測位部による前記監視対象の画像測位の頻度より小さく設定されており、
前記センサ装置は、
前記統合位置特定部による前記監視対象の位置特定より前に前記撮影部によって撮影された画像内の前記監視対象の画像に対して、前記監視対象の位置特定時または位置特定以後に前記統合位置特定部によって前記監視対象の画像に結び付けられた前記無線端末装置の前記識別情報を付与する第2の識別情報補間部を備えたことを特徴とする請求項1に記載の位置特定システム。 - 前記画像測位部で算出された位置と前記無線測位部で検出された位置との距離に基づいて、前記監視対象の画像測位と無線測位の結果を対応付ける対応付け部を備えたことを特徴とする請求項1に記載の位置特定システム。
- 前記対応付け部は、前記画像測位部で算出された位置を中心とした所定の探索エリア内に前記無線測位部で検出された位置が存在するときに、前記監視対象の画像測位と無線測位の結果を対応付けることを特徴とする請求項8に記載の位置特定システム。
- 前記対応付け部は、前記無線測位部で検出された位置を中心とした所定の探索エリア内に前記画像測位部で算出された位置が存在するときに、前記監視対象の画像測位と無線測位の結果を対応付けることを特徴とする請求項8に記載の位置特定システム。
- 前記監視対象の画像測位と無線測位の測位精度を比較する測位精度比較部を備え、
前記対応付け部は、前記測位精度の比較結果に基づいて、
前記画像測位の測位精度のほうが高い場合には、前記画像測位部で算出された位置を中心とした所定の探索エリア内に前記無線測位部で検出された位置が存在するときに、前記監視対象の画像測位と無線測位の結果を対応付け、
前記無線測位の測位精度のほうが高い場合には、前記無線測位部で検出された位置を中心とした所定の探索エリア内に前記画像測位部で算出された位置が存在するときに、前記監視対象の画像測位と無線測位の結果を対応付けることを特徴とする請求項8に記載の位置特定システム。 - 前記画像測位部で算出された位置を中心とした所定の探索エリアの大きさは、前記監視対象の画像測位の測位精度に応じて設定されることを特徴とする請求項9に記載の位置特定システム。
- 前記無線測位部で算出された位置を中心とした所定の探索エリアの大きさは、前記監視対象の無線測位の測位精度に応じて設定されることを特徴とする請求項10に記載の位置特定システム。
- 前記画像測位部で算出された位置を中心とした所定の探索エリア内に前記無線測位部で検出された位置が複数存在するときに、前記複数の位置に関する情報を候補情報として保持する履歴保持部を備えたことを特徴とする請求項9に記載の位置特定システム。
- 前記無線測位部で算出された位置を中心とした所定の探索エリア内に前記画像測位部で検出された位置が複数存在するときに、前記複数の位置に関する情報を候補情報として保持する履歴保持部を備えたことを特徴とする請求項10に記載の位置特定システム。
- 前記対応付け部は、前記履歴保持部に保持された前記候補情報に基づいて、前記監視対象の画像測位と無線測位の結果を対応付けることを特徴とする請求項14に記載の位置特定システム。
- 前記対応付け部は、前記履歴保持部に保持された前記候補情報に基づいて、前記監視対象の画像測位と無線測位の結果を対応付けることを特徴とする請求項15に記載の位置特定システム。
- 前記対応付け部は、前記画像測位部で算出された位置と前記無線測位部で検出された位置との距離の差の二乗和が最小となる組合せを算出する組合せ計算部を備えたことを特徴とする請求項8に記載の位置特定システム。
- 前記統合位置特定部は、前記画像測位部で算出された位置と前記無線測位部で検出された位置の平均の位置を、前記監視対象の位置として決定することを特徴とする請求項1に記載の位置特定システム。
- 前記平均は、前記監視対象の画像測位と無線測位の測位精度に応じた重み付け平均であることを特徴とする請求項19に記載の位置特定システム。
- 前記画像測位部で算出された位置と前記無線測位部で検出された位置が、複数のセルに分割されたセル空間内に配置されており、
前記位置特定システムは、
前記画像測位部で算出された位置が属するセルと前記無線測位部で検出された位置が属するセルとの位置関係に基づいて、前記監視対象の画像測位と無線測位の結果を対応付ける対応付け部を備えたことを特徴とする請求項1に記載の位置特定システム。 - 前記画像測位部で算出された位置を中心とした所定の探索エリア内に前記無線測位部で検出された位置が存在しないときに、警告を発する処理を行う警告部を備えたことを特徴とする請求項1に記載の位置特定システム。
- 監視対象が保持する無線端末装置であって検出信号とともに識別情報を発信する前記無線端末装置と無線通信する無線通信部と、
前記監視対象の画像を撮影する撮影部と、
前記無線通信部で受信した前記識別情報を含む前記検出用信号に基づいて、前記無線端末装置の位置を検出する無線測位部と、
前記撮影部で撮影した画像に基づいて、前記監視対象の位置を算出する画像測位部と、
前記画像測位部で算出された位置と、前記無線測位部で検出された位置および前記識別情報とを結び付けて、前記監視対象の位置を特定する統合位置特定部と、
を備えたことを特徴とするセンサ装置。 - カメラによって撮像された撮像画像に基づいて、対象の座標についての第1の確率密度分布を形成する第1の確率密度分布形成部と、
前記対象に付帯されたセンサの信号に基づいて、前記対象の座標についての第2の確率密度分布を形成する第2の確率密度分布形成部と、
前記第1の確率密度分布と前記第2の確率密度分布とを統合する確率密度統合部と、
前記撮像画像に基づいて、前記対象の周辺状況を検知する検知部と、
検知された前記周辺状況に応じて、前記第2の確率密度分布形成部における確率密度分布又は前記確率密度統合部における前記第2の確率密度分布に対する重み係数を変更する確率密度分布変更部と、
を具備する物体位置推定装置。 - 前記第2の確率密度分布は、正規分布であり、
前記確率密度分布変更部は、前記検知結果に応じて、正規分布の分散の値、正規分布の平均の値、又は正規分布内の前記検知結果に応じた一部の領域の確率密度を変更する、
請求項24に記載の物体位置推定装置。 - 前記検知部は、前記周辺状況として、前記対象の周辺の障害物を検知し、
前記確率密度分布変更部は、前記障害物が検知された位置の確率密度を0にする、
請求項25に記載の物体位置推定装置。 - 前記対象に付帯されたセンサは、無線タグであり、
前記検知部は、前記無線タグと自装置との間に障害物が存在するか否かを検知し、
前記確率密度分布変更部は、障害物が検知された場合には障害物が検知されない場合と比較して、正規分布の分散の値を大きくする、
請求項25に記載の物体位置推定装置。 - 前記対象に付帯されたセンサは、無線タグであり、
前記検知部は、前記周辺状況として、前記対象の周辺の障害物を検知し、
前記確率密度分布変更部は、障害物が検知された場合と障害物が検知されない場合とで、正規分布の平均の値を変更する、
請求項25に記載の物体位置推定装置。 - 前記第2の確率密度分布形成部は、パーティクルフィルタを有し、
前記確率密度分布変更部は、前記検知結果に応じて、前記パーティクルフィルタの尤度及び/又は除外するパーティクルを変更する、
請求項24に記載の物体位置推定装置。 - 対象を含む撮像画像から前記対象の第1の確率密度分布を形成することと、
前記対象に付帯されたセンサの信号に基づいて、前記対象の第2の確率密度分布を形成することと、
前記撮像画像に基づいて、前記対象の周辺状況を検知することと、
検知された前記周辺状況に応じて、前記第2の確率密度分布を変更することと、
前記第1の確率密度分布と前記第2の確率密度分布とを統合することと、
を含む物体位置推定方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP09721115.5A EP2264679A4 (en) | 2008-03-11 | 2009-02-27 | TAG SENSOR SYSTEM AND SENSOR DEVICE, AND OBJECT POSITION ESTIMATING DEVICE, AND OBJECT POSITION ESTIMATING METHOD |
US12/673,087 US8243136B2 (en) | 2008-03-11 | 2009-02-27 | Tag sensor system and sensor device, and object position estimating device and object position estimating method |
JP2010502708A JP5385893B2 (ja) | 2008-03-11 | 2009-02-27 | 位置特定システムおよびセンサ装置 |
CN200980000398A CN101681549A (zh) | 2008-03-11 | 2009-02-27 | 标签传感器系统、传感器装置、物体位置推测装置及物体位置推测方法 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008061582 | 2008-03-11 | ||
JP2008-061582 | 2008-03-11 | ||
JP2008305149 | 2008-11-28 | ||
JP2008-305149 | 2008-11-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009113265A1 true WO2009113265A1 (ja) | 2009-09-17 |
Family
ID=41064939
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/000915 WO2009113265A1 (ja) | 2008-03-11 | 2009-02-27 | タグセンサシステムおよびセンサ装置、ならびに、物体位置推定装置および物体位置推定方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8243136B2 (ja) |
EP (1) | EP2264679A4 (ja) |
JP (1) | JP5385893B2 (ja) |
CN (1) | CN101681549A (ja) |
WO (1) | WO2009113265A1 (ja) |
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011128107A (ja) * | 2009-12-21 | 2011-06-30 | Mitsubishi Electric Corp | 移動体管理装置 |
WO2011101945A1 (ja) * | 2010-02-19 | 2011-08-25 | パナソニック株式会社 | 物体位置補正装置、物体位置補正方法、及び物体位置補正プログラム |
JP2011215829A (ja) * | 2010-03-31 | 2011-10-27 | Hitachi Ltd | 監視装置および不審行動検出方法 |
WO2012068582A1 (en) | 2010-11-19 | 2012-05-24 | Isolynx, Llc | Associative object tracking systems and methods |
JP2012129858A (ja) * | 2010-12-16 | 2012-07-05 | Mitsubishi Electric Corp | 映像監視装置及び映像監視システム |
JP2012159957A (ja) * | 2011-01-31 | 2012-08-23 | Secom Co Ltd | 移動物体追跡装置 |
JP2013050926A (ja) * | 2011-08-31 | 2013-03-14 | Secom Co Ltd | 不審物体監視システム |
JP2013058167A (ja) * | 2011-09-09 | 2013-03-28 | Secom Co Ltd | 移動物体監視システム |
JPWO2011061905A1 (ja) * | 2009-11-20 | 2013-04-04 | 日本電気株式会社 | 物体領域抽出装置、物体領域抽出方法、及びプログラム |
JP2013513336A (ja) * | 2009-12-08 | 2013-04-18 | トゥルーポジション・インコーポレーテッド | マルチセンサの定位および特定 |
JP2013072858A (ja) * | 2011-09-29 | 2013-04-22 | Panasonic Corp | 移動体位置推定装置、移動体位置推定方法、及び、移動体位置推定プログラム |
JP2013114348A (ja) * | 2011-11-25 | 2013-06-10 | Secom Co Ltd | 移動物体監視システム |
CN102043157B (zh) * | 2009-10-15 | 2013-08-14 | 韩国电子通信研究院 | 用于估计对象的位置的设备和方法 |
JP2013257715A (ja) * | 2012-06-12 | 2013-12-26 | Fujitsu Advanced Engineering Ltd | 作業実績管理装置、方法、及び、プログラム |
JP2015108999A (ja) * | 2013-12-05 | 2015-06-11 | 三菱電機株式会社 | 動態管理システム |
JP2016042298A (ja) * | 2014-08-18 | 2016-03-31 | 株式会社豊田中央研究所 | 事故情報算出装置、及びプログラム |
JP2016154329A (ja) * | 2015-02-09 | 2016-08-25 | 株式会社リコー | 移動端末を携帯するユーザの識別方法と装置 |
JP2016192764A (ja) * | 2015-03-30 | 2016-11-10 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | キャプチャ画像に関連するメタデータとして識別子を含めるための方法、コンピュータ・システム、およびコンピュータ・プログラム |
JP2017046023A (ja) * | 2015-08-24 | 2017-03-02 | 三菱電機株式会社 | 移動体追跡装置及び移動体追跡方法及び移動体追跡プログラム |
US9752880B2 (en) | 2015-01-09 | 2017-09-05 | Fujitsu Limited | Object linking method, object linking apparatus, and storage medium |
US9824189B2 (en) | 2015-01-23 | 2017-11-21 | Panasonic Intellectual Property Management Co., Ltd. | Image processing apparatus, image processing method, image display system, and storage medium |
WO2018135095A1 (ja) * | 2017-01-20 | 2018-07-26 | ソニー株式会社 | 情報処理装置、情報処理方法、および情報処理システム |
JP2019009562A (ja) * | 2017-06-22 | 2019-01-17 | 凸版印刷株式会社 | 監視映像表示システム、監視映像表示装置、監視情報管理サーバ、および、監視映像表示方法 |
JP2019020360A (ja) * | 2017-07-21 | 2019-02-07 | 日本電信電話株式会社 | 物体認識装置、物体認識方法、及びプログラム |
JP2020020645A (ja) * | 2018-07-31 | 2020-02-06 | 清水建設株式会社 | 位置検出システム及び位置検出方法 |
JP2020102817A (ja) * | 2018-12-25 | 2020-07-02 | 凸版印刷株式会社 | 監視対象識別装置、監視対象識別システム、および、監視対象識別方法 |
JP2020118619A (ja) * | 2019-01-25 | 2020-08-06 | コニカミノルタ株式会社 | 移動体追跡システム、および移動体追跡方法 |
JP2021505898A (ja) * | 2017-12-11 | 2021-02-18 | フラウンホーファー−ゲゼルシャフト ツール フエルデルング デア アンゲヴァンテン フォルシュング エー.ファオ. | 対象物の現在位置を判断するための方法、位置決定システム、トラッカーおよびコンピュータプログラム |
JP2022513511A (ja) * | 2018-12-18 | 2022-02-08 | ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツング | 完全性範囲を特定するための方法 |
US11308346B2 (en) | 2018-02-02 | 2022-04-19 | Nec Corporation | Sensor information integration system, sensor information integration method, and program |
WO2022201682A1 (ja) * | 2021-03-25 | 2022-09-29 | パナソニックIpマネジメント株式会社 | 位置検知システム、位置検知方法及びプログラム |
WO2022239644A1 (ja) * | 2021-05-12 | 2022-11-17 | 株式会社デンソー | 追跡装置 |
US11930476B2 (en) | 2018-10-29 | 2024-03-12 | Nec Corporation | Sensor information integration system, sensor information integration method, program, and recording medium |
JP7567328B2 (ja) | 2020-09-30 | 2024-10-16 | 東芝ライテック株式会社 | 情報処理システム |
Families Citing this family (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007136745A2 (en) | 2006-05-19 | 2007-11-29 | University Of Hawaii | Motion tracking system for real time adaptive imaging and spectroscopy |
AU2008200926B2 (en) * | 2008-02-28 | 2011-09-29 | Canon Kabushiki Kaisha | On-camera summarisation of object relationships |
WO2009151778A2 (en) * | 2008-04-14 | 2009-12-17 | Mojix, Inc. | Radio frequency identification tag location estimation and tracking system and method |
JP5017392B2 (ja) * | 2010-02-24 | 2012-09-05 | クラリオン株式会社 | 位置推定装置および位置推定方法 |
US9762976B2 (en) | 2010-04-14 | 2017-09-12 | Mojix, Inc. | Systems and methods for detecting patterns in spatio-temporal data collected using an RFID system |
US20110298930A1 (en) * | 2010-06-04 | 2011-12-08 | Polaris Wireless, Inc. | Integrated Wireless Location and Surveillance System |
EP2747641A4 (en) | 2011-08-26 | 2015-04-01 | Kineticor Inc | METHOD, SYSTEMS AND DEVICES FOR SCAN INTERNAL MOTION CORRECTION |
KR101340287B1 (ko) * | 2012-04-13 | 2013-12-10 | 경기대학교 산학협력단 | 스마트 홈에서 마이닝 기반의 패턴 분석을 이용한 침입탐지 시스템 |
US9143741B1 (en) * | 2012-08-17 | 2015-09-22 | Kuna Systems Corporation | Internet protocol security camera connected light bulb/system |
EP2704055A1 (en) * | 2012-08-31 | 2014-03-05 | Layar B.V. | Determining space to display content in augmented reality |
GB201220584D0 (en) * | 2012-11-15 | 2013-01-02 | Roadpixel Ltd | A tracking or identification system |
US10327708B2 (en) | 2013-01-24 | 2019-06-25 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US9717461B2 (en) | 2013-01-24 | 2017-08-01 | Kineticor, Inc. | Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan |
US9305365B2 (en) | 2013-01-24 | 2016-04-05 | Kineticor, Inc. | Systems, devices, and methods for tracking moving targets |
WO2014120734A1 (en) | 2013-02-01 | 2014-08-07 | Kineticor, Inc. | Motion tracking system for real time adaptive motion compensation in biomedical imaging |
US9576213B2 (en) * | 2013-02-08 | 2017-02-21 | Chuck Fung | Method, system and processor for instantly recognizing and positioning an object |
US9836028B2 (en) | 2013-02-08 | 2017-12-05 | Chuck Fung | Method, system and processor for instantly recognizing and positioning an object |
US9111156B2 (en) | 2013-03-15 | 2015-08-18 | Mojix, Inc. | Systems and methods for compressive sensing ranging evaluation |
CN103491352A (zh) * | 2013-10-08 | 2014-01-01 | 尹梦寒 | 智慧社区的无线定位监控系统 |
CN103607538A (zh) | 2013-11-07 | 2014-02-26 | 北京智谷睿拓技术服务有限公司 | 拍摄方法及拍摄装置 |
CN104808227A (zh) * | 2014-01-28 | 2015-07-29 | 纳米新能源(唐山)有限责任公司 | 用于士兵定位的无线定位装置和无线定位系统 |
EP3157422A4 (en) | 2014-03-24 | 2018-01-24 | The University of Hawaii | Systems, methods, and devices for removing prospective motion correction from medical imaging scans |
US9721445B2 (en) * | 2014-06-06 | 2017-08-01 | Vivint, Inc. | Child monitoring bracelet/anklet |
CN106714681A (zh) | 2014-07-23 | 2017-05-24 | 凯内蒂科尔股份有限公司 | 用于在医学成像扫描期间追踪和补偿患者运动的系统、设备和方法 |
EP4343728A3 (en) | 2014-12-30 | 2024-06-19 | Alarm.com Incorporated | Digital fingerprint tracking |
US10310080B2 (en) * | 2015-02-25 | 2019-06-04 | The Boeing Company | Three dimensional manufacturing positioning system |
US9883337B2 (en) | 2015-04-24 | 2018-01-30 | Mijix, Inc. | Location based services for RFID and sensor networks |
US9943247B2 (en) | 2015-07-28 | 2018-04-17 | The University Of Hawai'i | Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan |
KR102432806B1 (ko) * | 2015-10-26 | 2022-08-12 | 한화테크윈 주식회사 | 감시 시스템 및 그 제어 방법 |
EP3380007A4 (en) | 2015-11-23 | 2019-09-04 | Kineticor, Inc. | SYSTEMS, DEVICES, AND METHODS FOR MONITORING AND COMPENSATING A MOVEMENT OF A PATIENT DURING MEDICAL IMAGING SCAN |
US10007991B2 (en) | 2016-01-29 | 2018-06-26 | International Business Machines Corporation | Low-cost method to reliably determine relative object position |
DE102016213234A1 (de) * | 2016-02-12 | 2017-08-31 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Vorrichtung zur Darstellung von Benutzerinformationen und entsprechendes Verfahren |
CN105828296A (zh) * | 2016-05-25 | 2016-08-03 | 武汉域讯科技有限公司 | 一种利用图像匹配与wi-fi融合的室内定位方法 |
WO2018222532A1 (en) * | 2017-06-01 | 2018-12-06 | Vid Scale, Inc. | Rfid based zoom lens tracking of objects of interest |
US10791425B2 (en) * | 2017-10-04 | 2020-09-29 | Enlighted, Inc. | Mobile tag sensing and location estimation |
US10785957B2 (en) * | 2017-12-06 | 2020-09-29 | Trupanion, Inc. | Motion powered pet tracker system and method |
WO2019187905A1 (ja) * | 2018-03-26 | 2019-10-03 | アルプスアルパイン株式会社 | 位置推定装置、位置推定システム、位置推定方法、およびプログラム |
CN110047262A (zh) * | 2019-05-20 | 2019-07-23 | 帷幄匠心科技(杭州)有限公司 | 一种卖场sku红外定位检测系统 |
CN111208581B (zh) * | 2019-12-16 | 2022-07-05 | 长春理工大学 | 一种无人机多维度识别系统及方法 |
SG10201913005YA (en) * | 2019-12-23 | 2020-09-29 | Sensetime Int Pte Ltd | Method, apparatus, and system for recognizing target object |
JP7310718B2 (ja) * | 2020-05-27 | 2023-07-19 | トヨタ自動車株式会社 | 路上障害物検知装置、路上障害物検知方法、及び路上障害物検知プログラム |
TWI774140B (zh) * | 2020-11-26 | 2022-08-11 | 聚眾聯合科技股份有限公司 | 雙向訊號定位方法及其雙向訊號定位系統 |
DE102020133787A1 (de) * | 2020-12-16 | 2022-06-23 | Sick Ag | Sicherheitssystem und Verfahren mit einem Sicherheitssystem |
CN113610993B (zh) * | 2021-08-05 | 2022-05-17 | 南京师范大学 | 一种基于候选标签评估的3d地图建筑物标注方法 |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001359147A (ja) * | 2000-04-14 | 2001-12-26 | Miwa Science Kenkyusho:Kk | 特定点近傍エリアの移動体の位置監視システム |
JP2004030022A (ja) * | 2002-06-24 | 2004-01-29 | Shimizu Corp | 侵入検知システム |
JP2005056213A (ja) * | 2003-08-06 | 2005-03-03 | Matsushita Electric Ind Co Ltd | 情報提供システム、情報提供サーバ、情報提供方法 |
JP2005141687A (ja) | 2003-11-10 | 2005-06-02 | Nippon Telegr & Teleph Corp <Ntt> | 物体追跡方法、物体追跡装置、物体追跡システム、プログラム、および、記録媒体 |
JP2006127240A (ja) * | 2004-10-29 | 2006-05-18 | Secom Co Ltd | 不審者検出システム及び不審者検出プログラム |
JP2006164199A (ja) * | 2004-12-10 | 2006-06-22 | Hitachi Maxell Ltd | 盗難検出装置及び盗難防止システム |
JP2006311111A (ja) | 2005-04-27 | 2006-11-09 | Daikin Ind Ltd | 位置検知システムおよび位置検知方法 |
JP2007188279A (ja) * | 2006-01-13 | 2007-07-26 | Fukushima Prefecture | 対象物のタグ情報と位置情報とを特定するための無線icタグ用タグ情報読み書きシステム |
JP2007207260A (ja) * | 2007-02-26 | 2007-08-16 | Denso Corp | 画像サーバ |
JP2007309757A (ja) * | 2006-05-17 | 2007-11-29 | Toyota Motor Corp | 対象物認識装置 |
WO2007138811A1 (ja) * | 2006-05-31 | 2007-12-06 | Nec Corporation | 不審行動検知装置および方法、プログラムおよび記録媒体 |
JP2007328747A (ja) * | 2006-06-09 | 2007-12-20 | Sony Computer Entertainment Inc | 特徴点探索装置、画像解析装置、および最近傍特徴点検出方法 |
JP2008070206A (ja) * | 2006-09-13 | 2008-03-27 | Toyota Motor Corp | 対象物認識装置 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2002255750B2 (en) * | 2001-03-12 | 2005-09-15 | Eureka Technologies Partners, Llc | Article locator system |
US8184154B2 (en) * | 2006-02-27 | 2012-05-22 | Texas Instruments Incorporated | Video surveillance correlating detected moving objects and RF signals |
-
2009
- 2009-02-27 JP JP2010502708A patent/JP5385893B2/ja not_active Expired - Fee Related
- 2009-02-27 US US12/673,087 patent/US8243136B2/en not_active Expired - Fee Related
- 2009-02-27 WO PCT/JP2009/000915 patent/WO2009113265A1/ja active Application Filing
- 2009-02-27 EP EP09721115.5A patent/EP2264679A4/en not_active Withdrawn
- 2009-02-27 CN CN200980000398A patent/CN101681549A/zh active Pending
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001359147A (ja) * | 2000-04-14 | 2001-12-26 | Miwa Science Kenkyusho:Kk | 特定点近傍エリアの移動体の位置監視システム |
JP2004030022A (ja) * | 2002-06-24 | 2004-01-29 | Shimizu Corp | 侵入検知システム |
JP2005056213A (ja) * | 2003-08-06 | 2005-03-03 | Matsushita Electric Ind Co Ltd | 情報提供システム、情報提供サーバ、情報提供方法 |
JP2005141687A (ja) | 2003-11-10 | 2005-06-02 | Nippon Telegr & Teleph Corp <Ntt> | 物体追跡方法、物体追跡装置、物体追跡システム、プログラム、および、記録媒体 |
JP2006127240A (ja) * | 2004-10-29 | 2006-05-18 | Secom Co Ltd | 不審者検出システム及び不審者検出プログラム |
JP2006164199A (ja) * | 2004-12-10 | 2006-06-22 | Hitachi Maxell Ltd | 盗難検出装置及び盗難防止システム |
JP2006311111A (ja) | 2005-04-27 | 2006-11-09 | Daikin Ind Ltd | 位置検知システムおよび位置検知方法 |
JP2007188279A (ja) * | 2006-01-13 | 2007-07-26 | Fukushima Prefecture | 対象物のタグ情報と位置情報とを特定するための無線icタグ用タグ情報読み書きシステム |
JP2007309757A (ja) * | 2006-05-17 | 2007-11-29 | Toyota Motor Corp | 対象物認識装置 |
WO2007138811A1 (ja) * | 2006-05-31 | 2007-12-06 | Nec Corporation | 不審行動検知装置および方法、プログラムおよび記録媒体 |
JP2007328747A (ja) * | 2006-06-09 | 2007-12-20 | Sony Computer Entertainment Inc | 特徴点探索装置、画像解析装置、および最近傍特徴点検出方法 |
JP2008070206A (ja) * | 2006-09-13 | 2008-03-27 | Toyota Motor Corp | 対象物認識装置 |
JP2007207260A (ja) * | 2007-02-26 | 2007-08-16 | Denso Corp | 画像サーバ |
Non-Patent Citations (1)
Title |
---|
See also references of EP2264679A4 |
Cited By (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102043157B (zh) * | 2009-10-15 | 2013-08-14 | 韩国电子通信研究院 | 用于估计对象的位置的设备和方法 |
JPWO2011061905A1 (ja) * | 2009-11-20 | 2013-04-04 | 日本電気株式会社 | 物体領域抽出装置、物体領域抽出方法、及びプログラム |
JP2013513336A (ja) * | 2009-12-08 | 2013-04-18 | トゥルーポジション・インコーポレーテッド | マルチセンサの定位および特定 |
JP2011128107A (ja) * | 2009-12-21 | 2011-06-30 | Mitsubishi Electric Corp | 移動体管理装置 |
JPWO2011101945A1 (ja) * | 2010-02-19 | 2013-06-17 | パナソニック株式会社 | 物体位置補正装置、物体位置補正方法、及び物体位置補正プログラム |
US8401234B2 (en) | 2010-02-19 | 2013-03-19 | Panasonic Corporation | Object position correction apparatus, object position correction method, and object position correction program |
WO2011101945A1 (ja) * | 2010-02-19 | 2011-08-25 | パナソニック株式会社 | 物体位置補正装置、物体位置補正方法、及び物体位置補正プログラム |
JP4875228B2 (ja) * | 2010-02-19 | 2012-02-15 | パナソニック株式会社 | 物体位置補正装置、物体位置補正方法、及び物体位置補正プログラム |
JP2011215829A (ja) * | 2010-03-31 | 2011-10-27 | Hitachi Ltd | 監視装置および不審行動検出方法 |
EP2641235A4 (en) * | 2010-11-19 | 2014-08-06 | Isolynx Llc | SYSTEMS AND METHODS FOR ASSOCIATIVE OBJECT TRACKING |
EP2641235A1 (en) * | 2010-11-19 | 2013-09-25 | Isolynx, LLC | Associative object tracking systems and methods |
WO2012068582A1 (en) | 2010-11-19 | 2012-05-24 | Isolynx, Llc | Associative object tracking systems and methods |
JP2012129858A (ja) * | 2010-12-16 | 2012-07-05 | Mitsubishi Electric Corp | 映像監視装置及び映像監視システム |
JP2012159957A (ja) * | 2011-01-31 | 2012-08-23 | Secom Co Ltd | 移動物体追跡装置 |
JP2013050926A (ja) * | 2011-08-31 | 2013-03-14 | Secom Co Ltd | 不審物体監視システム |
JP2013058167A (ja) * | 2011-09-09 | 2013-03-28 | Secom Co Ltd | 移動物体監視システム |
JP2013072858A (ja) * | 2011-09-29 | 2013-04-22 | Panasonic Corp | 移動体位置推定装置、移動体位置推定方法、及び、移動体位置推定プログラム |
JP2013114348A (ja) * | 2011-11-25 | 2013-06-10 | Secom Co Ltd | 移動物体監視システム |
JP2013257715A (ja) * | 2012-06-12 | 2013-12-26 | Fujitsu Advanced Engineering Ltd | 作業実績管理装置、方法、及び、プログラム |
JP2015108999A (ja) * | 2013-12-05 | 2015-06-11 | 三菱電機株式会社 | 動態管理システム |
JP2016042298A (ja) * | 2014-08-18 | 2016-03-31 | 株式会社豊田中央研究所 | 事故情報算出装置、及びプログラム |
US9752880B2 (en) | 2015-01-09 | 2017-09-05 | Fujitsu Limited | Object linking method, object linking apparatus, and storage medium |
US9824189B2 (en) | 2015-01-23 | 2017-11-21 | Panasonic Intellectual Property Management Co., Ltd. | Image processing apparatus, image processing method, image display system, and storage medium |
JP2016154329A (ja) * | 2015-02-09 | 2016-08-25 | 株式会社リコー | 移動端末を携帯するユーザの識別方法と装置 |
US10123158B2 (en) | 2015-02-09 | 2018-11-06 | Ricoh Company, Ltd. | Method and device for recognizing user of mobile device |
JP2016192764A (ja) * | 2015-03-30 | 2016-11-10 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | キャプチャ画像に関連するメタデータとして識別子を含めるための方法、コンピュータ・システム、およびコンピュータ・プログラム |
JP2017046023A (ja) * | 2015-08-24 | 2017-03-02 | 三菱電機株式会社 | 移動体追跡装置及び移動体追跡方法及び移動体追跡プログラム |
WO2018135095A1 (ja) * | 2017-01-20 | 2018-07-26 | ソニー株式会社 | 情報処理装置、情報処理方法、および情報処理システム |
US11721026B2 (en) | 2017-01-20 | 2023-08-08 | Sony Corporation | Information processing apparatus, information processing method, and information processing system |
JP7036036B2 (ja) | 2017-01-20 | 2022-03-15 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、および情報処理システム |
JPWO2018135095A1 (ja) * | 2017-01-20 | 2019-11-07 | ソニー株式会社 | 情報処理装置、情報処理方法、および情報処理システム |
EP3573326A4 (en) * | 2017-01-20 | 2019-11-27 | Sony Corporation | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING SYSTEM |
JP2019009562A (ja) * | 2017-06-22 | 2019-01-17 | 凸版印刷株式会社 | 監視映像表示システム、監視映像表示装置、監視情報管理サーバ、および、監視映像表示方法 |
JP2019020360A (ja) * | 2017-07-21 | 2019-02-07 | 日本電信電話株式会社 | 物体認識装置、物体認識方法、及びプログラム |
JP2021505898A (ja) * | 2017-12-11 | 2021-02-18 | フラウンホーファー−ゲゼルシャフト ツール フエルデルング デア アンゲヴァンテン フォルシュング エー.ファオ. | 対象物の現在位置を判断するための方法、位置決定システム、トラッカーおよびコンピュータプログラム |
US11662456B2 (en) | 2017-12-11 | 2023-05-30 | Fraunhofer-Gesellschaft zur Förderung der ange-wandten Forschung e. V. | Method to determine a present position of an object, positioning system, tracker and computer program |
JP7191104B2 (ja) | 2017-12-11 | 2022-12-16 | フラウンホーファー-ゲゼルシャフト ツール フエルデルング デア アンゲヴァンテン フォルシュング エー.ファオ. | 対象物の現在位置を決定するための方法、位置決定システム、トラッカーおよびコンピュータプログラム |
US11308346B2 (en) | 2018-02-02 | 2022-04-19 | Nec Corporation | Sensor information integration system, sensor information integration method, and program |
WO2020026480A1 (ja) * | 2018-07-31 | 2020-02-06 | 清水建設株式会社 | 位置検出システム及び位置検出方法 |
US11898847B2 (en) | 2018-07-31 | 2024-02-13 | Shimizu Corporation | Position detecting system and position detecting method |
JP2020020645A (ja) * | 2018-07-31 | 2020-02-06 | 清水建設株式会社 | 位置検出システム及び位置検出方法 |
JP7257752B2 (ja) | 2018-07-31 | 2023-04-14 | 清水建設株式会社 | 位置検出システム |
US11930476B2 (en) | 2018-10-29 | 2024-03-12 | Nec Corporation | Sensor information integration system, sensor information integration method, program, and recording medium |
JP2022513511A (ja) * | 2018-12-18 | 2022-02-08 | ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツング | 完全性範囲を特定するための方法 |
JP7284268B2 (ja) | 2018-12-18 | 2023-05-30 | ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツング | 完全性範囲を特定するための方法 |
JP2020102817A (ja) * | 2018-12-25 | 2020-07-02 | 凸版印刷株式会社 | 監視対象識別装置、監視対象識別システム、および、監視対象識別方法 |
JP2020118619A (ja) * | 2019-01-25 | 2020-08-06 | コニカミノルタ株式会社 | 移動体追跡システム、および移動体追跡方法 |
JP7567328B2 (ja) | 2020-09-30 | 2024-10-16 | 東芝ライテック株式会社 | 情報処理システム |
WO2022201682A1 (ja) * | 2021-03-25 | 2022-09-29 | パナソニックIpマネジメント株式会社 | 位置検知システム、位置検知方法及びプログラム |
JP7565504B2 (ja) | 2021-03-25 | 2024-10-11 | パナソニックIpマネジメント株式会社 | 位置検知システム、位置検知方法及びプログラム |
WO2022239644A1 (ja) * | 2021-05-12 | 2022-11-17 | 株式会社デンソー | 追跡装置 |
Also Published As
Publication number | Publication date |
---|---|
JP5385893B2 (ja) | 2014-01-08 |
JPWO2009113265A1 (ja) | 2011-07-21 |
EP2264679A1 (en) | 2010-12-22 |
CN101681549A (zh) | 2010-03-24 |
US20110205358A1 (en) | 2011-08-25 |
US8243136B2 (en) | 2012-08-14 |
EP2264679A4 (en) | 2013-08-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5385893B2 (ja) | 位置特定システムおよびセンサ装置 | |
US10402984B2 (en) | Monitoring | |
US9989626B2 (en) | Mobile robot and sound source position estimation system | |
CN106341661B (zh) | 巡逻机器人 | |
WO2018215829A1 (en) | Systems and methods for user detection, identification, and localization with in a defined space | |
JP5147761B2 (ja) | 画像監視装置 | |
JP7095746B2 (ja) | センサ情報統合システム、及びセンサ情報統合方法 | |
JP5807635B2 (ja) | 動線検出システム、動線検出方法および動線検出プログラム | |
JP2009295140A (ja) | 侵入者検知システム及びその方法 | |
JPWO2011021588A1 (ja) | 移動体軌跡識別システム | |
JP6588413B2 (ja) | 監視装置および監視方法 | |
CN114446026B (zh) | 一种物品遗忘提醒方法、相应的电子设备及装置 | |
CN110209281B (zh) | 对运动信号进行处理的方法、电子设备和介质 | |
US10741031B2 (en) | Threat detection platform with a plurality of sensor nodes | |
JP2021077295A (ja) | 監視装置、監視方法、及びプログラム | |
US20170118446A1 (en) | Surveillance system and method of controlling the same | |
JP7023803B2 (ja) | 監視システム | |
CN103675821B (zh) | 一种基于超声波传感阵列的摄像机室内定位系统的定位方法 | |
KR101518314B1 (ko) | 무인 항공 감시 장치를 이용한 영상 감시 방법 및 장치 | |
CN104243894A (zh) | 一种声视频融合监控方法 | |
US11209796B2 (en) | Surveillance system with intelligent robotic surveillance device | |
JP6967868B2 (ja) | 監視システム、監視プログラム、および、記憶媒体 | |
CN111966126A (zh) | 无人机巡逻方法、装置及无人机 | |
JP2021149687A (ja) | 物体認識装置、物体認識方法及び物体認識プログラム | |
CN113923599B (zh) | 一种基于无线融合信号的vslam闭环检测方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980000398.6 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2010502708 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09721115 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12673087 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009721115 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |