WO2011068184A1 - 移動体の測位装置 - Google Patents
移動体の測位装置 Download PDFInfo
- Publication number
- WO2011068184A1 WO2011068184A1 PCT/JP2010/071637 JP2010071637W WO2011068184A1 WO 2011068184 A1 WO2011068184 A1 WO 2011068184A1 JP 2010071637 W JP2010071637 W JP 2010071637W WO 2011068184 A1 WO2011068184 A1 WO 2011068184A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- processing device
- moving body
- output
- moving
- outputs
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
- G06V40/25—Recognition of walking or running movements, e.g. gait recognition
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/005—Traffic control systems for road vehicles including pedestrian guidance indicator
Definitions
- the present invention relates to a positioning device for a moving body using an external monitoring camera which is an indispensable element when monitoring and following the moving body.
- Non-Patent Document 1 Pedestrian Dead Reckoning
- the output of the self-contained sensor is analyzed in consideration of the restriction conditions of human walking motion, and the pedestrian's movement vector or movement speed vector is estimated and integrated one by one.
- the measurement error gradually accumulates with the progress of movement.
- the person in the video taken from the surveillance camera must be associated with the person being tracked by the PDR.
- An object of the present invention is to solve such a problem, and provides a positioning device for a moving body using an external monitoring camera which is an indispensable element when monitoring and following the moving body. There is to do.
- the accumulation of measurement errors due to the PDR which is the first problem, is based on the analysis result of the video of the surveillance camera installed outside and the PDR.
- the problem is solved by correcting the measurement result of PDR by combining the measurement results. That is, the person in the video of the monitoring camera is associated with the person tracked by the PDR, and the position of the PDR is corrected to the position of the person in the video of the monitoring camera.
- the mobile positioning device tracks the position of the person in the video by analyzing the video of the surveillance camera, Since the moving speed vector can be estimated, the individual difference parameter is estimated by analyzing the result and the PDR sensor data.
- the association between the person in the video of the surveillance camera and the person tracked by the PDR which is the third problem, is solved by taking the following procedure. That is, first, by analyzing the video of the surveillance camera that has a known installation relationship with the floor on which the pedestrian moves, the foreground person image is cut out, focusing on the difference image from the background. Estimate the position of the person's head and feet. Next, the movement speed vector or movement vector is estimated by tracking the position of the foot step by step. By comparing this movement speed vector or movement vector with the movement speed vector or movement vector output by the PDR, the output of the PDR is associated with the person tracked by the surveillance camera.
- a mobile body positioning device of the present invention is provided in a mobile body, and measures the movement of the mobile body and outputs it, and based on the output of the internal world observation apparatus, the mobile body
- the positioning device for a mobile object includes a self-contained sensor group that outputs a moving speed vector of the mobile object by an acceleration sensor provided in the mobile object, and a movement based on the output of the self-contained sensor group.
- a moving speed vector estimation processing device that measures a speed vector and outputs an output sequence of the moving speed vector together with the measured time, a monitoring camera that captures a moving body from the outside, and an image of the monitoring camera that is analyzed
- An image analysis processing device that measures a position of a foot of a moving body, measures a moving speed vector at the position of the foot, outputs an output sequence of the moving speed vector together with the measured time
- a moving speed vector estimation processing device A moving speed vector that collates the output sequence with the output sequence of the video analysis processing device and outputs the collation result as a TRUE (true) or FALSE (false) signal. Characterized in that it comprises a Le collation processing apparatus.
- the mobile body positioning apparatus includes, as its configuration, a self-contained sensor group that outputs a movement vector of the mobile body by accumulating a movement speed vector by an acceleration sensor provided in the mobile body, and the self-contained sensor group.
- the movement vector is measured based on the output of the movement vector, the movement vector estimation processing device that outputs the movement vector output sequence together with the measured time, the monitoring camera that images the moving body from the outside, and the video of the monitoring camera is analyzed.
- An image analysis processing device that measures a position of a foot of a moving body in a video, measures a movement vector of the position of the foot, outputs an output sequence of the movement vector together with the measured time, and the movement vector estimation processing device
- the output sequence and the output sequence of the video analysis processing device are collated, and the collation result is transferred to a mobile system that outputs a TRUE (true) or FALSE (false) signal.
- TRUE true
- FALSE false
- the mobile positioning device includes, as its configuration, a self-contained sensor group that outputs a movement vector of the mobile body by integrating a moving speed vector by an acceleration sensor provided in the mobile body, and a self-contained sensor group. Based on the output, identify the movement type of the moving body, and output the identification result together with the measured time, the self-contained sensor-based movement identification processing device, the monitoring camera that captures the moving body from the outside, and the video of the monitoring camera The video analysis processing device that identifies the motion type of the moving object in the video and outputs the identification result together with the measured time, the collation processing device, the output of the self-contained sensor-based motion identification processing device and the output of the video analysis processing device And an action type collation processing apparatus that outputs a TRUE (true) or FALSE (false) signal as a collation result.
- the moving speed vector collation processing apparatus uses the video of the surveillance camera when collating the output series from the moving speed vector estimation processing apparatus with the output series from the video analysis processing apparatus. It is also configured to perform a matching process by determining a weighting factor according to the size of the area of the unit pixel of the video of the surveillance camera projected on the floor surface at the estimated position of the foot of the moving body. This is similarly performed when the movement vector collation processing device collates the output sequence from the movement vector estimation processing device and the output sequence from the video analysis processing device. In this case, in the collation process, the weighting factor is decreased as the unit pixel area is projected larger, and the weighting factor is increased as the unit pixel area is projected smaller.
- the mobile positioning apparatus further includes a position correction signal output device, and the position correction signal output device is configured such that the moving speed vector matching processing device is TRUE (true).
- the position correction signal output device is configured such that the moving speed vector matching processing device is TRUE (true).
- the position correction signal output device is configured to output a signal for correcting the position coordinates of the moving object output from the video analysis processing apparatus as the position of the moving object on which the self-contained sensor group is mounted.
- the positioning apparatus for a moving body further includes a walking parameter estimation processing device, and the movement speed vector estimation processing device is based on a preset individual difference parameter.
- the walking parameter estimation processing device outputs the TRUE (true) signal when the movement speed vector matching processing device outputs a TRUE (true) signal.
- the individual difference parameter in the moving speed vector estimation processing device is reset.
- the mobile body positioning device further includes an identification information storage / display device for storing and displaying identification information for identifying the mobile body, and the identification information storage / display device includes the moving speed vector collation process.
- the device When the device outputs a TRUE signal, the mobile speed vector is stored and displayed as information indicating the mobile body in the video of the surveillance camera, and the identification information of the mobile body wearing the self-contained sensor is displayed.
- the verification processing device When the verification processing device outputs a FALSE (false) signal, it is configured to store and display identification information indicating that the self-contained sensor is not mounted as information indicating the moving body in the video of the monitoring camera. Is done.
- an image of a monitoring camera that is an external observation device, for example, an internal observation device that is a self-contained sensor group of the positioning device, for example.
- an external observation device for example, an internal observation device that is a self-contained sensor group of the positioning device, for example.
- a wearer of, for example, a PDR sensor which is an internal observation device and a self-contained sensor group of the positioning device, can display an image of an external observation device such as a monitoring camera.
- an external observation device such as a monitoring camera.
- the parameter can be estimated.
- a calibration operation can be performed in advance based on the estimated individual difference parameter.
- FIG. 1 is a diagram illustrating a schematic configuration of a mobile positioning device according to the present invention
- FIG. 2 is a block diagram illustrating a configuration of basic processing elements of the mobile positioning device according to the present invention. is there.
- an external environment observation apparatus 11 is an apparatus that can observe the movement of a plurality of moving bodies in a predetermined space (external world) such as a road, a plaza, and the sea.
- Examples of the external observation apparatus 11 include a monitoring camera, a Z value sensor, and a laser range finder.
- the inner world observation device 12 is a device attached to a moving body, and is a device that measures the movement of the moving body.
- a combination of a sensor for measuring movement such as an acceleration sensor, a magnetic sensor, an angular velocity sensor, and an atmospheric pressure sensor and a timepiece may be considered.
- a temperature sensor may be used in combination with an acceleration sensor, a magnetic sensor, an angular velocity sensor, and an atmospheric pressure sensor.
- Examples of the moving body include pedestrians, bicycles, automobiles, airplanes, and ships.
- the external field observation apparatus 11 is not equipped with a mobile body (in the example of this figure, which is a pedestrian and is a mobile body 2) equipped with the internal world observation apparatus 12 or an internal world observation apparatus.
- the mobile body 1 (similarly, a pedestrian in this example) that has not transmitted the information to the outside is observed by the external field observation device 11.
- the movement of each of the plurality of moving objects to be observed is estimated at each discrete time by using a moving speed vector, a moving vector, action identification, and the like.
- the external field observation device 11 is a monitoring camera that captures a moving body
- the moving body in the video of the monitoring camera is appropriately tracked and the moving speed vector is estimated based on features such as shape and color. This can be realized by a combination with a moving speed vector estimation processing device.
- a combination with a Z value analysis device that uses a sensor that measures the depth (Z value) of the outside world as the external environment observation device 11 to detect a moving object that is an object to be photographed based on the depth and estimates the moving speed vector.
- a sensor for measuring the Z value a depth sensor using a stereo camera can be considered as a passive sensor, and an active sensor can project a distance (Z by projecting infrared light and measuring the time required for reflection thereof.
- a device for measuring (value) is conceivable.
- a laser range finder that irradiates a laser beam to the outside world and measures the distance to the moving body based on the reflected light is used as the outside world observation device 11 and the measurement result of the distance.
- LRF laser range finder
- the inner field observation device 12 estimates the movement of the moving object equipped with this at each discrete time based on the movement speed vector, the movement vector, and the action identification.
- the time-series data of the moving velocity vectors observed in the external world and the internal world are collected in one place by communication means (for example, using a wireless data communication network). Therefore, collation processing (A) of the time series data of the moving speed vectors of one or more moving bodies captured by the external field observation apparatus 11 and the moving speed vector output from the internal field observation apparatus 12 is performed, and the collation is successful. Then, when it can be regarded as a time series of moving velocity vectors by the same moving body, a TRUE signal is output, otherwise a FALSE signal is output.
- FIG. 2 is a block diagram for explaining the configuration of basic processing elements of the mobile positioning apparatus according to the present invention.
- the external observation apparatus 21 is an apparatus that detects one or more moving objects and outputs the result with time data, similarly to the external observation apparatus 11 of FIG.
- the output data 30 differs depending on the external field observation device to be used. For example, when using a camera, it is video data with time, when using a Z value sensor, it is a set of Z values with time, and when using a laser range finder. If there is, it is a set of distance data with time.
- the external field observation data processing device 24 is a device that detects a moving body based on the output data 30, and outputs time-series data of the moving speed vector or moving vector, and the motion identification result of the moving body.
- a computer having a memory, a processor, and an input / output interface. This can be realized by, for example, a process of detecting a target moving body (for example, a person) based on a feature amount such as a shape and a color in video analysis.
- the inner world observation device 22 is a device that is mounted on a moving body and measures its movement, as shown in FIG. 1, similar to the inner world observation device 21 of FIG. This can be realized, for example, by a self-contained sensor (a combination of acceleration, magnetism, angular velocity, barometric pressure, and temperature sensor) and a clock (such as a real-time clock).
- a self-contained sensor a combination of acceleration, magnetism, angular velocity, barometric pressure, and temperature sensor
- a clock such as a real-time clock
- the inner field observation data processing device 23 acquires an acceleration vector fixed in the (three-dimensional) world coordinate system based on the time-accompanied output of the self-contained sensor, and integrates the acceleration integral thereof.
- Is a device for estimating a moving velocity vector specifically a computer having a memory, a processor, and an input / output interface.
- the inner-field observation data processing device 23 outputs time-series data (33) of the moving speed vector, the moving vector, or the movement identification result of the moving object.
- the moving speed vector, the moving vector, or the time series data (31) of the motion identification result of the moving body obtained based on the external field observation device 21 and the inner field observation device 22 are obtained. Is compared with the time-series data (33) of the moving speed vector, the moving vector, or the moving object motion identification result obtained based on the above, and if it can be compared, a TRUE signal is output, otherwise FALSE ( A false signal is output (34).
- the verification processing device 25 is also a computer including a memory, a processor, an input / output interface, and the like, and the data received from the external observation data processing device and the internal observation data processing device is subjected to verification processing by the processor.
- the moving speed vector time-series data (31) and the moving speed vector time-series data (33) obtained from the inner field observation device 22 are used.
- the distance scale between the time series data of the moving speed vector is determined in advance, and when this is below a certain threshold, it is determined that the two moving speed vectors can be collated, and a TRUE signal is output, Otherwise, a FALSE signal is output (34).
- the distance measure described above is used by selecting one that allows a certain jitter (time difference) to exist in the time data held by the two time series data.
- the monitoring camera 101 is installed so that the arrangement relationship with respect to the floor surface is known, and in the video taken by the monitoring camera 101, passages and open spaces that people often pass are shown. ing.
- the self-contained sensor 102 is a group of sensors that can operate without infrastructure support from the outside. For example, an acceleration sensor, a gyro sensor, a magnetic sensor, an atmospheric pressure sensor, or the like is used.
- the camera model including the translation / rotation movement parameter and the focal length / scale coefficient with respect to the external environment in which the monitoring camera 101 is installed is an environment model created with the scale set appropriately.
- an environment model can be interactively created by executing a modeler application program (modeler application) using the method described in Non-Patent Document 3 or the like.
- modeler application is used to perform a procedure for obtaining a vanishing point in the environment based on an instruction from the user (specifically, two sets of parallel lines in the environment) User selects a pair).
- FIG. 3 it is detected that a moving pedestrian wearing the self-contained sensor 102 is reflected in the video of the monitoring camera 101, and the person in the video of the monitoring camera 101 is self-contained.
- a process (A) for collating with a pedestrian wearing the sensor 102 is performed.
- 201 is a surveillance camera
- 202 is a self-contained sensor group
- 203 is a moving speed vector estimation processing device
- 204 is a video analysis processing device
- 205 is a moving speed vector collation processing device.
- the video (the frame image at each time) taken by the monitoring camera 201 is output to the video analysis processing device 204, where the candidates for the image of the moving person are selected, the position of the foot is estimated, and the time differentiation is performed.
- a moving speed vector at each time is estimated, and an output series of moving speed vectors is output.
- data may be acquired by using a stereo camera depth measurement device or a laser range finder, and the movement speed vector may be estimated.
- the moving body will be described as a person.
- 210 represents information of each frame image of the video of the surveillance camera
- 211 represents information of the result of analyzing each frame image.
- the analysis result information 211 outputs data representing the position coordinates on the floor of the person's feet when a person is present in the video, and data representing a signal indicating that when there is no person. Is output.
- the output data 212 of the self-contained sensor group 202 is output to the moving speed vector estimation processing device 203, where the output data (acceleration vector, angular velocity vector, magnetic vector, and atmospheric pressure data) is output. Based on this, the moving speed vector of the wearer (person) at each time is estimated.
- the output data 212 is output of sensor data from the sensors included in the self-contained sensor group 202 as described above.
- the self-contained sensor group 202 has a timer, and the timer outputs an acceleration vector, angular velocity vector, magnetic vector, and atmospheric pressure data having time stamp information at the time of acquisition.
- the calculation of the movement speed vector based on the sensor data of the output data 212 of the self-contained sensor group 202 is calculated by performing data processing by the method described in Non-Patent Document 1, for example.
- Data 213 output from the moving speed vector estimation processing device 203 is data of an output series of moving speed vectors of a moving body (person) wearing the self-contained sensor group 202.
- the moving speed vector collating device 205 accumulates and compares the output sequence of the moving speed vector output from the video analysis processing device 204 and the output sequence of the moving speed vector output from the moving speed vector estimation processing device 203 for a certain period of time. Judge whether to do. For example, as described in Non-Patent Document 4, the data processing of this determination is performed by calculating the sum of the magnitudes of the difference vectors between the two moving speed vectors and normalizing by the length of time or the number of samples. In the data processing, a comparison is made with a predetermined threshold value, and if it is equal to or less than the threshold value, it is regarded as a collation.
- the output data 214 output from the moving speed vector collating device 205 includes time series data of moving speed vectors estimated by the moving speed vector estimating processing device 203 based on the sensor data of the self-contained sensor group 202, and video.
- the analysis processing device 204 collates two movement speed vectors estimated from the position coordinates of the feet of the person in the video obtained as the analysis processing result of each frame image of the monitoring camera 201, and as a result, the collation is performed. In this case, a signal indicating TRUE (true) is output. Otherwise, a signal indicating FALSE (false) is output.
- the mobile positioning device of the above-described embodiment performs the matching process using the output sequence of the moving speed vector, and the portion that has been checked with the moving speed vector is checked using the moving vector (that is, the relative position vector). Can also be performed. Next, an embodiment having such a configuration will be described.
- FIG. 5 is a block diagram for explaining the configuration of another embodiment of the mobile positioning apparatus of the present invention.
- 301 is a surveillance camera
- 302 is a self-contained sensor group
- 303 is a motion vector estimation processing device
- 304 is a video analysis processing device
- 305 is a motion vector matching processing device.
- the monitoring camera 301 is the same as the monitoring camera 201 in FIG.
- a frame image signal 310 is output from the monitoring camera 301.
- the video image analysis processing device 304 receives the frame image signal 310 and determines that a person image (moving body) exists in the frame image, the video analysis processing device 304 outputs an output sequence of the movement vector 311 of the person image.
- Output data 312 from the self-contained sensor group 302 is an output series of sensor data such as an acceleration vector, an angular velocity vector, a magnetic vector, and atmospheric pressure data to which a time stamp is added.
- the movement vector estimation processing device 303 when the sensor data of the output data 312 is given as an input from such a self-contained sensor group 302, a wearer of a self-contained sensor ( The relative movement vector of the moving object is estimated and output.
- data processing obtained by first order integration with respect to the time can be used. Use data processing by method.
- the movement vector estimation processing device 303 outputs output data 313 of an output series of the obtained relative movement speed vector.
- the movement vector collation processing device 305 includes time series data of movement vector output data 311 obtained by video analysis processing and time series data of output data 313 of relative movement vector obtained by movement vector estimation processing based on sensor data. And match. As a method for collating time series data of two movement vectors, the sum of the magnitudes of the difference vectors between the two movement vectors is calculated, normalized by the length of time or the number of samples, and A comparison process is performed, and if it is less than the threshold value, a verification process for determining that the verification has been performed is performed. The movement vector collation processing device 305 outputs a TRUE (true) signal as output data 314 when it is determined that collation is performed by such processing, and otherwise, a FALSE (false) signal. Is output.
- TRUE true
- the position detection accuracy based on the image when the person image appears small in the image of the surveillance camera 301 is higher than that when the person image appears large. Go down. This means that when the unit pixel of the surveillance camera is back projected onto the floor surface, the reliability changes in inverse proportion to the area.
- the movement vector collation processing device 305 when collating the output sequence from the movement vector estimation processing device 303 with the output sequence from the video analysis processing device 304, identifies the foot of the moving object estimated from the video of the surveillance camera.
- a matching process is performed by determining a weighting factor according to the size of the area of the unit pixel of the video of the surveillance camera projected on the floor of the position.
- the weighting factor is decreased as the unit pixel area is projected larger, and the weighting factor is increased as the unit pixel area is projected smaller.
- Such processing can be performed in the same manner in the embodiment of the mobile positioning apparatus described with reference to FIG. That is, in that case, the moving speed vector collation processing device 205 is estimated from the video of the surveillance camera when the output sequence from the moving speed vector estimation processing device 203 is collated with the output sequence from the video analysis processing device 204.
- the weighting coefficient is determined according to the size of the area of the unit pixel of the image of the monitoring camera projected on the floor surface at the position of the foot of the moving body, and the matching process is performed.
- FIG. 6 is a block diagram for explaining the configuration of another embodiment of the mobile positioning apparatus of the present invention.
- the embodiment shown in FIG. 6 is based on the observation data from the inner field observation device and the outer field observation device, and the self-contained base operation type processing device as the inner field observation data processing device and the video analysis as the outer field observation data processing device.
- the processing device identifies the movement type of the moving object, and collates the time-series data of the identification result in the matching processing device.
- 3201 is a monitoring camera
- 3202 is a self-contained sensor group
- 3203 is a self-contained base operation type processing device
- 3204 is a video analysis processing device (operation type processing device)
- 3205 is an operation type collation processing device.
- the monitoring camera 3201 corresponds to the external observation apparatus 21 in FIG. 2 and is the same as the monitoring cameras 201 and 301.
- a video signal 3210 with time is output from the monitoring camera 3201.
- the video analysis processing device (operation type processing device) 3204 receives the video signal 3210 as input, analyzes the motion of a moving body (such as a person) present in the video, and recognizes / identifies the motion.
- HMM Hidden Markov Model
- the self-contained sensor group 3202 corresponds to the inner field observation device 22 in FIG. 2 and is the same as the self-contained sensor group 202 and 302.
- Output data 3212 from the self-contained sensor group 3202 is an output series of sensor data such as an acceleration vector, an angular velocity vector, a magnetic vector, atmospheric pressure data, and temperature data to which a time stamp is added.
- the self-contained base action type processing device 3203 identifies and recognizes the wearer's action type based on the output data 3212.
- Such an operation type processing device converts, for example, a feature vector in the time domain of time series data such as an acceleration vector and an angular velocity vector, which are a part of outputs of the self-contained sensor group, and Fourier transform of the time series data.
- the feature quantity vector in the frequency domain is calculated.
- the operation type corresponding to the combination of the two types of feature vectors in the time domain and the frequency domain can be realized by a machine learning method using a computer.
- the input time-series sensor data is collated with a learned model or model data to give a specific identification result to the input sensor data Is possible.
- a learned model or model data to give a specific identification result to the input sensor data Is possible.
- the action type collation processing device 3205 corresponds to the collation processing device 25 in FIG. 2, and based on the output 3211 of the action type processing device 3204 and the output 3213 of the self-contained base action type processing device 3203, Check for time series data. At this time, a distance scale between the time-series action type recognition result data is defined, and when the distance falls below a certain threshold, the two time-series data are compared, and a TRUE (true) signal is Otherwise, a FALSE signal is output (3214).
- FIG. 7 is a flowchart for explaining a processing flow for setting a weighting factor in the collation processing performed by the movement speed vector collation processing device or the movement vector collation processing device. This will be described with reference to FIG.
- step S401 the position on the floor where the person exists is obtained from the video of the monitoring camera, and then in step S402, the area where the unit pixel of the monitoring camera is projected on the floor where the person exists. Ask for.
- step S403 the weighting coefficient at that position is determined according to the obtained area size, and in step S404, the weighting coefficient on each point on the locus is stored. In this way, the weighting coefficient is determined and the matching process is performed.
- the mobile positioning device of the present invention can be modified so as to improve the accuracy of data processing in the estimation processing by setting individual difference parameters.
- the walking of the person wearing the self-contained sensor is performed using the moving speed vector of the person image in the video and the output series of the self-contained sensor.
- the individual difference parameter is set so as to estimate the individual difference parameter that characterizes the motion.
- Non-Patent Document 1 the amplitude data of the acceleration component obtained by decomposing the output of the acceleration sensor in the vertical direction and the magnitude of the moving speed vector of the person image in the video are linked, In other words, individual difference parameters (in this case, two of a slope and an intercept for determining a straight line) are estimated.
- FIG. 8 is a block diagram for explaining the configuration of another embodiment of the mobile positioning apparatus of the present invention.
- 201 is a surveillance camera
- 202 is a self-contained sensor group
- 203 is a moving speed vector estimation processing device
- 204 is a video analysis processing device
- 205 is a moving speed vector collation processing device.
- the walking parameter estimation processing device 505 includes sensor data 503 output from the self-contained sensor group 202, a moving speed vector 501 at the foot position of the human image in the video output from the video analysis processing device 204, and a moving speed vector.
- the collation processing device 205 receives the collation result (TRUE / FALSE) output signal 502 output as a result of the collation processing, and estimates and outputs the walking parameter.
- the output of the estimated walking parameter is input to the movement speed vector estimation processing device 203, and is used when the movement speed vector estimation processing device 203 executes the movement speed vector estimation processing.
- the walking parameter estimation processing performed by the walking parameter estimation processing device 505 is performed when the moving speed vector of the person image in the video is collated with the moving speed vector based on the sensor data of the self-contained sensor (that is, the signal 502) Is a TRUE signal), the walking parameter estimation processing device characterizes the walking motion of the wearer of the self-contained sensor based on the sensor data 503 of the self-contained sensor data and the moving speed vector 501 linked at that time.
- the walking parameter is estimated and the walking parameter data is output.
- the output walking parameter characterizing the individual difference is used when the moving speed vector estimation processing device 203 executes the moving speed vector estimation process.
- the walking parameter estimation processing device 505 provided in the mobile positioning device shown in FIG. 8 performs the verification when the moving speed vector matching processing device 205 outputs a TRUE signal as the output data 214 of the matching processing.
- the walking parameter of the individual difference parameter in the moving speed vector estimation processing device 203 is reset using the pair information of the time series data of the moving speed vector and the sensor data output series of the self-contained sensor. Accordingly, the moving speed vector estimation processing device 203 corrects and outputs the moving speed vector output series based on the set walking parameter of the individual difference parameter.
- the mobile positioning device stores and displays the identification information of the wearer of the self-contained sensor in association with the person image in the image when the video of the surveillance camera and the output of the self-contained sensor are collated. It can also be deformed so that it can. An embodiment having such a configuration will be described with reference to FIG.
- FIG. 9 is a block diagram for explaining the configuration of another embodiment of the mobile positioning apparatus of the present invention.
- 201 is a surveillance camera
- 202 is a self-contained sensor group
- 203 is a moving speed vector estimation processing apparatus
- 205 is a moving speed vector collation processing apparatus.
- Reference numeral 601 denotes a wearer identification information storage device
- 604 an identification information storage / display device
- 605 an image analysis processing device.
- the wearer identification information storage device 601 is a device that stores and outputs identification information (information such as individual ID and name) of the wearer of the self-contained sensor group.
- the identification information output 603 is identification information of the wearer of the self-contained sensor group 202.
- the video analysis processing device 605 here analyzes the video from the monitoring camera 201, estimates the moving speed vector of the position of the foot of the human image, and outputs the output data 211. Output data 602 of region information where an image exists is output.
- the area information output data 602 is information representing the area of the human image in the video. Also, in 606, the output data 606 of the video frame image is extracted from the output data 210 from the monitoring camera 201 and input to the identification information storage / display device 604.
- the identification information storage / display device 604 is the output of the frame image 606, the region information 602 of the person image in the image, the identification information 603 of the wearer of the self-contained sensor group 202, and the moving speed vector matching processing device 205.
- the TRUE (false) / FALSE (false) signal 214 that is the result of the collation determination processing of the two moving speed vectors is input, and the signal of the collation determination processing result is the TRUE signal
- the frame image 606 of the surveillance camera The area information 602 is stored in association with the wearer's identification information 603, and the identification information 603 is displayed on the image area indicated by the area information in the frame image.
- the moving speed vector or moving vector of the person image in the video of the surveillance camera is estimated based on the sensor data based on the self-contained sensor group.
- the identification information of the person in the video of the surveillance camera and the wearer of the self-contained sensor group can be linked and displayed.
- the mobile positioning apparatus of this embodiment is provided with an identification information storage / display device 604 for storing and displaying identification information for identifying the mobile body, and the identification information storage / display device 604 is provided with a moving speed vector matching process.
- the device 205 When the device 205 outputs a TRUE (true) signal, it stores and displays the identification information of the mobile object equipped with the self-contained sensor 202 as information indicating the mobile object in the video of the monitoring camera 201, and the moving speed
- the vector verification processing device 205 outputs a FALSE (false) signal, it stores and displays identification information indicating that the self-contained sensor is not mounted as information indicating the moving body in the video of the monitoring camera 201.
- the movement speed vector or movement vector of the person image in the video of the surveillance camera is estimated based on the sensor data based on the self-contained sensor group. It is possible to estimate with certain certainty that the wearer of the self-contained sensor group is at a position where the human image is present by video analysis of the surveillance camera.
- the position correction signal used here is output by the position correction signal output device 702.
- FIG. 10 is a block diagram for explaining the configuration of another embodiment of the mobile positioning apparatus of the present invention.
- 201 is a surveillance camera
- 202 is a self-contained sensor group
- 203 is a moving speed vector estimation processing device
- 204 is a video analysis processing device
- 205 is a moving speed vector collation processing device.
- Reference numeral 702 denotes a position correction signal output device.
- the position correction signal output device 702 is one of the outputs of the video analysis processing device 204, the foot position information 701 of the person image in the video, and the verification result signal (TRUE / FALSE) which is the output of the moving speed verification processing device 205.
- Signal) 214 is input and a position correction signal 703 is output.
- the position correction signal 703 is used as a correction signal for the video analysis signal 701 output from the video analysis processing device 204.
- the position correction signal output device 702 is configured such that when the moving speed vector matching processing device 205 outputs a TRUE signal, the position coordinates of the moving body output from the video analysis processing device 204 are displayed on the self-contained sensor group. It is output as a signal to be corrected as the position of the moving body.
- the mobile positioning device of the present invention it becomes possible to correct the position estimation result by the internal world observation device provided in the mobile body by an external world observation device such as a surveillance camera, so that the mobile body can be moved more accurately. Since it can be measured, analyzed and estimated, it is industrially useful.
Abstract
Description
12 内界観測装置
21 外界観測装置
22 内界観測装置
23 内界観測データ処理装置
24 外界観測データ処理装置
25 照合処理装置
101 監視カメラ
102 自蔵センサ群
201 監視カメラ
202 自蔵センサ群
203 移動速度ベクトル推定処理装置
204 映像解析処理装置
205 移動速度ベクトル照合処理装置
301 監視カメラ
302 自蔵センサ群
303 移動ベクトル推定処理装置
304 映像解析処理装置
305 移動ベクトル照合処理装置
505 歩行パラメータ推定処理装置
601 装着者識別情報記憶装置
604 識別情報記憶・表示装置
605 映像解析処理装置
702 位置補正信号出力装置
3201 監視カメラ
3202 自蔵センサ群
3203 自蔵ベース動作種別処理装置
3204 映像解析処理装置
3205 動作種別照合処理装置
Claims (9)
- 移動体に備えられ、移動体の移動を計測して出力する内界観測装置と、
前記内界観測装置の出力に基づいて、前記移動体の移動を推定する内界観測データ処理装置と、
複数の移動体の移動を観測する外界観測装置と、
前記外界観測装置の観測結果から前記移動体の移動を推定する外界観測データ処理装置と、
前記内界観測データ処理装置からの出力と前記外界観測データ処理装置とを照合して、照合結果を、TRUE(真)もしくはFALSE(偽)信号として出力する照合処理装置と、
を備えることを特徴とする移動体の測位装置。 - 請求項1記載の移動体の測位装置であって、
前記内界観測装置は、移動体に備えられる加速度センサにより当該移動体の移動速度ベクトルを出力する自蔵センサ群を有し、
前記内界観測データ処理装置は、前記自蔵センサ群の出力に基づいて移動速度ベクトルを計測し、計測した時刻と共に移動速度ベクトルの出力系列を出力する移動速度ベクトル推定処理装置であり、
前記外界観測装置は、外部から移動体を撮影する監視カメラを有し、
前記外界観測データ処理装置は、前記監視カメラの映像を解析して映像中の移動体の足元の位置を計測して、その足元の位置の移動速度ベクトルを計測し、計測した時刻と共に移動速度ベクトルの出力系列を出力し、
前記照合処理装置は、前記移動速度ベクトル推定処理装置の出力系列と前記映像解析処理装置の出力系列を照合して、照合結果をTRUE(真)もしくはFALSE(偽)信号として出力する移動速度ベクトル照合処理装置である
ことを特徴とする移動体の測位装置。 - 請求項1記載の移動体の測位装置であって、
前記内界観測装置は、移動体に備えられる加速度センサにより移動速度ベクトルの積算による当該移動体の移動ベクトルを出力する自蔵センサ群を有し、
前記内界観測データ処理装置は、前記自蔵センサ群の出力に基づいて移動ベクトルを計測し、計測した時刻と共に移動ベクトルの出力系列を出力する移動ベクトル推定処理装置であり、
前記外界観測装置は、外部から移動体を撮影する監視カメラを有し、
前記外界観測データ処理装置は、前記監視カメラの映像を解析して映像中の移動体の足元の位置を計測して、その足元の位置の移動ベクトルを計測し、計測した時刻と共に移動ベクトルの出力系列を出力する映像解析処理装置であり、
前記照合処理装置は、前記移動ベクトル推定処理装置の出力系列と前記映像解析処理装置の出力系列を照合して、照合結果をTRUE(真)もしくはFALSE(偽)信号を出力する移動速度ベクトル照合処理装置である
ことを特徴とする移動体の測位装置。 - 請求項1記載の移動体の測位装置であって、
前記内界観測装置は、移動体に備えられる加速度センサにより移動速度ベクトルの積算による当該移動体の移動ベクトルを出力する自蔵センサ群を有し、
前記内界観測データ処理装置は、前記自蔵センサ群の出力に基づいて前記移動体の動作種別を識別し、計測した時刻と共に識別結果を出力する自蔵センサベース動作識別処理装置であり、
前記外界観測装置は、外部から移動体を撮影する監視カメラを有し、
前記外界観測データ処理装置は、前記監視カメラの映像を解析して映像中の移動体の動作種別を識別し、計測した時刻と共に識別結果を出力する映像解析処理装置であり、
前記照合処理装置は、前記自蔵センサベース動作識別処理装置の出力と前記映像解析処理装置の出力を照合して、照合結果をTRUE(真)もしくはFALSE(偽)信号を出力する動作種別照合処理装置である
ことを特徴とする移動体の測位装置。 - 請求項2または請求項3に記載の移動体の測位装置において、
前記照合処理装置は、
前記内界観測データ処理装置からの出力系列と前記外界観測データ処理装置からの出力系列を照合する場合において、
監視カメラの映像から推定された移動体の足元の位置の床面上に投影される監視カメラの映像の単位画素の面積の大きさに応じて重み係数を決定して照合の処理を行う
ことを特徴とする移動体の測位装置。 - 請求項5に記載の移動体の測位装置において、
照合の処理においては、単位画素の面積が大きく投影されるほど重み係数を小さくし、単位画素の面積が小さく投影されるほど重み係数を大きくする
ことを特徴とする移動体の測位装置。 - 請求項2または請求項3に記載の移動体の測位装置において、更に、
位置補正信号出力装置を備えており、
前記位置補正信号出力装置は、
前記照合処理装置がTRUE(真)信号を出力するとき、前記外界観測データ処理装置から出力される移動体の位置座標を、自蔵センサ群が装着されている移動体の位置として補正する信号を出力する
ことを特徴とする移動体の測位装置。 - 請求項2または請求項3に記載の移動体の測位装置において、更に、
歩行パラメータ推定処理装置を備えており、
前記内界観測データ処理装置は予め設定された個人差パラメータに基づいて移動速度ベクトルの出力系列を補正して出力し、
前記歩行パラメータ推定処理装置は、
前記照合処理装置がTRUE(真)信号を出力するとき、
照合された移動速度ベクトルの時系列データと前記自蔵センサのセンサデータ出力系列のペア情報を用いて、前記内界観測データ処理装置における個人差パラメータを再設定する
ことを特徴とする移動体の測位装置。 - 請求項2または請求項3に記載の移動体の測位装置において、更に、
移動体を識別する識別情報を記憶して表示する識別情報記憶表示装置を備え、
前記識別情報記憶表示装置は、
前記照合処理装置がTRUE(真)信号を出力するとき、
前記監視カメラの映像中の移動体を指し示す情報として、自蔵センサを装着している移動体の識別情報を記憶して表示し、
前記照合処理装置がFALSE(偽)信号を出力するとき、
前記監視カメラの映像中の移動体を指し示す情報として、自蔵センサを装着していないことを示す識別情報を記憶して表示する
ことを特徴とする移動体の測位装置。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011544301A JPWO2011068184A1 (ja) | 2009-12-03 | 2010-12-03 | 移動体の測位装置 |
US13/512,696 US8983124B2 (en) | 2009-12-03 | 2010-12-03 | Moving body positioning device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009275754 | 2009-12-03 | ||
JP2009-275754 | 2009-12-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011068184A1 true WO2011068184A1 (ja) | 2011-06-09 |
Family
ID=44115030
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/071637 WO2011068184A1 (ja) | 2009-12-03 | 2010-12-03 | 移動体の測位装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US8983124B2 (ja) |
JP (2) | JPWO2011068184A1 (ja) |
WO (1) | WO2011068184A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012103169A (ja) * | 2010-11-11 | 2012-05-31 | Fujitsu Ltd | 移動物特定システム、移動物特定装置および移動物特定プログラム |
JP2013251800A (ja) * | 2012-06-01 | 2013-12-12 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
JP2015041194A (ja) * | 2013-08-21 | 2015-03-02 | 株式会社Nttドコモ | ユーザ観察システム |
JP2015184248A (ja) * | 2014-03-26 | 2015-10-22 | 大日本印刷株式会社 | 屋内測位サーバ、屋内測位方法、プログラム、及び、屋内測位システム |
JP2017067735A (ja) * | 2015-10-02 | 2017-04-06 | 株式会社電通国際情報サービス | 測位システム |
WO2017141375A1 (ja) * | 2016-02-17 | 2017-08-24 | 三菱電機株式会社 | 危険予測装置、移動端末及び危険予測方法 |
JP2017224071A (ja) * | 2016-06-14 | 2017-12-21 | 株式会社東芝 | 情報処理装置、および、情報処理方法 |
JP2018011125A (ja) * | 2016-07-11 | 2018-01-18 | カシオ計算機株式会社 | 移動体識別装置、移動体識別システム、移動体識別方法及びプログラム |
WO2018100679A1 (ja) * | 2016-11-30 | 2018-06-07 | 株式会社オプティム | コンピュータシステム、教師データ取引方法及びプログラム |
JP2019045178A (ja) * | 2017-08-30 | 2019-03-22 | 沖電気工業株式会社 | 情報処理装置、およびプログラム |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9369677B2 (en) * | 2012-11-30 | 2016-06-14 | Qualcomm Technologies International, Ltd. | Image assistance for indoor positioning |
JP6148505B2 (ja) * | 2013-03-21 | 2017-06-14 | 株式会社東芝 | 在室確率推定装置およびその方法、ならびにプログラム |
US9984519B2 (en) * | 2015-04-10 | 2018-05-29 | Google Llc | Method and system for optical user recognition |
JP6705972B2 (ja) | 2016-05-20 | 2020-06-03 | サイトセンシング株式会社 | 姿勢推定装置、姿勢推定方法、制御プログラム、および記録媒体 |
JP6611257B2 (ja) * | 2016-07-19 | 2019-11-27 | 日本電信電話株式会社 | 行動認識装置、および、行動認識方法 |
JP6577424B2 (ja) * | 2016-07-19 | 2019-09-18 | 日本電信電話株式会社 | 行動認識装置、および、行動認識方法 |
WO2018025531A1 (ja) * | 2016-08-05 | 2018-02-08 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP6804908B2 (ja) * | 2016-09-13 | 2020-12-23 | 株式会社東芝 | 推定装置、推定方法及びコンピュータプログラム |
JP6588413B2 (ja) * | 2016-10-04 | 2019-10-09 | 日本電信電話株式会社 | 監視装置および監視方法 |
FR3069317B1 (fr) * | 2017-07-21 | 2020-10-16 | Sysnav | Procede d'estimation du mouvement d'un objet evoluant dans un environnement et un champ magnetique |
WO2019093934A1 (en) * | 2017-11-10 | 2019-05-16 | Telefonaktiebolaget Lm Ericsson (Publ) | A radio access network node, wireless devices, methods and software for device-to-device communication |
CN111542858B (zh) * | 2018-01-04 | 2023-09-08 | 株式会社索思未来 | 动态图像解析装置、系统、方法、以及存储介质 |
JP2019139570A (ja) * | 2018-02-13 | 2019-08-22 | 株式会社東芝 | 判別装置、判別方法およびプログラム |
JP7078116B2 (ja) * | 2018-08-02 | 2022-05-31 | 日本電気株式会社 | 屋内位置推定装置、ユーザー端末、屋内位置推定方法及びプログラム |
JP7078117B2 (ja) * | 2018-08-02 | 2022-05-31 | 日本電気株式会社 | 屋内位置推定装置、屋内位置推定方法及びプログラム |
WO2021075004A1 (ja) * | 2019-10-16 | 2021-04-22 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置、情報処理装置の制御方法、及びプログラム |
US11682272B2 (en) * | 2020-07-07 | 2023-06-20 | Nvidia Corporation | Systems and methods for pedestrian crossing risk assessment and directional warning |
CN112598709B (zh) * | 2020-12-25 | 2022-11-01 | 之江实验室 | 一种基于视频流的行人运动速度智能感知方法 |
JP2022112897A (ja) * | 2021-01-22 | 2022-08-03 | トヨタ自動車株式会社 | 情報処理装置及び情報処理方法 |
CN114533040B (zh) * | 2022-01-12 | 2024-04-09 | 北京京仪仪器仪表研究总院有限公司 | 一种固定空间内人员特定活跃度的监测方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004096501A (ja) * | 2002-08-30 | 2004-03-25 | Ntt Advanced Technology Corp | 移動体の位置検出システム、移動体の位置検出方法、及びプログラム |
JP2004219332A (ja) * | 2003-01-16 | 2004-08-05 | National Institute Of Advanced Industrial & Technology | 位置情報処理装置 |
JP2005275912A (ja) * | 2004-03-25 | 2005-10-06 | Institute Of Physical & Chemical Research | 行動分析方法及びシステム |
JP2008016042A (ja) * | 2002-03-26 | 2008-01-24 | Toshiba Corp | 監視システム、装置および方法 |
JP2008026272A (ja) * | 2006-07-25 | 2008-02-07 | Kddi Corp | 自らの位置を検出する移動端末、カメラ及びプログラム |
WO2009007917A2 (en) * | 2007-07-10 | 2009-01-15 | Koninklijke Philips Electronics N.V. | Object motion capturing system and method |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5892454A (en) * | 1993-12-21 | 1999-04-06 | Trimble Navigation Ltd. | Hybrid monitoring of location of a site confinee |
US5828306A (en) * | 1996-04-15 | 1998-10-27 | Curran; Brendan Joseph | Location detector and monitor and method of using the same |
US6176837B1 (en) * | 1998-04-17 | 2001-01-23 | Massachusetts Institute Of Technology | Motion tracking system |
US6330356B1 (en) * | 1999-09-29 | 2001-12-11 | Rockwell Science Center Llc | Dynamic visual registration of a 3-D object with a graphical model |
US6474159B1 (en) * | 2000-04-21 | 2002-11-05 | Intersense, Inc. | Motion-tracking |
US6554706B2 (en) * | 2000-05-31 | 2003-04-29 | Gerard Jounghyun Kim | Methods and apparatus of displaying and evaluating motion data in a motion game apparatus |
US7319479B1 (en) * | 2000-09-22 | 2008-01-15 | Brickstream Corporation | System and method for multi-camera linking and analysis |
US20090231436A1 (en) * | 2001-04-19 | 2009-09-17 | Faltesek Anthony E | Method and apparatus for tracking with identification |
JP2008090861A (ja) * | 2002-03-26 | 2008-04-17 | Toshiba Corp | 監視システム、装置および方法 |
US7123126B2 (en) * | 2002-03-26 | 2006-10-17 | Kabushiki Kaisha Toshiba | Method of and computer program product for monitoring person's movements |
JP2004005511A (ja) * | 2002-03-26 | 2004-01-08 | Toshiba Corp | 監視システム、監視方法および監視プログラム |
JP2004007496A (ja) * | 2002-03-26 | 2004-01-08 | Toshiba Corp | 状態検出システム及び状態検出方法、並びに監視方法及び監視システムとコンピュータプログラム |
US6873256B2 (en) * | 2002-06-21 | 2005-03-29 | Dorothy Lemelson | Intelligent building alarm |
WO2004042662A1 (en) * | 2002-10-15 | 2004-05-21 | University Of Southern California | Augmented virtual environments |
JP2004274101A (ja) * | 2003-03-05 | 2004-09-30 | Shigeo Kaneda | 移動体識別システム |
US7305303B2 (en) * | 2004-03-02 | 2007-12-04 | Honeywell International Inc. | Personal navigation using terrain-correlation and/or signal-of-opportunity information |
JP4431513B2 (ja) * | 2005-03-16 | 2010-03-17 | 株式会社日立製作所 | セキュリティシステム |
WO2007044301A2 (en) * | 2005-10-04 | 2007-04-19 | Intersense, Inc. | Tracking objects with markers |
US7733224B2 (en) * | 2006-06-30 | 2010-06-08 | Bao Tran | Mesh network personal emergency response appliance |
US20080062120A1 (en) * | 2006-09-11 | 2008-03-13 | Lorraine Wheeler | Location tracking system |
US8696458B2 (en) * | 2008-02-15 | 2014-04-15 | Thales Visionix, Inc. | Motion tracking system and method using camera and non-camera sensors |
US20130100268A1 (en) * | 2008-05-27 | 2013-04-25 | University Health Network | Emergency detection and response system and method |
US8761434B2 (en) * | 2008-12-17 | 2014-06-24 | Sony Computer Entertainment Inc. | Tracking system calibration by reconciling inertial data with computed acceleration of a tracked object in the three-dimensional coordinate system |
-
2010
- 2010-12-03 WO PCT/JP2010/071637 patent/WO2011068184A1/ja active Application Filing
- 2010-12-03 JP JP2011544301A patent/JPWO2011068184A1/ja active Pending
- 2010-12-03 US US13/512,696 patent/US8983124B2/en active Active
-
2015
- 2015-06-10 JP JP2015117868A patent/JP2016001875A/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008016042A (ja) * | 2002-03-26 | 2008-01-24 | Toshiba Corp | 監視システム、装置および方法 |
JP2004096501A (ja) * | 2002-08-30 | 2004-03-25 | Ntt Advanced Technology Corp | 移動体の位置検出システム、移動体の位置検出方法、及びプログラム |
JP2004219332A (ja) * | 2003-01-16 | 2004-08-05 | National Institute Of Advanced Industrial & Technology | 位置情報処理装置 |
JP2005275912A (ja) * | 2004-03-25 | 2005-10-06 | Institute Of Physical & Chemical Research | 行動分析方法及びシステム |
JP2008026272A (ja) * | 2006-07-25 | 2008-02-07 | Kddi Corp | 自らの位置を検出する移動端末、カメラ及びプログラム |
WO2009007917A2 (en) * | 2007-07-10 | 2009-01-15 | Koninklijke Philips Electronics N.V. | Object motion capturing system and method |
Non-Patent Citations (1)
Title |
---|
MASAKATSU KOUROGI ET AL.: "Indoor positioning system using a self-contained sensor module for pedestrian navigation and its evaluation", MOBILE KENKYU RONBUNSHU, vol. 2008, 3 July 2008 (2008-07-03), pages 151 - 156 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012103169A (ja) * | 2010-11-11 | 2012-05-31 | Fujitsu Ltd | 移動物特定システム、移動物特定装置および移動物特定プログラム |
JP2013251800A (ja) * | 2012-06-01 | 2013-12-12 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
JP2015041194A (ja) * | 2013-08-21 | 2015-03-02 | 株式会社Nttドコモ | ユーザ観察システム |
JP2015184248A (ja) * | 2014-03-26 | 2015-10-22 | 大日本印刷株式会社 | 屋内測位サーバ、屋内測位方法、プログラム、及び、屋内測位システム |
JP2017067735A (ja) * | 2015-10-02 | 2017-04-06 | 株式会社電通国際情報サービス | 測位システム |
WO2017141375A1 (ja) * | 2016-02-17 | 2017-08-24 | 三菱電機株式会社 | 危険予測装置、移動端末及び危険予測方法 |
JP2017224071A (ja) * | 2016-06-14 | 2017-12-21 | 株式会社東芝 | 情報処理装置、および、情報処理方法 |
JP2018011125A (ja) * | 2016-07-11 | 2018-01-18 | カシオ計算機株式会社 | 移動体識別装置、移動体識別システム、移動体識別方法及びプログラム |
WO2018100679A1 (ja) * | 2016-11-30 | 2018-06-07 | 株式会社オプティム | コンピュータシステム、教師データ取引方法及びプログラム |
JPWO2018100679A1 (ja) * | 2016-11-30 | 2019-02-28 | 株式会社オプティム | コンピュータシステム、教師データ取引方法及びプログラム |
JP2019045178A (ja) * | 2017-08-30 | 2019-03-22 | 沖電気工業株式会社 | 情報処理装置、およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
JP2016001875A (ja) | 2016-01-07 |
US20120237086A1 (en) | 2012-09-20 |
US8983124B2 (en) | 2015-03-17 |
JPWO2011068184A1 (ja) | 2013-04-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011068184A1 (ja) | 移動体の測位装置 | |
Elloumi et al. | Indoor pedestrian localization with a smartphone: A comparison of inertial and vision-based methods | |
US20160061582A1 (en) | Scale estimating method using smart device and gravity data | |
US9465979B2 (en) | Measurement-target-selecting device, face-shape-estimating device, method for selecting measurement target, and method for estimating face shape | |
Kneip et al. | Closed-form solution for absolute scale velocity determination combining inertial measurements and a single feature correspondence | |
US10830606B2 (en) | System and method for detecting non-meaningful motion | |
TW201425971A (zh) | 圖資校正裝置、系統和方法 | |
US20160034817A1 (en) | Method and apparatus for categorizing device use case | |
JP5742794B2 (ja) | 慣性航法装置及びプログラム | |
US11113894B1 (en) | Systems and methods for GPS-based and sensor-based relocalization | |
JP4845068B2 (ja) | デッドレコニング装置 | |
CN108256563B (zh) | 基于距离度量的视觉词典闭环检测方法与装置 | |
JP3968429B2 (ja) | 位置情報処理装置 | |
JP2006090957A (ja) | 移動体の周囲物体検出装置及び移動体の周囲物体検出方法 | |
JP6698430B2 (ja) | 測定装置、測定方法およびプログラム | |
JP4714853B2 (ja) | デッドレコニング装置 | |
He et al. | WiFi based indoor localization with adaptive motion model using smartphone motion sensors | |
JP7098972B2 (ja) | 行動認識装置、行動認識システム、行動認識方法およびプログラム | |
Hnatiuc et al. | Path recognition using mobile phone | |
WO2019087581A1 (ja) | 情報処理装置と情報処理方法およびプログラム | |
Jiang et al. | Combining passive visual cameras and active imu sensors to track cooperative people | |
Yuan et al. | Visual Heading aided Pedestrian Navigation Method Based on Factor Graph in Indoor Environment | |
KR101376536B1 (ko) | 센서 융합을 이용한 모바일 객체의 위치 인식방법 및 그 장치 | |
Wang et al. | Posture recognition and adaptive step detection based on hand-held terminal | |
JP7309097B2 (ja) | 位置検知装置、位置検知方法、及び位置検知プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10834634 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13512696 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011544301 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10834634 Country of ref document: EP Kind code of ref document: A1 |