WO2011101945A1 - 物体位置補正装置、物体位置補正方法、及び物体位置補正プログラム - Google Patents
物体位置補正装置、物体位置補正方法、及び物体位置補正プログラム Download PDFInfo
- Publication number
- WO2011101945A1 WO2011101945A1 PCT/JP2010/007469 JP2010007469W WO2011101945A1 WO 2011101945 A1 WO2011101945 A1 WO 2011101945A1 JP 2010007469 W JP2010007469 W JP 2010007469W WO 2011101945 A1 WO2011101945 A1 WO 2011101945A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- observation
- estimated
- correction
- gravity
- center
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B21/00—Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
- G01B21/02—Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
- G01B21/04—Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness by measuring coordinates of points
- G01B21/045—Correction of measurements
Definitions
- the present invention relates to an object position correction apparatus, an object position correction method, and an object position correction program for displaying the position of an observation target by a user.
- a camera may be used as a sensor that can detect the position of an object.
- the camera ID identification accuracy of the object cannot be 100% (to identify the object ID from the image characteristics (shape or color) obtained from the camera). Even if the identification result through the camera is the object A, there is a possibility that an object other than the object A (object B, object C) was actually identified. In such a case, for example, the object identified by the camera can be expressed as 80% probability of being object A, 10% probability of being object B, and 10% probability of being object C. . Furthermore, the identification rate of objects with similar image characteristics is low. For example, it is difficult to identify objects having similar colors or shapes such as tomatoes and apples with high accuracy. In addition, although there are some differences depending on the performance or arrangement of the camera, the observation position (positioning result) usually includes a certain amount of error. The identification ID and the observation position of this article are collectively called an observation value.
- Non-Patent Document 1 There is a technology for probabilistically estimating the position of an object in the framework of Bayesian estimation while integrating a plurality of observation values of such an object identification ID or a sensor whose observation position is ambiguous while compensating for a lack of observation accuracy.
- Non-Patent Document 1 since even a small probability (in the case of the above example, the probability that the object identified by the camera is the object B) is used for the object position estimation process, the estimation result is another sensor observation value. Will be affected. An example is shown in FIG. As a result of object identification, the observed value 1 has a probability of being an object A of 90% and a probability of being an object B of 10%. As for the observed value 2, as a result of object identification, the probability of being an object A was 10%, and the probability of being an object B was 90%. When position estimation is performed in such an observation situation, the estimated position of the object A is slightly affected by the observation value 2, and is located at a position slightly shifted from the position of the observation value 1 in the direction of the observation value 2.
- an estimated position (for example, an average value in a Gaussian distribution) due to the positional deviation may be a position that is visually uncomfortable for the user. For example, if the observation target is a car, the estimated position of the vehicle is not on the road, and if the observation target is a person, the estimated position of the person is on the table.
- Patent Document 1 As a technique for correcting such a deviation of the estimated position, there is a technique using map matching (Patent Document 1). Since the vehicle position information acquired by GPS (Global Positioning System) contains errors, the information presented to the user is displayed using map matching technology based on the output from the accelerator sensor, brake sensor, and turn signal sensor. It can be changed flexibly.
- GPS Global Positioning System
- Hirofumi Kanazaki Takehisa Yairi, Kazuo Machida, Kenji Kondo, Yoshihiko Matsusukawa, “Varial ApproximationDataFacialEsociationDataFacialIssociAssociationDataFacialIssemoAssociationDataFacialIssemoAssociationDataAssociationDataFacialIssociAssociationDataFacialEssociAssociationDataFacialEssociAssociationEffect
- Patent Document 1 it is necessary to create a map for map matching in advance.
- an object of the present invention is to correct an estimated position of an observation target to a position that does not cause a sense of incongruity for the user without creating a map in which environment information is recorded in advance, and an object position correction A method and an object position correction program are provided.
- the present invention is configured as follows.
- (1) ID likelihood, (2) observation position of each object obtained by observing a plurality of objects existing in the environment with an observation device, and at the previous observation time Represents the existence probability at the position of each object created based on the estimation result of the ID and position of each object obtained based on the acquired ID likelihood and observation position of each object (3)
- An object position estimation unit that estimates an ID and a position of the object based on a predicted distribution to obtain an estimated position of the object; Centroid position calculating means for calculating the centroid position of the observation position;
- Object position correcting means for correcting the estimated position of the object based on the distance and direction from the center of gravity position calculated by the center of gravity position calculating means;
- An object position correcting apparatus is provided.
- (1) ID likelihood, (2) observation position, and object position estimation of each object obtained by observing a plurality of objects existing in the environment with an observation device Present in the position of each object created based on the estimation result of the ID and position of each object obtained based on the ID likelihood and observation position of each object acquired during the previous observation Based on (3) the predicted distribution representing the probability, the ID and position of the object are estimated to obtain the estimated position of the object,
- the gravity center position of the observation position is calculated by the gravity center position calculation means,
- an object position correction method in which an estimated position of the object is corrected by an object position correction means based on a distance and a direction from the gravity center position calculated by the gravity center position calculation means.
- a computer (1) ID likelihood and (2) observation position of each object acquired by observing a plurality of objects existing in the environment with an observation device, and the object position estimation unit acquired at the time of previous observation (3) Predictive distribution that represents the existence probability at the position of each object created based on the estimation result of the ID and position of each object obtained based on the ID likelihood and observation position of each object
- a function for estimating an ID and position of the object based on the estimated position of the object A function of calculating the centroid position of the observation position by the centroid position calculating means;
- An object position correction program for realizing the above is provided.
- the estimated position of the object can be corrected based on the positional relationship of the observed positions of the object detected by the observation apparatus. Therefore, it is possible to correct the object position estimation result to a position that does not make the user feel uncomfortable without using a map in which environment information is recorded.
- FIG. 1 is a block diagram showing a configuration of an object position correcting apparatus according to the first embodiment of the present invention.
- FIG. 2A is a diagram for explaining an observation situation in a room as a living space that is an environment where an observation target exists in the object position correction apparatus according to the first embodiment of the present invention;
- FIG. 2B is a block diagram illustrating a configuration of a camera that is an example of an observation device of the object position correction device according to the first embodiment;
- FIG. 1 is a block diagram showing a configuration of an object position correcting apparatus according to the first embodiment of the present invention.
- FIG. 2A is a diagram for explaining an observation situation in a room as a living space that is an environment where an observation target exists in the object position correction apparatus according to the first embodiment of the present invention
- FIG. 2B is a block diagram illustrating a configuration of a camera that is an example of an observation device of the object position correction device according to the first embodiment;
- FIG. 1 is a block diagram showing a configuration of an object
- FIG. 3 is a diagram showing an example of the estimation history of the object position estimation means recorded in the position estimation history database of the object position correction apparatus according to the first embodiment of the present invention
- FIG. 4 is a diagram showing a simple example of calculating the centroid position of the observed value by the centroid position calculating means in the object position correcting apparatus according to the first embodiment of the present invention
- FIG. 5 shows an outline of the correction of the estimated position of the article by the object position correction unit based on the barycentric position calculated by the barycentric position calculation unit in the object position correction device according to the first embodiment of the present invention.
- FIG. 5 shows an outline of the correction of the estimated position of the article by the object position correction unit based on the barycentric position calculated by the barycentric position calculation unit in the object position correction device according to the first embodiment of the present invention.
- FIG. 5 shows an outline of the correction of the estimated position of the article by the object position correction unit based on the barycentric position calculated by the barycentric position calculation unit in the object position correction device according to
- FIG. 6 is a diagram illustrating a state in which the estimated position (average position of the distribution) of the object is corrected based on the distance and the direction calculated in FIG. 12 in the object position correction apparatus according to the first embodiment of the present invention.
- FIG. 7 is a flowchart showing the overall processing of the object position correcting apparatus according to the first embodiment of the present invention
- FIG. 8 is a diagram illustrating an example of object template data recorded in the observation device of the object position correction device according to the first embodiment of the present invention
- FIG. 9 is a diagram showing an example (sensor model related to ID) of an ID likelihood conversion table of the object position correction apparatus according to the first embodiment of the present invention.
- FIG. 10 is a diagram illustrating an example of an observation history of an article by a camera recorded in an observation history database of the object position correction apparatus according to the first embodiment of the present invention
- FIG. 11 shows an observed value obtained at time 12:00:03 and an estimated position of each article obtained at time 12:00:02 in the object position correcting apparatus according to the first embodiment of the present invention.
- FIG. 12 is a diagram illustrating an operation example of the Kalman filter
- FIG. 13 is a diagram illustrating an example of an object position estimation situation in the object position correction apparatus according to the first embodiment of the present invention
- FIG. 14 is a diagram illustrating an example of the result of clustering in the object position correction apparatus according to the first embodiment of the present invention.
- FIG. 15 is a graph showing an example of the result of the object position estimating means in the object position correcting apparatus according to the first embodiment of the present invention
- FIG. 16 is a graph showing an example of the result of the object position correcting unit in the object position correcting apparatus according to the first embodiment of the present invention
- FIG. 17 shows the true positions of the movement trajectories of the object (1, 1) and the object (1, 2) in the object position correcting apparatus according to the first embodiment of the present invention. It is a figure showing an example when the true position of each object is not known
- FIG. 18 is a view showing a display example of the estimated position and the corrected position of the object shown in FIGS. 15 and 16 in the object position correcting apparatus according to the first embodiment of the present invention.
- FIG. 19 is a diagram illustrating an object position estimation situation in the prior art
- FIG. 20 is a diagram illustrating an observation state in a room as a living space, which is an environment in which a plurality of persons exist, as another example of an observation target in the object position correction apparatus according to the first embodiment of the present invention. .
- (1) ID likelihood, (2) observation position of each object obtained by observing a plurality of objects existing in the environment with an observation device, and at the previous observation time Represents the existence probability at the position of each object created based on the estimation result of the ID and position of each object obtained based on the acquired ID likelihood and observation position of each object (3)
- An object position estimation unit that estimates an ID and a position of the object based on a predicted distribution to obtain an estimated position of the object; Centroid position calculating means for calculating the centroid position of the observation position;
- Object position correcting means for correcting the estimated position of the object based on the distance and direction from the center of gravity position calculated by the center of gravity position calculating means;
- An object position correcting apparatus is provided.
- the object position correction means calculates the estimated position of the object by a correction distance calculated by weighting the distance from the gravity center position to the estimated position of the object.
- the object position correction device according to the first aspect is provided, wherein the object position is moved from the position of the center of gravity to a direction relative to the estimated position of the object.
- the object position correcting means further includes an estimated position of the object by a distance obtained by adding a value obtained by weighting the number of observation values output by the observation device to the correction distance.
- the object position correction apparatus according to the second aspect is provided in which the position of the center of gravity is moved in a direction with respect to the estimated position of the object.
- the object position correcting means determines the weighting ratio for obtaining the correction distance based on the ID identification performance of the observation device.
- a correction device is provided.
- the object position correction unit determines a weighting ratio for obtaining a correction distance based on a width of a sensing area of the observation device.
- An object position correction apparatus is provided.
- the object position estimation unit includes: An object position estimation history database that records the estimation result of the ID and position of the object; Prediction distribution creating means for creating the prediction distribution representing the existence probability at the position of the object based on the estimation result of the ID and the position of the object;
- An object position correcting apparatus comprising: an object position estimating unit that estimates an ID and a position of the object based on the predicted distribution, the ID likelihood, and the observation position.
- the observation apparatus further includes the observation device that detects the plurality of objects existing in the environment and acquires the ID likelihood and the observation position of each object.
- An object position correcting apparatus according to the sixth aspect is provided.
- the object position correcting apparatus according to any one of the first to seventh aspects, further comprising display means for displaying a result of the object ID and the corrected position. .
- the centroid position calculating means calculates a centroid position for each cluster of observation positions clustered based on the position.
- the object position correction apparatus described in 1. is provided.
- the object position correcting means further corrects the estimated position of the object based on the number of observation positions used by the gravity center position calculating means.
- the display means displays the correction result of the object position correction means in an overlay display on the estimation result of the object position estimation means.
- a correction device is provided.
- (1) ID likelihood, (2) observation position, and object position estimation of each object obtained by observing a plurality of objects existing in the environment with an observation device Present in the position of each object created based on the estimation result of the ID and position of each object obtained based on the ID likelihood and observation position of each object acquired during the previous observation Based on (3) the predicted distribution representing the probability, the ID and position of the object are estimated to obtain the estimated position of the object,
- the gravity center position of the observation position is calculated by the gravity center position calculation means,
- an object position correction method in which an estimated position of the object is corrected by an object position correction means based on a distance and a direction from the gravity center position calculated by the gravity center position calculation means.
- a computer (1) ID likelihood and (2) observation position of each object acquired by observing a plurality of objects existing in the environment with an observation device, and the object position estimation unit acquired at the time of previous observation (3) Predictive distribution that represents the existence probability at the position of each object created based on the estimation result of the ID and position of each object obtained based on the ID likelihood and observation position of each object
- a function for estimating an ID and position of the object based on the estimated position of the object A function of calculating the centroid position of the observation position by the centroid position calculating means;
- An object position correction program for realizing the above is provided.
- FIG. 1 is a diagram showing a configuration of an object position correcting apparatus according to the first embodiment of the present invention.
- the object position correction apparatus includes an observation apparatus 101, an observation history database 102, a position estimation history database 103, a predicted distribution creation means 104, an object position estimation means 105, and a gravity center position calculation. Means 106, object position correcting means 107, and display means 108 are provided.
- the position estimation history database 103, the predicted distribution creation means 104, and the object position estimation means 105 may be configured as a single object position estimation unit 120.
- FIG. 2A shows a room 201 as a specific example of a closed environment.
- the room 201 includes one or a plurality of cameras 202 as an example of the observation apparatus 101 that is a component of the object position correction apparatus according to the first embodiment of the present invention.
- One camera 202 is installed near the center of the ceiling of the room 201.
- the number of cameras 202 is not limited to one, and a plurality of cameras 202 may be installed.
- the article 203A, the article 203B, the article 203C, the article 203D, and the article 203E exist on a floor or a table as an example of an object to be observed.
- Each article has an ID as unique identification information.
- the article 203A is a plastic bottle
- the article 203B is a wallet
- the article 203C is a book
- the article 203D is a mobile phone
- the article 203E is a watch.
- the camera 202 as an example of the observation apparatus 101 observes the inside of the room 201 and detects an article 203 existing in the room 201. That is, as will be described later, the camera 202 detects the article 203 by performing image processing on image data acquired by imaging the room 201 using a background subtraction method or the like.
- an observation ID (a unique ID attached to each data or information every time data or information is obtained by observing with the camera 202, is an ID for distinguishing from other observation data or information.
- the identification ID of the detected article 203, and the observation position are acquired and recorded in the observation history database 102.
- the identification ID can be converted into an ID likelihood according to the ID likelihood conversion table.
- the ID likelihood is a probabilistic representation of which ID (article) the detected object (in the present embodiment, the article 203) is likely to be.
- the ID identification accuracy of the object of the camera 202 cannot be 100%.
- the ID likelihood has a probability of being an object A of 0.8, a probability of being an object B is 0.1, and a probability of being an object C is 0.1. Probabilities are assigned to all objects that are present (or that may be present in the room 201). This is an example of determining the ID likelihood, and the present invention is not limited to this.
- an observation ID is an object identification result of the object detected by the observation apparatus 101.
- Each observation device 101 for example, the camera 202 is provided with a timer for acquiring information on the observation period and time, and the time when the article 203 is detected can be output from the camera 202.
- the average value and variance-covariance matrix of the article 203 which is the output result of the object position estimation unit 105, and the final time when the observation value used by the object position estimation unit 105 is obtained are recorded. ing.
- FIG. 3 shows an example of the position estimation history database 103.
- an article with an article identification ID Obj001 is at time 2008/09 / 02_12: 00: 01
- the predicted distribution creating means 104 estimates the probability density distribution of the position of the article 203 based on the past estimated position of the article 203 recorded in the position estimation history database 103 and sends the result to the object position estimating means 105. Output.
- the position of the article 203 for which the probability density distribution is estimated by the predicted distribution creating means 104 is the position at the time when the observation value used by the object position estimating means 105 for estimating the position of the article 203 is obtained. Normally, the object position estimated using the previous observation value (latest estimated position) should be used, but the object position estimated using the previous observation value (latest estimated position) is earlier. It can also be estimated using the old estimated position.
- the object position estimating means 105 estimates the ID likelihood and the observed position of the detected article 203 based on the information recorded in the observation history database 102, and the position of the article 203 based on the predicted distribution.
- the object position estimation unit 105 includes an association unit 109 that calculates an association value.
- the observation value of the observation device 101 such as the camera 202 and the predicted position (prediction distribution) of the object at the time when the observation device 101 observed the object are required.
- moving the predicted position in the direction of the observed value based on the likelihood information can be said to be object position estimation processing.
- the predicted position is calculated based on the estimated position of the object at the time when the observation apparatus 101 observed the object last time.
- the association value is a value representing an association between an observed value (information on ID likelihood and observed position) and an actual object.
- the ID likelihood and the observation position of the object received from the observation history database 102 are values representing the ID obtained by detecting the object of which ID.
- the ID likelihood and the observation position of the object described above are values representing the certainty that each observation value is an observation value obtained by observing an object, and the association value is the ID likelihood. And the position likelihood.
- the position likelihood is a value calculated based on the distance between the observed position and the average position of the predicted distribution.
- the position likelihood increases as the distance decreases, and conversely, the position likelihood increases as the distance increases. Lower.
- the position likelihood may be obtained based on the Mahalanobis distance considering the position error characteristic of the observation apparatus 101 and the variance-covariance matrix of the predicted distribution. Also in this case, the position likelihood increases as the Mahalanobis distance is short, and conversely, the position likelihood decreases as the Mahalanobis distance is long.
- the gravity center position calculation means 106 calculates the gravity center position of the observation value based on the information recorded in the observation history database 102.
- the observation value used to calculate the position of the center of gravity is only the observation value used by the object position estimation unit 105 to estimate the position of the article 203 last time. It is assumed that the information regarding the observed value used by the gravity center position calculating unit 106 is obtained from the object position estimating unit 105.
- the object position correcting means 107 corrects the estimated position of the article 203 based on the center of gravity position calculated by the gravity center position calculating means 106 and the estimated position of the article 203 from the object position estimating means 105.
- the corrected estimated position of the object A is CEP A
- the corrected estimated position of the object B is CEP B.
- the estimated position of the object B whose estimated position is the same as the center of gravity position is not corrected. More detailed contents will be described later.
- the display unit 108 is configured by a monitor or the like that presents the estimated position corrected by the object position correcting unit 107 to the user.
- FIG. 7 is a flowchart showing the overall processing of the object position correcting apparatus according to the first embodiment of the present invention. Hereinafter, the detailed operation of the object position correcting apparatus will be described with reference to the flowchart of FIG.
- step S301 the interior of the room 201 is observed with the camera 202, and processing for detecting the article 203 from the image captured by the camera 202 is performed. Specific examples are described below.
- a background difference method can be used.
- the image is captured in advance by the imaging unit 202 a of the camera 202 and stored in the internal storage unit 202 b built in the camera 202.
- the background image data of the room 201 when the article 203 does not exist is compared with the current image data captured by the camera 202 by the image processing unit 202 c built in the camera 202.
- an area with different pixel values is extracted as a difference area by the image processing unit 202c. This difference area corresponds to the detected article 203.
- the difference area is not the article 203.
- the determination may be made by the image processing unit 202c.
- the case where the difference area is sufficiently small with respect to the article 203 may be a case where the number of pixels in the difference area is equal to or less than a threshold set in advance based on the minimum number of pixels that can be recognized as the article 203.
- the image processing unit 202c determines that the difference area is the detected article 203.
- the detected observation position of the article 203 can be, for example, the barycentric position of the difference area.
- the image processing unit 202c can identify the ID of the article detected by the camera 202. It is assumed that the template image for matching is recorded in advance in the internal storage unit 202b of the camera 202.
- FIG. 8 shows an example of image data of the article template recorded in the internal storage unit 202b of the camera 202.
- the information recorded in the internal storage unit 202b of the camera 202 is a template image for each of the object identification IDs Obj001 to Obj005 and the five objects whose object identification IDs are Obj001 to Obj005. If ID identification of an object is performed, ID likelihood will next be determined according to an ID likelihood conversion table.
- the object identification ID is an ID number that can uniquely identify the object.
- FIG. 9 shows an example of the ID likelihood conversion table.
- the probability that the object detected by the camera 202 is the object identification ID of Obj001 is 0.80.
- the probability that the object detected by the camera 202 is the object identification ID of Obj002 is 0.05
- the probability that the object identification ID of Obj003 is 0.10
- the probability that the object identification ID of Obj004 is 0.03
- Obj005 The probability of being an object identification ID is 0.02. It is assumed that the ID likelihood conversion table is also recorded in the internal storage unit 202b of the camera 202.
- the ID likelihood conversion table performs ID identification in advance by, for example, imaging the objects Obj001 to Obj005 with the imaging unit 202a of the camera 202 a plurality of times while changing the posture, based on the tendency of the ID identification error. create.
- observation cycle of the camera 202 is 1 second as an example.
- the observation period of the camera 202 is not limited to 1 second, and may be a predetermined period.
- step S302 the process in which the image processing unit 202c of the camera 202 obtains the barycentric position of the background difference area and the process of obtaining the ID likelihood by performing ID identification of the article 203 by the template matching are shown in the flowchart of FIG. This corresponds to the processing in step S302.
- step S303 the image processing unit 202c of the camera 202 performs a process of recording the observation position and ID likelihood of the article 203 detected by the camera 202 in the observation history database 102.
- FIG. 10 shows an example of the observation history database 102.
- the time when the camera 202 detects the article 203, the observation position, the identification ID of the article, and the observation ID can be recorded by the camera 202 in the observation history database 102 of FIG.
- an object other than the article 203 such as a wall or a column
- step S304 the predicted distribution creating means 104 creates a predicted distribution of the article 203 based on the estimated position of the article 203 recorded in the position estimation history database 103. Specific examples are described below.
- the predicted distribution is the position estimation history. It may be the same as the Gaussian distribution recorded in the database 103.
- an object for position estimation such as a car, an airplane, or a robot
- a predicted distribution is created based on the equation of motion of the object. For example, assume that a toy car is moving in the + X direction at a speed of 30 cm per second in the room 201.
- the estimated position recorded in the position estimation history database 103 is
- the predicted distribution is a position (130, 450) in which the average position of the object is moved 30 cm in the + X direction, and the variance-covariance matrix is
- the moving direction or speed of the position estimation target object is unknown (such as a human or animal) (the equation of motion is unknown)
- the equation of motion is unknown
- the prediction distribution can be created by the prediction distribution creation means 104.
- the predicted distribution creating means 104 randomly determines the average value of the predicted distribution.
- the dispersion is preferably set so that the distance of 1 ⁇ becomes the distance of one side of the cubic room 201.
- step S 305 the object position estimation unit 105 performs object position estimation processing based on the observation value of the camera 202 and the predicted distribution.
- the association means 109 When the object ID likelihood and the observation position are received from the observation history database 102, the association means 109 first calculates an association value.
- the position of the article 203 is estimated by the association means 109 using the association value. Specifically, it is possible to estimate the position of the article 203 (update the estimated position from the previous estimated position) using a Bayesian estimation framework represented by the Kalman filter or the like. The position is updated based on the detected ID likelihood and position likelihood of the article 203. At this time, the position of the article 203 is updated using only the observed value whose association value exceeds the threshold value. Although it is necessary to set the threshold value in advance, it is necessary to estimate the threshold value from a preliminary experiment or the like. As a tendency, it is desirable to set a low threshold value when it is easy to cause an identification error due to image processing.
- identification errors are likely to occur means that the ID likelihood of the detected object is small, and the association value tends to be small.
- all the observed values may be used regardless of the magnitude of the association value, but in this case, it is desirable to weight the update amount of the position of the article 203 with the association value. That is, the update amount of the position of the object increases as the association value increases. This means that the contribution rate to the position update of observation data that is likely to be observation data of a certain object is increased.
- FIG. 11 shows an example of the observed value obtained at time 12:00:03 and the estimated position of each article obtained at time 12:00:02.
- the association value of the observation value OBS013 is 0.032
- the other observations The value association value is less than 0.001.
- the Kalman filter is an assumption that noise is included in both the information of the state of the object position correction device (for example, the position of the object in the first embodiment of the present invention) and the observation data (observation information) of the observation device 101. Based on the above, the state of the likely object position correcting apparatus is estimated.
- FIG. 12 shows an example in which a Kalman filter is used for object position estimation processing.
- the vertical axis represents the probability, and the horizontal axis represents the position of the object.
- the observation apparatus 101 can obtain the observed value 903 obtained by (Expression 2).
- A represents a motion model of an object
- x represents an object position
- v represents process noise generated during movement
- y represents an observation value
- H represents an observation model that associates the object position x with the observation value y
- w represents observation noise
- t represents time.
- N (0, Q) represents a Gaussian distribution with an average of 0 and a variance of Q.
- N (0, R) similarly represents a Gaussian distribution with an average of 0 and a variance of R.
- the prior probability distribution 901 (hereinafter referred to as “prior distribution”) relating to the position of the currently obtained object is updated by the object position estimation means 105, and the predicted probability distribution 902 (hereinafter “ This is referred to as “predictive distribution”.
- the average (position) of the prediction distribution 902 can be obtained by the object position estimation unit 105 using (Equation 5), and the variance of the prediction distribution 902 can be obtained by the object position estimation unit 105 using (Expression 6).
- b represents an estimated value of X at time a based on the information at time b.
- t ⁇ 1 ” in (Expression 5) represents an estimated value of the object position x at time t based on the information at time t ⁇ 1
- P represents the distribution of the distribution.
- the posterior distribution 904 is obtained by the object position estimating means 105 from the observed value 903 and the predicted distribution 902.
- the average (position) of the posterior distribution can be obtained by the object position estimating means 105 in (Expression 7), and the dispersion of the posterior distribution can be obtained in the object position estimating means 105 in (Expression 8).
- K is a value called Kalman gain, and is obtained by (Equation 9).
- the Kalman gain is a value that determines the update amount. When the accuracy of the observed value is good (dispersion R is very small), the value of the Kalman gain increases to increase the update amount. On the contrary, when the accuracy of the prior distribution is good (the variance P is very small), the value of the Kalman gain is small in order to reduce the update amount.
- FIG. 13 shows an example of the estimation result of the object position estimation means 105.
- FIG. 13 shows the result of object position estimation using observation values at time 2008/1502 — 12:00:03. Comparing FIG. 13 and FIG. 10, the estimated position of the object is not estimated at the same position as the observation position of the camera 202.
- Equation 9 When weighting the update amount of the position of the article 203 with the association value, (Equation 9) may be replaced with (Equation 10).
- D represents an association value for the article 203.
- the weighting information based on the association value is output from the association unit 109 to the object position estimation unit 105.
- the position of the article 203 in the position estimation history database 103 is updated by the object position estimation means 105.
- step S306 the center-of-gravity position calculation unit 106 obtains the center-of-gravity position of the observation value used by the object position estimation unit 105.
- the gravity center position of the observation value is calculated by the gravity center position calculation means 106.
- the observation value for calculating the center-of-gravity position is only the observation value used by the object position estimation means 105. It is assumed that the information regarding the observed value used by the gravity center position calculating unit 106 is obtained from the object position estimating unit 105. That is, when the estimation result of the object position estimation means 105 shown in FIG. 13 is corrected, the observation values used to obtain the center of gravity position are the five observations observed at time 2008/03/02 — 12:00:03. This is a value (OBS011 to OBS015).
- the estimated position is affected by the observation values in the vicinity thereof, but the influence is not constant and is related to the magnitude of the association value.
- the association value is related to the size of the position likelihood. That is, there is a high possibility that the estimated position is affected by an observation value that is present closer. Therefore, the centroid position calculation means 106 may cluster the observation values with respect to the position, and obtain the centroid position among the clustered observation values. Then, the estimated position, which will be described later, may be corrected using the position of the center of gravity that is closest to each estimated position.
- a clustering method for example, a k-means method can be used.
- an arbitrary number for example, two
- the number of representative values can be, for example, the number of places where an object is likely to stay.
- the distance to each representative value is measured. It is assumed that the representative value with the shortest distance is the cluster to which the observed value belongs.
- the center of gravity of each cluster is set as a new representative value, and the distance to each representative value is measured for all observed values. It is assumed that the representative value with the shortest distance is the cluster to which the observed value belongs.
- the creation and assignment of the representative value is repeated, and the process ends when there is no change in the cluster to which each observation value belongs. All of these processes are performed by the gravity center position calculation means 106.
- FIG. 14 shows an example of the clustering result.
- the number of representative values can be determined as 2 in advance.
- OBS011, OBS012, and OBS013 are observed values belonging to cluster A (area 1902 in FIG. 14)
- OBS014 and OBS015 are cluster B ( It is shown that the observed value belongs to the region 1903 in FIG.
- step S307 the object position correcting unit 107 corrects the estimated position of the article 203 based on the number of observation values and the positional relationship.
- the direction from the position of the center of gravity to the estimated position is determined by the object position correcting means 107 as the direction to be corrected.
- the direction in which the estimated position of the object Obj004 is corrected is
- the distance to be corrected is determined by the object position correction means 107 based on the distance from the center of gravity position to the estimated position.
- the estimated position is usually affected by all observation values. For this reason, the estimated position existing near the center of gravity of the observed value is more likely to remain in the vicinity of the center of gravity as a result because the influence received from the surrounding observed values is offset.
- the degree of influence the amount by which the estimated position deviates
- weighting is performed by the object position correcting means 107 based on the distance from the center of gravity position to the estimated position and the number of observation values used by the object position estimating means 105 as in (Expression 11) and (Expression 12).
- the correction distance is calculated by the object position correction means 107. That is, the object position correcting unit 107 changes the estimated position of the object from the center of gravity position to the estimated position of the object by a distance obtained by adding a value obtained by weighting the number of observation values output from the observation apparatus 101 to the correction distance. I try to move it.
- D is a correction distance
- A is a distance between the center of gravity position and the observation position
- ⁇ is a weighting factor.
- the weighting factor (in other words, the weighting ratio for obtaining the correction distance) ⁇ is determined based on the size of the environment (in other words, the size of the sensing area of the observation device 101) and the object identification performance of the observation device 101. Is done. When the object identification performance of the observation apparatus 101 is high, the likelihood of being assigned to an incorrect object ID is reduced. Then, the influence of updating the estimated position of the wrong object ID is also reduced (see the description of the object position estimating means 105). That is, the higher the object identification performance of the observation apparatus 101, the smaller the value of the weighting factor ⁇ is desirable.
- Equation 13 and (Equation 14) show how to calculate the correction distance D using the number of observation values.
- B represents the number of observation values
- ⁇ represents a weighting coefficient.
- the weighting factor ⁇ is determined based on the size of the environment and the object identification performance of the observation apparatus 101 as with the weighting factor ⁇ .
- the object identification performance of the observation apparatus 101 is high, the likelihood of being assigned to an incorrect object ID is reduced. Then, the influence of updating the estimated position of the wrong object ID is also reduced (see the description of the object position estimating means 105). That is, as the object identification performance of the observation apparatus 101 is higher, it is desirable that the value of the weighting factor ⁇ is smaller.
- the correction direction and distance are obtained for each estimated position of each object, and the estimated position is corrected by the object position correcting means 107.
- FIG. 15 shows an example of the result of the object position estimating unit 105
- FIG. 16 shows an example of the result of the object position correcting unit 107.
- the history of the estimation results of the positions of the two objects, the object (1, 1) and the object (1, 2), is shown as a trajectory.
- FIG. 17 shows the true positions of the movement trajectories of the object (1, 1) and the object (1, 2).
- the true position of each object is shown. The location is unknown.
- the object (1,1) and the object (1,2) are moving in parallel from the left end to the right end in FIG.
- step S308 a process of presenting the estimated position corrected by the object position correcting unit 107 to the user on the display unit 108 is performed.
- the display unit 108 presents the estimated position corrected by the object position correcting unit 107 to the user.
- the covariance calculated by the object position estimating unit 105 may be simultaneously presented to the user by the display unit 108 at the estimated position. Further, the estimated position (estimated position before correction) calculated by the object position estimating means 105 may be simultaneously presented to the user on the display means 108.
- FIG. 18 shows a display example of the estimated position and the corrected position of the object shown in FIG. 15 and FIG.
- the obstacle 1402 a locus indicated by a solid line connecting the estimated object positions before correction, and a dotted line connecting the corrected object estimated positions by a straight line are shown.
- the displayed track is overlaid.
- An example of the obstacle 1402 is a bookshelf or a table, for example.
- the estimated position of the object can be corrected based on the positional relationship and the number of observed values of the object detected by the observation apparatus 101. Accordingly, the result of the object position estimation can be corrected to a position that does not feel uncomfortable for the user without using a map in which environment information is recorded.
- a plurality of people 212 may be observed with the camera 202 instead of the article 203.
- each of the object position estimation unit 120, the center-of-gravity position calculation unit 106, the object position correction unit 106, or any part of them may be configured by software itself. it can. Therefore, for example, as a computer program having steps constituting the control operation of each embodiment of the present specification, it is stored in a recording medium such as a storage device (hard disk or the like) in a readable manner, and the computer program is temporarily stored in the computer.
- a recording medium such as a storage device (hard disk or the like) in a readable manner
- the computer program is temporarily stored in the computer.
- Each function or each step described above can be executed by reading it into a device (semiconductor memory or the like) and executing it using a CPU.
- the object position correction apparatus, the object position correction method, and the object position correction program according to the present invention can correct the object position estimation result to a position that does not feel uncomfortable for the user without using a map in which environment information is recorded. it can. Therefore, the present invention is particularly useful for an object monitoring, display device or method in a place where it is difficult to create a map in advance, or in a place where environmental changes may occur (in the home, office, factory, etc.)
Abstract
Description
前記観測位置の重心位置を算出する重心位置算出手段と、
前記重心位置算出手段で算出された前記重心位置からの距離と方角とに基づいて、前記物体の推定位置の補正を行う物体位置補正手段と、
を備える物体位置補正装置を提供する。
前記観測位置の重心位置を重心位置算出手段で算出し、
前記重心位置算出手段で算出された前記重心位置からの距離と方角とに基づいて、前記物体の推定位置を物体位置補正手段で補正する物体位置補正方法を提供する。
環境内に存在する複数の物体を観測装置で観測してそれぞれ取得された前記各物体の(1)ID尤度と(2)観測位置と、物体位置推定部により、前回観測時に取得された前記各物体のID尤度と観測位置に基づいて求められた前記各物体のIDと位置との推定結果に基づいて作成された、前記各物体の位置における存在確率を表す(3)予測分布とに基づいて、前記物体のIDと位置を推定して前記物体の推定位置を求める機能と、
前記観測位置の重心位置を重心位置算出手段で算出する機能と、
前記重心位置算出手段で算出された前記重心位置からの距離と方角とに基づいて、前記物体の推定位置の補正を物体位置補正手段で行う機能と、
を実現させるための物体位置補正プログラムを提供する。
前記観測位置の重心位置を算出する重心位置算出手段と、
前記重心位置算出手段で算出された前記重心位置からの距離と方角とに基づいて、前記物体の推定位置の補正を行う物体位置補正手段と、
を備える物体位置補正装置を提供する。
前記物体のIDと位置の前記推定結果を記録する物体位置推定履歴データベースと、
前記物体のIDと位置の前記推定結果に基づいて、前記物体の位置における前記存在確率を表す前記予測分布を作成する予測分布作成手段と、
前記予測分布と前記ID尤度と前記観測位置に基づいて、前記物体のIDと位置を推定する物体位置推定手段とを備えている、第1の態様に記載の物体位置補正装置を提供する。
前記観測位置の重心位置を重心位置算出手段で算出し、
前記重心位置算出手段で算出された前記重心位置からの距離と方角とに基づいて、前記物体の推定位置を物体位置補正手段で補正する物体位置補正方法を提供する。
環境内に存在する複数の物体を観測装置で観測してそれぞれ取得された前記各物体の(1)ID尤度と(2)観測位置と、物体位置推定部により、前回観測時に取得された前記各物体のID尤度と観測位置に基づいて求められた前記各物体のIDと位置との推定結果に基づいて作成された、前記各物体の位置における存在確率を表す(3)予測分布とに基づいて、前記物体のIDと位置を推定して前記物体の推定位置を求める機能と、
前記観測位置の重心位置を重心位置算出手段で算出する機能と、
前記重心位置算出手段で算出された前記重心位置からの距離と方角とに基づいて、前記物体の推定位置の補正を物体位置補正手段で行う機能と、
を実現させるための物体位置補正プログラムを提供する。
図1は、本発明の第1実施形態に係る物体位置補正装置の構成を示した図である。
位置推定履歴データベース103に記録されている推定位置が
つまり、図13で示した物体位置推定手段105の推定結果を補正する場合、重心位置を求めるために使用する観測値は、時刻2008/09/02_12:00:03に観測された5個の観測値(OBS011~OBS015)ということになる。また、重心位置は、(x,y)=(300,310)となっている。
説明のために、図17に物体(1,1)と物体(1,2)の移動軌跡の真の位置を示すが、この第1実施形態にかかる物体位置補正装置では、各物体の真の位置は分かっていないものとする。物体(1,1)と物体(1,2)は、200cmの間隔を保ちながら図17の左端から右端へと並走移動している。具体的には、物体(1,1)は、座標(-500,800)から座標(500,800)へ移動し、物体(1,2)は、座標(-500,600)から座標(500,600)へ移動している。尚、各物体は同時刻に移動を開始し、更に、同じ速度で移動しているものとする。図15に示された物体位置推定手段105の結果を見ると、物体位置推定は、移動開始直後から互いの観測値の影響を受け始め、200cm程進んだ地点では物体(1,1)と物体(1,2)が互いに80cm程度ずつ引き寄せられていることが分かる。それに対して、図16に示された物体位置補正手段107の結果を見てみると、物体(1,1)と物体(1,2)の物体間隔は維持されたまま物体位置の推定(補正)が行われている。尚、この例ではパラメータを、α=700,β=2と設定している。
Claims (13)
- 環境内に存在する複数の物体を観測装置で観測してそれぞれ取得された前記各物体の(1)ID尤度と(2)観測位置と、前回観測時に取得された前記各物体のID尤度と観測位置に基づいて求められた前記各物体のIDと位置との推定結果に基づいて作成された、前記各物体の位置における存在確率を表す(3)予測分布とに基づいて、前記物体のIDと位置とを推定して前記物体の推定位置を求める物体位置推定部と、
前記観測位置の重心位置を算出する重心位置算出手段と、
前記重心位置算出手段で算出された前記重心位置からの距離と方角とに基づいて、前記物体の推定位置の補正を行う物体位置補正手段と、
を備える物体位置補正装置。 - 前記物体位置補正手段は、前記重心位置から前記物体の推定位置までの距離に対して重み付けを行うことにより算出された補正距離分、前記物体の推定位置を、前記重心位置から前記物体の推定位置に対する方角に移動させる、請求項1に記載の物体位置補正装置。
- 前記物体位置補正手段は、更に、前記補正距離に対して、前記観測装置が出力した観測値数を重み付けした値を加えた距離分、前記物体の推定位置を、前記重心位置から前記物体の推定位置に対する方角に移動させる、請求項2に記載の物体位置補正装置。
- 前記物体位置補正手段は、補正距離を求めるための重み付けの割合を、前記観測装置のID識別性能に基づいて決定する、請求項2に記載の物体位置補正装置。
- 前記物体位置補正手段は、補正距離を求めるための重み付けの割合を、前記観測装置のセンシング領域の広さに基づいて決定する、請求項2に記載の物体位置補正装置。
- 前記物体位置推定部は、
前記物体のIDと位置の前記推定結果を記録する物体位置推定履歴データベースと、
前記物体のIDと位置の前記推定結果に基づいて、前記物体の位置における前記存在確率を表す前記予測分布を作成する予測分布作成手段と、
前記予測分布と前記ID尤度と前記観測位置に基づいて、前記物体のIDと位置を推定する物体位置推定手段とを備えている、請求項1に記載の物体位置補正装置。 - 前記環境内に存在する前記複数の物体を検出して、前記各物体のID尤度と観測位置とをそれぞれ取得する前記観測装置をさらに備える、請求項1又は6に記載の物体位置補正装置。
- 前記物体のIDと補正された位置の結果を表示する表示手段をさらに備える、請求項1~7のいずれか1つに記載の物体位置補正装置。
- 前記重心位置算出手段は、位置に基づいてクラスタリングされた観測位置のクラスタ毎に重心位置を算出することを特徴とする請求項1~7のいずれか1つに記載の物体位置補正装置。
- 更に、前記物体位置補正手段は、前記重心位置算出手段が用いた前記観測位置の数に基づいて、前記物体の推定位置の補正を行うことを特徴とする請求項1~7のいずれか1つに記載の物体位置補正装置。
- 前記表示手段は、前記物体位置推定手段の推定結果に前記物体位置補正手段の補正結果をオーバーレイ表示させることを特徴とする請求項8に記載の物体位置補正装置。
- 環境内に存在する複数の物体を観測装置で観測してそれぞれ取得された前記各物体の(1)ID尤度と(2)観測位置と、物体位置推定部により、前回観測時に取得された前記各物体のID尤度と観測位置に基づいて求められた前記各物体のIDと位置との推定結果に基づいて作成された、前記各物体の位置における存在確率を表す(3)予測分布とに基づいて、前記物体のIDと位置を推定して前記物体の推定位置を求め、
前記観測位置の重心位置を重心位置算出手段で算出し、
前記重心位置算出手段で算出された前記重心位置からの距離と方角とに基づいて、前記物体の推定位置を物体位置補正手段で補正する物体位置補正方法。 - コンピュータに、
環境内に存在する複数の物体を観測装置で観測してそれぞれ取得された前記各物体の(1)ID尤度と(2)観測位置と、物体位置推定部により、前回観測時に取得された前記各物体のID尤度と観測位置に基づいて求められた前記各物体のIDと位置との推定結果に基づいて作成された、前記各物体の位置における存在確率を表す(3)予測分布とに基づいて、前記物体のIDと位置を推定して前記物体の推定位置を求める機能と、
前記観測位置の重心位置を重心位置算出手段で算出する機能と、
前記重心位置算出手段で算出された前記重心位置からの距離と方角とに基づいて、前記物体の推定位置の補正を物体位置補正手段で行う機能と、
を実現させるための物体位置補正プログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010800234023A CN102449427A (zh) | 2010-02-19 | 2010-12-24 | 物体位置修正装置、物体位置修正方法及物体位置修正程序 |
JP2011539579A JP4875228B2 (ja) | 2010-02-19 | 2010-12-24 | 物体位置補正装置、物体位置補正方法、及び物体位置補正プログラム |
US13/238,030 US8401234B2 (en) | 2010-02-19 | 2011-09-21 | Object position correction apparatus, object position correction method, and object position correction program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-034309 | 2010-02-19 | ||
JP2010034309 | 2010-02-19 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/238,030 Continuation US8401234B2 (en) | 2010-02-19 | 2011-09-21 | Object position correction apparatus, object position correction method, and object position correction program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011101945A1 true WO2011101945A1 (ja) | 2011-08-25 |
Family
ID=44482569
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/007469 WO2011101945A1 (ja) | 2010-02-19 | 2010-12-24 | 物体位置補正装置、物体位置補正方法、及び物体位置補正プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US8401234B2 (ja) |
JP (1) | JP4875228B2 (ja) |
CN (1) | CN102449427A (ja) |
WO (1) | WO2011101945A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016091202A (ja) * | 2014-10-31 | 2016-05-23 | 株式会社豊田中央研究所 | 自己位置推定装置及び自己位置推定装置を備えた移動体 |
WO2020013298A1 (ja) * | 2018-07-12 | 2020-01-16 | Groove X株式会社 | 発信元方向推定装置、発信元方向推定システム、赤外線発光装置、ロボット、発信元方向推定方法およびプログラム、対象物存在方向推定システム |
WO2022239644A1 (ja) * | 2021-05-12 | 2022-11-17 | 株式会社デンソー | 追跡装置 |
DE112021001527T5 (de) | 2020-03-06 | 2023-01-19 | Sony Group Corporation | Informationsverarbeitungsvorrichtung und informationsverarbeitungsverfahren |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5017392B2 (ja) * | 2010-02-24 | 2012-09-05 | クラリオン株式会社 | 位置推定装置および位置推定方法 |
TWI485421B (zh) * | 2012-12-17 | 2015-05-21 | Ind Tech Res Inst | 圖資校正裝置、系統和方法 |
JP6271953B2 (ja) * | 2013-11-05 | 2018-01-31 | キヤノン株式会社 | 画像処理装置、画像処理方法 |
CN106323275B (zh) * | 2015-06-17 | 2019-06-04 | 中国科学院上海高等研究院 | 基于贝叶斯估计和地图辅助校准的步行者室内定位方法 |
CN107742303B (zh) * | 2017-09-29 | 2021-05-25 | 南京阿凡达机器人科技有限公司 | 一种基于机器人的目标图像显示处理方法及系统 |
CN110232713B (zh) * | 2019-06-13 | 2022-09-20 | 腾讯数码(天津)有限公司 | 一种图像目标定位修正方法及相关设备 |
JP2021103347A (ja) * | 2019-12-24 | 2021-07-15 | キヤノン株式会社 | 情報処理装置、情報処理方法及びプログラム |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04264207A (ja) * | 1991-02-19 | 1992-09-21 | Nippon Telegr & Teleph Corp <Ntt> | 多視点ステレオ画像計測方法 |
JP2005141687A (ja) * | 2003-11-10 | 2005-06-02 | Nippon Telegr & Teleph Corp <Ntt> | 物体追跡方法、物体追跡装置、物体追跡システム、プログラム、および、記録媒体 |
JP2007249309A (ja) * | 2006-03-13 | 2007-09-27 | Toshiba Corp | 障害物追跡装置及びその方法 |
JP2007303886A (ja) * | 2006-05-09 | 2007-11-22 | Sony Corp | 位置推定装置、位置推定方法及びプログラム記録媒体 |
WO2009113265A1 (ja) * | 2008-03-11 | 2009-09-17 | パナソニック株式会社 | タグセンサシステムおよびセンサ装置、ならびに、物体位置推定装置および物体位置推定方法 |
JP2009211122A (ja) * | 2008-02-29 | 2009-09-17 | Toshiba Teli Corp | 画像処理装置およびオブジェクト推定プログラム。 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5499306A (en) * | 1993-03-08 | 1996-03-12 | Nippondenso Co., Ltd. | Position-and-attitude recognition method and apparatus by use of image pickup means |
JP3398038B2 (ja) | 1998-03-19 | 2003-04-21 | アイシン・エィ・ダブリュ株式会社 | ナビゲーション出力装置 |
US7845560B2 (en) * | 2004-12-14 | 2010-12-07 | Sky-Trax Incorporated | Method and apparatus for determining position and rotational orientation of an object |
JP2006338079A (ja) | 2005-05-31 | 2006-12-14 | Nissan Motor Co Ltd | 画像補正装置 |
JP4874607B2 (ja) | 2005-09-12 | 2012-02-15 | 三菱電機株式会社 | 物体測位装置 |
US7869649B2 (en) * | 2006-05-08 | 2011-01-11 | Panasonic Corporation | Image processing device, image processing method, program, storage medium and integrated circuit |
EP2407081A4 (en) * | 2009-03-10 | 2013-03-13 | Olympus Medical Systems Corp | POSITION DETECTING SYSTEM AND POSITION DETECTING METHOD |
JP5114514B2 (ja) * | 2010-02-25 | 2013-01-09 | 株式会社日立製作所 | 位置推定装置 |
-
2010
- 2010-12-24 WO PCT/JP2010/007469 patent/WO2011101945A1/ja active Application Filing
- 2010-12-24 JP JP2011539579A patent/JP4875228B2/ja active Active
- 2010-12-24 CN CN2010800234023A patent/CN102449427A/zh active Pending
-
2011
- 2011-09-21 US US13/238,030 patent/US8401234B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04264207A (ja) * | 1991-02-19 | 1992-09-21 | Nippon Telegr & Teleph Corp <Ntt> | 多視点ステレオ画像計測方法 |
JP2005141687A (ja) * | 2003-11-10 | 2005-06-02 | Nippon Telegr & Teleph Corp <Ntt> | 物体追跡方法、物体追跡装置、物体追跡システム、プログラム、および、記録媒体 |
JP2007249309A (ja) * | 2006-03-13 | 2007-09-27 | Toshiba Corp | 障害物追跡装置及びその方法 |
JP2007303886A (ja) * | 2006-05-09 | 2007-11-22 | Sony Corp | 位置推定装置、位置推定方法及びプログラム記録媒体 |
JP2009211122A (ja) * | 2008-02-29 | 2009-09-17 | Toshiba Teli Corp | 画像処理装置およびオブジェクト推定プログラム。 |
WO2009113265A1 (ja) * | 2008-03-11 | 2009-09-17 | パナソニック株式会社 | タグセンサシステムおよびセンサ装置、ならびに、物体位置推定装置および物体位置推定方法 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016091202A (ja) * | 2014-10-31 | 2016-05-23 | 株式会社豊田中央研究所 | 自己位置推定装置及び自己位置推定装置を備えた移動体 |
WO2020013298A1 (ja) * | 2018-07-12 | 2020-01-16 | Groove X株式会社 | 発信元方向推定装置、発信元方向推定システム、赤外線発光装置、ロボット、発信元方向推定方法およびプログラム、対象物存在方向推定システム |
JPWO2020013298A1 (ja) * | 2018-07-12 | 2021-08-05 | Groove X株式会社 | 発信元方向推定装置、発信元方向推定システム、赤外線発光装置、ロボット、発信元方向推定方法およびプログラム、対象物存在方向推定システム |
JP7473202B2 (ja) | 2018-07-12 | 2024-04-23 | Groove X株式会社 | 発信元方向推定装置、発信元方向推定システム、赤外線発光装置、ロボット、発信元方向推定方法およびプログラム、対象物存在方向推定システム |
DE112021001527T5 (de) | 2020-03-06 | 2023-01-19 | Sony Group Corporation | Informationsverarbeitungsvorrichtung und informationsverarbeitungsverfahren |
WO2022239644A1 (ja) * | 2021-05-12 | 2022-11-17 | 株式会社デンソー | 追跡装置 |
Also Published As
Publication number | Publication date |
---|---|
US8401234B2 (en) | 2013-03-19 |
JPWO2011101945A1 (ja) | 2013-06-17 |
US20120008831A1 (en) | 2012-01-12 |
JP4875228B2 (ja) | 2012-02-15 |
CN102449427A (zh) | 2012-05-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4875228B2 (ja) | 物体位置補正装置、物体位置補正方法、及び物体位置補正プログラム | |
US8073200B2 (en) | Information processing apparatus, information processing method, and computer program | |
US9927814B2 (en) | System and method for localization of robots | |
US9989626B2 (en) | Mobile robot and sound source position estimation system | |
KR101708061B1 (ko) | 제어 장치, 제어 방법 및 기록 매체 | |
JP5873864B2 (ja) | オブジェクト追跡及び認識方法及び装置 | |
US9576191B2 (en) | Posture estimation device, posture estimation method, and posture estimation program | |
EP2618232A1 (en) | Map generation device, map generation method, method for moving mobile body, and robot device | |
CN107053166B (zh) | 自主移动装置、自主移动方法以及存储介质 | |
WO2015008432A1 (ja) | 物体追跡装置、物体追跡方法および物体追跡プログラム | |
JP4978099B2 (ja) | 自己位置推定装置 | |
JP2012163495A5 (ja) | ||
US11298050B2 (en) | Posture estimation device, behavior estimation device, storage medium storing posture estimation program, and posture estimation method | |
WO2013032192A2 (ko) | 천장 임의 형상 특성 활용 이동 로봇 위치 인식 방법 | |
WO2019225746A1 (ja) | ロボットシステム及び追加学習方法 | |
JP2007240295A (ja) | 位置推定装置、位置推定方法及び位置推定プログラム | |
KR101460313B1 (ko) | 시각 특징과 기하 정보를 이용한 로봇의 위치 추정 장치 및 방법 | |
JPWO2011027557A1 (ja) | 位置校正情報収集装置、位置校正情報収集方法、及び位置校正情報収集プログラム | |
JP2017084335A (ja) | ユーザーインターフェースのリアルタイムインタラクティブ操作のためのシステム及び方法 | |
Fetzer et al. | On Monte Carlo smoothing in multi sensor indoor localisation | |
JP6349272B2 (ja) | 移動物体追跡装置 | |
Nobre et al. | Drift-correcting self-calibration for visual-inertial SLAM | |
KR100998709B1 (ko) | 물체의 공간적 의미정보를 이용한 로봇의 자기위치 추정 방법 | |
CN113327270A (zh) | 视觉惯导方法、装置、设备及计算机可读存储介质 | |
JP7096176B2 (ja) | 物体位置推定装置およびその方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080023402.3 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011539579 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10846085 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10846085 Country of ref document: EP Kind code of ref document: A1 |