WO2014076769A1 - 検出装置、方法及びプログラム - Google Patents
検出装置、方法及びプログラム Download PDFInfo
- Publication number
- WO2014076769A1 WO2014076769A1 PCT/JP2012/079411 JP2012079411W WO2014076769A1 WO 2014076769 A1 WO2014076769 A1 WO 2014076769A1 JP 2012079411 W JP2012079411 W JP 2012079411W WO 2014076769 A1 WO2014076769 A1 WO 2014076769A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- existence probability
- estimation
- road surface
- projection position
- projection
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/586—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of parking space
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
Definitions
- Embodiments described herein relate generally to a detection apparatus, a method, and a program.
- a spatial position is estimated using a plurality of images captured by a plurality of cameras installed in a vehicle and parallax (correspondence relationship) between the plurality of images, and based on the estimated spatial position,
- a technique for detecting a boundary between an existing object (obstacle) and a road surface is known.
- the problem to be solved by the present invention is to provide a detection apparatus, method, and program capable of detecting the boundary between an object and a road surface even if the parallax and the depth between the plurality of images are not in a fixed relationship. .
- the detection device of the embodiment includes a first estimation unit, a second estimation unit, a third estimation unit, a projection unit, a calculation unit, and a detection unit.
- the first estimation unit estimates a spatial first positional relationship between different imaging positions.
- the second estimation unit uses a plurality of captured images captured at the different imaging positions and the first positional relationship, and a spatial position in the estimated captured image that is one of the plurality of captured images and An error position of the spatial position is estimated.
- the third estimation unit estimates a second positional relationship between the imaging position of the estimation captured image and the road surface.
- the projecting unit uses the second positional relationship to project the imaging position of the estimation captured image onto the road surface to obtain a first projection position, and project the spatial position onto the road surface to obtain a second.
- a projection position is obtained, and the error position is projected onto the road surface to obtain a third projection position.
- the calculation unit is located between the second projecting position and the third projecting position rather than between the first projecting position and the third projecting position. Is updated so as to increase the existence probability of the object, and the existence probability is calculated.
- the detection unit detects a boundary between the road surface and the object using the existence probability.
- the lineblock diagram showing the example of the detecting device of a 1st embodiment.
- the flowchart which shows the example of a detection process of 1st Embodiment. Explanatory drawing of the example of arrangement
- the flowchart which shows the example of an estimation process of 1st Embodiment.
- the enlarged view of the corresponding point in the captured image of 1st Embodiment The figure which shows the example of the captured image to which the initial region of 1st Embodiment was set. The figure which shows the example of the relationship between the position and attitude
- the flowchart which shows the projection process of 1st Embodiment, and a presence probability calculation process example.
- the block diagram which shows the example of the detection apparatus of 2nd Embodiment.
- the flowchart which shows the example of a detection process of 2nd Embodiment.
- FIG. 1 is a configuration diagram illustrating an example of the detection device 10 according to the first embodiment.
- the detection device 10 includes an imaging unit 11, a first estimation unit 13, a second estimation unit 15, a third estimation unit 17, a projection unit 19, a calculation unit 21, and a detection unit. 23, an output control unit 25, and an output unit 27.
- the imaging unit 11 can be realized by an imaging device such as a digital camera, for example.
- the first estimation unit 13, the second estimation unit 15, the third estimation unit 17, the projection unit 19, the calculation unit 21, the detection unit 23, and the output control unit 25 are provided in a processing device such as a CPU (Central Processing Unit), for example.
- the program may be executed, that is, realized by software, may be realized by hardware such as an IC (Integrated Circuit), or may be realized by using software and hardware together.
- the output unit 27 may be realized by, for example, a display output display device such as a liquid crystal display or a touch panel display, or an audio output audio device such as a speaker, or may be realized by using these devices together. May be.
- the detection device 10 is applied to an automobile and an obstacle on the road surface is detected as an example.
- the detection apparatus 10 can be applied to a moving body, and may be applied to an autonomous mobile robot, for example.
- FIG. 2 is a flowchart illustrating an example of a flow of a detection process performed by the detection apparatus 10 according to the first embodiment.
- the imaging unit 11 inputs captured images captured in time series and internal parameters of the imaging unit 11 (camera).
- the internal parameters are the focal length of the lens, the size per pixel, the image center, and the like.
- the internal parameters are represented by a matrix of 3 rows and 3 columns.
- FIG. 3 is an explanatory diagram illustrating an example of the arrangement, movement, and imaging timing of the imaging unit 11 according to the first embodiment.
- the imaging unit 11 (camera) is disposed on the rear surface of the vehicle 104 so as to coincide with the traveling direction (back direction), and the vehicle 104 moves rearward (traveling direction) in the back. ing.
- the imaging unit 11 first captures and inputs an image when it is positioned at the position 100, and then captures and inputs an image when it is positioned at the position 102.
- FIG. 4A is a diagram illustrating an example of the captured image 103 captured from the position 102 by the imaging unit 11 of the first embodiment
- FIG. 4B is an image captured from the position 100 by the imaging unit 11 of the first embodiment. It is a figure which shows an example of the image.
- the person 105 since the person 105 stands in the traveling direction of the vehicle 104, the person 105 is reflected in the captured images 101 and 103. Since the position 102 is closer to the person 105 than the position 100, the captured image 103 shows the person 105 larger than the captured image 101.
- the arrangement, movement, and input timing of the captured image of the imaging unit 11 are not limited to this, and three or more captured images may be captured.
- the imaging unit 11 inputs internal parameters together with the captured image.
- a plurality of captured images are captured by one imaging unit 11 (one camera). Therefore, the internal parameter of each captured image does not change and is a constant value. For this reason, the imaging unit 11 may input the internal parameters in advance instead of inputting the internal parameters when inputting the captured image.
- the first estimation unit 13 estimates a spatial positional relationship (an example of the first positional relationship) between different imaging positions. Specifically, the first estimation unit 13 uses the time-series captured image and internal parameters input from the image capturing unit 11 to rotate, which is a relative positional relationship between the captured positions of each time-series captured image. Estimate matrices and translation vectors.
- FIG. 5 is a flowchart showing an example of the flow of the estimation process performed by the first estimation unit 13 of the first embodiment.
- FIG. 5 an example in which the rotation matrix and the translation vector between the captured images 101 and 103 are estimated will be described.
- the first estimation unit 13 extracts feature points from each of the captured images 101 and 103 (step S201). For example, a Harris operator may be used to extract feature points.
- the first estimation unit 13 associates each feature point of the extracted captured image 103 with each feature point of the captured image 101 between the captured images (step S203). For example, the first estimation unit 13 calculates a descriptor describing a feature from the feature points of the captured image 103, and obtains a similarity between the feature points of the captured image 101 according to the calculated feature point descriptor. Those having a high degree of similarity (for example, those having a degree of similarity equal to or higher than a threshold) are set as corresponding point pairs having a high degree of similarity.
- the descriptor can use, for example, the luminance in a small area set around the feature point. Since a small area includes a plurality of luminances, the luminance in the small area is represented by a vector. As for the similarity between feature points, a corresponding point set having a high similarity is obtained by using SAD (sum of absolute differences) or the like for luminance (luminance vector) in a small region.
- SAD sum of absolute differences
- the first estimation unit 13 calculates a basic matrix using the obtained corresponding point set (step S205).
- the first estimation unit 13 can obtain a basic matrix by a known five-point method or the like. For example, the first estimating unit 13 calculates the basic matrix using Equation (1).
- E represents a basic matrix
- x represents a normalized image coordinate of one image of the corresponding point set in a homogeneous coordinate system
- x ′ represents the position of the corresponding point in the remaining image and the normalized image coordinate.
- T represents transposition. Note that the normalized image coordinates x and x ′ of the corresponding point set can be obtained by converting the corresponding point set using internal parameters.
- the first estimation unit 13 decomposes the calculated basic matrix by singular value decomposition, and calculates a rotation matrix and a translation vector between the captured images 101 and 103 (step S207). At this time, four types of indefiniteness occur in the rotation matrix and the translation vector, but the first estimation unit 13 at least either of the fact that the rotational change between the imaging positions is small and the three-dimensional position of the corresponding point is ahead. It may be determined by narrowing down to one through this.
- the second estimation unit 15 uses a plurality of captured images captured at different imaging positions and the positional relationship estimated by the first estimation unit 13, and uses the plurality of captured images.
- the spatial position in the estimated captured image and the error position of the spatial position are estimated.
- the second estimation unit 15 uses the time-series captured image input from the imaging unit 11, the rotation matrix and the translation vector estimated by the first estimation unit 13, in the estimation captured image. And the error position of the three-dimensional position are estimated.
- FIG. 6A is a diagram illustrating an example of the captured image 103 according to the first embodiment
- FIG. 6B is a diagram illustrating an example of the captured image 101 according to the first embodiment.
- the three-dimensional position is estimated by associating the captured images 101 and 103 with each other and estimating the parallax.
- the second estimation unit 15 calculates an epipolar line 301 on the captured image 101 using the rotation matrix and the translation vector calculated by the first estimation unit 13, and an arbitrary point (FIG. 6B) on the epipolar line 301. Then, the points 302 and 303 are exemplified) and a small area is set around the points. Then, the second estimation unit 15 obtains the similarity between the luminance pattern of the small area around each point set on the epipolar line 301 and the luminance pattern of the small area around the point 300, and the most similar luminance pattern is obtained. The corresponding point between the captured images 101 and 103 is obtained by setting the corresponding point (point 303 in FIG. 6B) as the corresponding point of the point 300.
- the second estimation unit 15 calculates the three-dimensional position of the point 300 using the point 300 and the obtained corresponding point 303.
- the homogeneous coordinates of the three-dimensional position represented by Equation (2) the homogeneous coordinates of the point 300 represented by Equation (3)
- P K (I
- 0) and P ′ (R
- the three-dimensional position of the point 300 is expressed by Equation (5).
- the second estimation unit 15 determines the error ⁇ of the corresponding point 303 and calculates the three-dimensional position of the point 300 when the corresponding point 303 is shifted by the error ⁇ .
- the second estimation unit 15 may determine the error ⁇ based on a fixed value of one pixel or less or the step width used for the search on the epipolar line 301 when obtaining the corresponding point 303.
- FIG. 7 is an enlarged view of the corresponding point 303 in the captured image 101 of the first embodiment. Corresponding points 304 and 305 in consideration of the error ⁇ shown in FIG. 7 are expressed by Expression (6).
- the second estimating unit 15 applies the homogeneous coordinates of the corresponding points 304 and 305 to the mathematical expression (5) instead of the homogeneous coordinates of the corresponding point 303 for the corresponding points 304 and 305, respectively. Is calculated.
- the second estimation unit 15 estimates the three-dimensional position at each position of the captured image 103 by performing the above-described three-dimensional position estimation process for each point other than the point 300 in the captured image 103.
- the captured image 103 is used as an estimation captured image and the three-dimensional position of each point in the captured image 103 is estimated.
- the present invention is not limited to this, and the captured image 101 is captured for estimation. An image may be used, and the three-dimensional position of each point in the captured image 101 may be estimated.
- step S107 the third estimation unit 17 estimates the positional relationship (an example of the second positional relationship) between the imaging position of the estimation captured image and the road surface.
- the third estimation unit 17 estimates the orientation of the imaging unit 11 and the road surface using the spatial position (three-dimensional position) estimated by the second estimation unit 15.
- the three-dimensional position estimated by the second estimation unit 15 includes an estimation error due to a mistake in the corresponding position, it cannot be determined which point in the captured image 103 belongs to the road surface.
- the third estimation unit 17 sets an initial region 400 below the captured image 103 and uses the three-dimensional position of each point included in the initial region 400, as shown in FIG. A plane (road surface) in the three-dimensional space is estimated.
- the third estimator 17 uses RANSAC (RANdom Sample Consensus) and performs estimation while randomly selecting each point included in the initial region 400. Is desirable.
- the third estimation unit 17 thus obtains the road surface normal vector n and the distance d between the imaging unit 11 and the road surface by estimating the road surface in the three-dimensional space.
- the example in which the three-dimensional position estimated by the second estimation unit 15 is used to obtain the road surface normal vector n and the distance d between the imaging unit 11 and the road surface has been described. It is not necessary to use a three-dimensional position.
- the normal vector n and the distance d between the imaging unit 11 and the road surface may be obtained when the imaging unit 11 is installed.
- the normal vector n and the distance d between the imaging unit 11 and the road surface may be estimated using a gyro sensor, an acceleration sensor, or the like.
- part 11 estimates the rotation matrix R p and the translation vector t p coordinate transformation to the coordinate system X r relative to the road surface from the coordinate system X w centered on.
- the coordinate transformation from the coordinate system X w to coordinates X r is Z r-axis direction of the optical axis Z w axis of the imaging unit 11, the road surface perpendicular direction Y r axis, and Z r axis and Y r axis
- the orthogonal axes are converted into the Xr axis, respectively, and the size of the Yr axis becomes 0 on the road surface.
- the third estimation unit 17 the coordinate transformation from the coordinate system X w to coordinates X r, the direction does not change even after the coordinate transformation in projecting the Z w axis of the optical axis of the imaging unit 11 on the road surface,
- the size of the Yr axis is determined to be 0 on the road surface.
- step S109 the projecting unit 19 uses the second positional relationship estimated by the third estimating unit 17 to project the imaging position of the estimation captured image on the road surface to obtain a first projection position, The position is projected onto the road surface to obtain a second projection position, and the error position is projected onto the road surface to obtain a third projection position.
- the calculation unit 21 has an object between the second projection position and the third projection position in the straight line passing through the first projection position and the second projection position, rather than between the first projection position and the third projection position.
- the existence probability is calculated by updating so that the probability is high.
- the positional relationship with the captured image 103 is associated, and the grid image 500 (an example of an existence probability calculation image) divided into small regions every N pixels is used.
- the existence probability of the object is calculated.
- the grid image 500 is superimposed on the captured image 103 in order to make it easy to understand the correspondence between the positions of the captured image 103 and the grid image 500.
- the positional relationship between the captured image 103 and the grid image 500 only needs to be associated.
- FIG. 11 is a flowchart illustrating an example of a procedure flow of the projection process performed by the projection unit 19 and the existence probability calculation process performed by the calculation unit 21 according to the first embodiment.
- the projection unit 19 calculates the position when the imaging position (specifically, the optical center) of the captured image 103 is projected (projected) on the road surface using Expression (8) and Expression (9). By multiplying the calculated position by 1 / N, it is converted to the coordinate system of the grid image 500, and the position 501 is estimated (step S301). Thus, the coordinate system of the captured image 103 and the coordinate system of the grid image 500 can be uniquely determined.
- the projecting unit 19 calculates a horizontal line on the captured image 103, converts the calculated horizontal line into the coordinate system of the grid image 500, and estimates the horizontal line 502 (step S302). Specifically, the projection unit 19 sets ⁇ in Expression (10), and calculates the positions of two points on the captured image 103 using Expression (8) and Expression (9). This is because + ⁇ and ⁇ are set in Expression (10).
- the projection unit 19 obtains a horizontal line on the captured image 103 by obtaining a linear equation from the calculated positions of the two points, and obtains a horizontal line 502 by converting it into the coordinate system of the grid image 500.
- the projecting unit 19 checks whether or not there is an unprocessed attention point on the captured image 103 (step S303).
- the attention point is each point on the captured image 103 used for estimation of the three-dimensional position on the captured image 103 (for example, the point 300 in FIG. 6A).
- the projection unit 19 sets one of the unprocessed attention points (the attention point 300 in FIG. 10) as a processing target. The process proceeds to step S305.
- the process ends.
- the projection unit 19 calculates a point 503 that is the position on the road surface of the point of interest 300 set as the processing target (step S305). Specifically, the projecting unit 19 uses the rotation matrix R p and the translation vector t p estimated by the third estimating unit 17 to coordinate the three-dimensional position X w — 300 of the point of interest 300 with reference to the road surface. coordinate conversion into the system X r, obtain X r _300.
- the projection unit 19 to determine the position on the road 'to convert _300 the back to the coordinate system X w around the imaging unit 11 X w' the magnitude of Y r axis 0 and the X r _300 And applying point X w ′ — 300 to Equation (5) to calculate the point 503.
- the projecting unit 19 calculates an assumed error of the attention point 300 set as the processing target, and calculates points 504 and 505 which are error positions on the road surface of the attention point 300 (step S307). Specifically, the projecting unit 19 determines the three-dimensional position of the point 300 when the corresponding point 303 estimated by the second estimating unit 15 is shifted by the error ⁇ (for example, the corresponding points 304 and 305 in FIG. 7). Conversion is performed according to the procedure described in step S305, and points 504 and 505 are calculated. Note that the projection unit 19 calculates the magnitude of the error using Equation (11).
- ed represents the magnitude of the error
- m 50 * represents the two-dimensional coordinates on the grid image 500 of the point 50 *.
- * is either 3, 4, or 5.
- the calculating unit 21 updates the obstacle existence probability on the grid image 500, calculates the obstacle existence probability (step S309), and returns to step S303. Specifically, the calculation unit 21 updates the existence probability of the obstacle on the grid image 500 using the formula (12) and the formula (13).
- P (E) represents the existence probability of an obstacle.
- O) represents an update equation of the obstacle existence probability in the grid in which the existence of the obstacle is confirmed
- O ⁇ ) represents the obstacle in the grid in which the existence of the obstacle is not confirmed.
- the update formulas for the existence probability of an obstacle are represented, and both correspond to obtaining the obstacle existence probability P (E) based on a conditional probability of whether or not the existence of an obstacle has been confirmed.
- P (E P ) represents the prior probability of the obstacle. Since P (E P ) represents the prior probability of an obstacle, the existence probability of the obstacle in each grid before update is used. However, when the first process is performed, an initial value (for example, 0.5 indicating an intermediate state of presence / absence of an obstacle) is used.
- E) represents the probability (detection rate) that an obstacle is observed when the obstacle exists. Since P (O
- the calculation unit 21 calculates the existence probability for each of the grids located on the straight line 506 passing through the position 501 and the point 503 as shown in FIG. Update.
- the calculation unit 21 uses the formula (13) to calculate P ( E) is updated.
- the calculation unit 21 uses the formula (12) to calculate P (E ).
- P (E) is updated so that P (E) becomes smaller in the grid belonging to the section 507, and P (E) becomes larger in the grid belonging to the section 508.
- E ⁇ ) may be made smaller than Pe or P (O
- the grid belonging to the section 509 connecting the point 505 of the straight line 506 and the horizontal line 502 corresponds to a region that cannot be observed because the point of interest 300 is observed, but there is a possibility that an obstacle exists.
- 21 may update P (E) using Equation (12). At this time, since the obstacle is not directly observed, the calculation unit 21 further reduces the update width of P (E) by using a value smaller than Po for P (O
- the update range of the obstacle existence probability is greatly updated in the area where the error is small, and the presence or absence of the obstacle appears remarkably.
- the update range of the obstacle existence probability is updated to be small and the presence or absence of the obstacle remains unknown, and the possibility of erroneous detection can be reduced.
- step S111 the detection unit 23 detects the boundary between the road surface and the object using the existence probability of the object calculated by the calculation unit 21. Specifically, the detection unit 23 detects the boundary position between the obstacle and the road surface using the obstacle existence probability calculated by the calculation unit 21.
- FIG. 12 is a diagram showing an example of the obstacle existence probability of the first embodiment on the grid image 500.
- the grid of the grid image 500 is not shown.
- the area represented by the lattice has a P (E) value close to 1.0, and there is a high possibility that an obstacle exists.
- the value of P (E) is close to 0.0, and there is a high possibility that no obstacle exists.
- the value of P (E) is close to 0.5, and it is undefined whether or not an obstacle exists.
- the detection unit 23 searches for the grid position having a change in the adjacent region, so that the obstacle is on the road surface.
- the position in contact with the road (the boundary position between the obstacle and the road surface) is detected.
- the detection unit 23 searches for the grid position that changes from the grid where the obstacle does not exist to the grid where the obstacle exists. Detects the position that touches the road surface (the boundary position between the obstacle and the road surface).
- the detection unit 23 considers a direction from the end 600 of the grid image 500 on the position 501 side toward the horizontal line 502, that is, a direction 601 orthogonal to the end 600, and a direction from the side closer to the end 600.
- a grid 602 and a grid 603 are set toward 601.
- the grid 602 and the grid 603 may be adjacent to each other, or a plurality of grids may be set within a predetermined N range.
- the detection unit 23 detects the boundary position between the obstacle and the road surface when the obstacle existence probability G (X) in the grid X satisfies at least one of the equations (14) to (16).
- TH 1 (an example of the first threshold value) is a positive value
- TH 2 (an example of the second threshold value) is 0.5 or less
- TH 3 (an example of the third threshold value) is 0.5 or more.
- G (602) is an example of the first existence probability
- G (603) is an example of the second existence probability.
- Equation (14) represents a condition that the possibility of being a boundary increases as the difference between the grid 602 and the grid 603 increases
- Equation (15) indicates that the grid 602 is an area where no obstacle exists
- An expression (16) represents a condition where the grid 603 is an area where an obstacle exists.
- the detection unit 23 performs the same processing while changing the position along the edge 600, and detects the boundary at each position.
- step S113 the output control unit 25 causes the output unit 27 to output at least one of the object existence probability calculated by the calculation unit 21 and the road surface / object boundary detected by the detection unit 23.
- the output control unit 25 displays and outputs, on the output unit 27, an image in which a boundary line 611 indicating a boundary between a road surface and an object detected by the detection unit 23 is superimposed on the captured image 103. You may let them.
- the output control unit 25 may cause the output unit 27 to display and output a warning display 610 or output a warning by sound such as a warning sound. You may make it sound-output.
- the output control unit 25 may change the display mode such as the color, thickness, transparency, and blinking interval of the boundary line 611 according to the distance between the vehicle 104 and the obstacle.
- the output control unit 25 displays the boundary line 611 thicker or displays the obstacle farther away with higher transparency so that the obstacle closer to the vehicle 104 can alert the driver.
- the output control unit 25 has a color, a thickness, a transparency, and a blinking interval of the boundary line 611 so that the larger the probability value difference or the probability value according to the equations (14) to (16) is, the easier it is to attract attention. At least one of the above may be changed.
- the output control unit 25 may cause the output unit 27 to display and output an image in which the existence probability of the obstacle is identifiable on the captured image 103.
- identifiability may be visualization by color, brightness may be increased as the existence probability is higher, or blinking may be performed.
- the boundary between the object and the road surface can be detected by simple presetting such as arranging a single imaging unit for a moving object. it can.
- operator of a moving body In the first embodiment, an example in which the detection result of the boundary is notified to the driver of the moving body is described. However, even if the detection result of the boundary is used for the control of the moving body, it can be used for safe driving support. it can. In this case, the output of the existence probability and the boundary by the output unit 27 may not be performed.
- FIG. 14 is a configuration diagram illustrating an example of the detection apparatus 1010 according to the second embodiment. As shown in FIG. 14, the detection apparatus 1010 of the second embodiment is different from the first embodiment in that a conversion unit 18 is provided.
- FIG. 15 is a flowchart illustrating an example of a flow of a detection process performed by the detection apparatus 1010 according to the second embodiment.
- the detection process shown in FIG. 15 is first performed on the captured image at time t1 and the captured image at time t2, and subsequently performed on the captured image at time t2 and the captured image at time t3. Is also performed on the captured image at time tn (n is a natural number) and the captured image at time tn + 1. In addition, it is assumed that the estimated captured image is a captured image at the latest time.
- the captured image at time t1 and the captured image at time t2 is the estimated captured image
- the captured image at time t3 is This is an estimated captured image
- step S401 to S407 is the same as the processing from step S101 to S107 in FIG.
- step S409 the conversion unit 18 uses the estimated captured image (for example, the captured image at time t2) and a new estimated captured image whose time series is later than the estimated captured image (for example, at time t3).
- the existence probability on the grid image 500 is converted using a projective transformation matrix for the road surface of the captured image).
- the conversion unit 18 does not convert the existence probability and sets the obstacle prior probability P (E P ) on the grid image 500. Set the initial value.
- the setting of the prior probability of the obstacle is the same as that performed by the calculation unit 21 in the first embodiment.
- the existence probability is calculated on the grid image 500 at time t1, which is the previous time. Since it is not performed, the conversion unit 18 does not convert the existence probability.
- the detection process shown in FIG. 15 is performed on the captured image at time t2 and the captured image at time t3, the existence probability is calculated on the grid image 500 at time t2, which is the previous time. Since it is performed, the conversion unit 18 converts the existence probability.
- the conversion unit 18 includes the rotation matrix R and the translation vector t estimated by the second estimation unit 15, the road normal vector n estimated by the third estimation unit 17, the imaging unit 11, and the distance between the road surfaces.
- the projection transformation matrix H shown in Formula (17) is calculated using the distance d and the internal parameter K.
- the projective transformation matrix H can uniquely determine the correspondence between road surfaces between images. Then, the conversion unit 18 uses the projective transformation matrix H to calculate the existence probability calculated on the grid image 500 at the previous time (for example, time t2) on the grid image 500 at the current time (for example, time t3). To the existence probability (specifically, the prior probability P (E P ) of the obstacle).
- step S411 the projection unit 19 performs the same processing as in step S109 of FIG.
- the calculation unit 21 performs the same process as step S109 in FIG. 2 except that the prior probability of the obstacle is not set and the existence probability calculated on the grid image 500 is retained for conversion. Do.
- step S413 to S415 is the same as the processing from step S111 to S113 in FIG.
- FIG. 16 is a block diagram illustrating an example of a hardware configuration of the detection devices 10 and 1010 according to the above embodiments.
- the detection devices 10 and 1010 of the above embodiments include a control device 91 such as a CPU, a storage device 92 such as a ROM (Read Only Memory) and a RAM (Random Access Memory), and an HDD (Hard).
- An external storage device 93 such as a disk drive or SSD (Solid State Drive), a display device 94 such as a display, an input device 95 such as a key switch, a communication I / F 96, and an imaging device 97 such as an in-vehicle camera. It can be realized with a hardware configuration using a normal computer.
- the programs executed by the detection devices 10 and 1010 of the above embodiments are provided by being incorporated in advance in a ROM or the like.
- the programs executed by the detection devices 10 and 1010 of the above embodiments can be installed in an installable or executable file such as a CD-ROM, CD-R, memory card, DVD, flexible disk (FD), etc.
- the program may be provided by being stored in a computer-readable storage medium.
- the program executed by the detection devices 10 and 1010 of each of the above embodiments may be provided by being stored on a computer connected to a network such as the Internet and downloaded via the network.
- the program executed by the detection devices 10 and 1010 of each of the above embodiments may be provided or distributed via a network such as the Internet.
- the program executed by the detection devices 10 and 1010 of each of the above embodiments has a module configuration for realizing the above-described units on a computer.
- the control device 91 reads a program from the external storage device 93 onto the storage device 92 and executes the program, the above-described units are realized on a computer.
- the boundary between the object and the road surface can be detected even if the parallax and the depth between the plurality of images are not in a fixed relationship.
- the present invention is not limited to the above-described embodiments as they are, and can be embodied by modifying the constituent elements without departing from the scope of the invention in the implementation stage.
- Various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above embodiments. For example, some components may be deleted from all the components shown in the embodiment. Furthermore, the constituent elements over different embodiments may be appropriately combined.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
図1は、第1実施形態の検出装置10の一例を示す構成図である。図1に示すように、検出装置10は、撮像部11と、第1推定部13と、第2推定部15と、第3推定部17と、射影部19と、算出部21と、検出部23と、出力制御部25と、出力部27とを、備える。
第2実施形態では、時系列で撮像された撮像画像が順次入力され、順次入力された撮像画像に対して第1実施形態で説明した処理を時間的に連続して施す例について説明する。第2実施形態では、例えば、時刻t1の撮像画像、時刻t2の撮像画像、時刻t3の撮像画像が順次入力される場合に、まず、時刻t1の撮像画像及び時刻t2の撮像画像に対して第1実施形態で説明した処理を施し、次に、時刻t2の撮像画像及び時刻t3の撮像画像に対して第1実施形態で説明した処理を施す。以下では、第1実施形態との相違点の説明を主に行い、第1実施形態と同様の機能を有する構成要素については、第1実施形態と同様の名称・符号を付し、その説明を省略する。
図16は、上記各実施形態の検出装置10、1010のハードウェア構成の一例を示すブロック図である。図16に示すように、上記各実施形態の検出装置10、1010は、CPUなどの制御装置91と、ROM(Read Only Memory)やRAM(Random Access Memory)などの記憶装置92と、HDD(Hard Disk Drive)やSSD(Solid State Drive)などの外部記憶装置93と、ディスプレイなどの表示装置94と、キースイッチなどの入力装置95と、通信I/F96と、車載カメラなどの撮像装置97とを、備えており、通常のコンピュータを利用したハードウェア構成で実現できる。
11 撮像部
13 第1推定部
15 第2推定部
17 第3推定部
18 変換部
19 射影部
21 算出部
23 検出部
25 出力制御部
27 出力部
Claims (10)
- 異なる撮像位置間の空間的な第1位置関係を推定する第1推定部と、
前記異なる撮像位置で撮像された複数の撮像画像と前記第1位置関係とを用いて、前記複数の撮像画像のうちのいずれかである推定用撮像画像内の空間位置及び当該空間位置の誤差位置を推定する第2推定部と、
前記推定用撮像画像の撮像位置と路面との間の第2位置関係を推定する第3推定部と、
前記第2位置関係を用いて、前記推定用撮像画像の前記撮像位置を前記路面上に射影して第1射影位置を得、前記空間位置を前記路面上に射影して第2射影位置を得、前記誤差位置を前記路面上に射影して第3射影位置を得る射影部と、
前記第1射影位置と前記第2射影位置とを通過する直線において、前記第1射影位置及び前記第3射影位置間よりも前記第2射影位置及び前記第3射影位置間の方が物体の存在確率が高くなるように更新して前記存在確率を算出する算出部と、
前記存在確率を用いて前記路面と前記物体との境界を検出する検出部と、
を備える検出装置。 - 前記算出部は、前記第2射影位置と前記第3射影位置との間隔が大きいほど更新幅を小さくして前記存在確率を更新する請求項1に記載の検出装置。
- 前記算出部は、前記推定用撮像画像と位置関係が対応付けられた存在確率算出用画像上で前記存在確率を更新する請求項1又は2に記載の検出装置。
- 前記検出部は、前記第1射影位置側の前記存在確率算出用画像の端から前記存在確率算出用画像の水平線方向に向かった前記端側の第1存在確率及び前記水平線側の第2存在確率について、前記第2存在確率と前記第1存在確率との差分が第1閾値以上であること、前記第1存在確率が第2閾値以上であること、及び前記第2存在確率が第3閾値以上であることの少なくともいずれかを満たす場合に、前記境界を検出する請求項3に記載の検出装置。
- 前記推定用撮像画像及び当該推定用撮像画像よりも時系列が後の新たな推定用撮像画像間の路面に対する射影変換行列を用いて、前記存在確率算出用画像上の前記存在確率を変換する変換部を更に備え、
前記算出部は、前記新たな推定用撮像画像と位置関係が対応付けられた変換後の前記存在確率算出用画像上で前記存在確率を更新する請求項3又は4に記載の検出装置。 - 前記存在確率算出用画像は、小領域に分割されたグリッドを単位として構成されている請求項3~5のいずれか1つに記載の検出装置。
- 前記算出部は、前記第1射影位置と前記第2射影位置とを通過する直線において、前記第1射影位置と前記第2射影位置又は前記第3射影位置との間よりも、前記第2射影位置又は前記第3射影位置から水平線位置間の前記存在確率の更新幅を小さくして前記存在確率を更新する請求項1~6のいずれか1つに記載の検出装置。
- 前記存在確率及び前記境界の少なくともいずれかを出力する出力部を更に備える請求項1~7のいずれか1つに記載の検出装置。
- 第1推定部が、異なる撮像位置間の空間的な第1位置関係を推定する第1推定ステップと、
第2推定部が、前記異なる撮像位置で撮像された複数の撮像画像と前記第1位置関係とを用いて、前記複数の撮像画像のうちのいずれかである推定用撮像画像内の空間位置及び当該空間位置の誤差位置を推定する第2推定ステップと、
第3推定部が、前記推定用撮像画像の撮像位置と路面との間の第2位置関係を推定する第3推定ステップと、
射影部が、前記第2位置関係を用いて、前記推定用撮像画像の前記撮像位置を前記路面上に射影して第1射影位置を得、前記空間位置を前記路面上に射影して第2射影位置を得、前記誤差位置を前記路面上に射影して第3射影位置を得る射影ステップと、
算出部が、前記第1射影位置と前記第2射影位置とを通過する直線において、前記第1射影位置及び前記第3射影位置間よりも前記第2射影位置及び前記第3射影位置間の方が物体の存在確率が高くなるように更新して前記存在確率を算出する算出ステップと、
検出部が、前記存在確率を用いて前記路面と前記物体との境界を検出する検出ステップと、
を含む検出方法。 - 異なる撮像位置間の空間的な第1位置関係を推定する第1推定ステップと、
前記異なる撮像位置で撮像された複数の撮像画像と前記第1位置関係とを用いて、前記複数の撮像画像のうちのいずれかである推定用撮像画像内の空間位置及び当該空間位置の誤差位置を推定する第2推定ステップと、
前記推定用撮像画像の撮像位置と路面との間の第2位置関係を推定する第3推定ステップと、
前記第2位置関係を用いて、前記推定用撮像画像の前記撮像位置を前記路面上に射影して第1射影位置を得、前記空間位置を前記路面上に射影して第2射影位置を得、前記誤差位置を前記路面上に射影して第3射影位置を得る射影ステップと、
前記第1射影位置と前記第2射影位置とを通過する直線において、前記第1射影位置及び前記第3射影位置間よりも前記第2射影位置及び前記第3射影位置間の方が物体の存在確率が高くなるように更新して前記存在確率を算出する算出ステップと、
前記存在確率を用いて前記路面と前記物体との境界を検出する検出ステップと、
をコンピュータに実行させるためのプログラム。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12871600.8A EP2922045A1 (en) | 2012-11-13 | 2012-11-13 | Detection device, method, and program |
PCT/JP2012/079411 WO2014076769A1 (ja) | 2012-11-13 | 2012-11-13 | 検出装置、方法及びプログラム |
JP2013514456A JP5710752B2 (ja) | 2012-11-13 | 2012-11-13 | 検出装置、方法及びプログラム |
CN201280013215.6A CN103988241A (zh) | 2012-11-13 | 2012-11-13 | 检测装置、方法以及程序 |
US14/064,799 US9122936B2 (en) | 2012-11-13 | 2013-10-28 | Detecting device, detection method, and computer program product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/079411 WO2014076769A1 (ja) | 2012-11-13 | 2012-11-13 | 検出装置、方法及びプログラム |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/064,799 Continuation US9122936B2 (en) | 2012-11-13 | 2013-10-28 | Detecting device, detection method, and computer program product |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014076769A1 true WO2014076769A1 (ja) | 2014-05-22 |
Family
ID=50681721
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/079411 WO2014076769A1 (ja) | 2012-11-13 | 2012-11-13 | 検出装置、方法及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9122936B2 (ja) |
EP (1) | EP2922045A1 (ja) |
JP (1) | JP5710752B2 (ja) |
CN (1) | CN103988241A (ja) |
WO (1) | WO2014076769A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022147040A (ja) * | 2021-03-23 | 2022-10-06 | 本田技研工業株式会社 | 表示装置 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5706874B2 (ja) * | 2010-03-03 | 2015-04-22 | 本田技研工業株式会社 | 車両の周辺監視装置 |
EP2669845A3 (en) * | 2012-06-01 | 2014-11-19 | Ricoh Company, Ltd. | Target recognition system, target recognition method executed by the target recognition system, target recognition program executed on the target recognition system, and recording medium storing the target recognition program |
JP6075331B2 (ja) * | 2014-06-13 | 2017-02-08 | トヨタ自動車株式会社 | 車両用照明装置 |
US9849784B1 (en) * | 2015-09-30 | 2017-12-26 | Waymo Llc | Occupant facing vehicle display |
US10552966B2 (en) * | 2016-03-07 | 2020-02-04 | Intel Corporation | Quantification of parallax motion |
JP6595401B2 (ja) * | 2016-04-26 | 2019-10-23 | 株式会社Soken | 表示制御装置 |
US10160448B2 (en) * | 2016-11-08 | 2018-12-25 | Ford Global Technologies, Llc | Object tracking using sensor fusion within a probabilistic framework |
JP6794243B2 (ja) * | 2016-12-19 | 2020-12-02 | 日立オートモティブシステムズ株式会社 | 物体検出装置 |
JP7528009B2 (ja) * | 2021-03-17 | 2024-08-05 | 株式会社東芝 | 画像処理装置および画像処理方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001331787A (ja) * | 2000-05-19 | 2001-11-30 | Toyota Central Res & Dev Lab Inc | 道路形状推定装置 |
JP2007241326A (ja) * | 2003-10-17 | 2007-09-20 | Matsushita Electric Ind Co Ltd | 移動体動き算出装置 |
JP4406381B2 (ja) | 2004-07-13 | 2010-01-27 | 株式会社東芝 | 障害物検出装置及び方法 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1580687A4 (en) | 2003-10-17 | 2006-01-11 | Matsushita Electric Ind Co Ltd | METHOD FOR DETERMINING MOBILE UNIT MOTION, DEVICE AND NAVIGATION SYSTEM |
WO2008065717A1 (fr) * | 2006-11-29 | 2008-06-05 | Fujitsu Limited | Système et procédé de détection de piéton |
US20110187844A1 (en) * | 2008-09-12 | 2011-08-04 | Kabushiki Kaisha Toshiba | Image irradiation system and image irradiation method |
JP4631096B2 (ja) * | 2008-10-20 | 2011-02-16 | 本田技研工業株式会社 | 車両周辺監視装置 |
JP5440461B2 (ja) * | 2010-09-13 | 2014-03-12 | 株式会社リコー | 校正装置、距離計測システム、校正方法および校正プログラム |
-
2012
- 2012-11-13 WO PCT/JP2012/079411 patent/WO2014076769A1/ja active Application Filing
- 2012-11-13 EP EP12871600.8A patent/EP2922045A1/en not_active Withdrawn
- 2012-11-13 CN CN201280013215.6A patent/CN103988241A/zh active Pending
- 2012-11-13 JP JP2013514456A patent/JP5710752B2/ja active Active
-
2013
- 2013-10-28 US US14/064,799 patent/US9122936B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001331787A (ja) * | 2000-05-19 | 2001-11-30 | Toyota Central Res & Dev Lab Inc | 道路形状推定装置 |
JP2007241326A (ja) * | 2003-10-17 | 2007-09-20 | Matsushita Electric Ind Co Ltd | 移動体動き算出装置 |
JP4406381B2 (ja) | 2004-07-13 | 2010-01-27 | 株式会社東芝 | 障害物検出装置及び方法 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022147040A (ja) * | 2021-03-23 | 2022-10-06 | 本田技研工業株式会社 | 表示装置 |
JP7264930B2 (ja) | 2021-03-23 | 2023-04-25 | 本田技研工業株式会社 | 表示装置 |
Also Published As
Publication number | Publication date |
---|---|
US20140133700A1 (en) | 2014-05-15 |
JPWO2014076769A1 (ja) | 2016-09-08 |
CN103988241A (zh) | 2014-08-13 |
JP5710752B2 (ja) | 2015-04-30 |
EP2922045A1 (en) | 2015-09-23 |
US9122936B2 (en) | 2015-09-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5710752B2 (ja) | 検出装置、方法及びプログラム | |
CN110582798B (zh) | 用于虚拟增强视觉同时定位和地图构建的系统和方法 | |
US10097812B2 (en) | Stereo auto-calibration from structure-from-motion | |
CN112652016B (zh) | 点云预测模型的生成方法、位姿估计方法及其装置 | |
WO2015037178A1 (ja) | 姿勢推定方法及びロボット | |
EP3968266B1 (en) | Obstacle three-dimensional position acquisition method and apparatus for roadside computing device | |
JP5754470B2 (ja) | 路面形状推定装置 | |
US10438412B2 (en) | Techniques to facilitate accurate real and virtual object positioning in displayed scenes | |
JP2007263669A (ja) | 3次元座標取得装置 | |
JP5756709B2 (ja) | 身長推定装置、身長推定方法、及び身長推定プログラム | |
JP2008113296A (ja) | 車両周辺監視装置 | |
JP2014101075A (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP2006252473A (ja) | 障害物検出装置、キャリブレーション装置、キャリブレーション方法およびキャリブレーションプログラム | |
JP4941565B2 (ja) | 対応点探索装置および対応点探索方法 | |
WO2020195875A1 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
CN109447901B (zh) | 一种全景成像方法和装置 | |
KR20150120866A (ko) | Iid에서 디스플레이 오차를 보정하는 장치 및 방법 | |
JP6188860B1 (ja) | オブジェクト検出装置 | |
JP6610994B2 (ja) | 障害物検出装置、及び、障害物検出方法 | |
JP2011209070A (ja) | 画像処理装置 | |
JP5959682B2 (ja) | オブジェクトと車両との間の距離を算出するためのシステムおよび方法 | |
JP2017117038A (ja) | 道路面推定装置 | |
JP2021043486A (ja) | 位置推定装置 | |
JP6027705B2 (ja) | 画像処理装置,方法,およびそのプログラム | |
JP2015215646A (ja) | 解析装置、解析方法、及び、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2013514456 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2012871600 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012871600 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12871600 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |