WO2015125298A1 - 自己位置算出装置及び自己位置算出方法 - Google Patents
自己位置算出装置及び自己位置算出方法 Download PDFInfo
- Publication number
- WO2015125298A1 WO2015125298A1 PCT/JP2014/054313 JP2014054313W WO2015125298A1 WO 2015125298 A1 WO2015125298 A1 WO 2015125298A1 JP 2014054313 W JP2014054313 W JP 2014054313W WO 2015125298 A1 WO2015125298 A1 WO 2015125298A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- road surface
- vehicle
- self
- posture
- surface state
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the present invention relates to a self-position calculation device and a self-position calculation method.
- Patent Document 1 A technique is known in which an image in the vicinity of a vehicle is captured by a camera mounted on the vehicle and the amount of movement of the vehicle is obtained based on the change in the image (see Patent Document 1).
- Patent Document 1 a feature point is detected from an image so that the amount of movement can be obtained accurately even when the vehicle moves at low speed and delicately, the position of the feature point is obtained, and the moving direction and movement of the feature point.
- the movement amount of the vehicle is obtained from the distance (movement amount).
- the conventional technology described above has a problem that the position of the vehicle cannot be calculated accurately if there are irregularities or steps on the road surface around the vehicle.
- the present invention has been proposed in view of the above-described circumstances, and a self-position calculation device and a self-device that can accurately calculate the self-position of the vehicle even when there are unevenness and steps on the road surface around the vehicle.
- An object is to provide a position calculation method.
- a self-position calculation apparatus captures an image of a road surface around a vehicle on which pattern light is projected, acquires an image, and determines the position of the pattern light in the image.
- the attitude angle of the vehicle with respect to the road surface is calculated.
- the self-position calculation device calculates a posture change amount of the vehicle based on time changes of a plurality of feature points on the road surface in the acquired image, and adds the posture change amount to the initial position and posture angle of the vehicle. To calculate the current position and attitude angle of the vehicle.
- the self-position calculation device determines that the road surface condition around the vehicle has changed by a threshold value or more, the self-position calculation device adds the posture change amount to the current position and posture angle of the vehicle calculated in the previous information processing cycle. The current position and attitude angle of the vehicle are calculated.
- FIG. 1 is a block diagram showing the overall configuration of the self-position calculation apparatus according to the first embodiment.
- FIG. 2 is an external view showing an example of a method for mounting a projector and a camera on a vehicle.
- FIG. 3A is a diagram illustrating a state in which the position on the road surface irradiated with the spot light is calculated using a projector and a camera, and FIG. 3B is different from the region irradiated with the pattern light. It is a figure which shows a mode that the moving direction of a camera is calculated
- FIG. 4 is a diagram showing an image of pattern light obtained by performing binarization processing on an image acquired by a camera, FIG.
- FIG. 4A is a diagram showing the entire pattern light
- FIG. FIG. 4C is an enlarged view showing the spot light
- FIG. 4C is a view showing the position of the center of gravity of the spot light.
- FIG. 5 is a schematic diagram for explaining a method of calculating the change amount of the distance and the posture angle.
- FIG. 6 is a diagram showing the feature points detected on the image
- FIG. 6A is a diagram showing the first frame (image) acquired at time t
- FIG. 6B is the diagram at time t + ⁇ t. It is a figure which shows the acquired 2nd frame.
- FIG. 7 is a diagram for explaining a method of estimating the amount of change in the road surface height from the position of the pattern light.
- FIG. 8 is a time chart showing processing for determining a change in road surface state by the self-position calculation apparatus according to the first embodiment.
- FIG. 9 is a flowchart illustrating a processing procedure of self-position calculation processing by the self-position calculation apparatus according to the first embodiment.
- FIG. 10 is a flowchart showing a detailed processing procedure of step S09 of FIG. 9 by the self-position calculating apparatus according to the first embodiment.
- FIG. 11 is a diagram illustrating an example when an estimation error occurs in the roll angle and the movement amount of the vehicle.
- FIG. 12 is a time chart illustrating a process of determining a change in road surface state by the self-position calculating apparatus according to the second embodiment.
- FIG. 13 is a flowchart showing a detailed processing procedure of step S09 of FIG. 9 by the self-position calculation apparatus according to the second embodiment.
- the self-position calculating device includes a projector 11, a camera 12, and an engine control unit (ECU) 13.
- the projector 11 is mounted on the vehicle and projects pattern light onto a road surface around the vehicle.
- the camera 12 is an example of an imaging unit that is mounted on a vehicle and captures an image of a road surface around the vehicle including an area where pattern light is projected.
- the ECU 13 is an example of a control unit that controls the projector 11 and executes a series of information processing cycles for calculating the vehicle's own position from an image acquired by the camera 12.
- the camera 12 is a digital camera using a solid-state imaging device such as a CCD and a CMOS, and acquires a digital image that can be processed.
- the imaging target of the camera 12 is a road surface around the vehicle, and the road surface around the vehicle includes the road surface of the front, rear, side, and bottom of the vehicle.
- the camera 12 can be mounted on the front portion of the vehicle 10, specifically on the front bumper.
- the height and direction in which the camera 12 is installed are adjusted so that the feature point (texture) on the road surface 31 in front of the vehicle 10 and the pattern light 32b projected from the projector 11 can be imaged, and the camera 12 is provided.
- the lens focus and aperture are also automatically adjusted.
- the camera 12 repeatedly captures images at a predetermined time interval and acquires a series of images (frames). Image data acquired by the camera 12 is transferred to the ECU 13 every time it is imaged and stored in a memory provided in the ECU 13.
- the projector 11 projects pattern light 32 b having a predetermined shape including a square or rectangular lattice image toward the road surface 31 within the imaging range of the camera 12.
- the camera 12 images the pattern light irradiated on the road surface 31.
- the projector 11 includes, for example, a laser pointer and a diffraction grating.
- the projector 11 diffracts the laser light emitted from the laser pointer with a diffraction grating, and as shown in FIG. 2 to FIG. 4, pattern light (pattern light consisting of a plurality of spot lights arranged in a matrix form) 32b, 32a).
- the pattern light 32a composed of 5 ⁇ 7 spot lights is generated.
- the ECU 13 includes a microcontroller including a CPU, a memory, and an input / output unit, and configures a plurality of information processing units that function as a self-position calculating device by executing a computer program installed in advance. .
- the ECU 13 repeatedly executes a series of information processing cycles for calculating the vehicle's own position from the image acquired by the camera 12 for each image (frame).
- the ECU 13 may also be used as an ECU used for other controls of the vehicle 10.
- the plurality of information processing units include a pattern light extraction unit 21, an attitude angle calculation unit 22, a feature point detection unit 23, an attitude change amount calculation unit 24, a self-position calculation unit 26, and pattern light control.
- a unit 27 and a road surface state determination unit 30 are included.
- the posture change amount calculation unit 24 includes a feature point detection unit 23.
- the pattern light extraction unit 21 reads an image acquired by the camera 12 from the memory, and extracts the position of the pattern light from this image. As shown in FIG. 3A, for example, pattern light 32a composed of a plurality of spot lights in which the projectors 11 are arranged in a matrix is projected toward the road surface 31, and the pattern light 32a reflected by the road surface 31 is reflected. Detection is performed by the camera 12. The pattern light extraction unit 21 extracts only the image of the spot light Sp as shown in FIGS. 4A and 4B by performing binarization processing on the image acquired by the camera 12. . Then, as shown in FIG.
- the pattern light extraction unit 21 calculates the position He of the center of gravity of each spot light Sp, that is, the coordinates (Uj, Vj) on the image of the spot light Sp.
- the position of the light 32a is extracted.
- the coordinates are in units of pixels of the image sensor of the camera 12, and in the case of 5 ⁇ 7 spot light Sp, “j” is a natural number between 1 and 35.
- the coordinates (Uj, Vj) on the image of the spot light Sp are stored in the memory as data indicating the position of the pattern light 32a.
- the posture angle calculation unit 22 reads data indicating the position of the pattern light 32a from the memory, and calculates the distance and posture angle of the vehicle 10 with respect to the road surface 31 from the position of the pattern light 32a in the image acquired by the camera 12. For example, as shown in FIG. 3A, from the baseline length Lb between the projector 11 and the camera 12 and the coordinates (Uj, Vj) on the image of each spot light, The position on the road surface 31 irradiated with the spot light is calculated as a relative position with respect to the camera 12.
- the attitude angle calculation unit 22 calculates the plane type of the road surface 31 on which the pattern light 32a is projected, that is, the distance and the attitude angle of the camera 12 with respect to the road surface 31 (normal vector) from the relative position of each spot light with respect to the camera 12. ) Is calculated.
- the distance and posture angle of the camera 12 with respect to the road surface 31 are calculated as an example of the distance and posture angle of the vehicle 10 with respect to the road surface 31. .
- the distance between the road surface 31 and the vehicle 10 and the posture angle of the vehicle 10 with respect to the road surface 31 can be obtained.
- the attitude angle calculation unit 22 uses the principle of triangulation to determine the position on the road surface 31 irradiated with each spot light from the coordinates (Uj, Vj) of each spot light on the image relative to the camera 12. (Xj, Yj, Zj).
- the distance and posture angle of the camera 12 with respect to the road surface 31 are abbreviated as “distance and posture angle”.
- the distance and posture angle calculated by the posture angle calculation unit 22 are stored in the memory.
- the posture angle calculation unit 22 stops calculating the distance and the posture angle of the vehicle.
- the relative position (Xj, Yj, Zj) of each spot light with respect to the camera 12 often does not exist on the same plane. This is because the relative position of each spot light changes according to the asphalt unevenness exposed on the road surface 31. Therefore, a plane formula that minimizes the sum of squares of distance errors with respect to each spot light may be obtained by using the least square method.
- the distance and posture angle data calculated in this way are used by the self-position calculation unit 26 shown in FIG.
- the feature point detection unit 23 reads an image acquired by the camera 12 from the memory, and detects a feature point on the road surface 31 from the image read from the memory.
- the feature point detection unit 23 detects, for example, “DG Lowe,“ Distinctive Image Features from Scale-Invariant Keypoints, ”“ Int. J. Compute. Vis., Vol. 60, no. 2, pp. 91-110, Nov. 200 "can be used.
- the feature point detection unit 23 may be configured as “Akira Kanazawa, Kenichi Kanaya,“ Extracting feature points of an image for computer vision, ”IEICE Journal, vol. 87, no. 12, pp. 1043-1048, Dec. 2004. Can also be used.
- the feature point detector 23 detects, as a feature point, a point whose luminance value changes greatly compared to the surroundings, such as a vertex of an object, using, for example, a Harris operator or a SUSAN operator.
- the feature point detection unit 23 may detect, as a feature point, a point where the luminance value changes under a certain regularity using a SIFT (Scale-Invariant Feature Transform) feature amount. .
- SIFT Scale-Invariant Feature Transform
- the feature point detection unit 23 counts the total number N of feature points detected from one image, and assigns an identification number (i (1 ⁇ i ⁇ N)) to each feature point.
- the position (Ui, Vi) on the image of each feature point is stored in a memory in the ECU 13.
- FIGS. 6A and 6B show examples of feature points Te detected from an image acquired by the camera 12. Furthermore, the change direction and change amount of each feature point Te are shown as a vector Dte.
- the feature points on the road surface 31 are mainly assumed to be asphalt mixture grains having a size of 1 cm to 2 cm.
- the resolution of the camera 12 is VGA (approximately 300,000 pixels).
- the distance of the camera 12 with respect to the road surface 31 is about 70 cm.
- the imaging direction of the camera 12 is inclined toward the road surface 31 by about 45 degrees from the horizontal plane.
- the luminance value when the image acquired by the camera 12 is transferred to the ECU 13 is in the range of 0 to 255 (0: the darkest, 255: the brightest).
- the posture change amount calculation unit 24 among the images of each frame captured every fixed information processing cycle, the position coordinates (Ui,) of the plurality of feature points included in the image of the previous (time t) frame. Vi) is read from memory. Further, the posture change amount calculation unit 24 reads the position coordinates (Ui, Vi) on the image of a plurality of feature points included in the image of the current frame (time t + ⁇ t) from the memory. Then, the posture change amount calculation unit 24 obtains the posture change amount of the vehicle based on temporal position changes on the images of the plurality of feature points.
- the “vehicle attitude change amount” includes both the “distance and attitude angle change amount” of the vehicle with respect to the road surface and the “vehicle movement amount” on the road surface.
- the amount of change in the distance and posture angle” and “the amount of movement of the vehicle” will be described.
- FIG. 6A shows an example of the first frame (image) 38 (FIG. 5) acquired at time t.
- FIGS. 5 and 6A consider a case where, for example, the relative positions (Xi, Yi, Zi) of three feature points Te1, Te2, Te3 are calculated in the first frame 38, respectively.
- the plane G (see FIG. 6A) specified by the feature points Te1, Te2, Te3 can be regarded as a road surface. Therefore, the posture change amount calculation unit 24 can obtain the distance and posture angle (normal vector) of the camera 12 with respect to the road surface (plane G) from the relative position (Xi, Yi, Zi) of each feature point.
- the posture change amount calculation unit 24 forms distances (l1, l2, l3) between the feature points Te1, Te2, Te3 and straight lines connecting the feature points Te1, Te2, Te3 by a known camera model. The angle can be determined.
- the camera 12 in FIG. 5 shows the position of the camera when the first frame is imaged.
- the Z axis is set in the imaging direction of the camera 12, and the camera 12 is included with the imaging direction as the normal line.
- An X axis and a Y axis that are orthogonal to each other are set in the plane.
- the horizontal direction and the vertical direction are set to the V axis and the U axis, respectively.
- FIG. 6B shows the second frame 38 ′ acquired at time (t + ⁇ t) when time ⁇ t has elapsed from time t.
- the camera 12 ′ in FIG. 5 indicates the position of the camera when the second frame 38 ′ is imaged.
- the camera 12 ′ captures the feature points Te1, Te2, and Te3, and the feature point detection unit 23 detects the feature points Te1, Te2, and Te3.
- the posture change amount calculation unit 24 calculates the relative position (Xi, Yi, Zi) of each feature point Te1 to Te3 at time t and the position P1 (Ui, Vi) of each feature point on the second frame 38 ′.
- the movement amount ⁇ L of the camera 12 at time ⁇ t can be calculated from the camera model of the camera 12. Therefore, the movement amount of the vehicle can be calculated. Furthermore, the amount of change in distance and posture angle can also be calculated.
- the posture change amount calculation unit 24 calculates the movement amount ( ⁇ L) of the camera 12 (vehicle), and the change amounts of the distance and posture angle. Can be calculated.
- the following equation (1) is modeled as an ideal pinhole camera in which the camera 12 has no distortion or optical axis deviation, ⁇ i is a constant, and f is a focal length.
- the camera model parameters may be calibrated in advance.
- the posture change amount calculation unit 24 does not use all of the feature points whose relative positions are calculated in the images detected at time t and time t + ⁇ t, but is optimal based on the positional relationship between the feature points.
- a feature point may be selected.
- epipolar geometry RI polar geometry, R. I. Hartley: “A linear method for reconstruction, points, points”, “Proc. (1995)
- RI polar geometry R. I. Hartley: “A linear method for reconstruction, points, points”, “Proc. (1995)
- the feature point detection unit 23 also detects the feature points Te1, Te2, and Te3 whose relative positions (Xi, Yi, Zi) are calculated in the frame image 38 at the time t from the frame image 38 ′ at the time t + ⁇ t.
- the posture change amount calculation unit 24 calculates “the vehicle's position from the time change of the relative positions (Xi, Yi, Zi) of the plurality of feature points on the road surface and the positions (Ui, Vi) of the feature points on the image.
- the “posture angle change amount” can be calculated.
- the amount of movement of the vehicle can be calculated.
- the pattern light can be obtained by continuing the process of adding the distance and the change amount of the posture angle (integral calculation).
- the distance and the posture angle can be continuously updated without using 32a.
- a distance and posture angle calculated using the pattern light 32a or a predetermined initial distance and initial posture angle may be used. That is, the distance and the attitude angle that are the starting points of the integration calculation may be calculated using the pattern light 32a, or may use predetermined initial values. It is desirable that the predetermined initial distance and initial posture angle are a distance and posture angle that take into account at least an occupant and a load on the vehicle 10.
- the pattern light 32a is projected, and the distance and posture angle calculated from the pattern light 32a are set to a predetermined initial value. What is necessary is just to use as a distance and an initial posture angle. Thereby, the distance and the posture angle when the roll motion or the pitch motion due to the turning or acceleration / deceleration of the vehicle 10 is not generated can be obtained.
- an image of a small area around the detected feature points may be recorded in a memory, and the determination may be made based on the similarity of luminance and color information.
- the ECU 13 records an image for 5 ⁇ 5 (horizontal ⁇ vertical) pixels centered on the detected feature point in the memory.
- the posture change amount calculation unit 24 determines that the feature point is a feature point that can be correlated between the previous and next frames. Then, the posture change amount acquired by the above processing is used when the self-position calculating unit 26 in the subsequent stage calculates the self-position of the vehicle.
- the self position calculation unit 26 calculates the current distance and posture angle of the vehicle from the “distance and change amount of posture angle” calculated by the posture change amount calculation unit 24. Further, the vehicle's own position is calculated from the “vehicle movement amount” calculated by the posture change amount calculation unit 24.
- the self-position calculation unit 26 calculates the posture change amount calculation unit 24 for the distance and posture angle calculated by the posture angle calculation unit 22. Then, the distance and posture angle change amount for each frame are sequentially added (integral calculation), and the distance and posture angle are updated to the latest numerical values.
- the self-position calculation unit 26 uses the vehicle position when the distance and posture angle are calculated by the posture angle calculation unit 22 as a starting point (initial position of the vehicle), and sequentially adds the movement amount of the vehicle from this initial position. To calculate the vehicle's own position. For example, the current position of the vehicle on the map can be sequentially calculated by setting the starting point (the initial position of the vehicle) that is collated with the position on the map.
- the posture change amount calculation unit 24 can calculate the self-position of the vehicle by obtaining the movement amount ( ⁇ L) of the camera 12 during the time ⁇ t. Furthermore, since the change amount of the distance and the posture angle can be calculated at the same time, the posture change amount calculation unit 24 takes into account the change amount of the distance and the posture angle of the vehicle, and has six degrees of freedom (front and rear, left and right, up and down, The amount of movement ( ⁇ L) of yaw, pitch, roll) can be calculated with high accuracy. That is, even if the distance and the posture angle change due to roll motion or pitch motion caused by turning or acceleration / deceleration of the vehicle 10, an estimation error of the movement amount ( ⁇ L) can be suppressed.
- the movement amount ( ⁇ L) of the camera 12 is calculated by calculating the change amount of the distance and the posture angle and updating the distance and the posture angle.
- the change amount may be calculated and updated.
- the distance between the road surface 31 and the camera 12 may be assumed to be constant. Accordingly, it is possible to reduce the calculation load of the ECU 13 and improve the calculation speed while suppressing the estimation error of the movement amount ( ⁇ L) in consideration of the change amount of the posture angle.
- the pattern light control unit 27 controls the projection of the pattern light 32 a by the projector 11. For example, when the ignition switch of the vehicle 10 is turned on and the self-position calculating device is activated, the pattern light control unit 27 starts projecting the pattern light 32a. Thereafter, the pattern light control unit 27 continuously projects the pattern light 32a until the self-position calculating device stops. Alternatively, the on / off of the light projection may be repeated at a predetermined time interval.
- the road surface state determination unit 30 detects a change in the road surface state around the vehicle and determines whether or not the road surface state has changed by a threshold value or more. If it is determined that the road surface state has changed by a threshold value or more, the self-position calculation unit 26 fixes the starting point to the current position of the vehicle 10, the distance to the road surface, and the attitude angle calculated in the previous information processing cycle. To do. Thereby, the attitude angle calculation unit 22 stops calculating the distance and the attitude angle of the vehicle 10 with respect to the road surface.
- the self-position calculating unit 26 adds the posture change amount to the current position of the vehicle 10 calculated in the previous information processing cycle, the distance to the road surface, and the posture angle, so that the current position of the current vehicle, the distance to the road surface, Calculate the attitude angle.
- 35 (5 ⁇ 7) spot lights of the pattern light 32a are projected on the road surface. Therefore, for example, when only 80% or less, that is, 28 or less, can be detected on the image of the camera 12 out of 35 spot lights, the road surface state determination unit 30 causes the road surface step and unevenness to become severe, and the road surface state Is determined to have changed more than the threshold.
- the change in the road surface condition may be estimated from the amount of change in the road surface height.
- the amount of change in the height of the road surface can be detected from the vibration of the detection value of the stroke sensor attached to the suspension of each wheel of the vehicle. For example, when the vibration of the detection value of the stroke sensor becomes 1 Hz or more, the road surface state determination unit 30 estimates that the road surface level difference or unevenness has become severe, and determines that the road surface state has changed by more than a threshold value.
- the detection value of the acceleration sensor that measures the acceleration in the vertical direction is integrated to calculate the speed in the vertical direction, and when the change in the direction of the speed becomes 1 Hz or more, the road surface condition determination unit 30 It may be determined that the level difference or unevenness becomes intense and the road surface state has changed by more than a threshold value.
- the amount of change in the height of the road surface may be estimated from the position of the pattern light 32a in the image captured by the camera 12.
- pattern light 32 a as shown in FIG. 7 is projected onto the road surface 31. Therefore, a line segment 71 connecting the spot light of the pattern light 32a in the X direction and a line segment 73 connecting in the Y direction are drawn.
- the road surface state determination unit 30 estimates that the road surface step or unevenness has become intense, and the road surface state is greater than or equal to the threshold value. Judge that it has changed. Further, as shown in FIG. 7, when the difference between adjacent spot light intervals d1 and d2 has changed by 50% or more, the road surface state determination unit 30 determines that the road surface state has changed by a threshold value or more. Good.
- the self-position calculating unit 26 fixes the current position of the vehicle 10, the distance to the road surface, and the attitude angle calculated in the previous information processing cycle. Therefore, the posture angle calculation unit 22 stops calculating the distance and posture angle of the vehicle 10 with respect to the road surface, and the self-position calculation unit 26 calculates the current position of the vehicle 10 calculated in the previous information processing cycle, the distance and posture with respect to the road surface. The posture change amount is added to the corner to calculate the current position of the current vehicle 10, the distance to the road surface, and the posture angle.
- the road surface state determination unit 30 monitors the number of detected spot lights, and sets the threshold value to 28 which is 80% of 35 spot lights. In this case, the road surface state determination unit 30 sets the posture angle calculation flag to “ON” while more than 28 spot lights can be detected.
- the posture angle calculation unit 22 calculates the distance and posture angle of the vehicle 10 with respect to the road surface
- the self-position calculation unit 26 uses the vehicle distance and posture angle calculated by the posture angle calculation unit 22 to present the current distance.
- the attitude angle is calculated, and the current position of the vehicle is calculated by adding the moving amount of the vehicle to the current position of the vehicle 10 calculated in the previous information processing cycle (continuing the integration calculation). .
- the self-position calculation unit 26 switches the attitude angle calculation flag to “OFF”.
- the starting position of the vehicle 10 current position, the distance to the road surface, and the attitude angle are fixed at the current position, the distance to the road surface, and the attitude angle calculated in the previous information processing cycle.
- the calculation of the distance of 10 and the posture angle is stopped.
- the self-position calculation unit 26 adds the posture change amount to the current position of the vehicle 10 calculated in the previous information processing cycle, the distance to the road surface, and the posture angle, so that the current position of the current vehicle, the distance to the road surface, and Calculate the attitude angle.
- the self-position calculation unit 26 calculates the current distance and posture angle of the vehicle 10 using the distance and posture angle of the vehicle 10 calculated by the posture angle calculation unit 22.
- the self-position calculation device does not use the distance and the posture angle of the vehicle 10 calculated by the posture angle calculation unit 22 in the previous information processing cycle when the road surface state greatly changes. Since the calculated current position of the vehicle 10, the distance to the road surface, and the posture angle are used, the self-position of the vehicle 10 can be calculated accurately and stably even if the road surface state changes greatly.
- This information processing cycle is an example of a self-position calculation method for calculating the self-position of the vehicle 10 from the image 38 acquired by the camera 12.
- the information processing cycle shown in the flowchart of FIG. 9 is started at the same time as the ignition switch of the vehicle 10 is turned on and the self-position calculation device is started, and is repeatedly executed until the self-position calculation device is stopped.
- the pattern light control unit 27 controls the projector 11 to project the pattern light 32a onto the road surface 31 around the vehicle.
- the pattern light 32a is continuously projected.
- step S03 the ECU 13 controls the camera 12 to capture the road surface 31 around the vehicle including the area where the pattern light 32a is projected, and acquire the image 38.
- the ECU 13 stores the image data acquired by the camera 12 in a memory.
- the ECU 13 can automatically control the aperture of the camera 12.
- the diaphragm of the camera 12 may be feedback controlled so that the average luminance of the image 38 acquired in the previous information processing cycle becomes an intermediate value between the maximum value and the minimum value of the luminance value.
- ECU13 may obtain
- step S05 first, the pattern light extraction unit 21 reads the image 38 acquired by the camera 12 from the memory, and extracts the position of the pattern light 32a from the image 38 as shown in FIG.
- the pattern light extraction unit 21 stores the coordinates (Uj, Vj) on the image of the spot light Sp calculated as data indicating the position of the pattern light 32a in the memory.
- step S05 the posture angle calculation unit 22 reads data indicating the position of the pattern light 32a from the memory, calculates the distance and posture angle of the vehicle 10 with respect to the road surface 31 from the position of the pattern light 32a, and stores them in the memory. .
- step S07 the ECU 13 detects feature points from the image 38, extracts feature points that can be correlated between the preceding and following information processing cycles, and extracts the distance and the distance from the position (Ui, Vi) of the feature points on the image.
- the amount of change in posture angle is calculated. Further, the moving amount of the vehicle is calculated.
- the feature point detection unit 23 reads an image 38 acquired by the camera 12 from the memory, detects a feature point on the road surface 31 from the image 38, and positions each feature point on the image (Ui, Vi). ) Is stored in the memory.
- the posture change amount calculation unit 24 reads the position (Ui, Vi) of each feature point on the image from the memory, and calculates the distance and posture angle calculated by the posture angle calculation unit 22 and the position (Ui) of the feature point on the image. , Vi), the relative position (Xi, Yi, Zi) of the feature point with respect to the camera 12 is calculated.
- the posture change amount calculation unit 24 uses the distance and posture angle set in step S09 of the previous information processing cycle. Then, the posture change amount calculation unit 24 stores the relative position (Xi, Yi, Zi) of the feature point with respect to the camera 12 in the memory.
- the posture change amount calculation unit 24 calculates the position (Ui, Vi) of the feature point on the image and the relative position (Xi, Yi, Zi) of the feature point calculated in step S07 of the previous information processing cycle. Read from memory.
- the posture change amount calculation unit 24 uses the relative positions (Xi, Yi, Zi) and the positions (Ui, Vi) of the feature points that can be correlated between the previous and subsequent information processing cycles, and the distance and posture angle. The amount of change is calculated. Further, the posture change amount calculation unit 24 calculates the amount of movement of the vehicle from the previous relative position (Xi, Yi, Zi) of the feature point and the current relative position (Xi, Yi, Zi) of the feature point, and stores the memory. To remember.
- the “amount of change in distance and attitude angle” and “amount of movement of the vehicle” calculated in step S07 are used in the process of step S11.
- step S09 the ECU 13 sets the starting point of the integral calculation for calculating the self position according to the change in the road surface condition around the vehicle. Details will be described later with reference to FIG.
- the self-position calculating unit 26 calculates the self-position of the vehicle 10 from the starting point of the integral calculation set in step S09 and the distance and the change amount of the attitude angle of the vehicle 10 calculated in the process of step S07. Is calculated.
- the self-position calculation apparatus can calculate the self-position of the vehicle 10 by repeatedly executing the above-described series of information processing cycles and integrating the movement amount of the vehicle 10.
- step S101 the road surface state determination unit 30 detects a change in the road surface state around the vehicle. Specifically, the road surface state determination unit 30 detects the number of spot lights of the pattern light 32a, or detects the vibration of the detection value of the stroke sensor attached to each wheel. Further, the road surface state determination unit 30 may integrate the detection value of the acceleration sensor capable of measuring the acceleration in the vertical direction of the vehicle to calculate the velocity in the vertical direction, or detect the position of the pattern light 32a.
- the road surface state determination unit 30 determines whether or not the road surface state around the vehicle has changed by a threshold value or more. For example, in the case of detecting the number of spot lights of the pattern light 32a, the road surface state determination unit 30 determines the level difference or unevenness of the road surface when only 28 or less of the 35 spot lights can be detected on the camera image. It is determined that the road surface condition has changed more than a threshold value.
- the road surface state determination unit 30 determines that the road surface state has changed by a threshold value or more. Furthermore, when using the acceleration sensor, the detection value of the acceleration sensor is integrated to calculate the vertical speed, and when the change in the direction of the speed becomes 1 Hz or more, the road surface state determination unit 30 determines the road surface state. Is determined to have changed more than the threshold.
- the road surface state determination unit 30 determines that the road surface state has changed by more than a threshold value when the slope of the line segment connecting the spot light has changed by 15 degrees or more. judge.
- the road surface state determination unit 30 may determine that the road surface state has changed by a threshold value or more when the difference in the interval between adjacent spot lights has changed by 50% or more.
- step S103 it is determined whether or not the road surface condition around the vehicle has changed by a threshold value or more, and if the road surface condition determination unit 30 determines that the road surface condition has changed by a threshold value or more (YES in step S103), a step is performed. The process proceeds to S105. On the other hand, when the road surface state determination unit 30 determines that the change is not more than the threshold (NO in step S103), the process proceeds to step S107.
- step S105 the self-position calculating unit 26 fixes the current position of the vehicle 10, the distance to the road surface, and the attitude angle to the current position, the distance to the road surface, and the attitude angle calculated in the previous information processing cycle. That is, the self-position calculating unit 26 sets the current position of the vehicle 10 calculated in the previous information processing cycle, the distance to the road surface, and the attitude angle as the starting point of the integration calculation.
- the attitude angle calculation unit 22 stops the calculation of the distance and the attitude angle of the vehicle 10 with respect to the road surface, and the self-position calculation unit 26 calculates the current position of the vehicle 10 calculated in the previous information processing cycle, the distance to the road surface, and The amount of posture change is added to the posture angle to calculate the current position of the current vehicle 10, the distance to the road surface, and the posture angle.
- step S107 the self-position calculation unit 26 sets the current position of the vehicle 10, the distance to the road surface, and the attitude angle calculated in step S05 of the current information processing cycle as the base points of the integration calculation.
- the self-position calculation unit 26 adds the posture change amount to the current position, the distance to the road surface, and the attitude angle of the vehicle 10 calculated in the current information processing cycle, and the current position, the distance to the road surface, and the attitude of the vehicle 10. Calculate the corner.
- step S09 ends and the process proceeds to step S11 of FIG.
- FIG. 11 illustrates an example of an estimation error of the roll angle (an example of the attitude angle) and the movement amount (the vehicle width direction) of the vehicle 10.
- FIG. 11A shows the change over time of the calculated value of the roll angle when the vehicle 10 is traveling straight without generating a roll angle
- FIG. 11B shows the amount of movement of the vehicle 10 in the same case. The time change of the calculated value is shown.
- “P1” and “P2” indicate values when the movement amount of the vehicle is calculated by a conventional method
- Q1” and “Q2” indicate true values that are actual values. Yes.
- the self-position calculation device when it is determined that the road surface condition around the vehicle has changed more than the threshold value, the calculation of the current position of the vehicle 10, the distance to the road surface, and the posture angle is stopped.
- the current position of the vehicle 10 calculated in the previous information processing cycle, the distance to the road surface, and the posture angle are used as starting points.
- the error increases as shown by P1 and P2 in FIG. 11 by using the value of the previous information processing cycle.
- the self-position of the vehicle 10 can be calculated with high accuracy.
- the change in the road surface state is estimated from the amount of change in the road surface height. Therefore, since the unevenness
- the amount of change in the road surface height is estimated from the position of the pattern light. Therefore, since the change of a road surface state can be detected without using the sensor mounted in the vehicle, the self position of the vehicle can be calculated with a simple method with high accuracy.
- a self-position calculating apparatus according to a second embodiment of the present invention will be described with reference to the drawings. However, the configuration of the self-position calculation apparatus according to this embodiment is the same as that of the first embodiment shown in FIG.
- the distance and posture angle of the vehicle 10 are calculated in each information processing cycle.
- the distance and posture angle of the vehicle 10 are calculated at a constant cycle. It is different.
- a period count pulse is set to be generated at a constant cycle such as 8 frames, and the attitude angle calculation flag is set to “ON” at that timing.
- the distance and the attitude angle of the vehicle 10 can be calculated at a constant period.
- the attitude angle calculation flag is not set to “ON”. Keep “OFF”. For example, between the times t1 and t2, the number of detected spot lights is equal to or less than the threshold value, so that the cycle count pulse is generated, but the attitude angle calculation flag is not set to “ON” but remains “OFF”. It is.
- the road surface state determination unit 30 calculates the current position of the vehicle, the distance to the road surface, and the posture angle calculated in the previous information processing cycle.
- the attitude angle calculation unit 22 stops calculating the current position of the vehicle, the distance to the road surface, and the attitude angle
- the self-position calculation unit 26 calculates the current position of the vehicle 10 calculated in the previous information processing cycle, the distance to the road surface, and The amount of posture change is added to the posture angle to calculate the current position of the current vehicle 10, the distance to the road surface, and the posture angle.
- step S201 the road surface state determination unit 30 determines whether or not a predetermined period has elapsed. As described with reference to FIG. 12, the road surface state determination unit 30 monitors whether or not a cycle count pulse has occurred, and if so, determines that a predetermined cycle has elapsed and proceeds to step S203. . On the other hand, if the cycle count pulse has not occurred, it is determined that the predetermined cycle has not elapsed, and the process proceeds to step S207.
- steps S203 to S209 is the same as the processing in steps S101 to S107 in FIG. 10, and detailed description thereof will be omitted.
- the self-position calculation apparatus calculates the distance and posture angle of the vehicle at a constant cycle.
- the self-position calculation device calculates the current position of the vehicle, the distance to the road surface, and the attitude angle calculated in the previous information processing cycle To calculate the current position of the current vehicle, the distance to the road surface, and the posture angle.
- the posture angle calculation unit 22 can reduce the frequency of calculating the distance and the posture angle of the vehicle, so that the calculation load of the ECU 13 can be reduced and the calculation speed can be improved.
Abstract
Description
[ハードウェア構成]
先ず、図1を参照して、第1実施形態に係わる自己位置算出装置のハードウェア構成を説明する。自己位置算出装置は、投光器11と、カメラ12と、エンジンコントロールユニット(ECU)13とを備える。投光器11は、車両に搭載され、車両周囲の路面にパターン光を投光する。カメラ12は、車両に搭載され、パターン光が投光された領域を含む車両周囲の路面を撮像して画像を取得する撮像部の一例である。ECU13は、投光器11を制御し、且つカメラ12で取得した画像から車両の自己位置を算出する一連の情報処理サイクルを実行する制御部の一例である。
次に、図9及び図10を参照して、ECU13によって繰り返し実行される情報処理サイクルを説明する。この情報処理サイクルは、カメラ12で取得した画像38から車両10の自己位置を算出する自己位置算出方法の一例である。
次に、図10のフローチャートを参照して、図9のステップS09の詳細な手順を説明する。図10に示すように、ステップS101において、路面状態判定部30は、車両周囲の路面状態の変化を検出する。具体的に、路面状態判定部30は、パターン光32aのスポット光の数を検出したり、各車輪に取り付けられているストロークセンサの検出値の振動を検出したりする。また、路面状態判定部30は、車両の鉛直方向の加速度を計測できる加速度センサの検出値を積分して鉛直方向の速度を算出したり、パターン光32aの位置を検出したりしてもよい。
以上詳細に説明したように、本実施形態に係る自己位置算出装置では、車両周囲の路面状態が閾値以上変化していると判定した場合に、前回の情報処理サイクルで算出した車両の現在位置、路面に対する距離及び姿勢角に姿勢変化量を加算して、現在の車両の現在位置、路面に対する距離及び姿勢角を算出する。これにより、車両周囲の路面上に凹凸や段差がある場合でも車両の自己位置を精度よく算出することができる。
[第2実施形態]
次に、本発明の第2実施形態に係る自己位置算出装置について図面を参照して説明する。ただし、本実施形態に係る自己位置算出装置の構成は、図1に示した第1実施形態と同一なので、詳細な説明は省略する。
本実施形態に係る自己位置算出装置による情報処理サイクルは、図9に示す第1実施形態と同一なので、詳細な説明は省略する。本実施形態では、図9のステップS09の積分演算 の起点を設定する処理が第1実施形態と相違している。
次に、図13のフローチャートを参照して、図9のステップS09の詳細な手順を説明する。図13に示すように、ステップS201において、路面状態判定部30は、所定の周期が経過したか否かを判定する。図12で説明したように、路面状態判定部30は、周期カウントパルスが発生しているか否かを監視し、発生している場合には所定の周期が経過したと判定してステップS203に進む。一方、周期カウントパルスが発生していない場合には、所定の周期が経過していないと判定してステップS207に進む。
以上詳細に説明したように、本実施形態に係る自己位置算出装置では、一定の周期で車両の距離及び姿勢角の算出を行う。そして、その際に車両周囲の路面状態が閾値以上変化していると判定された場合に、自己位置算出装置は、前回の情報処理サイクルで算出された車両の現在位置、路面に対する距離及び姿勢角に姿勢変化量を加算して、現在の車両の現在位置、路面に対する距離及び姿勢角を算出する。これにより、姿勢角算出部22が、車両の距離及び姿勢角を算出する頻度を少なくできるので、ECU13の演算負荷を軽減し、且つ演算速度を向上させることができる。
11 投光器
12 カメラ(撮像部)
13 ECU
21 パターン光抽出部
22 姿勢角算出部
23 特徴点検出部
24 姿勢変化量算出部
26 自己位置算出部
27 パターン光制御部
30 路面状態判定部
31 路面
32a、32b パターン光
Te 特徴点
Claims (4)
- 車両周囲の路面にパターン光を投光する投光器と、
前記車両に搭載され、前記パターン光が投光された領域を含む車両周囲の路面を撮像して画像を取得する撮像部と、
前記撮像部で取得した画像における前記パターン光の位置から前記路面に対する車両の姿勢角を算出する姿勢角算出部と、
前記撮像部で取得した画像における前記路面上の複数の特徴点の時間変化に基づいて前記車両の姿勢変化量を算出する姿勢変化量算出部と、
前記車両の初期位置及び姿勢角に前記姿勢変化量を加算してゆくことによって前記車両の現在位置及び姿勢角を算出する自己位置算出部と、
前記車両周囲の路面状態の変化を検出して前記路面状態が閾値以上変化しているか否かを判定する路面状態判定部とを備え、
前記路面状態判定部によって前記路面状態が閾値以上変化していると判定された場合には、前記自己位置算出部は、前回の情報処理サイクルで算出された前記車両の現在位置及び姿勢角に前記姿勢変化量を加算して前記車両の現在位置及び姿勢角を算出することを特徴とする自己位置算出装置。 - 前記路面状態判定部は、前記路面状態の変化を路面の高さの変化量から推定することを特徴とする請求項1に記載の自己位置算出装置。
- 前記路面状態判定部は、前記路面の高さの変化量を前記パターン光の位置から推定することを特徴とする請求項2に記載の自己位置算出装置。
- 車両周囲の路面にパターン光を投光する手順と、
前記パターン光が投光された領域を含む車両周囲の路面を撮像して画像を取得する手順と、
前記画像における前記パターン光の位置から前記路面に対する車両の姿勢角を算出する手順と、
前記画像における前記路面上の複数の特徴点の時間変化に基づいて前記車両の姿勢変化量を算出する手順と、
前記車両の初期位置及び姿勢角に前記姿勢変化量を加算してゆくことによって前記車両の現在位置及び姿勢角を算出する自己位置算出手順と、
前記車両周囲の路面状態の変化を検出して前記路面状態が閾値以上変化しているか否かを判定する路面状態判定手順とを含み、
前記路面状態判定手順によって前記路面状態が閾値以上変化していると判定された場合に、前記自己位置算出手順では前回の情報処理サイクルで算出された前記車両の現在位置及び姿勢角に前記姿勢変化量を加算して前記車両の現在位置及び姿勢角を算出することを特徴とする自己位置算出方法。
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/054313 WO2015125298A1 (ja) | 2014-02-24 | 2014-02-24 | 自己位置算出装置及び自己位置算出方法 |
RU2016137966A RU2621823C1 (ru) | 2014-02-24 | 2014-02-24 | Устройство вычисления собственного местоположения и способ вычисления собственного местоположения |
EP14882984.9A EP3113146B1 (en) | 2014-02-24 | 2014-02-24 | Location computation device and location computation method |
BR112016019548-5A BR112016019548B1 (pt) | 2014-02-24 | 2014-02-24 | Dispositivo de cálculo de auto-localização e metodo de cálculo de auto-localização |
US15/119,161 US9721170B2 (en) | 2014-02-24 | 2014-02-24 | Self-location calculating device and self-location calculating method |
JP2016503897A JP6237876B2 (ja) | 2014-02-24 | 2014-02-24 | 自己位置算出装置及び自己位置算出方法 |
MX2016010742A MX349024B (es) | 2014-02-24 | 2014-02-24 | Dispositivo de calculo de la ubicacion propia y metodo de calculo de la ubicacion propia. |
CN201480075546.1A CN105993041B (zh) | 2014-02-24 | 2014-02-24 | 自身位置计算装置以及自身位置计算方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/054313 WO2015125298A1 (ja) | 2014-02-24 | 2014-02-24 | 自己位置算出装置及び自己位置算出方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015125298A1 true WO2015125298A1 (ja) | 2015-08-27 |
Family
ID=53877831
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/054313 WO2015125298A1 (ja) | 2014-02-24 | 2014-02-24 | 自己位置算出装置及び自己位置算出方法 |
Country Status (8)
Country | Link |
---|---|
US (1) | US9721170B2 (ja) |
EP (1) | EP3113146B1 (ja) |
JP (1) | JP6237876B2 (ja) |
CN (1) | CN105993041B (ja) |
BR (1) | BR112016019548B1 (ja) |
MX (1) | MX349024B (ja) |
RU (1) | RU2621823C1 (ja) |
WO (1) | WO2015125298A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108419441A (zh) * | 2016-03-14 | 2018-08-17 | 欧姆龙株式会社 | 路面形状测定装置、测定方法及程序 |
JP2019091102A (ja) * | 2017-11-10 | 2019-06-13 | 株式会社デンソーアイティーラボラトリ | 位置姿勢推定システム及び位置姿勢推定装置 |
JP2020003458A (ja) * | 2018-07-02 | 2020-01-09 | 株式会社Subaru | 自車位置検出装置 |
WO2020137110A1 (ja) * | 2018-12-27 | 2020-07-02 | クラリオン株式会社 | 移動量推定装置 |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
BR112016019589B1 (pt) * | 2014-02-24 | 2022-06-14 | Nissan Motor Co. Ltd | Dispositivo de cálculo de autolocalização e método de cálculo de autolocalização |
JP6707378B2 (ja) * | 2016-03-25 | 2020-06-10 | 本田技研工業株式会社 | 自己位置推定装置および自己位置推定方法 |
WO2018020589A1 (ja) * | 2016-07-26 | 2018-02-01 | 日産自動車株式会社 | 自己位置推定方法及び自己位置推定装置 |
JP6838340B2 (ja) * | 2016-09-30 | 2021-03-03 | アイシン精機株式会社 | 周辺監視装置 |
US9896107B1 (en) * | 2016-09-30 | 2018-02-20 | Denso International America, Inc. | Digital lane change confirmation projection systems and methods |
CN107063228B (zh) * | 2016-12-21 | 2020-09-04 | 上海交通大学 | 基于双目视觉的目标姿态解算方法 |
US20210403452A1 (en) | 2018-11-02 | 2021-12-30 | Celgene Corporation | Solid forms of 2-methyl-1-[(4-[6-(trifluoromethyl) pyridin-2-yl]-6-{[2-(trifluoromethyl)pyridin-4-yl]amino}-1,3,5-triazin-2-yl) amino]propan-2-ol |
US20220017490A1 (en) | 2018-11-02 | 2022-01-20 | Celgene Corporation | Co-crystals of 2-methyl-1 -[(4-[6-(trifluoromethyl)pyridin-2-yl]-6-{[2-(trifluoromethyl) pyridin-4-yl]amino}-1,3,5-triazin-2-yl)amino]propan-2-ol, compositions and methods of use thereof |
US20220017489A1 (en) | 2018-11-02 | 2022-01-20 | Celgene Corporation | Solid dispersions for treatment of cancer |
WO2020137315A1 (ja) * | 2018-12-28 | 2020-07-02 | パナソニックIpマネジメント株式会社 | 測位装置及び移動体 |
US11900679B2 (en) * | 2019-11-26 | 2024-02-13 | Objectvideo Labs, Llc | Image-based abnormal event detection |
CN111862474B (zh) * | 2020-05-20 | 2022-09-20 | 北京骑胜科技有限公司 | 共享车辆处理方法、装置、设备及计算机可读存储介质 |
CN116252581B (zh) * | 2023-03-15 | 2024-01-16 | 吉林大学 | 直线行驶工况车身垂向及俯仰运动信息估算系统及方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06325298A (ja) * | 1993-05-13 | 1994-11-25 | Yazaki Corp | 車両周辺監視装置 |
JP2007256090A (ja) * | 2006-03-23 | 2007-10-04 | Nissan Motor Co Ltd | 車両用環境認識装置及び車両用環境認識方法 |
JP2010101683A (ja) * | 2008-10-22 | 2010-05-06 | Nissan Motor Co Ltd | 距離計測装置および距離計測方法 |
WO2012172870A1 (ja) * | 2011-06-14 | 2012-12-20 | 日産自動車株式会社 | 距離計測装置及び環境地図生成装置 |
JP2013147114A (ja) * | 2012-01-18 | 2013-08-01 | Toyota Motor Corp | 周辺環境取得装置およびサスペンション制御装置 |
JP2013187862A (ja) * | 2012-03-09 | 2013-09-19 | Topcon Corp | 画像データ処理装置、画像データ処理方法および画像データ処理用のプログラム |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2901112B2 (ja) * | 1991-09-19 | 1999-06-07 | 矢崎総業株式会社 | 車両周辺監視装置 |
RU2247921C2 (ru) * | 2002-06-26 | 2005-03-10 | Анцыгин Александр Витальевич | Способ ориентирования на местности и устройство для его осуществления |
JP2004198211A (ja) | 2002-12-18 | 2004-07-15 | Aisin Seiki Co Ltd | 移動体周辺監視装置 |
JP2004198212A (ja) * | 2002-12-18 | 2004-07-15 | Aisin Seiki Co Ltd | 移動体周辺監視装置 |
JP4914726B2 (ja) | 2007-01-19 | 2012-04-11 | クラリオン株式会社 | 現在位置算出装置、現在位置算出方法 |
JP5414714B2 (ja) * | 2011-03-04 | 2014-02-12 | 日立オートモティブシステムズ株式会社 | 車戴カメラ及び車載カメラシステム |
JP6092530B2 (ja) * | 2012-06-18 | 2017-03-08 | キヤノン株式会社 | 画像処理装置、画像処理方法 |
EP2881703B1 (en) * | 2012-08-02 | 2021-02-24 | Toyota Jidosha Kabushiki Kaisha | Road surface condition acquisition device and suspension system |
-
2014
- 2014-02-24 CN CN201480075546.1A patent/CN105993041B/zh active Active
- 2014-02-24 JP JP2016503897A patent/JP6237876B2/ja active Active
- 2014-02-24 WO PCT/JP2014/054313 patent/WO2015125298A1/ja active Application Filing
- 2014-02-24 US US15/119,161 patent/US9721170B2/en active Active
- 2014-02-24 EP EP14882984.9A patent/EP3113146B1/en active Active
- 2014-02-24 RU RU2016137966A patent/RU2621823C1/ru active
- 2014-02-24 BR BR112016019548-5A patent/BR112016019548B1/pt active IP Right Grant
- 2014-02-24 MX MX2016010742A patent/MX349024B/es active IP Right Grant
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06325298A (ja) * | 1993-05-13 | 1994-11-25 | Yazaki Corp | 車両周辺監視装置 |
JP2007256090A (ja) * | 2006-03-23 | 2007-10-04 | Nissan Motor Co Ltd | 車両用環境認識装置及び車両用環境認識方法 |
JP2010101683A (ja) * | 2008-10-22 | 2010-05-06 | Nissan Motor Co Ltd | 距離計測装置および距離計測方法 |
WO2012172870A1 (ja) * | 2011-06-14 | 2012-12-20 | 日産自動車株式会社 | 距離計測装置及び環境地図生成装置 |
JP2013147114A (ja) * | 2012-01-18 | 2013-08-01 | Toyota Motor Corp | 周辺環境取得装置およびサスペンション制御装置 |
JP2013187862A (ja) * | 2012-03-09 | 2013-09-19 | Topcon Corp | 画像データ処理装置、画像データ処理方法および画像データ処理用のプログラム |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108419441A (zh) * | 2016-03-14 | 2018-08-17 | 欧姆龙株式会社 | 路面形状测定装置、测定方法及程序 |
CN108419441B (zh) * | 2016-03-14 | 2020-06-02 | 欧姆龙株式会社 | 路面形状测定装置、测定方法及计算机可读存储介质 |
JP2019091102A (ja) * | 2017-11-10 | 2019-06-13 | 株式会社デンソーアイティーラボラトリ | 位置姿勢推定システム及び位置姿勢推定装置 |
JP2020003458A (ja) * | 2018-07-02 | 2020-01-09 | 株式会社Subaru | 自車位置検出装置 |
WO2020137110A1 (ja) * | 2018-12-27 | 2020-07-02 | クラリオン株式会社 | 移動量推定装置 |
JP2020106357A (ja) * | 2018-12-27 | 2020-07-09 | クラリオン株式会社 | 移動量推定装置 |
JP7171425B2 (ja) | 2018-12-27 | 2022-11-15 | フォルシアクラリオン・エレクトロニクス株式会社 | 移動量推定装置 |
US11814056B2 (en) | 2018-12-27 | 2023-11-14 | Faurecia Clarion Electronics Co., Ltd. | Travel amount estimation apparatus |
Also Published As
Publication number | Publication date |
---|---|
BR112016019548B1 (pt) | 2022-01-11 |
MX2016010742A (es) | 2016-10-26 |
EP3113146A4 (en) | 2017-05-03 |
EP3113146A1 (en) | 2017-01-04 |
US20170024617A1 (en) | 2017-01-26 |
MX349024B (es) | 2017-07-07 |
EP3113146B1 (en) | 2018-07-04 |
US9721170B2 (en) | 2017-08-01 |
JP6237876B2 (ja) | 2017-11-29 |
BR112016019548A2 (pt) | 2017-08-15 |
JPWO2015125298A1 (ja) | 2017-03-30 |
CN105993041B (zh) | 2017-10-31 |
CN105993041A (zh) | 2016-10-05 |
RU2621823C1 (ru) | 2017-06-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6237876B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6269838B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6237875B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6176387B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6187671B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6187672B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6237874B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6547362B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6398218B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6398217B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6369897B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6299319B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6492974B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6459745B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6398219B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6369898B2 (ja) | 自己位置算出装置及び自己位置算出方法 | |
JP6459701B2 (ja) | 自己位置算出装置及び自己位置算出方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14882984 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15119161 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2016/010742 Country of ref document: MX |
|
ENP | Entry into the national phase |
Ref document number: 2016503897 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112016019548 Country of ref document: BR |
|
REEP | Request for entry into the european phase |
Ref document number: 2014882984 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014882984 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016137966 Country of ref document: RU Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 112016019548 Country of ref document: BR Kind code of ref document: A2 Effective date: 20160824 |