JP3958133B2 - Vehicle position measuring apparatus and method - Google Patents

Vehicle position measuring apparatus and method Download PDF

Info

Publication number
JP3958133B2
JP3958133B2 JP2002203355A JP2002203355A JP3958133B2 JP 3958133 B2 JP3958133 B2 JP 3958133B2 JP 2002203355 A JP2002203355 A JP 2002203355A JP 2002203355 A JP2002203355 A JP 2002203355A JP 3958133 B2 JP3958133 B2 JP 3958133B2
Authority
JP
Japan
Prior art keywords
road
vehicle position
vehicle
error
step
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2002203355A
Other languages
Japanese (ja)
Other versions
JP2004045227A (en
Inventor
幸一 佐藤
Original Assignee
アルパイン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アルパイン株式会社 filed Critical アルパイン株式会社
Priority to JP2002203355A priority Critical patent/JP3958133B2/en
Publication of JP2004045227A publication Critical patent/JP2004045227A/en
Application granted granted Critical
Publication of JP3958133B2 publication Critical patent/JP3958133B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

[0001]
BACKGROUND OF THE INVENTION
The present invention relates to a vehicle position measuring apparatus and method used for a navigation apparatus or the like.
[0002]
[Prior art]
In the navigation device, the measurement of the vehicle position is an important function, and a navigation operation such as display of a map image around the vehicle position, route calculation, route guidance, etc. is performed using this result. For example, vehicle position detection using GPS is generally performed. In addition, since the accuracy of detection of the vehicle position using the conventional GPS is not so good, recently the detection accuracy of the vehicle position has been improved by using DGPS (high accuracy GPS) using a plurality of GPS. I am trying.
[0003]
As another conventional technique for improving the detection accuracy of the vehicle position, a “vehicle traveling guide device” disclosed in Japanese Patent Laid-Open No. 6-68391 is known. In this vehicle travel guidance device, by detecting a road surface display mark such as a pedestrian crossing drawn on a road, position data such as a pedestrian crossing included in the map data is used to detect the position of the detected vehicle. A correction has been made. Thereby, it becomes possible to correct the own vehicle position using GPS or an autonomous navigation sensor with high accuracy.
[0004]
[Problems to be solved by the invention]
By the way, the conventional method for detecting the vehicle position using the GPS or DGPS described above has a problem that the improved detection accuracy is about several meters and the detection accuracy is still poor. For this reason, when two roads are parallel, such as an expressway or a general road, it is impossible to accurately detect which road the vehicle is traveling on, or the timing of intersection guidance is delayed. Inconvenience may occur. Moreover, since GPS and DGPS need to receive radio waves transmitted from satellites, vehicle position detection accuracy is further deteriorated in places where radio waves are difficult to receive, such as building districts and mountainous areas.
[0005]
Further, in the conventional system disclosed in Japanese Patent Laid-Open No. 6-68391, the position data of the road surface display mark needs to be stored in the map data. No data is stored. For this reason, in order to detect the vehicle position using this conventional method, it is necessary to collect the position data of the road surface display mark and produce the map data for the entire area that is the target of map display, Considering the cost, it is not practical.
[0006]
The present invention has been made in view of such a point, and an object of the present invention is to provide a vehicle position measuring apparatus and method capable of detecting a vehicle position with high accuracy and low cost.
[0007]
[Means for Solving the Problems]
In order to solve the above-described problem, a vehicle position measurement device according to the present invention is equipped with vehicle position detection means for detecting a vehicle position and a relative position with respect to a point where a predetermined absolute coordinate is specified. A first camera that calculates an error of the vehicle position detected by the vehicle position detecting means at the time of the image capturing based on the camera for capturing the specific object in the road and the image of the specific object in the road captured by the camera. Error calculation means, and vehicle position correction means for correcting the vehicle position detected by the vehicle position detection means using the error calculated by the first error calculation means. This makes it possible to calculate an error and correct the vehicle position based on the image of the specific object in the road photographed by the camera, and to detect the vehicle position with high accuracy. In addition, by using a specific item in the road whose relative position is determined with respect to the point where the absolute coordinates are specified, the map data that has been used in the past can be used as it is, and an increase in cost can be suppressed. become.
[0008]
In particular, it is desirable that the relative position of the above-described specific item in the road is determined by law. On the road, there is a specific item in the road where the installation position is determined by laws and regulations such as “orders related to road signs, lane markings and road signs”. By using this, the coordinates of this specific item in the road can be obtained. It is possible to calculate the above-described error and to correct the vehicle position using this error.
[0009]
Moreover, it is desirable that the above-mentioned specific item in the road is a guide sign including an intersection name. Based on the name of the intersection included in the guide sign, it is possible to easily specify the point where the absolute coordinates existing in the vicinity are specified.
In addition, it is preferable that the first error calculation unit described above calculates the error by calculating a distance from the shooting position by the camera to the guide sign based on the size of the guide sign. Alternatively, the first error calculating means described above calculates the distance from the camera shooting position to the guide sign based on the size of the guide sign, and the road on which the vehicle is traveling at the intersection included in the guide sign. It is desirable to calculate the error using the width data of other roads that intersect. If you know the size (height or width) of the specific item in the road, you can calculate the distance from the shooting position to the specific item in the road based on the size of the image, and by using this distance, Alternatively, by using the data of the width of the crossing road, it is easy to specify the shooting position, and it is possible to calculate the error included in the vehicle position detected by the vehicle position detecting means.
[0010]
Further, it is desirable that the first error calculation means described above calculates an error in the vehicle position along the road on which the vehicle is traveling. By using the image photographed by the camera, it is easy to correct the position along the road on which the vehicle is traveling by photographing a specific object in the road and identifying its position.
[0011]
Further, the point where the absolute coordinate is specified is a node, and it is desirable that the node coordinate included in the map data is used as the absolute coordinate. There are many roadside objects such as information signs and traffic lights around the intersection, and each intersection has a node whose absolute coordinates are stored in the map data. By using the node as the determined point, it becomes easy to specify the position of the specific object in the road existing in the vicinity.
[0012]
The image processing apparatus further includes second error calculation means for calculating an error of the vehicle position detected by the vehicle position detection means at the time of taking the image based on the white line image on the road imaged by the camera. Preferably, the vehicle position correcting means corrects the vehicle position detected by the vehicle position detecting means using the errors calculated by the first and second error calculating means. In particular, it is desirable that the second error calculation means described above calculates the error by calculating the position in the width direction in the road on which the vehicle is traveling based on the photographed white line image. By detecting the white line drawn on the road, it is possible to easily know the number of lanes on the road on which the vehicle is traveling and which lane the vehicle is traveling on. Can be obtained accurately, and the vehicle position can be corrected based on this. In particular, the vehicle position can be accurately detected by using the correction along the traveling road based on the image of the specific object in the road and the correction in the width direction of the traveling road based on the white line image.
[0013]
A first step of detecting a vehicle position; and a second step of photographing a specific object in the road whose relative position is known with respect to a point where a predetermined absolute coordinate is specified using a camera installed in the vehicle; A third step for calculating a first error included in the vehicle position detected in the first step based on the image taken in the second step, and a first step calculated in the third step And a fourth step of correcting the vehicle position detected in the first step using the above error. This makes it possible to calculate an error and correct the vehicle position based on the image of the specific object in the road photographed by the camera, and to detect the vehicle position with high accuracy. In addition, by using a specific item in the road whose relative position is determined with respect to the point where the absolute coordinates are specified, the map data that has been used in the past can be used as it is, and an increase in cost can be suppressed. become.
[0014]
In addition, the method further includes a fifth step of calculating a second error included in the vehicle position detected in the first step based on the white line image on the road included in the image captured by the camera described above. Then, it is desirable to correct the vehicle position in the fourth step using both the first and second errors calculated in the third and fifth steps. The vehicle position can be accurately detected by using the correction along the traveling road based on the image of the specific object in the road and the correction in the width direction of the traveling road based on the white line image.
[0015]
DETAILED DESCRIPTION OF THE INVENTION
Hereinafter, a navigation device according to an embodiment to which the present invention is applied will be described with reference to the drawings.
FIG. 1 is a diagram illustrating a configuration of a navigation device according to an embodiment. The navigation device shown in FIG. 1 includes a navigation controller 1, a DVD 2, a disc reader 3, a remote control (remote control) unit 4, a vehicle position detector 5, a display device 6, an audio unit 7, and a camera 8. .
[0016]
The navigation controller 1 controls the overall operation of the navigation device. The navigation controller 1 realizes its function by executing a predetermined operation program using a CPU, ROM, RAM, or the like. The detailed configuration of the navigation controller 1 will be described later.
[0017]
The DVD 2 is an information recording medium that stores map data necessary for map display, facility search, route search, and the like. This DVD 2 stores map data in units of rectangular figure leaves divided into appropriate sizes by longitude and latitude. The map data of each leaf can be specified and read by designating the leaf number.
[0018]
The disc reader 3 can be loaded with one or more DVDs 2 and reads map data from any of the DVDs 2 under the control of the navigation controller 1. The loaded disc is not necessarily a DVD but may be a CD. Further, both DVD and CD may be selectively loaded.
[0019]
The remote control unit 4 includes a joystick for designating directions such as up, down, left, and right, and various operation keys such as a numeric keypad for inputting numbers and a “decision key” for confirming various settings. A signal is output to the navigation controller 1.
[0020]
The vehicle position detection unit 5 includes, for example, a GPS receiver, an orientation sensor, a distance sensor, and the like, detects the vehicle position (longitude, latitude) at a predetermined timing, and outputs a detection result.
The display device 6 displays various images such as a map image around the own vehicle position and an intersection guide image based on the drawing data output from the navigation controller 1. The audio unit 7 outputs a guidance voice or the like generated based on the voice signal input from the navigation controller 1 into the vehicle interior.
[0021]
The camera 8 is installed at a predetermined position of the vehicle and photographs a subject included in the viewing angle. In the present embodiment, a camera 8 is installed in front of the vehicle in order to be able to photograph white lines drawn on the road, traffic lights installed near intersections, and the like.
Next, a detailed configuration of the navigation controller 1 will be described. As shown in FIG. 1, the navigation controller 1 includes a map buffer 10, a map readout control unit 12, a map drawing unit 14, a vehicle position calculation unit 20, a vehicle position correction unit 22, a route search processing unit 24, and a guidance route drawing unit 26. , Voice guidance unit 28, facility search unit 30, input processing unit 40, and display processing unit 50.
[0022]
The map buffer 10 temporarily stores map data read from the DVD 2 by the disk reader 3. The map reading control unit 12 outputs a request for reading a predetermined range of map data to the disk reading device 3 in accordance with the vehicle position calculated by the vehicle position calculating unit 20. Based on the map data stored in the map buffer 10, the map drawing unit 14 performs drawing processing necessary to display a map image and creates map image drawing data.
[0023]
The vehicle position calculation unit 20 calculates a rough own vehicle position based on the detection data output from the vehicle position detection unit 5, and based on the correction value output from the vehicle position correction unit 22, Correct the vehicle position. The vehicle position correction unit 22 is included in the approximate vehicle position calculated by the vehicle position calculation unit 20 based on the approximate vehicle position calculated by the vehicle position calculation unit 20 and the image taken by the camera 8. Error is calculated, and this error is output to the vehicle position calculation unit 20 as a correction value.
[0024]
The route search processing unit 24 searches for a travel route connecting the departure point and the destination according to a predetermined search condition as a guide route. The guide route drawing unit 26 generates guide route drawing data for displaying the guide route obtained by the route search by the route search processing unit 24 on the map. The voice guidance unit 28 outputs a voice signal such as intersection guidance necessary for guiding the vehicle along the guidance route obtained by the route search processing by the route search processing unit 24.
[0025]
The facility search unit 30 searches for a facility that satisfies the search condition specified by the user, or extracts detailed information on the facility obtained by the search.
The input processing unit 40 outputs instructions for performing operations corresponding to various operation instructions input from the remote control unit 4 to each unit in the navigation controller 1. The display processing unit 50 is input with map image drawing data created by the map drawing unit 14, guidance route drawing data created by the guidance route drawing unit 26, and the like, and a map within a predetermined range based on these drawing data. An image or the like is displayed on the screen of the display device 6.
[0026]
FIG. 2 is a diagram illustrating a detailed configuration of the vehicle position correction unit 22. As shown in FIG. 2, the vehicle position correcting unit 22 includes an image capturing unit 120, a white line extracting unit 121, a road specific object extracting unit 122, a template storage unit 123, a lateral relative position calculating unit 124, and a longitudinal relative position. A calculation unit 125 and a correction value determination unit 126 are included.
[0027]
The image capturing unit 120 performs processing for capturing photographing data output from the camera 8 at predetermined time intervals. The white line extraction unit 121 extracts a white line drawn on the road based on the image data captured by the image capture unit 120. In this specification, all lines (including dotted lines) such as white lines and yellow lines drawn on the road to indicate the separation of each lane (lane) are referred to as “white lines”.
[0028]
The in-road specific object extracting unit 122 is included in an image photographed by the camera 8 based on the image data captured by the image capturing unit 120 and the specific object template stored in the template storage unit 123. Extract specific items in the road. For example, in this embodiment, a traffic signal or a guide sign attached to the traffic signal is extracted as the specific item in the road.
[0029]
The lateral relative position calculation unit 124 determines the relative position in the lateral direction of the vehicle (shooting position) based on the white line on the road extracted by the white line extraction unit 121 and the image data captured by the image capture unit 120. Is calculated. Here, the lateral direction refers to the width direction of the road on which the vehicle is traveling, that is, the direction perpendicular to the link corresponding to the road.
[0030]
The vertical relative position calculation unit 125 determines the vertical direction of the vehicle (shooting position) based on the road specific object extracted by the road specific object extraction unit 122 and the image data captured by the image capture unit 120. The relative position of is calculated. Here, the vertical direction refers to the direction along the road on which the vehicle is traveling, that is, the direction along the link corresponding to the road.
[0031]
The correction value determination unit 126 obtains an error of the vehicle position calculated by the vehicle position calculation unit 20 using the calculation results of the horizontal direction relative position calculation unit 124 and the vertical direction relative position calculation unit 125, and calculates the error. Processing to set as a correction value is performed.
The vehicle position detection unit 5 described above is the vehicle position detection unit, the image capturing unit 120, the road specific object extraction unit 122, the vertical relative position calculation unit 125, and the correction value determination unit 126 are the first error calculation unit. The image capturing unit 120, the white line extracting unit 121, the lateral relative position calculating unit 124, and the correction value determining unit 126 correspond to the second error calculating unit, and the vehicle position calculating unit 20 corresponds to the vehicle position correcting unit.
[0032]
The navigation device of this embodiment has such a configuration, and the operation thereof will be described next.
FIG. 3 is a flowchart showing an operation procedure for correcting the vehicle position in the navigation device of the present embodiment.
[0033]
When the front of the vehicle is photographed by the camera 8 attached to the vehicle and image data obtained by photographing is captured (step 100), the road specific object extracting unit 122 displays an image of the road specific object within the photographing range. It is determined whether or not it exists (step 101). For example, in this embodiment, a guide sign including an intersection name is set as a specific object in the road to be determined. By using a guide sign template stored in the template storage unit 123, this guide sign is included in a captured image. It is determined whether or not a guide sign image is included. If it is not included, a negative determination is made, the process returns to step 100 without proceeding to the subsequent processing, and the photographing and image data capturing operations by the camera 8 are repeated. In addition, considering the case where the vehicle travels toward a guide sign installed on the road, the guide sign that was visible in the distance gradually appears larger as the vehicle travels, but the sign that appears smaller in the distance. Then, since an error in image processing becomes large, it is assumed that a guide sign larger than a predetermined size is detected.
[0034]
If the photographed image includes a guide sign of a predetermined size or larger, an affirmative determination is made in the determination of step 101, and then the road specific object extraction unit 122 displays the image of the guide sign contained in the photographed image. Based on the above, an intersection name included in this guide sign is specified (step 102). Since the background color of the guide sign, the predetermined character and color of the character, and the like are determined in advance by law, the name of the intersection can be easily specified using character recognition that has been performed conventionally.
[0035]
Further, the vertical relative position calculation unit 125 calculates the distance from the shooting position to the guide sign based on the image of the guide sign (step 103). Since the size (in particular, the height) of the guide sign is constant, the distance from the shooting position to the guide sign can be easily calculated based on the height of the guide sign image included in the captured image.
[0036]
Next, the vertical relative position calculation unit 125 calculates the relative distance from the intersection node specified by the intersection name included in the guide sign to the shooting position (step 104).
FIG. 4 is a diagram showing a specific example of the installation state of the guide signs. In the example shown in FIG. 4, another road intersects the road on which the vehicle G is traveling, and an intersection node N corresponds to this intersection. In addition, when paying attention to the road on which the vehicle G travels, the traffic light S1 is provided at the position where the vehicle enters the intersection, that is, the position where the vehicle G enters, and the position where the vehicle G exits, that is, the road where the vehicle G intersects. A traffic light S2 is provided at the position. Furthermore, it is assumed that a guide sign H as a signboard including an intersection name is attached to at least one of these traffic lights S1 and S2.
[0037]
FIG. 5 is a diagram illustrating an image captured by the camera 8 of the vehicle G that has approached the intersection illustrated in FIG. 4. As shown in FIG. 5, the image taken by the camera 8 includes white line images such as the center line a, the vehicle outermost line b, and the boundary line c drawn on the road on which the vehicle G is traveling, , Traffic lights S1, S2 and images of guide signs H attached to these traffic lights are included.
[0038]
Since the guide sign H described above has a height determined by law, the distance from the shooting position to the guide sign H can be known based on the height of the image of the guide sign H included in the photographed image. The vertical relative position calculation unit 125 calculates a distance h from the photographing position to the guide sign H. In addition, when the guide sign H is attached to both the two traffic lights S1 and S2, the distance h is calculated by paying attention to either one.
[0039]
By the way, the guide sign H for which the distance h has been calculated may be attached to the traffic light S1 in front of the vehicle G as shown in FIG. It may be attached. For this reason, in order to calculate the relative distance from the intersection node to the photographing position, it is necessary to determine to which traffic signal the guide sign H is attached.
[0040]
For example, in the present embodiment, the diameter of the lamp of the traffic light included in the photographed image is measured, the positions of the two traffic lights S1 and S2 are examined according to the size, and the installation position of the guide sign H is determined based on the result. It is judged which traffic signal is attached. That is, when the size of the lamp is constant, the size of the image of the lamp differs depending on the distance between the two traffic lights S1 and S2, and the traffic lights up to the traffic lights S1 and S2 are determined based on the size of each image. The distance can be calculated.
[0041]
As shown in FIG. 4, when the distances between the two traffic lights S1 and S2 are h1 and h2, the vertical relative position calculation unit 125 calculates the distance h2 from the vehicle G to the traffic light S2 far from the vehicle G and the guidance sign H. Compare distance h,
h2 ≦ h
If it is, it is judged that the traffic sign S2 is attached with the guide sign H, and the distance (width W1 / 2) that is half the width of the crossing road is subtracted from the distance h from the shooting position to the guide sign H. The value is calculated as the distance from the intersection node to the shooting position. The width W1 of the intersecting road is read from the map data stored in the map buffer 10 by determining the link L2 corresponding to this road based on the intersection name specified in step 102.
[0042]
In addition, the vertical relative position calculation unit 125
h2> h
If so, it is determined that the traffic sign S1 is attached to the traffic light S1, and a distance half the width of the intersection road (width W1 / 2) is added to the distance h from the shooting position to the traffic sign H. The value is calculated as the distance from the intersection node to the shooting position.
[0043]
In the above-described example, the lamp images of the traffic light are used to calculate the distances h1 and h2 from the shooting position to the traffic lights S1 and S2. However, these distances may be obtained by other methods. For example, when the height of the traffic signal with the fixed position fixed is constant, the height of the traffic signal can be determined from the captured image, and the distance to each traffic signal can be calculated based on the determination result.
[0044]
Further, as a method for determining whether the guide sign H is on the front side or the rear side of the intersection, a method other than the method for determining by determining the distance to the traffic lights S1 and S2 may be used. For example, when the vehicle G passes straight through an intersection, the positional relationship between the front and rear of the guide sign H and the center position of the intersection is determined based on the arrangement state of the center line a, the boundary line c, etc. existing across the intersection. You may make it judge.
[0045]
After the calculation of the relative distance from the intersection node to the shooting position is completed in this way, or in parallel with the processing of steps 102 to 104 described above, the white line extraction unit 121 performs the center line shown in FIG. An image of white lines such as a and boundary line c is extracted (step 105). Further, the lateral relative position calculation unit 124 calculates the relative position from the center of the road to the photographing position, that is, the relative position to the vehicle in the width direction of the running road, based on the extracted white line image ( Step 106).
[0046]
For example, the lateral relative position calculation unit 124, based on the white line image, the lane number T on the traveling lane side, the current traveling lane number N (1, 2,... In order of proximity to the road shoulder), and the relative position P in the traveling lane. (Assuming that the camera 8 is set in the center of the front of the vehicle, from the white line (boundary line c) on the right side of the center point o on the lower side shown in FIG. 5 to the white line on the left (the vehicle outermost line b). The relative position is expressed as a percentage and is calculated by 100 × w2 / (w1 + w2)), and then the distance from the center of the road to the shooting position is calculated using the following formula.
[0047]
W2 (T−N) / (2T) + (W2 / (2T)) × (P / 100)
Here, W2 is the width of the running road, the link L1 corresponding to this road is determined based on the intersection name specified in step 102, and is read from the map data stored in the map buffer 10.
[0048]
Next, based on the distance from the intersection node calculated at step 104 to the shooting position and the distance from the center of the road calculated at step 106 to the shooting position, the correction value determination unit 126 determines the vehicle position calculation unit 20. The correction value of the vehicle position calculated by the above is determined (step 107). That is, since the absolute coordinates of the shooting position are specified by these calculation results, a detection error that is a difference between the vehicle position at the shooting time calculated by the vehicle position calculation unit 20 and the absolute coordinates of the shooting position is used as a correction value. It is determined. When the correction value is determined in this manner, the vehicle position calculation unit 20 corrects the vehicle position using the correction value (step 108).
[0049]
As described above, in the navigation device of the present embodiment, the vehicle position along the traveling road is corrected based on the image of the guide sign photographed by the camera 8, and the traveling road is corrected based on the white line image. Correction in the width direction of the road along the road can be performed, and the vehicle position can be detected with high accuracy.
[0050]
In particular, by using a guide sign whose relative position is determined based on an intersection node whose absolute coordinates are specified, the map data that has been used in the past can be used as it is, and the increase in cost can be suppressed. It becomes possible.
In addition, by judging the content of the intersection name included in the guide sign, it is possible to accurately identify the road that is running, so if there are multiple roads that are very close or overlapped vertically Even in this case, it is possible to accurately determine one road on which the vehicle is traveling.
[0051]
In addition, this invention is not limited to the said embodiment, A various deformation | transformation implementation is possible within the range of the summary of this invention. For example, in the above-described embodiment, the vehicle position is corrected using the guide sign attached to the traffic light. However, the installation position is determined by law, and the relative position from the node can be accurately calculated. You may make it use the guide sign attached to the specific thing in a road (for example, crosswalk).
[0052]
In addition, the intersection name in the vicinity is specified using the intersection name included in the guide sign, and the inside of the road is determined from the shooting position based on the image of the specific object in the road other than the guide sign whose size is stipulated by law. You may make it calculate the distance to a specific thing.
[0053]
In the above-described embodiment, the case where the vehicle position is corrected by focusing on the simplest crossroad intersection where another road intersects the road on which the vehicle is traveling has been described. You may make it correct | amend a vehicle position using intersections, such as a road and a 5-way difference. In this case, it is necessary to determine the relative positional relationship between the installation position of the guide sign and the intersection node in consideration of each intersection shape.
[0054]
Alternatively, since it is not always necessary to correct the vehicle position at all the intersections, the vehicle position may be corrected only when passing through an intersection having a standard shape as shown in FIG.
In the embodiment described above, after obtaining the distance from the shooting position to the guide sign, the distance from the shooting position to the intersection node is obtained by adding or subtracting a distance half the width of the intersection road to this distance. However, the vehicle position may be corrected on the assumption that the installation position of the guide sign coincides with the position of the intersection node. In this case, an error corresponding to half of the width of the crossing road is allowed, but it is smaller than the detection error in the conventional method using GPS or the like, so that the detection accuracy of the vehicle position can be increased. Moreover, there is an advantage that the process can be simplified by omitting the process of determining where the guide sign is installed at the intersection.
[0055]
In the above-described embodiment, the vehicle position is corrected by measuring both the distance along the road on which the vehicle is traveling and the distance in the width direction of the road, but the distance along the road on which the vehicle is traveling You may make it use only. In other words, since the road on which the vehicle is traveling can be identified based on the intersection name included in the guide sign, even when only the distance along the road is corrected, the vehicle position is almost accurately detected. Can be detected.
[0056]
【The invention's effect】
As described above, according to the present invention, an error can be calculated and a vehicle position can be corrected based on an image of a specific object in the road photographed by a camera, and the vehicle position can be detected with high accuracy. In addition, by using a specific item in the road whose relative position is determined with respect to the point where the absolute coordinates are specified, the map data that has been used in the past can be used as it is, and an increase in cost can be suppressed. become.
[Brief description of the drawings]
FIG. 1 is a diagram illustrating a configuration of a navigation device according to an embodiment.
FIG. 2 is a diagram showing a detailed configuration of a vehicle position correction unit.
FIG. 3 is a flowchart showing an operation procedure for correcting the vehicle position in the navigation device of the present embodiment.
FIG. 4 is a diagram showing a specific example of an installation state of guide signs.
FIG. 5 is a diagram showing an image taken by a camera of a vehicle that has approached the intersection shown in FIG. 4;
[Explanation of symbols]
1 Navigation controller
5 Vehicle position detector
8 Camera
10 Map buffer
12 Map readout controller
14 Map drawing part
20 Vehicle position calculator
22 Vehicle position correction unit
120 Image capture unit
121 White line extraction unit
122 Road specific substance extraction unit
124 horizontal direction relative position calculation unit
125 Vertical relative position calculator
126 Correction value determination unit

Claims (6)

  1. Vehicle position detecting means for detecting the vehicle position;
    A camera that is installed in a vehicle and shoots a specific object in a road whose relative position is known with respect to a point where a predetermined absolute coordinate is specified;
    First error calculating means for calculating an error of the vehicle position detected by the vehicle position detecting means at the time of taking the image based on the image of the specific object in the road imaged by the camera;
    Vehicle position correction means for correcting the vehicle position detected by the vehicle position detection means using the error calculated by the first error calculation means;
    The specific item in the road is a guide sign in which the relative position is determined by law and an intersection name is included,
    The first error calculation means calculates the distance from the shooting position by the camera to the guide sign based on the size of the guide sign, and the road on which the vehicle is traveling at the intersection included in the guide sign A vehicle position measuring apparatus that calculates the error along the road on which the vehicle is traveling using data on the width of another road that intersects the road .
  2. In claim 1,
    The point where the absolute coordinate is specified is a node,
    A vehicle position measuring apparatus, wherein node coordinates included in map data are used as the absolute coordinates.
  3. In claim 1 or 2,
    A second error calculating means for calculating an error of the vehicle position detected by the vehicle position detecting means at the time of taking the image based on the white line image on the road imaged by the camera;
    The vehicle position correcting means corrects the vehicle position detected by the vehicle position detecting means by using the error calculated by the first and second error calculating means.
  4. In claim 3,
    The second error calculating means calculates the error by calculating a position in a width direction in a road on which the vehicle is traveling based on a photographed white line image. apparatus.
  5. A first step of detecting a vehicle position;
    A second step of photographing a specific object in the road having a known relative position with respect to a point where a predetermined absolute coordinate is specified, using a camera installed in the vehicle;
    A third step of calculating a first error included in the vehicle position detected in the first step based on the image photographed in the second step;
    A fourth step of correcting the vehicle position detected in the first step using the first error calculated in the third step;
    Have a, the road in the specific object is a signpost to include intersection name together are definite said relative position by law,
    In the third step, the result of calculating the distance from the shooting position by the camera to the guide sign based on the size of the guide sign, and the road where the vehicle is traveling at the intersection included in the guide sign A vehicle position measurement method comprising: calculating the first error along a road on which the vehicle is traveling using data on the width of another road .
  6. In claim 5,
    A fifth step of calculating a second error included in the vehicle position detected in the first step based on an image of a white line on the road included in the image captured by the camera;
    The vehicle position measuring method according to claim 4, wherein the correction of the vehicle position in the fourth step is performed using both the first and second errors calculated in the third and fifth steps.
JP2002203355A 2002-07-12 2002-07-12 Vehicle position measuring apparatus and method Expired - Fee Related JP3958133B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2002203355A JP3958133B2 (en) 2002-07-12 2002-07-12 Vehicle position measuring apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2002203355A JP3958133B2 (en) 2002-07-12 2002-07-12 Vehicle position measuring apparatus and method

Publications (2)

Publication Number Publication Date
JP2004045227A JP2004045227A (en) 2004-02-12
JP3958133B2 true JP3958133B2 (en) 2007-08-15

Family

ID=31709237

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2002203355A Expired - Fee Related JP3958133B2 (en) 2002-07-12 2002-07-12 Vehicle position measuring apparatus and method

Country Status (1)

Country Link
JP (1) JP3958133B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101026778B1 (en) 2011-01-26 2011-04-11 주식회사보다텍 Vehicle image detection apparatus

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4557288B2 (en) * 2005-01-28 2010-10-06 アイシン・エィ・ダブリュ株式会社 Image recognition device, image recognition method, position specifying device using the same, vehicle control device, and navigation device
JP2006287435A (en) * 2005-03-31 2006-10-19 Inkurimento P Kk Information processing apparatus, system thereof, method thereof, program thereof, and recording medium with the program recorded thereon
EP1876411A4 (en) 2005-04-25 2011-06-29 Geo Technical Lab Co Ltd Imaging position analyzing method
JP4903426B2 (en) * 2005-11-30 2012-03-28 アイシン・エィ・ダブリュ株式会社 Image recognition apparatus and method, and vehicle position recognition apparatus and method
JP4731380B2 (en) * 2006-03-31 2011-07-20 アイシン・エィ・ダブリュ株式会社 Self-vehicle position recognition device and self-vehicle position recognition method
JP4975350B2 (en) * 2006-03-31 2012-07-11 三菱電機株式会社 Car navigation system
JP5010844B2 (en) * 2006-03-31 2012-08-29 アイシン・エィ・ダブリュ株式会社 Feature information output device, image recognition device, and own vehicle position recognition device
JP4702149B2 (en) * 2006-04-06 2011-06-15 株式会社日立製作所 Vehicle positioning device
JP2007309670A (en) * 2006-05-16 2007-11-29 Aisin Aw Co Ltd Vehicle position detector
JP4600357B2 (en) * 2006-06-21 2010-12-15 トヨタ自動車株式会社 Positioning device
EP1906339B1 (en) * 2006-09-01 2016-01-13 Harman Becker Automotive Systems GmbH Method for recognizing an object in an image and image recognition device
KR100815153B1 (en) * 2006-11-08 2008-03-19 한국전자통신연구원 Apparatus and method for guiding a cross road of car navigation using a camera
JP4703544B2 (en) * 2006-11-21 2011-06-15 アイシン・エィ・ダブリュ株式会社 Driving assistance device
JP4875509B2 (en) * 2007-02-14 2012-02-15 アイシン・エィ・ダブリュ株式会社 Navigation device and navigation method
JP4953903B2 (en) * 2007-04-26 2012-06-13 三菱電機株式会社 Car navigation system
JP4817019B2 (en) * 2007-06-29 2011-11-16 アイシン・エィ・ダブリュ株式会社 Own vehicle position recognition device and own vehicle position recognition program
JP4902453B2 (en) * 2007-07-27 2012-03-21 インクリメント・ピー株式会社 Traffic regulation information generation apparatus and traffic regulation information generation program
JP4953015B2 (en) * 2007-10-30 2012-06-13 アイシン・エィ・ダブリュ株式会社 Own vehicle position recognition device, own vehicle position recognition program, and navigation device using the same
KR100887721B1 (en) * 2007-11-26 2009-03-12 한국전자통신연구원 Image car navigation system and method
WO2010004689A1 (en) 2008-07-07 2010-01-14 三菱電機株式会社 Vehicle traveling environment detection device
JP2010139478A (en) * 2008-12-15 2010-06-24 Clarion Co Ltd Navigation device
JP5786603B2 (en) * 2011-09-28 2015-09-30 アイシン・エィ・ダブリュ株式会社 Moving body position detection system, moving body position detection apparatus, moving body position detection method, and computer program
KR101919366B1 (en) * 2011-12-22 2019-02-11 한국전자통신연구원 Apparatus and method for recognizing vehicle location using in-vehicle network and image sensor
JP2013145176A (en) * 2012-01-13 2013-07-25 Toshiba Corp Road guide system
JP6031915B2 (en) * 2012-09-26 2016-11-24 株式会社バッファロー Image processing apparatus and program
JP2018030495A (en) * 2016-08-25 2018-03-01 トヨタ自動車株式会社 Vehicle control apparatus
WO2018212302A1 (en) * 2017-05-19 2018-11-22 パイオニア株式会社 Self-position estimation device, control method, program, and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101026778B1 (en) 2011-01-26 2011-04-11 주식회사보다텍 Vehicle image detection apparatus

Also Published As

Publication number Publication date
JP2004045227A (en) 2004-02-12

Similar Documents

Publication Publication Date Title
US6018697A (en) Navigation system for vehicles
US8085984B2 (en) Image recognizing apparatus and method, and position determining apparatus, vehicle controlling apparatus and navigation apparatus using the image recognizing apparatus or method
EP0798539B1 (en) Navigation device
JP4327389B2 (en) Travel lane recognition device
CN100595811C (en) Pavement marking recognition system
EP1975565B1 (en) Road surface feature information collecting apparatus and method
JP4277717B2 (en) Vehicle position estimation device and driving support device using the same
DE69633202T2 (en) Automatic course control system for a vehicle
DE69628102T2 (en) Car navigation system
JP4861850B2 (en) Lane determination device and lane determination method
EP2601480B1 (en) Method and device for determining the position of a vehicle on a carriageway and motor vehicle having such a device
JP4321821B2 (en) Image recognition apparatus and image recognition method
US6385536B2 (en) Navigation apparatus, method for map matching performed in the navigation apparatus, and computer-readable medium storing a program for executing the method
EP1991973B1 (en) Image processing system and method
DE69734736T2 (en) Road sensor and navigation system using this sensor
US6249214B1 (en) Image processing apparatus, image processing method, navigation apparatus, program storage device and computer data signal embodied in carrier wave
US20060235597A1 (en) Driving support method and device
JP2007004669A (en) Vehicle and lane recognizing device
DE19836156B4 (en) Vehicle navigation system and storage medium
US20060233424A1 (en) Vehicle position recognizing device and vehicle position recognizing method
US6173232B1 (en) Vehicle navigation system and a recording medium
CN101675442B (en) Lane determining device, lane determining method and navigation apparatus using the same
JP4513740B2 (en) Route guidance system and route guidance method
DE112005001307B4 (en) Built-in navigation device and method for correcting one's own vehicle position
US8213682B2 (en) Feature information collecting apparatuses, methods, and programs

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20050328

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20070126

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20070206

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20070329

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20070508

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20070509

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110518

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120518

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120518

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130518

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130518

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140518

Year of fee payment: 7

LAPS Cancellation because of no payment of annual fees