WO2007018145A1 - 路面標示認識システム - Google Patents
路面標示認識システム Download PDFInfo
- Publication number
- WO2007018145A1 WO2007018145A1 PCT/JP2006/315491 JP2006315491W WO2007018145A1 WO 2007018145 A1 WO2007018145 A1 WO 2007018145A1 JP 2006315491 W JP2006315491 W JP 2006315491W WO 2007018145 A1 WO2007018145 A1 WO 2007018145A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- road marking
- road
- vehicle
- marking
- distance
- Prior art date
Links
- 238000005259 measurement Methods 0.000 claims description 98
- 238000001514 detection method Methods 0.000 claims description 56
- 238000003384 imaging method Methods 0.000 claims description 16
- 238000000605 extraction Methods 0.000 claims description 6
- 238000000034 method Methods 0.000 description 56
- 238000012545 processing Methods 0.000 description 53
- 230000008569 process Effects 0.000 description 52
- 239000003973 paint Substances 0.000 description 35
- 239000004973 liquid crystal related substance Substances 0.000 description 16
- 238000004891 communication Methods 0.000 description 15
- 238000010586 diagram Methods 0.000 description 13
- 230000003287 optical effect Effects 0.000 description 6
- 240000004050 Pentaglottis sempervirens Species 0.000 description 4
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 4
- 239000004576 sand Substances 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000015654 memory Effects 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000007790 scraping Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096716—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0244—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using reflecting strips
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
- G06V20/582—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096725—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096733—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
- G08G1/09675—Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where a selection from the received information takes place in the vehicle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096783—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
Definitions
- the present invention relates to a road marking recognition system that recognizes a road marking formed on a road surface based on an image captured by an imaging means, and in particular, only a road marking having a state that satisfies a predetermined criterion.
- the present invention relates to a road marking recognition system that can reduce processing load by recognizing road markings by detection.
- an imaging means such as a camera is provided on the entire surface of the vehicle, and based on the captured image.
- an imaging means such as a camera
- a temporary stop line formed on a road on which the host vehicle travels is detected by detecting the image data force captured by a CCD camera installed in front of the vehicle.
- V is described as a vehicle driving assistance device that performs driving assistance at an intersection.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2004-86363 (pages 8 to 10; FIG. Four)
- a temporary stop line to be controlled is uniformly detected directly by a detection means such as a CCD camera, and guidance to the detected control object is provided.
- the road marking which is the object of detection, is formed on the road surface on which the vehicle travels, and as time passes, part of the paint is caused by various factors such as friction with the tires. It will peel off or become thin. If such a road marking is detected by the image force picked up by the camera, it is difficult to grasp the outline correctly, and there is a high possibility of erroneous recognition.
- the present invention has been made to solve the above-described conventional problems, and by removing road markings that are difficult to detect from the road markings formed on the road surface in advance.
- the purpose is to provide a road marking recognition system that can be configured with an inexpensive system by reducing the processing load on the equipment by reducing only the necessary processing while reducing the false recognition rate.
- a road marking recognition system is a marking state that is arranged on a vehicle and stores an imaging means for imaging the periphery of the vehicle, and a road marking state formed on the road surface.
- road marking refers to a certain form using paints, road fences or the like for necessary guidance, guidance, warning, regulation, instructions, etc. for road traffic. This refers to a set of lines, letters, and symbols placed on the road surface, such as stop lines and pedestrian crossings.
- the “road marking state” includes the road marking outline, blurring, paint density, color, and the like.
- a road marking recognition system is a road marking recognition system according to claim 1, wherein the road marking recognition system stores position information of road markings formed on the road surface. And a current position detection means for detecting the current position of the vehicle, wherein the road surface sign detection means is based on a detection result of the current position detection means and position information stored in the sign position storage means.
- a sign information extracting means for extracting from the recorded sign state storage means, and a sign state for determining whether or not the road sign from which the information is extracted by the sign information extracting means is a road sign having a state satisfying a predetermined standard Determining means, and detecting the road marking when the sign state determining means determines that there is a road marking that satisfies the condition.
- the road marking recognition system is the distance between the road marking detected by the road marking detection means and the vehicle in the road marking recognition system according to claim 1 or 2.
- the road marking has a plurality of measurement start points used when calculating the distance by the distance calculation means, and the distance calculation means includes the measurement start points of the plurality of measurement start points.
- the distance between the road marking and the vehicle is calculated by calculating a predetermined measurement starting point force selected based on the road marking detected by the road marking detection means and the distance to the vehicle.
- the road marking recognition system is the road marking recognition system according to claim 1 or claim 2, wherein the road marking having a state satisfying the predetermined standard is: A pavement marking of a lane boundary line located at a connection between a main line of an expressway and an attachment road, characterized in that the lane boundary line is divided in a length direction.
- the road marking recognition system is the road marking recognition system according to claim 4, wherein the road marking recognition system stores the positional information of the road marking formed on the road surface.
- Current position detection means for detecting the current position of the vehicle, the road marking detection means based on the detection result of the current position detection means and the position information stored in the sign position storage means. Recognize the shape of the road marking formed on the road surface based on the sign information extracting means for extracting the information of the road marking located within a predetermined range from the current location from the marking state storage means and the image captured by the imaging means.
- Image recognition means, and the image recognition means recognizes that double lines parallel to each other at a predetermined interval are formed on the road surface, and the marking information extraction means extracts the double lines. With When the road marking is not extracted, the double line is detected as a part or all of the road marking of the lane boundary line.
- the invention's effect [0010]
- a road marking having a state satisfying a predetermined criterion among road markings formed on a road surface on which the vehicle travels is based on a captured image. Since the detection is performed, it is possible to remove the detection target force in advance for the road marking in a state that is difficult to detect, and to reduce the false recognition rate in the recognition of the road marking.
- the road marking recognition system detects the road marking only when there is a road marking having a state satisfying a predetermined standard within a predetermined range of the current local force of the vehicle. It is possible to perform the detection process using the imaging means only at an appropriate timing, and to minimize the processing load of the device in detecting the road marking. Therefore, an inexpensive system that does not require a separate control unit for image processing can be configured.
- the distance from the predetermined measurement start point selected based on the detected road marking state to the vehicle among the plurality of measurement start points included in the road marking By calculating the distance between the road marking and the vehicle, it is possible to select the part with good road marking condition as the starting point for distance measurement, and improve the accuracy of distance measurement. it can.
- the connection between the main road of the expressway and the attachment road Because it detects the road markings of the lane boundary line located in the area, the road markings of the lane boundary lines that are likely to be separated into double lines due to the paint being peeled off are detected as much as possible. It is possible to provide various services such as driving assistance and information disclosure using the detection results.
- the road marking recognition system when it is recognized that double lines parallel to each other at a predetermined interval are formed on the road surface based on the captured image! If there is no road marking within the specified range based on the current strength of the vehicle, the recognized double line is divided in the length direction. Because it is detected as a part or all of the road marking with a line in the state of being cut, it can be detected as a correct type of road marking even if the road marking has been divided into double lines due to peeling of the paint, etc. It becomes possible. Therefore, the detection accuracy can be improved.
- FIG. 1 is a schematic configuration diagram of a driving support apparatus according to a first embodiment.
- FIG. 2 is a block diagram schematically showing a control system of the driving support apparatus according to the first embodiment.
- FIG. 3 is a diagram showing a storage area of a road marking DB according to the first embodiment.
- FIG. 4 is an explanatory view showing a blur pattern of a road marking particularly “with a pedestrian crossing” among the blur patterns used in the driving support apparatus according to the first embodiment.
- FIG. 5 is an explanatory view showing a blur pattern of a road marking particularly “with a pedestrian crossing” among the blur patterns used in the driving support apparatus according to the first embodiment.
- FIG. 6 is an overhead view showing a vehicle that images road markings.
- FIG. 7 is a side view showing a vehicle that images road markings.
- FIG. 8 is a schematic diagram showing a captured image captured by the rear camera of the vehicle in the state of FIGS. 6 and 7.
- Fig. 9 is a schematic diagram showing a method for calculating a distance to a vehicle force control object when a road marking is imaged by a rear camera of the vehicle.
- FIG. 10 is a flowchart of a driving support processing program in the driving support apparatus according to the first embodiment.
- FIG. 11 is a bird's-eye view showing a case where a road marking “with a pedestrian crossing” in which the blur pattern is classified as pattern 4 is formed around the vehicle.
- FIG. 12 is a bird's-eye view showing a case where a road marking “with a pedestrian crossing” in which the blur pattern is classified as pattern 8 is formed around the vehicle.
- FIG. 13 is an explanatory view showing a blur pattern of a “thick, broken line” road marking used in the driving support apparatus according to the second embodiment.
- FIG. 14 is a driving support processing program in the driving support apparatus according to the second embodiment. It is a flowchart of.
- FIG. 15 is an overhead view showing a case where a “thick, broken line” road marking in which the blur pattern is classified as pattern 4 is formed in front of the vehicle.
- FIG. 16 is a bird's-eye view showing a case where a “thick, broken line” road marking in which the blur pattern is classified as pattern 5 is formed around the vehicle.
- FIG. 17 is a schematic configuration diagram of a driving support apparatus according to the third embodiment.
- FIG. 1 is a schematic configuration diagram of a driving support apparatus 1 according to the first embodiment.
- the driving support apparatus 1 includes a rear camera (imaging means) 3, a navigation apparatus 4, a vehicle ECU 5, and the like installed on a vehicle 2.
- the rear camera 3 uses a solid-state image sensor such as a CCD, for example, and is attached near the upper center of the license plate mounted on the rear side of the vehicle 2, and the line-of-sight direction is set to 4 from the horizontal. Installed 5 degrees downward. Then, when the vehicle is parked, the rear of the vehicle, which is the traveling direction of the vehicle 2, is imaged, and the captured image (hereinafter referred to as a BGM (back 'guide' monitor) image) is displayed on the liquid crystal display 7 of the navigation device. On the other hand, during normal traveling, road markings such as stop lines, pedestrian crossings, and the maximum vehicle speed are formed on the road surface around the vehicle 2 as described later. The distance from the vehicle 2 to the control object that is the target for driving guidance and control of the vehicle, such as stop lines, intersections, and curve entrances, is indirectly calculated based on the captured road marking image.
- the navigation device 4 is provided on a navigation ECU (electronic 'control unit' unit) 6 and the center console or panel surface of the vehicle 2 to display a map and a search route to the destination.
- a liquid crystal display 7 that outputs voice guidance related to route guidance, a current location detection unit (current location detection means) 9 that identifies the current location and traveling direction of the vehicle 2 on a map, and a map display
- the data recording unit 10 stores information related to the type and position of map data and road markings formed on the road surface, and a communication device 13 for communicating with an information center or the like.
- the navigation ECU (road marking detection means, marking information extraction means, marking state determination means, distance calculation means) 6 is a vehicle 2 based on the captured image captured by the rear camera 3 in addition to the normal route search and route guidance processing. Detection process for detecting road markings formed on the road surface on which the vehicle travels, road marking force that detects the distance from the vehicle 2 to a controlled object such as a stop line, intersection, or curve entrance, calculation process for calculating indirectly, This is an electronic control unit that gives instructions for driving control of the vehicle 2 and route guidance processing based on the calculated distance. The detailed configuration of the navigation ECU 6 will be described later.
- the vehicle ECU 5 is an electronic control unit of the vehicle 2 that controls the operation of the engine, transmission, accelerator, brake, and the like, and is connected to the brake actuator 11 and the accelerator actuator 12. . Then, the navigation ECU 6 sends a control signal to the brake actuator 11 and the accelerator actuator 12 via the vehicle ECU 5 when a predetermined condition is satisfied, so that the brake pressure and the amount of air taken into the engine are determined. Change the brake force automatically.
- FIG. 2 is a block diagram schematically showing a control system of the driving support apparatus 1 according to the first embodiment.
- control system of the driving support device 1 is configured based on the navigation device 4 and the vehicle ECU 5, and predetermined peripheral devices are connected to each control means.
- the current location detection unit 9 includes a GPS 31, a geomagnetic sensor 32, a gyro sensor 33, a steering sensor 34, a distance sensor 35, and an altimeter (not shown). It is possible to detect the current position and direction of the vehicle, and the distance traveled by a predetermined point.
- the GPS 31 detects the current location and current time of the vehicle on the earth by receiving radio waves generated by an artificial satellite, and the geomagnetic sensor 32 measures the geomagnetism. Thus, the direction of the vehicle is detected.
- the gyro sensor 33 detects the turning angle of the host vehicle.
- a gas rate gyro, a vibration gyro, or the like is used as the gyro sensor 33. Further, by integrating the turning angle detected by the gyro sensor 33, the vehicle direction can be detected.
- the steering sensor 34 detects the rudder angle of the host vehicle.
- the steering sensor 34 for example, an optical rotation sensor, a rotation resistance sensor, an angle sensor attached to a wheel, or the like attached to a rotating portion of a steering wheel (not shown) is used.
- the distance sensor 35 detects the moving speed (integrated moving distance) based on the vehicle speed pulse generated at every constant travel distance.
- the data recording unit 10 includes an external storage device and a hard disk (not shown) as a recording medium, a predetermined program recorded on the hard disk, route information such as map data, and information necessary for map display.
- a map head that stores information on the map DB41, a road marking database that stores information related to road markings (signing status storage means, marking position storage means), etc. 42, and a recording head ( Figure) Not shown).
- a hard disk is used as the external storage device and storage medium of the data recording unit 10, but in addition to the hard disk, A magnetic disk such as a flexible disk can be used as an external storage device.
- Memory cards, magnetic tapes, magnetic drums, CDs, MDs, DVDs, optical disks, MOs, IC cards, optical cards, etc. can also be used as external storage devices.
- map data for displaying a map intersection data regarding each intersection, node points Node data related to roads, road data related to roads, search data to search for routes, facility data related to facilities, search data to search for points, etc. are recorded.
- type of road marking for example, stop line, pedestrian crossing, maximum speed
- paint marking of the road marking A blur pattern indicating the state, specific information for specifying the type of detected road marking, and coordinate data for specifying the position of the road marking on the map are also recorded.
- the road marking DB42 will be described in detail later with reference to FIG.
- the navigation ECU 6 is used as a working memory when the CPU performs various arithmetic processes in addition to the CPU as the arithmetic device and the control device for controlling the entire navigation device 4.
- RAM which stores route data when a route is searched
- a control program a route guidance processing program that searches for a route to the destination, and guides the searched guidance route
- the rear camera 3 Based on the captured image, calculate the distance to the controlled object (stop line, intersection, entrance to the curve, etc.), and record the driving assistance processing program (see Fig. 10) described later that assists driving, etc.
- Internal storage device As the RAM, ROM, etc., a semiconductor memory, a magnetic core, etc. are used.
- the arithmetic unit and the control unit it is possible to use an MPU or the like instead of the CPU.
- the navigation ECU 6 includes a GUI control unit 51, a location unit 52, a route search and guidance processing unit 53, and also acquires the rear camera 3, the current location detection unit 9, the data recording unit 10, and each peripheral device force. Various controls are performed based on the information.
- the GUI control unit 51 uses the map data read from the map DB 41 and the location unit.
- an appropriate map image around the vehicle In addition to displaying on the liquid crystal display 7 and when guidance of the route is required, icons, guidance screens, search routes, etc. are combined with the map image and displayed on the liquid crystal display 7.
- the current absolute position (latitude, longitude) of vehicle 2 is detected.
- the detected current position and the information power stored in the road marking DB42 are within the predetermined range (30m forward to 20m backward) of the vehicle 2 and there is a road marking in a good state where the paint scrape condition satisfies the predetermined standard. If it exists, the image captured by the rear camera 3 is taken in and analyzed, and the road marking on the road surface is detected and recognized.
- the distance between the road marking detected from the captured image and the vehicle 2 is calculated, and further, the distance to the control object associated with the distance force road marking is calculated, and the brake actuator is calculated according to the calculated distance.
- the driving of the vehicle 2 is controlled by controlling the actuator 11 and the accelerator actuator 12, or the driving is guided by the liquid crystal display 7 and the speaker 8.
- the route search / guidance processing unit 53 searches the route from the current location to the destination based on the node point data and the search data stored in the data recording unit 10 and is set.
- the guidance of the route is performed using the liquid crystal display 7 and the power 8 according to the guidance route.
- the navigation ECU 6 is electrically connected to peripheral devices such as a liquid crystal display 7, a speaker 8, a communication device 13 and the like.
- the liquid crystal display 7 includes operation guidance, operation menu, key guidance, guidance route from the current location to the destination, guidance information along the guidance route, traffic information, news, weather forecast, time, mail, TV
- the program, BGM images captured by the rear camera 3, etc. are displayed.
- the liquid crystal display 7 it is also possible to use a CRT display, a plasma display, or the like, or a hologram device that projects a hologram onto the windshield of the vehicle.
- the speaker 8 outputs voice guidance for guiding traveling along the guidance route based on an instruction from the navigation ECU 6.
- voice guidance to be guided for example, “Turn right at the intersection 200m ahead” or “National highway No. 00 ahead” Is congested. And so on.
- the voice output from the speaker 8 can output various sound effects and various guidance information recorded in advance on a tape, a memory or the like.
- the liquid crystal display 7 and the driving force 8 on the control object for example, Warning that the stop line is approaching).
- the communication device 13 is used for traffic jam information, regulation information, parking lot information, traffic accident information, service area information transmitted from an information center such as a VICS (registered trademark: Vehicle Information and Communication System) center. It is a beacon receiver that receives traffic information consisting of various information such as congestion status as radio beacons, optical beacons, etc. via radio beacon devices, optical beacon devices, etc. arranged along the road.
- the communication device 13 can communicate in communication systems such as LAN, WAN, intranet, mobile phone network, telephone network, public communication network, dedicated communication network, and communication networks such as the Internet. Network equipment may be used.
- the communication device 13 includes an FM receiver that receives FM multiplex information including information such as news and weather forecasts as FM multiplex broadcast via the FM broadcasting station.
- the beacon receiver and the FM receiver are united and arranged as a VICS receiver, but can be arranged separately.
- the navigation device 4 according to the first embodiment is connected to the information center via the communication device 13, and updates the information stored in the map DB 41 and the road marking DB.
- FIG. 3 is a diagram showing a storage area of the road marking DB 42 according to the first embodiment.
- the storage area of the road marking DB42 is a scale indicating the coordinates (position) on the road marking map data, the type of road marking, and the paint state of the road marking paint. It consists of a pattern, a control object associated with the road marking, and a distance from the measurement start point of the road marking (the measurement start point closest to the control object if there are multiple) to the control object.
- the blur pattern of the road marking stored in the road marking DB will be described with reference to FIG. 4 and FIG. FIG. 4 and FIG. 5 are explanatory diagrams showing the blur pattern of the road marking 60 of “with a pedestrian crossing” in particular, among the blur patterns used in the driving support apparatus 1 according to the first embodiment.
- the scum pattern defined by the road marking 60 with “with pedestrian crossing” is a total of 8 pattern forces, and the paint is blurred. Based on the location and range, they are classified.
- pattern 1 is classified as a pattern with no significant paint blur and a clear overall shape.
- the pattern 2 is classified into those in which the paint is missing or the thinned portion 61 exists but the outer shape remains.
- pattern 3 is classified into a state in which paint is missing on the front side of the traveling direction of the vehicle or there is a faint portion 61 that is thin.
- Pattern 4 is classified into a state in which there is a paint portion 61 that is missing or painted thinly on the back side with respect to the traveling direction of the vehicle.
- Pattern 5 is classified into a state in which there is a scraped portion 61 in which paint is missing on the left side with respect to the traveling direction of the vehicle or is thinned! /.
- Pattern 6 is classified into a state in which a paint portion 61 is missing on the right side with respect to the traveling direction of the vehicle or there is a faint portion 61 that is thin.
- Pattern 7 is categorized as a pattern in which part of the paint is missing due to obstacles 62 such as manhole covers and road joints.
- Pattern 8 is classified as a pattern that does not retain its outer shape due to missing or thin paint.
- measurement start points 60A to 60D are set at different positions on the road marking 60 in each of the blur patterns.
- the measurement start points 60A to 60D are provided at the corners and tip of the line (boundary line) forming the road marking 60, and the distance between the vehicle 2 and the road marking is calculated as described later.
- the road marking 60 “with pedestrian crossing” defines four measurement start points 60A to 60D.
- the measurement start points 60A to 60D are set in advance so that only the measurement start point of the part with less blur is used as the measurement start point.
- pattern 1 is set as a measurement start point used by all the measurement start points 60A to 60D.
- Pattern 2 only the measurement start points 60A and 60B that can be detected from the boundary line are set as measurement start points to be used.
- Pattern 3 is set as the measurement start point used only by the measurement start points 60A and 60C that can also detect boundary force.
- Pattern 4 is set as the measurement start point used only by the measurement start points 60B and 60D that can be detected from the boundary line.
- Pattern 5 is set as the measurement start point used only by the measurement start points 60A and 60B that can detect the boundary force.
- Pattern 6 is set as the measurement start point used only by the measurement start points 60A and 60B that can detect the boundary force.
- Pattern 7 only the measurement start points 60A and 60B that can be detected from the boundary line are set as measurement start points to be used. In Pattern 8, no usable measurement start point is set.
- the navigation ECU 6 reads from the road marking DB 42 the road marking marking pattern to be detected, and selects one measurement start point from the set measurement start point to be used as the starting point of the distance measurement. Then, calculate the distance from vehicle 2 to the measurement start point that is closest to the direction of travel (measurement start point 60A for road marking 60 with “pedestrian crossing” shown in FIGS. 4 and 5) (S7, S8 in FIG. 10). ). As a result, it is possible to use a portion with less blur as a starting point for distance measurement, and to improve the accuracy of distance measurement. A specific method for calculating the distance will be described later.
- the detection of the road marking is detected for the "road marking that does not retain its outer shape due to missing or thin paint" classified as pattern 8. It is controlled not to perform processing. As a result, road markings that are difficult to detect can be removed from the detection target in advance, and the misrecognition rate in recognition of road markings is reduced, and navigation is performed by performing only necessary processing. Yon It becomes possible to reduce the processing load of ECU6.
- FIGS. 4 and 5 have been described with reference to an example of a shading pattern of only a road marking “with a pedestrian crossing”, other road markings (for example, “stop line”, “arrow”, “ On a pedestrian crossing)
- the pattern 1 to pattern 8 are set in the same manner, and the road markings recorded in the road marking DB 42 are classified as misaligned patterns.
- the measurement start point is defined for each pattern, and the road marking classified into pattern 8 is controlled not to perform the road marking detection process.
- a road marking “with a crosswalk” with a cascading pattern 2 is formed at coordinates (xl, yl).
- the road marking indicates that a “stop line” road marking is associated as a control object 60m ahead.
- the coordinates (x2, y2) have a “arrow” road marking with a blur pattern 8, and the road marking corresponds to an “intersection (intersection node)” as a controlled object 54m ahead. Indicates that it is attached.
- the coordinate (x3, y3) has a “highest speed” road marking of the blur pattern 1, and the road marking is “corner (node at the corner starting point)” as a control object 72 m ahead. "Indicates that it is associated.
- the coordinates (x4, y4) have a pedestrian crossing road marking with a pattern of 3 and the road marking has an "intersection (intersection node)" as a control object 89m ahead. Shows that it is associated.
- the controlled object is a target for driving guidance and vehicle control, and is the traveling direction of the road on which the road marking is formed, and is in a predetermined section (for example, 10m to 200m). Some node points and other road markings are used.
- the navigation ECU 6 indirectly calculates the distance to the control object associated with the captured image force when the rear force memer 3 images one of the road markings recorded in the road marking DB42. When the distance reaches a predetermined distance, drive control of the vehicle 2 and travel guidance are performed.
- the content of the drive control and travel guidance of the vehicle 2 varies depending on the type of the control target associated with the vehicle 2. For example, when a "stop line” is associated with the control target, When the distance to the line reaches 50 m, the character string “Stop line is approaching” indicating that the stop line is approaching is displayed on the liquid crystal display 7, and the same content is displayed from the speaker 8. The warning sound is output. Further, if deceleration is not performed at that time, the brake actuator 11 is controlled to perform deceleration control so that the vehicle 2 stops before the stop line. [0049] If an "intersection" is associated as a control object, route guidance is performed according to the guidance route set when the distance from the node at the relevant intersection reaches 10m.
- a guidance display indicating a left turn is displayed on the liquid crystal display 7, and a guidance voice “Please turn left at the next intersection” is output from the speaker 8.
- a guidance route is set to V, especially the guidance display and guidance voice output are not performed.
- the curling pattern that is associated with the road marking 69 of the stop line as the control object is pattern 1 ⁇ With crosswalk ''
- the case where the road marking 60 is imaged is shown. 6 is a bird's-eye view showing the vehicle 2 that images the road marking 60
- FIG. 7 is a side view that shows the vehicle 2 that images the road marking 60
- FIG. 8 is a rear camera of the vehicle 2 in the state of FIGS. 3 is a schematic diagram illustrating a captured image 70 captured by 3.
- the rear camera 3 is mounted so that the force near the rear bumper 71 of the vehicle 2 can also image the rear side so that the optical axis L is directed 45 degrees downward in the horizontal direction, and the imaging range is It is fixed. Therefore, the distance to the subject can be calculated from the position of the image data (specifically, the number of pixels from the lower edge) in the captured image shown in FIG. 8 captured by the rear camera 3.
- measurement start points for measuring the distance to the vehicle 2 are defined in advance at a plurality of locations, and further, as the start points of measurement according to the blur pattern.
- the measurement start point to be used is set (see Fig. 4 and Fig. 5).
- the vehicle 2 and the measurement start point are determined from the position of the measurement start point (specifically, the number of pixels from the lower edge to the measurement start point). It is possible to calculate the distance D1 between the two.
- the distance to which the measurement start point is calculated is determined for each road marking. In the road marking 60, the distance from the measurement start point 60A is calculated.
- the distance to the measurement start point 60B is first determined.
- the distance between the measurement start point 60A and the measurement start point 60A is calculated indirectly by using the distance between the measurement start point 60A and the measurement start point 60B.
- the measurement start point 60C is used, and when the force that cannot be specified for the measurement start point 60C is used, the measurement start point 60D is used.
- the measurement start points 60B and 60D are set as the measurement start points to be used as the starting points of the distance measurement in the road marking 60 of “With pedestrian crossing” of the scour pattern 4 shown in FIG.
- the distance to the measurement start point 60A is indirectly calculated.
- the measurement start point 60B cannot be specified due to some cause (for example, when a part of the white line is hidden by an obstacle such as sand or a puddle)
- the measurement start point 60D is used.
- the power to calculate the distance to any of the measurement start points among a plurality of measurement start points set to be used is determined.
- Fig. 9 shows the case where the vehicle 2 detects the road marking 60 "with pedestrian crossing" with the rear camera 3, and the road marking 60 has a distance D2 ahead as a control object. "stop Road marking 69 of “stop line” is associated.
- the navigation ECU 6 calculates the travel distance S of the vehicle 2 by the distance sensor 35 based on the vehicle speed pulse generated at every constant travel distance from the engine. Then, the distance (D2-D1 -S) from the traveling vehicle 2 to the controlled object is calculated by subtracting the traveling distance S from the distance from the vehicle 2 to the controlled object (D2-D1). It becomes possible.
- the brake actuator 11 based on the calculated distance (D2-Dl-S) to the road marking 69 of the "stop line"
- the vehicle 2 is stopped along the stop line. The brake pressure can be adjusted.
- the road marking force detected by the rear camera 3 without directly recognizing the control object can be calculated more accurately at an earlier stage by indirectly calculating the distance to the control object ahead. It is possible to calculate the distance (D2-D1 -S) to the control object. Based on the calculated accurate distance to the control object (D2-D1-S), appropriate vehicle control and travel guidance can be performed at a more appropriate timing.
- FIG. 10 is a flowchart of the driving support processing program in the driving support device 1 according to the first embodiment.
- the driving support processing program detects the road marking from the captured image captured by the rear camera 3 when the vehicle 2 travels on the road surface, and the detected road marking force also determines the distance between the vehicle and the controlled object. It detects and performs control to assist the user's operation based on the distance.
- the program shown in the flowchart of FIG. 10 below is stored in the ROM or RAM of the navigation ECU 6 and is executed by the CPU.
- step (hereinafter abbreviated as S) 1 the navigation ECU 6 applies the current location information of the vehicle 2 detected by the current location detection unit 9 and the road marking DB42 (see FIG. 3). Based on the recorded position information of the road marking, the information on the road marking located in the vicinity of the vehicle 2 (2000 m forward to 500 m behind the vehicle 2 in the first embodiment) is displayed. Read from DB42.
- S2 it is determined whether or not there is a road marking located in a predetermined range of the vehicle 2 (30m forward to 20m behind the vehicle 2), among the road markings read by the SI. . If it is determined that there is a road marking located in the predetermined range of the vehicle 2 (S2: YES), the process proceeds to S3, and the blur pattern of the road marking located in the predetermined range of the vehicle 2 is changed to the road marking DB42. Read from. On the other hand, if it is determined that there is no road marking located in the predetermined range of vehicle 2 (S2: NO), the process returns to S1 and reads the road marking information based on the current location again. This S3 corresponds to the processing of the sign information extracting means.
- the blur pattern read in S3 is a blur pattern to be detected by the rear camera 3.
- eight patterns of patterns 1 to 8 are provided as the blur patterns (see FIGS. 4 and 5), and the blur patterns of patterns 1 to 7 are provided.
- the road markings classified as No. 3 are road markings that can be recognized at least by the navigation ECU 6 even if some paint blurring occurs, and are road markings to be detected by the rear camera 3. Determined.
- the road markings classified as the pattern 8 blur pattern are road markings that are difficult to recognize because the outline cannot be correctly detected by the blur, and are determined not to be detected by the rear camera 3.
- the This S4 corresponds to the processing of the marking state determination means.
- FIG 11 is an overhead view showing a case where a road marking 60 of "with a pedestrian crossing", in which the force threshold pattern is classified as pattern 4, is formed around the vehicle 2, and the rear camera 3
- the image recognition processing of the road marking 60 is performed based on the image power picked up in.
- Figure 12 shows the case where a road marking 60 with a pedestrian crossing, in which the blur pattern is classified as pattern 8, is formed around vehicle 2. In this case, the image recognition processing of the road marking 60 by the image captured by the rear camera 3 is not performed.
- an image captured by the rear camera 3 is input using analog communication means such as NTSC or digital communication means such as i link, and is converted into a digital image format such as jpeg or mpeg.
- analog communication means such as NTSC or digital communication means such as i link
- digital communication means such as i link
- luminance correction is performed on the road surface on which the road marking in the captured image is drawn and another road surface based on the luminance difference.
- binarization processing to separate the target road marking from the image geometric processing to correct distortion, smoothing processing to remove image noise, etc., and the boundary line between the road marking and other road surface and Detect the measurement start point.
- the placement force of the detected boundary line and the specified measurement start point is specified.
- the type of the detected road marking is specified, and further, the specified road marking type is present in the predetermined range of the vehicle in S2. It is determined whether or not the force matches the determined road marking type.
- S6 it is determined whether or not the road marking is recognized by the image recognition process in S5. As a result, when it is determined that the road marking has been recognized (S6: YES), that is, the road marking is detected in the captured image, and the detected road marking is detected in S2 above. If it is determined that it matches the type of road marking determined to be at this position, the process proceeds to S7. On the other hand, if it is determined that the road marking has not been recognized (S6: NO), that is, if the road marking has not been detected in the captured image, or the detected road marking has been detected.
- the process returns to S1 and reads the road marking information based on the current location again.
- S1 to S6 correspond to the processing of the road marking detection means.
- the measurement start point 6 OA is First, it is selected as the starting point for distance measurement and detected based on the boundary line of the road marking. However, if the measurement start point 60A cannot be detected by an obstacle such as sand or a puddle, another measurement start point is selected and detected in the priority order of the measurement start points 60B, 60C, and 60D. As shown in Fig. 4, when the road marking 60 with pedestrian crossing classified as cascading pattern 4 is recognized as a road surface display located around the vehicle 2, the measurement start point 60 B is the starting point of the distance measurement first. And is detected based on road marking boundaries. However, if the measurement start point 60B cannot be detected by an obstacle such as sand or a puddle, the measurement start point 60D is selected and detected.
- the distance between the road marking detected in S3 and the vehicle 2 is calculated. Specifically, from the position of the measurement start point detected in S7 (specifically, the number of pixels from the lower edge to the measurement start point) in the captured image obtained by capturing the road marking (see Fig. 8), the vehicle 2 Calculate the distance D1 between and the measurement start point.
- the above S7 to S8 correspond to the processing of the distance calculation means.
- the distance D1 between the vehicle 2 calculated in S8 and the measurement start point, and the distance D2 to the control object associated with the detected road marking (the value of D2 is the road marking in advance)
- the distance (D2-D1) from the vehicle 2 to the control object associated with the road marking detected from the vehicle 2 is calculated from the data stored in the DB 42 (see Fig. 3) (see Fig. 9).
- the distance sensor 35 calculates the mileage S of the vehicle 2 with the detection point power of the road marking, and is calculated in S6. Based on the distance between the vehicle 2 and the controlled object (D2-D1), the remaining distance from the traveling vehicle 2 to the controlled object (D2-D1-S) is calculated (see Fig. 9).
- the vehicle 2 is set to the guidance or control start point set for each type of the control object. It is determined whether or not the force reached. For example, if the control object is a road marking for a “stop line” In other cases, it is determined that the guidance or control start point has been reached when the remaining distance is within 50 m. If the control object is a road marking of “intersection”, it is determined that the guidance or control start point has been reached when the remaining distance is 10 m. Furthermore, when the control object is a “corner” road marking, it is determined that the guidance or control start point has been reached when the remaining distance is within 50 m.
- route guidance is performed according to the guidance route set when the distance to the node at the relevant intersection reaches 10 m. For example, a guidance display indicating a left turn is displayed on the liquid crystal display 7, and a guidance voice “Please turn left at the next intersection” is output from the speaker 8.
- the driving support device 1 when it is determined that there is a road marking classified into a blur pattern to be detected within a predetermined range from the vehicle 2 ( (S4: YES) recognizes the road marking from the image captured by the rear camera 3 (S5) and calculates the distance to the control object associated with the road marking recognized from the vehicle 2 ( (S7 to S10), when it is determined that the distance to the controlled object has become a predetermined distance (S11: YES), travel guidance or vehicle control is performed according to the type of the associated controlled object ( S14), it is not necessary to directly detect control objects such as stop lines and intersections, and control objects from the host vehicle are indirectly controlled based on the detection results of road markings at an early stage when the distance from the control object is long.
- an appropriate measurement start point is selected based on the measurement start point set for each categorized blur pattern, and the measurement start point closest to the traveling direction from the vehicle 2 (see “Pedestrian crossing” in FIGS. 4 and 5).
- the distance to the measurement start point 60A is calculated, so it is possible to use the part of the road marking with less blur as the starting point for distance measurement, which improves the accuracy of distance measurement. it can.
- FIGS. 1 to 12 a driving support apparatus according to a second embodiment will be described with reference to FIGS.
- the same reference numerals as those of the driving support device 1 according to the first embodiment in FIGS. 1 to 12 are the same as or equivalent to those of the driving support device 1 according to the first embodiment. Is shown.
- the schematic configuration of the driving support apparatus according to the second embodiment is substantially the same as the driving support apparatus 1 according to the first embodiment. Further, the various control processes are almost the same control processes as those of the driving support device 1 according to the first embodiment.
- a lane boundary line consisting of a thick broken line located at the connecting part (branch point, junction) between the main road of the expressway and the attachment road is detected as a road marking. Then, the control object is controlled.
- the road marking DB 42 in which information related to road marking is stored in the data recording unit 10 will be described with reference to FIG.
- it is configured by information on road markings on the lane boundary line that is a thick breaking force located at the connecting part (branch point, junction point) of the main road and the attachment road of the expressway. Yes.
- the storage area of the road marking DB42 is associated with the coordinates (position) of the road marking on the map data, the type of road marking, the blur pattern indicating the paint scraping state of the road marking, and the road marking. It consists of the control object and the distance to the road marking force control object (see Fig. 3).
- FIG. 13 is an explanatory diagram showing a blur pattern of a road marking 80 of a “thick broken line” in particular, among the blur patterns of the road marking used in the driving support apparatus according to the second embodiment.
- the driving support apparatus is defined by the road marking 80 of “thick, broken line”! Categorized based on location and range.
- pattern 1 is classified as a pattern in which there is no large paint blur and a rectangular frame remains completely, and the paint is uniform inside.
- Pattern 2 is classified as a pattern with uneven interior paint, although the rectangular frame remains completely.
- the pattern 3 is classified into a state in which the paint is missing or there is a faint portion 81 that is thin, and a part of the short frame is missing.
- Pattern 4 is classified as a thick broken line that does not appear as a thick broken line due to missing or thin paint.
- the pattern 5 is divided into left and right by the cracks 82 in the length direction of the broken line paint, and the patterns appearing as double lines are classified.
- the navigation ECU 6 detects the “thick V, broken line” road surface detected by the vehicle 2 when the “thick V, broken line” road marking 80 is detected from the image captured by the rear camera 3.
- the control force associated with the road marking 80 is determined.
- the distance to the figurine can be extracted from the road marking DB42 and the current position of the vehicle can be accurately identified.
- various services such as driving assistance and information disclosure can be provided based on the exact current location of the identified vehicle.
- the paint is missing or thinly thinned portion 81 classified as pattern 3 is present, and the short frame is partially missing. If you try to forcibly recognize “thick broken line”, other types of road markings formed elsewhere may be mistakenly recognized as “thick broken line” road markings 80. is there. Therefore, the “thick broken line” road marking 80 classified as pattern 3 is controlled not to perform the detection process. As a result, road markings that are difficult to detect can be excluded from detection targets in advance, reducing the misrecognition rate in recognition of road markings and performing only necessary processing. Yon It becomes possible to reduce the processing load of ECU6.
- the "painted missing or thin outline, which is classified as pattern 4" is recognized from the image captured by the rear camera 3 with respect to the outer shape due to the lack of paint or thinning. Difficult to do. Therefore, the “thick broken line” road marking 80 classified as Noturn 4 is controlled not to perform the detection process. As a result, the road marking that is difficult to detect can be removed in advance, the detection target power can be removed in advance, the error recognition rate in the recognition of the road marking is reduced, and only the necessary processing is performed. It is possible to reduce the processing load.
- FIG. 14 is a flowchart of the driving support processing program in the driving support apparatus according to the second embodiment.
- the driving support processing program detects the road marking of “thick broken line” from the captured image captured by the rear camera 3 when the vehicle 2 travels on the road surface, and also detects the detected road marking force vehicle and the control object. Is detected, and control is performed to assist the user's driving based on the detected distance.
- the program shown in the flowchart in FIG. 14 below is stored in ROM or RAM provided in the navigation ECU 6 and is executed by the CPU.
- the navigation ECU 6 is based on the current location information of the vehicle 2 detected by the current location detection unit 9 and the location information of the road marking recorded in the road marking DB 42 (see FIG. 3).
- information on the road marking located in the vicinity of the vehicle 2 is read from the road marking DB42.
- the "thick, broken line” road marking 80 located in a predetermined range of the vehicle 2 (30m forward to 20m behind the vehicle 2) is provided. Determine if there is. If it is determined that there is a “thick broken line” road marking 80 located in a predetermined range of the vehicle 2 (S102: YES), the flow proceeds to S103, and the “thick broken line” road marking 80 is positioned in the predetermined range of the vehicle 2. , “Dashed line” road marking 80 The blurred pattern is read from the road marking DB42.
- the blur pattern read in S103 is a blur pattern to be detected by the rear camera 3.
- five patterns of pattern 1 to pattern 5 are provided as the pattern of “thick and broken” road marking 80 (see FIG. 13).
- the road markings classified into the pattern 2 and pattern 2 are road markings that can be recognized at least by the navigation ECU 6 even if some paint blurring occurs. It is determined that the road marking is.
- the road markings are classified into pattern 3 and pattern 4 Is a road marking that is difficult to recognize because the outline cannot be detected correctly due to blurring, and is determined to be a road marking that is not detected by the rear camera 3.
- FIG. 15 is an overhead view showing a case where a “thick broken line” road marking in which the blur pattern is classified as pattern 4 is formed in front of the vehicle 2.
- a “thick broken line” road marking in which the blur pattern is classified as pattern 4 is formed in front of the vehicle 2.
- FIG. 15 shows in Fig. 15, when vehicle 2 is traveling on the main road 85 of the highway and there is an attachment road 86 that branches off from the main line 85 in the traveling direction of vehicle 2, there is a connection at the connection between main line 85 and attachment road 86.
- Information of the formed “thick broken line” road marking 87 is read from the road marking DB 42 (S 101). If the read “thick broken line” road marking 87 is a “thick, broken line” road marking that is erased as shown in FIG. Thereafter, the image recognition process (S105) of the road marking 87 based on the image captured by the rear camera 3 is not performed.
- an image of the rear environment of the vehicle 2 captured by the rear camera 3 is captured and analyzed, and is formed on the road surface on which the vehicle travels.
- the type of detected road marking is determined.
- an image captured by the rear camera 3 is input using analog communication means such as NTSC or digital communication means such as i link, and digital such as jpeg or mpeg is input. Convert to image format.
- luminance correction is performed on the road surface on which the road marking in the captured image is drawn and another road surface based on the luminance difference.
- binarization processing to separate the target road marking from the image, geometric processing to correct the distortion, smoothing processing to remove the noise of the image, etc., and the boundary line between the road marking and other road surface are performed To detect.
- the type of road marking detected from the shape of the detected boundary line is specified.
- S105 corresponds to the processing of the image recognition means.
- S 106 it is determined whether or not the “thick broken line” road marking 80 has been recognized by the image recognition processing in S 105.
- S106: YES when it is determined that the “thick broken line” road marking 80 is recognized (S106: YES), that is, the road marking is detected in the captured image, and the detected road marking is If it is determined in S102 that the road marking 80 is “thick broken line” determined to be located around the host vehicle, the process proceeds to S109.
- S 106: NO when it is determined that the road marking 80 of “dashed dashed line” is not recognized.
- the process proceeds to S107.
- S107 it is determined whether or not a double line in which lines are parallel at a predetermined interval is recognized by the image recognition process in S105. As a result, when it is determined that the double line has been recognized (S107: YES), that is, the road marking is detected in the captured image and the detected road marking lines are parallel at a predetermined interval. If it is determined that it is a double line, the process proceeds to S108. On the other hand, when it is determined that the double line has not been recognized (S107: NO), the process returns to S105, and the image recognition process based on the image captured by the rear camera 3 is performed again.
- the navigation ECU 6 determines whether or not the recognized double line can be replaced with a “thick broken line” road marking 80. Specifically, first, it is determined whether or not the current position force of the vehicle 2 is within a predetermined range (for example, 30 m forward to 20 m behind the vehicle 2) and there is a road marking having another double line. If it is determined that there is no road marking with another double line, it is determined that the recognized double line can be replaced with a “thick broken line” road marking 80. As a result, the recognized double line is replaced with part or all of the “thick broken line” road marking 80. Detect as.
- FIG. 16 is an overhead view showing a case where a “thick broken line” road marking in which the blur pattern is classified as pattern 5 is formed in front of the vehicle 2.
- a “thick broken line” road marking in which the blur pattern is classified as pattern 5 is formed in front of the vehicle 2.
- the vehicle 2 is traveling on the main road 91 of the highway and there is an attachment road 92 that branches from the main line 91 in the traveling direction of the vehicle 2, the connection between the main line 91 and the attachment road 92
- the information of the “thick broken line” road marking 93 formed in (5) is read from the road marking DB 42 (S 101).
- the read “thick broken line” road marking 93 is a “thick broken line” road marking classified into a blur pattern 5 in which the paint is divided in the length direction as shown in FIG.
- the image recognition process of the sign 93 is performed, even if a double line is recognized, the “thick broken line” road sign 93 is processed as recognized. As a result, even if it is divided into double lines, it becomes possible to accurately detect the road marking 93 of “dashed dashed line”.
- the above S101 to S108 correspond to the process of the road marking detection means.
- the navigation ECU 6 performs a determination process for detecting whether or not the vehicle 2 is straddling the road marking 80 indicated by the “dashed broken line” detected in S101 to S108.
- a captured image captured by the rear camera 3, a vehicle speed sensor, or the like is used.
- the navigation ECU 6 detects the distance to the control object associated with the “thick, broken line” road marking 80 detected by the rear camera 3 and determined to straddle (stored in the road marking DB 42). ) To calculate the distance from vehicle 2 to the controlled object associated with the road marking detected. Based on the vehicle speed pulse generated at every fixed travel distance, the distance sensor 35 calculates the travel distance of the vehicle 2 from the point where the road marking is straddled, and the remaining distance from the traveling vehicle 2 to the controlled object is calculated. calculate. [0099] Also, in SI 12, based on the remaining distance to the control object calculated in SI 11, the vehicle 2 has reached the guidance or control start point set for each type of control object. Force is determined.
- control target is a “stop line” road marking
- guidance or control start point has been reached when the remaining distance is within 50 m.
- control object is a road marking of “intersection”
- guidance or control start point has been reached when the remaining distance is 10 m.
- control object is a “corner” road marking
- route guidance is performed according to the guidance route set when the distance to the node at the relevant intersection reaches 10 m. For example, a guidance display indicating a left turn is displayed on the liquid crystal display 7, and a guidance voice “Please turn left at the next intersection” is output from the speaker 8.
- S116 it is determined whether or not the remaining distance to the control object calculated in S111 has become 0, that is, whether or not the vehicle 2 has reached the position of the control object. Then, when it is determined that the position of the control target has been reached (S116: YES), the driving support process is terminated. In contrast, if it is determined that the position of the control object has not been reached (S116: NO), the process returns to S111, and the remaining distance from the current vehicle 2 to the position of the control object is calculated again. .
- the thick broken line located at the junction (junction or junction) of the main road and the attachment road within the predetermined range from the vehicle 2 If it is determined that there is a road marking on the lane boundary that is also a force (S102: YES), and it is further determined that the ⁇ thick broken line '' road marking is classified as a screech pattern to be detected (S 104: YES), the image force captured by the rear camera 3 also recognizes the road marking (S105) and straddles the road marking of the “dashed dashed line” where the vehicle 2 is recognized!
- the distance to the control object associated with the marking is calculated (S111), and if it is determined that the distance to the control object has reached the predetermined distance (S112: YES), the associated control object Directs driving or controls the vehicle according to the type (S115).
- Control objects that do not need to be detected such as stop lines and intersections
- the distance from the vehicle to the control object can be calculated accurately indirectly based on the road marking detection result at an early stage when the distance is long. Therefore, it is possible to reliably control the control object without requiring an expensive device such as an image pickup device using a front camera in order to take an image of a distant place.
- the schematic configuration of the driving support apparatus 100 according to the third embodiment is substantially the same as that of the driving support apparatus 1 according to the first embodiment. Further, the various control processes are almost the same as those of the driving support device 1 according to the first embodiment.
- the driving support device 1 according to the first embodiment is provided with a rear camera 3 that captures the rear environment as an imaging unit, recognizes road markings based on an image captured by the rear camera, and controls the control object.
- a front camera 101 that images the front environment of the vehicle 2 is provided as an imaging unit. It is different from the driving support apparatus 1 according to the first embodiment in that the road marking is recognized based on the image captured by the camera 101 and the control target is controlled.
- FIG. 17 is a schematic configuration diagram of the driving support apparatus 100 according to the third embodiment.
- the driving support device 1 according to the third embodiment includes a front camera 101, a rear camera 3, a navigation device 4, a vehicle ECU 5, and the like installed on the vehicle 2.
- the front camera 101 uses a solid-state image sensor such as a CCD, for example, and is attached near the upper center of the license plate attached to the front of the vehicle 2, and the line of sight is directed slightly downward from the horizontal. Installed. Then, the traffic lights, road signs, road markings, etc. installed in front of the vehicle 2 are imaged.
- a solid-state image sensor such as a CCD, for example
- the configuration of the rear camera 3, navigation device 4, and vehicle ECU 5 other than the front camera 101 is the same as that of the driving support device 1 according to the first embodiment described above, and the description thereof is omitted.
- the driving assistance device 100 based on the image captured by the front camera 101, the enlargement of the control target and the improvement of the recognition rate of the road marking can be realized as follows. It becomes ability.
- the control target of the "intersection" as described above Caring for driving guidance and vehicle drive control (S11 to S14, S112 to S115) according to the object, warns that the traffic light at the intersection is lit in red, and stops the vehicle 2 before the intersection
- the brake actuator 11 can be controlled.
- the driving support device 100 when it is determined that there is a road marking classified into a blur pattern to be detected within a predetermined range from the vehicle 2.
- the distance from the vehicle 2 to the control target associated with the recognized road marking is calculated, and the distance to the control target is predetermined.
- traveling guidance or vehicle control is performed according to the type of associated control object, so there is no need to directly detect control objects such as stop lines and intersections.
- the distance from the vehicle to the controlled object can be calculated accurately indirectly based on the detection result of the road marking at an early stage when the distance from the controlled object is far away.
- the road marking is recognized by the front camera 101 in advance. Thus, even when the rear camera 3 with a narrow field of view is used, the recognition rate of road marking can be improved.
- control object is a stop line, an intersection, or a corner entrance.
- control object is not limited to the above.
- road markings such as pedestrian crossings
- facilities such as interchanges may be used.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/922,779 US8600655B2 (en) | 2005-08-05 | 2006-08-04 | Road marking recognition system |
DE112006001703T DE112006001703T5 (de) | 2005-08-05 | 2006-08-04 | Straßenmarkierungserkennungssystem |
CN200680021501A CN100595811C (zh) | 2005-08-05 | 2006-08-04 | 路面标示识别系统 |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005228900 | 2005-08-05 | ||
JP2005-228900 | 2005-08-05 | ||
JP2006212481A JP4820712B2 (ja) | 2005-08-05 | 2006-08-03 | 路面標示認識システム |
JP2006-212481 | 2006-08-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007018145A1 true WO2007018145A1 (ja) | 2007-02-15 |
Family
ID=37727328
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/315491 WO2007018145A1 (ja) | 2005-08-05 | 2006-08-04 | 路面標示認識システム |
Country Status (5)
Country | Link |
---|---|
US (1) | US8600655B2 (ja) |
JP (1) | JP4820712B2 (ja) |
CN (1) | CN100595811C (ja) |
DE (1) | DE112006001703T5 (ja) |
WO (1) | WO2007018145A1 (ja) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2008146949A1 (en) * | 2007-05-31 | 2008-12-04 | Aisin Aw Co., Ltd. | Driving assistance apparatus |
JP2008298622A (ja) * | 2007-05-31 | 2008-12-11 | Aisin Aw Co Ltd | 地物認識装置及び地物認識方法、並びにそれを用いたレーン判定装置及びレーン判定方法 |
US20100121561A1 (en) * | 2007-01-29 | 2010-05-13 | Naoaki Kodaira | Car navigation system |
EP2282295A4 (en) * | 2008-04-25 | 2013-10-30 | Hitachi Automotive Systems Ltd | OBJECT RECOGNITION DEVICE AND OBJECT RECOGNITION METHOD |
WO2015019122A1 (en) | 2013-08-07 | 2015-02-12 | Audi Ag | Visualization system,vehicle and method for operating a visualization system |
CN104658263A (zh) * | 2013-11-22 | 2015-05-27 | 上海宝康电子控制工程有限公司 | 对车辆借用右转车道越黄线抓拍的电子警察系统及方法 |
US9224052B2 (en) | 2012-12-19 | 2015-12-29 | Industrial Technology Research Institute | Method for in-image periodic noise pixel inpainting |
WO2018113117A1 (zh) * | 2016-12-20 | 2018-06-28 | 深圳市元征科技股份有限公司 | 一种信息提示方法及装置 |
EP3786921A4 (en) * | 2018-04-27 | 2022-01-05 | Hino Motors, Ltd. | DRIVER ASSISTANCE DEVICE AND TRAFFIC SYSTEM |
Families Citing this family (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7584208B2 (en) * | 2002-11-20 | 2009-09-01 | Radar Networks, Inc. | Methods and systems for managing offers and requests in a network |
JP4720383B2 (ja) * | 2005-09-01 | 2011-07-13 | トヨタ自動車株式会社 | 車両制御装置 |
JP4295298B2 (ja) * | 2006-08-07 | 2009-07-15 | 株式会社日立製作所 | 車両の運転支援制御装置 |
JP4569837B2 (ja) * | 2007-03-30 | 2010-10-27 | アイシン・エィ・ダブリュ株式会社 | 地物情報収集装置及び地物情報収集方法 |
US20100292895A1 (en) * | 2007-04-27 | 2010-11-18 | Aisin Aw Co. Ltd | Driving support device |
JP4780471B2 (ja) * | 2007-06-06 | 2011-09-28 | アイシン・エィ・ダブリュ株式会社 | ナビゲーション装置及びプログラム |
JP2009009368A (ja) * | 2007-06-28 | 2009-01-15 | Aisin Aw Co Ltd | 路面標示認識システム |
JP2009075933A (ja) * | 2007-09-21 | 2009-04-09 | Xanavi Informatics Corp | 分岐路内位置演算装置、分岐路内位置演算方法、および、分岐路内位置演算プログラム |
JP4951481B2 (ja) * | 2007-12-10 | 2012-06-13 | 日立オートモティブシステムズ株式会社 | 路面標示認識装置 |
JP4902575B2 (ja) * | 2008-02-27 | 2012-03-21 | 日立オートモティブシステムズ株式会社 | 道路標示認識装置、および道路標示認識方法 |
JP2009223817A (ja) * | 2008-03-18 | 2009-10-01 | Zenrin Co Ltd | 路面標示地図生成方法 |
JP4416039B2 (ja) * | 2008-03-19 | 2010-02-17 | 日本電気株式会社 | 縞模様検知システム、縞模様検知方法および縞模様検知用プログラム |
KR101067424B1 (ko) * | 2008-07-28 | 2011-09-27 | 삼성전자주식회사 | 휴대단말기의 위젯을 이용한 교통정보 표시 방법 및 장치 |
JP5123812B2 (ja) * | 2008-10-10 | 2013-01-23 | 日立オートモティブシステムズ株式会社 | 道路標示認識システム |
JP5097681B2 (ja) * | 2008-10-31 | 2012-12-12 | 日立オートモティブシステムズ株式会社 | 地物位置認識装置 |
JP5482167B2 (ja) * | 2009-12-10 | 2014-04-23 | アイシン・エィ・ダブリュ株式会社 | 車両用走行案内装置、車両用走行案内方法及びコンピュータプログラム |
JP5088592B2 (ja) * | 2010-04-28 | 2012-12-05 | アイシン・エィ・ダブリュ株式会社 | 自車位置認識装置及び自車位置認識プログラム |
JP5099460B2 (ja) * | 2010-04-28 | 2012-12-19 | アイシン・エィ・ダブリュ株式会社 | 地物情報収集装置及び地物情報収集方法 |
JP5433525B2 (ja) * | 2010-08-06 | 2014-03-05 | 株式会社日立製作所 | 車両走行支援装置及び道路標示の作成方法 |
JP5896263B2 (ja) * | 2011-01-31 | 2016-03-30 | 矢崎エナジーシステム株式会社 | 画像記録制御方法および車載画像記録装置 |
DE102011012791A1 (de) * | 2011-03-02 | 2012-09-06 | Audi Ag | Kraftfahrzeug |
KR20130005107A (ko) * | 2011-07-05 | 2013-01-15 | 현대자동차주식회사 | 차간거리 자동 가변 시스템 및 그 방법 |
KR20130015746A (ko) * | 2011-08-04 | 2013-02-14 | 엘지전자 주식회사 | 차선 인식 장치 및 그 방법 |
DE102011109491A1 (de) * | 2011-08-04 | 2013-02-07 | GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) | Fahrunterstützungsvorrichtung zur Unterstützung der Befahrung enger Fahrwege |
US9469297B2 (en) * | 2011-12-28 | 2016-10-18 | Toyota Jidosha Kabushiki Kaisha | Driving assistance apparatus for vehicle |
US8744675B2 (en) | 2012-02-29 | 2014-06-03 | Ford Global Technologies | Advanced driver assistance system feature performance using off-vehicle communications |
US9053372B2 (en) | 2012-06-28 | 2015-06-09 | Honda Motor Co., Ltd. | Road marking detection and recognition |
WO2014083821A1 (ja) * | 2012-11-27 | 2014-06-05 | 日産自動車株式会社 | 運転支援装置 |
DE102012025067A1 (de) | 2012-12-19 | 2014-06-26 | Valeo Schalter Und Sensoren Gmbh | Fahrerassistenzeinrichtung mit kombinierter Verkehrszeichenerfassung und Markierungserfassung für ein Kraftfahrzeug und entsprechendes Verfahren |
GB2510833B (en) * | 2013-02-13 | 2017-02-22 | Wdm Ltd | A road marking analyser and a method of analysing of road markings |
US9488483B2 (en) * | 2013-05-17 | 2016-11-08 | Honda Motor Co., Ltd. | Localization using road markings |
US8996197B2 (en) | 2013-06-20 | 2015-03-31 | Ford Global Technologies, Llc | Lane monitoring with electronic horizon |
CN104331703A (zh) * | 2013-07-22 | 2015-02-04 | 博世汽车部件(苏州)有限公司 | 监测车辆行驶状态的方法及实现该方法的汽车导航设备 |
JP6313589B2 (ja) * | 2013-12-20 | 2018-04-18 | 矢崎エナジーシステム株式会社 | 運行情報管理システム |
CN103942962B (zh) * | 2014-01-02 | 2016-07-13 | 浙江宇视科技有限公司 | 一种遮挡车牌的违章处理装置及方法 |
EP2918974B1 (en) * | 2014-03-11 | 2019-01-16 | Volvo Car Corporation | Method and system for determining a position of a vehicle |
CN104021693B (zh) * | 2014-06-25 | 2016-08-03 | 青海省恒立公路勘测设计有限公司 | 基于雷达探测的车行道标示方法及车行道标示系统 |
JP6526191B2 (ja) * | 2014-10-14 | 2019-06-19 | トヨタ モーター ヨーロッパ | 交通標識支援システム及び方法 |
EP3239956B1 (en) * | 2014-12-26 | 2021-02-24 | The Yokohama Rubber Co., Ltd. | Collision avoidance system |
CA2976344A1 (en) * | 2015-02-10 | 2016-08-18 | Mobileye Vision Technologies Ltd. | Sparse map for autonomous vehicle navigation |
US20160259034A1 (en) * | 2015-03-04 | 2016-09-08 | Panasonic Intellectual Property Management Co., Ltd. | Position estimation device and position estimation method |
DE102015205094A1 (de) * | 2015-03-20 | 2016-09-22 | Continental Automotive Gmbh | Vorrichtung zum automatischen Erfassen eines Zustands eines Objekts, Auswertevorrichtung zum automatischen Bestimmen eines vorbestimmten Zustands eines Objekts und Verfahren zum automatischen Bestimmen eines vorbestimmten Zustands eines Objekts |
US10509971B2 (en) * | 2015-08-28 | 2019-12-17 | Hitachi, Ltd. | Landmark recognition device and landmark recognition method using a database storing landmark extraction information |
US9654891B2 (en) * | 2015-09-15 | 2017-05-16 | D&M Holdings, Inc. | System and method for determining proximity of a controller to a media rendering device |
US10068133B2 (en) * | 2016-01-22 | 2018-09-04 | International Business Machines Corporation | Pavement marking determination |
US10670418B2 (en) * | 2016-05-04 | 2020-06-02 | International Business Machines Corporation | Video based route recognition |
US9851212B2 (en) * | 2016-05-06 | 2017-12-26 | Ford Global Technologies, Llc | Route generation using road lane line quality |
DE102016210632A1 (de) * | 2016-06-15 | 2017-12-21 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren zum Überprüfen eines Medienverlustes eines Kraftfahrzeuges sowie Kraftfahrzeug und System zum Ausführen eines solchen Verfahrens |
JP6642334B2 (ja) * | 2016-08-25 | 2020-02-05 | トヨタ自動車株式会社 | 車両制御装置 |
US10217240B2 (en) * | 2017-01-23 | 2019-02-26 | Autonomous Fusion, Inc. | Method and system to determine distance to an object in an image |
JP6935513B2 (ja) * | 2017-12-12 | 2021-09-15 | 愛知製鋼株式会社 | マーカ施工方法及びマーカ施工システム |
JP7037930B2 (ja) * | 2017-12-19 | 2022-03-17 | 株式会社小糸製作所 | 車両の運転支援装置、命令表示装置及び車両用灯具。 |
DE102018203560A1 (de) * | 2018-03-08 | 2019-09-12 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren und Steuereinheit zur Erkennung einer Fahrspurbegrenzung |
US10890462B2 (en) * | 2018-06-26 | 2021-01-12 | Princess Sumaya University For Technology | Traffic notification system and method |
JP7385595B2 (ja) * | 2018-11-28 | 2023-11-22 | 株式会社堀場製作所 | 車両試験システム及び車両試験方法 |
GB2580388B (en) * | 2019-01-09 | 2021-07-28 | Jaguar Land Rover Ltd | Control system for a vehicle |
US11023747B2 (en) | 2019-03-05 | 2021-06-01 | Here Global B.V. | Method, apparatus, and system for detecting degraded ground paint in an image |
JP7358762B2 (ja) * | 2019-04-02 | 2023-10-11 | トヨタ自動車株式会社 | 道路異常検知装置、道路異常検知方法および道路異常検知プログラム |
KR102638068B1 (ko) * | 2019-04-26 | 2024-02-21 | 주식회사 에이치엘클레무브 | 차량 제어 시스템, 표식 분류 장치 및 표식 분류 방법 |
WO2020263498A1 (en) * | 2019-06-25 | 2020-12-30 | Snap Inc. | Vanishing point stereoscopic image correction |
JP7211350B2 (ja) * | 2019-11-29 | 2023-01-24 | トヨタ自動車株式会社 | 路面損傷検出装置、路面損傷検出方法、プログラム |
CN113127681A (zh) * | 2019-12-30 | 2021-07-16 | 深圳Tcl数字技术有限公司 | 一种基于人脸识别的视频推荐方法及装置、电视 |
CN111999745A (zh) * | 2020-05-21 | 2020-11-27 | 深圳市西博泰科电子有限公司 | 提高地标定位精度的数据处理方法、装置、设备 |
JP7287373B2 (ja) * | 2020-10-06 | 2023-06-06 | トヨタ自動車株式会社 | 地図生成装置、地図生成方法及び地図生成用コンピュータプログラム |
US11557132B2 (en) | 2020-10-19 | 2023-01-17 | Here Global B.V. | Lane marking |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003168123A (ja) * | 2001-11-30 | 2003-06-13 | Toyota Central Res & Dev Lab Inc | 車線境界判定装置 |
JP2003252148A (ja) * | 2002-03-01 | 2003-09-10 | Alpine Electronics Inc | 車載用ナビゲーション装置 |
JP2005063398A (ja) * | 2003-06-16 | 2005-03-10 | Fujitsu Ten Ltd | 車両制御装置 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4970653A (en) * | 1989-04-06 | 1990-11-13 | General Motors Corporation | Vision method of detecting lane boundaries and obstacles |
JP3969251B2 (ja) | 2002-08-23 | 2007-09-05 | トヨタ自動車株式会社 | 車両用運転補助装置 |
JP2004206275A (ja) * | 2002-12-24 | 2004-07-22 | Denso Corp | 自動運転制御システム |
US20040160595A1 (en) * | 2003-02-14 | 2004-08-19 | Lafarge Road Marking, Inc. | Road marking evaluation and measurement system |
DE10323915A1 (de) * | 2003-05-23 | 2005-02-03 | Daimlerchrysler Ag | Kamerabasierte Positionserkennung für ein Straßenfahrzeug |
KR100559870B1 (ko) * | 2003-11-04 | 2006-03-13 | 현대자동차주식회사 | 차량의 주행차로 변경방법 |
JP3722486B1 (ja) * | 2004-05-19 | 2005-11-30 | 本田技研工業株式会社 | 車両用走行区分線認識装置 |
-
2006
- 2006-08-03 JP JP2006212481A patent/JP4820712B2/ja not_active Expired - Fee Related
- 2006-08-04 US US11/922,779 patent/US8600655B2/en not_active Expired - Fee Related
- 2006-08-04 WO PCT/JP2006/315491 patent/WO2007018145A1/ja active Application Filing
- 2006-08-04 CN CN200680021501A patent/CN100595811C/zh not_active Expired - Fee Related
- 2006-08-04 DE DE112006001703T patent/DE112006001703T5/de not_active Withdrawn
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003168123A (ja) * | 2001-11-30 | 2003-06-13 | Toyota Central Res & Dev Lab Inc | 車線境界判定装置 |
JP2003252148A (ja) * | 2002-03-01 | 2003-09-10 | Alpine Electronics Inc | 車載用ナビゲーション装置 |
JP2005063398A (ja) * | 2003-06-16 | 2005-03-10 | Fujitsu Ten Ltd | 車両制御装置 |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100121561A1 (en) * | 2007-01-29 | 2010-05-13 | Naoaki Kodaira | Car navigation system |
US8175806B2 (en) * | 2007-01-29 | 2012-05-08 | Kabushiki Kaisha Toshiba | Car navigation system |
WO2008146949A1 (en) * | 2007-05-31 | 2008-12-04 | Aisin Aw Co., Ltd. | Driving assistance apparatus |
JP2008298622A (ja) * | 2007-05-31 | 2008-12-11 | Aisin Aw Co Ltd | 地物認識装置及び地物認識方法、並びにそれを用いたレーン判定装置及びレーン判定方法 |
US8600673B2 (en) | 2007-05-31 | 2013-12-03 | Aisin Aw Co., Ltd. | Driving assistance apparatus |
EP2282295A4 (en) * | 2008-04-25 | 2013-10-30 | Hitachi Automotive Systems Ltd | OBJECT RECOGNITION DEVICE AND OBJECT RECOGNITION METHOD |
US9224052B2 (en) | 2012-12-19 | 2015-12-29 | Industrial Technology Research Institute | Method for in-image periodic noise pixel inpainting |
WO2015019122A1 (en) | 2013-08-07 | 2015-02-12 | Audi Ag | Visualization system,vehicle and method for operating a visualization system |
CN104658263A (zh) * | 2013-11-22 | 2015-05-27 | 上海宝康电子控制工程有限公司 | 对车辆借用右转车道越黄线抓拍的电子警察系统及方法 |
WO2018113117A1 (zh) * | 2016-12-20 | 2018-06-28 | 深圳市元征科技股份有限公司 | 一种信息提示方法及装置 |
EP3786921A4 (en) * | 2018-04-27 | 2022-01-05 | Hino Motors, Ltd. | DRIVER ASSISTANCE DEVICE AND TRAFFIC SYSTEM |
Also Published As
Publication number | Publication date |
---|---|
JP4820712B2 (ja) | 2011-11-24 |
US20090088978A1 (en) | 2009-04-02 |
CN100595811C (zh) | 2010-03-24 |
CN101198996A (zh) | 2008-06-11 |
DE112006001703T5 (de) | 2008-06-05 |
US8600655B2 (en) | 2013-12-03 |
JP2007066305A (ja) | 2007-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4820712B2 (ja) | 路面標示認識システム | |
JP4421549B2 (ja) | 運転支援装置 | |
JP4915739B2 (ja) | 運転支援装置 | |
JP6217412B2 (ja) | 自動運転支援装置、自動運転支援方法及びプログラム | |
EP2141678A1 (en) | Driving support system | |
JP4211620B2 (ja) | カーナビゲーション装置 | |
JP4762697B2 (ja) | 車両走行補助システム | |
JP4792948B2 (ja) | 車車間通信システム | |
JP2006209511A (ja) | 画像認識装置及び画像認識方法、並びにそれを用いた位置特定装置、車両制御装置及びナビゲーション装置 | |
JP2011122936A (ja) | 車両用走行案内装置、車両用走行案内方法及びコンピュータプログラム | |
JP4614098B2 (ja) | 周辺状況認識装置及び方法 | |
JP3811238B2 (ja) | 画像情報を利用した車両用音声案内装置 | |
JP2009042167A (ja) | 画像認識装置と画像認識装置のためのプログラム、及びこれを用いたナビゲーション装置とナビゲーション装置のためのプログラム | |
JP4637302B2 (ja) | 路面標示認識システム | |
JP4888285B2 (ja) | 運転支援装置、運転支援方法及びコンピュータプログラム | |
JP2010210477A (ja) | ナビゲーション装置 | |
JP2005182308A (ja) | 車両運転支援装置 | |
JP2002321579A (ja) | 警告情報生成方法及び車両側方映像生成装置 | |
JP4692831B2 (ja) | 周辺状況認識装置及び方法 | |
JP4912495B2 (ja) | 画像認識装置及び方法、ナビゲーション装置 | |
JP4924303B2 (ja) | 運転支援装置、運転支援方法及びコンピュータプログラム | |
JP2009009368A (ja) | 路面標示認識システム | |
JP2007085911A (ja) | 車両位置判定装置、その制御方法及び制御プログラム | |
JP4648697B2 (ja) | 画像認識装置及び方法、ナビゲーション装置 | |
JP2009002797A (ja) | 車両状態判定装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680021501.1 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 1120060017039 Country of ref document: DE |
|
RET | De translation (de og part 6b) |
Ref document number: 112006001703 Country of ref document: DE Date of ref document: 20080605 Kind code of ref document: P |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 06782349 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11922779 Country of ref document: US |