JP2008165610A - Road section line recognition device - Google Patents

Road section line recognition device Download PDF

Info

Publication number
JP2008165610A
JP2008165610A JP2006356220A JP2006356220A JP2008165610A JP 2008165610 A JP2008165610 A JP 2008165610A JP 2006356220 A JP2006356220 A JP 2006356220A JP 2006356220 A JP2006356220 A JP 2006356220A JP 2008165610 A JP2008165610 A JP 2008165610A
Authority
JP
Japan
Prior art keywords
target
road
recognition
imaging
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2006356220A
Other languages
Japanese (ja)
Inventor
Mayumi Kato
真弓 加藤
Original Assignee
Toyota Motor Corp
トヨタ自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Corp, トヨタ自動車株式会社 filed Critical Toyota Motor Corp
Priority to JP2006356220A priority Critical patent/JP2008165610A/en
Publication of JP2008165610A publication Critical patent/JP2008165610A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/00791Recognising scenes perceived from the perspective of a land vehicle, e.g. recognising lanes, obstacles or traffic signs on road scenes
    • G06K9/00798Recognition of lanes or road borders, e.g. of lane markings, or recognition of driver's driving pattern in relation to lanes perceived from the vehicle; Analysis of car trajectory relative to detected road

Abstract

A road lane marking recognition apparatus capable of appropriately recognizing road lane markings while reducing processing load.
A road lane marking recognition device that includes an imaging section that captures an image of the periphery of a vehicle and recognizes a road lane marking by analyzing a captured image of the imaging section, the position of a first target laid on the road The position of the second target in the captured image of the imaging means on a road that is partitioned by the first target and a second target different from the first target. When recognizing the road marking line based on the position on the near side or the back side of the position on the captured image of the imaging means corresponding to the position of the first target specified by the first target position specifying means, A second target recognition image analysis region for recognizing the second target is set.
[Selection] Figure 1

Description

  The present invention relates to a road lane marking recognition device that recognizes road lane markings by analyzing a captured image of an imaging means that images the periphery of a vehicle.
  In recent years, a control system that performs automatic steering control so as to travel while maintaining a traveling lane is known under a name such as LKA (Lane Keeping Assist) (for example, see Non-Patent Document 1). An important point in such a control system is to accurately recognize road lane markings (lane markers) that divide the traveling lane and grasp the positional relationship between the traveling lane and the host vehicle.
  The recognition of the road marking line is performed mainly by analyzing an image of a camera or the like that captures the periphery of the vehicle. For example, when recognizing a solid line or a broken line (hereinafter referred to as a white line) drawn with a white line or a yellow line, a predetermined area in the lower part of the image (except for an area in which the road at the upper part of the image or the left and right ends is not imaged) In this case, a point having a high luminance change appearing on a boundary line between a white line or the like and a road surface may be extracted, and a straight line of the extracted points may be recognized as the outline of the road marking line. In reality, instead of making a determination regarding luminance change for the entire range within the predetermined area, a plurality of scanning lines in the image width direction with appropriate intervals in the predetermined area are set, and the luminance on the scanning line is set. It is sufficient to make a change-related decision. Therefore, by appropriately setting the scanning line interval, it is possible to simultaneously reduce the processing load of the apparatus and maintain the recognition accuracy. In-vehicle devices have severe restrictions on installation space, cost, weight, etc., so there is a background that it is difficult to mount a large processing device, and it is highly important to reduce the processing load. .
On the other hand, an object detection device that includes a millimeter-wave radar, which is a type of radar device, and a camera, limits an image recognition area based on the power output from the millimeter-wave radar, and detects an object by performing image processing in the area Has been disclosed (for example, see Patent Document 1).
JP 2001-296357 A Toyota Motor Corporation, "Crown Majesta New Model Car Description (Part No. 7109100)", Toyota Motor Corporation Service Department, July 5, 2004, Chapter 10 Body & Electrical, p10-287-10-306
  By the way, in the road lane marking, in addition to the white line and the like, there are point lane-shaped road lane markings such as Botts Dots and Cat's Eye. Here, the botsdots are ceramic discs having a diameter of about 10 cm, which are mainly used in North America and are embedded in the road at intervals. The cat's eye is a reflector embedded in the road at an interval, and has a characteristic of reflecting incident light in the same direction.
  When recognizing botsdots, etc. by image analysis, pattern matching processing and morphological operations are usually performed, and the virtual straight line (curve) connecting the positions of botsdots, etc. as a result is recognized as road lane markings. This is a possible method. However, botsdots and cat's eyes are intermittently laid on the road, and white lines do not appear as densely on the image. Therefore, if the scanning lines in the image width direction are set with an appropriate interval as in the case of recognizing a white line or the like, a lot of botsdots and cat's eyes enter the interval, and a sufficient number of positions such as botsdots are obtained. There are cases where it is not possible. In this case, a virtual straight line (curve) cannot be generated sufficiently accurately, and if the pattern matching process or the like is performed on the entire range within the predetermined area, there arises a problem that the processing load of the apparatus increases.
  Here, it can be considered that the processing load of the apparatus can be reduced by applying the apparatus described in Patent Document 1 to a recognition apparatus such as Botsdots and limiting the image analysis region.
  However, the device described in Patent Document 1 is mainly intended to recognize so-called obstacles such as vehicles around the vehicle, and diverts the image analysis region setting method to recognition of road lane markings. Is inappropriate. Because, when setting the image analysis area based on the output power of the millimeter wave radar, it is possible to limit the image analysis area in the direction of the azimuth viewed from the vehicle, that is, the image width direction. Rather, it is because the image analysis region is limited in the longitudinal direction of the road, that is, the depth direction of the image.
  This invention is for solving such a subject, and it aims at providing the road marking line recognition apparatus which can recognize a road marking line appropriately, reducing processing load. .
  One aspect of the present invention for achieving the above object is a road lane marking recognition device that includes an imaging means for imaging the periphery of a vehicle and recognizes a road lane marking by analyzing a captured image of the imaging means. The first target position specifying means for specifying the position of the first target laid on the road, and the imaging means on the road partitioned by the first target and a second target different from the first target When recognizing a road lane marking based on the position of the second target in the captured image, the image on the captured image of the imaging means corresponding to the position of the first target specified by the first target position specifying means A second target recognition image analysis region for recognizing the second target is set on the front side or the back side of the position. Here, the “target” is a concept that does not include a solid line or a broken line drawn with a white line or a yellow line.
  According to this aspect of the present invention, when recognizing a target that is not suitable for processing by setting a scanning line, the position of the first target specified by the first target position specifying means is set. Since the second target recognition image analysis area for recognizing the second target is set on the near side or the back side of the position on the captured image of the corresponding imaging means, the road is reduced while reducing the processing load. A lane marking can be properly recognized.
  In one aspect of the present invention, on a road that is partitioned by a first target and a second target that is different from the first target, a road lane marking based on the position of the second target in the captured image of the imaging means When the first target position specifying means specifies the positions of the plurality of first targets in the road longitudinal direction, the plurality of first targets specified by the first target position specifying means are recognized. A second target recognition image analysis area for recognizing the second target may be set between positions on the captured image of the imaging unit corresponding to the position of the target.
  In one embodiment of the present invention, in a road that is partitioned by a first target and a second target different from the first target, the road is based on the position of the first target in the captured image of the imaging unit. When recognizing the lane marking, the first target is recognized on the captured image of the imaging means, centered on the position corresponding to the position of the first target specified by the first target position specifying means. For this purpose, a first target recognition image analysis region may be set.
  Moreover, in one aspect of the present invention, the imaging unit and the first target position specifying unit are units that periodically operate, and the first target position specifying unit does not specify the position of the first target. The captured image captured by the imaging unit captured at a timing corresponding to the timing may not be analyzed.
  ADVANTAGE OF THE INVENTION According to this invention, the road marking line recognition apparatus which can recognize a road marking line appropriately can be provided, reducing a processing load.
  Hereinafter, the best mode for carrying out the present invention will be described with reference to the accompanying drawings.
[Constitution]
Hereinafter, a road marking line recognition apparatus 1 according to an embodiment of the present invention will be described. FIG. 1 is a diagram illustrating an example of the overall configuration of a road lane marking recognition device 1. The road marking line recognition device 1 includes a front camera 10, a radar device 20, an LKA ECU (Electronic Control Unit) 30, and a main switch 40 as main components. Further, a steering device 50 is illustrated as one that uses the output of this device. In addition, the arrow in a figure shows the flow of the main information communication in this apparatus via a multiple communication line etc. The communication is performed using an appropriate communication protocol such as CAN (Controller Area Network), BEAN, AVC-LAN, or FlexRay.
  The front camera 10 is, for example, a camera that uses an image sensor such as a CCD or a CMOS disposed in the upper center of the windshield, and has an optical axis directed obliquely downward in front of the vehicle, and images a road ahead of the vehicle. To do. The captured image of the front camera 10 is transmitted to the LKA ECU 30 as an image signal generated by an interlace method such as NTSC (National Television Standards Committee).
  The radar device 20 is, for example, a millimeter wave radar device disposed behind the front grille, and uses the time until the reflected wave of the millimeter wave returns, the angle of the reflected wave, and the frequency change of the object. Detect direction, speed. The radar apparatus 20 periodically performs such detection, and transmits information regarding the detected object, particularly the position of the cat's eye laid on the road (relative position viewed from the host vehicle) to the LKA ECU 30. The radar device 20 may share a radar device used for well-known inter-vehicle distance control and collision prediction control, or may be provided exclusively for the road marking line recognition device 1 of the present embodiment. In addition to the millimeter wave radar device, a laser radar, an infrared radar, a sound wave radar (sonar), a stereo camera device, etc. can be considered as means for detecting the distance of the object.
  The LKA ECU 30 is, for example, a computer unit in which a ROM, a RAM, and the like are connected to each other via a bus with a CPU at the center, and other storage such as an HDD (Hard Disc Drive) and a DVD (Digital Versatile Disk). Media, I / O port, timer, counter, etc. are provided. The ROM stores programs and data executed by the CPU. The LKA ECU 30 is activated, for example, by a user operation on the main switch 40 disposed on the side of the steering, and transmits a steering signal to the steering device 50 based on the captured image analysis of the front camera 10. Specific contents of the steering signal transmission will be described later.
  The LKA ECU 30 includes an image recognition unit 32 and a steering signal generation unit 34 as main functional blocks that function when the CPU stores (loads) a program stored in the ROM and executes the program. .
  The image recognition unit 32 uses the position of the cat's eye transmitted from the radar device 20 as auxiliary information, analyzes the image signal transmitted from the front camera 10, and recognizes the positional relationship between the road marking line and the host vehicle ( Recognize road markings). Recognition processing by image analysis is performed based on different analysis regions and methods depending on the type of road marking line. In this embodiment, a road dividing line in which the road is partitioned by a white line, a yellow solid line, a broken line or the like (hereinafter referred to as a white line), and a road is partitioned by laying both the botsdots and the cat's eye. It can be adapted to the road lane markings. This is based on the fact that botsdots are relatively difficult to see at night, and therefore cat eyes that are easily visible at night by reflecting the light emitted from the headlight are often provided.
[1. Recognition method when the road is sectioned with white lines, etc.]
In this case, the image recognition unit 32 first sets a basic analysis region in the lower region in the captured image of the front camera 10. FIG. 2 is a diagram illustrating a basic analysis region set on a captured image of the front camera 10 when recognizing a white line or the like. The basic analysis area is set with a depth that allows the road lane marking to be recognized sufficiently accurately from the front boundary line, which is the limit line where the road can be imaged due to the presence of the vehicle nose (for example, the real coordinate system (the sky (A real coordinate system viewed from the above) is an area on the image corresponding to several [m] to several tens [m] ahead of the host vehicle). In addition, the left and right end portions are slightly narrowed toward the center side in order to reduce the influence of noise elements that are unnecessary in image recognition, such as pedestrians and buildings.
  When the basic analysis region is set, a plurality of scanning lines in the image width direction are set at appropriate intervals in the region, and points (feature points) whose luminance change in the horizontal direction of the image is greater than or equal to a threshold value are extracted on each scanning line ( (See FIG. 3). Here, “appropriate intervals” may be equal intervals on the image, or may be intervals on the image corresponding to equal intervals on the road. A feature point having a length greater than or equal to a predetermined value among the feature points arranged in a straight line or a curved line by using a straight line extraction method such as a Hough transform is recognized as a road marking line. Then, through a conversion process from the image coordinate system to the real coordinate system, or a curvature estimation process in the case of a curved road, the positional relationship (for example, the yaw angle, And represented by an offset). Here, the yaw angle is a deviation between the traveling direction of the traveling lane and the traveling direction of the host vehicle, and the offset is a deviation of the center of the host vehicle from the traveling lane center line (see FIG. 4).
  In this way, when recognizing a white line or the like, the feature point is extracted on the set scan line, instead of extracting the feature point by scanning the entire basic analysis region. Since the white line or the like has a sufficient length in the image depth direction, a sufficient number of feature points can be extracted even by such a method. Therefore, it is possible to reduce both the processing load of the apparatus and maintain the recognition accuracy.
[2. Recognition method when the road is partitioned by botsdots and catseye]
However, it is difficult to apply the method [1] as it is to roads partitioned by botsdots and catseye. This is because botsdots and cat's eyes are intermittently laid on the road, and white lines and the like do not appear as densely on the image, so scanning in the image width direction with appropriate intervals as in [1] This is because if a line is set, a lot of botsdots and cat's eyes enter the space, and a sufficient number of positions such as botsdots cannot be obtained. Therefore, basically, analysis processing is performed all over the given image analysis region without setting scanning lines. However, if analysis processing is performed all over a relatively wide area such as the basic analysis area, the processing burden on the image recognition unit 32 becomes large.
  Therefore, in this embodiment, paying attention to these laying patterns on the road where the botsdots and the catseye are provided, the analysis region for the botsdots is set based on the position of the catseye detected by the radar device 20. On such roads, botsdots and catseye are usually laid in a certain pattern (see, for example, FIG. 5), and if the position of catseye can be recognized, the probability of the existence of botsdots is high. This is because it is possible to specify a region (hereinafter referred to as a “bottoms analysis region”). In the figure, α is an interval (for example, a value of about several [m]) between the cat's eye and the bots dot group, which is specific to the country or region in which the vehicle is traveling. It is assumed that the value α is already stored in the ROM or the like at the time of shipment according to the country or region where the apparatus is used.
  Here, the “cat's eye position” refers to the distance from the host vehicle, that is, the position in the image depth direction, and the “bottom dot analysis area” refers to the same width direction as shown in each figure of FIG. Although it is the same as the basic analysis area of [1], it is one or more areas more limited than the basic analysis area of [1] with respect to the image depth direction.
  The analysis area for botsdots may be the front side of the cat's eye on the image (FIG. 6A) or the back side (FIG. 6B). For example, when the cat's eye is at the back position in the basic analysis area, the analysis area for the botts dot is located on the front side of the cat's eye, and when the cat's eye is at the back position in the basic analysis area, the analysis area for the bot's dots is A method of setting each of them on the back side of the eye can be considered. In the figure, β is obtained by converting the above α into an image coordinate system and reducing it slightly. The distance from the vehicle of the reference cat's eye and the camera installation parameters (roll, pan, pitch, installed) Height, focal length, etc.).
Moreover, you may set the analysis area for botsdots in both the near side and the back side. In addition, when two or more cat's eyes are recognized at positions corresponding to the basic analysis area, it is appropriate to set a botsdot analysis area between them (FIG. 6C). In the figure, β 1 and β are different intervals.
  Note that the analysis region in the image width direction is not particularly limited because the position on the image such as botsdots fluctuates relatively greatly due to the curve and the offset.
  Further, in consideration of the interval between the botsdots, a botsdots analysis region obtained by further subdividing the botsdots analysis region illustrated in each diagram of FIG. 6 may be set (see FIG. 7).
  As shown in FIGS. 6 and 7, the analysis area for botsdots is narrower than the basic analysis area when a white line or the like is detected. Therefore, the processing load can be reduced.
  When the analysis region for botsdots is set, processing such as pattern matching processing and morphological calculation is performed in the region to recognize botsdots. Then, among the image elements recognized as botsdots, a virtual straight line (or a curved line) connecting image elements arranged in a straight line or curved line extracted through Hough transform or voting processing is recognized as a road lane marking. To do. The subsequent recognition of the positional relationship between the travel lane and the host vehicle is the same as in the case of [1].
  As described above, in the road lane marking recognition device 1 according to the present embodiment, when recognizing a botsdot that is not appropriate to perform processing by setting a scanning line, the position of the cat's eye detected by the radar device 20, and Since the analysis region for botsdots is set based on these laying patterns, the processing load can be reduced. In addition, since it is considered that there are a sufficient number of botsdots in the analysis area for botsdots, road lane marking recognition can be appropriately recognized based on the positions of the bottsdots.
  Since the position of the folding cat's eye in the image depth direction is recognized, a cat's eye analysis region centered on the position of the cat's eye detected by the front radar 20 is further set as shown in FIG. Thus, it may be added to the road lane marking recognition. In this way, road lane marking recognition can be recognized more appropriately.
  Regarding the switching of [1] and [2], for example, when feature points continuous for a predetermined length or more are extracted, a road is partitioned by a white line or the like, and feature points continuous for a predetermined length or more are obtained. If not extracted, it may be determined that the road is not demarcated by a white line or the like, and may be switched as appropriate.
[Use of recognized road lane markings]
The positional relationship between the driving lane and the host vehicle derived by the image recognition unit 32 is output to the steering signal generation unit 34. The steering signal generation unit 34 issues a buzzer warning when it is predicted that the vehicle will deviate from the driving lane after a predetermined time (for example, 0 comma number [sec] to several [sec]) based on the input positional relationship. At the same time, a steering signal is generated so as to output a small auxiliary steering force for a short time, and is transmitted to the steering device 50 (lane departure warning control). In addition, a steering signal is generated and transmitted to the steering device 50 so as to continuously output a small auxiliary steering force so that the host vehicle can stably travel in the vicinity of the center of the traveling lane (lane keeping support control). By these controls, it is possible to suppress deviation of the host vehicle from the traveling lane.
  Strictly speaking, the steering signal generation unit 34 is a functional block exceeding the category of “road lane marking recognition device” in the scope of claims, and the road lane marking recognition device 1 of the present embodiment In other words, “steering control system including line recognition device”.
  The steering device 50 is, for example, an electric power steering device, and includes a steering angle sensor, a torque sensor, an assist motor, a controller, and the like. During normal times when the steering signal is not transmitted from the LKA ECU 30, the controller of the steering device 50 calculates the torque required for steering the vehicle based on the steering torque signal from the torque sensor and other vehicle state signals (vehicle speed, yaw rate, etc.). A control signal is output to the drive circuit of the assist motor so as to output. When the steering signal is transmitted from the LKA ECU 30, the assist motor is controlled based on the steering signal from the LKA ECU 30 in addition to (or instead of) the above-described normal assist motor control.
  According to the road lane marking recognition apparatus 1 of the present embodiment, road lane marking recognition can be appropriately recognized while reducing the processing load.
  The best mode for carrying out the present invention has been described above with reference to the embodiments. However, the present invention is not limited to these embodiments, and various modifications can be made without departing from the scope of the present invention. And substitutions can be added.
  For example, the imaging by the front camera 10 is usually performed at a cycle of about several tens of times per second, but the imaging timing and the detection of the radar device 20 are synchronized, and the radar device 20 uses the cat's eye. An image (frame) imaged at an imaging timing corresponding to a detection timing at which no is detected may not be a target for road lane marking recognition. This is because if the cat's eye is not detected, the analysis region for botsdots cannot be set appropriately, and thus analysis processing must be performed throughout the basic analysis region.
  The present invention can be used in the automobile manufacturing industry, the automobile parts manufacturing industry, and the like.
It is a figure which shows an example of the whole structure of the road lane marking recognition apparatus. It is a figure showing the basic analysis area set on the picked-up image of the front camera 10 when recognizing a white line. It is a figure which shows a mode that a scanning line is set within a basic analysis area | region and a feature point is extracted on each scanning line. It is a figure which shows a yaw angle and an offset. It is a figure which shows an example of the laying pattern of a bots dot and a cat's eye. It is a figure which shows the example of the analysis area | region for botsdots set in the captured image of the front camera. It is a figure which shows the other example of the analysis area for botsdots set in the captured image of the front camera. It is a figure which shows a mode that the analysis area | region for cats eyes is set in the picked-up image of the front camera 10 in addition to the analysis area | region for bots dots.
Explanation of symbols
1 road marking line recognition device 10 front camera 20 radar device 30 ECU for LKA
32 Image recognition unit 34 Steering signal generation unit 40 Main switch 50 Steering device

Claims (4)

  1. A road lane marking recognition device that includes an imaging means for imaging a vehicle periphery and recognizes a road lane marking by analyzing a captured image of the imaging means,
    Comprising first target position specifying means for specifying the position of the first target laid on the road;
    Recognizing a road marking line based on the position of the second target in the captured image of the imaging means on a road partitioned by the first target and a second target different from the first target In case
    Recognize the second target on the near side and / or the back side of the position on the captured image of the imaging means corresponding to the position of the first target specified by the first target position specifying means. A second target recognition image analysis area for setting
    Road marking line recognition device.
  2. The road marking line recognition device according to claim 1,
    Recognizing a road marking line based on the position of the second target in the captured image of the imaging means on a road partitioned by the first target and a second target different from the first target When
    When the first target position specifying means specifies the positions of a plurality of the first targets in the road longitudinal direction,
    A second object for recognizing the second target between positions on the captured image of the imaging means corresponding to the positions of the plurality of first targets specified by the first target position specifying means. It is characterized by setting an image analysis area for target recognition.
    Road marking line recognition device.
  3. A road lane marking recognition device according to claim 1 or 2,
    Recognizing a road marking line based on the position of the first target in the captured image of the imaging means on a road partitioned by the first target and a second target different from the first target In case
    A first object for recognizing the first target centered on a position corresponding to the position of the first target specified by the first target position specifying means on the captured image of the imaging means. It is characterized by setting an image analysis area for target recognition.
    Road marking line recognition device.
  4. A road lane marking recognition device according to any one of claims 1 to 3,
    The imaging unit and the first target position specifying unit are units that periodically operate,
    The captured image of the imaging unit captured at a timing corresponding to the timing at which the first target position specifying unit has not specified the position of the first target is not analyzed.
    Road marking line recognition device.
JP2006356220A 2006-12-28 2006-12-28 Road section line recognition device Pending JP2008165610A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006356220A JP2008165610A (en) 2006-12-28 2006-12-28 Road section line recognition device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006356220A JP2008165610A (en) 2006-12-28 2006-12-28 Road section line recognition device

Publications (1)

Publication Number Publication Date
JP2008165610A true JP2008165610A (en) 2008-07-17

Family

ID=39694991

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006356220A Pending JP2008165610A (en) 2006-12-28 2006-12-28 Road section line recognition device

Country Status (1)

Country Link
JP (1) JP2008165610A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010064513A (en) * 2008-09-08 2010-03-25 Toyota Motor Corp Road surface division mark recognizing apparatus and lane departure preventing apparatus
JP2010130133A (en) * 2008-11-26 2010-06-10 Kobe Steel Ltd Radio communication terminal, and radio communication system
JP2013037193A (en) * 2011-08-08 2013-02-21 Stanley Electric Co Ltd Optical device
JP2013186655A (en) * 2012-03-07 2013-09-19 Toyota Central R&D Labs Inc Road sign detection device and program
JP2016206721A (en) * 2015-04-15 2016-12-08 日産自動車株式会社 Road mark detection apparatus and road mark detection method
JP2019007790A (en) * 2017-06-22 2019-01-17 本田技研工業株式会社 Vehicle position determination device
US20190077459A1 (en) * 2017-09-11 2019-03-14 Honda Motor Co., Ltd. Vehicle control device, vehicle control method, and recording medium

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010064513A (en) * 2008-09-08 2010-03-25 Toyota Motor Corp Road surface division mark recognizing apparatus and lane departure preventing apparatus
US8265872B2 (en) 2008-09-08 2012-09-11 Toyota Jidosha Kabushiki Kaisha Road surface division mark recognition apparatus, and lane departure prevention apparatus
US8392115B2 (en) 2008-09-08 2013-03-05 Toyota Jidosha Kabushiki Kaisha Road surface division mark recognition apparatus, and lane departure prevention apparatus
JP2010130133A (en) * 2008-11-26 2010-06-10 Kobe Steel Ltd Radio communication terminal, and radio communication system
JP2013037193A (en) * 2011-08-08 2013-02-21 Stanley Electric Co Ltd Optical device
JP2013186655A (en) * 2012-03-07 2013-09-19 Toyota Central R&D Labs Inc Road sign detection device and program
JP2016206721A (en) * 2015-04-15 2016-12-08 日産自動車株式会社 Road mark detection apparatus and road mark detection method
JP2019007790A (en) * 2017-06-22 2019-01-17 本田技研工業株式会社 Vehicle position determination device
US20190077459A1 (en) * 2017-09-11 2019-03-14 Honda Motor Co., Ltd. Vehicle control device, vehicle control method, and recording medium

Similar Documents

Publication Publication Date Title
JP6353525B2 (en) Method for controlling the speed of a host vehicle and system for controlling the speed of a host vehicle
US9499171B2 (en) Driving support apparatus for vehicle
JP6202367B2 (en) Image processing device, distance measurement device, mobile device control system, mobile device, and image processing program
US20160210853A1 (en) Vehicle vision system with traffic monitoring and alert
JP6383661B2 (en) Device for supporting a driver when driving a car or driving a car autonomously
US9713983B2 (en) Lane boundary line recognition apparatus and program for recognizing lane boundary line on roadway
EP2993654B1 (en) Method and system for forward collision warning
KR102178433B1 (en) Auto emergency braking system and method for recognizing pedestrian of the same
CN106463064B (en) Object recognition device and vehicle travel control device using same
JP5892129B2 (en) Road shape recognition method, road shape recognition device, program, and recording medium
EP1982906B1 (en) Vehicle and steering control device for vehicle
CN104115198B (en) Vehicle collaborates accessory system and method
US6911642B2 (en) Object presence detection method and device
JP3864945B2 (en) Road lane detection device
KR101276871B1 (en) Method and apparatus for collision avoidance of vehicle
JP5711721B2 (en) Vehicle driving support control device
US10855953B2 (en) Vehicular control system with forward viewing camera and beam emitting antenna array
US10297156B2 (en) Driving support apparatus for a vehicle
JP6344638B2 (en) Object detection apparatus, mobile device control system, and object detection program
JP3925488B2 (en) Image processing apparatus for vehicle
US9623869B2 (en) Vehicle driving support control apparatus
US8615109B2 (en) Moving object trajectory estimating device
US20150073705A1 (en) Vehicle environment recognition apparatus
JP5679461B2 (en) Method and apparatus for determining valid lane markings
US8311283B2 (en) Method for detecting lane departure and apparatus thereof