WO2018066510A1 - Image processing device - Google Patents

Image processing device Download PDF

Info

Publication number
WO2018066510A1
WO2018066510A1 PCT/JP2017/035816 JP2017035816W WO2018066510A1 WO 2018066510 A1 WO2018066510 A1 WO 2018066510A1 JP 2017035816 W JP2017035816 W JP 2017035816W WO 2018066510 A1 WO2018066510 A1 WO 2018066510A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
marker
image processing
processing apparatus
lane
Prior art date
Application number
PCT/JP2017/035816
Other languages
French (fr)
Japanese (ja)
Inventor
祐司 石田
謙二 岡野
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to DE112017005114.2T priority Critical patent/DE112017005114T5/en
Priority to CN201780061868.4A priority patent/CN109844801A/en
Priority to US16/339,613 priority patent/US10926700B2/en
Publication of WO2018066510A1 publication Critical patent/WO2018066510A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/307Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene
    • B60R2300/308Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene by overlaying the real scene, e.g. through a head-up display on the windscreen
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/607Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/804Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for lane monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Definitions

  • the present disclosure relates to an image processing apparatus.
  • Patent Document 1 discloses the following technique. First, a rear captured image captured by the rear camera is converted into a rear bird's-eye image. Next, the converted rear bird's-eye view image is moved to the front of the vehicle, the positional relationship between the rear bird's-eye view image and the vehicle position is shifted, and an estimated bird's-eye view image is generated by estimating the side and front of the vehicle. To do.
  • the white line included in the rear bird's-eye image continues to the side and the front of the vehicle, and an estimated bird's-eye image is created.
  • the estimated bird's-eye view image is displayed on a monitor or the like together with an image showing the position of the host vehicle, so that the position of the vehicle relative to the white line can be easily grasped.
  • Patent Document 1 when the white line on the road is missing or the white line is unclear at night, the white line included in the rear bird's-eye view image is also missing or unclear. Therefore, even in the estimated bird's-eye view image created from the rear bird's-eye view image, the white line image is missing or unclear. Therefore, as a result of detailed studies by the inventors, the driver has difficulty in visually recognizing the white line even when looking at the monitor on which the estimated bird's-eye view image is displayed, and it is difficult to grasp the positional relationship between the vehicle and the white line. It was.
  • One aspect of the present disclosure is to provide an image processing device that makes it easy for a driver to grasp the positional relationship between a vehicle and a lane marker that defines a traveling path of the vehicle.
  • One aspect of the present disclosure is an image processing apparatus that includes an image acquisition unit, a detection unit, and a display processing unit.
  • An image acquisition part is comprised so that a captured image may be acquired from the imaging device which images the periphery of a vehicle.
  • the detection unit is configured to detect a lane marker that defines the traveling path of the vehicle from the captured image acquired by the image acquisition unit.
  • the display processing unit is configured to display a superimposed image, which is a captured image on which the marker image is superimposed, on the display device so that a marker image for supporting visual recognition of the lane marker is superimposed on the lane marker detected by the detection unit. Is done.
  • the driver can easily grasp the positional relationship between the vehicle and the lane marker by displaying the superimposed image on the display device. Further, for example, even when the lane marker is partially missing or the lane marker is unclear at night, the lane marker is complemented by the marker image, so that the driver can easily recognize the lane marker.
  • FIG. 5 is a superimposed image after the image processing of the first embodiment is performed on the bird's-eye view image shown in FIG. 4. It is a figure which shows the state in which a white line is unclear at night.
  • 7 is a superimposed image after the image processing of the first embodiment is performed on the bird's-eye view image shown in FIG. 6. It is a flowchart showing the superimposition process of 2nd Embodiment.
  • the configuration of the image processing apparatus 1 will be described with reference to FIG.
  • the image processing device 1 is an in-vehicle device mounted on a vehicle.
  • the vehicle on which the image processing apparatus 1 is mounted is referred to as the own vehicle.
  • the image processing device 1 is an ECU, and an imaging device 2 and a display device 3 are connected to the image processing device 1.
  • the ECU is an abbreviation for “Electronic Control Unit”, that is, an abbreviation for an electronic control unit.
  • the imaging device 2 includes a front camera, a left side camera, a right side camera, and a rear camera.
  • the front camera is installed in the host vehicle so that the road surface ahead of the host vehicle is within the imaging range.
  • the left side camera is installed in the host vehicle so that the road surface on the left side of the host vehicle is within the imaging range.
  • the right side camera is installed in the host vehicle so that the road surface on the right side of the host vehicle is within the imaging range.
  • the rear side camera is installed in the host vehicle so that the road surface behind the host vehicle is in the imaging range.
  • Each camera repeatedly captures images at a preset time interval, for example, at 1/15 second intervals, and outputs the captured images to the image processing apparatus 1.
  • the display device 3 is a display device having a display screen such as liquid crystal or organic EL.
  • the display device displays an image according to a signal input from the image processing device 1.
  • the image processing apparatus 1 is mainly configured by a known microcomputer including a CPU, a RAM, a ROM, and a semiconductor memory such as a flash memory.
  • Various functions of the image processing apparatus 1 are realized by the CPU executing a program stored in a non-transitional physical recording medium.
  • the semiconductor memory corresponds to a non-transitional tangible recording medium storing a program. Also, by executing this program, a method corresponding to the program is executed.
  • the number of microcomputers constituting the image processing apparatus 1 may be one or more.
  • the image processing apparatus 1 includes, as a functional configuration realized by the CPU executing a program, an image acquisition processing unit 4, a video conversion processing unit 5, a lane marker detection processing unit 6, and a detection result output processing unit 7. .
  • the method for realizing these elements constituting the image processing apparatus 1 is not limited to software, and some or all of the elements may be realized using hardware that combines a logic circuit, an analog circuit, and the like. .
  • the image processing executed by the image processing apparatus 1 will be described with reference to the flowcharts of FIGS.
  • the image processing is executed at a predetermined time interval such as 1/15 second, for example, while the ignition switch of the host vehicle is on.
  • step 1 the image processing apparatus 1 performs a process of acquiring captured images from the front camera, the left side camera, the right side camera, and the rear camera and converting them into digital signals.
  • the image processing apparatus 1 functions as the image acquisition processing unit 4 by executing the process of step 1.
  • the image processing apparatus 1 converts the four captured images converted into digital signals into a bird's-eye image viewed from a preset virtual viewpoint, and synthesizes the bird's-eye image that reflects the surroundings of the host vehicle. Generate the process. Specifically, the image processing apparatus 1 performs well-known bird's-eye conversion on the four captured images, thereby converting the photographed image into a bird's-eye image that is a viewpoint image looking down from above the host vehicle. Perform the process. The image processing apparatus 1 functions as the video conversion processing unit 5 by executing the process of step 2.
  • the image processing apparatus 1 performs processing for detecting the lane marker 30 from the bird's-eye view image generated in step 2.
  • the lane marker 30 defines the traveling path of the host vehicle.
  • the lane marker 30 is marked on the road surface, and a white line that defines the traveling road and a road edge that is mainly a boundary line between the roadway and the sidewalk are detected.
  • white lines include continuous lines and broken lines.
  • road edges include curbs, gutters, and guardrails.
  • two white lines are detected as the lane marker 30.
  • the detection of the lane marker 30 can be performed using a known technique.
  • the detection of the white line can be performed by calculating the luminance in the bird's-eye view image and extracting the edge from the luminance-converted image.
  • the detection of the road edge can be performed by using information such as luminance, color, or shape of the bird's-eye view image as a clue.
  • the image processing apparatus 1 functions as the lane marker detection processing unit 6 by executing the process of step 3.
  • the image processing apparatus 1 performs a superimposition process for superimposing a marker image 31 for assisting the visual recognition of the lane marker 30 on the lane marker 30 detected in step 3.
  • the marker image 31 is a band-like image for clearly showing the lane marker 30 as shown in FIGS. 5 and 7, for example.
  • FIG. 5 is a superimposed image in which two marker images 31 are superimposed on two white lines on the bird's-eye view image shown in FIG. 4.
  • the marker image 31 is a band-shaped image that is arranged so as to overlap the lane marker 30 on the bird's-eye view image and extends from end to end on the bird's-eye view image. Therefore, even when the lane marker 30 is missing as shown in FIG.
  • FIG. 7 is a superimposed image in which two marker images 31 are superimposed on two white lines on the bird's-eye view image shown in FIG. As shown in FIG. 6, even when the lane marker 30 is unclear at night, the position of the lane marker 30 can be clearly shown by superimposing the marker image 31.
  • step 11 the image processing apparatus 1 performs processing for setting the initial value of the variable N to 0.
  • N is a natural number.
  • step 12 the image processing apparatus 1 performs a process of determining whether or not the number of detected lane markers 30 detected in step 3 is greater than the variable N. If an affirmative determination is made in step 12, the process proceeds to step 13. In step 13, the image processing apparatus 1 performs a process of adding 1 to the variable N.
  • the image processing apparatus 1 acquires the distance from the own vehicle to the lane marker 30 and the inclination of the lane marker 30 from the bird's-eye image based on the Nth lane marker 30 detected in step 3. Perform the process.
  • the distance from the host vehicle to the lane marker 30 is a distance from the host vehicle to a line closer to the host vehicle indicating the longitudinal direction of the belt-like lane marker 30.
  • the image processing apparatus 1 performs a process of determining the coordinate position of the marker image 31 to be superimposed on the Nth lane marker 30 based on the result obtained in step 14. Specifically, the image processing apparatus 1 performs a process of determining the position of the line on the near side of the host vehicle indicating the longitudinal direction of the band-shaped marker image 31. This line extends from end to end on the bird's-eye view image.
  • step 16 the image processing apparatus 1 performs a process of determining whether or not the distance from the host vehicle to the Nth lane marker 30 is equal to or less than a threshold value. If an affirmative determination is made in step 16, the image processing apparatus 1 performs a process of setting the risk level to be strong in step 17. If a negative determination is made in step 16, the image processing apparatus 1 performs a process of setting the risk level to be weak in step 18. In addition, the process of step 16, step 17, and step 18 is called a risk determination process.
  • step 19 the image processing apparatus 1 performs a process of setting the width of the marker image 31 according to the risk level set in steps 17 and 18. Specifically, the image processing apparatus 1 sets the display mode of the marker image 31 so that the width of the marker image 31 is wide when the risk level is strong, and the width of the marker image 31 is narrow when the risk level is low. The display mode of the marker image 31 is set so that The image processing apparatus 1 performs a process of setting the width of the marker image 31 by determining the distance from the position of the line of the marker image 31 determined in step 15 to the side far from the host vehicle. .
  • step 20 the image processing apparatus 1 performs a process of determining whether or not the Nth lane marker 30 detected in step 3 is a white line. If an affirmative determination is made in step 20, a process of drawing a linear marker image 31 showing a white line on the bird's-eye view image in step 21 is performed. If a negative determination is made in step 20, a process of drawing a three-dimensional marker image 31 indicating a road edge on the bird's-eye view image in step 22 is performed. In addition, the process of step 20, step 21, and step 22 is called a superimposition drawing process.
  • step 21 or step 22 return to step 12.
  • step 12 the process from step 13 to step 22 is performed while an affirmative determination is made, and the superimposition process ends when a negative determination is made.
  • step 5 the image processing device 1 performs a process of outputting the superimposed image on which the marker image 31 is superimposed in step 4 to the display device 3. Note that the image processing apparatus 1 functions as the detection result output processing unit 7 by executing the processing of Step 4 and Step 5.
  • step 5 the process returns to step 1, and the image processing apparatus 1 performs image processing on the acquired captured image.
  • the image processing apparatus 1 ends the image processing.
  • the image processing apparatus 1 performs processing for causing the display device 3 to display a superimposed image on which the marker image 31 is superimposed. Therefore, the position of the lane marker 30 can be clearly shown. Therefore, the driver can easily grasp the positional relationship between the host vehicle and the lane marker 30. For example, even when the lane marker 30 is partially missing or the lane marker 30 is unclear at night, the lane marker 30 is complemented by the marker image 31, so that the driver can easily recognize the lane marker 30.
  • the image processing apparatus 1 performs a process of determining the display mode of the marker image 31. Therefore, the display mode of the marker image 31 can be changed. Therefore, it is possible to display on the display device 3 a superimposed image on which the marker image 31 having an arbitrary display mode is superimposed.
  • the image processing apparatus 1 performs a process of changing the width of the marker image according to the degree of risk. Therefore, the driver can recognize the risk level. Therefore, the driver can grasp the position of the host vehicle in the width direction of the travel path.
  • the image processing apparatus 1 performs a process of increasing the width of the marker image 31 when the degree of danger is high. Therefore, when the degree of danger is strong, the marker image 31 can be highlighted and displayed. Therefore, the driver can easily recognize that the degree of danger is high, that is, the host vehicle is about to depart from the road.
  • the image processing apparatus 1 performs a process of changing the display mode of the marker image 31 according to a white line or a road edge. Therefore, even if the marker image 31 is superimposed and the lane marker 30 is hidden in the superimposed image, the driver can recognize the type of the lane marker 30.
  • S1 corresponds to processing as an image acquisition unit
  • S3 corresponds to processing as a detection unit
  • S11 to S15 and S5 correspond to processing as a display processing unit
  • S16 to S22 Corresponds to processing as a determination unit.
  • the image processing apparatus 1 performs a process of causing the display device 3 to display the marker image 31 that is superimposed on all the detected lane markers 30.
  • the image processing apparatus 1 determines that the host vehicle is in a state of changing the vehicle, the marker image is superimposed on the lane marker 30 that is expected to be exceeded by the lane change. The process which does not display 31 is performed.
  • step 46 the image processing apparatus 1 performs a risk determination process.
  • the risk determination process is the same as the process of Step 16, Step 17 and Step 18 in FIG.
  • step 47 the image processing apparatus 1 determines whether or not the position of the Nth lane marker 30 detected in step 3 is on the right side of the host vehicle and the right winker is on. Do. If an affirmative determination is made in step 47, the process returns to step 42. For example, as shown in FIGS. 9 and 11, when the position of the white line is on the right side of the host vehicle and the right winker is on, that is, when the host vehicle is moving to the right lane, As shown in FIGS. 10 and 12, the marker image 31 is not superimposed on the white line. If a negative determination is made in step 47, the process proceeds to step 48.
  • step 48 the image processing apparatus 1 determines whether or not the position of the Nth lane marker 30 detected in step 3 is on the left side of the host vehicle and the left winker is on. . If an affirmative determination is made in step 48, the process returns to step 42. That is, if it is determined that the host vehicle is moving to the left lane, the marker image 31 is not superimposed on the lane marker 30. If a negative determination is made in step 48, the process proceeds to step 49. Note that the processing of step 47 and step 48 is referred to as non-display determination processing.
  • step 49 the image processing apparatus 1 performs a process of setting the width of the marker image 31 in accordance with the risk level set in step 46.
  • step 50 the image processing apparatus 1 performs a superimposing drawing process.
  • the superimposing drawing process is the same as the process of Step 20, Step 21, and Step 22 in FIG.
  • step 50 the process returns to step 42. If a negative determination is made in step 42, the superimposition process ends. [2-3. effect] According to 2nd Embodiment explained in full detail above, there exists the effect of 1st Embodiment mentioned above, Furthermore, there exist the following effects.
  • the image processing apparatus 1 performs a non-display determination process. Therefore, the marker image 31 superimposed on the lane marker 30 that is predicted to be exceeded by the host vehicle due to the lane change is not displayed on the display device 3. If the marker image 31 is displayed when the lane is changed, the driver feels that it is difficult to change the lane. However, since the marker image 31 is not displayed, the driver can change the lane without feeling resistance. In addition, if the marker image 31 is displayed when the lane is changed, the marker image 31 indicating that the degree of danger is high in accordance with the lane change is displayed. However, since the marker image 31 is not displayed, information indicating that the degree of danger is high for the driver. Can be actively stopped.
  • S41 to S45 correspond to processing as a display processing unit
  • S46, S49, and S50 correspond to processing as a determination unit
  • S47 and S48 correspond to processing as a determination unit.
  • the setting of the display mode of the marker image is not limited to this. Absent.
  • Examples of the display mode of the marker image include the color of the marker image, the transmittance, the blinking speed, and the like. Specifically, when the degree of danger increases, the marker image may gradually become red, the transmittance may decrease, or the blinking speed may increase.
  • the superposition drawing process may be a process of changing the display mode of the marker image according to the type of white line or road edge. For example, when the white line is a broken line, the marker image may be a broken line.
  • the risk level determination process is not limited to this.
  • the risk may be set according to the type of the lane marker. For example, if the lane change is permitted as shown by a broken line or the like, the danger level is set to be weak, and if the lane change is not allowed as shown by a yellow line or the like, the danger level is set. May be set to strong.
  • the image processing apparatus 1 may determine the brightness of the bird's-eye view image and change the marker image to an optimum brightness according to the determined brightness.
  • the marker image has the same luminance as the bird's-eye view image.
  • a superimposed image with a small difference in luminance is displayed on the display device, so that a superimposed image that is gentle to the eyes of the driver can be provided.
  • the image processing range is not limited to this.
  • the range of image processing may be, for example, a portion close to the vehicle of the bird's-eye view image.
  • the imaging device 2 In the above-described embodiment, an example in which the front camera, the left side camera, the right side camera, and the rear camera are used as the imaging device 2 has been described. However, the imaging device is not limited to this. One or more cameras may be used.
  • a plurality of functions of one constituent element in the above embodiment may be realized by a plurality of constituent elements, or a single function of one constituent element may be realized by a plurality of constituent elements. . Further, a plurality of functions possessed by a plurality of constituent elements may be realized by one constituent element, or one function realized by a plurality of constituent elements may be realized by one constituent element. Moreover, you may abbreviate
  • at least a part of the configuration of the above embodiment may be added to or replaced with the configuration of the other embodiment.
  • all the aspects included in the technical idea specified from the wording described in the claims are embodiments of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The image processing device (1) is provided with an image acquisition unit (1, S1), a detection unit (1, S3), and a display processing unit (1, S11-S15, S41-S45, S5). The image acquisition unit is configured so as to acquire captured images from an imaging device (2) for capturing images of the surroundings of a vehicle. The detection unit is configured so as to detect lane markers (30) defining a vehicle traveling path from the captured images acquired by the image acquisition unit. The display processing unit causes a display device (3) to display a superimposed image, that is, a captured image superimposed with marker images, such that marker images (31) for assisting with visual recognition of the lane markers are superimposed over the lane markers detected by the detection unit.

Description

画像処理装置Image processing device 関連出願の相互参照Cross-reference of related applications
 本国際出願は、2016年10月7日に日本国特許庁に出願された日本国特許出願第2016-199184号に基づく優先権を主張するものであり、日本国特許出願第2016-199184号の全内容を参照により本国際出願に援用する。 This international application claims priority based on Japanese Patent Application No. 2016-199184 filed with the Japan Patent Office on October 7, 2016, and is based on Japanese Patent Application No. 2016-199184. The entire contents are incorporated herein by reference.
 本開示は、画像処理装置に関する。 The present disclosure relates to an image processing apparatus.
 車両と白線との位置関係を把握しやすくするために、撮像装置から取得された撮像画像を処理して表示装置に表示する装置が知られている。例えば、特許文献1では、以下のような技術が開示されている。まず、後方カメラで撮像した後方の撮像画像を後方鳥瞰画像に変換する。次に、変換された後方鳥瞰画像を車両の前方に移動させるようにして、後方鳥瞰画像と車両の位置との位置関係をずらし、車両の側方や前方を推定して示す推定鳥瞰画像を作成する。つまり、後方鳥瞰画像に含まれる白線は、車両の側方や前方まで同様に続くと推定し、推定鳥瞰画像を作成する。この技術を用いれば、特に車両の前進時に、この推定鳥瞰画像を自車両の位置を示す画像とともにモニタ等に表示することにより、白線に対する車両の位置を容易に把握することができる。 In order to make it easy to grasp the positional relationship between a vehicle and a white line, a device that processes a captured image acquired from an imaging device and displays it on a display device is known. For example, Patent Document 1 discloses the following technique. First, a rear captured image captured by the rear camera is converted into a rear bird's-eye image. Next, the converted rear bird's-eye view image is moved to the front of the vehicle, the positional relationship between the rear bird's-eye view image and the vehicle position is shifted, and an estimated bird's-eye view image is generated by estimating the side and front of the vehicle. To do. That is, it is estimated that the white line included in the rear bird's-eye image continues to the side and the front of the vehicle, and an estimated bird's-eye image is created. By using this technique, particularly when the vehicle is moving forward, the estimated bird's-eye view image is displayed on a monitor or the like together with an image showing the position of the host vehicle, so that the position of the vehicle relative to the white line can be easily grasped.
特開2003-006621号公報JP 2003-006621 A
 特許文献1では、例えば、道路上の白線が欠けている、或いは夜間に白線が不明瞭である場合には、後方鳥瞰画像に含まれる白線も欠けている、或いは不明瞭である。そのため、後方鳥瞰画像から作成される推定鳥瞰画像においても、その白線の画像は欠けたもの或いは不明瞭なものになる。したがって、ドライバはこの推定鳥瞰画像が表示されたモニタを見ても、白線を視認しづらく、車両と白線との位置関係を把握しにいという課題が発明者の詳細な検討の結果見出された。 In Patent Document 1, for example, when the white line on the road is missing or the white line is unclear at night, the white line included in the rear bird's-eye view image is also missing or unclear. Therefore, even in the estimated bird's-eye view image created from the rear bird's-eye view image, the white line image is missing or unclear. Therefore, as a result of detailed studies by the inventors, the driver has difficulty in visually recognizing the white line even when looking at the monitor on which the estimated bird's-eye view image is displayed, and it is difficult to grasp the positional relationship between the vehicle and the white line. It was.
 本開示の一局面は、ドライバが車両と車両の走行路を規定するレーンマーカとの位置関係を把握しやすくすることができる画像処理装置を提供することにある。
 本開示の一態様は、画像取得部と、検出部と、表示処理部と、を備える画像処理装置である。画像取得部は、車両の周辺を撮像する撮像装置から撮像画像を取得するように構成される。検出部は、画像取得部によって取得された撮像画像から、車両の走行路を規定するレーンマーカを検出するように構成される。表示処理部は、検出部によって検出されたレーンマーカにレーンマーカの視認を支援するためのマーカ画像が重畳するように、マーカ画像が重畳された撮像画像である重畳画像を表示装置に表示させるように構成される。
One aspect of the present disclosure is to provide an image processing device that makes it easy for a driver to grasp the positional relationship between a vehicle and a lane marker that defines a traveling path of the vehicle.
One aspect of the present disclosure is an image processing apparatus that includes an image acquisition unit, a detection unit, and a display processing unit. An image acquisition part is comprised so that a captured image may be acquired from the imaging device which images the periphery of a vehicle. The detection unit is configured to detect a lane marker that defines the traveling path of the vehicle from the captured image acquired by the image acquisition unit. The display processing unit is configured to display a superimposed image, which is a captured image on which the marker image is superimposed, on the display device so that a marker image for supporting visual recognition of the lane marker is superimposed on the lane marker detected by the detection unit. Is done.
 本開示の画像処理装置によれば、重畳画像を表示装置に表示させることで、ドライバは自車両とレーンマーカとの位置関係を把握しやすくなる。また、例えばレーンマーカが一部欠けている、或いは夜間にレーンマーカが不明瞭である場合にも、マーカ画像によってレーンマーカが補完されるため、ドライバはレーンマーカを視認しやすくなる。 According to the image processing device of the present disclosure, the driver can easily grasp the positional relationship between the vehicle and the lane marker by displaying the superimposed image on the display device. Further, for example, even when the lane marker is partially missing or the lane marker is unclear at night, the lane marker is complemented by the marker image, so that the driver can easily recognize the lane marker.
 なお、請求の範囲に記載した括弧内の符号は、一つの態様として後述する実施形態に記載の具体的手段との対応関係を示すものであって、本開示の技術的範囲を限定するものではない。 In addition, the code | symbol in the parenthesis described in the claim shows the correspondence with the specific means as described in embodiment mentioned later as one aspect, Comprising: It does not limit the technical scope of this indication. Absent.
画像処理装置の構成を示すブロック図である。It is a block diagram which shows the structure of an image processing apparatus. 第1実施形態の画像処理を表すフローチャートである。It is a flowchart showing the image processing of 1st Embodiment. 第1実施形態の重畳処理を表すフローチャートである。It is a flowchart showing the superimposition process of 1st Embodiment. 白線が一部欠けている状態を示す鳥瞰画像である。It is a bird's-eye view image which shows the state where a part of white line is missing. 図4に示す鳥瞰画像に第1実施形態の画像処理が行われた後の重畳画像である。FIG. 5 is a superimposed image after the image processing of the first embodiment is performed on the bird's-eye view image shown in FIG. 4. 夜間に白線が不明瞭である状態を示す図である。It is a figure which shows the state in which a white line is unclear at night. 図6に示す鳥瞰画像に第1実施形態の画像処理が行われた後の重畳画像である。7 is a superimposed image after the image processing of the first embodiment is performed on the bird's-eye view image shown in FIG. 6. 第2実施形態の重畳処理を表すフローチャートである。It is a flowchart showing the superimposition process of 2nd Embodiment. 白線が一部欠けており、自車両が右車線に移動する状態を示す図である。It is a figure which shows the state which a part of white line is missing and the own vehicle moves to the right lane. 白線が一部欠けており、自車両が右車線に移動する状態である場合に、第2実施形態の画像処理が行われた図である。When a part of white line is missing and the own vehicle moves to the right lane, the image processing of the second embodiment is performed. 夜間に白線が不明瞭であり、自車両が右車線に移動する状態を示す図である。It is a figure which shows the state which a white line is unclear at night and the own vehicle moves to the right lane. 夜間に白線が不明瞭であり、自車両が右車線に移動する状態を示す状態である場合に、第2実施形態の画像処理が行われた図である。It is the figure which performed the image processing of 2nd Embodiment, when the white line is unclear at night and it is the state which shows the state which the own vehicle moves to a right lane.
 以下、図面を参照しながら、本開示を実施するための形態を説明する。
 [1.第1実施形態]
 [1-1.画像処理装置1の構成]
 画像処理装置1の構成を図1に基づき説明する。画像処理装置1は、車両に搭載される車載装置である。画像処理装置1を搭載する車両を以下では自車両とする。画像処理装置1はECUであり、画像処理装置1には撮像装置2と表示装置3とが接続されている。なお、ECUは、「Electronic Control Unit」の略であり、すなわち電子制御装置の略である。
Hereinafter, embodiments for carrying out the present disclosure will be described with reference to the drawings.
[1. First Embodiment]
[1-1. Configuration of Image Processing Device 1]
The configuration of the image processing apparatus 1 will be described with reference to FIG. The image processing device 1 is an in-vehicle device mounted on a vehicle. Hereinafter, the vehicle on which the image processing apparatus 1 is mounted is referred to as the own vehicle. The image processing device 1 is an ECU, and an imaging device 2 and a display device 3 are connected to the image processing device 1. The ECU is an abbreviation for “Electronic Control Unit”, that is, an abbreviation for an electronic control unit.
 撮像装置2は、フロントカメラ、左サイドカメラ、右サイドカメラ、及びリアカメラを備えている。フロントカメラは自車両前方の路面が撮像範囲となるように自車両に設置される。左サイドカメラは自車両左側方の路面が撮像範囲となるように自車両に設置される。右サイドカメラは自車両右側方の路面が撮像範囲となるように自車両に設置される。リアサイドカメラは自車両後方の路面が撮像範囲となるように自車両に設置される。各カメラは、予め設定された時間間隔、例えば1/15秒間隔で繰り返し撮像し、撮像画像を画像処理装置1へ出力する。 The imaging device 2 includes a front camera, a left side camera, a right side camera, and a rear camera. The front camera is installed in the host vehicle so that the road surface ahead of the host vehicle is within the imaging range. The left side camera is installed in the host vehicle so that the road surface on the left side of the host vehicle is within the imaging range. The right side camera is installed in the host vehicle so that the road surface on the right side of the host vehicle is within the imaging range. The rear side camera is installed in the host vehicle so that the road surface behind the host vehicle is in the imaging range. Each camera repeatedly captures images at a preset time interval, for example, at 1/15 second intervals, and outputs the captured images to the image processing apparatus 1.
 表示装置3は、液晶や有機EL等の表示画面を有する表示装置である。表示装置は、画像処理装置1から入力される信号に従って画像を表示する。
 画像処理装置1は、CPU、RAM、ROM、及びフラッシュメモリ等の半導体メモリを備えた周知のマイクロコンピュータを中心に構成される。画像処理装置1の各種機能は、CPUが非遷移的実体的記録媒体に格納されたプログラムを実行することにより実現される。この例では、半導体メモリが、プログラムを格納した非遷移的実体的記録媒体に該当する。また、このプログラムが実行されることで、プログラムに対応する方法が実行される。なお、画像処理装置1を構成するマイクロコンピュータの数は1つでも複数でもよい。
The display device 3 is a display device having a display screen such as liquid crystal or organic EL. The display device displays an image according to a signal input from the image processing device 1.
The image processing apparatus 1 is mainly configured by a known microcomputer including a CPU, a RAM, a ROM, and a semiconductor memory such as a flash memory. Various functions of the image processing apparatus 1 are realized by the CPU executing a program stored in a non-transitional physical recording medium. In this example, the semiconductor memory corresponds to a non-transitional tangible recording medium storing a program. Also, by executing this program, a method corresponding to the program is executed. The number of microcomputers constituting the image processing apparatus 1 may be one or more.
 画像処理装置1は、CPUがプログラムを実行することで実現される機能の構成として、画像取得処理部4と、映像変換処理部5と、レーンマーカ検出処理部6と、検出結果出力処理部7と、を備える。画像処理装置1を構成するこれらの要素を実現する手法はソフトウェアに限るものではなく、その一部又は全部の要素を、論理回路やアナログ回路等を組み合わせたハードウェアを用いて実現してもよい。 The image processing apparatus 1 includes, as a functional configuration realized by the CPU executing a program, an image acquisition processing unit 4, a video conversion processing unit 5, a lane marker detection processing unit 6, and a detection result output processing unit 7. . The method for realizing these elements constituting the image processing apparatus 1 is not limited to software, and some or all of the elements may be realized using hardware that combines a logic circuit, an analog circuit, and the like. .
 [1-2.画像処理]
 画像処理装置1が実行する画像処理について、図2及び図3のフローチャートを用いて説明する。画像処理は、自車両のイグニッションスイッチがオン状態の期間中、例えば1/15秒等の所定の時間間隔で実行される。
[1-2. Image processing]
The image processing executed by the image processing apparatus 1 will be described with reference to the flowcharts of FIGS. The image processing is executed at a predetermined time interval such as 1/15 second, for example, while the ignition switch of the host vehicle is on.
 図2に示すように、ステップ1では、画像処理装置1は、フロントカメラ、左サイドカメラ、右サイドカメラ、及びリアカメラから撮像画像を取得し、デジタル信号に変換する処理を行う。なお、画像処理装置1はステップ1の処理を実行することで画像取得処理部4として機能する。 As shown in FIG. 2, in step 1, the image processing apparatus 1 performs a process of acquiring captured images from the front camera, the left side camera, the right side camera, and the rear camera and converting them into digital signals. The image processing apparatus 1 functions as the image acquisition processing unit 4 by executing the process of step 1.
 続くステップ2では、画像処理装置1は、デジタル信号化された4つの撮像画像を、予め設定された仮想視点から見た鳥瞰画像に変換して合成し、自車両の周辺を映した鳥瞰画像を生成する処理を行う。具体的には、画像処理装置1は、4つの撮像画像に対して周知の鳥瞰変換を施すことで、当該撮影画像を、自車両の上方から見下ろす視点の画像である鳥瞰画像に変換して合成する処理を行う。なお、画像処理装置1はステップ2の処理を実行することで映像変換処理部5として機能する。 In subsequent step 2, the image processing apparatus 1 converts the four captured images converted into digital signals into a bird's-eye image viewed from a preset virtual viewpoint, and synthesizes the bird's-eye image that reflects the surroundings of the host vehicle. Generate the process. Specifically, the image processing apparatus 1 performs well-known bird's-eye conversion on the four captured images, thereby converting the photographed image into a bird's-eye image that is a viewpoint image looking down from above the host vehicle. Perform the process. The image processing apparatus 1 functions as the video conversion processing unit 5 by executing the process of step 2.
 続くステップ3では、画像処理装置1は、ステップ2で生成した鳥瞰画像からレーンマーカ30を検出する処理を行う。ここで、レーンマーカ30とは、自車両の走行路を規定するものである。ステップ3では、レーンマーカ30として、路面上にマーキングされ、走行路を規定する白線、及び、主に車道と歩道との間の境界線である道路端が検出される。白線としては、連続した線や破線等が挙げられる。道路端としては、縁石、側溝、及びガードレール等が挙げられる。例えば、図4及び図6に示す鳥瞰画像では、レーンマーカ30として2本の白線が検出される。 In subsequent step 3, the image processing apparatus 1 performs processing for detecting the lane marker 30 from the bird's-eye view image generated in step 2. Here, the lane marker 30 defines the traveling path of the host vehicle. In step 3, the lane marker 30 is marked on the road surface, and a white line that defines the traveling road and a road edge that is mainly a boundary line between the roadway and the sidewalk are detected. Examples of white lines include continuous lines and broken lines. Examples of road edges include curbs, gutters, and guardrails. For example, in the bird's-eye view images shown in FIGS. 4 and 6, two white lines are detected as the lane marker 30.
 レーンマーカ30の検出は、公知の技術を用いて行うことができる。例えば、白線の検出は、鳥瞰画像における輝度を算出し、輝度変換後の画像からエッジを抽出することで行うことができる。道路端の検出は、鳥瞰画像の輝度や色、或いは形状などの情報を手がかりにして行うことができる。なお、画像処理装置1はステップ3の処理を実行することでレーンマーカ検出処理部6として機能する。 The detection of the lane marker 30 can be performed using a known technique. For example, the detection of the white line can be performed by calculating the luminance in the bird's-eye view image and extracting the edge from the luminance-converted image. The detection of the road edge can be performed by using information such as luminance, color, or shape of the bird's-eye view image as a clue. The image processing apparatus 1 functions as the lane marker detection processing unit 6 by executing the process of step 3.
 続くステップ4では、画像処理装置1は、ステップ3で検出したレーンマーカ30に、当該レーンマーカ30の視認を支援するためのマーカ画像31を重畳する重畳処理を行う。具体的に、マーカ画像31とは、例えば図5及び図7に示すように、レーンマーカ30を明瞭に示すための帯状の画像である。図5は、図4に示す鳥瞰画像上における2本の白線に、2本のマーカ画像31が重畳された重畳画像である。図5に示すように、マーカ画像31は鳥瞰画像上のレーンマーカ30に重なるように配置され、鳥瞰画像上の端から端まで伸びている帯状の画像である。そのため、図4に示すようにレーンマーカ30が欠けている場合でも、このようなマーカ画像31を重畳することで、レーンマーカ30の位置を明瞭に示すことができる。また、図7は、図6に示す鳥瞰画像上における2本の白線に、2本のマーカ画像31が重畳された重畳画像である。図6に示すように夜間にレーンマーカ30が不明瞭である場合でも、マーカ画像31を重畳することで、レーンマーカ30の位置を明瞭に示すことができる。 In subsequent step 4, the image processing apparatus 1 performs a superimposition process for superimposing a marker image 31 for assisting the visual recognition of the lane marker 30 on the lane marker 30 detected in step 3. Specifically, the marker image 31 is a band-like image for clearly showing the lane marker 30 as shown in FIGS. 5 and 7, for example. FIG. 5 is a superimposed image in which two marker images 31 are superimposed on two white lines on the bird's-eye view image shown in FIG. 4. As shown in FIG. 5, the marker image 31 is a band-shaped image that is arranged so as to overlap the lane marker 30 on the bird's-eye view image and extends from end to end on the bird's-eye view image. Therefore, even when the lane marker 30 is missing as shown in FIG. 4, the position of the lane marker 30 can be clearly shown by superimposing such a marker image 31. FIG. 7 is a superimposed image in which two marker images 31 are superimposed on two white lines on the bird's-eye view image shown in FIG. As shown in FIG. 6, even when the lane marker 30 is unclear at night, the position of the lane marker 30 can be clearly shown by superimposing the marker image 31.
 重畳処理の具体的処理について、図3のフローチャートを用いて説明する。
 ステップ11では、画像処理装置1は、変数Nの初期値を0に設定する処理を行う。Nは自然数である。
Specific processing of superimposition processing will be described with reference to the flowchart of FIG.
In step 11, the image processing apparatus 1 performs processing for setting the initial value of the variable N to 0. N is a natural number.
 続くステップ12では、画像処理装置1は、ステップ3で検出したレーンマーカ30の検出数が、変数Nよりも多いか否かを判定する処理を行う。ステップ12で肯定判定されると、ステップ13に進む。ステップ13では、画像処理装置1は、変数Nに1を加算する処理を行う。 In subsequent step 12, the image processing apparatus 1 performs a process of determining whether or not the number of detected lane markers 30 detected in step 3 is greater than the variable N. If an affirmative determination is made in step 12, the process proceeds to step 13. In step 13, the image processing apparatus 1 performs a process of adding 1 to the variable N.
 続くステップ14では、画像処理装置1は、ステップ3で検出されたN個目のレーンマーカ30に基づいて、自車両から当該レーンマーカ30までの距離と、当該レーンマーカ30の傾きと、を鳥瞰画像から取得する処理を行う。なお、自車両からレーンマーカ30までの距離とは、自車両から帯状のレーンマーカ30の長手方向を示す自車両に近い側の線までの距離である。 In subsequent step 14, the image processing apparatus 1 acquires the distance from the own vehicle to the lane marker 30 and the inclination of the lane marker 30 from the bird's-eye image based on the Nth lane marker 30 detected in step 3. Perform the process. The distance from the host vehicle to the lane marker 30 is a distance from the host vehicle to a line closer to the host vehicle indicating the longitudinal direction of the belt-like lane marker 30.
 続くステップ15では、画像処理装置1は、ステップ14で取得した結果に基づいて、N個目のレーンマーカ30に重畳するマーカ画像31の座標位置を決定する処理を行う。具体的には、画像処理装置1は、帯状のマーカ画像31の長手方向を示す自車両の近い側の線の位置を決定する処理を行う。なお、この線は、鳥瞰画像上の端から端まで伸びている。 In subsequent step 15, the image processing apparatus 1 performs a process of determining the coordinate position of the marker image 31 to be superimposed on the Nth lane marker 30 based on the result obtained in step 14. Specifically, the image processing apparatus 1 performs a process of determining the position of the line on the near side of the host vehicle indicating the longitudinal direction of the band-shaped marker image 31. This line extends from end to end on the bird's-eye view image.
 続くステップ16では、画像処理装置1は、自車両からN個目のレーンマーカ30までの距離が閾値以下であるか否かを判定する処理を行う。ステップ16で肯定判定されると、ステップ17で画像処理装置1は危険度を強に設定する処理を行う。ステップ16で否定判定されると、ステップ18で画像処理装置1は危険度を弱に設定する処理を行う。なお、ステップ16、ステップ17及びステップ18の処理を、危険度決定処理と称する。 In subsequent step 16, the image processing apparatus 1 performs a process of determining whether or not the distance from the host vehicle to the Nth lane marker 30 is equal to or less than a threshold value. If an affirmative determination is made in step 16, the image processing apparatus 1 performs a process of setting the risk level to be strong in step 17. If a negative determination is made in step 16, the image processing apparatus 1 performs a process of setting the risk level to be weak in step 18. In addition, the process of step 16, step 17, and step 18 is called a risk determination process.
 ステップ17又はステップ18の後、ステップ19に進む。ステップ19では、画像処理装置1は、ステップ17及びステップ18で設定された危険度に応じてマーカ画像31の幅を設定する処理を行う。具体的には、画像処理装置1は、危険度が強の場合マーカ画像31の幅を広くなるようにマーカ画像31の表示態様を設定し、危険度が弱の場合マーカ画像31の幅が狭くなるようにマーカ画像31の表示態様を設定する。なお、画像処理装置1は、ステップ15で決定されたマーカ画像31の線の位置から自車両に遠い側への距離を決定することによって、マーカ画像31の幅の広さを設定する処理を行う。 After step 17 or step 18, proceed to step 19. In step 19, the image processing apparatus 1 performs a process of setting the width of the marker image 31 according to the risk level set in steps 17 and 18. Specifically, the image processing apparatus 1 sets the display mode of the marker image 31 so that the width of the marker image 31 is wide when the risk level is strong, and the width of the marker image 31 is narrow when the risk level is low. The display mode of the marker image 31 is set so that The image processing apparatus 1 performs a process of setting the width of the marker image 31 by determining the distance from the position of the line of the marker image 31 determined in step 15 to the side far from the host vehicle. .
 続くステップ20では、画像処理装置1は、ステップ3で検出されたN個目のレーンマーカ30が白線であるか否かを判定する処理を行う。ステップ20で肯定判定されると、ステップ21で白線を示す直線形状のマーカ画像31を鳥瞰画像に描画する処理を行う。ステップ20で否定判定されると、ステップ22で道路端を示す立体形状のマーカ画像31を鳥瞰画像に描画する処理を行う。なお、ステップ20、ステップ21及びステップ22の処理を、重畳描画処理と称する。 In subsequent step 20, the image processing apparatus 1 performs a process of determining whether or not the Nth lane marker 30 detected in step 3 is a white line. If an affirmative determination is made in step 20, a process of drawing a linear marker image 31 showing a white line on the bird's-eye view image in step 21 is performed. If a negative determination is made in step 20, a process of drawing a three-dimensional marker image 31 indicating a road edge on the bird's-eye view image in step 22 is performed. In addition, the process of step 20, step 21, and step 22 is called a superimposition drawing process.
 ステップ21又はステップ22の後、ステップ12に戻る。ステップ12で、肯定判定される間ステップ13からステップ22の処理が行われ、否定判定されると重畳処理が終了する。 After step 21 or step 22, return to step 12. In step 12, the process from step 13 to step 22 is performed while an affirmative determination is made, and the superimposition process ends when a negative determination is made.
 ステップ4の後、ステップ5に進む。ステップ5では、画像処理装置1は、ステップ4でマーカ画像31が重畳された重畳画像を、表示装置3へ出力する処理を行う。なお、画像処理装置1は、ステップ4及びステップ5の処理を実行することにより検出結果出力処理部7として機能する。 After step 4, go to step 5. In step 5, the image processing device 1 performs a process of outputting the superimposed image on which the marker image 31 is superimposed in step 4 to the display device 3. Note that the image processing apparatus 1 functions as the detection result output processing unit 7 by executing the processing of Step 4 and Step 5.
 ステップ5の後、ステップ1に戻り、画像処理装置1は次に取得された撮像画像に対して画像処理を行う。画像処理装置1はイグニッションスイッチがオフになると画像処理を終了する。 After step 5, the process returns to step 1, and the image processing apparatus 1 performs image processing on the acquired captured image. When the ignition switch is turned off, the image processing apparatus 1 ends the image processing.
 [1-3.効果]
 以上詳述した第1実施形態によれば、以下の効果が得られる。
 (1a)第1実施形態の画像処理装置1は、マーカ画像31が重畳された重畳画像を表示装置3に表示させる処理を行う。そのため、レーンマーカ30の位置を明瞭に示すことができる。したがって、ドライバは自車両とレーンマーカ30との位置関係を把握しやすくなる。また、例えばレーンマーカ30が一部欠けている、或いは夜間にレーンマーカ30が不明瞭である場合にも、マーカ画像31によってレーンマーカ30が補完されるため、ドライバはレーンマーカ30を視認しやすくなる。
[1-3. effect]
According to the first embodiment described in detail above, the following effects can be obtained.
(1a) The image processing apparatus 1 according to the first embodiment performs processing for causing the display device 3 to display a superimposed image on which the marker image 31 is superimposed. Therefore, the position of the lane marker 30 can be clearly shown. Therefore, the driver can easily grasp the positional relationship between the host vehicle and the lane marker 30. For example, even when the lane marker 30 is partially missing or the lane marker 30 is unclear at night, the lane marker 30 is complemented by the marker image 31, so that the driver can easily recognize the lane marker 30.
 (1b)第1実施形態の画像処理装置1は、マーカ画像31の表示態様を決定する処理を行う。そのため、マーカ画像31の表示態様を変化させることができる。したがって、任意の表示態様のマーカ画像31が重畳された重畳画像を表示装置3に表示することができる。 (1b) The image processing apparatus 1 according to the first embodiment performs a process of determining the display mode of the marker image 31. Therefore, the display mode of the marker image 31 can be changed. Therefore, it is possible to display on the display device 3 a superimposed image on which the marker image 31 having an arbitrary display mode is superimposed.
 (1c)第1実施形態の画像処理装置1は、危険度に応じてマーカ画像の幅を変化させる処理を行う。そのため、ドライバは危険度を認識することができる。したがって、ドライバは、走行路の幅方向における自車両の位置を把握することができる。 (1c) The image processing apparatus 1 according to the first embodiment performs a process of changing the width of the marker image according to the degree of risk. Therefore, the driver can recognize the risk level. Therefore, the driver can grasp the position of the host vehicle in the width direction of the travel path.
 (1d)第1実施形態の画像処理装置1は、危険度が強の場合マーカ画像31の幅が広くなる処理を行う。そのため、危険度が強の場合、マーカ画像31を強調して表示することができる。したがって、ドライバは危険度が高い、つまり、自車両が走行路から逸脱しようとしていることを認識しやすくなる。 (1d) The image processing apparatus 1 according to the first embodiment performs a process of increasing the width of the marker image 31 when the degree of danger is high. Therefore, when the degree of danger is strong, the marker image 31 can be highlighted and displayed. Therefore, the driver can easily recognize that the degree of danger is high, that is, the host vehicle is about to depart from the road.
 (1e)第1実施形態の画像処理装置1は、白線又は道路端に応じて、マーカ画像31の表示態様を変化させる処理を行う。そのため、重畳画像において、マーカ画像31が重畳されてレーンマーカ30が隠れてしまったとしても、ドライバはレーンマーカ30の種類を認識することができる。 (1e) The image processing apparatus 1 according to the first embodiment performs a process of changing the display mode of the marker image 31 according to a white line or a road edge. Therefore, even if the marker image 31 is superimposed and the lane marker 30 is hidden in the superimposed image, the driver can recognize the type of the lane marker 30.
 なお、第1実施形態では、S1が画像取得部としての処理に相当し、S3が検出部としての処理に相当し、S11~S15、S5が表示処理部としての処理に相当し、S16~S22が決定部としての処理に相当する。 In the first embodiment, S1 corresponds to processing as an image acquisition unit, S3 corresponds to processing as a detection unit, S11 to S15 and S5 correspond to processing as a display processing unit, and S16 to S22. Corresponds to processing as a determination unit.
 [2.第2実施形態]
 [2-1.第1実施形態との相違点]
 第2実施形態は、基本的な構成は第1実施形態と同様であるため、相違点について以下に説明する。なお、第1実施形態と同じ符号は、同一の構成を示すものであって、先行する説明を参照する。
[2. Second Embodiment]
[2-1. Difference from the first embodiment]
Since the basic configuration of the second embodiment is the same as that of the first embodiment, differences will be described below. Note that the same reference numerals as those in the first embodiment indicate the same configuration, and the preceding description is referred to.
 前述した第1実施形態では、画像処理装置1は、検出したすべてのレーンマーカ30に重畳するマーカ画像31を表示装置3に表示させる処理を行う。これに対し、第2実施形態では、画像処理装置1は、自車両が車両変更を行う状態であると判定すると、車線変更によって自車両が越えることになると予想されるレーンマーカ30に重畳するマーカ画像31を表示しない処理を行う。 In the first embodiment described above, the image processing apparatus 1 performs a process of causing the display device 3 to display the marker image 31 that is superimposed on all the detected lane markers 30. In contrast, in the second embodiment, when the image processing apparatus 1 determines that the host vehicle is in a state of changing the vehicle, the marker image is superimposed on the lane marker 30 that is expected to be exceeded by the lane change. The process which does not display 31 is performed.
 [2-2.画像処理]
 第2実施形態の画像処理装置1が、図3に示す第1実施形態の重畳処理に代えて実行する重畳処理について、図8のフローチャートを用いて説明する。なお、図8におけるステップ41からステップ45の処理は、図3におけるステップ11からステップ15の処理と同様であるため、説明を一部簡略化している。
[2-2. Image processing]
A superimposition process executed by the image processing apparatus 1 according to the second embodiment instead of the superimposition process according to the first embodiment shown in FIG. Note that the processing from step 41 to step 45 in FIG. 8 is the same as the processing from step 11 to step 15 in FIG.
 ステップ45の後、ステップ46に進む。ステップ46では、画像処理装置1は危険度決定処理を行う。危険度決定処理は、図3におけるステップ16、ステップ17及びステップ18の処理と同様である。 After step 45, proceed to step 46. In step 46, the image processing apparatus 1 performs a risk determination process. The risk determination process is the same as the process of Step 16, Step 17 and Step 18 in FIG.
 続くステップ47では、画像処理装置1は、ステップ3で検出されたN個目のレーンマーカ30の位置が自車両よりも右側に存在し、かつ右ウィンカーがオンの状態であるか否かの判定を行う。ステップ47で肯定判定されると、ステップ42に戻る。例えば、図9及び図11に示すように、白線の位置が自車両よりも右側に存在し、かつ右ウィンカーがオンの状態の場合、つまり、自車両が右車線に移動する状態である場合、図10及び図12に示すように、当該白線にマーカ画像31は重畳されない。ステップ47で否定判定されると、ステップ48に進む。 In subsequent step 47, the image processing apparatus 1 determines whether or not the position of the Nth lane marker 30 detected in step 3 is on the right side of the host vehicle and the right winker is on. Do. If an affirmative determination is made in step 47, the process returns to step 42. For example, as shown in FIGS. 9 and 11, when the position of the white line is on the right side of the host vehicle and the right winker is on, that is, when the host vehicle is moving to the right lane, As shown in FIGS. 10 and 12, the marker image 31 is not superimposed on the white line. If a negative determination is made in step 47, the process proceeds to step 48.
 ステップ48では、画像処理装置1は、ステップ3で検出されたN個目のレーンマーカ30の位置が自車両よりも左側に存在し、かつ左ウィンカーがオンの状態であるか否かの判定を行う。ステップ48で肯定判定されると、ステップ42に戻る。つまり、自車両が左車線に移動する状態であると判定されると、当該レーンマーカ30にマーカ画像31は重畳されない。ステップ48で否定判定されると、ステップ49に進む。なお、ステップ47及びステップ48の処理を、非表示決定処理と称する。 In step 48, the image processing apparatus 1 determines whether or not the position of the Nth lane marker 30 detected in step 3 is on the left side of the host vehicle and the left winker is on. . If an affirmative determination is made in step 48, the process returns to step 42. That is, if it is determined that the host vehicle is moving to the left lane, the marker image 31 is not superimposed on the lane marker 30. If a negative determination is made in step 48, the process proceeds to step 49. Note that the processing of step 47 and step 48 is referred to as non-display determination processing.
 ステップ49では、画像処理装置1は、ステップ46で設定された危険度に応じてマーカ画像31の幅を設定する処理を行う。
 続くステップ50では、画像処理装置1が重畳描画処理を行う。重畳描画処理は、図3におけるステップ20、ステップ21及びステップ22の処理の同様である。
In step 49, the image processing apparatus 1 performs a process of setting the width of the marker image 31 in accordance with the risk level set in step 46.
In subsequent step 50, the image processing apparatus 1 performs a superimposing drawing process. The superimposing drawing process is the same as the process of Step 20, Step 21, and Step 22 in FIG.
 ステップ50の後、ステップ42に戻る。ステップ42で否定判定されると重畳処理が終了する。
 [2-3.効果]
 以上詳述した第2実施形態によれば、前述した第1実施形態の効果を奏し、さらに、以下の効果を奏する。
After step 50, the process returns to step 42. If a negative determination is made in step 42, the superimposition process ends.
[2-3. effect]
According to 2nd Embodiment explained in full detail above, there exists the effect of 1st Embodiment mentioned above, Furthermore, there exist the following effects.
 (2a)第2実施形態の画像処理装置1は非表示決定処理を行う。そのため、車線変更によって自車両が越えることになると予想されるレーンマーカ30に重畳するマーカ画像31が表示装置3に表示されない。車線変更時にマーカ画像31が表示されていると、ドライバは車線変更しにくいと感じるが、マーカ画像31が表示されないため、ドライバは抵抗感なく車線変更をすることができる。また、車線変更時にマーカ画像31が表示されていると、車線変更に伴い危険度が強を示すマーカ画像31が表示されるが、マーカ画像31が表示されないため、ドライバに危険度が強という情報を積極的に知らせなくすることができる。 (2a) The image processing apparatus 1 according to the second embodiment performs a non-display determination process. Therefore, the marker image 31 superimposed on the lane marker 30 that is predicted to be exceeded by the host vehicle due to the lane change is not displayed on the display device 3. If the marker image 31 is displayed when the lane is changed, the driver feels that it is difficult to change the lane. However, since the marker image 31 is not displayed, the driver can change the lane without feeling resistance. In addition, if the marker image 31 is displayed when the lane is changed, the marker image 31 indicating that the degree of danger is high in accordance with the lane change is displayed. However, since the marker image 31 is not displayed, information indicating that the degree of danger is high for the driver. Can be actively stopped.
 なお、第2実施形態では、S41~S45が表示処理部としての処理に相当し、S46、S49及びS50が決定部としての処理に相当し、S47及びS48が判定部としての処理に相当する。 In the second embodiment, S41 to S45 correspond to processing as a display processing unit, S46, S49, and S50 correspond to processing as a determination unit, and S47 and S48 correspond to processing as a determination unit.
 [3.他の実施形態]
 以上、本開示の実施形態について説明したが、本開示は上述の実施形態に限定されることなく、種々変形して実施することができる。
[3. Other Embodiments]
As mentioned above, although embodiment of this indication was described, this indication is not limited to the above-mentioned embodiment, and can carry out various modifications.
 (3a)上記実施形態では、画像処理対象として、鳥瞰画像に対して画像処理が行われる例を示したが、画像処理対象はこれに限定されるものではなく、撮像画像に対して画像処理を行ってもよい。 (3a) In the above embodiment, an example in which image processing is performed on a bird's-eye image as an image processing target has been shown, but the image processing target is not limited to this, and image processing is performed on a captured image. You may go.
 (3b)上記実施形態では、画像処理装置1が危険度に応じてマーカ画像31の幅を設定する処理を行う例を示したが、マーカ画像の表示態様の設定はこれに限定されるものではない。マーカ画像の表示態様としては、例えば、マーカ画像の色、透過率、及び点滅速度等が挙げられる。具体的には、危険度が高くなると、マーカ画像が段階的に赤色になる、或いは透過率が低くなる、或いは点滅速度が速くなってもよい。 (3b) In the above embodiment, the example in which the image processing apparatus 1 performs the process of setting the width of the marker image 31 according to the degree of risk has been described. However, the setting of the display mode of the marker image is not limited to this. Absent. Examples of the display mode of the marker image include the color of the marker image, the transmittance, the blinking speed, and the like. Specifically, when the degree of danger increases, the marker image may gradually become red, the transmittance may decrease, or the blinking speed may increase.
 (3c)上記実施形態では、重畳描画処理として、レーンマーカ30が白線か道路端かに応じてマーカ画像31の表示態様を変化させる処理の例を示したが、重畳描画処理はこれに限定されるものではない。重畳描画処理は、白線又は道路端の種類に応じてマーカ画像の表示態様を変化させる処理であってもよい。例えば、白線が破線の場合、マーカ画像は破線であってもよい。 (3c) In the above embodiment, an example of the process of changing the display mode of the marker image 31 depending on whether the lane marker 30 is a white line or a road edge is shown as the overlay drawing process, but the overlay drawing process is limited to this. It is not a thing. The superposition drawing process may be a process of changing the display mode of the marker image according to the type of white line or road edge. For example, when the white line is a broken line, the marker image may be a broken line.
 (3d)上記実施形態では、危険度決定処理として、自車両とレーンマーカ30との距離に応じて危険度の設定を行う例を示したが、危険度決定処理はこれに限定されるものではない。危険度決定処理は、レーンマーカの種類に応じて危険度を設定してもよい。例えば、破線等のラインで示されるように車線変更が許可されている場合には危険度を弱に設定し、黄色等のラインで示されるように車線変更が許可されていない場合には危険度を強に設定してもよい。 (3d) In the above embodiment, an example in which the risk level is set according to the distance between the host vehicle and the lane marker 30 is shown as the risk level determination process. However, the risk level determination process is not limited to this. . In the risk determination process, the risk may be set according to the type of the lane marker. For example, if the lane change is permitted as shown by a broken line or the like, the danger level is set to be weak, and if the lane change is not allowed as shown by a yellow line or the like, the danger level is set. May be set to strong.
 (3e)上記実施形態の画像処理装置1は、鳥瞰画像の輝度を判定して、判定した輝度に応じてマーカ画像を最適な輝度に変更してもよい。例えば、マーカ画像を鳥瞰画像と同じ位の輝度にする。これにより、輝度の差が少ない重畳画像が表示装置に表示されるため、ドライバの目に優しい重畳画像を提供することができる。 (3e) The image processing apparatus 1 according to the above embodiment may determine the brightness of the bird's-eye view image and change the marker image to an optimum brightness according to the determined brightness. For example, the marker image has the same luminance as the bird's-eye view image. As a result, a superimposed image with a small difference in luminance is displayed on the display device, so that a superimposed image that is gentle to the eyes of the driver can be provided.
 (3f)上記実施形態では、画像処理の範囲として、鳥瞰画像全体に対して画像処理が行われる例を示したが、画像処理の範囲はこれに限定されるものではない。画像処理の範囲は、例えば、鳥瞰画像の車両に近い部分であってもよい。これにより、鳥瞰画像全体に対して画像処理が行われる場合に比べて、処理時間が低減される。 (3f) In the above-described embodiment, an example in which image processing is performed on the entire bird's-eye view image is shown as the image processing range, but the image processing range is not limited to this. The range of image processing may be, for example, a portion close to the vehicle of the bird's-eye view image. Thereby, compared with the case where image processing is performed on the entire bird's-eye view image, the processing time is reduced.
 (3g)上記実施形態では、撮像装置2として、フロントカメラ、左サイドカメラ、右サイドカメラ、及びリアカメラの4つのカメラを用いる例を示したが、撮像装置はこれに限定されるものではなく、1つ以上のカメラを用いてもよい。 (3g) In the above-described embodiment, an example in which the front camera, the left side camera, the right side camera, and the rear camera are used as the imaging device 2 has been described. However, the imaging device is not limited to this. One or more cameras may be used.
 (3h)上記実施形態における1つの構成要素が有する複数の機能を、複数の構成要素によって実現したり、1つの構成要素が有する1つの機能を、複数の構成要素によって実現したりしてもよい。また、複数の構成要素が有する複数の機能を、1つの構成要素によって実現したり、複数の構成要素によって実現される1つの機能を、1つの構成要素によって実現したりしてもよい。また、上記実施形態の構成の一部を省略してもよい。また、上記実施形態の構成の少なくとも一部を、他の上記実施形態の構成に対して付加又は置換してもよい。なお、請求の範囲に記載した文言から特定される技術思想に含まれるあらゆる態様が本開示の実施形態である。 (3h) A plurality of functions of one constituent element in the above embodiment may be realized by a plurality of constituent elements, or a single function of one constituent element may be realized by a plurality of constituent elements. . Further, a plurality of functions possessed by a plurality of constituent elements may be realized by one constituent element, or one function realized by a plurality of constituent elements may be realized by one constituent element. Moreover, you may abbreviate | omit a part of structure of the said embodiment. In addition, at least a part of the configuration of the above embodiment may be added to or replaced with the configuration of the other embodiment. In addition, all the aspects included in the technical idea specified from the wording described in the claims are embodiments of the present disclosure.
 (3i)上述した画像処理装置1の他、当該画像処理装置1を構成要素とするシステム、上述した画像処理をコンピュータに実行させるためのプログラム、このプログラムを記録した半導体メモリ等の非遷移的実態的記録媒体、画像処理方法など、種々の形態で本開示を実現することもできる。 (3i) In addition to the image processing apparatus 1 described above, a system having the image processing apparatus 1 as a constituent element, a program for causing a computer to execute the image processing described above, and a non-transitional actual situation such as a semiconductor memory storing the program The present disclosure can also be realized in various forms such as a typical recording medium and an image processing method.

Claims (6)

  1.  画像処理装置(1)であって、
     車両の周辺を撮像する撮像装置(2)から撮像画像を取得するように構成された画像取得部(1、S1)と、
     前記画像取得部によって取得された前記撮像画像から、前記車両の走行路を規定するレーンマーカ(30)を検出するように構成された検出部(1、S3)と、
     前記検出部によって検出された前記レーンマーカに前記レーンマーカの視認を支援するためのマーカ画像(31)が重畳するように、前記マーカ画像が重畳された前記撮像画像である重畳画像を表示装置(3)に表示させるように構成された表示処理部(1、S11~S15、S41~S45、S5)と、
     を備えた画像処理装置。
    An image processing device (1),
    An image acquisition unit (1, S1) configured to acquire a captured image from an imaging device (2) that captures the periphery of the vehicle;
    A detection unit (1, S3) configured to detect a lane marker (30) that defines a travel path of the vehicle from the captured image acquired by the image acquisition unit;
    A superimposed image, which is the captured image on which the marker image is superimposed, is displayed on the display device (3) so that a marker image (31) for assisting visual recognition of the lane marker is superimposed on the lane marker detected by the detection unit. A display processing unit (1, S11 to S15, S41 to S45, S5) configured to be displayed on
    An image processing apparatus.
  2.  請求項1に記載の画像処理装置であって、
     前記マーカ画像の表示態様を決定するように構成された決定部(1、S16~S22、S46、S49、S50)を更に備え、
     前記表示処理部は、前記決定部によって決定された表示態様に基づく前記重畳画像を前記表示装置に表示させるように構成された、画像処理装置。
    The image processing apparatus according to claim 1,
    A determination unit (1, S16 to S22, S46, S49, S50) configured to determine a display mode of the marker image;
    The display processing unit is an image processing device configured to cause the display device to display the superimposed image based on the display mode determined by the determination unit.
  3.  請求項2に記載の画像処理装置であって、
     前記決定部(1、S16~S19、S46、S49)は、前記車両と前記レーンマーカとの距離を算出し、算出した距離に応じて前記マーカ画像の表示態様を変化させるように構成された、画像処理装置。
    The image processing apparatus according to claim 2,
    The determination unit (1, S16 to S19, S46, S49) calculates a distance between the vehicle and the lane marker, and is configured to change a display mode of the marker image according to the calculated distance. Processing equipment.
  4.  請求項3に記載の画像処理装置であって、
     前記決定部は、前記算出した距離が一定以下である場合の前記マーカ画像の幅が、前記算出した距離が一定以下でない場合の前記マーカ画像の幅よりも広くなるように、前記マーカ画像の表示態様を変化させるように構成された、画像処理装置。
    The image processing apparatus according to claim 3,
    The determination unit displays the marker image so that a width of the marker image when the calculated distance is equal to or less than a certain value is wider than a width of the marker image when the calculated distance is not equal to or less than a certain value. An image processing apparatus configured to change an aspect.
  5.  請求項2から請求項4までのいずれか1項に記載の画像処理装置であって、
     前記決定部(1、S20~S22、S50)は、前記検出部によって検出された前記レーンマーカの種類に応じて前記マーカ画像の表示態様を変化させるように構成された、画像処理装置。
    An image processing apparatus according to any one of claims 2 to 4, wherein
    The image processing device, wherein the determination unit (1, S20 to S22, S50) is configured to change a display mode of the marker image according to a type of the lane marker detected by the detection unit.
  6.  請求項1から請求項5までのいずれか1項に記載の画像処理装置であって、
     前記車両が車線変更を行う状態であるか否かを判定する判定部(1、S47、S48)を更に備え、
     前記表示処理部は、前記判定部によって前記車両が車線変更を行う状態であると判定された場合、車線変更によって前記車両が越えることになると予想される前記レーンマーカに重畳する前記マーカ画像を表示しないように構成された、画像処理装置。
    An image processing apparatus according to any one of claims 1 to 5, wherein
    A determination unit (1, S47, S48) for determining whether or not the vehicle is in a state of changing lanes;
    The display processing unit does not display the marker image superimposed on the lane marker that is expected to be exceeded by the lane change when the determination unit determines that the vehicle is in a state of changing lanes. An image processing apparatus configured as described above.
PCT/JP2017/035816 2016-10-07 2017-10-02 Image processing device WO2018066510A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE112017005114.2T DE112017005114T5 (en) 2016-10-07 2017-10-02 Image processing device
CN201780061868.4A CN109844801A (en) 2016-10-07 2017-10-02 Image processing apparatus
US16/339,613 US10926700B2 (en) 2016-10-07 2017-10-02 Image processing device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-199184 2016-10-07
JP2016199184A JP6607163B2 (en) 2016-10-07 2016-10-07 Image processing device

Publications (1)

Publication Number Publication Date
WO2018066510A1 true WO2018066510A1 (en) 2018-04-12

Family

ID=61831831

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/035816 WO2018066510A1 (en) 2016-10-07 2017-10-02 Image processing device

Country Status (5)

Country Link
US (1) US10926700B2 (en)
JP (1) JP6607163B2 (en)
CN (1) CN109844801A (en)
DE (1) DE112017005114T5 (en)
WO (1) WO2018066510A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11341615B2 (en) * 2017-09-01 2022-05-24 Sony Corporation Image processing apparatus, image processing method, and moving body to remove noise in a distance image

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06253311A (en) * 1993-02-22 1994-09-09 Isuzu Motors Ltd Lane deviation warning device
JP2006350617A (en) * 2005-06-15 2006-12-28 Denso Corp Vehicle driving support apparatus
JP2007512636A (en) * 2003-12-01 2007-05-17 ボルボ テクノロジー コーポレイション Method and system for supporting route control

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4637408B2 (en) 2001-06-20 2011-02-23 株式会社デンソー Vehicle periphery image processing apparatus and recording medium
US7526103B2 (en) * 2004-04-15 2009-04-28 Donnelly Corporation Imaging system for vehicle
US8559675B2 (en) * 2009-04-23 2013-10-15 Panasonic Corporation Driving support device, driving support method, and program
JP6142784B2 (en) * 2013-11-27 2017-06-07 株式会社デンソー Driving assistance device
JP6084598B2 (en) * 2014-11-17 2017-02-22 本田技研工業株式会社 Sign information display system and sign information display method
JP2016199184A (en) 2015-04-13 2016-12-01 株式会社タンクテック Tank lorry

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06253311A (en) * 1993-02-22 1994-09-09 Isuzu Motors Ltd Lane deviation warning device
JP2007512636A (en) * 2003-12-01 2007-05-17 ボルボ テクノロジー コーポレイション Method and system for supporting route control
JP2006350617A (en) * 2005-06-15 2006-12-28 Denso Corp Vehicle driving support apparatus

Also Published As

Publication number Publication date
US20200039437A1 (en) 2020-02-06
JP6607163B2 (en) 2019-11-20
US10926700B2 (en) 2021-02-23
DE112017005114T5 (en) 2019-06-27
CN109844801A (en) 2019-06-04
JP2018060459A (en) 2018-04-12

Similar Documents

Publication Publication Date Title
WO2016002163A1 (en) Image display device and image display method
EP3125212B1 (en) Vehicle warning device
CN107848416B (en) Display control device, display device, and display control method
JP5966965B2 (en) Lane boundary departure control device and lane boundary departure suppression method
JP6307895B2 (en) Vehicle periphery monitoring device
CN106470877B (en) Vehicle display device and vehicle display method
JP6330908B2 (en) Display device for vehicle and display method for vehicle
JP6457278B2 (en) Object detection apparatus and object detection method
US20110228980A1 (en) Control apparatus and vehicle surrounding monitoring apparatus
EP2928182B1 (en) Onboard image processing system
JP6586849B2 (en) Information display device and information display method
JP2009116723A (en) Lane change support system
WO2012141053A1 (en) Image processing device
JP2009226978A (en) Vehicular circumference monitoring device
JP2018097431A (en) Driving support apparatus, driving support system and driving support method
JP6152261B2 (en) Car parking frame recognition device
US10740909B2 (en) Drive recorder
JP4986070B2 (en) Ambient monitoring device for vehicles
JP2014106739A (en) In-vehicle image processing device
WO2018066510A1 (en) Image processing device
JP6032141B2 (en) Travel road marking detection device and travel road marking detection method
JP2006350571A (en) Lane detection device
JP2008042759A (en) Image processing apparatus
JP2019140518A (en) Image generating apparatus
JP2008230358A (en) Display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17858348

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17858348

Country of ref document: EP

Kind code of ref document: A1