JP4882571B2 - Vehicle monitoring device - Google Patents

Vehicle monitoring device Download PDF

Info

Publication number
JP4882571B2
JP4882571B2 JP2006198307A JP2006198307A JP4882571B2 JP 4882571 B2 JP4882571 B2 JP 4882571B2 JP 2006198307 A JP2006198307 A JP 2006198307A JP 2006198307 A JP2006198307 A JP 2006198307A JP 4882571 B2 JP4882571 B2 JP 4882571B2
Authority
JP
Japan
Prior art keywords
image
vehicle
monitoring device
background
object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2006198307A
Other languages
Japanese (ja)
Other versions
JP2008027138A (en
Inventor
勉 川野
柳  拓良
達美 柳井
壮一郎 森
政美 舟川
Original Assignee
日産自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日産自動車株式会社 filed Critical 日産自動車株式会社
Priority to JP2006198307A priority Critical patent/JP4882571B2/en
Publication of JP2008027138A publication Critical patent/JP2008027138A/en
Application granted granted Critical
Publication of JP4882571B2 publication Critical patent/JP4882571B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a vehicle monitoring apparatus that displays an image around a vehicle on a monitor.

  2. Description of the Related Art Conventionally, there has been known an apparatus that displays a surrounding image captured by a camera on a monitor in order to confirm the rear or rear side of the vehicle (see, for example, Patent Document 1). In the apparatus, the target object on the monitor is emphasized to make it easier for the observer to recognize the target object.

JP-A-9-48282

  However, since the surrounding image captured by the camera is displayed on the monitor by being reduced to a visual field range narrower than the actual visual field range, more video information exists in the narrow range. Therefore, even if emphasis processing is performed on an object to be noted as in the past, the amount of information does not change, and there is a problem in that the amount of information adversely affects recognition and judgment of the object. is there.

The present invention is applied to a vehicle monitoring device that displays a captured image of a rear side of a vehicle captured by an imaging device on a display device . The frame image captured by the imaging device is input and the frame image should be monitored. Of the target image region including a target object that is a target candidate, a dividing unit that divides the frame image into a background image region that excludes the target image region, and the target image region divided by the dividing unit , An object determined not approaching the host vehicle based on the distance and relative speed between the target object included in each target image region and the host vehicle, and an upper side of the horizon line in the background image region And an image processing means for performing a blurring process on the background .

  According to the present invention, an unnecessary amount of information of a displayed image is reduced, and recognition and determination of an object to be noted is facilitated. For example, an approaching moving body is easily recognized.

  Hereinafter, the best mode for carrying out the present invention will be described with reference to the drawings. FIG. 1 is a block diagram showing an embodiment of a vehicle monitoring apparatus according to the present invention. The monitoring apparatus according to the present embodiment includes a camera 1, a video processing unit 2, and a monitor 3. The camera 1 is an image pickup apparatus using an image pickup device such as a CCD image pickup device. For example, the camera 1 is installed at substantially the center of the rear part of the host vehicle and continuously captures the rear of the host vehicle.

  Images continuously photographed by the camera 1 are taken into the video processing unit 2 for each image frame divided every predetermined time. The video processing unit 2 includes a CPU that performs image processing and the like, and a ROM and RAM that store control programs and various data. In the RAM, input image data and output image data to be displayed on the monitor 3 are temporarily stored. The image captured by the video processing unit 2 is subjected to predetermined video processing and sent to the display monitor 3. On the monitor 3, the rear video after the video processing is displayed as a monitoring image. The video processing unit 2 receives vehicle information related to the situation of the host vehicle and ambient information related to the surrounding situation of the host vehicle.

  The vehicle information includes vehicle speed and lateral G from the vehicle speed sensor and acceleration sensor, navigation information from the navigation device (the current position of the host vehicle and the shape of the surrounding road), and the like. As the surrounding information, there are position information and speed information related to surrounding vehicles from radar and sonar mounted on the vehicle. By using radar, the position and speed of relatively distant vehicles can be obtained, while by using sonar, the position and relative speed of nearby vehicles (for example, a parked vehicle passing by the side or a succeeding motorcycle) can be obtained. can get.

  FIG. 2 shows an outline of a processing procedure performed in the video processing unit 2, and this processing starts when the vehicle monitoring device is turned on. In step S1, the image data of the latest frame is captured from the camera 1. FIG. 3 shows an example of the captured image data, and the following vehicle, the stopped vehicle, the road surface, the building, the sky, etc. are photographed. In step S2, the video area is divided into a plurality of areas by performing edge detection, optic flow detection, detection of areas where the same color is continuous, and the like. Details of the region extraction processing at the time of division will be described later.

  In step S3, image processing such as blurring processing is performed for each divided region. Although details of the image processing will be described later, it is determined whether or not each divided area is to be blurred, and blurring processing is performed on the divided areas determined to be blurred. In step S4, the divided areas are integrated to create display image data. Next, in step S5, the integrated image data is output to the monitor. In step S6, it is determined whether or not the power switch is off. If the power switch is on, the process proceeds to step S1, and if it is off, a series of processing ends.

[Description of area extraction processing]
Next, the outline of the extraction process in the division process will be described with reference to FIGS. Here, in order to simplify the description, the image shown in FIG. 4 will be used instead of the image shown in FIG. In the extraction process, the video processing unit 2 captures the latest image 1 of the frame N (FIG. 4B) and the image 2 of the immediately preceding frame N-1 (FIG. 4A).

  The target object 10 that is a candidate for the object to be monitored and the other background are displayed in any of the images of the frames N-1 and N shown in FIGS. 4 (a) and 4 (b). The background includes the building 20, the utility pole 21, the white line 22 on the road, and the like. As shown by the broken line in FIG. 4B, the building 20, the utility pole 21, and the white line 22 in the image 1 have moved backward as compared with the image 2 over time. The object 10 is a vehicle that is approaching the host vehicle, and moves forward on the screen.

  Next, the optic flow is detected. The optic flow is a representation of the motion of the target object as a vector. A typical expression is that the direction of the movement of one point on the target and the magnitude of the velocity are simultaneously indicated by the direction and length of the arrow. To express. In order to detect the optic flow, first, feature points of the image 1 and the image 2 are extracted. For example, feature points are extracted using the corner detection algorithm described in C. Harris and M. Stepehens. A combined corner and edge detector. In Proc. Alvey. Conf., Pp189-162, 1987.

  Once the feature points of images 1 and 2 have been extracted, matching processing between the feature points of image 1 and feature points of image 2 is performed, and corresponding feature points of image 2 are detected for each feature point of image 1 To do. For example, for a feature point in image 1, a feature point having the closest color among a plurality of feature points in image 2 existing in the vicinity thereof and having a difference smaller than a predetermined threshold is selected.

  FIG. 5A shows the feature points (black circles) of image 1 and the corresponding feature points (black triangles) of image 2 on image 1, and a part of many feature points for explanation. Only shown. Regarding the feature points related to the object 10, the building 20, and the utility pole 21 that are relatively moving between the images 1 and 2, the black circle and the black triangle are separated from each other, but the feature point of the empty area indicated by the arrow R As for the black circle and the black triangle, the positions are almost the same. For easy understanding, the broken line similar to that in FIG. 4B is shown in FIG.

  An optic flow can be obtained by drawing an arrow from the feature point of image 2 to the corresponding feature point of image 1. FIG. 5B shows a part of the optic flow. The optic flows OF1, OF2, and OF3 of the building 20, the utility pole 21, and the white line 22 on the road are arrows pointing backward, but the optic flow OF4 related to the object 10 approaching the vehicle from behind is directed forward. It is an arrow that faces (slightly downward on the drawing). In the area where the positions of the feature points of the images 1 and 2 are the same as in the empty area in FIG. 5A, the optic flow OF5 is represented by a point. At this time, as shown in FIG. 5B, the optic flow on the screen area is only determined, and the division into the divided areas is not yet performed.

  Next, edge detection is performed on the image 1. This edge detection is performed using, for example, a known image processing method such as edge enhancement by differential processing. Since FIG. 4A is a schematic image obtained by simplifying the video, the edge image after edge detection is almost the same as FIG. 4A. For example, when the boundary between the building 20 and the utility pole 21 and the ground is not detected by edge detection, the edge image after edge detection is an image as shown in FIG. In FIG. 5C, the edge of the pillar portion at the rear of the vehicle where the color change is small and the far white line are not detected.

  By this edge detection process, the video is divided into a plurality of regions. Then, for the region divided by the edge, the direction and amount of movement of the region with respect to the previous frame are obtained from the representative optic flow (feature point movement and direction) in the region. For example, those values are obtained from the optic flow OF4 for the object 10, and from the optic flow OF1 for the building 20.

  Further, as shown in FIGS. 5A and 5B, the feature points of the empty region are not moved, and the empty region is detected as a continuous region where such non-moving feature points exist. . Further, when there are almost no feature points in the divided area, attention is paid to the direction and amount of movement of the feature points on the edge surrounding the area. For example, in an area where there is no color shading (such as the white line 22 on the road), the feature points are concentrated on the edge. It should be noted that in a region with little change such as a road surface, the feature point matching process between the images 1 and 2 may fail, and a feature point that is regarded as not moving may be detected. Included in part.

  In this way, the object 10 is detected as an object to be monitored by edge detection and optical flow detection. The feature points in other areas are moving toward the vanishing point FOC (Focus Of Compression) near the center of the screen as shown in the optic flow OF1 to OF3, and the feature points move like the empty area. It is divided into what is not. For the region other than the object 10, the background is divided into a background A below the horizon and a background B above the horizon. An empty area is included in the background B. From the tendency of the optic flows OF1 to OF3, it can be seen that the moving direction of the background during traveling is as shown in FIG.

[Explanation of blur processing]
By performing the extraction process described above, the video shown in FIG. 3 is divided into five divided regions as shown in FIGS. FIGS. 7A to 7C show the divided areas for each of the three objects 11, 12, and 13, and FIGS. 8A and 8B show the divided areas for the backgrounds A and B. FIG. In the processing of step S3 in FIG. 2, image processing including blurring processing is performed on the images of these divided regions.

  FIG. 9 is a diagram for explaining the outline of image processing for each divided region, and shows the flow of processing. First, processing for images of the objects 11 to 13 will be described. The reason why image processing is performed for each divided region in the present embodiment is to call attention to a vehicle (target object) that approaches the host vehicle from behind. Therefore, when performing image processing on the objects 11 to 13, processing is performed based on the distance and relative speed with respect to the objects 11 to 13. These distances and relative velocities may be based on information from a radar or sonar mounted on the vehicle, or distances and relative velocities estimated from detected optic flows may be used.

  In the case of the object 11, since it is a moving body that is faster and much closer than the host vehicle, an emphasis process is performed so that the object 11 is easily noticeable. For example, image processing that increases contrast is performed. The optic flow of the object 11 is directed in the traveling direction in the same manner as the optic flow OF4 in FIG. 5B, and is recognized as a moving body approaching from the optic flow.

  In the case of the object 12, it is determined from the distance and relative speed that the vehicle is parked on the side of the road, and it is estimated that the object need not pay attention. The optic flow of the stationary object 12 is the same as the optic flow of the background, as shown in FIG. 5B, such as the optic flows OF1 and OF2 of the building 20 and the utility pole 21. As described above, the blurring process is performed in the case of the stationary object 12 or a moving body that travels slowly and moves away from the host vehicle even if the vehicle is a traveling vehicle.

  When performing the blurring process, the blurring direction is set to the direction of the optic flow of the object 12, and the blurring amount (blurring degree) is increased as the relative speed is increased. When blurring in the direction of the optic flow, blur processing is performed by averaging between pixels along the optic flow. Further, the blur amount may be increased when the object is close, and the blur amount may be decreased when the object is small. As a result, the blurred image is an image that does not feel uncomfortable according to the traveling speed, and the visibility of the area that has not been blurred is improved.

  The blurring process is performed by performing an averaging process between the pixels in the target region and the surrounding pixels on the same frame. Further, the blur amount is adjusted depending on how many pixels away from the image are used, and the blur direction can be changed by averaging the pixels in which direction. You may change the weight in the case of averaging. Note that the blurring process may be performed using data of images having different frames, but by performing spatial averaging in units of frames, it is possible to achieve both easy-to-view video and video without delay.

  In the case of the object 13, since the distance is relatively close and the moving object is gradually approaching, the object 13 is estimated as an object to be noted and is not subjected to the blurring process. As described above, even if the target is not subjected to the enhancement process, the blurring process is performed on the surrounding area, so that attention can be easily drawn, and the same effect as when the enhancement process is performed can be obtained.

  As for the image of background A, which is the background below the horizon, most of the area is occupied by the road surface, and since the road surface of a uniform texture has little information and does not interfere with object recognition, more natural images The blurring process is not performed so that However, when there is a white line 22 on the road surface, it changes periodically while moving backward, so in order to reduce unnecessary recognition and reduce the observer's recognition load, it is blurred in the direction in which the white lines 22 are arranged. Apply processing. As a result, the white lines arranged periodically become like one continuous white line, and the troublesomeness of the periodic change can be reduced. Although the white line on the road surface has been described here, for example, the illumination in the tunnel also has a similar periodic image, and therefore the same blurring process is performed.

  For the image of background B, which is the background above the horizon, the blurring process is performed in the following procedure. In the image of background B, the building and other landscapes become smaller while moving toward the vanishing point FOC shown in FIG. 6 as the vehicle travels. The optic flow shown in FIG. 6 is particularly called a global flow because it represents a movement pattern of the entire stationary background that depends on the traveling of the host vehicle. Background B is subjected to blurring processing based on the global flow pattern shown in FIG.

  In the memory (RAM or ROM) of the video processing unit 2, a plurality of global flow patterns corresponding to the traveling patterns are stored in advance, and one of the global flow patterns is selected based on the vehicle information of the current vehicle traveling state. Use. The global flow pattern shown in FIG. 9 is a pattern in the case of straight running, and there are other patterns such as a curve running (right curve, left curve). The vehicle information includes vehicle speed, steering angle, lateral G from the acceleration sensor, and the like. When the navigation device is operating, the global flow pattern may be selected by estimating the road curve state from the map information of the navigation.

  If the global flow pattern is selected, the global flow pattern is corrected based on the detected optical flow to match the current state, and blurring processing is performed using the corrected global flow pattern. The correction of the global flow pattern is also performed according to the camera orientation and the vehicle body posture. The direction of blurring at this time is performed in the flow direction of the global flow pattern. As for the blur amount, the blur amount is increased as the distance from the vanishing point FOC increases. That is, the blur amount is set based on a concentric blur pattern as shown in FIG. The numbers in the pattern represent the degree of blurring. The blur amount of the blur pattern may be changed in accordance with the speed of the host vehicle. For example, when the vehicle speed is high, the numbers 1, 2, 3, 4 are changed to 2, 3, 4, Increase to 5 and so on.

  When image processing has been performed on all the divided areas in this way, they are integrated as display data as described above and output to the monitor 3. FIG. 10 is a diagram showing an integrated video. It can be seen at a glance that the objects 11 and 13 (see FIG. 7) as the objects to be noted are conspicuous by performing an appropriate blurring process. In addition, since the parked vehicle (object 12) near the surrounding background or the object 13 is blurred, the amount of unnecessary information is reduced and the image is easy to see. When the host vehicle is stopped, the blurring process as described above is not performed on the captured image, and the captured image is displayed as it is.

As described above, the vehicle monitoring apparatus according to the present embodiment can provide the following operational effects.
(1) A captured image is acquired in units of frames, and the dividing unit 2 divides the image area into divided image areas 11 to 13 including a moving body and divided image areas A and B including a background based on a change between frames. Since the divided image area 12 including the moving object and the divided image area B including the background of the divided divided image areas are blurred by the image processing unit, the amount of information is reduced and visually recognized. A high-quality display image can be obtained.
(2) In the same frame, it is easy to recognize the image while minimizing the delay time of the video by performing the blurring process by averaging the pixel to be blurred and the surrounding pixels. You can get a picture.
(3) In addition, the degree of blur is increased as the relative speed between the moving object and the background moving away is increased, or the degree of blur is changed according to the distance from the moving object and the background moving away. It is possible to realize a natural image that matches the observer's feeling that changes according to the vehicle driving situation. Furthermore, by increasing the degree of blur as the speed of the host vehicle increases, it is possible to reduce the annoyance of images such as parked vehicles and roadside signs.
(4) Based on the optic flow, by dividing into a divided image area including a moving object and a divided image area including a background, appropriate division processing can be performed more easily, and background blur processing is also possible. It will be easy.
(5) Note that the pixel to be blurred and the surrounding pixels in the direction of the optic flow are averaged, and the blur processing is performed in the direction of the optic flow, or the blur is performed as the distance from the vanishing point of the optic flow increases. By increasing the degree of the above, it is possible to realize natural blurring caused by movement of the vehicle with a small amount of calculation, and an image with high visibility and less discomfort can be obtained.

1 is a block diagram showing an embodiment of a vehicle monitoring apparatus according to the present invention. 4 is a flowchart illustrating an outline of a processing procedure performed in the video processing unit 2. It is a figure which shows an example of the taken-in image data. It is a figure explaining an area | region extraction process, (a) shows the image 2, (b) shows the image 1, respectively. It is a figure explaining an area | region extraction process, (a) is a figure which shows a feature point, (b) is a figure which shows an optic flow, (c) is a figure which shows an edge image. It is a figure which shows a global flow pattern. It is a figure which shows the division area with respect to the image | video of FIG. 3, (a)-(c) shows the division area regarding each of the target objects 11-13. 4A and 4B are diagrams illustrating divided areas for the video in FIG. 3, in which FIG. 3A illustrates a divided area related to the background B, and FIG. It is a figure explaining the outline of the image processing with respect to each division area. It is a figure which shows the integrated image | video.

Explanation of symbols

  1: Camera, 2: Video processing unit, 3: Monitor, 10: Mobile, 20: Building, 21: Utility pole, 22: White line, OF1 to OF5: Optic flow, FOC: Vanishing point

Claims (10)

  1. In the vehicle monitoring device that displays a captured image of the rear of the vehicle imaged by the imaging device on a display device,
    A frame image captured by the imaging device is input, and the frame image includes a target image region including a target object that is a candidate for monitoring, and a background image region obtained by removing the target image region from the frame image. Dividing means to divide into
    Of the target image areas divided by the dividing means, it is determined that the vehicle is not approaching based on the distance and relative speed between the target object included in each target image area and the host vehicle. A vehicle monitoring apparatus comprising: an object; and image processing means for performing a blurring process on a background above the horizon in the background image region .
  2. The vehicle monitoring device according to claim 1,
    The vehicular monitoring device, wherein the image processing means performs the blurring process by averaging a pixel to be blurred and surrounding pixels in the same frame.
  3. The vehicle monitoring device according to claim 1 or 2,
    Wherein the image processing means, the background in the upper horizon of the object and the background image area away from the vehicle among the objects, blurring as the relative speed between the moving object and the background and the subject vehicle is large A vehicle monitoring device characterized by increasing the degree.
  4. In the monitoring device for vehicles according to any one of claims 1 to 3,
    The vehicle, wherein the image processing means increases the degree of blurring of an image located farther from the vanishing point with respect to the background excluding the object behind the host vehicle in the frame image above the horizon. Monitoring device.
  5. In the vehicle monitoring device according to any one of claims 1 to 4,
    The image monitoring means increases the degree of blurring as the speed of the host vehicle increases, and stops the blurring process when the host vehicle is stopped.
  6. In the monitoring apparatus for vehicles as described in any one of Claims 1-5,
    Computation means for calculating an optic flow based on the captured image,
    The dividing means divides into a target image area including a target object that is a candidate of a target to be monitored and a background image area obtained by removing the target image area from the frame image based on the optic flow. A vehicle monitoring device.
  7. The vehicle monitoring device according to claim 6,
    The vehicular monitoring device, wherein the image processing means averages a pixel to be blurred and a peripheral pixel in the direction of the optic flow, and performs the blur processing in the direction of the optic flow .
  8. The vehicle monitoring device according to claim 6 or 7,
    The vehicular monitoring device, wherein the image processing means increases the degree of blur as the distance from the vanishing point of the optic flow increases.
  9. In the vehicle monitoring device according to any one of claims 1 to 8,
    The image processing means performs a blurring process on an image that periodically changes while moving to the rear of the host vehicle with respect to a background excluding an object behind the host vehicle in the frame image below the horizon. The vehicle monitoring apparatus is characterized in that no blurring process is applied to other images.
  10. The vehicle monitoring device according to claim 9, wherein
    The image processing means blurs the white line on the road surface and the illumination in the tunnel along the extending direction as an image that periodically changes while moving to the rear of the host vehicle. apparatus.
JP2006198307A 2006-07-20 2006-07-20 Vehicle monitoring device Expired - Fee Related JP4882571B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006198307A JP4882571B2 (en) 2006-07-20 2006-07-20 Vehicle monitoring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006198307A JP4882571B2 (en) 2006-07-20 2006-07-20 Vehicle monitoring device

Publications (2)

Publication Number Publication Date
JP2008027138A JP2008027138A (en) 2008-02-07
JP4882571B2 true JP4882571B2 (en) 2012-02-22

Family

ID=39117711

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006198307A Expired - Fee Related JP4882571B2 (en) 2006-07-20 2006-07-20 Vehicle monitoring device

Country Status (1)

Country Link
JP (1) JP4882571B2 (en)

Families Citing this family (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4935302B2 (en) * 2006-11-02 2012-05-23 株式会社ニコン Electronic camera and program
GB2447672B (en) 2007-03-21 2011-12-14 Ford Global Tech Llc Vehicle manoeuvring aids
US9513103B2 (en) 2011-04-19 2016-12-06 Ford Global Technologies, Llc Hitch angle sensor assembly
US9374562B2 (en) 2011-04-19 2016-06-21 Ford Global Technologies, Llc System and method for calculating a horizontal camera to target distance
US9506774B2 (en) 2011-04-19 2016-11-29 Ford Global Technologies, Llc Method of inputting a path for a vehicle and trailer
US9500497B2 (en) 2011-04-19 2016-11-22 Ford Global Technologies, Llc System and method of inputting an intended backing path
US9555832B2 (en) 2011-04-19 2017-01-31 Ford Global Technologies, Llc Display system utilizing vehicle and trailer dynamics
US9290203B2 (en) 2011-04-19 2016-03-22 Ford Global Technologies, Llc Trailer length estimation in hitch angle applications
US10196088B2 (en) 2011-04-19 2019-02-05 Ford Global Technologies, Llc Target monitoring system and method
US9335163B2 (en) 2011-04-19 2016-05-10 Ford Global Technologies, Llc Trailer length estimation in hitch angle applications
US9290202B2 (en) 2011-04-19 2016-03-22 Ford Global Technologies, Llc System and method of calibrating a trailer backup assist system
US9926008B2 (en) 2011-04-19 2018-03-27 Ford Global Technologies, Llc Trailer backup assist system with waypoint selection
US9854209B2 (en) 2011-04-19 2017-12-26 Ford Global Technologies, Llc Display system utilizing vehicle and trailer dynamics
US9434414B2 (en) 2011-04-19 2016-09-06 Ford Global Technologies, Llc System and method for determining a hitch angle offset
US9683848B2 (en) 2011-04-19 2017-06-20 Ford Global Technologies, Llc System for determining hitch angle
US9723274B2 (en) 2011-04-19 2017-08-01 Ford Global Technologies, Llc System and method for adjusting an image capture setting
US9969428B2 (en) 2011-04-19 2018-05-15 Ford Global Technologies, Llc Trailer backup assist system with waypoint selection
US9290204B2 (en) 2011-04-19 2016-03-22 Ford Global Technologies, Llc Hitch angle monitoring system and method
US9937953B2 (en) 2011-04-19 2018-04-10 Ford Global Technologies, Llc Trailer backup offset determination
JP2012247847A (en) * 2011-05-25 2012-12-13 Denso Corp Information transmission control device for vehicle and information transmission control device
US9373044B2 (en) 2011-07-25 2016-06-21 Ford Global Technologies, Llc Trailer lane departure warning system
US9592851B2 (en) 2013-02-04 2017-03-14 Ford Global Technologies, Llc Control modes for a trailer backup assist system
US9511799B2 (en) 2013-02-04 2016-12-06 Ford Global Technologies, Llc Object avoidance for a trailer backup assist system
WO2014129026A1 (en) 2013-02-21 2014-08-28 本田技研工業株式会社 Driving assistance device and image processing program
WO2014155827A1 (en) * 2013-03-28 2014-10-02 日産自動車株式会社 Parking assistance device
JP6205923B2 (en) 2013-07-11 2017-10-04 株式会社デンソー Driving support device
JP6323018B2 (en) * 2014-01-17 2018-05-16 株式会社デンソー Driving assistance device
JPWO2015162910A1 (en) * 2014-04-24 2017-04-13 パナソニックIpマネジメント株式会社 In-vehicle display device, control method for in-vehicle display device, and program
US9963004B2 (en) 2014-07-28 2018-05-08 Ford Global Technologies, Llc Trailer sway warning system and method
US9517668B2 (en) 2014-07-28 2016-12-13 Ford Global Technologies, Llc Hitch angle warning system and method
US9340228B2 (en) 2014-10-13 2016-05-17 Ford Global Technologies, Llc Trailer motion and parameter estimation system
US9315212B1 (en) 2014-10-13 2016-04-19 Ford Global Technologies, Llc Trailer sensor module and associated method of wireless trailer identification and motion estimation
US9522677B2 (en) 2014-12-05 2016-12-20 Ford Global Technologies, Llc Mitigation of input device failure and mode management
US9533683B2 (en) 2014-12-05 2017-01-03 Ford Global Technologies, Llc Sensor failure mitigation system and mode management
US9607242B2 (en) 2015-01-16 2017-03-28 Ford Global Technologies, Llc Target monitoring system with lens cleaning device
US9522699B2 (en) 2015-02-05 2016-12-20 Ford Global Technologies, Llc Trailer backup assist system with adaptive steering angle limits
US9616923B2 (en) 2015-03-03 2017-04-11 Ford Global Technologies, Llc Topographical integration for trailer backup assist system
US9804022B2 (en) 2015-03-24 2017-10-31 Ford Global Technologies, Llc System and method for hitch angle detection
WO2016151978A1 (en) * 2015-03-26 2016-09-29 パナソニックIpマネジメント株式会社 Image processing device, driving assistance system, and image processing method
JP2017028349A (en) * 2015-07-15 2017-02-02 株式会社デンソー Video processing device, operation support display system and video processing method
US9896130B2 (en) 2015-09-11 2018-02-20 Ford Global Technologies, Llc Guidance system for a vehicle reversing a trailer along an intended backing path
US10384607B2 (en) 2015-10-19 2019-08-20 Ford Global Technologies, Llc Trailer backup assist system with hitch angle offset estimation
US9836060B2 (en) 2015-10-28 2017-12-05 Ford Global Technologies, Llc Trailer backup assist system with target management
US10017115B2 (en) 2015-11-11 2018-07-10 Ford Global Technologies, Llc Trailer monitoring system and method
US9798953B2 (en) 2015-12-17 2017-10-24 Ford Global Technologies, Llc Template matching solution for locating trailer hitch point
US9796228B2 (en) 2015-12-17 2017-10-24 Ford Global Technologies, Llc Hitch angle detection for trailer backup assist system
US10011228B2 (en) 2015-12-17 2018-07-03 Ford Global Technologies, Llc Hitch angle detection for trailer backup assist system using multiple imaging devices
US10155478B2 (en) 2015-12-17 2018-12-18 Ford Global Technologies, Llc Centerline method for trailer hitch angle detection
US9827818B2 (en) 2015-12-17 2017-11-28 Ford Global Technologies, Llc Multi-stage solution for trailer hitch angle initialization
US9610975B1 (en) 2015-12-17 2017-04-04 Ford Global Technologies, Llc Hitch angle detection for trailer backup assist system
US9934572B2 (en) 2015-12-17 2018-04-03 Ford Global Technologies, Llc Drawbar scan solution for locating trailer hitch point
US10005492B2 (en) 2016-02-18 2018-06-26 Ford Global Technologies, Llc Trailer length and hitch angle bias estimation
US10112646B2 (en) 2016-05-05 2018-10-30 Ford Global Technologies, Llc Turn recovery human machine interface for trailer backup assist
US10106193B2 (en) 2016-07-01 2018-10-23 Ford Global Technologies, Llc Enhanced yaw rate trailer angle detection initialization
US10046800B2 (en) 2016-08-10 2018-08-14 Ford Global Technologies, Llc Trailer wheel targetless trailer angle detection
US10222804B2 (en) 2016-10-21 2019-03-05 Ford Global Technologies, Llc Inertial reference for TBA speed limiting

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3298851B2 (en) * 1999-08-18 2002-07-08 松下電器産業株式会社 Image display method of the multi-function onboard camera system and a multi-function onboard camera
JP3941926B2 (en) * 2002-03-12 2007-07-11 松下電器産業株式会社 Vehicle periphery monitoring device
JP3776094B2 (en) * 2002-05-09 2006-05-17 松下電器産業株式会社 Monitoring device, monitoring method and monitoring program
JP3897305B2 (en) * 2004-02-06 2007-03-22 シャープ株式会社 Vehicle periphery monitoring device, vehicle periphery monitoring method, control program, and readable recording medium
JP2005346177A (en) * 2004-05-31 2005-12-15 Nissan Motor Co Ltd Information presenting device for vehicle

Also Published As

Publication number Publication date
JP2008027138A (en) 2008-02-07

Similar Documents

Publication Publication Date Title
KR100956858B1 (en) Sensing method and apparatus of lane departure using vehicle around image
US9826199B2 (en) Road vertical contour detection
EP2285109B1 (en) Vehicle image processor, and vehicle image processing system
JP4437714B2 (en) Lane recognition image processing device
JP4855158B2 (en) Driving assistance device
JP3780922B2 (en) Road white line recognition device
JP3898709B2 (en) Vehicle lane marking recognition device
US8249303B2 (en) Restoration apparatus for weather-degraded image and driver assistance system
US8175806B2 (en) Car navigation system
JP5276637B2 (en) Lane estimation device
US10099614B2 (en) Vision system for vehicle
DE10033599B4 (en) Position sensing device
JP5035284B2 (en) Vehicle periphery display device
US8655019B2 (en) Driving support display device
JP2010130646A (en) Vehicle periphery checking system
EP2546602A1 (en) Stereo camera device
DE102007014012B4 (en) Vehicle environment monitor, vehicle environment monitoring method, and vehicle environment monitoring program
US8477999B2 (en) Road division line detector
US8842181B2 (en) Camera calibration apparatus
JP4267657B2 (en) Vehicle periphery monitoring device
JP3764086B2 (en) Vehicle information providing device
US8005266B2 (en) Vehicle surroundings monitoring apparatus
US20130286205A1 (en) Approaching object detection device and method for detecting approaching objects
JP3722487B1 (en) Vehicle lane marking recognition device
CN101297177A (en) Navigation system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20090326

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110524

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110707

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110823

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20111019

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20111108

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20111121

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20141216

Year of fee payment: 3

LAPS Cancellation because of no payment of annual fees