WO2011108217A1 - 車両の周辺監視装置 - Google Patents
車両の周辺監視装置 Download PDFInfo
- Publication number
- WO2011108217A1 WO2011108217A1 PCT/JP2011/000948 JP2011000948W WO2011108217A1 WO 2011108217 A1 WO2011108217 A1 WO 2011108217A1 JP 2011000948 W JP2011000948 W JP 2011000948W WO 2011108217 A1 WO2011108217 A1 WO 2011108217A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- vehicle
- display
- display device
- display image
- Prior art date
Links
- 238000012806 monitoring device Methods 0.000 title claims description 9
- 238000003384 imaging method Methods 0.000 claims abstract description 15
- 238000001514 detection method Methods 0.000 claims description 9
- 238000012544 monitoring process Methods 0.000 claims description 5
- 238000000034 method Methods 0.000 description 41
- 230000008569 process Effects 0.000 description 35
- 238000012545 processing Methods 0.000 description 19
- 238000013459 approach Methods 0.000 description 10
- 241001465754 Metazoa Species 0.000 description 7
- 230000005484 gravity Effects 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 238000012937 correction Methods 0.000 description 5
- 238000005070 sampling Methods 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/30—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing vision in the non-visible spectrum, e.g. night or infrared vision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/24—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/94—Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/163—Decentralised systems, e.g. inter-vehicle communication involving continuous checking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
- B60R2300/305—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/307—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8033—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for pedestrian protection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
Definitions
- the present invention relates to an apparatus for monitoring the periphery of a vehicle, and more specifically to an apparatus for detecting and displaying an object around the vehicle.
- a head-up display (HUD) is provided, an object around the vehicle is detected using an infrared camera, and an object existing in an approach determination area set in the traveling direction of the vehicle Is a system that highlights in the center area of the display screen of the head-up display and displays icons in the right and left areas of the display screen for objects existing in the intrusion determination area set outside the approach determination area Has been proposed.
- the so-called head-up display as described in the above document is provided in front of the driver, and has an advantage that the amount of line-of-sight movement is relatively small for the driver to visually recognize the display screen.
- the spread of display devices attached to dashboards is progressing.
- navigation devices become more popular, devices are becoming more popular for displaying various information such as map information provided by a navigation function.
- a vehicle periphery monitoring device is configured to adjust the temperature of the object through imaging by the imaging unit that images the periphery of the vehicle using a camera mounted on the vehicle.
- a means for acquiring a grayscale image having a luminance value according to the object a target detection means for detecting a predetermined target present around the vehicle from the grayscale image
- Display image generating means for generating a display image to be displayed on a display device mounted on a vehicle, and means for displaying the generated display image on the display device.
- the display image generation means generates the display image by reducing the luminance of a region other than the detected object in the grayscale image.
- the display image is generated by reducing the luminance of the region other than the object in the grayscale image, and this is displayed on the display device. Therefore, an image with high contrast, such as a spotlight applied only to the object, is displayed on the display device. Therefore, the driver who visually recognizes the display image can instantly recognize the presence of the object.
- the display device is a position that is visible to the driver of the vehicle, and passes through a center of rotation of the steering wheel of the vehicle and extends in the front-rear direction of the vehicle. It is provided at a position separated in the vehicle width direction by a predetermined distance.
- Such a display device is not a HUD with a small amount of line-of-sight movement as described above, but is a general display device attached to the left or right side of the handle, for example. growing. According to the present invention, the time required for the driver's object recognition can be shortened by the display image with high contrast as described above. Therefore, even if such a general display device is used, the object can be displayed. The driver can be instantly recognized.
- the object detection unit determines whether or not the vehicle is likely to collide with the object
- the display image generation unit is configured to determine whether the vehicle is the object. When it is determined that the possibility of collision is high, the display image in which the luminance of the area other than the detected object is reduced is generated.
- a high contrast display image is generated as if the object is spotlighted.
- the driver can instantly recognize the presence of an object.
- the display image generation unit further superimposes an image simulating the target at a position where the target exists in the grayscale image, and the display unit includes: The superimposed display image is displayed on the display device.
- the pseudo image is superimposed and displayed at the position of the object in the grayscale image obtained by imaging the periphery of the vehicle, so that the driver's attention is directed to the position where the pseudo image is superimposed. be able to.
- the object image is hidden by the pseudo image, it is possible to prompt the driver to watch the front.
- the display device is a display device of a navigation device.
- the display device of the navigation device can be effectively used to notify the driver of the presence of an object existing around the vehicle.
- the block diagram which shows the structure of the vehicle periphery monitoring apparatus according to one Example of this invention.
- 3 is a flowchart illustrating a process in an image processing unit according to an embodiment of the present invention.
- 5 is a flowchart illustrating an alarm determination process according to an embodiment of the present invention.
- generation of a display image according to one Example of this invention The figure which shows an example of the display image according to one Example of this invention. 7 is a flowchart illustrating an alarm determination process according to another embodiment of the present invention. The figure for demonstrating the production
- FIG. 1 is a block diagram showing a configuration of a vehicle periphery monitoring device using a display device of a navigation device according to an embodiment of the present invention
- FIG. 2 shows attachment of the display device and a camera to the vehicle.
- the vehicle is equipped with a navigation device, and the navigation device includes a navigation unit 5 and a display device 4.
- the display device 4 passes through the center of the vehicle handle (steering wheel) 21 and extends in the front-rear direction of the vehicle. It is attached so as to be visible to the driver at a position separated by a predetermined distance with respect to (shown to extend in the direction).
- the display device 4 is embedded in the dashboard 23 of the vehicle.
- the navigation unit 5 is realized by a computer having a central processing unit (CPU) and a memory.
- the navigation unit 5 receives, for example, a GPS signal for measuring the position of the vehicle 10 using an artificial satellite via a communication device (not shown) provided in the navigation unit 5, and receives the GPS signal. Based on this, the current position of the vehicle 10 is detected.
- the navigation unit 5 stores the current position in the map information around the vehicle (which can be stored in a storage device of the navigation device or can be received from a predetermined server via the communication device). Is displayed on the display screen 25 of the display device 4.
- the display screen 25 of the display device 4 constitutes a touch panel, and the occupant inputs the destination to the navigation unit 5 via the touch panel or another input device 27 such as a key or a button. Can do.
- the navigation unit 5 can calculate the optimal route of the vehicle to the destination, superimpose an image showing the optimal route on the map information, and display it on the display screen 25 of the display device 4.
- navigation devices are equipped with various other functions such as providing traffic information and facility guidance in the vicinity of the vehicle.
- any appropriate navigation device can be used. .
- the vehicle periphery monitoring device detects objects near the vehicle based on two infrared cameras 1R and 1L that are mounted on the vehicle and can detect far infrared rays, and image data captured by the cameras 1R and 1L.
- Image processing unit 2 and a speaker 3 that generates an alarm by sound or voice based on the detection result.
- the display device 4 is used for displaying an image obtained through the imaging of the camera 1R or 1L and for causing the driver to recognize the presence of an object around the vehicle.
- the periphery monitoring device includes a yaw rate sensor 6 that detects the yaw rate of the vehicle, and a vehicle speed sensor 7 that detects the traveling speed (vehicle speed) of the vehicle, and the detection results of these sensors are sent to the image processing unit 2. It is done.
- the cameras 1R and 1L are symmetrical with respect to the central axis passing through the center of the vehicle width at the front portion of the vehicle 10 so as to image the front of the vehicle 10. It is arranged in the position.
- the two cameras 1R and 1L are fixed to the vehicle so that their optical axes are parallel to each other and their height from the road surface is equal.
- the infrared cameras 1R and 1L have a characteristic that the level of the output signal becomes higher (that is, the luminance in the captured image becomes higher) as the temperature of the object is higher.
- the image processing unit 2 includes an A / D conversion circuit that converts an input analog signal into a digital signal, an image memory that stores a digitized image signal, a central processing unit (CPU) that performs various arithmetic processing, and a data RAM (Random Access Memory) used to store data, ROM (Read Only Memory) that stores programs to be executed by the CPU and data to be used (including tables and maps), driving signals for the speaker 3, and display for the display device 4
- An output circuit for outputting signals and the like is provided.
- the output signals of the cameras 1R and 1L are converted into digital signals and input to the CPU.
- the display device 4 of the navigation device is used to display to the driver the display of an image obtained through imaging by the cameras 1R and 1L and the presence of a predetermined object detected from the image. Used for display for notification (alarm).
- the display device 4 is a predetermined distance from the handle 21 in the vehicle width direction. Since it is provided at a position far away from the HUD, the amount of line-of-sight movement is large for the driver to visually recognize the screen of the display device 4, and thus the time required for visual recognition is also long.
- a display that is easier to understand than a HUD, that is, a display that can be recognized in a shorter time is desired.
- the present invention makes it possible to display in such a form, and in short, displays in a form in which a spotlight is applied to an object in an image. This specific method will be described below.
- FIG. 3 is a flowchart showing a process executed by the image processing unit 2. The process is performed at predetermined time intervals. Details of the processing of steps S11 to S23 are described in Japanese Patent Laid-Open No. 2001-6096, and will be described briefly here.
- the output signals of the cameras 1R and 1L (that is, captured image data) are received as input, A / D converted, and stored in the image memory.
- the stored image data is a gray scale image including luminance information.
- step S14 the right image captured by the camera 1R is used as a reference image (alternatively, the left image may be used as a reference image), and the image signal is binarized. Specifically, a process of setting a region brighter than the luminance threshold value ITH to “1” (white) and a dark region to “0” (black) is performed. By this binarization processing, an object having a temperature higher than a predetermined temperature such as a living body is extracted as a white region.
- the luminance threshold value ITH can be determined by any appropriate technique.
- step S15 the binarized image data is converted into run-length data.
- the coordinates of the start point (the leftmost pixel of each line) of the white area (referred to as a line) in each pixel row, and the start point to the end point (each The run length data is represented by the length (expressed by the number of pixels) up to the pixel on the right end of the line.
- the y-axis is taken in the vertical direction in the image
- the x-axis is taken in the horizontal direction.
- steps S16 and S17 the object is labeled and the object is extracted. That is, of the lines converted into run length data, a line having a portion overlapping in the y direction is regarded as one object, and a label is given thereto. Thus, one or a plurality of objects are extracted.
- step S18 the center of gravity G and area S of the extracted object and the aspect ratio ASPECT of the rectangle circumscribing the object, that is, the circumscribed rectangle, are calculated.
- the area S is calculated by integrating the lengths of run length data for the same object.
- the coordinates of the center of gravity G are calculated as the x coordinate of a line that bisects the area S in the x direction and the y coordinate of a line that bisects the area S in the y direction.
- the aspect ratio ASPECT is calculated as a ratio Dy / Dx between the length Dy in the y direction and the length Dx in the x direction of the circumscribed square. Note that the position of the center of gravity G may be substituted by the position of the center of gravity of the circumscribed rectangle.
- step S19 tracking of the target object for the time (tracking), that is, recognition of the same target object is performed at every predetermined sampling period.
- the sampling period may be the same as the period in which the process of FIG. 3 is performed.
- the time obtained by discretizing the time t as an analog quantity with the sampling period is k and the object A is extracted at the time k, the object A and the time at the next sampling period are considered.
- the identity with the object B extracted at (k + 1) is determined. The identity determination can be performed according to a predetermined condition.
- the difference between the x and y coordinates of the position of the center of gravity G on the images of the objects A and B is smaller than a predetermined allowable value
- the area on the image of the object B on the image of the object A If the ratio of the aspect ratio of the circumscribed rectangle of the object B to the aspect ratio of the circumscribed rectangle of the object A is smaller than the predetermined allowable value, the objects A and B Can be determined to be the same.
- the position of the object (in this embodiment, the position coordinates of the center of gravity G) is stored in the memory as time-series data together with the assigned label.
- step S20 the vehicle speed VCAR detected by the vehicle speed sensor 7 and the yaw rate YR detected by the yaw rate sensor 6 are read, and the yaw rate YR is integrated over time to calculate the turning angle of the vehicle 10, that is, the turning angle ⁇ r.
- steps S31 to S33 a process of calculating the distance z from the vehicle 10 to the object is performed in parallel with the processes of steps S19 and S20. Since this calculation requires a longer time than steps S19 and S20, it may be executed in a cycle longer than steps S19 and S20 (for example, a cycle about three times the execution cycle of steps S11 to S20).
- step S31 one of the objects to be tracked by the binarized image of the reference image (in this example, the right image) is selected, and this is surrounded by a search image R1 (here, a circumscribed rectangle). Let the image area be a search image).
- step S32 an image of the same object as the search image R1 (hereinafter referred to as a corresponding image) is searched for in the left image. Specifically, it can be performed by executing a correlation calculation between the search image R1 and the left image. The correlation calculation is performed according to the following formula (1). This correlation calculation is performed using a grayscale image, not a binary image.
- the search image R1 has M ⁇ N pixels, and IR (m, n) is a luminance value at the position of the coordinates (m, n) in the search image R1, and IL (a + m ⁇ M , B + n ⁇ N) is a luminance value at the position of the coordinates (m, n) in the local area having the same shape as the search image R1 with the predetermined coordinates (a, b) in the left image as a base point.
- the position of the corresponding image is specified by determining the position where the luminance difference sum C (a, b) is minimized by changing the coordinates (a, b) of the base point.
- a region to be searched may be set in advance, and a correlation calculation may be performed between the search image R1 and the region.
- step S33 the distance dR (number of pixels) between the centroid position of the search image R1 and the image center line (the line that bisects the captured image in the x direction) LCTR of the captured image, the centroid position of the corresponding image, and the image center line LCTR Distance dL (number of pixels) is obtained and applied to Equation (2) to calculate the distance z to the object of the vehicle 10.
- B is the base line length, that is, the distance in the x direction (horizontal direction) between the center position of the image sensor of the camera 1R and the center position of the image sensor of the camera 1L (that is, the distance between the optical axes of both cameras).
- F represents the focal length of the lenses provided in the cameras 1R and 1L
- p represents the pixel interval of the image sensors of the cameras 1R and 1L.
- step S21 the coordinates (x, y) in the image of the position of the object (as described above, the position of the center of gravity G in this embodiment) and the distance z calculated by Expression (2) are expressed in Expression (3).
- the real space coordinates (X, Y, Z) are, as shown in FIG. 4A, the origin O as the midpoint position (position fixed to the vehicle) of the camera 1R and 1L attachment positions.
- it is expressed in a coordinate system in which the X axis is defined in the vehicle width direction of the vehicle 10
- the Y axis is defined in the vehicle height direction
- the Z axis is defined in the traveling direction of the vehicle 10.
- FIG. 4B the coordinates on the image are represented by a coordinate system in which the center of the image is the origin, the horizontal direction is the x axis, and the vertical direction is the y axis.
- (xc, yc) is the coordinate (x, y) on the right image based on the relative position relationship between the mounting position of the camera 1R and the origin O of the real space coordinate system and the image. Are converted into coordinates in a virtual image that coincides with the center of the image.
- F is a ratio between the focal length F and the pixel interval p.
- step S22 a turning angle correction for correcting a positional deviation on the image due to the turning of the vehicle 10 is performed.
- the vehicle 10 turns, for example, by a turning angle ⁇ r in the left direction, the image obtained by the camera is shifted in the x direction (positive direction) by ⁇ x. Therefore, this is corrected.
- the real space coordinates (X, Y, Z) are applied to Equation (4) to calculate the corrected coordinates (Xr, Yr, Zr).
- the calculated real space position data (Xr, Yr, Zr) is stored in the memory in time series in association with each object.
- the corrected coordinates are indicated as (X, Y, Z).
- 1) indicating the direction of the approximate straight line LMV, the straight line represented by the equation (5) is obtained.
- u is a parameter that takes an arbitrary value.
- Xav, Yav, and Zav are the average value of the X coordinate, the average value of the Y coordinate, and the average value of the Z coordinate of the real space position data string, respectively.
- FIG. 5 is a diagram for explaining the approximate straight line LMV.
- P (0), P (1), P (2),..., P (N-2), P (N-1) represent time-series data after turning angle correction
- the numerical value in () attached to P indicating the coordinates of each data point indicates that the data is past data as the value increases.
- P (0) indicates the latest position coordinates
- P (1) indicates the position coordinates one sample period before
- P (2) indicates the position coordinates two sample periods before.
- X (j), Y (j), Z (j) and the like in the following description.
- a more detailed method for calculating the approximate straight line LMV is described in Japanese Patent Laid-Open No. 2001-6096.
- the corrected position coordinates Pv (0) (Xv (0) are obtained by using the expression (6) obtained by applying the Z coordinates Z (0) and Z (n ⁇ 1) to the expression (5a).
- Yv (0), Zv (0)) and Pv (N-1) (Xv (N-1), Yv (N-1), Zv (N-1)).
- a vector from the position coordinates Pv (N ⁇ 1) calculated by Expression (6) toward Pv (0) is calculated as a relative movement vector.
- step S24 an alarm determination process is executed.
- FIG. 6 shows the alarm determination process, which will be described with reference to FIG.
- FIG. 7 shows an imaging range AR0 that is an area that can be imaged by the cameras 1R and 1L.
- the processing in steps S11 to S23 in FIG. 3 is performed on the captured image corresponding to the imaging range AR0.
- the area AR1 corresponds to an area corresponding to a range obtained by adding a margin ⁇ (for example, about 50 to 100 cm) on both sides of the vehicle width ⁇ of the vehicle 10, in other words, the central axis of the vehicle 10 in the vehicle width direction. It is an area having a width of ( ⁇ / 2 + ⁇ ) on both sides, and is an approach determination area with a high possibility of collision if the object continues to exist as it is.
- the areas AR2 and AR3 are areas where the absolute value of the X coordinate is larger than the approach determination area (outside in the lateral direction of the approach determination area), and an intrusion determination that may cause an object in this area to enter the access determination area It is an area. These areas AR1 to AR3 have a predetermined height H in the Y direction and a predetermined distance Z1 in the Z direction.
- step S41 the possibility of a collision with the vehicle is determined for each object. Specifically, the objects existing in the areas AR1 to AR3 are extracted. Therefore, the relative speed Vs in the Z direction of the object with respect to the vehicle is calculated for each object by Equation (7). Thereafter, an object that satisfies Expression (8) and Expression (9) is extracted.
- Vs (Zv (N ⁇ 1) ⁇ Zv (0)) / ⁇ T (7) Zv (0) / Vs ⁇ T (8)
- Zv (0) is attached to indicate that the latest distance detection value (v is data after correction by the approximate straight line LMV, but the Z coordinate is the same value as before correction.
- Zv (N ⁇ 1) is a distance detection value before time ⁇ T.
- T is an allowance time and is intended to determine the possibility of a collision by a time T before the predicted collision time.
- T is set to, for example, about 2 to 5 seconds.
- Vs ⁇ T corresponds to the predetermined distance Z1 of the above-described areas AR1 to AR3.
- H defines a range in the Y direction, that is, the height direction, and is set to about twice the vehicle height of the host vehicle 10, for example. This indicates the predetermined height H of the above-mentioned areas AR1 to AR3.
- the objects in the areas AR1 to AR3 that are limited to the predetermined height H in the vertical direction and limited to the predetermined distance Z1 in the distance direction are determined to have a possibility of collision and extracted.
- step S42 an approach determination process is performed to determine whether or not each of the objects thus extracted is within the approach determination area AR1. Specifically, it is determined whether or not the X coordinate Xv (0) of the position Pv (0) of each object exists in the area AR1. For an object for which this determination is Yes, it is determined that there is a high possibility of a collision, and the process of step S44 is immediately performed. An object for which this determination is No indicates that the object exists in the area AR2 or AR3, and the intrusion determination process in step S43 is performed.
- the latest x-coordinate xc (0) on the image of the object (c is a coordinate that has been corrected to make the center position of the image coincide with the real space origin O as described above.
- the difference between the x coordinate xc (N ⁇ 1) before the time ⁇ T satisfies whether or not the expression (10) is satisfied. If there is an object that satisfies the equation (10), it is determined that the object is likely to collide with the vehicle 10 by entering the approach determination area AR1 due to movement (Yes in S43), and the process proceeds to Step S44. move on.
- step S48 the normal display which outputs the gray scale image acquired by step S13 on the display apparatus 4 as a display image is performed.
- Xv (N-1) xc (N-1) * Zv (N-1) / f (15)
- step S ⁇ b> 44 an alarm that determines whether or not an alarm is actually output to the driver for an object that is determined to have a high possibility of a collision in the approach determination process and the intrusion determination process. Perform output judgment processing.
- the alarm output determination process determines whether an alarm is actually output according to the operation of the brake. Specifically, it is determined from the output of a brake sensor (not shown) whether or not the driver of the vehicle 10 is performing a brake operation. If the brake operation is not performed, it is determined that an alarm should be output. (S44 is Yes), the process proceeds to step S45.
- the acceleration Gs generated by the brake operation (deceleration direction is positive) is calculated.
- the threshold GTH is calculated for each object as shown in Expression (17), and when there is an object whose acceleration Gs satisfies the threshold GTH or less (Gs ⁇ GTH), it is determined that an alarm should be output (Yes in S44). The process proceeds to step S45.
- Expression (17) is a value corresponding to a condition in which the vehicle 10 stops at a travel distance equal to or less than the distance Zv (0) when the brake acceleration Gs is maintained as it is.
- the process may proceed to step S45 in response to the presence of an object determined to have a high possibility of collision in steps S41 to S43.
- step S45 the brightness value of the region other than the region corresponding to the target object determined as having a high possibility of collision in the processing of steps S41 to S43 in the grayscale image acquired in step S13 is decreased. .
- FIG. 9 (a) schematically shows the grayscale image acquired in step S13 via the infrared cameras 1R and 1L.
- the object in this example, a pedestrian
- the luminance value of the object 101 is I1
- the luminance value of the area other than the object is I2.
- step S45 as shown in (b), all areas other than a predetermined area B1 including the object (for example, a rectangular area circumscribing the object set in step S18 of FIG. 3) may be used.
- a display image in which the brightness of B2 is lowered is generated.
- the luminance values of all the pixels included in the predetermined area B1 including the object are held as they are (therefore, the luminance values of the pixels of the object 101 are held at I1), and the other area B2
- the luminance values of all the pixels included in are set to I3 that is lower than I2 by a predetermined value.
- step S47 the display image thus generated is displayed on the display device 4.
- FIG. 10 shows, as an example, (a) an actually acquired grayscale image and (b) a display image generated by the processing in step S45 described above.
- a pedestrian 103 is captured in the gray scale image. Assume that the pedestrian 103 is an object that has been determined to have a high possibility of collision as described above.
- (b) shows a conventional alarm form in which an object in a grayscale image is highlighted with a frame 111.
- various objects other than the pedestrian 103 are included in the grayscale image. Since the image is captured, the driver may not be able to instantly understand where to pay attention even if highlighting is performed.
- the display device 4 installed at a predetermined distance from the steering wheel in the vehicle width direction is used, there is a possibility that the object recognition is delayed in the image as shown in (c). .
- the objects other than the pedestrian 103 are darkened and displayed in a form in which only the pedestrian 103 is spotlighted. The time required for the recognition of the pedestrian 103 can be shortened.
- the brightness of the area other than the predetermined area surrounding the object is lowered, but the brightness of the area other than the object 101 may be lowered without setting the area surrounding the object.
- the luminance value of the pixel is decreased by a predetermined value in order to decrease the luminance of the region other than the target object (or the predetermined region surrounding the target object).
- an alarm sound may be output through the speaker 3 together with the display image output to the display device 4 in step S47.
- the alarm sound may be arbitrary, may be a simple buzzer sound, or may be a voice message.
- the display image is generated so that the luminance value is held for the object that is determined to have a high possibility of collision in the processing in steps S41 to S43. If it is determined that the brake operation is being performed in the output determination process, it is determined that the possibility of collision is high, and the luminance value is held for an object that satisfies Gs ⁇ GTH in the alarm output determination process in step S44.
- a display image may be generated as described. As a result, the driver can be made to recognize only the object to be particularly noted.
- FIG. 11 is a flowchart of the alarm determination process executed in step S24 of FIG. 3 according to another embodiment of the present invention. The difference from FIG. 6 is that step S46 is added, and this point will be described with reference to FIG.
- a pseudo image of the object is stored in advance in the storage device of the image processing unit 2.
- a pseudo image is an icon image that simulates an object.
- the object is a pedestrian
- an icon 105 that simulates a pedestrian is stored.
- step S46 the pseudo image is read from the storage device, and the pseudo image is located at the position of the object in the image obtained as a result of the process in step S45 (the object for which the luminance is not reduced as described above). Are superimposed to generate a display image.
- FIG. 12 an image obtained by superimposing the pseudo image 105 of (a) on the position of the object 103 of the image of (b) obtained in step S45 (this is the same as (b) of FIG. 10). Is shown in (c).
- step S47 the image on which the pseudo image is superimposed is output on the display device 4.
- the pseudo image is an image that is superimposed in order to visually distinguish the target object from other areas in the displayed image. Therefore, the pseudo image is preferably generated in a color (for example, red or yellow) that has a high luminance value and can attract the driver's attention.
- the luminance value of the pseudo image is set to a luminance value that is a predetermined value higher than the luminance value of the region other than the object, that is, the luminance value decreased in the process of step S45 so that a high-contrast display image is obtained. It is preferable.
- the luminance value of the pseudo image may be determined in advance or may be variable. In the latter case, for example, the luminance value of the region other than the object reduced in step S45 (the luminance value of the region). The average value of the luminance values of the pixels may be obtained, and a predetermined value may be added to this to calculate the luminance value for the pseudo image.
- a display image is generated by superimposing the pseudo image of the calculated luminance value.
- the pseudo image may be highlighted by surrounding it with a frame 107 as shown in FIG.
- the frame 107 is preferably displayed in a color that is easy to call the driver's attention and has a relatively high luminance value.
- an image with a high contrast form is displayed such that the spotlight is only applied to the pseudo image indicating the position of the object.
- the pseudo image is an animation-like image unlike various images (that is, real objects). Therefore, the driver can instantly recognize that the pseudo image has been output, and thus can instantly recognize the presence of an object to be noted. Further, since the captured image of the actual target object is hidden by the pseudo image, the driver can be prompted to gaze at the front.
- an image simulating a pedestrian and an image simulating an animal may be generated and stored in advance.
- a process for determining whether the object is a pedestrian or an animal is provided before step 435, for example. Any appropriate determination process can be used. In the determination process, if the object is determined to be a pedestrian, in step S46, a pseudo image corresponding to the pedestrian is read and superimposed. If it is determined to be an animal, a pseudo image corresponding to the animal is read. Superimpose. In this way, the driver can instantly recognize whether the object to be noted is a pedestrian or an animal.
- the approach determination area and the intrusion determination area are used to determine the possibility of collision.
- the determination method is not limited to this form, and any other appropriate collision is possible.
- a possibility determination method can be used.
- the display device 4 uses a display device of a navigation device.
- operator's left or right side can be utilized.
- other display devices may be used, and the display form of the present invention may be applied to a head-up display as in the past.
- the present invention can be applied to other cameras (for example, a visible camera).
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
- Navigation (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
Abstract
Description
(X-Xav)/lx=(Y-Yav)/ly=(Z-Zav)/lz
(5a)
Vs=(Zv(N-1)―Zv(0))/ΔT (7)
Zv(0)/Vs≦T (8)
|Yv(0)|≦H (9)
-α/2≦XCL≦α/2 (11)
Xv(0)=xc(0)×Zv(0)/f (14)
Xv(N-1)=xc(N-1)×Zv(N-1)/f (15)
2 画像処理ユニット
3 スピーカ
4 表示装置
Claims (5)
- 車両に搭載されたカメラを用いて車両の周辺を撮像する撮像手段と、
前記撮像手段による撮像を介して、前記対象物の温度に応じた輝度値を有するグレースケール画像を取得する手段と、
前記グレースケール画像から、前記車両の周辺に存在する所定の対象物を検出する対象物検出手段と、
前記グレースケール画像に基づいて、前記車両に搭載された表示装置上に表示されるべき表示画像を生成する表示画像生成手段と、
前記生成された表示画像を、前記表示装置上に表示する手段と、を備える車両の周辺監視装置であって、
前記表示画像生成手段は、前記グレースケール画像中の、前記検出された対象物以外の領域の輝度を低下させることによって前記表示画像を生成することを特徴とする、周辺監視装置。 - 前記表示装置は、前記車両の運転者が視認可能な位置であって、前記車両のハンドルの回転の中心を通ると共に該車両の前後方向に伸長する線に対して所定距離だけ車幅方向に離れた位置に設けられている、
請求項1に記載の周辺監視装置。 - 前記対象物検出手段は、前記車両が前記対象物と衝突する可能性が高いか否かを判定し、
前記表示画像生成手段は、前記車両が前記対象物と衝突する可能性が高いと判定された場合に、前記検出された対象物以外の領域の輝度を低下させた前記表示画像を生成する、
請求項1または2に記載の周辺監視装置。 - 前記表示画像生成手段は、さらに、前記グレースケール画像中の前記対象物が存在する位置に、該対象物を擬似化した画像を重畳し、前記表示手段は、該重畳した前記表示画像を前記表示装置上に表示する、
請求項1から3のいずれかに記載の周辺監視装置。 - 前記表示装置は、ナビゲーション装置の表示装置である、
請求項1から4のいずれかに記載の周辺監視装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/579,754 US8953840B2 (en) | 2010-03-01 | 2011-02-21 | Vehicle perimeter monitoring device |
CN201180011858.2A CN102783144B (zh) | 2010-03-01 | 2011-02-21 | 车辆的周围监测装置 |
JP2012502992A JP5503728B2 (ja) | 2010-03-01 | 2011-02-21 | 車両の周辺監視装置 |
EP11750331.8A EP2544449B1 (en) | 2010-03-01 | 2011-02-21 | Vehicle perimeter monitoring device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-044465 | 2010-03-01 | ||
JP2010044465 | 2010-03-01 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011108217A1 true WO2011108217A1 (ja) | 2011-09-09 |
Family
ID=44541887
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/000948 WO2011108217A1 (ja) | 2010-03-01 | 2011-02-21 | 車両の周辺監視装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8953840B2 (ja) |
EP (1) | EP2544449B1 (ja) |
JP (1) | JP5503728B2 (ja) |
CN (1) | CN102783144B (ja) |
WO (1) | WO2011108217A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014116756A (ja) * | 2012-12-07 | 2014-06-26 | Toyota Motor Corp | 周辺監視システム |
CN107820116A (zh) * | 2017-11-14 | 2018-03-20 | 优酷网络技术(北京)有限公司 | 视频播放方法及装置 |
JP2018097000A (ja) * | 2018-01-05 | 2018-06-21 | 株式会社 ミックウェア | ナビゲーションシステム |
JP2018132528A (ja) * | 2013-06-17 | 2018-08-23 | ソニー株式会社 | 画像処理装置、画像処理方法及びプログラム |
WO2021161712A1 (ja) * | 2020-02-14 | 2021-08-19 | ソニーグループ株式会社 | 撮像装置及び車両制御システム |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011132388A1 (ja) * | 2010-04-19 | 2011-10-27 | 本田技研工業株式会社 | 車両の周辺監視装置 |
JP2013253961A (ja) * | 2012-05-07 | 2013-12-19 | Denso Corp | 画像表示システム |
KR102092625B1 (ko) * | 2013-10-15 | 2020-04-14 | 현대모비스 주식회사 | 차량 상태 경보 방법 및 이를 위한 장치 |
CN104079881B (zh) * | 2014-07-01 | 2017-09-12 | 中磊电子(苏州)有限公司 | 监控装置与其相关的监控方法 |
CN104269070B (zh) * | 2014-08-20 | 2017-05-17 | 东风汽车公司 | 一种车辆主动安全预警方法和运用该方法的安全预警系统 |
KR102149276B1 (ko) * | 2014-10-23 | 2020-08-28 | 한화테크윈 주식회사 | 영상 정합 방법 |
KR102225617B1 (ko) * | 2014-11-03 | 2021-03-12 | 한화테크윈 주식회사 | 영상 정합 알고리즘을 설정하는 방법 |
JP6432332B2 (ja) * | 2014-12-15 | 2018-12-05 | 株式会社リコー | 光電変換素子、画像読取装置及び画像形成装置 |
JP6402684B2 (ja) * | 2015-06-10 | 2018-10-10 | トヨタ自動車株式会社 | 表示装置 |
EP3396410A4 (en) * | 2015-12-21 | 2019-08-21 | Koito Manufacturing Co., Ltd. | IMAGE ACQUISITION DEVICE FOR VEHICLES, CONTROL DEVICE, VEHICLE HAVING IMAGE ACQUISITION DEVICE FOR VEHICLES AND CONTROL DEVICE, AND IMAGE ACQUISITION METHOD FOR VEHICLES |
WO2017110414A1 (ja) | 2015-12-21 | 2017-06-29 | 株式会社小糸製作所 | 車両用画像取得装置およびそれを備えた車両 |
US11187805B2 (en) | 2015-12-21 | 2021-11-30 | Koito Manufacturing Co., Ltd. | Image acquiring apparatus for vehicle, control device, vehicle having image acquiring apparatus for vehicle or control device, and image acquiring method for vehicle |
JP6851985B2 (ja) | 2015-12-21 | 2021-03-31 | 株式会社小糸製作所 | 車両用画像取得装置、制御装置、車両用画像取得装置または制御装置を備えた車両および車両用画像取得方法 |
DE102016220479A1 (de) * | 2016-10-19 | 2018-04-19 | Robert Bosch Gmbh | Verfahren und Vorrichtung zum Generieren eines Notrufs für ein Fahrzeug |
JP7143728B2 (ja) * | 2017-11-07 | 2022-09-29 | 株式会社アイシン | 重畳画像表示装置及びコンピュータプログラム |
CN109935107B (zh) * | 2017-12-18 | 2023-07-14 | 姜鹏飞 | 一种提升交通通视范围的方法及装置 |
KR20200005282A (ko) * | 2018-07-06 | 2020-01-15 | 현대모비스 주식회사 | 미러리스 자동차의 측방 영상 처리 장치 및 방법 |
CN115210790A (zh) * | 2020-08-28 | 2022-10-18 | Jvc建伍株式会社 | 目标识别控制装置以及目标识别方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08305999A (ja) * | 1995-05-11 | 1996-11-22 | Hitachi Ltd | 車載用カメラシステム |
JP2001006096A (ja) | 1999-06-23 | 2001-01-12 | Honda Motor Co Ltd | 車両の周辺監視装置 |
JP2005318408A (ja) * | 2004-04-30 | 2005-11-10 | Nissan Motor Co Ltd | 車両周囲監視装置および方法 |
JP4334686B2 (ja) | 1999-07-07 | 2009-09-30 | 本田技研工業株式会社 | 車両の画像表示装置 |
JP2010044561A (ja) * | 2008-08-12 | 2010-02-25 | Panasonic Corp | 乗物搭載用監視装置 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3987013B2 (ja) * | 2003-09-01 | 2007-10-03 | 本田技研工業株式会社 | 車両周辺監視装置 |
JP4128562B2 (ja) * | 2004-11-30 | 2008-07-30 | 本田技研工業株式会社 | 車両周辺監視装置 |
WO2006098356A1 (ja) * | 2005-03-15 | 2006-09-21 | Omron Corporation | 画像処理装置および画像処理方法、プログラム、並びに、記録媒体 |
JP4426535B2 (ja) * | 2006-01-17 | 2010-03-03 | 本田技研工業株式会社 | 車両の周辺監視装置 |
JP4456086B2 (ja) * | 2006-03-09 | 2010-04-28 | 本田技研工業株式会社 | 車両周辺監視装置 |
US7671725B2 (en) * | 2006-03-24 | 2010-03-02 | Honda Motor Co., Ltd. | Vehicle surroundings monitoring apparatus, vehicle surroundings monitoring method, and vehicle surroundings monitoring program |
JP4171501B2 (ja) * | 2006-04-25 | 2008-10-22 | 本田技研工業株式会社 | 車両の周辺監視装置 |
JP4173902B2 (ja) * | 2006-05-19 | 2008-10-29 | 本田技研工業株式会社 | 車両周辺監視装置 |
JP4173901B2 (ja) * | 2006-05-19 | 2008-10-29 | 本田技研工業株式会社 | 車両周辺監視装置 |
US7741961B1 (en) * | 2006-09-29 | 2010-06-22 | Canesta, Inc. | Enhanced obstacle detection and tracking for three-dimensional imaging systems used in motor vehicles |
DE102007011180A1 (de) * | 2007-03-06 | 2008-09-11 | Daimler Ag | Rangierhilfe und Verfahren für Fahrer von Fahrzeugen bzw. Fahrzeuggespannen, welche aus gegeneinander knickbare Fahrzeugelementen bestehen |
US7936923B2 (en) * | 2007-08-31 | 2011-05-03 | Seiko Epson Corporation | Image background suppression |
DE102007044535B4 (de) * | 2007-09-18 | 2022-07-14 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren zur Fahrerinformation in einem Kraftfahrzeug |
EP2401176B1 (en) * | 2009-02-27 | 2019-05-08 | Magna Electronics | Alert system for vehicle |
US8164543B2 (en) * | 2009-05-18 | 2012-04-24 | GM Global Technology Operations LLC | Night vision on full windshield head-up display |
-
2011
- 2011-02-21 JP JP2012502992A patent/JP5503728B2/ja not_active Expired - Fee Related
- 2011-02-21 EP EP11750331.8A patent/EP2544449B1/en active Active
- 2011-02-21 WO PCT/JP2011/000948 patent/WO2011108217A1/ja active Application Filing
- 2011-02-21 US US13/579,754 patent/US8953840B2/en active Active
- 2011-02-21 CN CN201180011858.2A patent/CN102783144B/zh active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08305999A (ja) * | 1995-05-11 | 1996-11-22 | Hitachi Ltd | 車載用カメラシステム |
JP2001006096A (ja) | 1999-06-23 | 2001-01-12 | Honda Motor Co Ltd | 車両の周辺監視装置 |
JP4334686B2 (ja) | 1999-07-07 | 2009-09-30 | 本田技研工業株式会社 | 車両の画像表示装置 |
JP2005318408A (ja) * | 2004-04-30 | 2005-11-10 | Nissan Motor Co Ltd | 車両周囲監視装置および方法 |
JP2010044561A (ja) * | 2008-08-12 | 2010-02-25 | Panasonic Corp | 乗物搭載用監視装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2544449A4 |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014116756A (ja) * | 2012-12-07 | 2014-06-26 | Toyota Motor Corp | 周辺監視システム |
JP2018132528A (ja) * | 2013-06-17 | 2018-08-23 | ソニー株式会社 | 画像処理装置、画像処理方法及びプログラム |
US10677596B2 (en) | 2013-06-17 | 2020-06-09 | Sony Corporation | Image processing device, image processing method, and program |
CN107820116A (zh) * | 2017-11-14 | 2018-03-20 | 优酷网络技术(北京)有限公司 | 视频播放方法及装置 |
WO2019095813A1 (zh) * | 2017-11-14 | 2019-05-23 | 优酷网络技术(北京)有限公司 | 视频播放方法及装置 |
CN107820116B (zh) * | 2017-11-14 | 2020-02-18 | 优酷网络技术(北京)有限公司 | 视频播放方法及装置 |
JP2018097000A (ja) * | 2018-01-05 | 2018-06-21 | 株式会社 ミックウェア | ナビゲーションシステム |
WO2021161712A1 (ja) * | 2020-02-14 | 2021-08-19 | ソニーグループ株式会社 | 撮像装置及び車両制御システム |
US12108172B2 (en) | 2020-02-14 | 2024-10-01 | Sony Group Corporation | Vehicle control system using imaging device capable of object detection |
Also Published As
Publication number | Publication date |
---|---|
US8953840B2 (en) | 2015-02-10 |
CN102783144B (zh) | 2016-06-29 |
CN102783144A (zh) | 2012-11-14 |
JP5503728B2 (ja) | 2014-05-28 |
EP2544449A4 (en) | 2014-01-01 |
US20130004021A1 (en) | 2013-01-03 |
EP2544449B1 (en) | 2016-03-16 |
EP2544449A1 (en) | 2013-01-09 |
JPWO2011108217A1 (ja) | 2013-06-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5503728B2 (ja) | 車両の周辺監視装置 | |
JP5706874B2 (ja) | 車両の周辺監視装置 | |
JP4410292B1 (ja) | 車両の周辺監視装置 | |
JP4456086B2 (ja) | 車両周辺監視装置 | |
WO2011108218A1 (ja) | 車両の周辺監視装置 | |
JP2003150938A (ja) | 画像認識装置 | |
JP4528283B2 (ja) | 車両周辺監視装置 | |
JP2003134508A (ja) | 車両用情報提供装置 | |
JP2008027309A (ja) | 衝突判定システム、及び衝突判定方法 | |
JP5292047B2 (ja) | 車両の周辺監視装置 | |
JP5192007B2 (ja) | 車両の周辺監視装置 | |
JP5192009B2 (ja) | 車両の周辺監視装置 | |
JP3919975B2 (ja) | 車両の周辺監視装置 | |
JP4176558B2 (ja) | 車両周辺表示装置 | |
JP4823753B2 (ja) | 車両の周辺監視装置 | |
JP3949628B2 (ja) | 車両の周辺監視装置 | |
JP4629638B2 (ja) | 車両の周辺監視装置 | |
JP2003151096A (ja) | 進入警報装置 | |
JP2003157498A (ja) | 障害物警報装置 | |
JP2007168547A (ja) | 障害物衝突判定システム、障害物衝突判定方法及びコンピュータプログラム | |
JP4876117B2 (ja) | 画像情報処理システム | |
JP4472623B2 (ja) | 車両周辺監視装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180011858.2 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11750331 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13579754 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012502992 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011750331 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |