EP2571005A1 - Vehicle surroundings monitoring device - Google Patents

Vehicle surroundings monitoring device Download PDF

Info

Publication number
EP2571005A1
EP2571005A1 EP11821400A EP11821400A EP2571005A1 EP 2571005 A1 EP2571005 A1 EP 2571005A1 EP 11821400 A EP11821400 A EP 11821400A EP 11821400 A EP11821400 A EP 11821400A EP 2571005 A1 EP2571005 A1 EP 2571005A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
distance
image
physical body
image portion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP11821400A
Other languages
German (de)
French (fr)
Other versions
EP2571005B1 (en
EP2571005A4 (en
Inventor
Kodai Matsuda
Nobuharu Nagaoka
Makoto Aimura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of EP2571005A1 publication Critical patent/EP2571005A1/en
Publication of EP2571005A4 publication Critical patent/EP2571005A4/en
Application granted granted Critical
Publication of EP2571005B1 publication Critical patent/EP2571005B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

Definitions

  • the present invention relates to a vehicle surroundings monitoring device which monitors a surroundings of a vehicle, by captured images by a camera mounted on the vehicle.
  • a vehicle surroundings monitoring device which monitors a surroundings of a vehicle by a single camera mounted on the vehicle, by extracting image portions assumed as the image portion of an identical physical body, from time-series images captured by the single camera, and calculates a distance between the physical body and the vehicle from a rate of change of the size thereof (for example, refer to Patent Document 1).
  • Patent Document 1 Japanese Patent No. 4267657
  • the physical body as the target of distance calculation is a pedestrian or a wild animal and the like crossing a road, and a posture change accompanying movement when seen from the vehicle is large, a change in shape of the image portions of the physical body between the time-series captured images become large. Therefore, there is a fear that the extraction of the image portions of the identical physical body from the time-series images becomes difficult, and the calculation accuracy of the distance between the vehicle and the physical body decreases.
  • the present invention has been made in view of the above-mentioned background, and aims to provide a vehicle surroundings monitoring device suppressing the decrease of the calculation accuracy of the distance between the physical body and the vehicle, on the basis of the captured images by a single camera.
  • the present invention has been made in order to achieve the object mentioned above, and relates to a vehicle surroundings monitoring device, comprising: a distance calculating unit which calculates, on a basis of a captured image by a single camera mounted on a vehicle, a distance between a vehicle and a physical body in real space corresponding to an image portion extracted from the captured image; and a physical body type judging unit which judges a type of the physical body in real space corresponding to the image portion.
  • the distance calculating unit determines a change in shape of the image portion or a change in shape of the physical body in real space corresponding to the image portion, in a predetermined period, executes a first distance calculating process, when the change in shape exceeds a predetermined level, of calculating a distance between the physical body and the vehicle, on the basis of a correlative relationship between a distance from the vehicle in real space set on assumption of the type of the physical body and a size of the image portion in the captured image, according to the size of the image portion of the physical body extracted from the captured image, and executes a second distance calculating process, when the change in shape is equal to or less than the predetermined level, of calculating the distance between the physical body and the vehicle, on the basis of the change in size of image portions of the physical body extracted from time-series captured images by the camera (a first aspect of the invention).
  • the distance calculating unit determines the change in shape of the image portion or the physical body in real space corresponding to the image portion, in a predetermined period, and when the change in shape exceeds the predetermined level, performs the first distance calculating process and calculates the distance between the vehicle and the physical body.
  • the physical body is a pedestrian crossing a road
  • the distance calculating unit calculates, in the first distance calculating process, of calculating the distance between the object in real space corresponding to the image portion and the vehicle, on the basis of a correlative relationship between the distance from the vehicle in real space and the height of the image portion in the captured image, set given that a height of the pedestrian is a predetermined value, according to the height of the image portion extracted from the captured image (a second aspect of the invention).
  • the distance calculating unit may calculate the distance between the vehicle and the pedestrian crossing the road captured by the camera, on the basis of the correlative relationship between the height of the image portion in the captured image and the distance from the vehicle in real space, which is set given that the height of the pedestrian is the predetermined value.
  • Fig. 1 is an explanatory view of a mounting manner of a vehicle surroundings monitoring device to a vehicle;
  • a vehicle surroundings monitoring device 10 of the present embodiment is used by being mounted on a vehicle 1.
  • the vehicle 1 is equipped with an infrared camera 2 (corresponds to a camera of the present invention) capable of detecting far infrared rays.
  • the infrared camera 2 is fixed to a front side of the vehicle 1 to capture images in front of the vehicle 1.
  • a real space coordinate system taking a front side of the vehicle 1 as an origin O, a lateral direction of the vehicle 1 as an X axis, a vertical direction thereof as a Y axis, and an anteroposterior direction thereof as a Z axis is defined.
  • a camera having sensitivity in other wavelength band such as visible light may be used in place of the infrared camera 2.
  • the vehicle 1 is connected with a yaw rate sensor 3 for detecting a yaw rate of the vehicle 1, a vehicle speed sensor 4 which detects a traveling speed of the vehicle 1, a brake sensor 5 which detects a manipulation amount of a brake by a driver, a loudspeaker 6 for performing attention-seeking by voice and the like, and a head up display 7 (hereinafter referred to as HUD 7) for displaying an image captured by the infrared camera 2 and performing display to make the driver visualize a physical body having high possibility of coming into contact with the vehicle 1.
  • the HUD 7 is provided so as to display a screen 7a on a windshield of the vehicle 1 at a forefront position on the driver's side.
  • the vehicle surroundings monitoring device 10 is an electronic unit configured from a CPU, a memory (not shown) and the like, and has a function of converting a video signal output from the infrared camera 2 into a digital data and taking in the same into an image memory (not shown), and to perform various arithmetic processing to the captured image in front of the vehicle 1 taken into the image memory with the CPU.
  • the CPU functions as an object extracting unit 11 which extracts an image portion having a predetermined condition from the image captured by the infrared camera 2, a physical body type judging unit 12 which judges a type of the physical body in real space corresponding to the extracted image portion, a distance calculating unit 13 which calculates a distance between the physical object and the vehicle 1, a real space position calculating unit 14 which calculates a real space position of the physical body, a motion vector calculating unit 15 which calculates a motion vector of the physical body in the real space, and a warning judging unit 16 which determines whether or not to make the physical body a target of warning on the basis of the motion vector.
  • an object extracting unit 11 which extracts an image portion having a predetermined condition from the image captured by the infrared camera 2
  • a physical body type judging unit 12 which judges a type of the physical body in real space corresponding to the extracted image portion
  • a distance calculating unit 13 which calculates a distance between the physical object and the vehicle 1
  • the vehicle surroundings monitoring device 10 executes a process by the flow chart shown in Fig. 3 every predetermined control cycle, and monitors the surroundings of the vehicle 1.
  • the vehicle surroundings monitoring device 10 inputs the video signal output from the infrared camera 2, and takes in to the image memory a gray scale image obtained by converting the video image to a digital gradation (luminance) data.
  • Subsequent STEP 2 is a process by the object extracting unit 11.
  • the object extracting unit 11 obtains a binary image by performing, for each pixel in the gray scale image, a binarization process of setting "1" (white) for a pixel with the luminance of a predetermined threshold value or more, and "0" (black) for a pixel with the luminance smaller than the threshold value. Thereafter, the object extracting unit 11 calculates a run length data of each white region in the binary image, performs a labeling process and the like, and extracts the image portion of the physical body.
  • Subsequent STEP 3 is a process by the physical body type judging unit 12.
  • the physical body type judging unit 12 determines whether or not the physical body extracted in STEP 2 (hereinafter referred to as an object) is a pedestrian crossing a road (hereinafter referred to as a crossing pedestrian), on the basis of an amount of characteristics of the image portion.
  • Fig. 4 illustrates the image of the crossing pedestrian
  • the physical body type judging unit 12 searches upper regions M(1), M(2) of an image portion Tk estimated as a head of the crossing pedestrian and lower regions M(3) through M(8), and recognizes the characteristics of each region. Thereafter, the physical body type judging unit 12 judges the type of the object as the crossing pedestrian, when the upper regions M(1), M(2) of the image portion Tk does not have characteristic portion, and a reversed V shape (two slant edges crossing at top) at a bottom end, which is a specific characteristics of the crossing pedestrian, is recognized in the lower regions M(3) through M(8). Further, the physical body type judging unit 12 sets an object region Rk including the image portion Tk estimated as the head through the lower end M(7).
  • the judging of the crossing pedestrian may be performed by a complexity or change in luminance dispersion, or a periodicity and the like of the image.
  • Subsequent STEP 4 through STEP 5 and STEP 20 are process by the distance calculating unit 13.
  • the distance calculating unit 13 determines whether or not the type of the object is judged as the crossing pedestrian in STEP 4, and when it is determined as the crossing pedestrian (in this case, it is assumed that a change in shape of the object and the image portion is large, and exceeds a predetermined level), branches to STEP 20 and executes a first distance calculating process.
  • the process proceeds to STEP 5
  • the distance calculating unit 13 proceeds to STEP 5 and executes a second distance calculating process.
  • the second distance calculating process is, as is explained later, for calculating the distance between the object and the vehicle 1, on the basis of a rate of change of a size of the image portions of an identical object that are extracted from time-series images captured by the infrared camera 2.
  • the crossing pedestrian crosses the front of the vehicle 1, while moving both legs and arms widely, as is shown in Ik1 through Ik4 in Fig. 5 . Therefore, the shape of the image portion of the crossing pedestrian (width W and the like) in the captured image by the infrared camera 2 changes greatly. As such, there are cases where it is not possible to extract the image portions of the crossing pedestrian as the image portions of identical physical body, between the time-series images captured by the infrared camera 2, and even if it is possible to extract the same, it is difficult to obtain the rate of change of size with good accuracy.
  • the distance calculating unit 13 calculates in STEP 20 the distance between the object and the vehicle 1, on the basis of a height of the image portion of the object in a single captured image.
  • the distance calculating unit 13 calculates the distance between the object and the vehicle 1 in real space, taking the height of the pedestrian in real space constant, and applying the height of the image portion of the object in the captured image, to a map setting a correlative relationship between a distance L between the pedestrian and the vehicle 1 in real space and a height H of the image portion of the pedestrian in the captured image or to a correlation equation.
  • the distance calculating unit 13 calculates in STEP 5 the distance between the object and the vehicle 1 by executing the second distance calculating process.
  • the second distance calculating process a process of tracking the image portions of the identical object, between an image Im2 captured at previous control cycle (capturing time point t, 11 ) and an image Im3 captured at current control cycle (capturing time point t 12 ).
  • the tracking process is explained in detail for example in Japanese Patent Laid-Open No. 2007-213561 , so that explanation thereof is omitted in this specification.
  • the distance calculating unit 13 calculates a rate of change Rate by dividing a width w 11 of an image portion 31 in the image Im2 by a width w 12 of an image portion 32 in the image Im3, according to the following expression (1).
  • a relative speed Vs between the vehicle 1 and the object is approximated by the traveling speed of the vehicle detected by the vehicle speed sensor 4.
  • w 11 width of image portion of object at previous image capturing (capturing time point t 11 )
  • w 12 width of image portion of object at current image capturing (capturing time point t 12 )
  • f:f F (focal length of infrared camera 2) / p (pixel pitch of captured image)
  • W width of object in real space
  • Z 1 distance from vehicle 1 to object at previous image capturing (capturing time point t 11 )
  • Z 2 distance from vehicle 1 to object at current image capturing (capturing time point t 12 )
  • Vs relative speed between vehicle and object
  • dT image capturing interval
  • T arrival time of self vehicle (estimated time until object reaches vehicle 1).
  • Subsequent STEP 6 is a process by the real space position calculating unit 14.
  • the real space position calculating unit 14 calculates a distance Z 1 from the vehicle 1 to the object in previous image capturing by a following expression (3).
  • Z 1 distance from vehicle 1 to object at previous image capturing
  • Z 2 distance from vehicle 1 to object at current image capturing
  • Vj traveling speed of vehicle 1
  • dT image capturing interval
  • the real space position calculating unit 14 calculates the real space positions of the object in the current and the previous image capturing, from the position of the image portion of the object in the current and the previous binary images.
  • Fig. 8(a) shows a position Pi_2 (x 12 , y 12 ) of the current image portion of the object, and a position Pi_1 (x 11 , y 11 ) of the previous image portion of the object in the binary image Im4, wherein an axis of ordinate y is set to a vertical direction of the image, and an axis of abscissas x is set to a horizontal direction of the image.
  • Fig. 8(b) shows a moving state of the object in real space, wherein Z axis is set to a traveling direction of the vehicle 1, and X axis is set to a direction orthogonal to the Z axis.
  • Pr_2 (X 12 , Y 12 , Z 12 ) shows a position of the object at the current image capturing
  • Pr_1 (X 11 , Y 11 , Z 11 ) shows a position of the object at the previous image capturing.
  • Vm is the motion vector of the object in real space estimated from Pr_2 and Pr_1.
  • the real space position calculating unit 14 calculates a real space coordinate Pr_2 (X 12 , Y 12 , Z 12 ) of the object at current image capturing from following expression (4), and calculates a real space coordinate Pr_1 (X 11 , Y 11 , Z 11 ) of the object at previous image capturing from following expression (5).
  • Z 11 Z 1
  • Z 12 Z 2 .
  • X 12 , Y 12 real space coordinate values of object at current image capturing
  • x 12 , y 12 coordinate values of image portion of object at current binary image
  • Z 2 distance from vehicle to object at current image capturing
  • the real space position calculating unit 14 performs a turning angle correction of correcting a position deviation on the image from the turning of the vehicle 1, on the basis of a turning angle recognized from a detected signal YR of the yaw rate sensor 3. Specifically, when the turning angle of the vehicle 1 from the previous image capturing to the current image capturing is ⁇ r, then the real space coordinate values are corrected with following expression (6).
  • Subsequent STEP 7 is a process by the motion vector calculating unit 15.
  • the motion vector calculating unit 15 obtains an approximation straight line Vm corresponding to a relative motion vector between the object and the self vehicle 1, from the real space position Pr_1 of the previous image capturing and the real space position Pr_2 of the current image capturing for the identical object.
  • the relative motion vector may be obtained using the real space positions of the object at plural time points in the past. Further, a specific calculation process of the approximation straight line is, for example a method disclosed in Japanese Patent Laid-Open No. 2001-6096 .
  • Step 9 when the object exists within an approximate judging region in front of the vehicle, or when the motion vector of the object is oriented within the approximate judging region, the warning judging unit 16 sets the object as a target of warning.
  • the warning judging unit 16 further determines whether or not a braking operation is performed by the driver from the output from the brake sensor 5.
  • a braking operation is performed, and when an acceleration of the vehicle 1 (in this case, a decelerating direction is positive) is larger than a predetermined acceleration threshold value (it is assumed that appropriate braking operation is performed by the driver), the warning judging unit 16 determines that no warning output is necessary because an avoiding operation is being performed, and proceeds to STEP 10.
  • the process branches to STEP 30.
  • the warning judging unit 16 outputs alarming sound from the loudspeaker 6 in STEP 30, and also displays a highlighted image of the object to the HUD 7 in STEP 31, and proceeds to STEP 10.
  • the distance calculating unit 13 calculated, in the second distance calculating process, the rate of change Rate by a time tracking operation of the image portion of the identical object between the binary images shown in Fig. 7 .
  • the rate of change Rate may be calculated by a correlation calculation of the image portion of the object shown in Fig. 9 .
  • Im5 is a grayscale image at the previous image capturing, and 51 shows the image portion of the object.
  • Im6 is a grayscale image at the current image capturing, and 52 shows the image portion of the object.
  • the distance calculating unit 13 reduces (in the case where the object is approaching the self vehicle) or expands (in the case where the object is departing from the self vehicle) the size of the image portion 52 of the object in the current grayscale image Im6 by affine transformation, and calculates a degree of correlation with the image portion 51 of the object at the previous image capturing. Specifically, as is shown in the figure, the degree of correlation between the image portion 51, and an image 60 obtained by expanding image portion 50 by 150%, an image 61 obtained by expanding by 125%, an image 62 obtained by expanding by 100%, an image 63 obtained by expanding by 75%, and an image 64 obtained by expanding by 50%. Thereafter, distance calculating unit 13 determines the magnification of the image portion 52 which has the highest degree of correlation as the rate of change Rate.
  • a configuration for capturing the front of the vehicle is shown.
  • other directions such as a rear or a side of the vehicle, may be captured, to determine the contact possibility with a monitored object.
  • the present invention is applicable to other type of physical bodies, such as a large-sized wild animal crossing the road, which is assumed that a shape of an image portion between time-series captured images change to a degree that makes it difficult to extract as the image portion of the identical physical body, by previously assuming the size of the object type (height, width and the like), and to set the correlative relationship between the distance from the vehicle 1 and the image portion of the physical body in the captured image.
  • the first distance calculating process and the second distance calculating process are switched after the determination of the crossing pedestrian.
  • a determination on whether or not it is a physical body with small change of shape between the time-series images (vehicle, or a predetermined stationary object, and the like) may be performed, and the distance between the vehicle 1 and the physical body may be calculated using the second distance calculating process when it is determined that the physical body is a physical body having small change of shape, and the distance may be calculated using the first distance calculating process when it is not determined that the physical body is a physical body having small change of shape.
  • the determination on whether or not the physical body is a physical body having small change of shape is performed in place of the determination of crossing pedestrian in STEP 4 of Fig. 3 , and takes a configuration in which the process proceeds to STEP 5 when it is a physical body with small change of rate and calculates the distance between the vehicle 1 and the physical body by the second distance calculating process, and branches to STEP 20 when it is not the physical body with small change in shape and calculates the distance between the vehicle 1 and the physical object by the first distance calculating process.
  • the vehicle surroundings monitoring device of the present invention it becomes possible to suppress the decrease in the calculation accuracy between the physical body and the vehicle, on the basis of the captured image by a single camera, therefore it is useful in monitoring by calculating the distance between the physical bodies in the surroundings of the vehicle and the vehicle.

Abstract

If an object type determining unit (12) determines that the type of an object in a real space corresponding to an image portion extracted from a captured image is a pedestrian crossing a road, a distance calculating unit (13) performs first distance calculation processing of calculating the distance between the object in the real space corresponding to the image portion extracted from the captured image and a vehicle (1) according to the height of the image portion on the basis of correlation between the distance from the vehicle (1) in the real space and the height of the image portion in the captured image set for an assumed height of the pedestrian. If the object type determining unit (12) determines that the type of the object is not a pedestrian crossing a road, the distance calculating unit (13) performs second distance calculation processing of calculating the distance between the object and the vehicle on the basis of a change in the area of image portions of the object extracted from time-series images captured by an infrared camera (2).

Description

    Technical Field
  • The present invention relates to a vehicle surroundings monitoring device which monitors a surroundings of a vehicle, by captured images by a camera mounted on the vehicle.
  • Background Art
  • Conventionally, there is known a vehicle surroundings monitoring device, which monitors a surroundings of a vehicle by a single camera mounted on the vehicle, by extracting image portions assumed as the image portion of an identical physical body, from time-series images captured by the single camera, and calculates a distance between the physical body and the vehicle from a rate of change of the size thereof (for example, refer to Patent Document 1).
  • Prior Art Documents Patent Documents
  • Patent Document 1: Japanese Patent No. 4267657
  • Summary of the Invention Problems to be Solved by the Invention
  • According to the vehicle surroundings monitoring device disclosed in Patent Document 1 mentioned above, it becomes possible to calculate the distance between the physical body and the vehicle comparatively accurately, by extracting the image portions of the identical physical body from the time-series images.
  • However, in the case where the physical body as the target of distance calculation is a pedestrian or a wild animal and the like crossing a road, and a posture change accompanying movement when seen from the vehicle is large, a change in shape of the image portions of the physical body between the time-series captured images become large. Therefore, there is a fear that the extraction of the image portions of the identical physical body from the time-series images becomes difficult, and the calculation accuracy of the distance between the vehicle and the physical body decreases.
  • The present invention has been made in view of the above-mentioned background, and aims to provide a vehicle surroundings monitoring device suppressing the decrease of the calculation accuracy of the distance between the physical body and the vehicle, on the basis of the captured images by a single camera.
  • Means for Solving the Problems
  • The present invention has been made in order to achieve the object mentioned above, and relates to a vehicle surroundings monitoring device, comprising: a distance calculating unit which calculates, on a basis of a captured image by a single camera mounted on a vehicle, a distance between a vehicle and a physical body in real space corresponding to an image portion extracted from the captured image; and a physical body type judging unit which judges a type of the physical body in real space corresponding to the image portion.
  • And, the distance calculating unit determines a change in shape of the image portion or a change in shape of the physical body in real space corresponding to the image portion, in a predetermined period, executes a first distance calculating process, when the change in shape exceeds a predetermined level, of calculating a distance between the physical body and the vehicle, on the basis of a correlative relationship between a distance from the vehicle in real space set on assumption of the type of the physical body and a size of the image portion in the captured image, according to the size of the image portion of the physical body extracted from the captured image, and executes a second distance calculating process, when the change in shape is equal to or less than the predetermined level, of calculating the distance between the physical body and the vehicle, on the basis of the change in size of image portions of the physical body extracted from time-series captured images by the camera (a first aspect of the invention).
  • According to the first aspect of the invention, the distance calculating unit determines the change in shape of the image portion or the physical body in real space corresponding to the image portion, in a predetermined period, and when the change in shape exceeds the predetermined level, performs the first distance calculating process and calculates the distance between the vehicle and the physical body. With such configuration, by using the second distance calculating process for the physical body with large change in shape, it becomes possible to suppress the decrease of the calculation accuracy of the distance between the physical body and the vehicle.
  • Further, the physical body is a pedestrian crossing a road, and the distance calculating unit calculates, in the first distance calculating process, of calculating the distance between the object in real space corresponding to the image portion and the vehicle, on the basis of a correlative relationship between the distance from the vehicle in real space and the height of the image portion in the captured image, set given that a height of the pedestrian is a predetermined value, according to the height of the image portion extracted from the captured image (a second aspect of the invention).
  • According to the second aspect of the invention, when it is determined by the physical body type judging unit that the type of the physical body in real space corresponding to the image portion extracted from the captured image is the pedestrian, the distance calculating unit may calculate the distance between the vehicle and the pedestrian crossing the road captured by the camera, on the basis of the correlative relationship between the height of the image portion in the captured image and the distance from the vehicle in real space, which is set given that the height of the pedestrian is the predetermined value.
  • Brief Description of the Drawings
  • Fig. 1 is an explanatory view of a mounting manner of a vehicle surroundings monitoring device to a vehicle;
    • Fig. 2 is a configuration view of the vehicle surroundings monitoring device:
    • Fig. 3 is an operational flow chart of the vehicle surroundings monitoring device;
    • Fig. 4 is an explanatory view of a pedestrian determination;
    • Fig. 5 is an explanatory view of a motion of the pedestrian crossing a road;
    • Fig. 6 is an explanatory view of a correlative relationship between a distance and a height of an image portion of the pedestrian;
    • Fig. 7 is an explanatory view of a change in size of the image portion of an object in time-series images;
    • Fig. 8 is an explanatory view of an estimating process of a motion vector of the object in real space; and
    • Fig. 9 is an explanatory view of a calculating process of a rate of change of the image portion of a physical body by pattern matching.
    Mode for Carrying out the Invention
  • An embodiment of the present embodiment will be explained with reference to Fig. 1 through Fig. 9. With reference to Fig. 1, a vehicle surroundings monitoring device 10 of the present embodiment is used by being mounted on a vehicle 1. The vehicle 1 is equipped with an infrared camera 2 (corresponds to a camera of the present invention) capable of detecting far infrared rays.
  • The infrared camera 2 is fixed to a front side of the vehicle 1 to capture images in front of the vehicle 1. A real space coordinate system taking a front side of the vehicle 1 as an origin O, a lateral direction of the vehicle 1 as an X axis, a vertical direction thereof as a Y axis, and an anteroposterior direction thereof as a Z axis is defined. Alternatively, a camera having sensitivity in other wavelength band such as visible light may be used in place of the infrared camera 2.
  • Next, with reference to Fig. 2, the vehicle 1 is connected with a yaw rate sensor 3 for detecting a yaw rate of the vehicle 1, a vehicle speed sensor 4 which detects a traveling speed of the vehicle 1, a brake sensor 5 which detects a manipulation amount of a brake by a driver, a loudspeaker 6 for performing attention-seeking by voice and the like, and a head up display 7 (hereinafter referred to as HUD 7) for displaying an image captured by the infrared camera 2 and performing display to make the driver visualize a physical body having high possibility of coming into contact with the vehicle 1. As is shown in Fig. 1, the HUD 7 is provided so as to display a screen 7a on a windshield of the vehicle 1 at a forefront position on the driver's side.
  • The vehicle surroundings monitoring device 10 is an electronic unit configured from a CPU, a memory (not shown) and the like, and has a function of converting a video signal output from the infrared camera 2 into a digital data and taking in the same into an image memory (not shown), and to perform various arithmetic processing to the captured image in front of the vehicle 1 taken into the image memory with the CPU.
  • Thereafter, by making the CPU execute a control program of the vehicle surroundings monitoring device 10, the CPU functions as an object extracting unit 11 which extracts an image portion having a predetermined condition from the image captured by the infrared camera 2, a physical body type judging unit 12 which judges a type of the physical body in real space corresponding to the extracted image portion, a distance calculating unit 13 which calculates a distance between the physical object and the vehicle 1, a real space position calculating unit 14 which calculates a real space position of the physical body, a motion vector calculating unit 15 which calculates a motion vector of the physical body in the real space, and a warning judging unit 16 which determines whether or not to make the physical body a target of warning on the basis of the motion vector.
  • Next, according to a flow chart shown in Fig. 3, a series of a vehicle surroundings monitoring process by the vehicle surroundings monitoring device 10 will be explained. The vehicle surroundings monitoring device 10 executes a process by the flow chart shown in Fig. 3 every predetermined control cycle, and monitors the surroundings of the vehicle 1.
  • In STEP 1, the vehicle surroundings monitoring device 10 inputs the video signal output from the infrared camera 2, and takes in to the image memory a gray scale image obtained by converting the video image to a digital gradation (luminance) data.
  • Subsequent STEP 2 is a process by the object extracting unit 11. The object extracting unit 11 obtains a binary image by performing, for each pixel in the gray scale image, a binarization process of setting "1" (white) for a pixel with the luminance of a predetermined threshold value or more, and "0" (black) for a pixel with the luminance smaller than the threshold value. Thereafter, the object extracting unit 11 calculates a run length data of each white region in the binary image, performs a labeling process and the like, and extracts the image portion of the physical body.
  • Subsequent STEP 3 is a process by the physical body type judging unit 12. The physical body type judging unit 12 determines whether or not the physical body extracted in STEP 2 (hereinafter referred to as an object) is a pedestrian crossing a road (hereinafter referred to as a crossing pedestrian), on the basis of an amount of characteristics of the image portion.
  • Fig. 4 illustrates the image of the crossing pedestrian, and the physical body type judging unit 12 searches upper regions M(1), M(2) of an image portion Tk estimated as a head of the crossing pedestrian and lower regions M(3) through M(8), and recognizes the characteristics of each region. Thereafter, the physical body type judging unit 12 judges the type of the object as the crossing pedestrian, when the upper regions M(1), M(2) of the image portion Tk does not have characteristic portion, and a reversed V shape (two slant edges crossing at top) at a bottom end, which is a specific characteristics of the crossing pedestrian, is recognized in the lower regions M(3) through M(8). Further, the physical body type judging unit 12 sets an object region Rk including the image portion Tk estimated as the head through the lower end M(7).
  • The judging of the crossing pedestrian may be performed by a complexity or change in luminance dispersion, or a periodicity and the like of the image.
  • Subsequent STEP 4 through STEP 5 and STEP 20 are process by the distance calculating unit 13. The distance calculating unit 13 determines whether or not the type of the object is judged as the crossing pedestrian in STEP 4, and when it is determined as the crossing pedestrian (in this case, it is assumed that a change in shape of the object and the image portion is large, and exceeds a predetermined level), branches to STEP 20 and executes a first distance calculating process. On the other hand, when it is judged that the type of the object is not the crossing pedestrian (in this case, it is assumed that the change in shape of the object and the image portion is small, and becomes the predetermined level or less), the process proceeds to STEP 5, the distance calculating unit 13 proceeds to STEP 5 and executes a second distance calculating process.
  • The second distance calculating process is, as is explained later, for calculating the distance between the object and the vehicle 1, on the basis of a rate of change of a size of the image portions of an identical object that are extracted from time-series images captured by the infrared camera 2.
  • As is shown in Fig. 5, the crossing pedestrian crosses the front of the vehicle 1, while moving both legs and arms widely, as is shown in Ik1 through Ik4 in Fig. 5. Therefore, the shape of the image portion of the crossing pedestrian (width W and the like) in the captured image by the infrared camera 2 changes greatly. As such, there are cases where it is not possible to extract the image portions of the crossing pedestrian as the image portions of identical physical body, between the time-series images captured by the infrared camera 2, and even if it is possible to extract the same, it is difficult to obtain the rate of change of size with good accuracy.
  • Therefore, when it is judged that the type of the object is the crossing pedestrian, the distance calculating unit 13 calculates in STEP 20 the distance between the object and the vehicle 1, on the basis of a height of the image portion of the object in a single captured image.
  • As is shown in Fig. 6, in the case where a height of a pedestrian is set constant (for example, 170cm), a height H of the image portion of the pedestrian in a captured image Im1 by the infrared camera 2 becomes lower, as the distance between the pedestrian and the vehicle 1 becomes longer. Therefore, the distance calculating unit 13 calculates the distance between the object and the vehicle 1 in real space, taking the height of the pedestrian in real space constant, and applying the height of the image portion of the object in the captured image, to a map setting a correlative relationship between a distance L between the pedestrian and the vehicle 1 in real space and a height H of the image portion of the pedestrian in the captured image or to a correlation equation.
  • Further, when it is judged that the type of the object is not the crossing pedestrian, the distance calculating unit 13 calculates in STEP 5 the distance between the object and the vehicle 1 by executing the second distance calculating process. As is shown in Fig. 7, in the second distance calculating process, a process of tracking the image portions of the identical object, between an image Im2 captured at previous control cycle (capturing time point t,11) and an image Im3 captured at current control cycle (capturing time point t12). The tracking process is explained in detail for example in Japanese Patent Laid-Open No. 2007-213561 , so that explanation thereof is omitted in this specification.
  • Then, the distance calculating unit 13 calculates a rate of change Rate by dividing a width w11 of an image portion 31 in the image Im2 by a width w12 of an image portion 32 in the image Im3, according to the following expression (1). A relative speed Vs between the vehicle 1 and the object is approximated by the traveling speed of the vehicle detected by the vehicle speed sensor 4.
  • [Expression 1] Rate = w 11 previous w 12 current = f W / Z 1 f W / Z 2 = Z 2 Z 1 = Z 2 Z 2 + Vs dT = Vs T Vs T + Vs dT = 1 1 + dT / T
    Figure imgb0001
  • Where w11: width of image portion of object at previous image capturing (capturing time point t11), w12: width of image portion of object at current image capturing (capturing time point t12), f:f=F (focal length of infrared camera 2) / p (pixel pitch of captured image), W: width of object in real space, Z1: distance from vehicle 1 to object at previous image capturing (capturing time point t11), Z2: distance from vehicle 1 to object at current image capturing (capturing time point t12), Vs: relative speed between vehicle and object, dT: image capturing interval, T: arrival time of self vehicle (estimated time until object reaches vehicle 1).
  • Subsequently, the distance calculating unit 13 calculates a distance Z2 between the vehicle 1 and the object at current image capturing, with an expression (2) below transformed by substituting the relative velocity Vs between the vehicle 1 and the object (=the traveling speed Vj of the vehicle 1+a traveling speed Vd of the object) in the above-mentioned expression (1) with the traveling speed Vj of the vehicle, assuming that the traveling speed Vj of the vehicle 1 is sufficiently higher than the traveling speed Vd of the object.
  • [Expression 2] Z 2 = Rate Vj dT 1 - Rate
    Figure imgb0002
  • Where Z2: distance from vehicle 1 to object at current image capturing, Rate: rate of change, Vj: traveling speed of vehicle 1, dT: image capturing interval.
  • Subsequent STEP 6 is a process by the real space position calculating unit 14. The real space position calculating unit 14 calculates a distance Z1 from the vehicle 1 to the object in previous image capturing by a following expression (3).
  • [Expression 3] Z 1 = Z 2 + Vj dT
    Figure imgb0003
  • Where Z1: distance from vehicle 1 to object at previous image capturing, Z2: distance from vehicle 1 to object at current image capturing, Vj: traveling speed of vehicle 1, dT: image capturing interval.
  • Then, the real space position calculating unit 14 calculates the real space positions of the object in the current and the previous image capturing, from the position of the image portion of the object in the current and the previous binary images.
  • Fig. 8(a) shows a position Pi_2 (x12, y12) of the current image portion of the object, and a position Pi_1 (x11, y11) of the previous image portion of the object in the binary image Im4, wherein an axis of ordinate y is set to a vertical direction of the image, and an axis of abscissas x is set to a horizontal direction of the image.
  • Further, Fig. 8(b) shows a moving state of the object in real space, wherein Z axis is set to a traveling direction of the vehicle 1, and X axis is set to a direction orthogonal to the Z axis. In the figure, Pr_2 (X12, Y12, Z12) shows a position of the object at the current image capturing, and Pr_1 (X11, Y11, Z11) shows a position of the object at the previous image capturing. Further, Vm is the motion vector of the object in real space estimated from Pr_2 and Pr_1.
  • The real space position calculating unit 14 calculates a real space coordinate Pr_2 (X12, Y12, Z12) of the object at current image capturing from following expression (4), and calculates a real space coordinate Pr_1 (X11, Y11, Z11) of the object at previous image capturing from following expression (5). Here, Z11=Z1, and Z12=Z2.
  • [Expression 4] { X 12 = x 12 Z 2 f Y 12 = x 12 Z 2 f
    Figure imgb0004
  • Where X12, Y12: real space coordinate values of object at current image capturing, x12, y12: coordinate values of image portion of object at current binary image, Z2: distance from vehicle to object at current image capturing, f: f=F (focal length of infrared camera) / p (pixel pitch of captured image).
  • [Expression 5] { X 11 = x 11 Z 1 f Y 11 = x 11 Z 1 f
    Figure imgb0005
  • Where X11, Y11: real space coordinate values of object at previous image capturing, x11, y11: coordinate values of image portion of object at previous binary image, Z1: distance from vehicle to object at previous image capturing, f: f=F (focal length of infrared camera) / p (pixel pitch of captured image).
  • Further, the real space position calculating unit 14 performs a turning angle correction of correcting a position deviation on the image from the turning of the vehicle 1, on the basis of a turning angle recognized from a detected signal YR of the yaw rate sensor 3. Specifically, when the turning angle of the vehicle 1 from the previous image capturing to the current image capturing is θr, then the real space coordinate values are corrected with following expression (6).
  • [Expression 6] Xr Yr Zr = cos θr 0 - sin θr 0 1 0 sin θr 0 cos θr Xo Yo Zo
    Figure imgb0006
  • Where Xr, Yr, Zr: real space coordinate values after turning angle correction, θr: turning angle, Xo, Yo, Zo: real space coordinate values before turning angle correction.
  • Subsequent STEP 7 is a process by the motion vector calculating unit 15. As is shown in Fig. 8(b), the motion vector calculating unit 15 obtains an approximation straight line Vm corresponding to a relative motion vector between the object and the self vehicle 1, from the real space position Pr_1 of the previous image capturing and the real space position Pr_2 of the current image capturing for the identical object.
  • The relative motion vector may be obtained using the real space positions of the object at plural time points in the past. Further, a specific calculation process of the approximation straight line is, for example a method disclosed in Japanese Patent Laid-Open No. 2001-6096 .
  • Subsequent STEP 9 and STEP 30 through STEP31 are process by the warning judging unit 16. In STEP 9, when the object exists within an approximate judging region in front of the vehicle, or when the motion vector of the object is oriented within the approximate judging region, the warning judging unit 16 sets the object as a target of warning.
  • Thereafter, if the object is set as the warning target, the warning judging unit 16 further determines whether or not a braking operation is performed by the driver from the output from the brake sensor 5. When the braking operation is performed, and when an acceleration of the vehicle 1 (in this case, a decelerating direction is positive) is larger than a predetermined acceleration threshold value (it is assumed that appropriate braking operation is performed by the driver), the warning judging unit 16 determines that no warning output is necessary because an avoiding operation is being performed, and proceeds to STEP 10.
  • On the other hand, when no braking operation is performed or when the acceleration of the vehicle is equal to or less than the acceleration threshold value, the process branches to STEP 30. And, the warning judging unit 16 outputs alarming sound from the loudspeaker 6 in STEP 30, and also displays a highlighted image of the object to the HUD 7 in STEP 31, and proceeds to STEP 10.
  • In the present embodiment, the distance calculating unit 13 calculated, in the second distance calculating process, the rate of change Rate by a time tracking operation of the image portion of the identical object between the binary images shown in Fig. 7. However, the rate of change Rate may be calculated by a correlation calculation of the image portion of the object shown in Fig. 9. With reference to Fig. 9, Im5 is a grayscale image at the previous image capturing, and 51 shows the image portion of the object. Further, Im6 is a grayscale image at the current image capturing, and 52 shows the image portion of the object.
  • The distance calculating unit 13 reduces (in the case where the object is approaching the self vehicle) or expands (in the case where the object is departing from the self vehicle) the size of the image portion 52 of the object in the current grayscale image Im6 by affine transformation, and calculates a degree of correlation with the image portion 51 of the object at the previous image capturing. Specifically, as is shown in the figure, the degree of correlation between the image portion 51, and an image 60 obtained by expanding image portion 50 by 150%, an image 61 obtained by expanding by 125%, an image 62 obtained by expanding by 100%, an image 63 obtained by expanding by 75%, and an image 64 obtained by expanding by 50%. Thereafter, distance calculating unit 13 determines the magnification of the image portion 52 which has the highest degree of correlation as the rate of change Rate.
  • Further, in the present embodiment, a configuration for capturing the front of the vehicle is shown. However, other directions, such as a rear or a side of the vehicle, may be captured, to determine the contact possibility with a monitored object.
  • Further, in the present embodiment, a case where the specified target shape of the present invention is the crossing pedestrian is shown. However, the present invention is applicable to other type of physical bodies, such as a large-sized wild animal crossing the road, which is assumed that a shape of an image portion between time-series captured images change to a degree that makes it difficult to extract as the image portion of the identical physical body, by previously assuming the size of the object type (height, width and the like), and to set the correlative relationship between the distance from the vehicle 1 and the image portion of the physical body in the captured image.
  • Further, in the present embodiment, the first distance calculating process and the second distance calculating process are switched after the determination of the crossing pedestrian. However, a determination on whether or not it is a physical body with small change of shape between the time-series images (vehicle, or a predetermined stationary object, and the like) may be performed, and the distance between the vehicle 1 and the physical body may be calculated using the second distance calculating process when it is determined that the physical body is a physical body having small change of shape, and the distance may be calculated using the first distance calculating process when it is not determined that the physical body is a physical body having small change of shape.
  • In this case, the determination on whether or not the physical body is a physical body having small change of shape is performed in place of the determination of crossing pedestrian in STEP 4 of Fig. 3, and takes a configuration in which the process proceeds to STEP 5 when it is a physical body with small change of rate and calculates the distance between the vehicle 1 and the physical body by the second distance calculating process, and branches to STEP 20 when it is not the physical body with small change in shape and calculates the distance between the vehicle 1 and the physical object by the first distance calculating process.
  • Industrial Applicability
  • As is explained above, according to the vehicle surroundings monitoring device of the present invention, it becomes possible to suppress the decrease in the calculation accuracy between the physical body and the vehicle, on the basis of the captured image by a single camera, therefore it is useful in monitoring by calculating the distance between the physical bodies in the surroundings of the vehicle and the vehicle.
  • Description of Reference Numerals
  • 1... vehicle, 2... infrared camera (image capturing means), 3... yaw rate sensor, 4... vehicle speed sensor, 5... brake sensor, 6.. loudspeaker, 7... HUD, 10... vehicle surroundings monitoring device, 11...object extracting unit, 12...physical body type determining unit, 13... distance calculating unit, 14... real space position calculating unit, 15... motion vector calculating unit, 16.. warning judging unit.

Claims (2)

  1. A vehicle surroundings monitoring device, comprising:
    a distance calculating unit which calculates, on a basis of a captured image by a single camera mounted on a vehicle, a distance between the vehicle and a physical body in real space corresponding to an image portion extracted from the captured image; and
    a physical body type judging unit which judges a type of the physical body in real space corresponding to the image portion;
    wherein the distance calculating unit
    determines a change in shape of the image portion or a change in shape of the physical body in real space corresponding to the image portion, in a predetermined period,
    executes a first distance calculating process, when the change in shape exceeds a predetermined level, of calculating a distance between the physical body and the vehicle, on a basis of a correlative relationship between a distance from the vehicle in real space set on assumption of the type of the physical body and a size of the image portion in the captured image, according to the size of the image portion of the physical body extracted from the captured image, and
    executes a second distance calculating process, when the change in shape is equal to or less than the predetermined level, of calculating the distance between the physical body and the vehicle, on the basis of the change in size of image portions of the physical body extracted from time-series captured images by the camera.
  2. The vehicle surroundings monitoring device according to claim 1,
    wherein the physical body is a pedestrian crossing a road, and
    the distance calculating unit calculates, in the first distance calculating process, of calculating the distance between the physical body in real space corresponding to the image portion and the vehicle, on a basis of a correlative relationship between the distance from the vehicle in real space and a height of the image portion in the captured image, set given that a height of the pedestrian is a predetermined value, according to the height of the image portion extracted from the captured image.
EP11821400.6A 2010-08-31 2011-06-22 Vehicle surroundings monitoring device Active EP2571005B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010194971 2010-08-31
PCT/JP2011/064267 WO2012029382A1 (en) 2010-08-31 2011-06-22 Vehicle surroundings monitoring device

Publications (3)

Publication Number Publication Date
EP2571005A1 true EP2571005A1 (en) 2013-03-20
EP2571005A4 EP2571005A4 (en) 2014-03-26
EP2571005B1 EP2571005B1 (en) 2016-07-27

Family

ID=45772494

Family Applications (1)

Application Number Title Priority Date Filing Date
EP11821400.6A Active EP2571005B1 (en) 2010-08-31 2011-06-22 Vehicle surroundings monitoring device

Country Status (5)

Country Link
US (1) US8965056B2 (en)
EP (1) EP2571005B1 (en)
JP (1) JP5687702B2 (en)
CN (1) CN102985958B (en)
WO (1) WO2012029382A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016195566A1 (en) * 2015-06-04 2016-12-08 Scania Cv Ab Method and control unit for avoiding an accident at a crosswalk

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5603835B2 (en) * 2011-06-27 2014-10-08 クラリオン株式会社 Vehicle perimeter monitoring device
JP5740241B2 (en) * 2011-08-03 2015-06-24 株式会社東芝 Vehicle detection device
WO2013047088A1 (en) * 2011-09-28 2013-04-04 本田技研工業株式会社 Biometric device
KR102021152B1 (en) * 2013-05-07 2019-09-11 현대모비스 주식회사 Method for detecting pedestrians based on far infrared ray camera at night
CN103345840B (en) * 2013-05-28 2015-09-23 南京正保通信网络技术有限公司 Road incidents video detecting method is crossed at a kind of cross channel crossing
JP6473571B2 (en) * 2014-03-24 2019-02-20 アルパイン株式会社 TTC measuring device and TTC measuring program
JP6386837B2 (en) * 2014-09-02 2018-09-05 任天堂株式会社 Image processing program, information processing system, information processing apparatus, and image processing method
EP2993645B1 (en) 2014-09-02 2019-05-08 Nintendo Co., Ltd. Image processing program, information processing system, information processing apparatus, and image processing method
EP3694206A1 (en) * 2017-10-02 2020-08-12 Sony Corporation Image processing device and image processing method
EP4283575A3 (en) * 2017-10-12 2024-02-28 Netradyne, Inc. Detection of driving actions that mitigate risk
CN109934079A (en) * 2017-12-18 2019-06-25 华创车电技术中心股份有限公司 Low lighting environment object monitoring device and its monitoring method
EP4078088A2 (en) * 2020-01-03 2022-10-26 Mobileye Vision Technologies Ltd. Vehicle navigation with pedestrians and determining vehicle free space
JP7309630B2 (en) 2020-02-12 2023-07-18 日立Astemo株式会社 Image processing device
CN113246931B (en) * 2021-06-11 2021-09-28 创新奇智(成都)科技有限公司 Vehicle control method and device, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1918876A1 (en) * 2006-10-31 2008-05-07 HONDA MOTOR CO., Ltd. Vehicle environment monitoring apparatus
WO2009064227A1 (en) * 2007-11-12 2009-05-22 Autoliv Development Ab A vehicle safety system

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999030280A1 (en) * 1997-12-05 1999-06-17 Dynamic Digital Depth Research Pty. Ltd. Improved image conversion and encoding techniques
JP4267171B2 (en) * 1999-05-10 2009-05-27 本田技研工業株式会社 Pedestrian detection device
JP3515926B2 (en) 1999-06-23 2004-04-05 本田技研工業株式会社 Vehicle periphery monitoring device
CN1332556C (en) * 2002-12-26 2007-08-15 三菱电机株式会社 Image processor
JP3922245B2 (en) * 2003-11-20 2007-05-30 日産自動車株式会社 Vehicle periphery monitoring apparatus and method
WO2006014974A2 (en) * 2004-07-26 2006-02-09 Automotive Systems Laboratory, Inc. Vulnerable road user protection system
JP2007015525A (en) 2005-07-07 2007-01-25 Denso Corp Output device for outputting signal for coping with danger approach between preceding vehicle and own vehicle based on front picture photographed by camera and program for the output device
JP2007156626A (en) 2005-12-01 2007-06-21 Nissan Motor Co Ltd Object type determination device and object type determination method
US7623681B2 (en) * 2005-12-07 2009-11-24 Visteon Global Technologies, Inc. System and method for range measurement of a preceding vehicle
US8164628B2 (en) * 2006-01-04 2012-04-24 Mobileye Technologies Ltd. Estimating distance to an object using a sequence of images recorded by a monocular camera
JP4970926B2 (en) 2006-01-16 2012-07-11 本田技研工業株式会社 Vehicle periphery monitoring device
US8126210B2 (en) * 2007-04-27 2012-02-28 Honda Motor Co., Ltd. Vehicle periphery monitoring device, vehicle periphery monitoring program, and vehicle periphery monitoring method
JP2009156626A (en) * 2007-12-25 2009-07-16 Toyota Motor Corp Method and device for detecting drawing-out of air inside car compartment
JP4359710B2 (en) * 2008-02-04 2009-11-04 本田技研工業株式会社 Vehicle periphery monitoring device, vehicle, vehicle periphery monitoring program, and vehicle periphery monitoring method
JP4271720B1 (en) 2008-04-24 2009-06-03 本田技研工業株式会社 Vehicle periphery monitoring device
JP4486997B2 (en) * 2008-04-24 2010-06-23 本田技研工業株式会社 Vehicle periphery monitoring device
JP2010079569A (en) * 2008-09-25 2010-04-08 Canon Inc Information processing apparatus, processing method thereof and program
JP4943403B2 (en) * 2008-10-10 2012-05-30 本田技研工業株式会社 Vehicle periphery monitoring device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1918876A1 (en) * 2006-10-31 2008-05-07 HONDA MOTOR CO., Ltd. Vehicle environment monitoring apparatus
WO2009064227A1 (en) * 2007-11-12 2009-05-22 Autoliv Development Ab A vehicle safety system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2012029382A1 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016195566A1 (en) * 2015-06-04 2016-12-08 Scania Cv Ab Method and control unit for avoiding an accident at a crosswalk

Also Published As

Publication number Publication date
WO2012029382A1 (en) 2012-03-08
JP5687702B2 (en) 2015-03-18
CN102985958B (en) 2015-04-01
EP2571005B1 (en) 2016-07-27
US8965056B2 (en) 2015-02-24
JPWO2012029382A1 (en) 2013-10-28
EP2571005A4 (en) 2014-03-26
CN102985958A (en) 2013-03-20
US20130243261A1 (en) 2013-09-19

Similar Documents

Publication Publication Date Title
EP2571005B1 (en) Vehicle surroundings monitoring device
EP1918876B1 (en) Vehicle environment monitoring apparatus
US8766816B2 (en) System for monitoring the area around a vehicle
JP5904925B2 (en) Vehicle periphery monitoring device
US6531959B1 (en) Position detecting device
US8670590B2 (en) Image processing device
CN106470877B (en) Vehicle display device and vehicle display method
US7969466B2 (en) Vehicle surroundings monitoring apparatus
EP2270765B1 (en) Vehicle periphery monitoring device
WO2011108217A1 (en) Vehicle perimeter monitoring device
EP2323098A1 (en) Vehicle periphery monitoring device
JP2008027309A (en) Collision determination system and collision determination method
US20150169980A1 (en) Object recognition device
CN107451539B (en) Lane departure early warning method and system
EP3211368B1 (en) Stereo camera apparatus and vehicle provided with stereo camera apparatus
EP2312550B1 (en) Vehicle periphery monitoring device
US7526104B2 (en) Vehicle surroundings monitoring apparatus
EP3207523B1 (en) Obstacle detection apparatus and method
JP5430213B2 (en) Vehicle periphery monitoring device
US7545955B2 (en) Vehicle surroundings monitoring apparatus
JP4975776B2 (en) Vehicle periphery monitoring device
JP4496383B2 (en) Vehicle periphery monitoring device, vehicle periphery monitoring method, vehicle periphery monitoring program

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20121214

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20140220

RIC1 Information provided on ipc code assigned before grant

Ipc: G06K 9/00 20060101ALI20140214BHEP

Ipc: G08G 1/16 20060101AFI20140214BHEP

17Q First examination report despatched

Effective date: 20141219

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20160315

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 816339

Country of ref document: AT

Kind code of ref document: T

Effective date: 20160815

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602011028654

Country of ref document: DE

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: NL

Ref legal event code: MP

Effective date: 20160727

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 816339

Country of ref document: AT

Kind code of ref document: T

Effective date: 20160727

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161127

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161027

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161028

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161128

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602011028654

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20161027

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20170502

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20170622

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20180228

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170622

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170622

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170630

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170630

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170622

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20170622

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20110622

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160727

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

REG Reference to a national code

Ref country code: DE

Ref legal event code: R084

Ref document number: 602011028654

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160727

REG Reference to a national code

Ref country code: DE

Ref legal event code: R081

Ref document number: 602011028654

Country of ref document: DE

Owner name: ARRIVER SOFTWARE AB, SE

Free format text: FORMER OWNER: HONDA MOTOR CO., LTD., TOKYO, JP

Ref country code: DE

Ref legal event code: R082

Ref document number: 602011028654

Country of ref document: DE

Representative=s name: BARDEHLE PAGENBERG PARTNERSCHAFT MBB PATENTANW, DE

Ref country code: DE

Ref legal event code: R082

Ref document number: 602011028654

Country of ref document: DE

Representative=s name: MUELLER VERWEYEN PATENTANWAELTE PARTNERSCHAFT , DE

Ref country code: DE

Ref legal event code: R081

Ref document number: 602011028654

Country of ref document: DE

Owner name: VEONEER SWEDEN AB, SE

Free format text: FORMER OWNER: HONDA MOTOR CO., LTD., TOKYO, JP

REG Reference to a national code

Ref country code: DE

Ref legal event code: R085

Ref document number: 602011028654

Country of ref document: DE

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 602011028654

Country of ref document: DE

Representative=s name: BARDEHLE PAGENBERG PARTNERSCHAFT MBB PATENTANW, DE

REG Reference to a national code

Ref country code: DE

Ref legal event code: R081

Ref document number: 602011028654

Country of ref document: DE

Owner name: ARRIVER SOFTWARE AB, SE

Free format text: FORMER OWNER: VEONEER SWEDEN AB, VARGARDA, SE

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20230509

Year of fee payment: 13