EP2639742A2 - Vorrichtung zur Überwachung des Fahrzeugumfeldes - Google Patents

Vorrichtung zur Überwachung des Fahrzeugumfeldes Download PDF

Info

Publication number
EP2639742A2
EP2639742A2 EP13158347.8A EP13158347A EP2639742A2 EP 2639742 A2 EP2639742 A2 EP 2639742A2 EP 13158347 A EP13158347 A EP 13158347A EP 2639742 A2 EP2639742 A2 EP 2639742A2
Authority
EP
European Patent Office
Prior art keywords
vehicle
candidate
pedestrian
image
pedestrian head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13158347.8A
Other languages
English (en)
French (fr)
Inventor
Nobuharu Nagaoka
Makoto Aimura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of EP2639742A2 publication Critical patent/EP2639742A2/de
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/584Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of vehicle lights or traffic lights
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads

Definitions

  • the present invention relates to a vehicle periphery monitoring apparatus for monitoring the periphery of a vehicle based on an image captured by an infrared camera mounted on the vehicle, and more particularly to a vehicle periphery monitoring apparatus which is suitable for use when the vehicle is driving at night or in dark places.
  • JP2003-284057A As disclosed in Japanese Laid-Open Patent Publication No. 2003-284057 (hereinafter referred to as " JP2003-284057A "), there has heretofore been known a vehicle periphery monitoring apparatus incorporated in a driver's own vehicle which detects an object such as a pedestrian or the like that could possibly contact the driver's own vehicle from images (a grayscale image and its binarized image) of the periphery of the driver's own vehicle captured by infrared cameras, and provides the driver with information about the detected object.
  • JP2003-284057A Japanese Laid-Open Patent Publication No. 2003-284057
  • the vehicle periphery monitoring apparatus disclosed in JP2003-284057A detects a high-temperature area of the two images captured by a pair of left and right infrared cameras (stereo camera system) as an object, and calculates the distance up to the object by determining the parallax of the object in the two images.
  • the vehicle periphery monitoring apparatus detects an object such as a pedestrian or the like that is likely to affect the traveling of the driver's own vehicle, i.e., that could possibly contact the driver's own vehicle, from the moving direction and position of the object detected in the captured images (see paragraphs [0014], [0018] of JP2003-284057A ).
  • JP4521642B2 a vehicle periphery monitoring apparatus disclosed in Japanese Patent No. 4521642 (hereinafter referred to as " JP4521642B2 ") employs a single vehicle-mounted infrared camera which captures at least two images (two frames) of an object in the periphery of a vehicle at a given interval of time.
  • JP4521642B2 the vehicle periphery monitoring apparatus
  • the size of an image of the object in the image captured later changes more greatly from the size of an image of the object in the image captured earlier.
  • the object that is present ahead of the vehicle reaches the vehicle in a shorter period of time.
  • TTC Time To Contact or Time to Collision
  • the vehicle periphery monitoring apparatus judges whether an object that is imaged at different times is a person or a vehicle by dividing the object into local areas depending on the object, i.e., a person or a vehicle, making images of the object that are captured at different times equal in size to each other, and decides that the object is a person or a vehicle if the degree of correlation between the local areas is equal to or greater than a threshold value.
  • the vehicle periphery monitoring apparatus of the related art detects a person, i.e., a pedestrian, at night or in dark places, it can easily identify the shape of the head of the person from the image captured by the infrared camera because the head is exposed and has a high surface temperature and the head has a round shape.
  • the infrared camera of the vehicle periphery monitoring apparatus of the related art captures the front end of another vehicle, e.g., an oncoming vehicle, at night, it can easily identify the headlights thereof that are positioned at respective ends in the transverse directions of the other vehicle.
  • the infrared camera of the vehicle periphery monitoring apparatus captures the rear end of another vehicle, e.g., a preceding vehicle running ahead in the same direction, at night, it can easily identify the taillights thereof that are positioned at respective ends in the transverse directions of the other vehicle.
  • the vehicle periphery monitoring apparatus of the related art finds it difficult to distinguish between the headlights or taillights of other vehicles and the heads of pedestrians. Furthermore, as described later, the vehicle periphery monitoring apparatus of the related art occasionally fails to decide that there are two headlights or taillights on other vehicles on account of heat emitted by the exhaust pipes, etc. of the other vehicles and spread to the vehicle bodies of the other vehicles.
  • a vehicle periphery monitoring apparatus for detecting an object in the periphery of a vehicle based on an image captured by an infrared camera mounted on the vehicle, and determining the type of the detected object, comprising a pedestrian head candidate extractor for extracting a pedestrian head candidate from the image, an other vehicle candidate detector for detecting a high-luminance area which is greater in area than the pedestrian head candidate and has a horizontal length equal to or greater than a prescribed width, within a prescribed range beneath or below the extracted pedestrian head candidate, and an other vehicle determiner for determining the pedestrian head candidate as part of another vehicle when the other vehicle candidate detector detects the high-luminance area.
  • a vehicle periphery monitoring apparatus for detecting an object in the periphery of a vehicle based on an image captured by an infrared camera mounted on the vehicle, and determining the type of the detected object, comprising pedestrian head candidate extracting means for extracting a pedestrian head candidate from the image, other vehicle candidate detecting means for detecting a high-luminance area which is greater in area than the pedestrian head candidate and has a horizontal length equal to or greater than a prescribed width, within a prescribed range beneath the extracted pedestrian head candidate, and other vehicle determining means for determining the pedestrian head candidate as part of another vehicle when the other vehicle candidate detecting means detects the high-luminance area.
  • a method of determining a type of an object for use in a vehicle periphery monitoring apparatus for detecting an object in the periphery of a vehicle based on an image captured by an infrared camera mounted on the vehicle comprising a pedestrian head candidate extracting step of extracting a pedestrian head candidate from the image, an other vehicle candidate detecting step of detecting a high-luminance area which is greater in area than the pedestrian head candidate and has a horizontal length equal to or greater than a prescribed width, within a prescribed range beneath the extracted pedestrian head candidate, and an other vehicle determining step of determining the pedestrian head candidate as part of another vehicle when the high-luminance area is detected in the other vehicle candidate detecting step.
  • the pedestrian head candidate when a high-luminance area which is greater in area than the pedestrian head candidate and has a horizontal length equal to or greater than a prescribed width is detected in a prescribed range beneath the pedestrian head candidate that is extracted from the image acquired by the infrared camera, the pedestrian head candidate is determined as part of another vehicle. Consequently, the other vehicle and a pedestrian can be distinguished from each other highly accurately.
  • the other vehicle determiner may determine the rigid body as the other vehicle.
  • the other vehicle candidate detector may further detect an engine exhaust pipe candidate or a tire candidate in the image, and when the other vehicle candidate detector detects the high-luminance area above the engine exhaust pipe candidate or the tire candidate, the other vehicle determiner may determine an object including the pedestrian head candidate and the engine exhaust pipe candidate or an object including the pedestrian head candidate and the tire candidate as the other vehicle.
  • the other vehicle determiner may determine the pedestrian head candidate as part of the other vehicle regardless whether or not the other vehicle candidate detector detects the high-luminance area which is greater in area than the pedestrian head candidate and has the horizontal length.
  • the other vehicle determiner may judge whether or not there is a low-luminance area which is greater in area than the pedestrian head candidate, above the pedestrian head candidate, and if the temperature outside of the vehicle is equal to or lower than a second temperature which is lower than the first temperature, then the other vehicle determiner may judge whether or not there is a high-luminance area which is greater in area than the pedestrian head candidate, above the pedestrian head candidate. In this manner also, the other vehicle can be detected.
  • the pedestrian head candidate is determined as part of another vehicle. Therefore, another vehicle and a pedestrian can be distinguished from each other highly accurately.
  • FIG. 1 shows in block form a vehicle periphery monitoring apparatus 10 according to an embodiment of the present invention
  • FIG. 2 shows in perspective a vehicle (hereinafter also referred to as "driver's own vehicle") 12 which incorporates the vehicle periphery monitoring apparatus 10 shown in FIG. 1 .
  • the vehicle periphery monitoring apparatus 10 includes an image processing unit 14 for controlling the vehicle periphery monitoring apparatus 10, a single (monocular) infrared camera 16 (image capturing device) connected to the image processing unit 14, a vehicle speed sensor 18 for detecting a vehicle speed Vs of the vehicle 12, a brake sensor 20 for detecting a depressed amount (brake depressed amount) Br of a brake pedal which is operated by the driver of the vehicle 12, a yaw rate sensor 22 for detecting a yaw rate Yr of the vehicle 12, a speaker 24 for outputting a warning sound or the like, and an image display unit 26 comprising a HUD (Head Up Display) 26a for displaying an image captured by the infrared camera 16 to make the driver of the vehicle 12 recognize an object (moving object, target object to be monitored) such as a pedestrian or the like that is likely to contact the vehicle 12.
  • a HUD Head Up Display
  • the image display unit 26 is not limited to the HUD 26a, but may be a display unit for displaying a map, etc. of a navigation system incorporated in the vehicle 12 or a display unit (multi-information display unit) disposed in a meter unit for displaying fuel consumption information, etc.
  • the image processing unit 14 detects a target object to be monitored, such as a pedestrian or the like, in front of the vehicle 12, from an infrared image of the periphery of the vehicle 12 and signals indicative of a traveling state of the vehicle 12, i.e., signals representing the vehicle speed Vs, the brake depressed amount Br, and the yaw rate Yr.
  • a target object to be monitored such as a pedestrian or the like
  • the image processing unit 14 decides that it is highly likely for the vehicle 12 to collide with the target object to be monitored, then the image processing unit 14 outputs a warning sound, e.g., a succession of blips from the speaker 24, and highlights the target object to be monitored in a captured image displayed as a grayscale image on the HUD 26a, by surrounding the target object with a bright color frame such as a yellow or red frame, thereby arousing attention of the driver.
  • a warning sound e.g., a succession of blips from the speaker 24
  • highlights the target object to be monitored in a captured image displayed as a grayscale image on the HUD 26a by surrounding the target object with a bright color frame such as a yellow or red frame, thereby arousing attention of the driver.
  • the image processing unit 14 includes an input circuit comprising an A/D converting circuit for converting analog signals input thereto into digital signals, an image memory (storage unit 14m) for storing digital image signals, a CPU (Central Processing Unit) 14c for performing various processing operations, a storage unit 14m including a RA (Random Access Memory) for storing data being processed by the CPU 14c and a ROM (Read Only Memory) for storing a program executed by the CPU 14c, tables, maps, and templates ⁇ pedestrian (human body) shape templates, vehicle shape templates, etc. ⁇ , a clock (clock section) and a timer (time measuring section), and an output circuit for outputting a drive signal for the speaker 24 and a display signal for the image display unit 26.
  • Output signals from the infrared camera 16, the yaw rate sensor 22, the vehicle speed sensor 18, and the brake sensor 20 are converted by the A/D converting circuit into digital signals, which are then input to the CPU 14c.
  • the CPU 14c of the image processing unit 14 reads the supplied digital signals and executes the program while referring to the tables, the maps, and the templates, thereby functioning as various functioning means (also referred to as "functioning sections"), described below, to send the drive signal (e.g., sound signal, display signal) to the speaker 24 and the display signal to the image display unit 26.
  • the functioning means may alternatively be performed by pieces of hardware.
  • the functioning sections of the image processing unit 14 include a pedestrian head candidate extractor 101, an other vehicle candidate detector 102, an other vehicle determiner 103 functioning as a target object determiner, a contact possibility determiner 106, and an attention seeking output generation determiner 108.
  • the pedestrian head candidate extractor 101 extracts a pedestrian head candidate from an image (captured image) acquired by the infrared camera 16
  • the pedestrian head candidate extractor 101 also extracts a pedestrian candidate including a head candidate.
  • the pedestrian head candidate extractor 101 also functions as a pedestrian candidate extractor.
  • the image processing unit 14 basically executes an object recognizing (distinguishing) program (object detecting program) for recognizing (distinguishing) an object by comparing an image captured by the infrared camera 16 with pattern templates representing human body shapes, animal shapes, vehicle shapes, and artificial structure shapes such as columns or the like including utility poles, which are stored in the storage unit 14m.
  • object recognizing program object detecting program
  • the infrared camera 16 is mounted in a front bumper of the vehicle 12 with an optical axis thereof extending parallel to the longitudinal axis of the vehicle 12.
  • the infrared camera 16 has such characteristics that its output signal (imaging signal) has a higher level (a higher luminance level) as the temperature of a target object imaged thereby is higher.
  • the HUD 26a is positioned to display its display screen on the front windshield of the vehicle 12 at such a position where it does not obstruct the field of front vision of the driver.
  • the image processing unit 14 converts a video signal output from the infrared camera 16 into digital data at frame clock intervals/periods of several tens milliseconds, e.g., 1 second/30 frames [ms], and stores the digital data in the storage unit 14m (image memory).
  • the image processing unit 14 includes the above functioning means to perform various processing operations on an image of an area in front of the vehicle 12 which is represented by the digital data stored in the storage unit 14m.
  • the pedestrian head candidate extractor 101 extracts an image portion of a target object to be monitored, such as a pedestrian, a vehicle (another vehicle), etc., from the image of the area in front of the vehicle 12 which is stored in the storage unit 14m, and extracts a pedestrian head candidate having a prescribed size based on the extracted image portion.
  • a target object to be monitored such as a pedestrian, a vehicle (another vehicle), etc.
  • the other vehicle candidate detector 102 detects a high-luminance area, to be described later, having an area greater than the area of the pedestrian head candidate detected by the pedestrian head candidate extractor 101 and a horizontal length equal to or greater than a prescribed width, within a prescribed range below the pedestrian head candidate.
  • the other vehicle determiner 103 determines the pedestrian head candidate as part of the other vehicle.
  • the attention seeking output generation determiner 108 calculates a rate of change Rate of the size of the image portion of the target object to be monitored between images that are captured at the above frame clock intervals/periods (prescribed time intervals), estimates a period of time T which the target object to be monitored takes to reach the vehicle 12 using the rate of change Rate, calculates the position of the target object to be monitored in the actual space, and calculates a motion vector in the actual space of the target object to be monitored.
  • the vehicle speed Vs should be replaced with a relative speed between the target object to be monitored and the vehicle 12.
  • the relative speed is equal to the vehicle speed Vs.
  • the attention seeking output generation determiner 108 calculates a positional change ⁇ x (horizontal) and a positional change ⁇ y (vertical) of the image portion of the target object to be monitored between the images that are captured at the prescribed time intervals, and determines a contact possibility that the target object to be monitored and the vehicle 12 will contact each other, based on the determined period of time TTC and the calculated positional changes (motion vector) ⁇ x, ⁇ y.
  • the vehicle periphery monitoring apparatus 10 is basically constructed as described above. An operation sequence of the vehicle periphery monitoring apparatus 10 will be described in detail below with reference to a flowchart shown in FIG. 3 .
  • step S1 shown in FIG. 3 the image processing unit 14 judges whether the vehicle 12 is traveling or at rest based on, for example, the vehicle speed Vs detected by the vehicle speed sensor 18. If the vehicle 12 is at rest (S1: NO), then the operation sequence is stopped.
  • step S2 the image processing unit 14 acquires an infrared image of an area within a given angle of view in front of the vehicle 12, which is represented by an output signal from the infrared camera 16 in each frame, converts the infrared image into a digital grayscale image, stores the digital grayscale image in the image memory (storage unit 14m), and binarizes the stored grayscale image.
  • the image processing unit 14 performs a binarizing process on the grayscale image by converting areas brighter than a luminance threshold value for determining a human luminance level into "1" (white) and areas darker than the luminance threshold value into "0" (black) to generate a binarized image in each frame, and stores the binarized image in the storage unit 14m.
  • step S3 the image processing unit 14 detects (extracts), as shown in FIG. 4 , a pedestrian candidate PCX comprising a head candidate 50, a body candidate 52 including arms, and two leg candidates 54 of a pedestrian Pa.
  • a substantially horizontal line interconnecting the lower ends of the two leg candidates 54 is regarded as a point of contact with a road surface 56.
  • the head candidate 50 can easily be extracted from the binarized image which corresponds to the grayscale image converted from the infrared image captured by the infrared camera 16.
  • the binarized image in each frame is stored in the storage unit 14m.
  • the pedestrian candidate PCX Since the pedestrian candidate PCX is walking with its arms swinging and its legs moving up and down, its shape is changing as confirmed from the image in each frame.
  • the pedestrian candidate PCX is thus not detected as a rigid body, such as another vehicle, whose shape remains unchanged between images in respective frames.
  • step S3 when the height of an object having a head candidate 50 from the road surface 56 is within a prescribed height range, the object is estimated as a pedestrian candidate PCX, and its image is stored as being labeled as run-length data, i.e., a labeling process is performed on its image.
  • the image thus processed is a large quadrangle-shaped image including a quadrangle circumscribing the pedestrian candidate PCX. If necessary, large quadrangle-shaped images including quadrangles circumscribing pedestrian candidates PCX are converted into images of one size in respective frames for easier image processing.
  • the image of another vehicle Car shown in FIG. 5 is processed as follows: Lights 70a, 70b on laterally spaced left and right end portions of the other vehicle Car, such as headlights (oncoming car) or taillights (preceding car), a front grill (oncoming car) or an exhaust pipe 72 (preceding car) on a lower central portion of the other vehicle Car, and left and right tires 74a, 74b of the other vehicle Car are indicated as hatched regions because of their higher luminance level.
  • Lights 70a, 70b on laterally spaced left and right end portions of the other vehicle Car such as headlights (oncoming car) or taillights (preceding car), a front grill (oncoming car) or an exhaust pipe 72 (preceding car) on a lower central portion of the other vehicle Car
  • left and right tires 74a, 74b of the other vehicle Car are indicated as hatched regions because of their higher luminance level.
  • the road surface 56 can be detected based on a horizontal line interconnecting the lower ends of the tires 74a, 74b.
  • a quadrangular mask having a prescribed area and extending horizontally, which, for example, has a horizontal width greater than the horizontal width of the other vehicle Car, generally covering a distance from the left end of the light 70a to the right end of the light 70b, and a vertical width slightly greater than the vertical width of the lights 70a, 70b, is applied to the image of the other vehicle Car and vertically moved above the lights 70a, 70b, and an area having a succession of identical pixel values within the grayscale image in the mask can be detected (extracted) as a roof (and a roof edge).
  • Another quadrangular mask extending vertically which, for example, has a horizontal width comparable to the horizontal width of the lights 70a, 70b and a vertical width which is 1 to 2 times the vertical width of the lights 70a, 70b, is applied laterally of the lights 70a, 70b, and an area having a succession of identical pixel values within the grayscale image in the mask can be detected (extracted) as a pillar (and a pillar edge) or a fender (and a fender edge).
  • the other vehicle Car thus detected has its lights 70a, 70b whose vertical height from the road surface 56 is within a height range that could possibly be detected in error as a head 50. Therefore, the other vehicle Car is temporarily estimated as a pedestrian candidate PCY, and its image is stored as being labeled as run-length data, i.e., a labeling process is performed on its image in step S3.
  • the image thus processed is a large quadrangle-shaped image including a quadrangle circumscribing the pedestrian candidate PCY. If necessary, large quadrangle-shaped images including quadrangles circumscribing pedestrian candidates PCY are converted into images of one size in respective frames for easier image processing.
  • the processing of steps S2, S3 is carried out by the pedestrian head candidate extractor 101.
  • the pedestrian head candidate extractor 101 (pedestrian head candidate extracting means, pedestrian head candidate extracting step) thus extracts a pedestrian candidate PCX (see FIG. 4 ) including a head candidate 50 as a pedestrian head candidate, and a pedestrian candidate PCY (see FIG. 5 ) including lights 70a, 70b as a pedestrian head candidate.
  • step S4 the other vehicle determiner 103 which also functions as a target object determiner performs a target object determining process on the pedestrian candidate PCX ( FIG. 4 ) and the pedestrian candidate PCY ( FIG. 5 ) using the other vehicle candidate detector 102, i.e., judges whether each of the pedestrian candidate PCX ( FIG. 4 ) and the pedestrian candidate PCY ( FIG. 5 ) is a pedestrian Pa that is actually walking or another vehicle Car.
  • the other vehicle determiner 103 determines the pedestrian candidate PCY as another vehicle Car because the pedestrian candidate PCY is actually a rigid body whose image remains unchanged in shape but changes in size only with time and whose image includes long straight edges (roof and fender), etc.
  • the other vehicle determiner 103 determines the shape of the other vehicle Car, i.e., judges whether it is a rigid body or not, by converting the image thereof into a circumscribed quadrangle of one size and analyzing the converted images. Since the shape of the image of the other vehicle Car remains unchanged, e.g., the distance between the lights 70a, 70b remains unchanged and the distance between the tires 74a, 74b remains unchanged, the other vehicle determiner 103 determines the pedestrian candidate PCY as a rigid body, i.e., another vehicle Car.
  • the other vehicle determiner 103 which functions as a target object determiner determines the pedestrian candidate PCX shown in FIG. 4 as a pedestrian Pa when it detects that the horizontal width (width of the body 52) and height (height from the road surface 56 to the top of the head) of the pedestrian candidate PCX fall within a human range, and that the pedestrian candidate PCX is not a rigid body but an object which changes in shape.
  • a large high-luminance area 76 indicated by the thick solid lines may occasionally be detected on a pedestrian candidate PCYa.
  • the high-luminance area 76 represents a combination (an area obtained by logical addition in binary images) of a heated area 78 indicated by the dot-and-dash lines and which represents a portion of the car body heated by heat-generating components such as the exhaust pipe 72 and the engine, a heated area representing the light 70b, a heated area representing the exhaust pipe 72, and a heated area representing the tire 74b.
  • the pedestrian candidate PCYa cannot be determined as another vehicle Car since it is difficult to detect the lights 70a, 70b as a pair of lights on account of the appearance of the high-luminance area 76.
  • the other vehicle determiner 103 when the pedestrian head candidate extractor 101 extracts the light 70a as a pedestrian candidate head in step S3, the other vehicle determiner 103 performs a process of determining the light 70a extracted as a pedestrian candidate head as part of the other vehicle Car if the other vehicle candidate detector 102 detects a high-luminance area 76 that is greater in area than the light 70a and has a horizontal length equal to or greater than a prescribed width, within a prescribed range (e.g., a range from the upper end of the lights 70a, 70b to the lower end of the tires 74a, 74b) beneath the horizontal position of the light 70a, in the target object determining process in step S4 which is performed by the other vehicle candidate detector 102 and the other vehicle determiner 103.
  • a prescribed range e.g., a range from the upper end of the lights 70a, 70b to the lower end of the tires 74a, 74b
  • the target object determining process since the target object determining process is included, when the exhaust pipe 72 of a high-luminance area which is greater in area than the light 70a or 70b extracted as a pedestrian head candidate and has a horizontal length equal to or greater than a prescribed width, is detected within a prescribed range beneath the horizontal position of the light 70a or 70b on the pedestrian candidate PCY shown in FIG. 5 which is free of the heated area 78, the pedestrian candidate PCY can be detected as another vehicle Car.
  • the other vehicle determiner 103 determines the light 70a extracted as a pedestrian candidate head from an image acquired by the infrared camera 16, as part of the other vehicle Car if the other vehicle candidate detector 102 (other vehicle candidate detecting means, other vehicle candidate detecting step) detects a high-luminance area 76 (which may represent the exhaust pipe 72 only) that is greater in area than the light 70a and has a horizontal length equal to or greater than a prescribed width, within a prescribed range beneath the horizontal position of the light 70a. Consequently, the other vehicle Car and the pedestrian Pa can be distinguished from each other highly accurately.
  • the other vehicle determiner 103 may determine the pedestrian candidate PCY as another vehicle Car based on the relationship between the two lights 70a, 70b and the exhaust pipe 72 which represents a high-luminance area.
  • the other vehicle determiner 103 may determine the pedestrian candidate PCY as another vehicle Car based on the positional relationship between the two lights 70a, 70b and the exhaust pipe 72 which represents a high-luminance area, or when a mask covering the two lights 70a, 70b is provided such that the mask extends toward the lower side of the two lights 70a, 70b, and a high-luminance area which is greater than the areas of the lights 70a, 70b by a prescribed area or more is detected.
  • the other vehicle determiner 103 may also determine the pedestrian candidate PCY as another vehicle Car provided that the exhaust pipe 72 which represents a high-luminance area has a horizontal width (lateral width) Hwb that is smaller than the horizontal width (lateral width) Hwa of a region interconnecting the lights 70a, 70b detected as pedestrian head candidates.
  • the other vehicle candidate detector 102 may determine the pedestrian candidate PCY as another vehicle Car based on the shape of a high-luminance area relevant to the end 73, e.g., a shape considered to be correlated to a reference pattern (a reference pattern for the exhaust pipe 72) extending substantially concentrically outwardly from the end 73.
  • the other vehicle candidate detector 102 detects the lights 70a, 70b as pedestrian head candidates, in the quadrangle circumscribing the pedestrian candidate PCY which has been determined as a rigid body (an object which remains unchanged in shape with time) in the image shown in FIG. 5 , the other vehicle determiner 103 may determine the pedestrian candidate PCY determined as the rigid body, as another vehicle Car.
  • the other vehicle determiner 103 may determine an object including the lights 70a, 70b as the pedestrian head candidates, as another vehicle Car.
  • the other vehicle candidate detector 102 detects the light 70a and the exhaust pipe 72 as an exhaust pipe candidate of the engine or the tires 74a, 74b as tire candidates in the image shown in FIG. 6 , then when the other vehicle candidate detector 102 detects a high-luminance area 76 that is greater than a prescribed area of the light 70a and that includes a heated area 78 which represents a portion of the car body heated by the exhaust pipe 72, above the exhaust pipe 72 or the tires 74a, 74b, the other vehicle determiner 103 may determine an object including the light 70a as the pedestrian head candidate and the high-luminance area 76 representing the exhaust pipe 72, as another vehicle Car.
  • the other vehicle candidate detector 102 detects a high-luminance area 76 above the exhaust pipe 72 or the tires 74a, 74b, it may be judged that an object including the light 70a as the pedestrian head candidate and the tires 74a, 74b, as another vehicle Car.
  • the other vehicle candidate detector 102 when the other vehicle candidate detector 102 detects a pedestrian candidate PCYb having another high-luminance area 92h having a prescribed area or greater (representing, for example, a windshield of another car Car whose passenger compartment is warmed in a cold climate) or a low-luminance area 921 having a prescribed area or greater (representing, for example, a windshield of another car Car whose passenger compartment is cooled in a warm climate), above the light 70a and/or 70b as a pedestrian head candidate in the image, the other vehicle determiner 103 may determine the light 70a and/or 70b as part of the other vehicle Car regardless of whether the other vehicle candidate detector 102 detects the horizontal high luminance area 76, the heated area 78 and the exhaust pipe 72.
  • a pedestrian candidate PCYb having another high-luminance area 92h having a prescribed area or greater (representing, for example, a windshield of another car Car whose passenger compartment is warmed in a cold climate) or a low-luminance area 921 having
  • the temperature outside the vehicle is equal to or higher than a preset first temperature at which the passenger compartment needs to be cooled, then it is judged from a grayscale image, for example, whether or not there is a low-luminance area 921 which is greater in area than the light 70a or 70b above the light 70a and/or 70b, and, if there is such a low-luminance area 921, it is determined that the light 70a and/or 70b is part of the other vehicle Car.
  • the temperature outside the vehicle is equal to or lower than a preset second temperature (lower than the first temperature) at which the passenger compartment needs to be warmed, then it is judged from a grayscale image, for example, whether or not there is a high-luminance area 92h which is greater in area than the light 70a or 70b above the light 70a and/or 70b, and, if there is such a high-luminance area 92h, it is determined that the light 70a and/or 70b is part of the other vehicle Car.
  • a preset second temperature lower than the first temperature
  • the temperature outside the vehicle can be detected based on the luminance of a grayscale image which corresponds to a temperature (prescribed temperature) of the head 50 which has been measured in advance.
  • the temperature outside the vehicle may be detected by a temperature sensor (ambient air temperature sensor), not shown.
  • FIGS. 8A and 8B show another vehicle Cara which is different in shape from the other vehicle Car shown in FIGS. 5 through 7 .
  • FIG. 8A shows a present image Ipr and FIG. 8B shows a past image Ips.
  • FIG. 9 shows a table 90 of the horizontal and vertical coordinates of rectangular frames (see FIGS. 8A and 8B ) in the present and past images Ipr, Ips, with their origin at the lower left end of each image.
  • the present and past images Ipr, Ips shown in FIGS. 8A and 8B include binarized high-luminance areas which are shown hatched.
  • the high-luminance areas include an area representing a light 70aa extracted as a pedestrian head candidate, which is a heat source to be processed, and a high-luminance area 76a disposed beneath the light 70aa and greater in area than the light 70aa and which has a horizontal length equal to or greater than a prescribed width.
  • only the light 70aa and the high-luminance area 76a have a luminance level "1" (white) brighter than the luminance threshold value for determining the human luminance level.
  • the following conditions 1, 2 are used as conditions for determining a vehicle.
  • a polygon i.e., a quadrangle 84 (84pr, 84ps)
  • Condition 2 Present and past straight lines 86pr, 86ps are drawn between the light 70aa as a pedestrian head candidate and either one of the other masks, i.e., between the coordinates T (T1, T0) and the coordinates R (R1, R0) in FIGS. 8A and 8B , and the length Lpr of the present straight line 86pr and the length Lps of the past straight line 86ps are compared with each other.
  • Lps ⁇ Lpr i.e., if the length Lps of the past straight line 86ps is equal to or greater than the length Lpr of the present straight line 86pr, then the other vehicle Cara is recognized as a preceding vehicle followed by the driver's own vehicle or a preceding vehicle moving apart from the driver's own vehicle (overtaking vehicle).
  • step S4 the other vehicle determiner 103 judges whether each of the pedestrian candidates PCY ( FIG. 5 ), PCYa ( FIG. 6 ), PCYb ( FIG. 7 ), PCYc ( FIGS. 8A, 8B ) is a pedestrian Pa shown in FIG. 4 or not in step S5. If the other vehicle determiner 103 decides that the pedestrian candidate is determined as another vehicle Car, Cara, but not a pedestrian Pa, then it will be excluded from a subsequent processing sequence (S5: NO).
  • the contact possibility determiner 106 determines a contact possibility that the pedestrian Pa and the driver's own vehicle 12 will contact each other in step S6.
  • the contact possibility determiner 106 determines a contact possibility in view of the period of time TTC according to the expression (1) and each motion vector of the pedestrian Pa (possibly also the distance Z), and also based on the brake depressed amount Br, the vehicle speed Vs, and the yaw rate Yr represented by the output signals respectively from the brake sensor 20, the vehicle speed sensor 18, and the yaw rate sensor 22. If the contact possibility determiner 106 decides that the driver's own vehicle 12 will possibly contact the pedestrian Pa (S6: YES), then the attention seeking output generation determiner 108 generates an attention seeking output signal, thereby arousing attention of the driver, e.g., providing the driver with information, in step S7.
  • the attention seeking output generation determiner 108 highlights the pedestrian in the grayscale image on the HUD 26a, with a surrounding frame in a bright color or the like, and produces a warning sound from the speaker 24, thereby arousing attention of the driver of the vehicle 12.
  • the attention seeking output generation determiner 108 highlights the pedestrian Pa in the grayscale image on the HUD 26a, with a surrounding frame in a bright color such as red or yellow and generates an output for arousing the driver's attention.
  • the present invention is not limited to the above embodiment, but may adopt various arrangements based on the disclosure of the present description.
  • a vehicle periphery monitoring apparatus 10A may include a pair of left and right infrared cameras 16L, 16R which are incorporated in a vehicle 12A.
  • the infrared cameras 16L, 16R which are combined into a stereo camera system, are mounted in a front bumper of the vehicle 12A at respective positions that are substantially symmetric with respect to a transversely central portion of the vehicle 12A.
  • the cameras 16L, 16R have respective optical axes parallel to each other and are located at equal heights from the road surface.
  • the vehicle periphery monitoring apparatus which includes the left and right infrared cameras 16L, 16R handles a high-temperature area as a target object in left and right images of the periphery of the vehicle 12A captured by the infrared cameras 16R, 16L, calculates the distance up to the target object according to the principles of triangulation based on the parallax of the target object in the left and right images, detects an object that is likely to affect the traveling of the vehicle (driver's own vehicle) 12A, from the moving direction (motion vector) and position of the target object, and outputs an attention seeking output signal to seek attention of the driver of the vehicle 12A.
  • a pedestrian head candidate extractor (101) extracts a light (70a) as a pedestrian head candidate
  • an other vehicle candidate detector (102) detects a horizontal high-luminance area (76) which is greater in area than the light (70a) extracted as the pedestrian head candidate, within a prescribed range beneath the horizontal position of the light (70a)
  • an other vehicle determiner (103) determines the light (70a) extracted as the pedestrian head candidate as part of another car (Car).
EP13158347.8A 2012-03-12 2013-03-08 Vorrichtung zur Überwachung des Fahrzeugumfeldes Withdrawn EP2639742A2 (de)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2012053962A JP5529910B2 (ja) 2012-03-12 2012-03-12 車両周辺監視装置

Publications (1)

Publication Number Publication Date
EP2639742A2 true EP2639742A2 (de) 2013-09-18

Family

ID=47900687

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13158347.8A Withdrawn EP2639742A2 (de) 2012-03-12 2013-03-08 Vorrichtung zur Überwachung des Fahrzeugumfeldes

Country Status (4)

Country Link
US (1) US20130235202A1 (de)
EP (1) EP2639742A2 (de)
JP (1) JP5529910B2 (de)
CN (1) CN103303235A (de)

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2364575B1 (de) 2008-11-17 2016-01-27 Express Imaging Systems, LLC Elektronische regelung zum regeln der stromversorgung für festkörper-beleuchtung und verfahren hierfür
KR20120032472A (ko) 2009-05-01 2012-04-05 익스프레스 이미징 시스템즈, 엘엘씨 수동 냉각을 구비한 가스-방전 램프 교체
US8541950B2 (en) 2009-05-20 2013-09-24 Express Imaging Systems, Llc Apparatus and method of energy efficient illumination
WO2010135575A2 (en) 2009-05-20 2010-11-25 Express Imaging Systems, Llc Long-range motion detection for illumination control
US8901825B2 (en) 2011-04-12 2014-12-02 Express Imaging Systems, Llc Apparatus and method of energy efficient illumination using received signals
WO2013047088A1 (ja) * 2011-09-28 2013-04-04 本田技研工業株式会社 生体認識装置
US9360198B2 (en) 2011-12-06 2016-06-07 Express Imaging Systems, Llc Adjustable output solid-state lighting device
US9497393B2 (en) * 2012-03-02 2016-11-15 Express Imaging Systems, Llc Systems and methods that employ object recognition
US9210751B2 (en) 2012-05-01 2015-12-08 Express Imaging Systems, Llc Solid state lighting, drive circuit and method of driving same
US9204523B2 (en) 2012-05-02 2015-12-01 Express Imaging Systems, Llc Remotely adjustable solid-state lamp
US9131552B2 (en) 2012-07-25 2015-09-08 Express Imaging Systems, Llc Apparatus and method of operating a luminaire
US8896215B2 (en) 2012-09-05 2014-11-25 Express Imaging Systems, Llc Apparatus and method for schedule based operation of a luminaire
US9301365B2 (en) 2012-11-07 2016-03-29 Express Imaging Systems, Llc Luminaire with switch-mode converter power monitoring
US9210759B2 (en) 2012-11-19 2015-12-08 Express Imaging Systems, Llc Luminaire with ambient sensing and autonomous control capabilities
US9288873B2 (en) 2013-02-13 2016-03-15 Express Imaging Systems, Llc Systems, methods, and apparatuses for using a high current switching device as a logic level sensor
US9466443B2 (en) 2013-07-24 2016-10-11 Express Imaging Systems, Llc Photocontrol for luminaire consumes very low power
US9414449B2 (en) 2013-11-18 2016-08-09 Express Imaging Systems, Llc High efficiency power controller for luminaire
WO2015116812A1 (en) 2014-01-30 2015-08-06 Express Imaging Systems, Llc Ambient light control in solid state lamps and luminaires
US9572230B2 (en) 2014-09-30 2017-02-14 Express Imaging Systems, Llc Centralized control of area lighting hours of illumination
US9445485B2 (en) 2014-10-24 2016-09-13 Express Imaging Systems, Llc Detection and correction of faulty photo controls in outdoor luminaires
US9462662B1 (en) 2015-03-24 2016-10-04 Express Imaging Systems, Llc Low power photocontrol for luminaire
US9538612B1 (en) 2015-09-03 2017-01-03 Express Imaging Systems, Llc Low power photocontrol for luminaire
JP6572696B2 (ja) * 2015-09-11 2019-09-11 株式会社リコー 画像処理装置、物体認識装置、機器制御システム、画像処理方法およびプログラム
US9924582B2 (en) 2016-04-26 2018-03-20 Express Imaging Systems, Llc Luminaire dimming module uses 3 contact NEMA photocontrol socket
JP6788447B2 (ja) * 2016-09-12 2020-11-25 日立オートモティブシステムズ株式会社 映像出力システム
US9985429B2 (en) 2016-09-21 2018-05-29 Express Imaging Systems, Llc Inrush current limiter circuit
US10230296B2 (en) 2016-09-21 2019-03-12 Express Imaging Systems, Llc Output ripple reduction for power converters
CN106780727B (zh) * 2016-12-27 2020-09-08 深圳市捷顺科技实业股份有限公司 一种车头检测模型重建方法及装置
US10098212B2 (en) 2017-02-14 2018-10-09 Express Imaging Systems, Llc Systems and methods for controlling outdoor luminaire wireless network using smart appliance
US10219360B2 (en) 2017-04-03 2019-02-26 Express Imaging Systems, Llc Systems and methods for outdoor luminaire wireless control
US10568191B2 (en) 2017-04-03 2020-02-18 Express Imaging Systems, Llc Systems and methods for outdoor luminaire wireless control
US10904992B2 (en) 2017-04-03 2021-01-26 Express Imaging Systems, Llc Systems and methods for outdoor luminaire wireless control
US11375599B2 (en) 2017-04-03 2022-06-28 Express Imaging Systems, Llc Systems and methods for outdoor luminaire wireless control
CN111033590B (zh) * 2017-08-18 2022-09-09 索尼公司 车辆行驶控制设备、车辆行驶控制方法以及程序
US11062608B2 (en) 2018-05-11 2021-07-13 Arnold Chase Passive infra-red pedestrian and animal detection and avoidance system
US11294380B2 (en) 2018-05-11 2022-04-05 Arnold Chase Passive infra-red guidance system
US10750953B1 (en) * 2018-05-11 2020-08-25 Arnold Chase Automatic fever detection system and method
US10467903B1 (en) 2018-05-11 2019-11-05 Arnold Chase Passive infra-red pedestrian detection and avoidance system
JP7136589B2 (ja) 2018-05-18 2022-09-13 株式会社Fuji テープ装填装置およびテープ連結方法
CN108806318A (zh) * 2018-06-19 2018-11-13 芜湖岭上信息科技有限公司 一种基于图像的停车位管理系统和方法
JP7056458B2 (ja) * 2018-08-10 2022-04-19 株式会社Jvcケンウッド 認識処理装置、認識処理方法及び認識処理プログラム
US11234304B2 (en) 2019-05-24 2022-01-25 Express Imaging Systems, Llc Photocontroller to control operation of a luminaire having a dimming line
US11317497B2 (en) 2019-06-20 2022-04-26 Express Imaging Systems, Llc Photocontroller and/or lamp with photocontrols to control operation of lamp
US11212887B2 (en) 2019-11-04 2021-12-28 Express Imaging Systems, Llc Light having selectively adjustable sets of solid state light sources, circuit and method of operation thereof, to provide variable output characteristics
US20220144199A1 (en) * 2020-11-11 2022-05-12 Hyundai Mobis Co., Ltd. System method for determining collision of walker

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003284057A (ja) 2002-01-18 2003-10-03 Honda Motor Co Ltd 車両周辺監視装置
JP4521642B2 (ja) 2008-02-13 2010-08-11 本田技研工業株式会社 車両周辺監視装置、車両、車両周辺監視プログラム

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3844750B2 (ja) * 2003-05-26 2006-11-15 本田技研工業株式会社 赤外線画像認識装置、及び赤外線画像認識装置を用いた警報装置
JP2007241898A (ja) * 2006-03-10 2007-09-20 Honda Motor Co Ltd 停止車両分別検出装置および車両の周辺監視装置
JP4434234B2 (ja) * 2007-05-30 2010-03-17 トヨタ自動車株式会社 車両用撮像システム、及び車両用制御装置
JP4887537B2 (ja) * 2007-08-02 2012-02-29 本田技研工業株式会社 車両周辺監視装置
JP4359710B2 (ja) * 2008-02-04 2009-11-04 本田技研工業株式会社 車両周辺監視装置、車両、車両周辺監視用プログラム、車両周辺監視方法
CN202077119U (zh) * 2011-06-29 2011-12-14 广东好帮手电子科技股份有限公司 具有行人识别功能的车载红外夜视系统

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003284057A (ja) 2002-01-18 2003-10-03 Honda Motor Co Ltd 車両周辺監視装置
JP4521642B2 (ja) 2008-02-13 2010-08-11 本田技研工業株式会社 車両周辺監視装置、車両、車両周辺監視プログラム

Also Published As

Publication number Publication date
JP5529910B2 (ja) 2014-06-25
JP2013186848A (ja) 2013-09-19
CN103303235A (zh) 2013-09-18
US20130235202A1 (en) 2013-09-12

Similar Documents

Publication Publication Date Title
EP2639742A2 (de) Vorrichtung zur Überwachung des Fahrzeugumfeldes
US10565438B2 (en) Vehicle periphery monitor device
US7436982B2 (en) Vehicle surroundings monitoring apparatus
US9235990B2 (en) Vehicle periphery monitoring device
US8766816B2 (en) System for monitoring the area around a vehicle
JP4173901B2 (ja) 車両周辺監視装置
JP4899424B2 (ja) 物体検出装置
US8810653B2 (en) Vehicle surroundings monitoring apparatus
US7969466B2 (en) Vehicle surroundings monitoring apparatus
US7388476B2 (en) Vehicle surroundings monitoring apparatus
JP2014515893A (ja) 車両のカメラによって撮影した画像を評価するための方法および画像評価装置
WO2011086807A1 (ja) 車両周辺監視装置
JP2007323578A (ja) 車両周辺監視装置
JP2007334511A (ja) 対象物検出装置、車両、対象物検出方法、並びに対象物検出用プログラム
JP2004355139A (ja) 車両認識装置
JP4887537B2 (ja) 車両周辺監視装置
JP2005202787A (ja) 車両用表示装置
JP3839329B2 (ja) ナイトビジョンシステム
JP3844750B2 (ja) 赤外線画像認識装置、及び赤外線画像認識装置を用いた警報装置
KR100880919B1 (ko) 멀티패턴마스크를 이용한 이동차량인식방법
JP3961269B2 (ja) 障害物警報装置
JP5430633B2 (ja) 車両周辺監視装置
JP2010092437A (ja) 車両の周辺監視装置
JP2007311866A (ja) 対象物検出装置、車両、対象物検出方法、並びに対象物検出用プログラム
JP4871941B2 (ja) 車両周辺監視装置

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20130308

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20161108