US10150415B2 - Method and apparatus for detecting a pedestrian by a vehicle during night driving - Google Patents

Method and apparatus for detecting a pedestrian by a vehicle during night driving Download PDF

Info

Publication number
US10150415B2
US10150415B2 US15/391,310 US201615391310A US10150415B2 US 10150415 B2 US10150415 B2 US 10150415B2 US 201615391310 A US201615391310 A US 201615391310A US 10150415 B2 US10150415 B2 US 10150415B2
Authority
US
United States
Prior art keywords
image
pedestrian
camera
area
pedestrian area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/391,310
Other versions
US20170106798A1 (en
Inventor
Young Chul Oh
Myung Seon Heo
Wan Jae Lee
Byung Yong YOU
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hyundai Motor Co
Original Assignee
Hyundai Motor Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hyundai Motor Co filed Critical Hyundai Motor Co
Priority to US15/391,310 priority Critical patent/US10150415B2/en
Publication of US20170106798A1 publication Critical patent/US20170106798A1/en
Application granted granted Critical
Publication of US10150415B2 publication Critical patent/US10150415B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/22
    • B60K35/28
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/30Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing vision in the non-visible spectrum, e.g. night or infrared vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/04Mounting of cameras operative during drive; Arrangement of controls thereof relative to the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • G06K9/00369
    • G06K9/00805
    • G06K9/4604
    • G06K9/6201
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/11Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
    • H04N5/332
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
    • B60K2350/106
    • B60K2350/1084
    • B60K2350/2013
    • B60K2360/179
    • B60K2360/21
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/102Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using 360 degree surveillance camera system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/106Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using night vision cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/205Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/303Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/307Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8033Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for pedestrian protection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8053Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for bad weather conditions or night vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60YINDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
    • B60Y2400/00Special features of vehicle units
    • B60Y2400/92Driver displays

Definitions

  • the present invention relates to a method and apparatus for detecting a pedestrian by a vehicle during night driving, in which a color image and an infrared image are obtained, and a pedestrian area is detected by excluding a non-pedestrian area from the infrared image using information of the color image.
  • the present invention obtains a color image and an infrared image and detects a pedestrian area by excluding a non-pedestrian area from the infrared image using information of the color image, the calculation amount and the probability of an erroneous detection can be reduced and thus reliable information can be provided.
  • the first camera 10 takes a first image including color information of the vicinity of a vehicle during night driving.
  • the first camera 10 can be implemented by a CCD (Charge Coupled Device) camera and the like.

Abstract

A method and an apparatus for detecting a pedestrian by a vehicle during night driving are provided, in which the apparatus includes: a first camera configured to take a first image including color information of a vicinity of the vehicle during night driving; a second camera configured to take a second image including thermal distribution information of the vicinity of the vehicle; a pedestrian detector configured to detect a non-pedestrian area by using the color information from the first image and detect a pedestrian area by excluding the non-pedestrian area from the second image; and a display configured to match and display the pedestrian area on the second image.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a continuation of U.S. patent application Ser. No. 14/562,686, filed Dec. 6, 2014 which is based on and claims under 35 U.S.C. § 119(a) priority from Korean Patent Application No. 10-2014-0110943, filed on Aug. 25, 2014 in the Korean Intellectual Property Office, the disclosure of which are incorporated herein in its entirety by reference.
BACKGROUND
(a) Field of the Invention
The present invention relates to a method and apparatus for detecting a pedestrian by a vehicle during night driving, in which a color image and an infrared image are obtained, and a pedestrian area is detected by excluding a non-pedestrian area from the infrared image using information of the color image.
(b) Description of the Related Art
A vehicle vision system takes a color image of the vicinity of a vehicle through a camera mounted on the vehicle and displays it. For detecting an object (a vehicle, a pedestrian, etc.), this color image ensures a certain degree of performance during the daytime, but the performance for detecting an object such as a pedestrian/animal is significantly reduced at nighttime.
In order to solve the problem, conventionally, a vehicle night vision system, which applies a far-infrared camera or a near-infrared camera, etc. taking a thermal image by using the thermal property generated from a human body, has been proposed. However, because the thermal image is detected as a black and white image, substantial information may be lost and thus erroneous detection is increased.
SUMMARY
A method and an apparatus for detecting a pedestrian by a vehicle during night driving includes obtaining a color image and an infrared image, and detecting a pedestrian area by excluding a non-pedestrian area from the infrared image using information of the color image.
According to an embodiment of the present invention, an apparatus for detecting a pedestrian by a vehicle during night driving includes: a first camera configured to take a first image including color information of a vicinity of the vehicle during night driving; a second camera configured to take a second image including thermal distribution information of the vicinity of the vehicle; a pedestrian detector configured to detect a non-pedestrian area by using the color information from the first image and detect a pedestrian area by excluding the non-pedestrian area from the second image; and a display configured to match and display the pedestrian area on the second image.
Also, the second camera is implemented by one of an infrared camera, a far-infrared camera, and a near-infrared camera.
Also, the pedestrian detector includes an image matcher configured to match the first image and the second image, a non-pedestrian area detector configured to detect the area which a color value is more than a reference value as the non-pedestrian area from the first image based on the color information of the first image, an attention area extractor configured to extract an attention area by excluding the non-pedestrian area from the second image, and a pedestrian area extractor configured to extract the pedestrian area from the attention area.
Also, the image matcher calculates a real coordinate of an object from a coordinate of the first image by using inside and outside parameters of the first camera and a real distance between a virtual starting point and the object.
Also, the image matcher calculates a corresponding coordinate of the second image corresponding to the coordinate of the first image by using the real coordinate of the object, inside and outside parameters of the second camera, and the real distance.
Also, the virtual starting point is a central point between points representing locations of the first and second cameras, in which a vertical line from starting points of the first camera and the second camera and planes of the first image and the second image meet.
An method for detecting a pedestrian by a vehicle during night driving according to an embodiment of the present invention includes the steps of: taking a first image and a second image of a vicinity of the vehicle through a first camera and a second camera, respectively, during night driving; matching the first image and the second image; detecting a non-pedestrian area by using a color information of the first image; excluding the non-pedestrian area from the second image;
detecting a pedestrian area from the second image excluding the non-pedestrian area; and matching and displaying the pedestrian area on the first image.
Also, the step of matching the first image and the second image includes steps of: calculating a real coordinate of an object from a coordinate of the first image by using inside and outside parameters of the first camera and a real distance between a virtual starting point and the object, and calculating a corresponding coordinate of the second image corresponding to a coordinate of the first image by using the real coordinate of the object, inside and outside parameters of the second camera and the real distance.
Also, the virtual starting point is a central point between the points, which a vertical line from starting points of the first camera and the second camera and planes of the first image and the second image meet.
Also, the step of detecting a non-pedestrian area detects an area which a color value is more than a reference value as the non-pedestrian area from the first image.
Since the present invention obtains a color image and an infrared image and detects a pedestrian area by excluding a non-pedestrian area from the infrared image using information of the color image, the calculation amount and the probability of an erroneous detection can be reduced and thus reliable information can be provided.
Further, the present invention can be combined with the existing front camera and thus the performance is improved without any additional cost.
Further, since the present invention excludes the non-pedestrian area from the image, the existing pedestrian detection algorithm can be applied and the calculation speed of a pedestrian detection can be improved.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other objects, features and advantages of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a block configuration diagram of an apparatus for detecting a pedestrian by a vehicle during night driving according to an embodiment of the present invention.
FIG. 2a is diagram showing the geometric relation between two images obtained by two cameras related to the present invention.
FIG. 2b is flow chart showing a process for calculating a corresponding coordinate between cameras of the image matcher shown in FIG. 1.
FIG. 3 is a flow chart showing a method for detecting a pedestrian by a vehicle during night driving according to an embodiment of the present invention.
FIG. 4a to FIG. 4d are exemplary diagrams showing the image processing result for each of steps of a night pedestrian detection process.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Hereinafter, the embodiments of the present invention will be described in detail with reference to the drawings.
It is understood that the term “vehicle” or “vehicular” or other similar term as used herein is inclusive of motor vehicles in general such as passenger automobiles including sports utility vehicles (SUV), buses, trucks, various commercial vehicles, watercraft including a variety of boats and ships, aircraft, and the like, and includes hybrid vehicles, electric vehicles, plug-in hybrid electric vehicles, hydrogen-powered vehicles and other alternative fuel vehicles (e.g. fuels derived from resources other than petroleum). As referred to herein, a hybrid vehicle is a vehicle that has two or more sources of power, for example both gasoline-powered and electric-powered vehicles.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
Further, the control logic of the present invention may be embodied as non-transitory computer readable media on a computer readable medium containing executable program instructions executed by a processor, controller or the like. Examples of computer readable media include, but are not limited to, ROM, RAM, compact disc (CD)-ROMs, magnetic tapes, floppy disks, flash drives, smart cards and optical data storage devices. The computer readable medium can also be distributed in network coupled computer systems so that the computer readable media is stored and executed in a distributed fashion, e.g., by a telematics server or a Controller Area Network (CAN).
FIG. 1 is a block configuration diagram of an apparatus for detecting a pedestrian by a vehicle during night driving according to an embodiment of the present invention, FIG. 2a is diagram showing the geometric relation between two images obtained by two cameras related to the present invention, and FIG. 2b is flow chart showing a process for calculating a corresponding coordinate between cameras of the image matcher shown in FIG. 1.
Referring to FIG. 1, an apparatus for detecting a pedestrian according to the present invention includes a first camera 10, a second camera 20, a pedestrian detector 30 and a display 40.
The first camera 10 takes a first image including color information of the vicinity of a vehicle during night driving. The first camera 10 can be implemented by a CCD (Charge Coupled Device) camera and the like.
The second camera 20 takes a second image including thermal distribution information of the vicinity of the vehicle. The second camera 20 can be implemented by an infrared camera, a far-infrared camera, a near-infrared camera, and the like.
The first camera 10 and the second camera 20 are mounted in pairs on at least one of the front, rear, and side of the vehicle. The first camera 10 and second camera 20 are arranged in two different points of the same plane (for example, front). In particular, the first camera 10 and the second camera 20 obtain the image of the same scene from the different points from each other.
The pedestrian detector 30 detects a non-pedestrian area from the first image and detects the pedestrian area by excluding the non-pedestrian area from the second image. This pedestrian detector 30 can be implemented by an image processor.
The pedestrian detector 30 includes an image matcher 31, a non-pedestrian area detector 32, an attention area extractor 33, and a pedestrian area extractor 34.
The image matcher 31 matches the first image and the second image by using viewpoint change technology. In other words, the image matcher 31 mutually matches the coordinates of the first image and the second image obtained from the different points from each other.
The process which the image matcher 31 calculates the corresponding coordinates between the first image and the second image will be described with reference to FIG. 2a and FIG. 2 b.
Referring to FIG. 2a , one point P in the three dimensional space is projected onto an image coordinate p in the first image, and is projected onto an image coordinate p′ in the second image. Also, a central point between the points, in which a vertical line from the starting points of two cameras and the planes of the first image and the second image meet, is assumed as a virtual starting point, and the distance from the virtual starting point to the point P is assumed as a real distance Z.
If the image coordinate p of the first image is inputted, in order to calculate the image coordinate of the corresponding p′ in the second image, if the image coordinate p(x, y) of the first image is inputted, the image matcher 31 calculates the real coordinate (X, Y, Z) of the point P by using the inside and outside parameter of the first camera 10 and the real distance Z.
The image matcher 31 calculates the image coordinate p′(u, v) of the second image corresponding to the image coordinate p(x, y) of the first image by using the real coordinate (X, Y, Z) of the point P, the inside and outside parameter of the second camera 10, and the real distance Z.
The non-pedestrian detector 32 detects the non-pedestrian area from the first image by using the color information (hue information) of the first image. At this time, the non-pedestrian area detector 32 detects the area which the color value is more than the reference value from the first image as the non-pedestrian area.
The attention area extractor 33 excludes the non-pedestrian area detected by the non-pedestrian area detector 32 from the second image. In particular, the second image excluding the non-pedestrian area becomes the attention area capable of detecting the pedestrian area.
The pedestrian area extractor 34 extracts the pedestrian area from the second image excluding the non-pedestrian area. At this time, the pedestrian area extractor 34 extracts the pedestrian area from the second image by using a feature detection and learning algorithm (a pedestrian detection algorithm).
The display 40 matches and displays the pedestrian area extracted by the pedestrian are extractor 34 on the second image. This display 40 can be implemented by LCD (Liquid Crystal Display), LED (Light Emitting Diode) display, HUD (Head-Up Display), a transparent display and the like.
FIG. 3 is a flow chart showing a method for detecting a pedestrian by a vehicle during night driving according to an embodiment of the present invention, and FIG. 4a to FIG. 4d are exemplary diagrams showing the image processing result for each of steps of a night pedestrian detection process.
First, the pedestrian detector 30 of a pedestrian detection apparatus takes the first image and the second image of the vicinity (for example, front, rear, or side) of a vehicle through the first camera 10 and the second camera 20 during night driving (S11). At this time, the first image and the second image is the images taken in the different points from each other, the first image (color image) includes the color information for the vicinity of a vehicle, and the second image includes the thermal distribution information for the vicinity of a vehicle.
The image matcher 31 of the pedestrian detector 30 matches the image coordinate of the second image corresponding to the image coordinate of the first image (S12). In other words, the image matcher 31 calculates the real coordinate of the object from the coordinate of the first image by using the inside and the outside parameter of the first camera 10 and the real distance between the virtual starting point and the object, and calculates the corresponding coordinate of the second image corresponding to the coordinate of the first image by using the real coordinate of the object, the inside and the outside parameter of the second camera 20 and the real distance.
The non-pedestrian detector 32 detects the non-pedestrian area form the first image by using the color information of the first image (S13). At this time, the non-pedestrian area detector 32 detects the area which the color value is more than the reference value as the non-pedestrian area in the first image. In particular, the non-pedestrian detector 32 detects the bright area which is more than the reference as compared to the vicinity as the non-pedestrian area as shown in FIG. 4 a.
The attention area extractor 33 excludes the non-pedestrian area from the second image (S14). For example, the attention area extractor 33 detects the area of the far-infrared area corresponding to the non-pedestrian area detected form the color image and deletes the area as shown in FIG. 4 b.
The pedestrian area extractor 34 detects the pedestrian area form the second image excluding the non-pedestrian area (S15). Here, the pedestrian area extractor 34 detects the pedestrian area in the far-infrared area excluding the non-pedestrian area by using a feature detection and learning algorithm (pedestrian detection algorithm) as shown in FIG. 4 c.
The display 40 matches and displays the pedestrian area detected by the pedestrian detector 30 on the first image (S16). For example, the display 40 matches and displays the pedestrian area detected from the far-infrared image on the color image as shown in FIG. 4d . Although the present invention has been described through specific exemplary embodiments hereinabove, it may be variously modified without departing from the scope of the present invention. Accordingly, the scope of the present invention is not to be construed as being limited to the above-mentioned exemplary embodiments, but is to be defined by the following claims and equivalents thereto. When considering the above-mentioned contents, it is to be considered that the present invention includes modifications and alternations thereof as long as these modifications and alternations are within the scope of the following claims and equivalents thereto.

Claims (10)

What is claimed is:
1. An apparatus for detecting a pedestrian by a vehicle during night driving, comprising:
a first camera configured to take a first image including color information of a vicinity of the vehicle during night driving;
a second camera configured to take a second image including thermal distribution information of the vicinity of the vehicle;
a pedestrian detector configured to detect a non-pedestrian area by using the color information from the first image and detect a pedestrian area by excluding the non-pedestrian area from the second image; and
a display configured to match and display the pedestrian area on the second image,
wherein the pedestrian detector comprises:
an image matcher configured to match the first image and the second image by matching corresponding coordinates of the first image and the second image;
a non-pedestrian area detector configured to detect an area in which a color value is more than a reference value as the non-pedestrian area from the first image based on the color information of the first image; and
an attention area extractor configured to extract an attention area by excluding the non-pedestrian area from the second image.
2. The apparatus of claim 1, wherein the pedestrian detector further comprises a pedestrian area extractor configured to extract the pedestrian area from the attention area.
3. The apparatus of claim 1, wherein the first camera includes one or more of an infrared camera, a far-infrared camera, and near-infrared camera.
4. The apparatus of claim 1, wherein the image matcher calculates a real coordinate of an object from a coordinate of the first image by using inside and outside parameters of the first camera and a real distance between a virtual starting point and the object.
5. The apparatus of claim 4, wherein the virtual starting point is a central point between points representing locations of the first and second cameras, in which a vertical line from the points of the first camera and the second camera and planes of the first image and the second image meet.
6. The apparatus of claim 2, wherein the image matcher calculates a corresponding coordinate of the second image corresponding to the coordinate of the first image by using the real coordinate of the object, inside and outside parameters of the second camera, and the real distance.
7. A method for detecting a pedestrian by a vehicle during night driving, comprising:
taking a first image and a second image of a vicinity of the vehicle through a first camera and a second camera, respectively, during night driving;
matching the first image and the second image by matching corresponding coordinates of the first image and the second image;
detecting a non-pedestrian area by using color information of the first image;
excluding the non-pedestrian area from the second image;
detecting a pedestrian area from the second image excluding the non-pedestrian area; and
matching and displaying the pedestrian area on the first image,
wherein the detecting the non-pedestrian area includes detecting an area in which a color value is more than a reference value as the non-pedestrian area from the first image.
8. The method according to claim 7, wherein the matching the first image and the second image comprises calculating a real coordinate of an object from a coordinate of the first image by using inside and outside parameters of the first camera and a real distance between a virtual starting point and the object.
9. The method according to claim 8, wherein the matching the first image and the second image further comprises calculating a corresponding coordinate of the second image corresponding to the coordinate of the first image by using a real coordinate of the object, inside and outside parameters of the second camera, and the real distance.
10. The method according to claim 8, wherein the virtual starting point is a central point between points representing locations of the first and second cameras, in which a vertical line from starting points of the first camera and the second camera and planes of the first image and the second image meet.
US15/391,310 2014-08-25 2016-12-27 Method and apparatus for detecting a pedestrian by a vehicle during night driving Active 2035-01-14 US10150415B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/391,310 US10150415B2 (en) 2014-08-25 2016-12-27 Method and apparatus for detecting a pedestrian by a vehicle during night driving

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR1020140110943A KR101601475B1 (en) 2014-08-25 2014-08-25 Pedestrian detection device and method for driving vehicle at night
KR10-2014-0110943 2014-08-25
US14/562,686 US10391937B2 (en) 2014-08-25 2014-12-06 Method and apparatus for detecting a pedestrian by a vehicle during night driving
US15/391,310 US10150415B2 (en) 2014-08-25 2016-12-27 Method and apparatus for detecting a pedestrian by a vehicle during night driving

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/562,686 Continuation US10391937B2 (en) 2014-08-25 2014-12-06 Method and apparatus for detecting a pedestrian by a vehicle during night driving

Publications (2)

Publication Number Publication Date
US20170106798A1 US20170106798A1 (en) 2017-04-20
US10150415B2 true US10150415B2 (en) 2018-12-11

Family

ID=55352139

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/562,686 Active 2035-08-25 US10391937B2 (en) 2014-08-25 2014-12-06 Method and apparatus for detecting a pedestrian by a vehicle during night driving
US15/391,310 Active 2035-01-14 US10150415B2 (en) 2014-08-25 2016-12-27 Method and apparatus for detecting a pedestrian by a vehicle during night driving

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/562,686 Active 2035-08-25 US10391937B2 (en) 2014-08-25 2014-12-06 Method and apparatus for detecting a pedestrian by a vehicle during night driving

Country Status (3)

Country Link
US (2) US10391937B2 (en)
KR (1) KR101601475B1 (en)
CN (1) CN105787917B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9971958B2 (en) * 2016-06-01 2018-05-15 Mitsubishi Electric Research Laboratories, Inc. Method and system for generating multimodal digital images
US10552706B2 (en) * 2016-10-24 2020-02-04 Fujitsu Ten Limited Attachable matter detection apparatus and attachable matter detection method
DE102017216016A1 (en) * 2017-09-12 2019-03-14 Robert Bosch Gmbh Method and device for detecting an environment of a vehicle
KR102614494B1 (en) * 2019-02-01 2023-12-15 엘지전자 주식회사 Non-identical camera based image processing device
WO2021260598A1 (en) * 2020-06-23 2021-12-30 Immervision Inc. Infrared wide-angle camera
US11924527B2 (en) * 2022-05-10 2024-03-05 GM Global Technology Operations LLC Optical sensor activation and fusion

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6327536B1 (en) * 1999-06-23 2001-12-04 Honda Giken Kogyo Kabushiki Kaisha Vehicle environment monitoring system
JP2004254044A (en) 2003-02-19 2004-09-09 Denso Corp Driving supporting method, display system and notification system
JP3857698B2 (en) 2004-04-05 2006-12-13 株式会社日立製作所 Driving environment recognition device
JP2007087337A (en) 2005-09-26 2007-04-05 Toyota Motor Corp Vehicle peripheral information display device
JP2007139564A (en) 2005-11-17 2007-06-07 Sumitomo Electric Ind Ltd Obstacle detection system, obstacle detection method and computer program
JP2008230358A (en) 2007-03-19 2008-10-02 Honda Motor Co Ltd Display device
JP2009301494A (en) 2008-06-17 2009-12-24 Sumitomo Electric Ind Ltd Image processing unit and image processing method
JP2010020557A (en) 2008-07-10 2010-01-28 Sumitomo Electric Ind Ltd Image processor and image processing method
JP4611919B2 (en) 2006-03-23 2011-01-12 本田技研工業株式会社 Pedestrian recognition device
JP2011087006A (en) 2009-10-13 2011-04-28 Denso Corp Display device for vehicle
JP2012164026A (en) 2011-02-03 2012-08-30 Nippon Soken Inc Image recognition device and display device for vehicle
JP2013247514A (en) 2012-05-25 2013-12-09 Canon Inc Image processor, image processing method and program
JP2014067198A (en) 2012-09-26 2014-04-17 Hitachi Automotive Systems Ltd Moving object recognition device

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6496594B1 (en) * 1998-10-22 2002-12-17 Francine J. Prokoski Method and apparatus for aligning and comparing images of the face and body from different imagers
WO2002005013A2 (en) * 2000-07-10 2002-01-17 Ophir Optronics Ltd. Impaired vision assist system and method
DE102005006290A1 (en) * 2005-02-11 2006-08-24 Bayerische Motoren Werke Ag Method and device for visualizing the surroundings of a vehicle by fusion of an infrared and a visual image
JP4263737B2 (en) * 2006-11-09 2009-05-13 トヨタ自動車株式会社 Pedestrian detection device
US7961906B2 (en) * 2007-01-03 2011-06-14 Science Applications International Corporation Human detection with imaging sensors
CN101226597B (en) * 2007-01-18 2010-04-14 中国科学院自动化研究所 Method and system for recognizing nights pedestrian based on thermal infrared gait
JP2008230348A (en) 2007-03-19 2008-10-02 Toyota Motor Corp Mounting structure of base plate for window regulator
JP4315991B2 (en) * 2007-04-20 2009-08-19 本田技研工業株式会社 Vehicle periphery monitoring device, vehicle periphery monitoring method, vehicle periphery monitoring program
US8546100B2 (en) * 2007-10-03 2013-10-01 3M Innovative Properties Company Microorganism concentration process and agent
KR20100006700A (en) * 2008-07-10 2010-01-21 현대자동차주식회사 Night-vision system for a vehicle
CN101408984B (en) * 2008-10-07 2010-09-29 西北工业大学 Method for detecting synergic movement target
CN101477631B (en) * 2009-01-20 2011-01-19 深圳先进技术研究院 Method, equipment for extracting target from image and human-machine interaction system
CN102542242B (en) * 2010-12-27 2017-08-08 北京北科慧识科技股份有限公司 The biological characteristic area positioning method and device of contactless collection image
EP2719168A2 (en) * 2011-06-10 2014-04-16 Flir Systems, Inc. Systems and methods for intelligent monitoring of thoroughfares using thermal imaging
CN202077119U (en) * 2011-06-29 2011-12-14 广东好帮手电子科技股份有限公司 Vehicle-mounted infrared night vision system with pedestrian identification function
CN102435174B (en) * 2011-11-01 2013-04-10 清华大学 Method and device for detecting barrier based on hybrid binocular vision
KR101247497B1 (en) * 2012-02-29 2013-03-27 주식회사 슈프리마 Apparatus and method for recongnizing face based on environment adaption
JP5991224B2 (en) * 2013-02-15 2016-09-14 オムロン株式会社 Image processing apparatus, image processing method, and image processing program
JP5991648B2 (en) * 2013-03-28 2016-09-14 株式会社デンソー Display control device for vehicle
CN103390281B (en) * 2013-07-29 2016-04-13 西安科技大学 A kind of two spectrum night vision device onboard systems and two spectrum Combined design method
CN103778618A (en) * 2013-11-04 2014-05-07 国家电网公司 Method for fusing visible image and infrared image

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6327536B1 (en) * 1999-06-23 2001-12-04 Honda Giken Kogyo Kabushiki Kaisha Vehicle environment monitoring system
JP2004254044A (en) 2003-02-19 2004-09-09 Denso Corp Driving supporting method, display system and notification system
JP3857698B2 (en) 2004-04-05 2006-12-13 株式会社日立製作所 Driving environment recognition device
JP2007087337A (en) 2005-09-26 2007-04-05 Toyota Motor Corp Vehicle peripheral information display device
JP2007139564A (en) 2005-11-17 2007-06-07 Sumitomo Electric Ind Ltd Obstacle detection system, obstacle detection method and computer program
JP4611919B2 (en) 2006-03-23 2011-01-12 本田技研工業株式会社 Pedestrian recognition device
JP2008230358A (en) 2007-03-19 2008-10-02 Honda Motor Co Ltd Display device
JP2009301494A (en) 2008-06-17 2009-12-24 Sumitomo Electric Ind Ltd Image processing unit and image processing method
JP2010020557A (en) 2008-07-10 2010-01-28 Sumitomo Electric Ind Ltd Image processor and image processing method
JP2011087006A (en) 2009-10-13 2011-04-28 Denso Corp Display device for vehicle
JP2012164026A (en) 2011-02-03 2012-08-30 Nippon Soken Inc Image recognition device and display device for vehicle
JP2013247514A (en) 2012-05-25 2013-12-09 Canon Inc Image processor, image processing method and program
JP2014067198A (en) 2012-09-26 2014-04-17 Hitachi Automotive Systems Ltd Moving object recognition device

Also Published As

Publication number Publication date
US20160052452A1 (en) 2016-02-25
US20170232894A9 (en) 2017-08-17
US20170106798A1 (en) 2017-04-20
CN105787917B (en) 2021-08-06
KR101601475B1 (en) 2016-03-21
US10391937B2 (en) 2019-08-27
CN105787917A (en) 2016-07-20
KR20160024297A (en) 2016-03-04

Similar Documents

Publication Publication Date Title
US10150415B2 (en) Method and apparatus for detecting a pedestrian by a vehicle during night driving
US20180136332A1 (en) Method and system to annotate objects and determine distances to objects in an image
KR102240197B1 (en) Tracking objects in bowl-shaped imaging systems
US9224055B2 (en) Exterior environment recognition device
US10866636B2 (en) Virtual touch recognition apparatus and method for correcting recognition error thereof
US20130286205A1 (en) Approaching object detection device and method for detecting approaching objects
US9183449B2 (en) Apparatus and method for detecting obstacle
US9810787B2 (en) Apparatus and method for recognizing obstacle using laser scanner
US9606623B2 (en) Gaze detecting apparatus and method
JP2007508624A (en) Moving object detection using computer vision adaptable to low illumination depth
US20140152697A1 (en) Method and apparatus for providing augmented reality
TWI595450B (en) Object detection system
TWI744610B (en) Scene reconstructing system, scene reconstructing method and non-transitory computer-readable medium
US20150293585A1 (en) System and method for controlling heads up display for vehicle
EP3690393B1 (en) Information processing device, information processing method, control device, and image processing device
US11250279B2 (en) Generative adversarial network models for small roadway object detection
CN111369617A (en) 3D target detection method of monocular view based on convolutional neural network
US20140294241A1 (en) Vehicle having gesture detection system and method
US20190026947A1 (en) Vehicle backup safety mapping
WO2018222122A1 (en) Methods for perspective correction, computer program products and systems
CN113165653A (en) Following vehicle
US11580695B2 (en) Method for a sensor-based and memory-based representation of a surroundings, display device and vehicle having the display device
Hosseini et al. A system design for automotive augmented reality using stereo night vision
US11377027B2 (en) Image processing apparatus, imaging apparatus, driving assistance apparatus, mobile body, and image processing method
US11100702B2 (en) 3D image labeling method based on labeling information of 2D image and 3D image labeling device

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4