JP2007076378A - Lighting apparatus and method for vehicle - Google Patents

Lighting apparatus and method for vehicle Download PDF

Info

Publication number
JP2007076378A
JP2007076378A JP2005262690A JP2005262690A JP2007076378A JP 2007076378 A JP2007076378 A JP 2007076378A JP 2005262690 A JP2005262690 A JP 2005262690A JP 2005262690 A JP2005262690 A JP 2005262690A JP 2007076378 A JP2007076378 A JP 2007076378A
Authority
JP
Japan
Prior art keywords
area
region
moving object
temperature
extracting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2005262690A
Other languages
Japanese (ja)
Inventor
Yohei Aragaki
Tomoko Shimomura
倫子 下村
洋平 新垣
Original Assignee
Nissan Motor Co Ltd
日産自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd, 日産自動車株式会社 filed Critical Nissan Motor Co Ltd
Priority to JP2005262690A priority Critical patent/JP2007076378A/en
Publication of JP2007076378A publication Critical patent/JP2007076378A/en
Pending legal-status Critical Current

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a lighting device for a vehicle, lighting the direction of an area detected as having the possibility that a pedestrian may exist or appear therein. <P>SOLUTION: The lighting device for vehicle includes: an infrared ray radiation camera 11 for acquiring a thermal image around a vehicle; a feature amount extracting part 131 which extracts a feature amount in the image from the acquired thermal image; a feature amount distribution calculation part 132 for acquiring distribution of the extracted feature amount; and a moving object presence possibility area extracting part 133 which extracts an area which has a high possibility that the moving object, such as the pedestrian may exist or appear from the distribution of the acquired feature amount, thus irradiating the light from a site lighting part in such a direction of the area in which the moving object may exist or appear. <P>COPYRIGHT: (C)2007,JPO&INPIT

Description

  The present invention relates to a lighting device and a lighting method for a vehicle.

  Conventionally, as an apparatus for changing the direction of light emitted by a vehicle headlamp, for example, road width information is acquired, and the direction of headlamp illumination or lighting is obtained from the acquired road width, vehicle steering, information, etc. A technique for controlling the illumination range by changing the irradiation width (light distribution pattern) is known (see Patent Document 1).

  As another conventional technique for changing the irradiation direction of the vehicle headlamp, there is a technique for determining the traveling region (highway, general road, urban area) of the vehicle and controlling the irradiation area and the glare amount accordingly. (See Patent Document 2).

In addition, as a technology for notifying the driver of the presence of pedestrians and the like while the vehicle is running, pedestrians are detected using infrared images, and the results are displayed on the head-up display and a warning sound is sounded. A technique is known (see Patent Document 3).
JP-A-8-207656 JP 2002-254980 A JP 2004-303219 A

  However, all of the technologies that change the direction of illumination and the range of illumination of the headlamps only change the direction and range of illumination according to the road surface conditions and the surrounding environment. It was difficult to make a call.

  On the other hand, with the technique of detecting a pedestrian from an infrared image and notifying the driver of the pedestrian, the driver can be alerted but the detection result cannot be confirmed in the real field of view. For this reason, the driver has to move the line of sight, check the output result to the head-up display, and then determine the situation in the real field of view, which is a heavy burden on the driver. It was. In addition, since the result is output only when a pedestrian is detected, there is no warning sound or output to the head-up display until just before the pedestrian approaches the vehicle, and behind the structure on the road When pedestrians exist, there was a problem that it was not possible to cope.

  In order to solve the above-described problems, a vehicle lighting device according to the present invention extracts a feature amount in an image from a thermal image acquisition unit that acquires a thermal image around the vehicle and the thermal image acquired by the thermal image acquisition unit. A feature quantity extracting means, a feature quantity distribution calculating means for obtaining a distribution of the feature quantity extracted by the feature quantity extracting means, and a moving object from the distribution of the feature quantity obtained by the feature quantity distribution calculating means. Moving object existence possibility area extracting means for extracting an area having a high possibility of existence or appearance, and an area where the moving object extracted by the previous moving object existence possibility extraction means is likely to exist or appear And a light irradiation means for irradiating light.

  In order to solve the above problems, the lighting method of the present invention includes a step of acquiring a thermal image around a vehicle, a step of extracting a feature amount in the image from the thermal image, a step of obtaining a distribution of the feature amount, Extracting a region where a moving object is likely to exist or appear from the distribution of the feature amount, and irradiating light toward a region where the moving object is likely to exist or appear This is the gist.

  According to the present invention, a region where a moving object is highly likely to exist or a region where a moving object is likely to appear is detected and light is irradiated to the region. It is possible to notify the driver of an area where an object that may be an obstacle to driving appears in the field of view, and it is possible to improve convenience for the night field of view.

  Embodiments of a vehicle lighting device and a lighting method according to the present invention will be described below with reference to the drawings.

  FIG. 1 is a block diagram illustrating the configuration of the vehicular illumination apparatus according to the first embodiment. The vehicular lighting device 1 according to the present embodiment includes an infrared camera 11 (thermal image acquisition unit) that acquires a thermal image, a storage unit 12 that stores the acquired image and calculation results, and a moving object may appear from the image. And a light irradiation unit 14 (light irradiation means) for irradiating the target region with light from a headlamp (light).

  The infrared camera 11 acquires a thermal image having thermal data around the host vehicle (at least in a range in front of the host vehicle). The acquired thermal image is transferred to and stored in the image memory 121 in the storage unit 12.

  The calculation unit 13 performs a calculation based on the information stored in the image memory 121 and executes a process for extracting a region where a moving object may appear in front of the host vehicle.

  2A and 2B are explanatory views showing the installation state of the infrared camera 11. FIG. 2A shows the installation state when viewed from the side of the vehicle, and FIG. 2B shows the installation state when viewed from above the vehicle.

  As shown in the figure, the infrared camera 11 is installed in the grill of the vehicle. This position is arranged so that an image in front of the vehicle can be acquired in a format including thermal data. Therefore, the installation position and shooting direction of the infrared camera 11 are not limited to the positions as shown in FIG. 2, and other installations can be used as long as an image in front of the vehicle can be acquired in a format including thermal data. It may be the position and the shooting direction.

  Here, an example of an image acquired by the infrared camera 11 will be described. FIG. 5 is an explanatory diagram illustrating an example of an image acquired by the infrared camera 11.

  As shown in the figure, the infrared camera 11 according to the present embodiment images the front direction of the vehicle, and includes, for example, a thermal image including a person G1, a utility pole G2, a preceding vehicle G3, a vending machine G4, a street tree G5, a road, and the like. To get. This thermal image is composed of a plurality of pixel data, and each pixel data includes coordinate data and thermal data.

  Reference is again made to FIG. In addition to the image memory 121 that stores images as described above, the storage unit 12 includes a calculation memory unit 122 that stores calculation results and is provided as a work area.

  In addition, the calculation unit 13 includes a feature amount extraction unit 131 (feature amount extraction unit), a feature amount distribution calculation unit 132 (feature amount distribution calculation unit), and a moving object existence possibility region extraction unit 133 (moving object existence possibility). Region extracting means).

  The functions of the calculation unit 13 and the storage unit 12 configured as described above are realized by a so-called computer, and the computer executes a program for executing the procedure of the illumination method according to the present invention so that the function of each unit is performed. It has been achieved. However, the number of computers for achieving the function of each unit is not necessarily one. For example, a plurality of microcomputers may be shared and processed.

  The image memory 121 stores thermal image data acquired from the infrared camera 11 in time series. That is, the image memory 121 stores the thermal image as shown in FIG. 5 including the coordinate data and thermal data of each pixel.

  The feature amount extraction unit 131 extracts pixels having a large luminance change from surrounding pixels as feature points from the thermal data of the thermal image acquired by the infrared camera 11 and extracts a region belonging to a predetermined temperature zone.

  The feature quantity distribution calculation unit 132 calculates the feature point distribution from one or more of the number, density, intensity, and distance of the feature points from the feature point information extracted by the feature quantity extraction unit 131, and extracts the feature quantity. The temperature distribution and area inside the region belonging to the predetermined temperature zone obtained by the unit 131 are calculated.

  The moving object existence possibility area extracting unit 133 extracts an area where the moving object is likely to appear from the feature point distribution state calculated by the feature amount distribution calculating unit 132.

  In addition, the moving object existence possibility area extraction unit 133 also extracts a road surface area from the thermal image. Therefore, here, the moving object existence possibility area extraction unit 133 also has a function as a road surface area extraction unit.

  The light irradiation unit 14 irradiates light toward the region extracted by the moving object existence possibility region extraction unit 133. For example, a movable light called AFS (Adaptive Frontlighting System) is used as the light to be used. The AFS moves the optical axis by a motor, and enables, for example, the direction of light trading company according to the steering angle or the like. Here, using this function, the irradiation direction of the AFS light is changed so that the region extracted by the moving object existence possibility region extraction unit 133 is irradiated with light.

  This AFS is usually installed near the headlamp of the vehicle, but a light that can change the direction of illumination may be attached to the front grille, top front position, or the like of the vehicle so that the front of the vehicle can be illuminated.

Next, the operation of the vehicular illumination device 1 configured as described above will be described. FIG. 3 is a flowchart showing a procedure of illumination processing performed by the vehicular illumination device. 5 to 12 are explanatory diagrams for explaining feature amount extraction.

  First, the arithmetic unit 13 instructs the infrared camera 11 to take an image, acquires a thermal image of a scene as shown in FIG. 5 by the infrared camera 11, and stores it in the image memory 121 (step S1). In FIG. 5, various objects G1 to G5 as described above are shown.

  Next, the computing unit 13 extracts feature points from the thermal image once stored in the image memory 121 (step S2).

  The feature point extraction is performed by the feature amount extraction unit 131. As feature points, for example, corners of the image area are extracted as feature points. As a technique for that purpose, for example, a method called a corner filter used in image processing can be used. Specifically, for example, Jianbo Shi, Carlo Tomasi, “Good Features to Track”, IEEE Conference on Computer Vision and Pattern Recognition (CVPR — 94) pp. The method disclosed in 593-600 can be used.

  This document describes a technique for extracting pixels suitable for tracking from an image. In this embodiment, it is the corners of the imaged object region that are taken out. Therefore, feature points can be extracted by using a method as proposed in the above document. Of course, other corner detection algorithms may be used. Although an example of feature points has been described here, a part of an edge obtained from a thermal image (for example, an edge portion having no linear element) may be used.

  FIG. 6 shows the result of extracting feature points from the image as shown in FIG.

  The feature points are obtained as output results such as a plurality of circles, as indicated by H1 in FIG. The feature points obtained in this way do not appear in regions such as road surface regions where temperature changes continuously appear in the image. On the other hand, areas where there are obstacles that tend to cause changes in luminance values on thermal images due to dense temperature changes, non-uniform surface configurations, and uneven infrared radiation directions, etc. Many appear. As described above, by using the feature points that are pixels having a large luminance change in the image, it is possible to reduce the influence of the thermal image from the temperature change due to the seasonal variation.

  In addition, in the process of extracting feature points, it is desirable to calculate the strength of feature points together.

  FIGS. 7A and 7B are explanatory diagrams illustrating the strength of feature points.

  As shown in FIG. 7, the intensity of a feature point is defined by a luminance difference between a pixel that is a target feature point and its surroundings. For example, when i1 in FIG. 7A is compared with i2 in FIG. 7B, i1 has a larger luminance difference from surrounding pixels, and thus becomes a stronger feature point than i2.

  Subsequently, the calculation unit 13 calculates the density of feature points (step S3). The feature point density calculation unit 132 calculates the density of feature points.

  To calculate the density of feature points, as shown in H2 of FIG. 6, the entire image is divided into a plurality of strip-shaped regions, the number of feature points existing in each strip region is calculated, and the score is The high region is calculated as a region having a high feature point density. At this time, the density of each strip region may be calculated by weighting at the time of density calculation according to the strength of the feature points.

  Subsequently, the calculation unit 13 extracts a region having a predetermined temperature zone (referred to as a predetermined temperature zone region) (step S4). The feature amount distribution calculation unit 132 also performs extraction of the predetermined temperature zone region.

  In the extraction of the predetermined temperature zone area, for example, an area having a temperature equivalent to a pedestrian is extracted as a predetermined temperature zone area in the scene of FIG. FIG. 8 is an explanatory diagram for explaining extraction of such a predetermined temperature zone region.

  As shown in FIG. 8, here, various objects (see FIG. 5) are extracted as regions J1 to J5 having temperatures equivalent to pedestrians.

  Subsequently, the calculation unit 13 calculates a temperature distribution inside the region for each obtained region (step S5). The feature amount distribution calculation unit 132 also calculates the temperature distribution.

  The temperature distribution is calculated by calculating the area and temperature distribution of each of the extracted regions J1 to J5 shown in FIG. 8 and calculating the uniformity of the internal temperature distribution. At this stage, regions where the area of the region is equal to or less than a predetermined value or greater than a predetermined value are excluded.

  Subsequently, the calculation unit 13 compares the density of the feature points calculated in step S3 with the predetermined temperature zone area extracted in step S4, calculates the density of the feature points of each area, An area where an obstacle is present or is likely to appear is calculated (step S6). This process is performed by the moving object existence possibility area extraction unit 133.

  Specifically, the two rectangular areas with high feature point density indicated by H3 in FIG. 6 are areas having a pedestrian equivalent temperature. Therefore, it corresponds to J1 and J4 in FIG. Here, J1 is an area corresponding to a pedestrian, and J4 is an area corresponding to a vending machine. Thus, if a region with a certain temperature zone has a region with a high density of feature points at the same time, there is a possibility that the region is an obstacle on the road or there may be a pedestrian who can be an obstacle. Is expensive. Therefore, it is assumed that the area is an area where an obstacle exists or is likely to appear.

  If the road surface area of the road is extracted in advance, the area extracted inside the road surface area by comparing the shape of the road surface area with an area where an obstacle is likely to exist or appears Can be determined to be highly likely to be an obstacle.

  In addition, by extracting the road surface area in this way, the feature point area extracted in contact with the end of the road is considered to be a structure existing facing the road area. Therefore, using the shape of the road surface area, it is possible to estimate the positional relationship, such as the distance from the host vehicle to the area where the detected obstacle is likely to appear, and improve the area detection performance Is possible.

  The road surface area can be extracted by, for example, recognizing a roadside band line (white solid line on the road) indicating a roadway area on the road surface as an image. For this purpose, a moving image camera (not the infrared camera 11) and an image recognition device are separately required. In this case, these serve as road surface area extraction means and exist separately from the vehicle lighting device. Accordingly, information on the road surface area is obtained from the road surface area extraction means which is separately required.

  However, this increases the device configuration and leads to an increase in cost, and in this embodiment, the road surface area is extracted using the thermal image obtained by the infrared camera 11. For this purpose, first, an edge having a linear component is extracted from the thermal image. In the thermal image, the curb of the road is measured at a higher temperature than the road surface. The boundary between the curb and the road surface (or a road surface area and a structure such as a building) is generally constituted by an edge having a linear component. By extracting this edge by a straight edge extraction method such as Hough transform, it is possible to separate the road surface area from the other areas. This can be realized by measuring far infrared rays emitted from an object at the same temperature but different temperatures depending on the angle (radiation angle) to be measured.

  In addition, when extracting an edge having a straight line component, the angle of the edge is limited (for example, the position at 12 o'clock at 0 o'clock is 0 °, and 0 ± 10 ° is not used as a vertical edge of a utility pole or the like. In order to eliminate the possibility of extracting the upper stop line and the like, the horizontal 0 ° ± 5 ° edge is not used), and the amount of calculation can be reduced.

  In this way, by extracting the road surface area from the thermal image acquired by the infrared camera 11, it is not necessary to separately install a moving image camera or the like in addition to the infrared camera 11, and the system can be provided at low cost. Become.

  And the calculating part 13 changes the irradiation direction of light so that the light from the light irradiation part 14 may be irradiated to the area | region where the obstruction obtained by the procedure of step S6 exists, or a person has a high possibility of appearing. Then, light is irradiated (step S7).

  As a result, it is possible to improve the visibility of a region where a moving object such as a person finally exists or a person is likely to appear.

  After changing the light irradiation direction, the direction is not illuminated until the next illuminated area is detected. It is preferable to change to stop the illumination to the area where the moving object is likely to exist or appear. This means that if you illuminate a region where moving objects are likely to exist or appear for a certain amount of time, it will be sufficient to alert the driver to that region, so it is necessary to illuminate the same place forever Because there is no sex.

  For example, consider a case where a vehicle is traveling on an alley at a speed of about 20 km / h. From the relationship of the angle of view of the infrared camera, when the detectable distance of a pedestrian in front of the vehicle is 10 m or more, it is 15 m for the vehicle to actually reach the pedestrian position after the pedestrian approaches and cannot be detected. Travel back and forth. When this is converted into time, it takes about 3 seconds. Therefore, with this time as a reference, it is preferable to turn off the light after about 3 seconds after no pedestrian is detected.

  Hereinafter, estimation methods for various obstacles will be described.

  First, whether or not a vending machine can be determined can be determined from the shape of the road surface area and the height of the temperature distribution area. The height of the vending machine is stipulated by the Japan Vending Machine Industry Association based on 1830 mm and 2007 mm. Therefore, when it can be estimated that the height of the selected area is about 1.8 m to 2 m, the area is estimated to be a vending machine.

  Since there is a high possibility that a person is present around the vending machine, it is preferable to irradiate the area with light. Thereby, it is good to make it irradiate light in the direction determined to be a vending machine as a pedestrian exists or may come out.

  Next, the method for estimating a parked vehicle extracts a region (for example, 40 ° C. or higher, referred to as a high temperature region) that is clearly hotter than the pedestrian as the predetermined temperature zone region extracted in the procedure of S4. It can be carried out.

  Specifically, for example, a road surface area of a road is extracted, the height of the high temperature area extracted from the shape of the road surface area is estimated, and if the height is in a range of 1 m or less, the vehicle muffler The surrounding area is estimated as a parked vehicle.

  However, the parked vehicle may be parked for a long time and the muffler part may be cold. Such a vehicle has passed a considerable amount of time after injection, and it is unlikely that a person will get off the vehicle. On the other hand, a vehicle whose muffler has not yet cooled is a vehicle immediately after parking, so there is a high possibility that a person who gets off from such a vehicle will appear. For this reason, it is preferable to detect the vehicle immediately after parking where the temperature of the muffler part is high, and to direct light to the area, assuming that a person is likely to appear from there.

  In addition, as another method for estimating a parked vehicle, attention is paid to the predetermined temperature range and the appearance density of feature points.

  Since the vehicle has a high temperature around the tire / wheel part and the front fender part during traveling, the vehicle can be extracted as an area having a pedestrian equivalent temperature when observed with an infrared image. Moreover, since the shape and temperature distribution are not uniform, there are many feature points.

  Therefore, the density of feature points extracted in the procedure of step S3 is compared with the predetermined temperature zone area extracted in the procedure of step S4, and the area has a pedestrian equivalent temperature and the density of feature points is high. Are extracted as candidate regions.

  Subsequently, a road surface area of the road is extracted, and an area existing within a predetermined range from the end of the road surface area is selected from the candidate areas. The distance from the shape of the road surface area to the target area is estimated, and the height of the selected area is estimated. About the area | region where this height is about 1 m to 1.5 m, it can be estimated that it is a parked vehicle.

  Next, a method for extracting a planting area around a road as an area where an obstacle is likely to exist or appear will be described.

  Although there is little possibility that an obstacle exists in the implantation itself, for example, a pedestrian may jump out of an area where the implantation is interrupted. For this reason, the planting region is extracted and the region where the region is interrupted or between the planting and the planting is extracted, and the region is set as a region where an obstacle is likely to appear.

  As a method for detecting the implantation region, a temperature region corresponding to the outside air temperature is extracted in the extraction of the temperature region which is the procedure of step S4.

  Subsequently, the density of the feature points calculated in the procedure of step S3 in the temperature region corresponding to the outside air temperature is observed, the density of the feature points in the temperature region is equal to or higher than a predetermined value, and the distribution is An area that is uniform in the area is extracted.

  Here, when calculating the density of feature points, the feature points may be limited to those having low strength. This is because the temperature distribution is relatively uniform inside the planting region, and as a result, there are many feature points that do not change in luminance with the surroundings, that is, feature points with low strength. Therefore, by observing the density limited to feature points with low strength, it becomes easier to calculate the distribution of feature points appearing at the site of the planted leaf.

  Subsequently, the road surface area of the road is extracted, the distance distribution is estimated from the shape of the road surface area, and the distance to the corresponding area is estimated. From this estimated distance, the height of the region is estimated. When this height is 1 m or less, the area is estimated to be implanted.

  And it is good to irradiate light in the direction of the area | region estimated to be implanted in this way. This makes it possible to detect the planting on the side of the road, extract a region where a pedestrian or the like is likely to appear, such as a planting break, and improve the visibility in that direction.

  Next, a method for extracting signboards existing on the road will be described.

  Obstacles may appear from behind the signboards on the road. The signboard is measured on an infrared image at a temperature equivalent to or lower than the temperature, and the area around the area has a linear configuration. Therefore, if a region lower than the temperature is extracted and a straight edge is observed around it, the region is assumed to be a signboard.

  Specifically, a region having a temperature lower than the air temperature is extracted by extraction of the temperature region, which is the procedure of step S4.

  Then, a straight edge is extracted, and a region where the straight edge can be observed is further selected so as to surround the extracted region. Here, the road surface area of the road is extracted, and the height of the selected area is estimated from the shape of the road surface area. When this height is in the range of 1 m or less, the selected area is assumed to be a signboard, and light may be irradiated in that direction.

  As a result, a signboard on the road can be detected, so that it is possible to extract a region where a pedestrian may be behind, and to improve visibility in that direction.

  Next, a method for estimating a demographic structure will be described.

  Artificial structures generally have straight edges. Therefore, in this embodiment as well, artificial structures are estimated by extracting such straight edges.

  FIG. 4 is a flowchart showing a processing procedure for detecting a linear edge and changing the illumination direction.

  First, the calculation unit 13 stores the image acquired by the infrared camera 11 in the image memory 121 (step S11).

  Subsequently, the computing unit 13 extracts a vertical straight edge (vertical edge) as shown in FIG. 9 from the thermal image (scene as shown in FIG. 5) once stored (step S12). This process is performed by the feature amount extraction unit 131.

  In this straight line edge extraction, for example, a general Hough transform is used as a vertical edge extraction method, and the edges in the vertical direction of the image are set to 0 °, and only the edges falling within the range of 0 ± 3 ° are extracted. Of course, the vertical edge extraction method is not limited to this, and other methods may be used.

  Subsequently, the calculation unit 13 extracts a region having a predetermined temperature as shown in FIG. 8 (step S13). In addition, the calculation unit 13 excludes a temperature region that does not have an area in a predetermined range, and further calculates a temperature distribution inside the extracted temperature region (step S14). These are performed by the moving object existence possibility area extraction unit 133.

  Subsequently, the calculation unit 13 performs grouping on the vertical edge calculated in the procedure of step S12 and the temperature region calculated in the procedure of step S13 (step S14). This process is performed by the moving object existence possibility area extraction unit 133.

  FIG. 10 is an explanatory diagram for explaining grouping.

  In the grouping, as shown in FIG. 10, the vertical edge L2 in which the distance L3 on the image is a predetermined value or less is grouped for each temperature region such as L2. Here, a road surface area of the road is extracted, and a group of temperature areas and vertical edges existing along the edge of the road surface area is extracted.

  The extracted group is presumed to be a building that faces the road. The building is, for example, a house, a building such as a building, or a fence.

  Here, as shown in FIG. 10, the calculation method of the distance between the temperature region and the vertical edge is not limited to the horizontal direction of the screen, but the distance that is the shortest in terms of the image plane is used. It doesn't matter.

  Subsequently, in the moving object existence possibility area extracting unit 133, the calculation unit 13 assumes that the end of the area estimated as a building, that is, the periphery of the vertical edge is a part where the building is interrupted, and the moving object exists. Or an area that is likely to appear. Thereby, it becomes possible to extract the site | part in which the building has a high possibility that a pedestrian etc. will appear.

  And the light of the light irradiation part 14 is irradiated to the estimated direction (step S17). As a result, it is possible to improve the visibility of a portion where a person who seems to have broken the building is likely to come out.

  The estimation of other artificial structures will be explained. Here, an area assumed to be a utility pole or a roadside tree where a pedestrian or the like may be present is extracted, and the surrounding area is set as an area where there is a high possibility that an obstacle exists or appears.

  First, the method of extracting utility poles and roadside trees uses the temperature region and vertical edge grouping results calculated in step S15, and the aspect ratio of the shape of the temperature region of the grouping result is vertical (for example, vertical: horizontal = 5). For a group that is 1 or more vertically long), it is estimated to be a utility pole or a roadside tree. And the visibility of the part is improved by irradiating light to the direction of the estimated area | region.

  Here, in addition to the aspect ratio, the estimated height of the group result estimated from the road surface shape may be used by using the road surface area extraction result of the road. For example, when the height of the upper end of the area grouped according to the aspect ratio is 2 m or more, it is estimated that it is a utility pole or a roadside tree. Street trees are not artificial structures, but can be estimated as described above.

  In addition, there are cases where a billboard or the like is pasted on a part of the telephone pole or street tree. FIG. 11 is an explanatory diagram for explaining a state in which a signboard or the like is attached to a utility pole or a roadside tree.

  Like N1 and N2 in FIG. 11, a temperature region that should originally be able to be continuously measured in the vertical direction may be divided and measured in the middle. For this reason, a region in which one vertical edge as indicated by N3 and N4 is grouped in the same direction in the left-right direction, and the width difference N5-N6 of the region is within a predetermined range. Determines that the same region is divided and measured, and determines that the region is the same region.

Here, the predetermined range is, for example, the width between the vertical edges N3 and N4 in the case where the image changes greatly with only one pixel variation (when N3-N4 is 10 pixels or less). Against
| N3-N4 |> 2 × | N5-N6 |
If it is a range, it is determined that they are the same region. In addition, when the width between N3 and N4 is, for example, 10 pixels or more and there is a sufficient width,
1/4 × | N3-N4 |> | N5-N6 |
As shown in FIG.

  In this way, it is possible to extract a region where a pedestrian or the like may appear from the shade by extracting the part of the trunk of a telephone pole or street tree that is measured on the screen, and visually confirm the direction Can be improved.

  And about the area | region estimated with the building extracted by the procedure of step S6, the vertical edge which belongs to the edge of the area | region is estimated to be the edge of a building. And it determines with having a plurality of buildings by detecting the edge of this building.

  Here, a case where there are a plurality of buildings will be described.

  FIG. 12 is an explanatory diagram illustrating a case where there are a plurality of buildings.

  When there are a plurality of buildings, as shown in FIG. 12, as a result of the grouping, a situation such as groups M1 and M2 is obtained.

  Here, the distance on the image between the temperature region and the vertical edge groups M1 and M2 is M3. If the road surface area of the road is obtained in advance, the distance distribution on the image can be estimated from the road surface shape. Therefore, the moving object existence possibility region extraction unit 133 estimates the actual distance of M3 from the distance distribution. By setting the actual distance of M3 as the distance between buildings, it is possible to determine whether or not there is a high possibility that an obstacle will appear in a region M4 between buildings.

  Specifically, for example, the distance distribution on the image from the road surface shape can be calculated by using perspective transformation, assuming that the road surface is a plane. Here, an angle formed by both ends of the road surface and a distance distribution are calculated. The distance distribution from the distance is calculated and estimated. This distance distribution associates the position in the vertical direction of the screen with the distance when it is assumed that the road surface is a plane.

  Then, the distance of M3 is estimated from the obtained position distribution of the road from the position in the vertical direction of M3 and the width M3 ′.

  For example, when the distance of M3 obtained in this way is a width of about 1 to 2 m, it can be estimated that the area of M4 is a doorway or a gate of a building.

  Accordingly, since there is a possibility that a person may come out from the entrance of such a building, in such a case, the visibility is improved by irradiating light in that direction.

  Further, when the estimated distance of M3 further increases and is 3 m or more, it can be estimated that the region of M4 is an intersection. In such a case, it is possible to determine that the region M4 is a region where there is a high possibility that an obstacle will appear, and it is possible to improve visibility by irradiating light.

  Next, a method for estimating a glass door such as a store will be described.

  First, the region estimated by the procedure of step S6 is observed, and it is confirmed whether a straight edge can be extracted from a portion other than the vertical edge measured.

  Specifically, for example, the vertical edge is extracted from the upper part and the lower part in addition to the area where the vertical edge is measured.

  Here, if straight edges are extracted also at the top and bottom of the vertical edge, the surrounding area is a straight edge. In such a case, it is estimated that the area is a glass surface. Then, the road surface area of the road is extracted, and the size of the glass surface is estimated by comparing the road surface shape with the glass surface area.

  For example, the distance distribution on the image from the road surface shape can be obtained in the same manner as the distance distribution calculation of the road surface area. Therefore, from this distance distribution, first, the estimated distance from the vehicle is obtained by using the position of the lower part of the area where the glass surface is considered to be in contact with the road surface area. Next, it is possible to calculate the distance between the vertical edges of the glass surface and estimate the height of the glass surface. Thereby, the size of the region can be estimated.

  For example, if it is calculated that the glass surface is at a height of 2 m or more and is continuously present in the horizontal direction as well as 1 m or more, it is estimated that the glass surface is part of the store and there are obstacles around it, or It is assumed that the region is likely to appear. And it becomes possible to improve visibility by irradiating light in this direction.

  As described above, according to the present embodiment, a region where a moving object represented by a pedestrian or a bicycle that can be a driving obstacle exists or is likely to appear is extracted, and a light is written there. Since the light is directed toward the light, it becomes possible to improve nighttime visibility. In addition, the driver only needs to watch the front in the same way as normal driving operation, and it is not necessary to look at a special display etc., so it is possible to reduce the load on the driver's field of view and improve convenience It becomes possible to make it.

  As mentioned above, although the Example to which this invention was applied was described, it cannot be overemphasized that this invention is not limited to such an Example, and various changes are possible. In particular, in the above-described embodiments, scenes where pedestrians or the like may exist or have been described are divided into cases, but these are all processed at one time so as to illuminate a plurality of detected areas. Alternatively, only one of them may be detected.

  In the embodiment, the infrared camera 11 is built in the vehicle. However, for example, the infrared camera 11 and the light irradiation unit 14 are attached to one case, and a microcomputer including the calculation unit 13 and the memory unit is installed in the case. It is good also as an apparatus put in and integrated and can be attached to and detached from the car.

  In the present invention, the speed of the vehicle is not particularly specified. However, in order to appropriately judge the illuminated area by looking at the illuminated area, the vehicle speed is not so high. More effects can be expected.

It is a block diagram which shows the structure of the illuminating device for vehicles. (A), (b) is explanatory drawing which shows the mode of installation of an infrared camera. It is a flowchart which shows the procedure of the illumination process by the illuminating device for vehicles. It is a flowchart which shows the process sequence for detecting a linear edge and changing an illumination direction. It is explanatory drawing explaining the example of an image which an infrared camera acquires. It is explanatory drawing explaining feature-value extraction. (A), (b) is explanatory drawing explaining the intensity | strength of a feature point. It is explanatory drawing explaining extraction of a predetermined temperature range area | region. It is explanatory drawing explaining vertical edge extraction. It is explanatory drawing explaining grouping. It is explanatory drawing explaining the state by which the signboard etc. were attached to the utility pole or the street tree. It is explanatory drawing explaining the case where there are two or more buildings.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 Vehicle illumination device 11 Infrared camera 12 Memory | storage part 13 Calculation part 14 Light irradiation part 121 Image memory 122 Calculation memory part 131 Feature-value extraction part 132 Feature-value distribution calculation part 133 Moving object presence possibility area | region extraction part

Claims (34)

  1. Thermal image acquisition means for acquiring a thermal image around the vehicle;
    Feature quantity extraction means for extracting a feature quantity in the image from the thermal image acquired by the thermal image acquisition means;
    Feature quantity distribution calculating means for obtaining a distribution of the feature quantity extracted by the feature quantity extracting means;
    A moving object existence possibility region extracting unit that extracts a region where a moving object is highly likely to exist or appear from the distribution of the feature amount obtained by the feature amount distribution calculating unit;
    A light irradiating means for irradiating light toward an area where the moving object extracted by the previous moving object existence possibility extracting means is likely to exist or appear;
    A vehicular lighting device comprising:
  2.   The vehicle lighting device according to claim 1, wherein the moving object existence possibility area extracting unit includes a road surface area extracting unit that extracts a road surface area.
  3.   The vehicular lighting device according to claim 2, wherein the road surface area extraction unit extracts the road surface area from the thermal image acquired by the thermal image acquisition unit.
  4. The feature amount extraction means extracts a pixel having a large luminance change with surrounding pixels as a feature point, and extracts a region belonging to a predetermined temperature zone,
    The feature amount distribution calculating means calculates the distribution of the feature points from any one or more of the number, density, intensity, and distance of the feature points, and calculates the temperature distribution inside the region belonging to the predetermined temperature zone. Calculate the area,
    The moving object existence possibility area extracting means refers to the distribution of the feature points obtained by the feature amount distribution calculating means and the temperature distribution inside the area belonging to the predetermined temperature zone, and the moving object exists. The vehicular lighting device according to any one of claims 1 to 3, wherein an area having high possibility is extracted.
  5. Whether the moving object existence possibility area extracting unit is the same area as the area where the feature point density calculated by the feature quantity distribution calculating unit is high and the area having a predetermined area at a predetermined temperature. Determine
    The temperature area is estimated as a vending machine when the density of the determined feature points is high and the area having a temperature equivalent to a pedestrian is 1.8 to 2 m high from the road surface area. The vehicle lighting device according to any one of claims 2 to 4.
  6.   The moving object existence possibility area extracting means has a temperature equal to or higher than the pedestrian equivalent temperature and has a high feature point density calculated by the feature quantity distribution calculating means, and is about 1 m or less from the road surface area. When it is calculated that it has a temperature region higher than a pedestrian at the position of the height, it is estimated that the temperature region is a parked vehicle. The vehicle lighting device according to Item.
  7.   The moving object existence possibility area extracting means has a temperature equal to or higher than a pedestrian equivalent temperature, and is 1 to 1 from the road surface area among areas having a high density of the feature points calculated by the feature amount distribution calculating means. The vehicle lighting device according to any one of claims 2 to 5, wherein the temperature range is estimated to be a parked vehicle when the height is 1.5 m.
  8.   The moving object existence possibility region extracting means has a temperature distribution of a region having a predetermined area at a predetermined temperature substantially equal to the ambient temperature, and the density of feature points inside the region having the predetermined temperature is uniform. The vehicular lighting device according to any one of claims 2 to 7, wherein the temperature region is estimated to be planted when the height from the road surface region is lower than 1 m in such a region.
  9. The feature amount extraction unit further extracts a straight edge from the thermal image acquired by the thermal image acquisition unit,
    The moving object existence possibility region extracting means is configured such that a temperature distribution of a region having a predetermined area at the predetermined temperature is measured at a temperature lower than the ambient temperature, and the extracted straight edge exists around the region. An area is extracted, and the temperature area having the straight edge around it is estimated to be a signboard on a road when the height from the road surface area is 1 to 1.5 m. Item 9. The vehicle illumination device according to any one of Items 2 to 8.
  10. The feature amount extraction unit further extracts a vertical edge from the thermal image acquired by the thermal image acquisition unit,
    The moving object existence possibility area extraction unit is an area in which an end of an area having a high temperature distribution uniformity within an area having a predetermined area at a predetermined temperature is the vertical edge as the moving object existence possibility area. The vehicle lighting device according to any one of claims 2 to 9, wherein the vehicle lighting device is extracted.
  11.   The moving object existence possibility area extraction unit estimates that the area is a power pole and a roadside tree when the extracted aspect ratio of the area serving as the vertical edge is a vertically long value greater than or equal to a predetermined value. The vehicle lighting device according to claim 10.
  12.   The moving object existence possibility area extraction unit estimates that the extracted area serving as the vertical edge continuously exists up to a height of 2 m or more from the road surface area as a power pole or a roadside tree. The vehicle lighting device according to claim 10 or 11, wherein the lighting device for a vehicle is used.
  13. The moving object existence possibility area extracting means, when the extracted area serving as the vertical edge exists in contact with an end of the road surface area extracted by the road surface area extracting means, The area around which the vertical edge is extracted at the edge of the area is estimated to be the edge of the building,
    Assuming that there are multiple buildings when there are building edges, the distance between each building is estimated from the plurality of building edges and the shape of the road surface area,
    The region according to any one of claims 10 to 12, wherein when the estimated distance between the buildings is within a predetermined range, the region is likely to have a moving object appearing between the buildings. The vehicle lighting device according to one item.
  14.   When the estimated distance between the buildings is a width of 1 to 2 m, the moving object existence possibility area extracting unit assumes that the moving object is an entrance / exit of the building, and an area where the moving object is likely to appear The vehicle lighting device according to claim 13.
  15.   The moving object existence possibility area extracting means determines that the moving object is likely to appear as an intersection of a road when the estimated distance between buildings is a width of 3 m or more. The vehicle lighting device according to claim 13.
  16. The moving object existence possibility area extracting unit detects whether the area including the vertical edge has a straight edge in a portion other than the vertical edge being measured,
    Among the detected areas, if the entire periphery including the upper part of the vertical edge area is composed of straight edges, it is assumed that the area is a glass surface and that there is a store around the glass surface. The vehicular illumination device according to any one of claims 10 to 15.
  17.   The light irradiation means irradiates light to an area where the moving object is highly likely to exist or appear, and then illuminates an area where the moving object is likely to exist or appear after a certain period of time. The vehicle lighting device according to any one of claims 1 to 16, wherein the vehicle lighting device is stopped.
  18. Acquiring a thermal image around the vehicle;
    Extracting a feature amount in the image from the thermal image;
    Obtaining a distribution of the feature quantity;
    Extracting a region where a moving object is likely to exist or appear from the distribution of the feature amount;
    Irradiating light toward an area where the moving object is likely to exist or appear; and
    An illumination method comprising:
  19.   The illumination method according to claim 18, further comprising a step of extracting a road surface area.
  20.   The illumination method according to claim 19, wherein the step of extracting the road surface area includes extracting the road surface area from the thermal image.
  21. In the step of extracting the feature amount, a pixel having a large luminance change with surrounding pixels is extracted as a feature point, and an area belonging to a predetermined temperature zone is extracted.
    The step of obtaining the distribution of the feature amount includes calculating a feature point distribution from any one or more of the number, density, intensity, and distance of the feature points, and a temperature distribution inside the region belonging to the predetermined temperature zone. Calculate the area,
    The step of extracting a region where the moving object is highly likely to exist or appear refers to extracting a region where the moving object is highly likely to exist with reference to the distribution of the feature points and the temperature region distribution. The illumination method according to any one of claims 19 to 20, characterized in that
  22. In the step of extracting an area where the moving object is likely to exist or appear,
    It is determined whether the region having a high density of feature points and the region having a predetermined area at a predetermined temperature are the same region,
    Estimating the temperature region as a vending machine when the identified density of the feature points is high and the region having a temperature corresponding to a pedestrian is 1.8 to 2 m high from the road surface region. The illumination method according to any one of claims 19 to 21, wherein the illumination method is characterized in that
  23. In the step of extracting an area where the moving object is likely to exist or appear,
    It is calculated that a temperature region higher than the pedestrian is present at a position having a temperature of about 1 m or less from the road surface region in the region having a temperature equal to or higher than the pedestrian equivalent temperature and having a high feature point density. The lighting method according to any one of claims 19 to 22, wherein the temperature region is estimated to be a parked vehicle when the temperature is set.
  24. In the step of extracting an area where the moving object is likely to exist or appear,
    If the temperature is higher than the pedestrian equivalent temperature and has a high density of feature points, the temperature region is estimated to be a parked vehicle when the height is 1 to 1.5 m from the road surface region. The illumination method according to any one of claims 19 to 22, wherein
  25. In the step of extracting an area where the moving object is likely to exist or appear,
    The temperature distribution of a region having a predetermined area at a predetermined temperature is almost equal to the ambient temperature, and the density of feature points inside the region having the predetermined temperature is uniform. The illumination method according to any one of claims 19 to 24, wherein the temperature region is estimated to be implanted when the height is lower than 1 m.
  26. Further comprising extracting straight edges from the thermal image;
    In the step of extracting an area where the moving object is likely to exist or appear,
    The temperature distribution of a region having a predetermined area at the predetermined temperature is measured to be lower than the ambient temperature, and a region where the extracted straight edge exists around the region is extracted, and the straight edge is surrounded. 26. The temperature region according to claim 19, wherein the temperature region is estimated as a signboard on a road when the height from the road surface region is 1 to 1.5 m. Lighting method.
  27. Further comprising extracting vertical edges from the thermal image;
    In the step of extracting a region where the moving object is likely to exist or appear, a region where the temperature distribution inside the region having a predetermined area at a predetermined temperature is high is the vertical edge. 26. The illumination method according to any one of claims 19 to 25, wherein: is extracted.
  28. In the step of extracting an area where the moving object is likely to exist or appear,
    The edge of the region having a high uniformity of temperature distribution inside the region having a predetermined area at the predetermined temperature is a vertically long aspect ratio of the region that becomes the vertical edge extracted as the region that becomes the vertical edge is a predetermined value or more. 28. The lighting method according to claim 27, wherein if there is, the region is estimated to be a utility pole and a roadside tree.
  29. In the step of extracting an area where the moving object is likely to exist or appear,
    The region where the edge of the region having a high uniformity of temperature distribution inside the region having the predetermined area at the predetermined temperature is a vertical edge extracted as the vertical edge region is a height of 2 m or more from the road surface region. 29. The lighting method according to claim 27 or 28, wherein if the area is continuously present, the area is estimated to be a utility pole or a roadside tree.
  30. In the step of extracting an area where the moving object is likely to exist or appear,
    The region where the temperature distribution inside the region having the predetermined area at the predetermined temperature is highly uniform is the vertical edge extracted as the vertical edge region is in contact with the end of the road surface region. If it exists, the area is judged to be a building, and the area around the part where the vertical edge is extracted at the edge of the area is estimated to be the edge of the building.
    There are a plurality of estimated buildings, and from the edges of the plurality of buildings and the shape of the road surface area extracted by the road surface area extraction means, the distance between each building is estimated,
    30. The region according to any one of claims 27 to 29, wherein when the estimated distance between the buildings is within a predetermined range, the region is likely to have a moving object appearing between the buildings. The illumination method according to one item.
  31. In the step of extracting an area where the moving object is likely to exist or appear,
    31. The region according to claim 30, wherein when the estimated distance between the buildings is 1 to 2 m wide, the building is an entrance / exit of the building and the moving object is highly likely to appear. Lighting method.
  32. In the step of extracting an area where the moving object is likely to exist or appear,
    31. The illumination according to claim 30, wherein when the estimated distance between buildings is a width of 3 m or more, the area is considered to be an intersection of a road and the moving object is highly likely to appear. Method.
  33. In the step of extracting an area where the moving object is likely to exist or appear,
    The vertical edge is measured in the region where the edge of the region where the temperature distribution inside the region having the predetermined area at the predetermined temperature is high is extracted as the vertical edge. Detect whether other parts have straight edges,
    Among the detected areas, if the entire periphery including the upper part of the vertical edge area is composed of straight edges, it is assumed that the area is a glass surface and that there is a store around the glass surface. The illumination method according to any one of claims 27 to 32, characterized in that
  34. After the step of irradiating light toward the region where the moving object is likely to exist or appear, the step of stopping the illumination to the region where the moving object is likely to exist or appear after a certain period of time. The illumination method according to any one of claims 18 to 33, comprising:
JP2005262690A 2005-09-09 2005-09-09 Lighting apparatus and method for vehicle Pending JP2007076378A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005262690A JP2007076378A (en) 2005-09-09 2005-09-09 Lighting apparatus and method for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005262690A JP2007076378A (en) 2005-09-09 2005-09-09 Lighting apparatus and method for vehicle

Publications (1)

Publication Number Publication Date
JP2007076378A true JP2007076378A (en) 2007-03-29

Family

ID=37937108

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005262690A Pending JP2007076378A (en) 2005-09-09 2005-09-09 Lighting apparatus and method for vehicle

Country Status (1)

Country Link
JP (1) JP2007076378A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010277384A (en) * 2009-05-29 2010-12-09 Nissan Motor Co Ltd Device and method for supporting traveling
WO2012164729A1 (en) 2011-06-02 2012-12-06 トヨタ自動車株式会社 Vehicular field of view assistance device
WO2017169704A1 (en) * 2016-04-01 2017-10-05 日立オートモティブシステムズ株式会社 Environment recognition device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010277384A (en) * 2009-05-29 2010-12-09 Nissan Motor Co Ltd Device and method for supporting traveling
WO2012164729A1 (en) 2011-06-02 2012-12-06 トヨタ自動車株式会社 Vehicular field of view assistance device
US9230178B2 (en) 2011-06-02 2016-01-05 Toyota Jidosha Kabushiki Kaisha Vision support apparatus for vehicle
WO2017169704A1 (en) * 2016-04-01 2017-10-05 日立オートモティブシステムズ株式会社 Environment recognition device

Similar Documents

Publication Publication Date Title
Son et al. Real-time illumination invariant lane detection for lane departure warning system
US10139708B2 (en) Adjustable camera mount for a vehicle windshield
US10354157B2 (en) Controlling host vehicle based on detection of a one-way road
US9443154B2 (en) Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications
US10300875B2 (en) Pedestrian collision warning system
US9704060B2 (en) Method for detecting traffic violation
US10389985B2 (en) Obstruction detection
US9902426B2 (en) Apparatus and method for driver assistance
DE102012207620B4 (en) System and method for light signal detection
US9810785B2 (en) Gated imaging using an adaptive depth of field
JP5867273B2 (en) Approaching object detection device, approaching object detection method, and computer program for approaching object detection
JP5529910B2 (en) Vehicle periphery monitoring device
RU2669003C2 (en) System and method for parking structure and crossroad tracking vehicles
EP2463843B1 (en) Method and system for forward collision warning
CN103523014B (en) Car-mounted device
Broggi Automatic vehicle guidance: the experience of the ARGO autonomous vehicle
DE10050741B4 (en) The vehicle lighting device
JP2901112B2 (en) Vehicle periphery monitoring device
JP3922245B2 (en) Vehicle periphery monitoring apparatus and method
EP2172873B1 (en) Bundling of driver assistance systems
JP3822515B2 (en) Obstacle detection device and method
JP3776094B2 (en) Monitoring device, monitoring method and monitoring program
CN100437660C (en) Device for monitoring vehicle breaking regulation based on all-position visual sensor
EP1928687B1 (en) Method and driver assistance system for sensor-based driving off control of a motor vehicle
US20130027511A1 (en) Onboard Environment Recognition System