JP4830604B2 - Object detection method and object detection apparatus - Google Patents

Object detection method and object detection apparatus Download PDF

Info

Publication number
JP4830604B2
JP4830604B2 JP2006112877A JP2006112877A JP4830604B2 JP 4830604 B2 JP4830604 B2 JP 4830604B2 JP 2006112877 A JP2006112877 A JP 2006112877A JP 2006112877 A JP2006112877 A JP 2006112877A JP 4830604 B2 JP4830604 B2 JP 4830604B2
Authority
JP
Japan
Prior art keywords
edge
detection
determination
vector
preset threshold
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2006112877A
Other languages
Japanese (ja)
Other versions
JP2007288460A (en
Inventor
倫子 下村
Original Assignee
日産自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日産自動車株式会社 filed Critical 日産自動車株式会社
Priority to JP2006112877A priority Critical patent/JP4830604B2/en
Publication of JP2007288460A publication Critical patent/JP2007288460A/en
Application granted granted Critical
Publication of JP4830604B2 publication Critical patent/JP4830604B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P20/00Technologies relating to chemical industry
    • Y02P20/50Improvements relating to the production of bulk chemicals
    • Y02P20/52Improvements relating to the production of bulk chemicals using catalysts, e.g. selective catalysts

Description

  The present invention relates to an object detection apparatus that detects an object using a sensor such as a radar or a camera.

  2. Description of the Related Art Conventionally, driving support devices such as a collision prevention device, an inter-vehicle distance control device, and a following traveling device have been proposed for vehicles. In these driving support devices, an object detection device that detects an object (obstacle) such as a vehicle traveling in front of the host vehicle is used.

As such an object detection device, for example, a camera and a radar are provided as detection sensors, and the vehicle is accurately detected based on the object edge extracted from the camera image and the object detection information by the laser radar. A technique for determining has been proposed (see, for example, Patent Document 1).
JP 2005-157875 A

  However, since the above-described prior art is based on the symmetrical reflector and only the feature of “four-wheel vehicle” is detected, it cannot be applied to detection of an object other than a four-wheel vehicle. There was a problem that it was not possible to respond to the desire to detect.

  Accordingly, an object of the present invention is to provide an object detection method and an object detection apparatus that can detect an object other than a vehicle with high accuracy.

  The present invention has been made in view of the above circumstances. As information for determining an object, at least one of an edge direction vector, an edge direction vector variance, an edge strength, and an edge strength variance, and an actual speed of a detection target And an object detection method for determining the type of the object based on these.

  Depending on the type of object, its shape characteristics, movement characteristics, and actual speed differ. Therefore, in the present invention, by using at least one of the edge direction vector, the edge direction vector variance, the edge strength, and the edge strength variance, and the actual velocity of the detection target, the difference in the type of the object is highly accurate. Can be determined.

  Hereinafter, embodiments of the present invention will be described with reference to the drawings.

  The object detection apparatus of this embodiment includes an input means (10) for obtaining information relating to an object existing in the outside world, and an object detection processing means (10) for detecting an object based on the information obtained from the input means (10). CU), and the distance information between the detection target object and the means (1) for inputting at least image information obtained by imaging the detection target object is input to the input means (10). Means (2), and the object detection processing means (CU) includes the distance to the detection target object, the edge direction vector calculated from the image information, the edge direction vector variance, the edge strength, and the edge strength variance. An object type determination process for determining the type of the object based on at least one of the above is performed.

  An object detection apparatus according to Example 1 of the best mode of the present invention will be described with reference to FIGS.

  As shown in FIG. 1, the object detection apparatus according to the first embodiment is mounted on a vehicle MB, and includes a camera 1, a radar 2, a vehicle speed sensor 3 as input means, and a control unit CU as object detection processing means. ing.

  For example, the camera 1 is mounted in the vicinity of a room mirror (not shown) in the passenger compartment. As the camera 1, at least one of a luminance camera that captures a luminance image such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor) camera and an infrared camera that captures an infrared image can be used. In the first embodiment, a luminance camera is used.

  As the radar 2, a millimeter wave radar, a laser radar, or an ultrasonic radar can be used. In the first embodiment, a laser radar is used. In the case of laser radar, the distance and the reflection intensity of the laser beam can be obtained as information on the detected object. In the case of millimeter wave radar, distance, radio wave reflection intensity, and relative speed between the vehicle MB and the detected object can be obtained as detected object information.

  The vehicle speed sensor 3 is a known sensor and outputs a signal corresponding to the rotational speed of the wheel.

  As shown in FIG. 2, the control unit CU inputs a signal from an in-vehicle sensor group 10 including a camera 1, a radar 2, and a vehicle speed sensor 3, detects an object, and further performs an object detection process for determining the type. This is a well-known device including a RAM (Random Access Memory), a ROM (Read Only Memory), a CPU (Central Processing Unit), and the like.

  FIG. 2 functionally represents a configuration for performing object detection processing in the control unit CU. The control unit CU includes a memory 11, an object presence determination processing unit 12, an object speed calculation processing unit 13, and an object. A type determination processing unit 14.

The memory 11 stores the distance from the radar 2 to the object at each angle and the laser for each horizontal scan resolution (angle) of the radar 2 as luminance image information captured by the camera 1 and detection information detected by the radar 2. The reflection intensity of light is preserved.
An example of luminance image information obtained by the camera 1 will be described with reference to FIG.
FIG. 3 shows an example of an image when a vehicle (other vehicle) AB, a person (pedestrian) PE, and a wall WO as a road structure exist in front of the vehicle (own vehicle) MB. The image is projected on the imaging surface 1a of the camera 1 as luminance image information shown in FIG. 3A shows a state viewed from the side, FIG. 3B shows a state viewed from above, and FIG. 3C shows an infrared image when an infrared camera is used as the camera 1. In the other vehicle AB, MP in the drawing indicates a portion (muffler portion) that is captured as a high temperature.

  In the following description, the vehicle AB as the detection target object is referred to as another vehicle AB in order to distinguish it from the vehicle MB on which the object detection device of the first embodiment is mounted, and the vehicle MB is referred to as the own vehicle. .

  In FIG. 3, the distance from the vertical projection point PA on the road surface of the camera 1 to the end point PF on the own vehicle MB side in the vertical projection image on the road surface of the person (pedestrian) PE is z, and the distance to the other similar vehicle AB. Is zc, and the distance between the point PA and the point PF in the x-axis direction is xs.

Therefore, when the reference coordinate is the origin of the center of the lens 1b of the camera 1, the position of the point PF in the reference coordinate system is represented as (xs, -H, z), and the point PF is located on the image on the imaging surface 1a. The coordinates (xc, yc) to be expressed are expressed by the relationship of the following expressions (1) and (2) using the focal length f.
xc = xs · f / z (1)
yc = −H · f / z (2)
Luminance image information is stored for each such x-, y-, and z-axis coordinate.

Next, an example of detection information by the radar 2 is shown in FIG.
FIG. 4 shows a detection example after a lapse of time from the detection example described in FIG. 3, (a) shows luminance image information of the detection example, and (b) shows the detection example from above. Shows the state. As shown in this figure, an object exists by detecting laser light reflected by another vehicle AB, a person (pedestrian) PE, and a wall (road structure) WO existing in front of the host vehicle MB. Can be detected. Note that qP, qA, and qW indicated by circles in FIG. 4B indicate measurement points of the respective objects. Further, these measurement points qP, qA, and qW represent the reflection intensity described later with the size of the circles.

  Therefore, the memory 11 stores the distance to the measurement point and the reflection intensity thereof for each direction in which the reflected wave is observed.

  Next, the object presence determination processing unit 12 will be described. First, the edge detection method used for the determination by the object presence determination processing unit 12, the method of obtaining the edge direction, the edge direction vector variance, and the optical flow are described. Explain how to find out.

  That is, in the first embodiment, edge detection processing for extracting vertical edges, horizontal edges, and edge strength, direction vector calculation processing for extracting direction vectors, and optical flow are performed on luminance image information obtained from the camera 1. An optical flow process to be calculated is performed.

First, edge detection processing and how to determine the edge direction will be described with reference to FIGS.
5A and 5B show vertical edge detection filters, and FIGS. 5C and 5D show horizontal edge detection filters. Note that the edge detection filters shown in FIGS. 5A to 5D are assumed to have a negative value for black and a positive value for white. Specifically, as shown in FIGS. The value is set so that the total numerical value in the edge detection filter becomes zero.

  Edge detection uses these edge detection filters. For example, vertical edges at all pixel positions in the detection region are obtained by shifting the detection region vertically and horizontally by one pixel and performing convolution (convolution). And the strength of the lateral edge is calculated.

The edge detection using these edge detection filters will be further described. FIG. 7A conceptually shows the state of edge detection in the region Rab where the other vehicle AB is imaged. In this figure, (e) is an example in which a horizontal edge detection filter is applied, and the same processing is similarly performed using the vertical edge detection filter. FIG. 7B is an explanatory diagram of the calculation principle of the edge of the region where the wall WO is imaged.
For example, when convolution is performed between the vertical edge detection filter and the horizontal edge detection filter at the positions shown in FIGS. 7G and 7H where the vertical edges exist on the image, the position at the position (g) is obtained. Although the value of the vertical edge detection filter increases, the value of the horizontal edge detection filter at the position (h) is close to zero. Further, at the position where there is no edge in (j), the value is close to 0 regardless of which of the edge detection filters is applied. Further, in the portion having an oblique edge as in (i), both the vertical edge detection filter and the horizontal edge detection filter have values, but the values are not as large as the positions of (g) (absolute values are almost the same). Half of the value).

  As described above, in edge detection using an edge detection filter, the convolution value of either the vertical or horizontal edge detection filter is small at a position where no edge exists, and at the position where an edge exists, the one closer to the slope of the edge. The convolution value with the edge detection filter increases. The sum of the absolute values of the convolution values with the vertical / horizontal edge detection filter indicates the strength of the edge.

  Next, a specific example of a method for calculating a direction vector of an edge in a portion where an edge exists on the image will be described.

  As described above, the vertical edge detection filter shows a high value at the vertical edge, and the horizontal edge detection filter shows a high value at the horizontal edge portion. On the other hand, in the case of an oblique edge, the convolution with the vertical and horizontal edge detection filter has a value, but if the edge is inclined closer to the vertical direction, the value of the vertical edge detection filter is the horizontal edge detection. The value is larger than the value of the filter for use.

  Further, when the convolutions at the position (k) and the position (g) in FIG. 7 are compared, the plus and minus are reversed, and the absolute values are almost the same.

FIG. 6 shows the relationship between the edge direction, the convolution value plus / minus, and the absolute value. For example, assuming that the convolution values obtained using the vertical edge detection filter and the horizontal edge detection filter shown in FIGS. 5A and 5B are Dx and Dy, the angle of the edge is atan (Dx / Dy ). That is, by performing this calculation for each pixel in the detection region, it is possible to calculate a direction vector of a fine edge in the detection region.
Further, the above-described edge strength can be obtained by | Dx + Dy |.
Note that the variance of the edge direction vector used in the object type determination process described later can be obtained by calculating the variance for each direction vector obtained by the above-described method.

Next, the optical flow will be described.
The optical flow is an arrow connecting a video displayed at a certain point (xc, yc) on the image at a certain time and a point on the image where the video is located after Δt seconds, In general, the movement of a certain point on a certain object captured on the image is shown. Such an optical flow can be obtained by applying any conventionally proposed method such as block matching or gradient method.

This optical flow will be specifically described with reference to FIGS.
4 and 8 show a case where the person PE is stopped and the other vehicle AB is moving forward in the same direction as the host vehicle MB. FIG. 8 is Δt seconds after the time shown in FIG. Shows the state. Moreover, in both figures, (a) is an image of the camera 1, and (b) shows a state where a detection area by the radar 2 is viewed from above.

  Here, the values xc1, yc1, and hc1 indicating the person PE in FIG. 4 become smaller since only the value of z serving as the denominator decreases in Δt seconds shown in FIG. 8 in the above-described equations (1) and (2). It becomes larger as the vehicle MB moves forward. Therefore, the arrow of the optical flow is the vanishing point VP (the point at which the forward infinity point on the image is imaged. When the optical axis LZ of the camera 1 is parallel to the road surface RS, the center of the image is the vanishing point. It becomes longer in the direction away from VP.

  Similarly, since the points on the wall WO are also stopped, the optical flow becomes long. Also, these optical flows become arrows with the vanishing point VP as the center and from there toward the outside of the image.

  Therefore, the optical flow of the person PE shown in FIG. 8A is directed downward to the right at the feet, and directed to the right near the head near the center of the image.

  On the other hand, assuming that the other vehicle AB is moving at a constant speed with the host vehicle MB, the distance relationship is substantially constant and the value of z does not change in the above formulas (1) and (2). It gets shorter.

  In addition, the object presence determination processing unit 12 also performs processing for obtaining reflection intensity and the like when determining the presence of an object based on detection information from the radar 2.

  This process will be described. As described above, the sizes of the circles at the measurement points qP, qA, and qW shown in FIG. 8 indicate the strength of the reflection intensity.

  In general, in the case of a laser radar, a strong reflection intensity is observed when an object whose surface reflects light (a signboard or a reflector made of a metal plate) is detected. In general, a vehicle (other vehicle AB) has two or more reflectors (a reflecting plate for efficiently reflecting light when illuminated by light at night, and is attached to assist nighttime visibility. ) Is provided. In addition, at least one reflector is always attached to the motorcycle MS. In many cases, a reflector is provided on a side wall or a center pole on a curve even on the road. Furthermore, road signs and signboards on the road are made of metal plates, and generally reflect the irradiation light of the laser radar well.

  On the other hand, natural objects such as people PE and animals have low reflection efficiency. Therefore, at the measurement point qA of the other vehicle AB shown in FIG. 8B, a measurement point of strong reflection intensity is observed from a position corresponding to the reflector, so that a circle is displayed in a large size. On the other hand, at the measurement points qP and qW of the wall WO and the person PE, the circles are also displayed small with a relatively low reflection intensity.

  Therefore, processing for obtaining the intensity distribution of the radar 2 is performed. In addition, as shown in FIG. 8, the measurement points on one object are grouped as a lump for each detection target. Since these measurement points have intensity information, a feature amount indicating the feature of the object is obtained for each group. As the feature amount, the average, variance, and maximum value of the reflection intensity of the measurement points belonging to the same group are obtained, and the number of measurement points is obtained, and these are used for an object type determination process described later.

  Normally, the reflection intensity of the same object becomes weaker as the distance becomes longer. Thus, in the first embodiment, the reflection intensity value is stored as an intensity value corresponding to the distance. As this method, the reflection intensity is obtained for each distance, or a reference distance is obtained in advance, and a value obtained by multiplying the fourth power of (measured distance / reference distance) is stored. Incidentally, since the reflection intensity has a characteristic that it becomes weak in proportion to the fourth power of the distance, it is stored as such a value.

Returning to the description of the object presence determination processing unit 12 in FIG.
The object presence determination processing unit 12 determines the presence of an object using the luminance image above by the camera 1 and the detection information by the radar 2. That is, when the presence of an object is confirmed by both luminance image information and detection information, it is determined that the object is present. In addition, even when only one of luminance image information and detection information is determined, for example, when an object is detected continuously in time series, or the absolute value of edge intensity or reflection intensity indicating the object When is large, it is determined that an object exists.
Specifically, the object presence determination using an image performs a conventionally proposed process. In the first embodiment, a vehicle (another vehicle AB), a person PE, a two-wheeled vehicle MS, and a road structure (WO) are set as detection targets, and the presence of these objects is obtained as described above in the image Judgment based on optical flow.

  For example, the vehicle (other vehicle AB) is characterized by a lateral edge corresponding to the number plate 31 and the roof portion 32. Therefore, an area where these lateral edges are detected in time is determined as an area where an object exists.

  Further, in the person (pedestrian) PE, an area in which oblique or vertical vertical edges corresponding to the limbs that are characteristic of the person PE are detected, and that they are observed to move in the movement period of the limbs of the person PE is observed. Is determined as a region where an object exists.

  As described above, the object presence determination processing unit 12 determines the area having the detection target feature as the area where the object exists.

On the other hand, the object presence determination using the radar is performed as follows.
For example, as shown in FIG. 4B, when an object is present, the object has a certain size, so that a plurality of measurement points qP, qA, qW are detected together. Therefore, by grouping the measurement points qP, qA, qW detected as the same mass, each object (person PE, other vehicle AB, wall WO) can be grasped.

  Further, in the first embodiment, in order to remove the noise component and improve the object detection accuracy, the group of these measurement points qP, qA, qW is observed in time series and continuously for a certain time or more. If an object of the same size continues to be observed, it is determined that the object exists.

  Here, in the first embodiment, it is determined that an object is present when the observation is continuously performed n times in a time series as a certain fixed time. Note that n times is about n = 2 to 4 times, and this means that detection is continuously performed for 2 × Δt to 4 × Δt seconds if the scanning period of the radar 2 is Δt seconds. To do.

  Furthermore, the above-described number n of consecutive measurements may be changed depending on conditions. That is, n, which is the number of times of continuous measurement, is a small value (for example, n = 2) when the number of measurement points included in one measurement point group is large or when the intensity is strong. If the number of measurement points in the group is small or the intensity is weak, a larger value (for example, n = 4 to 6) is set to make the object presence determination accuracy more reliable. be able to.

  As described above, the object presence determination processing unit 12 performs object presence determination based on the luminance image information from the camera 1 and object presence determination based on the detection information from the radar 2. In both cases, when the presence determination of the object is made in the same direction, it is determined that the object exists in the direction.

  Further, in the first embodiment, the detection objects are the same based on the relative speed of the object determined to be present in the luminance image information and the relative speed of the object determined to be present in the detection information. I have confirmed that.

  That is, since the radar 2 can measure the distance to the object, the relative speed of the object can be obtained from the change in the distance by observing the distance to the measurement point in time series. For example, FIG. 9 is a diagram plotting the distance to an object when the object is detected. Therefore, the relative velocity can be obtained by the slope of the line connecting these plots, that is, the distance change / observation time for a certain time at the measurement point at the same position.

On the other hand, the luminance image information is calculated as described below.
There is a relationship that the size of an object imaged on an image is half proportional to the actual distance. For example, as shown in FIG. 3, the actual width of the other vehicle AB is Wc, the width of the other vehicle AB on the image is Wc0, the focal length of the camera 1 is f, and the horizontal distance from the camera 1 to the other vehicle AB is zc. Then, the relationship of the following formula (3) is established.
Wc0 = Wc × f / zc (3)

  Here, the distance zc to the other vehicle AB, which is the distance to the detection target, can be obtained by the radar 2. The width Wc0 on the image can also be obtained by processing on the image.

Therefore, the width Wc of the other vehicle AB can be obtained by the following equation (4).
Wc = Wc0 × zc / f (4)

  Therefore, in the first embodiment, when the relative velocity is obtained based on the luminance image information, a rectangular area is extracted from the area where the object exists, and based on the distance between the edges existing in the extracted rectangular area. To calculate the relative speed.

  That is, when the detection target has substantially parallel edges, the relative speed is obtained based on the change in the interval between the edges. In addition, even when there is no edge, if a repeated pattern or texture is observed in the detection target, a change in size is obtained depending on the frequency component of the observed repeated pattern or texture.

This will be described with reference to FIGS.
First, a case where the relative speed is obtained based on the edge interval will be described.
For example, taking the other vehicle AB shown in FIG. 10 as an example, in this case, as the other vehicle AB, a region Rnp including the number plate 31 that is a region having an edge on the image is extracted. In this extracted region Rnp, as shown in FIG. 11, as the edge interval, the vertical edge interval dimension wh1 of the number plate 31, the horizontal edge interval ww1 formed by the lower end of the number plate 31 and the trunk lid 33, and ww2 is observed.

  These intervals wh1, ww1, and ww2 are inversely proportional to the distance from the camera 1 as described above. That is, FIG. 11 shows a state in which the distance from the other vehicle AB approaches the state shown in (b) from the state shown in (a). As described above, when the distance changes, the intervals wh1, ww1, and www2 change to afwh1, afww1, and afww2, respectively. Therefore, a change in distance can be obtained based on changes in these intervals wh1, ww1, and ww2, and a relative speed can be obtained based on the time required for the change.

Next, a method for obtaining a relative speed using frequency components will be described.
For example, when repeated diagonal edges are observed as in the wall WO of FIG. 10, in the region Rw from which a part of the wall WO is extracted, the state shown in FIG. ), The interval between the repeated patterns is observed to be narrow at the distance (a), whereas the interval between the repeated patterns is increased when the distance shown in (b) is approached.

  Such a change in the interval between the repeated patterns is obtained as a change in the frequency of the edge in the region Rw. Therefore, the frequency component of the repetitive edge pattern is obtained by wavelet transform, image FFT transform (fast Fourier transform), or the like.

  In general, when there are many edges like a repetitive pattern, there is a possibility that the combination of edges may be mistaken if there are many edges of the same type, and the object size change based on the change in the edge interval described above is reliable. It is difficult to calculate high. Therefore, in the first embodiment, for a repetitive pattern with a short edge-to-edge distance, the change in the frequency f is obtained, and the reciprocal of the change in the value at which the frequency reaches the peak is observed, thereby changing the size of the object on the image. In other words, it becomes possible to observe the speed change.

  Next, returning to FIG. 2, the object speed calculation processing unit 13 obtains the speed of the object to be detected based on the detection information from the radar 2 and the own vehicle speed obtained from the vehicle speed sensor 3. That is, the relative velocity with respect to the object is obtained from the detection information of the radar 2. Therefore, the actual speed of the object is calculated by adding the relative speed and the own vehicle speed.

Next, the object type determination process performed in the object type determination processing unit 14 will be described.
In this object type determination process, the object type is determined based on the actual velocity Vn of the object, the edge direction vector variance Ev, the reflection intensity variance Br, and the reflection intensity.

That is, when the conditions listed in the following (a1) to (a4) are satisfied, it is determined that the vehicle is traveling other vehicle AB.
(A1) The actual speed Vn of the object is greater than or equal to a preset actual speed threshold Vs.
(A2) The edge direction vector variance Ev is less than a preset vector variance threshold Evs.
(A3) The reflection intensity dispersion Br is not less than a preset intensity dispersion threshold Brs.
(A4) The maximum value Rmax of the reflection intensity according to the distance is equal to or greater than a preset maximum reflection intensity threshold value Rs.
Moreover, when satisfy | filling the conditions of the following (b1)-(b5), it determines with the moving motorcycle MS.
(B1) The actual speed Vn of the object is greater than or equal to the actual speed threshold value Vs.
(B2) The edge direction vector variance Ev is equal to or greater than the vector variance threshold value Evs.
(B3) The reflection intensity dispersion Br is greater than or equal to the intensity dispersion threshold Brs.
(B4) The maximum value Rmax of the reflection intensity according to the distance is not less than the maximum reflection intensity threshold MRs.
(B5) The number Kn of measurement points whose reflection intensity according to the distance is greater than or equal to a preset reflection intensity threshold value Rs is less than the preset reflection intensity measurement point threshold value KPs.

Moreover, when satisfy | filling the conditions of the following (c1)-(c3), it determines with a road structure (wall WO etc.).
(C1) The actual speed of the object is zero.
(C2) The edge direction vector variance Ev is less than the vector variance threshold Evs.
(C3) The number Kn of measurement points whose reflection intensity according to the distance is greater than or equal to a preset reflection intensity threshold value Rs is greater than or equal to a preset reflection intensity measurement point threshold value KKs, or the number Kn = 0. Note that KKs> KPs.

Further, when the following conditions (d1) to (d4) are satisfied, it is determined as a person (pedestrian) PE.
(D1) The actual speed Vn of the object is less than the actual speed threshold value Vs.
(D2) The edge direction vector variance Ev is equal to or greater than the vector variance threshold Evs.
(D3) The maximum value Rmax of the reflection intensity according to the distance is less than the maximum reflection intensity threshold MRs.
(D4) The number Kn of measurement points whose reflection intensity according to the distance is equal to or greater than the reflection intensity threshold value Rs is 0.

  Further, although not satisfying the above conditions, if it is determined that an object exists by the luminance image information of the camera 1 and the detection information of the radar 2, it is processed as another object.

  The flow of the object type determination process in the above object type determination processing unit is shown in the flowchart of FIG. In this flowchart, the actual speed threshold value Vs is set to a speed higher than the walking speed of the person PE (for example, a speed of about 8 km / h). The other threshold values Evs, Brs, Rs, and KPs are set to optimum values based on actual measurements.

  As described above, in the object detection apparatus according to the first embodiment, first, the object presence determination processing unit 12 determines the presence / absence of an object based on the luminance image information obtained from the camera 1, and in parallel therewith. The presence / absence of an object is also determined by detection information obtained from the radar 2.

  In both determinations, if an object is present in the same direction, the relative velocity calculated from the image of the object determined to be present on the luminance image information and the detection information of the object determined to be present on the detection information are further calculated. It is confirmed that the same object is detected based on the calculated relative speed.

  Thereafter, the object speed calculation processing unit 13 calculates the actual speed Vn of the object confirmed to exist.

  Then, the object type determination processing unit 14 determines the object type based on the actual speed Vn of the object determined to exist, the edge direction vector dispersion Ev, the reflection intensity and the reflection intensity dispersion Br of the object. That is, any one of the other vehicle AB that is traveling, the two-wheeled vehicle MS that is traveling, the person PE, the road structure, and other objects is determined. The determination result of the object type determination processing unit 14 is output to a driving support device or the like.

  As described above, in the object detection apparatus according to the first embodiment, when detecting an object, the object presence determination processing unit 12 performs the object presence determination on each of the luminance image information and the detection information. Since it is determined that an object exists when it is determined that the object exists, the detection accuracy of the object can be improved. In addition, in this presence determination, when it is observed in time series and measured continuously for a certain time or more, it is determined that an object is present, so that erroneous detection due to noise can be eliminated.

  In addition, in the first embodiment, the relative speed of the object determined to be present on the image of the camera 1 and the relative speed of the object determined to be present based on the detection information of the radar 2 are obtained, and the two match. In this way, it is confirmed that the detection objects are the same. For this reason, the detection accuracy can be further improved without erroneously detecting the background of the object on the image and other objects existing next to the object.

  Further, in the first embodiment, for the object determined to exist, the object type determination processing unit 14 sets the types as “other vehicle AB during traveling”, “two-wheeled vehicle MS during traveling”, “person PE”, “ It can be determined as “road structure (wall WO)” and “other objects”. For this reason, compared with the object detection device that detects only the “vehicle”, the possibility and application of the use of the detection data by the object detection device are expanded.

  In addition, the object type determination processing unit 14 uses the actual velocity Vn of the object, the edge direction vector dispersion Ev, the reflection intensity dispersion Br, and the reflection intensity in the object type determination. These represent the characteristics of the above-mentioned “traveling vehicle (other vehicle AB)”, “traveling motorcycle MS”, “person PE”, and “road structure”, and distinguishing these types with high accuracy. Can be done.

  In other words, the actual speed Vn of the object is high for “traveling vehicle (AB)” and “traveling motorcycle MS”, whereas “person PE” and “road structure (WO)” are low. 0, which is effective in distinguishing these.

  Further, the edge direction vector variance Ev has a low value because artifacts such as other vehicles AB and road structures have many straight lines as their shape characteristics, and these often maintain a constant shape. Tend to be. On the other hand, the person PE is a natural object, includes a curve as a feature of the shape, and changes its shape due to movement of the limbs. Therefore, the edge direction vector variance Ev tends to be a high value. Note that when the person PE gets on the motorcycle MS, the two-wheeled vehicle MS has the characteristics of the person PE, and therefore the edge direction vector variance Ev tends to be high.

  As for the reflection intensity and dispersion, as described above, the other vehicles AB and the two-wheeled vehicle MS have reflectors, so that the reflection intensity tends to be high, and the reflection intensity varies. It tends to be higher. On the other hand, the person PE has few reflectors and tends to have low reflection intensity. In the case of a road structure, the reflection intensity varies depending on the material. For example, a metal signboard has a high reflection intensity, but a concrete wall or a tree has a low reflection intensity. Furthermore, the road structure has a large area and a large number of measurement points with a high reflection intensity, such as a wall WO, or conversely, the number of measurement points with a high reflection intensity is zero.

  As described above, the characteristics differ depending on the type of the object. For this type of determination, as described above, the actual velocity Vn of the object, the edge direction vector dispersion Ev, the reflection intensity dispersion Br, and the reflection intensity By using, it is possible to capture the feature according to the type and determine the type of the object with high accuracy.

  Further, in the first embodiment, the data used for the determination of the type of the object in the object type determination processing unit 14 uses information on a region where the relative velocities are the same on the luminance image information and the detection information. For this reason, it is possible to prevent information such as the background and adjacent objects other than the object to be determined from being mixed in the type determination, and it is possible to determine the type of the object with high accuracy.

  Next, an object detection apparatus according to Example 2 of the embodiment of the present invention will be described. In the description of the second embodiment, the same or equivalent parts as those in the first embodiment are not shown in the drawing or are denoted by the same reference numerals and the description thereof is omitted. To do.

  The second embodiment is different from the first embodiment in the contents of luminance image information processing in the object presence determination processing unit 12 and object type determination processing in the object type determination processing unit 14.

  In the second embodiment, in the calculation of the edge direction vector variance prior to the object presence in the object presence determination processing unit 12, two types of edge direction vector variance (first direction vector variance) VA and edge direction vector variance ( (Second direction vector variance) VB is calculated.

  The former edge direction vector variance VA is calculated based only on edges whose edge strength is equal to or greater than a preset threshold value. That is, the number NA of edges whose edge strength exceeds a threshold value is calculated, and the edge direction vector variance VA is calculated based only on the edges exceeding the threshold value.

  On the other hand, the direction vector variance VB of the latter edge is calculated based on only edges whose edge strength is less than the above threshold value, and at the same time, the number of edges NB less than the threshold value is also calculated.

  The threshold value is set based on experimental results so that the threshold value is lower than the edge strength observed in a vehicle that is an artifact and higher than the edge strength observed in a natural object including a person PE. Yes.

  In the object type determination process, the edge direction vector variance VA is used to determine the vehicle (other vehicle AB), and the edge direction vector variance VB is used to determine the person PE (pedestrian). Specifically, in step S2 for vehicle determination shown in FIG. 13, the edge direction vector variance VA exceeding the threshold value is compared with the edge variance threshold value Evs. In step S10 for person determination, the direction vector variance VB of the edge less than the threshold value is compared with the edge variance threshold value Evs.

  As a result, in the determination of the other vehicle AB or road structure, noise on the surface of the other vehicle AB or road structure is removed, and the contour, shape, or interior of the object (for example, the bumper of the other vehicle AB) For example, it is possible to extract only the characteristic edge component of all the types of objects by removing the influence of noise.

  In this way, since only the direction component that characterizes the type of the object can be extracted, more reliable type determination is possible.

  In addition, in the case of the person PE or the two-wheeled vehicle MS on which the person PE is riding, the shape is usually complicated and the size is smaller than that of the other vehicle AB. Therefore, there is a high possibility that the edge of another object existing behind the person PE is included in the detection region in the image. For this reason, when the edge direction vector is obtained in the region including the entire person PE, the edge direction vector may be extracted including the edge of the object existing in the background. Therefore, by extracting only the edges below the threshold, the edges that are likely to be artifacts existing in the background are excluded, and only the edges of the portion of the person PE that is on the person PE or the motorcycle MS are removed. It becomes possible to extract.

  As described above, in the second embodiment, the noise component can be removed in the determination of the object type based on the edge direction vector variance, and the object type determination accuracy can be improved.

  Next, an object detection apparatus according to Example 3 of the embodiment of the present invention will be described. In the description of the third embodiment, the same or equivalent parts as those in the first embodiment are not shown in the drawing or are denoted by the same reference numerals and the description thereof is omitted. To do.

  The third embodiment is different from the first embodiment in the object type determination processing of the object in the object type determination processing unit 14. In the object type determination processing of the third embodiment, the edge direction vector dispersion Ev obtained from the luminance image information and the reflection intensity and distance of the measurement point obtained from the radar 2 detection information are used without using the actual velocity Vn. Used to determine the type of object.

That is, in Example 3, when the following conditions (a31) to (a33) are satisfied, the vehicle is determined to be the other vehicle AB.
(A31) The edge direction vector variance Ev is less than a preset vector variance threshold Evs.
(A32) The reflection intensity dispersion Br is not less than a preset intensity dispersion threshold Brs.
(A33) The maximum value Rmax of the reflection intensity corresponding to the distance is greater than or equal to a preset maximum reflection intensity threshold value Rs.

Further, when the following conditions (b31) to (b34) are satisfied, the motorcycle is determined to be a motorcycle MS.
(B31) The edge direction vector variance Ev is greater than or equal to the vector variance threshold Evs.
(B32) The reflection intensity dispersion Br is greater than or equal to the intensity dispersion threshold Brs.
(B33) The maximum value Rmax of the reflection intensity according to the distance is not less than the maximum reflection intensity threshold MRs.
(B34) The number Kn of measurement points whose reflection intensity according to the distance is greater than or equal to a preset reflection intensity threshold value Rs is less than the preset reflection intensity measurement point threshold value KPs.

Moreover, when satisfy | filling the conditions of the following (c31) (c32), it determines with a road structure (wall WO etc.).
(C31) The edge direction vector variance Ev is less than the vector variance threshold Evs.
(C32) The number Kn of measurement points whose reflection intensity according to the distance is greater than or equal to a preset reflection intensity threshold value Rs is greater than or equal to a preset reflection intensity measurement point threshold value KKs, or the number Kn. = 0. Note that KKs> KPs.

Further, when the following conditions (d31) to (d33) are satisfied, it is determined as a person (pedestrian) PE.
(D31) The edge direction vector variance Ev is equal to or greater than the vector variance threshold value Evs.
(D32) The maximum value Rmax of the reflection intensity according to the distance is less than the maximum reflection intensity threshold MRs.
(D33) The number Kn of measurement points whose reflection intensity according to the distance is equal to or greater than the reflection intensity threshold value Rs is 0.

  Further, although not satisfying the above conditions, if it is determined that an object exists by the luminance image information of the camera 1 and the detection information of the radar 2, it is processed as another object.

  The flow of the object type determination process in the object type determination processing unit 14 is shown in the flowchart of FIG.

  In the third embodiment, since the actual speed Vn is not calculated, the calculation amount in the processing can be reduced, the determination speed can be shortened, the configuration can be simplified, and the cost can be reduced.

  The other vehicle AB and the two-wheeled vehicle MS can be detected regardless of whether the vehicle is traveling or stopped.

  Next, an object detection apparatus according to Example 4 of the embodiment of the present invention will be described with reference to FIGS. In the description of the fourth embodiment, the same or equivalent parts as those in the first embodiment are not shown in the drawing or are denoted by the same reference numerals and the description thereof is omitted. To do.

  The fourth embodiment is an example in which only the camera 1 is used as means for detecting a detection target object, as shown in FIG.

  In the fourth embodiment, when the control unit CU is functionally represented, as shown in FIG. 16, a memory 411, an object presence determination processing unit 412, an object speed calculation processing unit 413, and an object type determination processing unit 414 are provided. .

  The memory 411 stores luminance image information from the camera 1 as in the first embodiment.

  In the fourth embodiment, the object presence determination processing unit 412 determines the presence of an object based only on luminance image information from the camera 1. The determination of the presence of the object based on the luminance image information is performed based on the edge and the optical flow as in the first embodiment. As in the first embodiment, the edge direction vector variance is also calculated at this time.

  In the fourth embodiment, the object speed calculation processing unit 413 calculates the actual speed Vn of the object by inputting speed information from the external communication device 415. The speed information from the external communication device 415 is, for example, speed information obtained from the other vehicle AB or the two-wheeled vehicle MS using a communication technology such as inter-vehicle communication, or road-to-vehicle communication from a speed detection device on the road. This refers to speed information obtained using communication technologies such as

  Further, in the fourth embodiment, the actual speed is calculated for the other vehicle AB and the two-wheeled vehicle MS based on the edge interval shown in the first embodiment in parallel with the input of the speed from the outside. For these two types of objects (other vehicle AB and two-wheeled vehicle MS), confirmation is performed by comparing the calculated actual speed Vn with the actual speed Vn obtained from the external communication device 415.

  Note that the types of objects for which the actual speed is determined by the luminance image information are limited for the reasons described below. That is, when the speed is obtained from the edge interval, in the first embodiment, the actual speed of the object is calculated based on the distance to the object measured by the radar 2 and the edge interval. On the other hand, in the fourth embodiment, since the radar 2 is not provided, this distance cannot be obtained.

  Therefore, in the fourth embodiment, the distance to the object is calculated based on what size the existing dimension is on the image. That is, in the present Example 4, the dimension of the number plate 31 of the other vehicle AB and the two-wheeled vehicle MS is used as this existing dimension.

  For example, in the other vehicle AB, the number plate 31 is detected from the region Rnp including the number plate 31 described with reference to FIG. 10 described above, the same aspect ratio as 1: 2 that is the aspect ratio of the number plate 31. This is done by detecting the rectangle. Here, detection of a rectangle having an aspect ratio of 1: 2 is performed by, for example, detecting a large number of edges by Hough transform that detects only vertical and horizontal edges, and combining a distance between a combination of two vertical edges and a combination of two horizontal edges. Detection is performed by obtaining a combination having a constant distance ratio of 1: 2. As for the actual speed Vn of other types of objects, only those obtained from the external communication device 415 are used.

  Next, the object type determination processing unit 414 determines the other vehicle AB that is traveling, the two-wheeled vehicle MS that is traveling, the road structure, and the person PE as the types of objects.

That is, in Example 4, when the following conditions (a41) and (a42) are satisfied, it is determined as the other vehicle AB.
(A41) The actual speed Vn is equal to or higher than a preset speed threshold value Vs.
(A42) The edge direction vector variance Ev is less than a preset vector variance threshold Evs.

Further, when the following conditions (b41) and (b42) are satisfied, it is determined that the motorcycle is traveling.
(B41) The actual speed Vn is equal to or greater than the speed threshold value Vs.
(B42) The edge direction vector variance Ev is equal to or greater than the vector variance threshold value Evs.

Further, when the following conditions (c41) and (c42) are satisfied, the road structure is determined.
(C41) The actual speed Vn is less than the speed threshold Vs.
(C42) The edge direction vector variance Ev is less than the vector variance threshold Evs.

Further, when the following conditions (d41) and (d42) are satisfied, the person PE is determined.
(D41) The actual speed Vn is less than the speed threshold value Vs.
(D42) The edge direction vector variance Ev is equal to or greater than the vector variance threshold value Evs.

  The flow of the object type determination processing in the object type determination processing unit 414 is shown in the flowchart of FIG.

  As described above, the other vehicle AB, the two-wheeled vehicle MS, the road structure, and the person PE have different combinations of features of the actual speed Vn and the edge direction vector dispersion Ev. Can be determined.

  As described above, in the fourth embodiment, only the camera 1 is used as an input unit for obtaining information about an object. Therefore, the overall configuration can be simplified, and the manufacturing cost and the vehicle weight can be suppressed. Further, in the object type determination process of the object, since the object type determination is performed based on only two elements of the actual speed Vn and the edge direction vector variance Ev, the process can be simplified. The manufacturing cost can be suppressed and the processing time can be shortened.

  Next, an object detection apparatus according to Example 5 of the embodiment of the present invention will be described with reference to FIGS. In the description of the fifth embodiment, the same or equivalent parts as those of the first embodiment are not shown, or the same reference numerals are used and the description thereof is omitted. To do.

  The fifth embodiment is a modification of the fourth embodiment, and is an example in which the edge strength and the histogram of the edge direction vector are added to the determination element in the object type determination processing of the object.

  Here, as described in the first embodiment, the edge strength can be obtained by | Dx + Dy |, and a histogram of edge direction vectors will be described below.

  When tracking an object in time series, an artificial object such as another vehicle AB or a road structure does not change its shape, but a non-artificial object such as a person PE changes its own shape.

  Accordingly, when the histogram of the edge direction vector is obtained, the distribution of the time-series histogram of the region Rab including the other vehicle AB in FIG. 18A is narrow as shown in FIG. 18B. . On the other hand, the time-series histogram of the region Rpe including the person PE, which is a non-artifact in FIG. 10A, has a wide distribution range as shown in FIG.

  Therefore, in the fifth embodiment, the type of the object is determined based on the edge strength, the edge direction vector dispersion, the histogram distribution that is the degree of time-series change of the edge, and the actual speed of the object. To do.

That is, it is determined that the vehicle is an other vehicle AB when the following conditions (a51) to (a54) are satisfied.
(A51) The actual speed Vn is equal to or higher than a preset speed threshold value Vs.
(A52) The edge direction vector variance Ev is less than a preset vector variance threshold Evs.
(A53) The edge strength EPn is greater than or equal to a preset strength threshold value EPs.
(A54) The distribution range of the edge direction vector histogram is narrower than the set value.

Further, when the following conditions (b51) to (b53) are satisfied, it is determined that the motorcycle is traveling.
(B51) The actual speed Vn is equal to or higher than the speed threshold value Vs.
(B52) The direction vector variance Ev of the edge is equal to or greater than the vector variance threshold value Evs .

Further, when the following conditions (c51) to (c53) are satisfied, the road structure is determined.
(C51) The actual speed Vn is less than the speed threshold Vs.
(C52) The edge direction vector variance Ev is less than the vector variance threshold Evs.
(C53) The distribution range of the edge direction vector histogram is narrower than the set value.

Further, when the following conditions (d51) to (d53) are satisfied, the person PE is determined.
(D51) The actual speed Vn is less than the speed threshold Vs.
(D52) The edge strength EPn is less than a preset strength threshold value EPs.
(D53) The edge direction vector variance Ev is equal to or greater than the vector variance threshold value Evs.
(D54) The distribution range of the edge direction vector histogram is wider than the set value.

  If the above determination condition is not satisfied, it is determined as another object. Moreover, the flow of the object type determination process in the above object type determination processing part is shown in the flowchart of FIG.

  As described above, in the fifth embodiment, in the object type determination, the edge strength EPv and the histogram of the edge direction vector are added to the determination conditions of the fourth embodiment. For this reason, it becomes clearer whether it is an artificial object, and the determination accuracy of a kind can be improved. That is, since an artificial object often has a linear shape, the edge strength EPn is high, whereas a natural object such as a person PE tends to have a curved shape and a weak edge strength EPn. In addition, since artifacts often have a certain shape, the histogram distribution range is narrow, whereas natural objects such as the person PE have a wide histogram distribution range because the shape changes. Tend to be. Therefore, by adding the histogram of the edge strength EPn and the edge direction vector to the determination element of the object type, the difference between the above-described artificial object and the natural object can be clearly determined, and the determination accuracy of the type is improved. be able to.

  As described above, the embodiment of the present invention and Examples 1 to 5 have been described in detail with reference to the drawings. However, the specific configuration is not limited to this embodiment and Examples 1 to 5. Design changes that do not depart from the gist of the present invention are included in the present invention.

  For example, in the first to fifth embodiments, the object detection method and the object detection apparatus of the present invention are implemented by being mounted on a vehicle. However, the present invention is not limited to this, and the present invention is applicable to other than industrial vehicles such as industrial robots. Can do.

  In the first embodiment, the object presence determination processing unit 12 detects the object based on the edge and the optical flow when determining the presence of the object based on the luminance image information. However, the present invention is not limited thereto. For example, when the detection target is determined, it may be detected by template matching indicating the target. Alternatively, a feature of the detection target may be learned in advance, and a detection method based on learning such as a neural network may be applied.

  Further, in the first embodiment, when the object presence determination processing unit 12 determines the presence of an object based on detection information, the measurement points qP and qA are used in order to remove the noise component and improve the object detection accuracy. , QW group is observed in time series, and when an object of the same size is continuously observed for a certain period of time or more, an example in which it is determined that an object exists is shown. By doing so, it is possible to improve the accuracy of determining the presence of an object, but it is not limited to such a determination method, and an object immediately exists at a measurement point whose reflection intensity is equal to or greater than a threshold value. Then, it may be determined. In this case, the time required for determination can be shortened.

  In the first embodiment, when the object presence determination processing unit 12 determines that an object exists in the same direction in each of the luminance image information and the detection information, in addition to determining that the object exists, the luminance image It is determined whether or not the objects determined to exist are the same object based on whether or not the relative speed of the object determined to exist by the information matches the relative speed of the object determined to exist by the detection information. I have confirmed. By doing so, the accuracy of determining the presence of an object is extremely high, but this confirmation is not essential and can be omitted.

  Furthermore, the object presence determination processing unit 12 increases the object detection accuracy by determining that an object exists when it is determined that the object exists in the same direction in both the luminance image information and the detection information. Although an example is shown, the present invention is not limited to this, and it may be determined that an object is present when the presence is confirmed with a certain degree of certainty or more. The certainty can be determined by whether the average value or the maximum value of the edge intensity and the reflection intensity is equal to or greater than a predetermined value, or is observed for a predetermined time as described above.

  As described above, when the presence determination is performed on either side, the object can be detected by the radar 2 even during heavy fog or heavy rain in which the detection accuracy of the object is lowered by the camera 1, and the influence of such an external environment. A decrease in detection accuracy can be suppressed.

It is the schematic which shows vehicle MB carrying the object detection apparatus of Example 1 of embodiment of this invention, (a) is the figure seen from the side, (b) is the figure seen from upper direction. It is the block diagram functionally showing control unit CU of the object detection apparatus of Example 1 of an embodiment of the invention. It is a figure explaining the image information of the camera 1 in the object detection apparatus of the said Example 1, Comprising: (a) shows the state seen from the side, (b) shows the state seen from upper direction, (c). An infrared image when an infrared camera is used as the camera 1 is shown. (D) A luminance image projected on the imaging surface 1a of the camera 1 is shown. In the comparative example of luminance image information and detection information in the object detection apparatus of the first embodiment, (a) shows image information and (b) shows a measurement point based on detection information. It is the schematic which shows the Sobel filter used by the information conversion process of the image information of the camera 1 in the object detection apparatus of the said Example 1. FIG. In the information conversion process of the image information of the camera 1 in the object detection apparatus of the first embodiment, it is an explanatory diagram of how to obtain an edge direction vector, (a) shows a filter for calculating a vertical edge component, (b) Represents a filter for calculating a horizontal edge component, and (c) represents a relationship between the edge strength and the edge direction vector. BRIEF DESCRIPTION OF THE DRAWINGS It is explanatory drawing of the edge detection by the object detection apparatus of Example 1 of embodiment of this invention, (a) shows the example of a horizontal edge detection of the other vehicle AB, (b) has shown the calculation principle of edge. . It is explanatory drawing of the optical flow by the object detection apparatus of Example 1 of embodiment of this invention, (a) shows the optical flow when time passes from the state of FIG. 4, (b) is the radar of the radar. The state of distance detection is shown. It is explanatory drawing which calculates | requires a relative speed with the radar 2 in the object detection apparatus of Example 1 of embodiment of this invention. It is explanatory drawing in the case of calculating | requiring a relative speed by luminance image information in the object detection apparatus of Example 1 of an embodiment of the invention. In the object detection apparatus of Example 1 of embodiment of this invention, it is explanatory drawing in the case of calculating | requiring a relative speed based on the space | interval change of an edge, (a) has shown the case where distance is farther than (b). In the object detection apparatus of Example 1 of embodiment of this invention, it is explanatory drawing at the time of calculating | requiring a relative speed based on the change of a frequency component, (a) has shown the case where distance is farther than (b). It is a flowchart which shows the flow of the object type determination process in the object type determination process part 14 of the object detection apparatus of Example 1 of embodiment of this invention. It is a flowchart which shows the flow of the object type determination process in the object type determination process part 14 of the object detection apparatus of Example 3 of embodiment of this invention. It is the schematic which shows vehicle MB carrying the object detection apparatus of Example 4 of embodiment of this invention, (a) is the figure seen from the side, (b) is the figure seen from upper direction. It is the block diagram which functionally represented control unit CU of the object detection apparatus of Example 4 of an embodiment of the invention. It is a flowchart which shows the flow of the object type determination process in the object type determination process part 414 of the object detection apparatus of Example 3 of embodiment of this invention. It is explanatory drawing of the histogram of the direction vector of the edge in the object detection apparatus of Example 5 of embodiment of this invention, (a) shows luminance image information, (b) is a time series of area | region Rab in (a). (C) shows the time-series hiss and gram of the region Rpe in (a). It is a flowchart which shows the flow of the object type determination process in the object type determination process part 414 of the object detection apparatus of Example 5 of embodiment of this invention.

Explanation of symbols

1 Camera (means to input image information)
2 Radar (means to input distance information)
14 Object type determination processing unit 414 Object type determination processing unit AB Vehicle (other vehicle)
CU control unit (object detection processing means)
MS motorcycle (object)
PE person (object)
WO wall (object)

Claims (12)

  1. An object detection method for detecting an object using information related to an object existing in the outside world,
    And actual speed of the detection object, time-series frequency distribution of a change in the direction vector of the edge calculated from the image information, the direction vector variance of the edge, and at least the edge direction vector variance of the edge strength of, on the basis While determining the type of object ,
    Vehicle determination that determines that the vehicle is running when the actual speed of the object is greater than or equal to a preset threshold and the edge direction vector variance is less than the preset threshold;
    Two-wheeled vehicle determination that determines that the object is a two-wheeled vehicle when the actual speed of the object is greater than or equal to a predetermined threshold and the edge direction vector variance is greater than or equal to a predetermined threshold.
    A road structure determination that determines that the actual speed of the object is less than a preset threshold and the edge direction vector variance is less than a preset threshold;
    A person determination that determines that the object is a person when the actual speed of the object is less than a preset threshold and the variance of the edge direction vector is greater than or equal to a preset threshold;
    An object detection method comprising determining an object type by using at least one determination .
  2. The vehicle determination condition includes that the range of the frequency distribution of the time-series change in the direction vector of the edge is narrower than a preset setting value,
    The road structure determination condition includes that the range of the frequency distribution of the time-series change of the direction vector of the edge is narrower than a preset setting value,
    2. The object according to claim 1, wherein the person determination condition includes that the frequency distribution range of the time-series change of the edge direction vector is wider than a preset value. Detection method.
  3. An object detection method for detecting an object using information related to an object existing in the outside world,
    Scanning with detection wave, distance to detection object, frequency distribution of time-series change of edge direction vector calculated from image information, edge direction vector dispersion, edge direction vector dispersion of edge intensity, and detection wave And determining the type of the object based on the reflection intensity calculated from the detection information obtained by the reflected reflection, at least the reflection intensity of the reflection intensity dispersion,
    The edge direction vector variance is less than a preset threshold value, the reflection intensity variance is greater than or equal to a preset threshold value, and the maximum value of reflection intensity according to distance is greater than or equal to a preset threshold value A vehicle determination for determining a vehicle,
    Edge direction vector variance is greater than or equal to a preset threshold, reflection intensity variance is greater than or equal to a preset threshold, and the maximum value of reflection intensity according to distance is greater than or equal to a preset threshold If the number of measurement points whose reflection intensity according to distance is equal to or greater than a preset threshold value is less than a preset threshold value, a motorcycle determination that determines that the motorcycle is a motorcycle,
    The number of measurement points whose edge direction vector variance is less than the threshold and the maximum reflection intensity according to the distance is greater than or equal to the threshold is greater than or equal to the threshold set for the vehicle. Or when it is zero, the road structure determination to be determined as a road structure,
    Edge direction vector dispersion is greater than or equal to a preset threshold, the maximum reflection intensity according to distance is less than the preset threshold, and the reflection intensity according to distance is preset. When the number of measurement points equal to or greater than 0 is 0,
    An object detection method comprising determining an object type by using at least one determination .
  4. As the edge direction vector variance, a first direction vector variance obtained in a portion where the edge strength is equal to or greater than a preset threshold value and a first direction vector variance obtained in a portion where the edge strength is less than the preset threshold value. 2 direction vector variances are calculated,
    The first direction vector dispersion is used for the vehicle determination and the road structure determination,
    The object detection method according to claim 1, wherein second direction vector dispersion is used for the person determination .
  5. An input means for obtaining information about an object existing in the outside world;
    Object detection processing means for detecting an object based on the information obtained from the input means;
    With
    The input means includes at least means for inputting image information obtained by imaging the detection target object, and means for inputting speed information of the detection target object.
    The object detection processing means obtains at least the edge direction vector dispersion of the time direction change of the edge direction vector calculated from the image information, the edge direction vector dispersion, the edge intensity, and the velocity information. An object detection device that performs an object type determination process for determining the type of the object based on the actual speed of the detected object,
    In the object type determination process,
    A vehicle determination process for determining that the vehicle is running when the actual speed of the object is equal to or greater than a preset threshold and the edge direction vector variance is less than the preset threshold;
    A two-wheeled vehicle determination process for determining a two-wheeled vehicle when the actual speed of the object is equal to or higher than a predetermined threshold and the direction vector variance of the edge is equal to or higher than a predetermined threshold;
    When the actual speed of the object is less than a preset threshold value and the edge direction vector variance is less than the preset threshold value, a road structure determination process for determining a road structure,
    If the actual speed of the object is less than a preset threshold and the variance of the edge direction vector is greater than or equal to the preset threshold,
    An object detection apparatus including at least one of the determination processes.
  6. The vehicle determination condition in the vehicle determination process includes that the range of the frequency distribution of the time-series change of the direction vector of the edge is narrower than a preset setting value,
    The road structure determination condition in the road structure determination process includes that the range of the frequency distribution of the time-series change of the direction vector of the edge is narrower than a preset setting value,
    6. The person determination condition in the person determination process includes that the range of a frequency distribution of time-series changes in edge direction vectors is wider than a preset value. The object detection apparatus described .
  7. An input means for obtaining information about an object existing in the outside world;
    Object detection processing means for detecting an object based on the information obtained from the input means;
    With
    The input means includes at least means for inputting image information obtained by imaging the detection target object, and means for inputting distance information on the detection target object.
    As the means for inputting the distance information, detection information on the object based on the reflection of the detection wave obtained by scanning the object detection target region with the detection wave is included,
    At least the edge direction of the distance from the object to be detected by the object detection processing unit, the frequency distribution of the time-series change of the edge direction vector calculated from the image information, the edge direction vector variance, and the edge strength. An object detection device that performs an object type determination process that determines the type of an object based on vector dispersion, the reflection intensity calculated from the detection information, and at least the reflection intensity of the reflection intensity dispersion,
    In the object type determination process,
    The edge direction vector variance is less than a preset threshold value, the reflection intensity variance is greater than or equal to a preset threshold value, and the maximum value of reflection intensity according to distance is greater than or equal to a preset threshold value A vehicle determination process for determining a vehicle,
    Edge direction vector variance is greater than or equal to a preset threshold, reflection intensity variance is greater than or equal to a preset threshold, and the maximum value of reflection intensity according to distance is greater than or equal to a preset threshold A two-wheeled vehicle determination process for determining a two-wheeled vehicle when the number of measurement points whose reflection intensity according to the distance is equal to or greater than a predetermined threshold value is less than a predetermined threshold value;
    The number of measurement points whose edge direction vector variance is less than the threshold and the maximum reflection intensity according to the distance is greater than or equal to the threshold is greater than or equal to the threshold set for the vehicle. Or, when it is zero, road structure determination processing for determining a road structure,
    Edge direction vector variance is greater than or equal to a preset threshold, the maximum reflection intensity according to distance is less than a preset threshold, and the reflection intensity according to distance is preset When the number of measurement points is 0, the person determination process for determining a person,
    An object detection apparatus including at least one of the following.
  8. An input means for obtaining information about an object existing in the outside world;
    Object detection processing means for detecting an object based on the information obtained from the input means;
    With
    The input means includes at least means for inputting image information obtained by imaging the detection target object, and means for inputting distance information on the detection target object.
    As the means for inputting the distance information, detection information on the object based on the reflection of the detection wave obtained by scanning the object detection target region with the detection wave is included,
    The object detection processing means calculates the distance from the detection target object, the actual speed of the target calculated based on the change in the distance to the target or the size of the object on the image, and the edge calculated from the image information. Frequency distribution of time-series change of direction vector, edge direction vector dispersion, edge direction vector dispersion of at least edge intensity, reflection intensity calculated from detection information, and at least reflection intensity of reflection intensity dispersion , An object detection device for performing an object type determination process for determining the type of an object based on
    In the object type determination process,
    Reflection according to distance when the actual velocity of the object is greater than or equal to a preset threshold, the edge direction vector variance is less than the preset threshold, and the reflection intensity variance is greater than or equal to the preset threshold A vehicle determination process for determining that the vehicle is running when the maximum intensity value is equal to or greater than a threshold;
    Reflection according to distance when the actual speed of the object is greater than or equal to a preset threshold, the edge direction vector variance is greater than or equal to a preset threshold, and the reflection intensity variance is greater than or equal to a preset threshold When the maximum intensity value is greater than or equal to a preset threshold and the number of measurement points whose reflection intensity according to distance is greater than or equal to a preset threshold is less than the preset threshold A motorcycle determination process for determining that the vehicle is a middle motorcycle,
    The number of measurement points where the actual velocity of the object is 0, the edge direction vector variance is less than a preset threshold, and the maximum value of the reflection intensity according to the distance is greater than or equal to the preset threshold, A road structure determination process for determining a road structure when the threshold value is greater than or equal to zero or more than that of a vehicle;
    The actual velocity of the object is less than a preset threshold, the edge direction vector variance is greater than or equal to the preset threshold, and the maximum reflection intensity according to the distance is less than the preset threshold. A person determination process for determining a person when the number of measurement points whose reflection intensity according to distance is equal to or greater than a preset threshold value is zero;
    An object detection apparatus including at least one of the following .
  9. In the object type determination process, as the edge direction vector variance, the first direction vector variance obtained in a portion where the edge strength is equal to or greater than a preset threshold value and the edge strength is less than the preset threshold value. A second directional vector variance determined in part is calculated;
    A first direction vector variance is used for the vehicle determination process and the road structure determination process,
    The object detection apparatus according to claim 5, wherein second direction vector dispersion is used for the person determination process .
  10. The object detection processing means includes an object presence determination processing unit that determines whether or not an object exists before performing the object type determination processing,
    This object presence determination processing unit performs a first object presence determination based on image information and a second object presence determination based on detection information, and it is determined that an object exists in the same direction in each object presence determination result. 10. The object detection apparatus according to claim 5, wherein an object determination process for determining that an object is finally present is performed .
  11. In the first object presence determination process and the second object presence determination process, the object presence determination processing unit performs observation in time series, and the object is continuously measured over a preset time. The object detection apparatus according to claim 10 , wherein it is determined that an object exists .
  12. In the first object presence determination process and the second object presence determination process, the object presence determination processing unit obtains a relative speed for each of the image information and the detection information, and in the final object presence determination, Including that the relative velocities match,
    The object detection apparatus according to claim 10 or 11 , wherein the object type determination processing unit performs an object type determination process based on information on a region where the relative speeds of the image information and the detection information coincide with each other. .
JP2006112877A 2006-04-17 2006-04-17 Object detection method and object detection apparatus Expired - Fee Related JP4830604B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006112877A JP4830604B2 (en) 2006-04-17 2006-04-17 Object detection method and object detection apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006112877A JP4830604B2 (en) 2006-04-17 2006-04-17 Object detection method and object detection apparatus

Publications (2)

Publication Number Publication Date
JP2007288460A JP2007288460A (en) 2007-11-01
JP4830604B2 true JP4830604B2 (en) 2011-12-07

Family

ID=38759823

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006112877A Expired - Fee Related JP4830604B2 (en) 2006-04-17 2006-04-17 Object detection method and object detection apparatus

Country Status (1)

Country Link
JP (1) JP4830604B2 (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4857909B2 (en) * 2006-05-23 2012-01-18 日産自動車株式会社 Object detection method and object detection apparatus
JP5145986B2 (en) * 2008-02-05 2013-02-20 日産自動車株式会社 Object detection apparatus and distance measuring method
JP4842301B2 (en) * 2008-07-01 2011-12-21 トヨタ自動車株式会社 Pedestrian detection device and program
DE112008004159B4 (en) * 2008-12-09 2014-03-13 Toyota Jidosha Kabushiki Kaisha Object detection device and object detection method
KR100972041B1 (en) 2009-01-15 2010-07-22 한민홍 Method for recognizing obstacle using cameras
JP5136504B2 (en) * 2009-04-02 2013-02-06 トヨタ自動車株式会社 Object identification device
JP5379543B2 (en) * 2009-04-09 2013-12-25 日立オートモティブシステムズ株式会社 Automobile external recognition device
JP5493601B2 (en) * 2009-08-31 2014-05-14 日産自動車株式会社 Driving support device and driving support method
US8610620B2 (en) 2009-12-08 2013-12-17 Toyota Jidosha Kabushiki Kaisha Object detecting apparatus and object detecting method
WO2011114815A1 (en) 2010-03-17 2011-09-22 本田技研工業株式会社 Vehicle surroundings monitoring device
KR101080730B1 (en) 2010-03-18 2011-11-07 서강대학교산학협력단 Radar reflector for vehicle
JP2011243154A (en) * 2010-05-21 2011-12-01 Mitsubishi Electric Corp Device and method for detecting vehicle
DE102015101292A1 (en) * 2015-01-29 2016-08-04 Valeo Schalter Und Sensoren Gmbh Method for detecting an object in an environmental region of a motor vehicle by checking a spatial deviation of measuring points, control device, driver assistance system and motor vehicle
JP2016142661A (en) 2015-02-03 2016-08-08 オプテックス株式会社 Vehicle detection device, vehicle gate system, object detection device, control method for vehicle detection device, and vehicle detection program
JP2017054311A (en) * 2015-09-09 2017-03-16 株式会社デンソー Object detection apparatus
WO2019187216A1 (en) * 2018-03-30 2019-10-03 Necソリューションイノベータ株式会社 Object identification device, object identification method, and non-temporary computer readable medium storing control program

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05265547A (en) * 1992-03-23 1993-10-15 Fuji Heavy Ind Ltd On-vehicle outside monitoring device
JP3377743B2 (en) * 1998-01-20 2003-02-17 三菱重工業株式会社 Mobile object identification device
JPH11271441A (en) * 1998-03-23 1999-10-08 Hino Motors Ltd On-vehicle radar apparatus
JP3450189B2 (en) * 1998-06-16 2003-09-22 ダイハツ工業株式会社 Pedestrian detection system and control method thereof
JP3669205B2 (en) * 1999-05-17 2005-07-06 日産自動車株式会社 Obstacle recognition device
JP3649166B2 (en) * 2001-07-26 2005-05-18 日産自動車株式会社 Object type discrimination device and object type discrimination method

Also Published As

Publication number Publication date
JP2007288460A (en) 2007-11-01

Similar Documents

Publication Publication Date Title
US20200065594A1 (en) Path sensing using structured lighting
US9836657B2 (en) System and method for periodic lane marker identification and tracking
JP5939357B2 (en) Moving track prediction apparatus and moving track prediction method
US9726483B2 (en) Integrated vehicular system for low speed collision avoidance
DE102011100927B4 (en) Object and vehicle detection and tracking using 3-D laser rangefinder
JP6156733B2 (en) Object recognition apparatus, device control system, vehicle, object recognition method, and object recognition program
US10081308B2 (en) Image-based vehicle detection and distance measuring method and apparatus
US9052393B2 (en) Object recognition system having radar and camera input
US9330321B2 (en) Method of processing an image of a visual scene
EP2879370B1 (en) In-vehicle image recognizer
US7660436B2 (en) Stereo-vision based imminent collision detection
US7046822B1 (en) Method of detecting objects within a wide range of a road vehicle
JP4797794B2 (en) Pedestrian detection device and pedestrian detection method
US6888953B2 (en) Vehicle surroundings monitoring apparatus
US7672510B2 (en) Vehicle environment monitoring device
US7403659B2 (en) Method and apparatus for differentiating pedestrians, vehicles, and other objects
JP4241834B2 (en) In-vehicle fog determination device
US7542835B2 (en) Vehicle image processing device
JP4650079B2 (en) Object detection apparatus and method
US20180239970A1 (en) Adaptive lane marker detection for a vehicular vision system
DE102006057552B4 (en) System and method for measuring the distance of a preceding vehicle
JP3846494B2 (en) Moving obstacle detection device
JP6344638B2 (en) Object detection apparatus, mobile device control system, and object detection program
US8824733B2 (en) Range-cued object segmentation system and method
US7209031B2 (en) Obstacle detecting apparatus and method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20090225

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110502

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110621

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110729

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110823

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110905

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140930

Year of fee payment: 3

LAPS Cancellation because of no payment of annual fees