JP5145986B2 - Object detection apparatus and distance measuring method - Google Patents

Object detection apparatus and distance measuring method Download PDF

Info

Publication number
JP5145986B2
JP5145986B2 JP2008025021A JP2008025021A JP5145986B2 JP 5145986 B2 JP5145986 B2 JP 5145986B2 JP 2008025021 A JP2008025021 A JP 2008025021A JP 2008025021 A JP2008025021 A JP 2008025021A JP 5145986 B2 JP5145986 B2 JP 5145986B2
Authority
JP
Japan
Prior art keywords
distance
detection
relative
target
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2008025021A
Other languages
Japanese (ja)
Other versions
JP2009186260A (en
Inventor
和巳 藤本
Original Assignee
日産自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日産自動車株式会社 filed Critical 日産自動車株式会社
Priority to JP2008025021A priority Critical patent/JP5145986B2/en
Publication of JP2009186260A publication Critical patent/JP2009186260A/en
Application granted granted Critical
Publication of JP5145986B2 publication Critical patent/JP5145986B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an object detection apparatus and a distance measurement method for calculating a distance from a vehicle to an object and detecting the object.

  When an object is extracted based on information of an image output from the camera, a relative position and a relative distance with the host vehicle are calculated, and a stopped vehicle and a moving object that moves above the stopped vehicle are detected, A travel safety device that calculates a relative distance between a stopped vehicle and a moving object based on an output signal of a radar is known (see Patent Document 1).

JP 2005-280538 A

  However, if there is no stop vehicle ahead and the pedestrian is in the distance, the reflectance will be low, so accurate distance information may not be obtained and objects such as pedestrians may not be detected. There was a problem that there was.

  The problem to be solved by the present invention is to acquire accurate distance information from an own vehicle to an object that exists far away, and to detect an object such as a distant pedestrian with high accuracy.

  The present invention relates to the relative positional relationship between the target object and the reference object calculated based on the information of the image captured by the imaging unit, and the radar when the radar distance measuring unit cannot measure the distance to the target object. The above problem is solved by calculating the distance from the host vehicle to the target object based on the reference distance from the host vehicle to the reference object measured based on the received signal acquired by the distance measuring means.

  According to the present invention, since the relative positional relationship between the target object and the reference object is converted into the relative positional relationship with the own vehicle with reference to the position of the reference object, the object is received as a distant object or a pedestrian. Even when the signal is weak and cannot be measured directly by the radar ranging means, the distance from the host vehicle to the target object can be calculated, and objects such as pedestrians can be detected with high accuracy. .

  The object detection device according to the present embodiment is a device that is mounted on a vehicle and detects the presence of objects around the vehicle and the positional relationship with the host vehicle.

  FIG. 1 is a diagram illustrating an example of a block configuration of an in-vehicle device 1000 including the object detection device 100. As shown in FIG. 1, an in-vehicle device 1000 according to the present embodiment includes an object detection device 100, a travel support device 200 that performs travel support based on a determination result of the object detection device 100, and an object detection device 100. And an output device 300 that outputs a detection result. These are connected by an in-vehicle LAN such as a CAN (Controller Area Network).

  As shown in FIG. 1, the object detection apparatus 100 includes a camera 10 as one aspect of an imaging unit that captures the surroundings of the vehicle, an image memory 11 that temporarily records an image captured by the camera 10, and a vehicle. A radar distance measuring device 20 as one aspect of a radar distance measuring means for measuring the distance between a surrounding target object and the host vehicle, and distance measurement data based on a received signal acquired by the radar distance measuring device 20 are temporarily recorded. A distance measurement data memory 21, a control unit 101 that calculates a distance from the host vehicle to the target object based on a relative positional relationship between the target object and the reference object, and a reference distance from the host vehicle to the reference object; Have

  Each configuration will be described below.

The camera 10 is a camera having an image sensor such as a charge-coupled device (CCD) or a complementary metal-oxide semiconductor (CMOS). The camera 10 according to the present embodiment captures an object (including a solid body and a plane body on the road surface) existing around the vehicle (including the front of the vehicle, the rear of the vehicle, and the side of the vehicle) at a predetermined cycle. The captured image is output to the image memory 11. The image memory 11 stores an image captured by the camera 10 in an accessible state.

FIG. 2 shows an installation example of the camera 10. FIG. 2A is a side view of a vehicle equipped with a camera, and FIG. 2B is a view of the vehicle equipped with a camera viewed from above. As shown in FIG. 2, in this embodiment, one camera 10 is installed in the vehicle. That is, the vehicle periphery is imaged with a single eye. In the present embodiment, the camera 10 is installed in the upper part of the vehicle interior facing the front of the vehicle. Then, the optical axis LS of the camera 10 is adjusted so as to be directed in the Z direction of the vehicle traveling direction (driver front direction), the horizontal axis X of the imaging surface is adjusted to be parallel to the road surface, and the imaging surface The vertical axis Y is adjusted to be perpendicular to the road surface.

  FIG. 3 is an example of an image obtained by capturing the front of the vehicle using the camera 10 of the present embodiment. An image captured by the camera 10 is represented by an xy coordinate system with the vertex at the upper left of the image as the origin. An axis extending rightward from the origin is taken as the x-axis, and an axis extending downward from the origin is taken as the y-axis. Note that the captured image illustrated in FIG. 3 includes a stationary vehicle and a pedestrian that are stationary objects.

The radar distance measuring device 20 is a device that measures the distance and direction to the target object by emitting electromagnetic waves toward the target object and analyzing the reflected wave (received signal). The radar distance measuring device 20 of this embodiment, for example, a laser radar having a scanning mechanism, scans laser light within a predetermined scanning range by changing the optical axis at a predetermined angle on the scanning surface. Thereby, the radar distance measuring device 20 irradiates the object existing in the scan range with the laser light. The radar distance measuring device 20 detects a reflected laser beam (reflected wave) reflected by irradiating an emitted laser beam to an object existing ahead, thereby receiving a received signal (based on the light intensity of the reflected laser beam). (Reflected signal) is acquired. And distance measurement information is produced | generated by performing the distance measurement process based on the acquired received signal, and it outputs to the control part 101 via the distance measurement data memory 21 or directly. The timing of the distance measurement by the radar distance measuring device 20 is not particularly limited, but it is preferable to emit the radar wave so as to be substantially the same as the imaging timing of the camera 10.

Further, as shown in FIG. 2, the radar distance measuring device 20 of the present embodiment is installed at the front portion of the vehicle, and its optical axis is set so as to face the vehicle front direction (Z direction) like the camera 10. To do.

  The control unit 101 uses the relative position relationship between the target object calculated from the image captured by the camera 10 and the reference object and the reference object from the own vehicle measured based on the received signal received by the radar distance measuring device 20. The distance from the host vehicle to the target object is calculated based on the reference distance up to. The control unit 101 is configured by combining operation circuits such as a CPU, MPU, DSP, and FPGA.

  The operation of the control unit 101 is shown in FIG. As shown in FIG. 4, the control unit 101 of the present embodiment determines the relative positional relationship between the target object and the reference object based on the captured image of the camera 10 from the own vehicle measured by the radar distance measuring device 20 to the reference object. Based on the reference distance, the distance from the host vehicle to the target object is calculated and converted to the relative position between the host vehicle and the target object, and the target object is detected. According to the method of the present embodiment, even for a target object that is weakly reflected like a distant pedestrian and cannot be measured by the radar distance measuring device 20, the distance to the target object can be obtained with high accuracy. The presence of an object can be detected with high accuracy.

  The control unit 101 according to the present embodiment includes an inter-object relative position calculation unit 30, a reference distance measurement unit 40, and a distance calculation unit 50.

Hereinafter, each structure with which the control part 101 of the object detection apparatus 100 is provided is demonstrated.

First, the inter-object relative position calculation unit 30 will be described. The inter-object relative position calculation unit 30 includes a feature extraction unit 31 and a movement information calculation unit 22, and is a distance measurement target among the captured objects based on information of an image captured by the camera 10. A relative positional relationship between the target object and a reference object that is a reference for distance measurement is calculated.

The feature extraction unit 31 includes a feature including an extension of the object and a characteristic part of the object in order to observe the movement of the imaged object on the image from each image data (frame) captured by the camera 10. Extract. The feature extraction unit 31 of the present embodiment reads an image captured by the camera 10 from the image memory 11, binarizes the read captured image using a predetermined threshold, and extracts an edge of an object present in the image. . A feature portion is extracted based on the edge component.

  The feature extraction unit 31 according to the present embodiment reads an image captured by the camera 10 from the image memory 11 and binarizes the read captured image using a predetermined threshold value, whereby an edge of an object existing in the image is detected. To extract. FIG. 5A shows an example of the extracted vertical edge. Next, thinning processing is performed on each extracted edge to narrow the edge width, and the center of the edge is accurately set (see FIG. 5B). Further, the edge is expanded in the horizontal direction so that the edge width of the thinned edge becomes a constant width, for example, a width corresponding to three pixels (see FIG. 5C). By this operation, the extracted edges are normalized, and an edge image having a uniform width for each edge is obtained.

The movement information calculation unit 32 calculates the speed of the pixel of the feature portion obtained from the edge extracted by the feature extraction unit 31. The obtained moving speed and moving direction of the characteristic part are stored in association with the imaging timing identifier or the frame identifier. This pixel movement information includes “pixel movement speed” and “pixel movement direction” along with pixel identification information. Note that if there are a plurality of feature portions in one image data, the speed is calculated for all the feature portions.

Hereinafter, the movement information calculation unit 32 of the present embodiment counts up the count value of the pixel at the position where the edge corresponding to the extension of the object is detected based on the information of the image of the object imaged by the camera 10, Based on the slope of the count value, the edge moving speed and direction are calculated.

  The movement information calculation unit 32 of the present embodiment updates the counter value of the pixel counter of the pixel corresponding to the edge included in each image data with a predetermined method for image data with different imaging timings. Here, the pixel counter is a counter set for each pixel. When the pixel corresponds to the edge, the counter value of the pixel counter is incremented by +1. When the pixel does not correspond to the edge, the counter value of the pixel counter is increased. This is a counter that is set to 0 (initialized). This counter value update process is performed for each frame that is repeatedly imaged by the camera 10 in a predetermined cycle. When this operation is performed, the counter value of the corresponding pixel counter increases for a pixel having a long time corresponding to an edge, while the counter value of the corresponding pixel counter decreases for a pixel having a short time corresponding to an edge.

This change in the counter value of the pixel counter represents the moving direction and moving amount of the edge. Therefore, the moving direction and moving speed of the edge on the captured image can be calculated based on this counter value. Further, since the coordinate system of the image represents the azimuth, the moving direction and moving speed of the edge and the feature corresponding to the edge can be obtained based on the counter value.

Furthermore, the movement information calculation method of the movement information calculation unit 32 will be described with reference to FIG. FIG. 5 is a diagram for explaining the movement information calculation process. Here, a process of acquiring an edge image in which the extracted edge is normalized and calculating a moving direction and a moving speed from the edge counter value (residence time) will be described.

First, the feature extraction unit 31 performs binarization processing on the edge image. The binarization process is a process in which a pixel at a position where an edge is detected is set to 1 and a pixel at a position where no edge is detected is set to 0.

FIG. 5A shows an example of the binarized image of the extracted vertical edge. Next, as shown in FIG. 5B, thinning processing is performed on the generated binary image. The thinning process is a process of reducing the edge width of the detected edge until a predetermined pixel width is reached. That is, the edge width is narrowed by performing thinning processing on each extracted edge. In this example, as shown in FIG. 5B, the edge width of the edge is thinned until the predetermined pixel width becomes one pixel. In this way, the edge is thinned to a predetermined pixel width, thereby setting the center position as the center of the edge. Note that, in this example, an example in which one pixel is thinned is shown, but the number of pixels to be thinned is not particularly limited.

Next, an expansion process is performed to expand the edge width of the thinned edge. The expansion process is performed so that the edge width is constant from the center position set by thinning toward the edge movement direction, and the edge width is also changed from the center position to the direction opposite to the edge movement direction. It is a process of expanding. In this example, the edge is expanded in the horizontal direction so that the edge width of the thinned edge becomes a width corresponding to three pixels. By this process, the extracted edges are normalized, and an edge image having a uniform width is obtained. Specifically, as shown in FIG. 5C, the pixel is expanded by one pixel from the edge center position x0 to the edge movement direction (the positive direction of the x axis), and opposite to the edge movement direction from the edge center position x0. The edge width is expanded to 3 pixels by expanding one pixel in the direction (negative direction of the x axis).

In this way, by performing the thinning process and the expansion process, the edge width of the extracted edge image can be standardized and standardized to a predetermined width in the edge moving direction.

Next, the count-up process performed for the movement information calculation unit 32 to calculate movement information will be described. The count-up process mentioned here is a process for counting up the value of the memory address corresponding to the position of the pixel where the edge is detected and initializing the value of the memory address corresponding to the position of the pixel where the edge is not detected. It is.

Hereinafter, the edge count-up process performed by the movement information calculation unit 32 will be described with reference to FIGS. For convenience of explanation, here, a case where the edge moves in the positive direction of the x-axis will be described as an example. Even when the edge moves in the negative x-axis direction, the y-axis direction, or two-dimensionally, the basic processing method is common.

As shown in FIG. 5C, the edge has the center position of the edge at a position x0 in a certain frame. Then, it is expanded from the center position to the position x0 + 1 of one pixel in the edge moving direction, and similarly expanded from the center position to the position x0-1 of one pixel in the direction opposite to the edge moving direction.

  The count value of the memory address corresponding to the position where such an edge is detected, “x0-1”, “x0”, “x0 + 1” is incremented by “+1”. On the other hand, the count value of the memory address corresponding to the position where the edge is not detected is reset.

For example, in FIG. 5D, edges are detected at positions “x0-1”, “x0”, and “x0 + 1” at time t. Therefore, the count value of the memory address corresponding to each position is incremented by “1”. As a result, the count value at the position “x0 + 1” is “1”, the count value at the position “x0” is “3”, and the count value at the position “x0-1” is “5”.

Next, as shown in FIG. 5E, since the edge does not move even at time t + 1, the edge is detected at each of the positions “x0-1”, “x0”, and “x0 + 1”. . Therefore, the count values at the positions “x0-1”, “x0”, and “x0 + 1” are further incremented by one. As a result, the count value at position “x0 + 1” is 2, the count value at position “x0” is 4, and the count value at position “x0-1” is 6.

  Further, as shown in FIG. 5 (f), at time t + 2, the edge is shifted by one pixel in the positive direction of the x-axis, and the edge is detected at positions “x0”, “x0 + 1”, and “x0 + 2”. Therefore, the count value of the memory address corresponding to the positions “x0”, “x0 + 1”, and “x0 + 2” where the edge is detected is counted up. On the other hand, the count value at the position “x0-1” where no edge is detected is reset to “zero”. As a result, the count value at the position “x0 + 2” is 1, the count value at the position “x0 + 1” is 3, and the count value at the position “x0” is 5, as shown in FIG. Further, the count value at the position “x0-1” where no edge is detected is reset to “0”.

  As described above, the movement information calculation unit 32 counts up the count value of the memory address corresponding to the position where the edge is detected, and resets the count value of the memory address corresponding to the position where the edge is not detected.

In the description based on FIG. 5, as the position for detecting the count value, the center position “x0” of the edge, the position “x0 + 1” of one pixel in the moving direction of the edge from the center position, and the edge position from the center position The count value is detected at three positions of the position “x0-1” of one pixel in the direction opposite to the moving direction. However, if the slope of the count value described later is obtained, the arrangement and number of points for detecting the count value are not limited. . That is, as long as the count value can be detected at two or more locations in the edge moving direction, the count value may be detected in any number.

Further, when the object approaches the vehicle at a constant angle, the edge is detected a plurality of times at the same position between successive frames. For example, in the example of FIG. 5, the edge is detected twice at the position x0 in the continuous frame at time t and frame at time t + 1. Therefore, when the count value of the memory address corresponding to the position where the edge is detected is counted up, the count value correlates with the time (number of frames, dwell time) at which the edge is detected at that position.

  Next, a method for calculating the moving speed, moving direction, and position of the edge will be described. First, the inclination of the count value is calculated, and the moving speed, moving direction, and position of the edge are calculated based on this inclination.

For example, in the case of FIG. 5E, the count values at the positions “x0-1”, “x0”, and “x0 + 1” are “6”, “4”, and “2”, respectively. When the count value “2” of “x0 + 1” is subtracted from the count value “6” of the position “x0-1”, the slope H of the count value is calculated as H = (6-2) / 2 = 2.

  This means H = {(time from the edge moving to the position x0-1 to the present) − (time after the edge moves to the position x0 + 1)} / (2 pixels). The time (number of frames) required to pass through one pixel at position x0 is calculated.

Therefore, the slope H of the count value corresponds to how many frames are required for the edge to move by one pixel, and the edge moving speed 1 / H can be calculated based on the slope H of the count value. In FIG. 5 (e), two frames are required to move one pixel, so the edge moving speed is calculated as 1/2 (pixel / frame).

Next, a method for determining the edge moving direction based on the magnitude of the count value will be described. Since the edge moves to a position where there is no edge and the count value at the position where the edge is newly detected is 1, the count value at each position is the smallest value. Therefore, the count value in the direction in which the edge moves is small, and the count value in the direction opposite to the direction in which the edge moves is large. By using this tendency, the moving direction of the edge can be determined.

As described above, by counting up the count value of the memory address corresponding to the position where the edge is detected, the moving speed and moving direction of the edge can be calculated based on the slope of the counted up value. .

Further, the movement information calculation unit 32 classifies the movement information of the edges existing on the captured image into predetermined class values, and generates a movement image that represents the feature of the movement information. FIG. 6 shows an example of the moving image. As shown in FIG. 6, in the moving image of the present embodiment, the pixel of the edge where the movement information is detected is indicated by a circle, and the pixel having the higher movement speed is indicated by a larger circle, thereby obtaining pixel velocity information. Express. In addition, the moving direction of the pixel is represented by a solid black mark that represents a pixel that moves to the right, that is, the right direction, and the moving direction is represented by a white mark that represents a pixel that moves to the left, that is, the left direction. Express. In FIG. 6, the moving speed toward the left side of the image is detected at the edge corresponding to the pedestrian stationary on the left side of the road, and the left side of the image is reached at the edge corresponding to the stopped vehicle existing on the left side of the road. The moving speed is detected. According to this moving image, moving information including speed information and a moving direction can be expressed.

The inter-object relative position calculation unit 20 sets an area for dividing the moving image in order to extract an object from the calculated velocity image. That is, as shown in FIG. 7, a plurality of strip-shaped areas are set on the moving image, and the moving image is divided into the plurality of areas.

  Next, for each region, an object is extracted by grouping regions in which pixels having common movement information are continuous in the vertical direction. That is, when each area is scanned from the bottom to the top of the image and there is a pixel with movement information in the area, the difference in movement speed between the pixels with movement information adjacent to the pixel is calculated. Compare. When the movement speed difference is equal to or less than the threshold value T1, it can be estimated that the objects move at the same movement speed with respect to the vehicle, and therefore are extracted as the same object.

For example, if the object is a stationary object, the moving speed of the object in the foreground will be faster than that of the object in the back, so if there is a difference in moving speed within the strip-shaped area, there will be objects with different relative positions. I understand that Therefore, by comparing the movement information, the existence area of each object can be extracted, and by comparing the positions, the relative positional relationship between the target object and the reference object can be grasped, and the existence (position) of each object. Can be accurately detected.

  In addition, according to this structure, an object can be extracted without specifying the object used as a detection target. In addition, in order to calculate the movement speed by measuring the time that the feature point stays at a certain position, the amount of movement between frames is limited to 1 pixel or less, and the correlation processing between frames necessary for object identification is performed. By eliminating, high-speed arithmetic processing is possible. Also, high-speed arithmetic processing can be performed by performing sequential processing in units of pixels without performing repetitive arithmetic (recursive processing). Further, in the movement information calculated by the movement information calculation unit 32, if the solid is in a stationary state, a different movement speed can be obtained because the relative position is different, and a relative position is obtained between a moving object and a stationary object. Since different moving speeds can be obtained even if the same, the attributes of the object (solid, plane, stationary object, moving object, etc.) can be determined based on this information. The object can be extracted while grasping the relative position and attribute of the object.

Next, the reference distance measurement unit 40 will be described. The reference distance measurement unit 40 according to the present embodiment obtains a reference distance between the reference object selected as a distance measurement reference and the host vehicle based on the received signal acquired by the radar distance measuring device 20. The reference distance measurement unit 40 of the present embodiment is configured such that the reference distance measurement unit 40 and the reference object selected as the reference for distance measurement among the distances between the vehicle and the object measured by the radar distance measuring device 20 are determined. A reference distance to the vehicle is acquired from the distance measurement data memory 21.

In addition, the reference distance measurement unit 40 obtains one reference distance from the distance information measured by the radar distance measuring device 20, so that a reference for distance measurement is selected from objects whose distance from the host vehicle is measured. As a reference object selection unit 41 that selects a “reference object” determined to be appropriate.

The reference object selection unit 41 selects the “reference object” based on the reliability related to the distance measurement of the radar distance measuring device 20 and the reliability related to the relative position calculation of the inter-object relative position calculation unit 30. A reliability determination unit 411 that determines the reliability related to measurement and the reliability related to relative position calculation is included. The reliability related to the distance measurement of the radar distance measuring device 20 includes the reliability based on the distance measuring performance of the radar distance measuring device 20. This reliability can be acquired from laser data 20a stored in advance on the radar distance measuring device 20 side. Further, the reliability related to the relative position calculation includes the reliability based on the imaging performance of the camera 10. This reliability can be acquired from the camera radar 10a stored in advance on the camera 10 side.

First, the reliability determination unit 411 obtains the reliability related to the relative position calculation of the inter-object relative position calculation unit 30. The reliability determination unit 411 according to the present embodiment determines the reliability based on the movement information of the pixels corresponding to the object image included in the image captured by the camera 10. The movement information corresponding to the pixel corresponding to the object image, for example, the edge (feature portion) of the object image, indicates the number of frames in which the edge moves by one pixel, that is, the accumulated amount of pixels until the movement. For this reason, it can be said that the reliability increases because the S / N is improved when the edge accumulation amount is large. The region where the edge accumulation amount is large corresponds to a region where the moving speed of the pixel is slow, and corresponds to a region corresponding to an object image that is far away from the own vehicle in terms of the moving speed of the stationary object. That is, since the reliability of the movement information increases as the amount of accumulated pixels until the movement increases, the reliability increases as the distance from the host vehicle increases. FIG. 8 shows the relationship between the distance to the target object imaged by the camera 10 and the reliability. As shown in FIG. 8, the reliability (the degree of image accumulation) increases as the distance increases. Since the reliability of the image captured by the camera 10 is increased, the relative positional relationship between the target object to be a distance measurement calculated based on the image information with a high reliability and the reference object as a reference for distance measurement is increased. Reliability is also increased.

Next, the reliability determination unit 411 calculates the reliability related to the distance measurement of the radar distance measuring device 20. The reliability determination unit 411 according to the present embodiment determines the reliability based on the intensity of the received signal acquired by the radar distance measuring device 20. The distance measurement by the radar distance measuring device 20 is considered to have high distance measurement accuracy if the reception intensity of the received signal is high. Since the received intensity is inversely proportional to the square of the distance from the radar equation, the reliability of the distance information is inversely proportional to the square of the distance. FIG. 8 shows the relationship between the distance to the target object measured by the radar distance measuring device 20 and the reliability. As shown in FIG. 8, the reliability of distance measurement becomes lower as the distance is further away.

The reliability determination unit 411 determines a region for searching for a reference object from the two reliability levels shown in FIG. While the reliability related to the relative positional relationship based on pixel movement information is proportional to the distance, the reliability related to distance measurement based on the received intensity is inversely proportional to the square of the distance. For this reason, the reliability determination unit 411 determines a region in which either one of the reliability is not extremely deteriorated as a region for searching for the reference object. In the present embodiment, as shown in FIG. 8, a search area having a predetermined width is set around a distance that is an intersection of two functions indicating reliability. For example, when a camera with a frame rate of 120 fps is used, the search area is an area within an arbitrarily set measurement distance range such as an area centered at 50 m, for example, 45 m to 55 m, 30 m to 60 m, and the like. .

Then, the reference object selection unit 41 selects an object that exists within the measurement distance range belonging to this search area as a reference object. For example, an object that exists in a measurement distance range in which the reliability related to distance measurement is within a predetermined threshold and that is in a measurement distance range in which the reliability related to relative position calculation is within the predetermined threshold is selected as the reference object.

  Further, the reference object selection unit 41 can select a stationary object as a reference object by paying attention to whether the object is a stationary body or a moving body. In particular, it is possible to select, as the reference object, a stationary object that has been able to continuously measure the distance from a state where the radar distance measuring device 20 is separated from the host vehicle by a predetermined distance or more.

The reference object selection unit 41 of the present embodiment includes a stationary object determination unit 412 and determines whether or not the object selected as the reference object is a stationary object based on a received signal from the object. The reference object selection unit 41 refers to the determination result and preferentially selects a stationary object as the reference object. The method for determining whether or not an object is a stationary object is not particularly limited, but when the host vehicle is running, a stationary object is characterized by a slow moving speed in the distance and a fast moving speed in the vicinity. Can be extracted. Further, if the vehicle speed of the host vehicle is used, it is possible to estimate the moving speed of the image of the stationary object, so that the stationary object can be extracted based on the movement information.

Further, the reference object selection unit 41 of the present embodiment includes a detection duration determination unit 412 and is an object that can continuously measure a distance from a state in which the object selected as the reference object is separated from the host vehicle by a predetermined distance or more. Whether or not there is is determined based on a received signal from the object. The reference object selection unit 41 refers to the determination result, and preferentially selects an object whose distance has been continuously measured since it exists far away as a reference object.

  Next, the distance calculation unit 50 will be described.

  When the radar distance measuring device 20 cannot measure the distance to the target object, the distance calculation unit 50 calculates the relative positional relationship between the target object and the reference object calculated by the inter-object relative position calculation unit 30, and the reference distance measurement. Based on the reference distance from the host vehicle to the reference object determined by the unit 40, the distance from the host vehicle to the target object is calculated. That is, using the reference distance from the host vehicle to the reference object, the relative positional relationship between the target object and the reference object is converted into the positional relationship between the host vehicle and the target object, and the distance is calculated.

The distance calculation unit 50 includes the same object determination unit 51 that determines whether or not the reference object is the same object in order to obtain the positional relationship between the host vehicle and the target object.

The same object determination unit 51 relates to the first reference object related to the relative positional relationship calculated based on the image information of the camera 10 and the reference distance obtained based on the received signal of the radar ranging device 20. It is determined whether or not the second reference object is the same object. Then, when the first reference object and the second reference object are the same object, the same object determination unit 51 determines the relative positional relationship between the target object and the reference object as a reference from the own vehicle to the reference object. Using the distance, it is converted into a positional relationship between the host vehicle and the target object.

  The same object determination unit 51 compares the positions of the reference objects to determine their identity. In other words, the same object determination unit 51 detects the position of the first reference object whose relative positional relationship is calculated by the relative position calculation unit 30 between objects and the second reference object selected as the reference object by the reference distance measurement unit 40. To determine whether the first reference object and the second reference object are the same object.

A specific determination method will be described with reference to FIG.

First, the same object determination unit 51 includes the distance Z1 (m) from the reference distance measurement unit 40 to the object (second reference object) selected as the reference object by the reference object selection unit 41, and the reference object orientation information. Obtain θ (rad).

Next, as shown in FIG. 9, the same object determination unit 51 determines which reference object selected by the reference object selection unit 41 is the object among which the relative position relationship is calculated by the inter- object relative position calculation unit 30. It is determined whether or not it matches.

  Here, the height of the camera 10 from the road surface is Ch (m), the depression angle of the camera 10 is Tr (rad), the vertical size of the image is Ih, the horizontal size of the image is Iw, and the angle per pixel in the height direction When the resolution is PYr (rad) and the angular resolution per pixel in the horizontal direction is PXr (rad), the x and y coordinates on the image of the reference object use the azimuth θ (rad) and the distance Z1 (m), Calculate from the following formula. Information such as the height of the camera is acquired from the camera data 10a stored on the camera 10 side.

(Formula 1) x = θ / PXr + (Iw / 2)
(Formula 2) y = (ATAN (Ch / Z1) -Tr) / PYr + (Ih / 2)
That is, based on the acquired orientation information of the second reference object, the position of the second reference object in the coordinates of the image whose relative positional relationship is calculated by the inter-object relative position calculation unit 30 is calculated. By comparing the position of the second reference object and the position of the first reference object in the image for which the relative positional relationship has been obtained, the identity between the first reference object and the second reference object can be determined. .

In the present embodiment, the horizontal position (y) of the lower end region of the second reference object is calculated from the reference distance of the second reference object selected as the reference object. Then, with respect to the image in which the relative positional relationship is calculated by the inter-object relative position calculation unit 30, the first reference object candidate in which the lower end region is present in the vicinity of the horizontal position of the calculated lower end region of the second reference object. Explore. Then, out of the searched first reference object candidates, a first reference object in which a difference between the vertical position of the lower end region and the vertical position of the lower end region of the second reference object is equal to or less than a predetermined value is extracted, It is determined that the extracted first reference object is the same object as the second reference object selected as the reference object by the reference distance measurement unit 40.

Specifically, as shown in FIG. 9, an object is searched for on the x-coordinate calculated based on the orientation information of the second reference object. The position y1 of the lower end area of each searched object is compared with the position y of the lower end area of the second reference object calculated from the reference distance. The thing with the smallest difference of y coordinate of y1 and y is determined as the 1st reference object same as the 2nd reference object.

Furthermore, the same object determination unit 51 is configured such that the first reference object whose relative positional relationship is calculated by the inter-object relative position calculation unit 30 and the second reference object selected by the reference distance measurement unit 40 are both stationary. If it is determined that the object is an object, it is determined that the first reference object and the second reference object are the same object. Even if the relative position is the same between the moving object and the stationary object, different movement velocities are calculated. Therefore, whether or not the object is a stationary object can be determined based on the movement information.

Based on the movement information of the first reference object and the movement information of the second reference object, the same object determination unit 51 of the present embodiment determines whether or not these are stationary objects. If both are determined to be stationary objects, both are determined to be the same object, and the position of one reference object (y coordinate (y1) of the lower end region) can be acquired.

As described above, the distance calculation unit 50 identifies a common reference object and obtains y1 which is the position of the reference object (y coordinate of the lower end region) necessary for obtaining the positional relationship between the host vehicle and the target object. To do.

Subsequently, the distance calculation unit 50 calculates the distance between the target object that cannot be measured by the radar distance measuring device 20 and the host vehicle. As shown in FIG. 10, the distance calculation unit 50 of the present embodiment calculates distance information of an object that is extracted from an image captured by the camera 10 and cannot be measured by the radar distance measuring device 20.

First, based on the lower end position y1 of the reference object determined as the same object by the same object determination unit 51 and the reference distance of the reference object selected by the reference object selection unit 41, it changes according to vehicle behavior such as pitching. The depression angle Tr1 (rad) of the current camera 10 is calculated from the following equation.

(Formula 3) Tr1 = ATAN (Ch / Z1)-(y1- (Ih / 2) * PYr)
Next, if the position y2 of the lower end region of the target object P from which the distance information is acquired is acquired, Z2 indicating the distance from the own vehicle is calculated by the following equation.

(Formula 4) Z2 = (Ch) / (TAN (Tr1 + (y2-Ih / 2) × PYr))
The distance between the host vehicle and the target object can be obtained by the above arithmetic processing.

As described above, when the distance between the host vehicle and the target object is obtained, the target object exists, and “the target object exists” or “the target object is located at a distance Z2 from the host vehicle. The detection result of the object “present” can be obtained. According to the object detection device 100 of the present embodiment, it is possible to calculate the distance and detect the presence of a target object that cannot be detected by the radar ranging device 20. That is, when the target object is a distant object or an object with low reflection such as a pedestrian, the radar distance measuring device 20 may not be able to obtain a received signal and cannot detect the presence of the object. Therefore, the presence of such a target object can be detected.

Further, the distance between the subject vehicle and the target object obtained by the arithmetic processing is output to the output device 300, the driving support device 200, and the like. The output device 300 outputs the distance between the host vehicle and the target object and the presence of the target object as an object detection result. In addition, the driving support device 200 performs output of information for supporting driving of the host vehicle and vehicle control based on the distance between the host vehicle and the target object.

    Subsequently, an object detection processing procedure of the object detection apparatus 100 according to the present embodiment will be described with reference to FIG. FIG. 11 is a flowchart showing an image processing procedure of the object detection apparatus 100 of the present embodiment.

When an ignition switch (not shown) is turned on and the in-vehicle device 1000 is activated, this processing program is executed.

  First, the camera 10 captures an image of the surroundings of the vehicle at a predetermined cycle (S101). The captured image is stored in the image memory 11. The feature extraction 21 performs edge extraction processing on the image, extracts the contour of the object existing in the captured image as an edge image, and normalizes the edge image (S102). The movement information calculation unit 22 calculates edge movement information (S103). Then, a moving image (see FIG. 6) in which the calculated movement information is represented by a predetermined gradation is created.

In step S104, the movement information calculation unit 22 sets a strip area for object detection on the generated movement image (S104). Subsequently, in step S105, the inter-object relative position calculation unit 30 searches from the bottom to the top whether there is a pixel having movement information in each strip area. If there is a pixel with movement information, compare the movement speed difference with the pixel with movement information adjacent above it, and if the movement speed difference is less than or equal to the threshold value, it corresponds to the same object The objects are grouped and objects are extracted (S105).

  In step S106, it is determined whether all the objects included in the image have been extracted (S106). If extraction of all objects is completed, the process proceeds to step S107. If extraction is not completed, the process returns to step S105.

In step S <b> 107, the reference object selection unit 411 selects a reference object based on the reference distance measured by the radar distance measuring device 20. The reference object selection unit 41 is a reliability related to the distance measurement obtained by referring to the intensity of the received signal acquired via the radar distance measuring device 20, and the target object included in the captured image by the inter-object relative position calculation unit 30. The reference object is selected based on the reliability relating to the relative position calculation obtained by referring to the moving speed of the pixel corresponding to the image. Specifically, an intersection of a function indicating the reliability of relative position calculation calculated from the edge accumulation amount as shown in FIG. 8 and a function indicating the reliability related to distance measurement calculated from the intensity of the received signal, An object within a measurement distance range within a predetermined range set with a distance as a center is selected as a reference object. Further, a stationary object for which the radar distance measuring device 20 has been able to continuously measure the distance from a distance is selected as a reference object, and the distance to the reference object and its orientation information are acquired. After this, the flow moves to step S108.

In step S108, the object selected by the reference object selection unit 41 is imaged by the camera 10, and it is determined which of the objects whose relative positional relationship is calculated by the inter-object relative position calculation unit 30 matches with the object. I do. The identity determination unit 51 calculates xy coordinates on the image of the second reference object from the distance and direction information to the reference object (second reference object) selected by the reference object selection unit 41. An object that exists on the x-coordinate calculated from the azimuth information and has the smallest difference in the y-coordinate of the lower end region thereof is determined as the same reference object candidate.

In step S109, the same object determination unit 51 determines whether or not the object is a stationary object from the movement information of the same reference object candidate. If the reference object candidate is determined to be a stationary object, it is determined as the same object and selected as the reference object. Then, the y coordinate of the lower end region of the reference object is acquired (S109).

In step S110, the depression angle of the current camera is calculated from the coordinates of the lower end region of the reference object determined as the same object (S110).

In step S111, the calculated current depression angle of the camera is used to calculate the distance between the target object that cannot be measured by the radar distance measuring device 20 and the host vehicle. Specifically, the position of the lower end region of the target object is acquired, and the distance between the host vehicle and the target object indicating the relative position with the host vehicle is calculated (S111). The distance between the host vehicle and the target object obtained in S111 is output to the external output device 300, the driving support device 200, and the like.

In step S112, it is determined whether the distances between all target objects and the host vehicle have been calculated (S112).

If the calculation of the distance from the host vehicle is completed for all target objects, the process proceeds to step S113, and if not completed, the process returns to step S111.

In step S113, it is determined whether or not the ignition switch of the own vehicle is turned off (S113).

If the ignition switch is not turned off, the process returns to step S101 and is repeated. On the other hand, if the ignition switch is turned off, the flow moves to step S114 and the process ends (S114).

Since the image processing apparatus 100 according to the present embodiment is configured and operates as described above, the following effects can be obtained.

  The object detection device 100 according to the present embodiment, when the radar distance measuring device 20 cannot measure the distance to the target object, calculates the target object, the reference object, and the reference object calculated based on the information of the image captured by the camera 10. By calculating the distance from the subject vehicle to the target object based on the relative positional relationship between the subject vehicle and the reference distance from the subject vehicle to the reference object measured from the received signal acquired by the radar distance measuring device 20. Since the relative positional relationship between the target object and the reference object is converted into the relative positional relationship with the host vehicle based on the position of the object, the distance to the target object located far away that cannot be directly measured from the radar received signal is highly accurate. It can be calculated with. That is, even if the target object to be distance-measured cannot be directly measured by the radar distance measuring device 20 such as a pedestrian with weak reflection or an object existing outside the radar detection area, the own vehicle The distance from the target object to the target object can be calculated with high accuracy, and the object can be detected accurately.

  Further, by selecting a reference object as a reference for distance measurement based on the reliability related to the distance measurement of the radar distance measuring device 20 and the reliability related to the relative position calculation of the relative position calculation unit 30 between the objects, A reference object is selected in which the reliability of both the reliability of the captured image of the camera 10 (information of the captured image) and the reliability of the distance measurement of the radar distance measuring device 20 showing different reliability trends with respect to the distance is considered. It is possible to perform ranging and object detection with high accuracy.

  In particular, the reference distance measurement unit 40 obtains the reliability related to the distance measurement with reference to the intensity of the received signal, and the reliability related to the relative position calculation of the inter-object relative position calculation unit 30 is captured by the camera 10. A reference object with high detection accuracy can be obtained based on the reliability according to the characteristics of the image information and received signal by obtaining the moving speed of the pixel corresponding to the target object image, for example, by referring to the accumulated amount of the count value of the pixel. Since it can be selected, highly accurate distance measurement and object detection can be performed.

  Further, the reference distance measurement unit detects an object that exists in a measurement distance range in which the reliability related to distance measurement is within a predetermined threshold, and exists in a measurement distance range in which the reliability related to relative position calculation is within the predetermined threshold. By selecting as the reference object, it is possible to maintain both the reliability related to distance measurement and the reliability related to relative position calculation. As a result, the reliability related to the distance measurement and the reliability related to the relative position calculation tend to be different with respect to the distance, but the reference object is selected from the range of the distance where one of the reliability does not become extremely low. be able to. Thus, by setting a range in which a certain degree of reliability is ensured, an object with high detection accuracy can be selected as a reference object by the camera 10 and the radar distance measuring device 20. As a result, it is possible to detect an accurate reference object suitable for converting the relative positional relationship between the target object and the reference object into a relative position between the host vehicle and the target object, and to perform accurate ranging and object detection. It can be carried out.

  In addition, an object with high measurement accuracy can be selected by selecting, as the reference object, an object whose distance has been continuously measured from the state in which the radar distance measuring device 20 is separated from the host vehicle by a predetermined distance or more by the reference distance measurement unit 40. It can be used as a reference object, and highly accurate ranging and object detection can be performed.

  In addition, by selecting, as a reference object, a stationary object that can accurately compare positions among the objects, the reference object in the inter-object relative position calculated by the inter-object relative position calculation unit 30 and the reference distance measurement The correspondence with the reference object at the reference distance obtained by the unit 40 can be determined with high accuracy. When the distance calculation unit 50 calculates the distance between the host vehicle and the target object, it is necessary to determine the identity of the reference object in the positional relationship between the objects and the reference object at the reference distance. By doing so, it is possible to determine the identity with high accuracy and extract the same reference object. Thereby, highly accurate ranging and object detection can be performed.

  Further, in calculating the distance between the host vehicle and the target object, the position of the first reference object whose relative positional relationship is calculated by the inter-object relative position calculation unit 30 and the reference distance measurement unit 40 are selected as the reference objects. By determining whether both reference objects are the same object based on the position of the second reference object, the relative positional relationship between the target object and the first reference object is Based on the reference distance to the second reference object, it is possible to accurately convert the positional relationship between the host vehicle and the target object. Thereby, highly accurate ranging and object detection can be performed.

In particular, the distance calculation unit 50 according to the present embodiment determines whether the first reference object and the second reference object are the same object based on the distance and orientation information of the second reference object. A predetermined position of the second reference object is coordinate-converted on the image, and the positional relationship between the first reference object and the second reference object existing in the vicinity of the converted coordinates is compared. In order to determine whether or not there is, it is possible to narrow down the region where the reference object exists among the objects included in the image captured by the camera 10 based on the distance and azimuth information with high measurement accuracy of the radar distance measuring device 20. And an object common to the reference object at the reference distance can be specified in the relative positional relationship. Thereby, highly accurate ranging and object detection can be performed.

  Furthermore, when it is determined that the first reference object for which the relative positional relationship is calculated and the second reference object for which the reference distance is calculated are both stationary objects, they are the same object. Since the position comparison or the movement information comparison can be accurately performed by the determination, the identity can be determined with high accuracy, and the same reference object can be accurately extracted. Thereby, highly accurate ranging and object detection can be performed.

  Further, in calculating the relative positional relationship between the target object and the reference object, the inter-object relative position calculation unit 30 counts up the count value of the pixel corresponding to the feature portion of the captured image of the camera 10, and the inclination of the count value Based on the above, the movement information of the pixel corresponding to the feature is calculated, and the relative positional relationship between the objects is calculated based on the movement information. According to this configuration, it is possible to accurately detect the specific position of the object, for example, the lower end position, by measuring the movement information of the feature points extracted from the image captured by the camera 10, and therefore, over a wide range. The relative positional relationship between the objects can be calculated. In addition, it is possible to perform the same determination between objects based on the position information and movement information of the feature points. That is, it is possible to accurately determine the identity between objects based on the lower end position of the object and the movement information of the lower end position. Furthermore, since different moving speeds are calculated even when the relative position is the same between the moving object and the stationary object, it is possible to accurately determine whether or not the object is a stationary object.

  The embodiment described above is described for facilitating the understanding of the present invention, and is not described for limiting the present invention. Therefore, each element disclosed in the above embodiment is intended to include all design changes and equivalents belonging to the technical scope of the present invention.

It is a figure which shows an example of the block configuration of the vehicle-mounted apparatus 1000 containing the object detection apparatus 100 of this embodiment. (A) And (B) is a figure which shows the example of mounting of the camera 10 and the radar ranging device 20. FIG. 2 is an example of an image in front of a vehicle imaged by a camera 10; It is a figure for demonstrating the effect | action of the object detection apparatus of this embodiment. (A)-(f) is a figure for demonstrating the calculation process of moving speed. It is a figure which shows an example of a moving image. It is a figure for demonstrating the example of a method of detecting an object from a moving image. It is a figure for demonstrating the determination method of the area | region which searches a reference | standard object based on the reliability of a different parameter | index. It is a figure for demonstrating the example of a method which determines the identity of a reference | standard object. It is a figure for demonstrating the example of a method which calculates the distance from the own vehicle to a target object. It is a flowchart figure which shows the control procedure of the object detection apparatus of this embodiment.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1000 ... Vehicle-mounted apparatus 100 ... Object detection apparatus 10 ... Camera 11 ... Image memory 20 ... Radar ranging apparatus 30 ... Inter-object relative position calculation part 31 ... Feature extraction part 32 ... Movement information calculation part 40 ... Reference distance measurement part 41 ... Reference | standard Object selection unit 50 ... Distance calculation unit 51 ... Same object determination unit

Claims (11)

  1. An inter-object relative position calculating unit that calculates a relative positional relationship between a target object to be a distance measurement target and a reference object to be a distance measurement reference based on information of an image captured by an imaging unit mounted on the host vehicle; ,
    A reference distance measuring means for obtaining a reference distance between a reference object selected as a reference for the distance measurement and the own vehicle based on a received signal acquired by a radar distance measuring means mounted on the own vehicle;
    If the radar distance measuring means cannot measure the distance to the target object, the relative position relationship between the target object and the reference object calculated by the inter-object relative position calculating means is obtained by the reference distance measuring means. An object detection apparatus comprising: distance calculation means for calculating a distance from the host vehicle to the target object based on the reference object and the reference distance of the host vehicle.
  2. The object detection apparatus according to claim 1,
    The reference distance measuring means selects the reference object based on reliability related to distance measurement by the radar distance measuring means and reliability related to relative position calculation by the inter-object relative position calculating means. .
  3. The object detection device according to claim 2,
    The reference distance measuring means is an image obtained by capturing the reliability of the distance measurement obtained by referring to the intensity of the received signal received by the radar distance measuring means and the inter-object relative position calculating means by the imaging means. An object detection device that selects the reference object based on the reliability relating to the relative position calculation obtained by referring to the moving speed of the pixel corresponding to the object image included in the object image .
  4. The object detection device according to claim 3,
    The reference distance measurement unit is present in a measurement distance range in which the reliability related to the distance measurement obtained with reference to the intensity of the received signal is equal to or greater than a predetermined threshold, and the image captured by the imaging unit An object detection apparatus that selects, as the reference object, an object that exists in a measurement distance range in which the reliability related to the relative position calculation obtained with reference to the moving speed of a pixel corresponding to an included object image is a predetermined threshold or more .
  5. In the object detection device according to any one of claims 1 to 4,
    The reference distance measurement unit is an object detection device that selects, as the reference object, an object whose distance has been continuously measured from a state in which the radar distance measurement unit is separated from the host vehicle by a predetermined distance or more.
  6. In the object detection device according to any one of claims 1 to 5,
    The reference distance measurement unit is an object detection device that selects, as a reference object, an object that is determined as a stationary object based on a moving speed of a pixel corresponding to an object image included in an image captured by the imaging unit.
  7. In the object detection device according to any one of claims 1 to 6,
    The distance calculation means includes a position of a first reference object whose relative positional relationship is calculated by the relative position calculation means between the objects, and a position of a second reference object selected as a reference object by the reference distance measurement means. An object detection apparatus that determines whether the first reference object and the second reference object are the same object based on the above.
  8. The object detection apparatus according to claim 7,
    When the distance calculation means determines that the first reference object and the second reference object are the same object,
    Based on the azimuth information of the second reference object, calculate the horizontal position of the lower end region of the second reference object on the image where the relative positional relationship is calculated by the relative position calculation means between the objects,
    For the object included in the image for which the relative positional relationship is calculated by the inter-object relative position calculating means, a first lower end region is present in the vicinity of the horizontal position of the calculated lower end region of the second reference object. Search for reference object candidates,
    From the searched first reference object candidates, a first reference object in which a difference between a vertical position of a lower end region thereof and a vertical position of a lower end region of the second reference object is a predetermined value or less is extracted,
    An object detection apparatus that determines that the extracted first reference object is the same object as the second reference object.
  9. In the object detection device according to claim 7 or 8,
    When the distance calculation means determines that the first reference object and the second reference object are the same object,
    Furthermore, it is determined that the first reference object whose relative positional relationship is calculated by the relative position calculation means between the objects and the second reference object selected by the reference distance measurement means are both stationary objects. In this case, the object detection apparatus determines that the first reference object and the second reference object are the same object.
  10. In the object detection device according to any one of claims 1 to 9,
    The inter-object relative position calculating means extracts a feature corresponding to the object based on information of each image of the object imaged at a predetermined period by the imaging means, and corresponds to the position of the extracted feature Each of the objects by counting up the count value of the pixel to be calculated, calculating the movement information of the pixel corresponding to the feature based on the slope of the count value, and comparing the calculated movement information of the pixel An object detection apparatus that calculates a relative positional relationship between the target object and the reference object by extracting a presence area of the target object and comparing the positions of the presence areas .
  11. A distance measuring method for measuring a distance between a target object to be measured and a subject vehicle,
    The distance between the target object and the host vehicle calculated based on the received signal for the electromagnetic waves emitted from the host vehicle side,
    Relative positional relationship between the target object to be measured and the reference object to be a reference for distance measurement calculated based on information of captured images around the host vehicle, and reception of electromagnetic waves emitted from the own vehicle side One of the distances between the target object and the host vehicle calculated based on a reference distance between the reference object calculated based on the signal and the host vehicle is set as the target object. A distance measuring method for selecting according to the intensity of a received signal with respect to an electromagnetic wave emitted from the own vehicle side when measuring the distance to the own vehicle .
JP2008025021A 2008-02-05 2008-02-05 Object detection apparatus and distance measuring method Active JP5145986B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008025021A JP5145986B2 (en) 2008-02-05 2008-02-05 Object detection apparatus and distance measuring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008025021A JP5145986B2 (en) 2008-02-05 2008-02-05 Object detection apparatus and distance measuring method

Publications (2)

Publication Number Publication Date
JP2009186260A JP2009186260A (en) 2009-08-20
JP5145986B2 true JP5145986B2 (en) 2013-02-20

Family

ID=41069659

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008025021A Active JP5145986B2 (en) 2008-02-05 2008-02-05 Object detection apparatus and distance measuring method

Country Status (1)

Country Link
JP (1) JP5145986B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105128836A (en) * 2014-05-30 2015-12-09 株式会社万都 Autonomous emergency braking system and method for recognizing pedestrian therein

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4883246B2 (en) * 2009-12-08 2012-02-22 トヨタ自動車株式会社 Object detection apparatus and object detection method
JP5659587B2 (en) * 2010-07-09 2015-01-28 富士通株式会社 Radar device, roadside device, and in-vehicle device
KR101665388B1 (en) 2011-01-20 2016-10-12 한화테크윈 주식회사 Method for controlling camera
JP5651642B2 (en) * 2012-07-18 2015-01-14 本田技研工業株式会社 Object position detection device
US9405006B2 (en) * 2012-09-03 2016-08-02 Toyota Jidosha Kabushiki Kaisha Collision determination device and collision determination method
EP2894618B1 (en) * 2012-09-03 2016-10-26 Toyota Jidosha Kabushiki Kaisha Speed calculating device and speed calculating method, and collision determination device
JP5783163B2 (en) * 2012-12-03 2015-09-24 株式会社デンソー Target detection device
JP5790627B2 (en) * 2012-12-03 2015-10-07 株式会社デンソー Target detection device
JP2016057066A (en) * 2014-09-05 2016-04-21 パイオニア株式会社 Identifying device, travel lane identifying method, and travel lane identifying program

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3948450B2 (en) * 2003-10-20 2007-07-25 日産自動車株式会社 Object detection apparatus and object detection method
JP2007187618A (en) * 2006-01-16 2007-07-26 Omron Corp Object identifying device
JP2007240314A (en) * 2006-03-08 2007-09-20 Omron Corp Object detector
JP4857839B2 (en) * 2006-03-22 2012-01-18 日産自動車株式会社 Object detection device
JP4830604B2 (en) * 2006-04-17 2011-12-07 日産自動車株式会社 Object detection method and object detection apparatus
JP2007304033A (en) * 2006-05-15 2007-11-22 Honda Motor Co Ltd Monitoring device for vehicle periphery, vehicle, vehicle peripheral monitoring method, and program for vehicle peripheral monitoring

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105128836A (en) * 2014-05-30 2015-12-09 株式会社万都 Autonomous emergency braking system and method for recognizing pedestrian therein
CN105128836B (en) * 2014-05-30 2019-04-19 株式会社万都 Autonomous emergency braking system and the wherein method of identifying rows people

Also Published As

Publication number Publication date
JP2009186260A (en) 2009-08-20

Similar Documents

Publication Publication Date Title
DE102012021375B4 (en) Apparatus and method for detecting a three-dimensional position and orientation of an article
US9230165B2 (en) Object detection apparatus, vehicle-mounted device control system and storage medium of program of object detection
JP3645177B2 (en) Vehicle periphery monitoring device
KR100352423B1 (en) Vehicle distance measuring device
US7741961B1 (en) Enhanced obstacle detection and tracking for three-dimensional imaging systems used in motor vehicles
JP6151150B2 (en) Object detection device and vehicle using the same
Bogoslavskyi et al. Fast range image-based segmentation of sparse 3D laser scans for online operation
US7403659B2 (en) Method and apparatus for differentiating pedestrians, vehicles, and other objects
JP3349060B2 (en) Outside monitoring device
DE10029423B4 (en) Object recognition system
JP6274557B2 (en) Moving surface information detection apparatus, moving body device control system using the same, and moving surface information detection program
KR101395089B1 (en) System and method for detecting obstacle applying to vehicle
US20130286205A1 (en) Approaching object detection device and method for detecting approaching objects
KR101787996B1 (en) Apparatus of estimating traffic lane in vehicle, control method of thereof
US7612800B2 (en) Image processing apparatus and method
JP4203512B2 (en) Vehicle periphery monitoring device
DE102007020791B4 (en) Lane marker detection device
US6956469B2 (en) Method and apparatus for pedestrian detection
US8175337B2 (en) Apparatus and method of measuring distance using structured light
DE102006012914B4 (en) System and method for determining the distance to a preceding vehicle
JP4544028B2 (en) In-vehicle image processing apparatus and image processing method
US8867790B2 (en) Object detection device, object detection method, and program
EP1589484B1 (en) Method for detecting and/or tracking objects
JP3596314B2 (en) Object edge position measuring device and moving object traffic judging device
US9836657B2 (en) System and method for periodic lane marker identification and tracking

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20110127

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20120816

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20120821

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120926

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20121030

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20121112

R150 Certificate of patent or registration of utility model

Ref document number: 5145986

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20151207

Year of fee payment: 3