US8233031B2 - Object detecting system - Google Patents
Object detecting system Download PDFInfo
- Publication number
- US8233031B2 US8233031B2 US12/260,538 US26053808A US8233031B2 US 8233031 B2 US8233031 B2 US 8233031B2 US 26053808 A US26053808 A US 26053808A US 8233031 B2 US8233031 B2 US 8233031B2
- Authority
- US
- United States
- Prior art keywords
- image
- reference image
- estimated
- information
- detected
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
Definitions
- the present invention relates to object detecting systems, and more particularly, to an object detecting system that detects an object by stereo matching of a pair of images taken by stereo-image taking means.
- a pair of images are taken by a pair of cameras, and one of the taken images used for reference (hereinafter referred to as a reference image) is compared with the other image (hereinafter referred to as a comparative image).
- a reference image one of the taken images used for reference
- a comparative image compared with the other image
- a difference between positions of corresponding portions of the same object in the images, that is, a parallax is calculated, and the distance to the object is calculated from the parallax according to the principle of triangulation.
- the corresponding portions of the same object in the reference image and the comparative image are typically located by stereo matching (for example, see Japanese Unexamined Patent Application Publication No. 10-283461).
- Japanese Unexamined Patent Application Publication No. 2000-123151 proposes an image processing method using a single camera.
- this method differences in pixel density between an image of an environment in front of a vehicle taken in the current frame and an image of the environment taken in the previous frame are calculated.
- On the basis of the average density and distribution of a difference image obtained from the differences it is determined whether an image of the wiper is included in the image taken in the current frame.
- Japanese Unexamined Patent Application Publication No. 2006-107314 also proposes an image processing apparatus using a single camera.
- the moving speeds of objects in a taken image are calculated by using optical flow in which the moving directions and speeds of the objects in the image are shown by vectors.
- a wiper whose image is included in the taken image is detected.
- an area of the image of the front environment missing because of the wiper is compensated for by a compensation image obtained from the previous frame.
- the stereo-image taking means When objects around the vehicle are detected on the basis of an image taken by the stereo-image taking means, the stereo-image taking means is forced to be mounted inside the front glass, and noise inevitably occurs in the taken image because of raindrops and dirt adhering to the front glass. Further, it is necessary to remove raindrops and dirt adhering to the front glass in front of the camera with the wiper while rain is falling, and an image of the wiper is inevitably included in the image taken by the camera. When noise, such as images of the wiper and adherents, is included in the taken image, it is difficult to obtain highly reliable information about the distance to the object obtained by stereo matching performed.
- an object detecting system that detects objects by stereo matching is required to accurately detect and distinguish between changes of the external environment and noise due to the wiper and adherents.
- an object of the invention is to provide an object detecting system that can detect objects while accurately distinguishing between changes of an external environment and noise due to a wiper and adherents.
- an object detecting system includes stereo-image taking means including a pair of image taking means mounted inside a front glass, the stereo-image taking means simultaneously taking images of an object around a vehicle in which the object detecting system is mounted and outputting the images as a reference image and a comparative image; stereo matching means for performing, in the comparative image, stereo matching of a reference pixel block set in the reference image and having a predetermined number of pixels, and for calculating a parallax between the reference pixel block and a comparative pixel block specified in the comparative image, the parallax being calculated for each reference pixel block; object detecting means for calculating a distance to the vehicle on the basis of the parallax and for detecting the object in the reference image on the basis of the distance; estimated-region setting means for setting, in the reference image and the comparative image, estimated regions where images of the object are expected to be taken in a current frame, on the basis of the distance of the object in the reference image in the
- estimated regions in which images of the object are expected to be taken in the current frame, are respectively set in the reference image and the comparative image on the basis of the result of the previous frame. Further, it is determined, on the basis of the absolute value of a difference between the average luminances of the estimated regions, whether information about the object detected in the estimated region of the reference image or information that the object is not detected includes noise. Therefore, it is possible to accurately distinguish between a case in which noise is caused by a wiper cutting in front of one of the image taking means or a substance adhering to the front glass and a case in which an external environment changes.
- the object can be detected by thus accurately distinguishing between noise and the change of the external environment, it is possible to accurately distinguish between a case in which only the external environment changes and information about the detected object and information that the object is not detected are highly reliable, and a case in which noise occurs and information about the detected object and information that the object is not detected are not reliable.
- reliability of the information is low, information about the reliability can be output in proper correlation with the object information.
- the information when automatic running control of the vehicle is performed on the basis of the output information, the information can be used for automatic control after recognizing reliability of the information. This allows accurate automatic control.
- the determination means decreases reliability of information about the parallax or the distance calculated for the object detected in the estimated region of the reference image, or decreases reliability of the information that the object is not detected in the estimated region of the reference image when the object is not detected in the estimated region.
- information about the object is correlated with reliability beforehand.
- the object information is correlated with information that noise is included. Therefore, reliability of information about the object detected in the estimated region of the reference image or information that the object is not detected can be evaluated by simply recognizing the information about reliability. This allows the above-described advantages of the present invention to be achieved more effectively.
- the object detecting means calculates a position of the detected object relative to the vehicle on the basis of the distance of the object, and stores the relative position in time series.
- the estimated-region setting means estimates a relative position of the object in the current frame on the basis of the relative position and the relative speed of the object in the previous frame, and sets, in the reference image, an estimated region where an image of the object is expected to be taken.
- the estimated-region setting means also estimates a parallax in the current frame on the basis of the estimated relative position of the object, and sets an estimated region in the comparative image on the basis of the parallax and the estimated region set in the reference image.
- the position of the detected object relative to the vehicle is calculated on the basis of the distance of the object, and an estimated region is set in the reference image on the basis of the relative position. Further, a parallax is estimated from the estimated region set in the reference image, and an estimated region is set in the comparative image on the basis of the parallax. Therefore, estimated regions can be easily and accurately set in the reference image and the comparative image, and the above-described advantages of the present invention can be achieved more accurately.
- the relative speed in the next frame is calculated by smoothing filtering in which the relative speed of the object used in the current frame and a difference between the relative positions in the previous frame and the current frame are subjected to weighted addition by an addition ratio.
- the relative speed of the object used in the next frame is calculated by smoothing the relative speed used in the current frame and the detection result of the object in the current frame. Therefore, even when the detection result of the object fluctuates, fluctuation is suppressed, and the speed of the object relative to the vehicle can be obtained in a state closer to reality. Moreover, estimated regions can be more properly set in the reference image and the comparative image. This allows the above-described advantages of the present invention to be achieved more accurately.
- the addition ratio of the difference is decreased for the object in the smoothing filtering in the current frame.
- the addition ratio of the detection result is decreased in smoothing filtering in the current frame. This can prevent the calculated relative speed from being made greatly different from an actual value by the detection result having low reliability, and a proper relative speed is calculated. Therefore, the above-described advantages of the present invention can be achieved more accurately.
- the relative speed of the object used for the object in the current frame is set as the relative speed in the next frame.
- the determination means deletes information about the object when the object is not detected in the estimated region set in the reference image.
- the determination means tightens a criterion of stereo matching performed by the stereo matching means.
- the criterion of stereo matching performed by the stereo matching means is tightened. Even when the absolute value of the difference between average luminances of the estimated regions in the reference image and the comparative image is large and reliability of the information about the detected object is low, only comparative pixel blocks in the comparative image that have a luminance pattern very close to that of the reference pixel blocks in the reference image are specified by stereo matching. Therefore, in addition to the above-described advantages of the present invention, reliability of information about the distance of the detected object can be improved further.
- the determination means determines that a wiper cuts in front of the image taking means that takes the reference image or the comparative image to which the estimated region having the lower average luminance belongs.
- the determination means determines that a substance adheres to the front glass in front of the image taking means that takes the reference image or the comparative image.
- FIG. 1 is a block diagram showing a configuration of an object detecting system according to an embodiment
- FIG. 2 shows an example of a reference image
- FIG. 3 explains how stereo matching is performed by an image processor
- FIG. 4 shows a distance image formed on the basis of the reference image shown in FIG. 2 ;
- FIG. 5 is a flowchart showing a procedure performed by a detection means
- FIG. 6 is a flowchart showing the procedure performed by the detection means
- FIG. 7 explains sections of the distance image shown in FIG. 7 ;
- FIG. 8 shows an example of a histogram for extracting the distance of an object in each section shown in FIG. 7 ;
- FIG. 9 shows distances of objects in the sections that are plotted in real space
- FIG. 10 explains grouping of dots shown in FIG. 9 ;
- FIG. 11 shows objects obtained by linear approximation of dots belonging to groups shown in FIG. 10 ;
- FIG. 12 shows detected objects enclosed by rectangular frames in the reference image
- FIG. 13 explains a method for calculating relative positions, where an object is expected to exist in the current frame, on the basis of the previous frame by an estimated-region setting means;
- FIG. 14A shows an estimated region set in a reference image
- FIG. 14B shows an estimated region set in a comparative image
- FIG. 15A shows an estimated region and a frame enclosing an object set in the reference image
- FIG. 15B shows an estimated region set in the comparative image
- FIG. 16A shows a reference image including an image of an object
- FIG. 16B shows a comparative image in which a taken image of the object is hidden by a wiper
- FIG. 17 is a graph explaining a state in which the average luminance of the estimated region in the comparative image greatly decreases in a situation shown in FIGS. 16A and 16B ;
- FIG. 18 is a graph explaining that the average luminance of the estimated region in the comparative image recovers when a state shown in FIGS. 15 a and 15 B is brought about again;
- FIG. 19 is a graph showing the average luminances of the estimated regions in the reference image and the comparative image in a state in which the wiper cuts in front of a pair of image taking means;
- FIG. 20 is a graph showing changes in the average luminances of the estimated regions in the reference image and the comparative image caused when dirt adheres to a front glass in front of one of the image taking means;
- FIG. 21 is a graph showing changes in the average luminances of the estimated regions in the reference image and the comparative image caused when light is diffused because a raindrop adheres to the front glass in front of one of the image taking means;
- FIG. 22 is a block diagram showing a configuration adopted when a signal is transmitted from a determination means to a stereo matching means so as to tighten the criterion of stereo matching.
- an object detecting system 1 includes a stereo-image taking means 2 , an image processing means 6 including a stereo matching means 7 and so on, and a detection means 9 including an object detecting means 9 and so on.
- the stereo-image taking means 2 is formed by a stereo camera including a main camera 2 a and a sub-camera 2 b respectively including image sensors, such as CCDs or CMOS sensors, which are in synchronization with each other.
- the main camera 2 a and the sub-camera 2 b are mounted inside a front glass of a vehicle (not shown), for example, near a room mirror and are spaced a predetermined distance apart in the vehicle width direction, that is, in the right-left direction.
- the main camera 2 a and the sub-camera 2 b are mounted at the same height from the road surface, simultaneously take images of an object around the vehicle, particularly, in front of the vehicle at a predetermined sampling cycle, and output information about the taken images.
- the main camera 2 a close to the driver outputs image data on a reference image T 0 shown in FIG. 2
- the sub-camera 2 b remote from the driver outputs image data on a comparative image T C that is not shown.
- Image data output from the main camera 2 a and the sub-camera 2 b are converted from analog images into digital images in which each pixel has a luminance of a predetermined number of, for example, 256 levels of gray scale by A/D converters 3 a and 3 b in a conversion means 3 .
- the digital images are subjected to image correction, such as displacement and noise removal, by an image correction unit 4 .
- image correction After image correction, the image data is transmitted to and stored in an image-data memory 5 , and is also transmitted to the image processing means 6 .
- the image processing means 6 includes a stereo matching means 7 , such as an image processor, and a distance-data memory 8 .
- the stereo matching means 7 performs stereo matching for the reference image T 0 and the comparative image.
- the stereo matching means 7 sets a reference pixel block PB 0 defined by a predetermined number of pixels, such as 3 by 3 pixels or 4 by 4 pixels, in the reference image T 0 , calculates SAD values serving as differences in luminance pattern between the reference pixel block PB 0 and comparative pixel blocks PB C , which have the same shape as that of the reference pixel block PB 0 , on an epipolar line EPL in a comparative image T C corresponding to the reference pixel block PB 0 according to the following Expression (1), and specifies a comparative pixel block PB C having the smallest SAD value:
- P 1 st represents the luminance of the pixel in the reference pixel block PB 0
- p 2 st represents the luminance of the pixel in the comparative pixel block PB C .
- the sum of the luminances are calculated for all pixels in a region where 1 ⁇ s ⁇ 3 and 1 ⁇ t ⁇ 3 when each of the reference pixel block PB 0 and the comparative pixel block PB C is set as a region defined by 3 by 3 pixels, and for all pixels in a region where 1 ⁇ s ⁇ 4 and 1 ⁇ t ⁇ 4 when each of the reference pixel block PB 0 and the comparative pixel block PB C is set as a region defined by 4 by 4 pixels.
- the stereo matching means 7 thus calculates a parallax dp from the position of each reference pixel block PB 0 in the reference image T 0 and the position of a specified comparative pixel block PB C in the comparative image T C corresponding to the reference pixel block PB 0 .
- a threshold value of the SAD value is set as a criterion of stereo matching beforehand. When a SAD value between a reference pixel block PB 0 in the reference image T 0 and a comparative pixel block PB C specified corresponding thereto is more than the threshold value, the stereo matching means 7 determines that matching is not valid, and sets the parallax dp at 0.
- the stereo matching means 7 transmits information about the parallax dp thus calculated for each reference pixel block PB 0 to the distance-data memory 8 and stores the information therein.
- the point (X, Y, Z) in real space, the parallax dp, and the point (i, j) in a distance image T Z can be uniquely correlated by coordinate conversion given by the following Expressions (2) to (4) according to the principle of triangulation:
- X CD/ 2 +Z ⁇ PW ⁇ ( i ⁇ IV ) (2)
- Y CH+Z ⁇ PW ⁇ ( j ⁇ JV ) (3)
- Z CD /( PW ⁇ ( dp ⁇ DP )) (4)
- a point on the road surface just below the midpoint between the main camera 2 a and the sub-camera 2 b is designated as the origin
- the X-axis indicates the vehicle width direction (right-left direction)
- the Y-axis indicates the vehicle height direction
- Z-axis indicates the vehicle length direction (front-rear direction).
- CD represents the distance between the main camera 2 a and the sub-camera 2 b
- PW represents the viewing angle for one pixel
- CH represents the mounting height of the main camera 2 a and the sub-camera 2 b
- IV and JV respectively represent i and j coordinates in the distance image T Z of the point at infinity in front of the vehicle
- DP represents the vanishing point parallax.
- the operations are performed by using a distance image T Z (see FIG. 4 ) formed by assigning parallaxes dp to the reference pixel blocks PB 0 in the reference image T 0 , as described above.
- a distance image T Z (see FIG. 4 ) formed by assigning parallaxes dp to the reference pixel blocks PB 0 in the reference image T 0 , as described above.
- the parallaxes dp are stored in the distance-data memory 8 , they can be converted into distances Z according to Expression (4) described above, and the operations can be performed by using a distance image obtained by assigning the distances Z to the reference pixel blocks PB 0 in the reference image T 0 .
- the detection means 9 is formed by a microcomputer in which a CPU, a ROM, a RAM, an input/output interface, etc. (not shown) are connected to a bus.
- the detection means 9 is connected to sensors Q such as a vehicle-speed sensor, a yaw-rate sensor, and a steering-angle sensor for measuring and outputting the vehicle speed V, yaw rate ⁇ , steering angle ⁇ of the steering wheel.
- the yaw-rate sensor can be replaced with a device for estimating a yaw rate from the speed of the vehicle.
- the detection means 9 includes an object detecting means 10 , an estimated-region setting means 11 , and a determination means 12 .
- the detection means 9 also includes a memory that is not shown. Necessary data is input from the sensors Q to the above means in the detection means 9 .
- the detection means 9 performs processing according to flowcharts shown in FIGS. 5 and 6 . The following description will be given according to the flowcharts.
- the object detecting means 10 is based on the vehicle surroundings monitoring apparatus disclosed in, for example, Japanese Unexamined Patent Application Publication No. 10-283461, as described above. Since detailed descriptions have been given in the publication, the structure and processing of the object detecting means 10 will be described below.
- the object detecting means 10 the above-described detected parallaxes dp are converted into data on distances Z, and data on the adjoining distances Z are combined into groups, thus detecting objects in the reference image T 0 (Step 1 in FIG. 5 ).
- the object detecting means 10 reads out the above-described distance image T Z from the distance-data memory 8 , and divides the distance image T Z into vertical strip sections Dn each having a predetermined pixel width, as shown in FIG. 7 . Then, the object detecting means 10 converts the parallaxes dp assigned to the reference pixel blocks PB 0 belonging to each strip section Dn into distances Z according to Expression (4) described above, forms a histogram Hn of the distances Z, as shown in FIG. 8 , and sets a class value having the highest frequency Fn as an object distance Zn in the strip section Dn. This operation is performed for all sections Dn.
- the object detecting means 10 When the object detecting means 10 then plots the distances Zn obtained in the sections Dn are plotted in real space, the distances Zn are arranged, as shown in FIG. 9 .
- the object detecting means 10 classifies the adjoining dots into groups G 1 , G 2 , G 3 , . . . on the basis of the distances between the plotted dots and directionality, as shown in FIG. 10 .
- the object detecting means 10 linearly approximates the dots belonging to each group, as shown in FIG. 11 .
- the object detecting means 10 labels a group, in which the dots are arranged substantially parallel to the width direction of a vehicle A, that is, the X-axis direction, with an “object” O, and labels a group, in which the dots are arranged substantially parallel to the length direction of the vehicle A, that is, the Z-axis direction, with a “side wall” S.
- a point that can be regarded as an intersection of an “object” and a “side wall” of the same object is labeled with C as a corner point.
- the object detecting means 10 detects, as one object, each of [Side Wall S 1 ], [Object O 1 ], [Side Wall S 2 ], [Object O 2 , Corner Point C, Side Wall S 3 ], [Side Wall S 4 ], [Object O 3 ], [Object O 4 ], [Side Wall S 5 , Corner Point C, Object O 5 ], [Object O 6 ], and [Side Wall S 6 ]. While “Object” and “Side Wall” are used as labels for convenience, as described above, “side wall” is also detected as an object.
- the object detecting means 10 sets regions, where images of the objects are taken, by setting rectangular frames that enclose the objects in the reference image T 0 on the basis of information about the detected objects, thus detecting the objects in the reference image T o .
- the object detecting means 10 calculates the positions (X 1 new, Y 1 new, Z 1 new) and (X 2 new, Y 2 new, Z 2 new) in real space of both ends of an approximation line representing each detected object relative to the vehicle A, and stores the positions in the memory (Step S 2 in FIG. 5 ). Further, the object detecting means 10 stores, in the memory, information about the frames of the detected objects and information about a yaw rate ⁇ new of the vehicle A transmitted from the sensors Q in the current frame.
- the object detecting means 10 collates the relative positions (X 1 new, Y 1 new, Z 1 new) and (X 2 new, Y 2 new, Z 2 new) of both ends of each object with relative positions (X 1 old, Y 1 old, Z 1 old) and (X 2 old, Y 2 old, Z 2 old) of both ends of the object obtained in the previous frame and stored in the memory.
- Step S 13 in FIG. 6 When it is determined, by smoothing filtering that will be described below (Step S 13 in FIG. 6 ), that the object detected in the current frame matches with any object detected in the previous frame within the range of consistency with the speeds ( ⁇ X 1 old, ⁇ Y 1 old, ⁇ Z 1 old) and ( ⁇ X 2 old, ⁇ Y 2 old, ⁇ Z 2 old) in the X-axis direction, Y-axis direction, and Z-axis direction of both ends of an approximation line representing the object relative to the vehicle A, which have been calculated in the previous frame, the object detecting means 10 updates object information by adding a sign, which is the same as for the object detected in the previous frame, to the object detected in the current frame.
- the estimated-region setting means 11 sets an estimated region, where an image of each object detected in the previous frame is expected to be taken in the current frame, in each of the reference image T 0 and the comparative image T C , on the basis of the relative positions (X 1 old, Y 1 old, Z 1 old) and (X 2 old, Y 2 old, Z 2 old) of both ends of an approximation line representing the object calculated by the object detecting means 10 in the previous frame, the speeds ( ⁇ X 1 old, ⁇ Y 1 old, ⁇ Z 1 old) and ( ⁇ X 2 old, ⁇ Y 2 old, ⁇ Z 2 old) of both ends of the object relative to the vehicle A calculated in the previous frame, and the yaw rate ⁇ old of the vehicle A in the previous frame.
- the estimated-region setting means 11 reads out, from the memory, the relative positions (X 1 old, Y 1 old, Z 1 old) and (X 2 old, Y 2 old, Z 2 old) of both ends of an approximation line representing an object O calculated in the previous frame, and shifts the relative positions by the amount corresponding to the relative speeds ( ⁇ X 1 old, ⁇ Y 1 old, ⁇ Z 1 old) and ( ⁇ X 2 old, ⁇ Y 2 old, ⁇ Z 2 old) in the X-axis direction, Y-axis direction, and Z-axis direction of both ends of the object O in the previous frame.
- the estimated-region setting means 11 turns the positions around the vehicle A by the amount corresponding to the yaw rate ⁇ old of the vehicle A measured in the previous frame, and thereby calculates positions (X 1 est, Y 1 est, Z 1 est) and (X 2 est, Y 2 est, Z 2 est) of both ends of the object, where the object is expected to exist in the current frame, relative to the vehicle A.
- the estimated-region setting means 11 calculates an average Zest_ave of the Z coordinates Z 1 est and Z 2 est of the relative positions of the object O, and estimates a parallax dp est in the current frame by substituting the average into Expression (4).
- an estimated region R C est is set at a position in the comparative image T C shifted from the estimated region R 0 est set in the reference image T 0 by the parallax dp est (Step S 4 in FIG. 5 ).
- Step S 10 , S 11 While seeking (Step S 10 , S 11 ) that will be described below continues for an object, which has not been detected in the previous frame, estimated regions R 0 est and R C est are set for the object in the reference image T 0 and the comparative image T C in subsequent frames.
- the estimated-region setting means 11 performs the above-described Steps S 3 and S 4 for all objects including an object that has not been detected in the previous frame.
- the determination means 12 calculates an average p 1 ij _ave of luminances p 1 ij in an estimated region R 0 est which is set for each object in the reference image T 0 by the estimated-region setting means 11 , calculates an average p 2 ij _ave of luminances p 2 ij in an estimated region R C est correspondingly set in the comparative image T C , and then determines whether an absolute value
- the determination means 12 determines whether an object is detected in the estimated region R 0 est of the reference image T 0 (Step S 6 ).
- the determination means 12 holds information about the object stored in the memory, for example, the positions (X 1 new, Y 1 new, Z 1 new) and (X 2 new, Y 2 new, Z 2 new) in real space of both ends of the approximation line representing the object.
- Step S 5 When the absolute value
- the determination means 12 determines whether an object is detected in the estimated region R 0 est of the reference image T 0 (Step S 8 ).
- Step S 8 When an object is detected (Step S 8 ; YES), since reliability of information about the detected object is low, information about the distance Zn of the object detected in the current frame and stored in the memory and information about the positions (X 1 new, Y 1 new, Z 1 new) and (X 2 new, Y 2 new, Z 2 new) in real space of both ends of the approximation line representing the object relative to the vehicle A are restored in the memory together with information that noise is included (Step S 9 ).
- Step S 5 When the absolute value
- the determination means 12 sets information about positions (X 1 est, Y 1 est, Z 1 est) and (X 2 est, Y 2 est, Z 2 est) of both ends of the object relative to the vehicle A, where the object is expected to exist, as relative positions (X 1 new, Y 1 new, Z 1 new) and (X 2 new, Y 2 new, Z 2 new) of both ends of the object instead (Step S 10 ), and sets information about the estimated region R 0 est as information about a frame of the object.
- the determination means 12 also stores, in the memory, this information together with information that an object is not detected and information that noise is included (Step S 11 ), and continues seeking for the object.
- the above-described determination whether an object is detected in the estimated region R 0 est of the reference image T 0 is made on the basis of determination whether the estimated region R 0 est in the reference image T 0 overlaps with a region shown by a rectangular frame set in the reference image T 0 by the object detecting means 10 . It is necessary for the regions to at least be in contact with each other. Even when the regions overlap in the reference image T 0 , when the distance between the positions of the regions in real space is more than or equal to a predetermined distance, it is determined that an object is not detected in the estimated region R 0 est.
- Step S 10 , S 11 when a state in which the object is not detected in the estimated region R 0 est continues through a predetermined number of frames or more, it can be determined that the object does not exist in the reference image T 0 , and information about the object can be deleted from the memory so as to stop seeking of the object.
- Step S 12 When the above-described operations have not been completed for all objects in the estimated regions R 0 est and R C est including the object that has not been detected in the previous frame (Step S 12 ; NO), the determination means 12 repeats Step S 5 and subsequent steps.
- the determination means 12 reads out, from the memory, information about the objects that thus have been detected or have not been detected, as necessary.
- the determination means 12 calculates relative speeds ( ⁇ X 1 new, ⁇ Y 1 new, ⁇ Z 1 new) and ( ⁇ X 2 new, ⁇ Y 2 new, ⁇ Z 2 new) of both ends of the object in the next frame.
- the relative speeds ( ⁇ X 1 new, ⁇ Y 1 new, ⁇ Z 1 new) and ( ⁇ X 2 new, ⁇ Y 2 new, ⁇ Z 2 new) of both ends of the object in the next frame are calculated by smoothing filtering.
- smoothing filtering the relative speeds ( ⁇ X 1 new, ⁇ Y 1 new, ⁇ Z 1 new) and ( ⁇ X 2 new, ⁇ Y 2 new, ⁇ Z 2 new) of both ends of the object used in the current frame and differences between the relative positions (X 1 old, Y 1 old, Z 1 old) and (X 2 old, Y 2 old, Z 2 old) of both ends of the object in the previous frame and the relative positions (X 1 new, Y 1 new, Z 1 new) and (X 2 new, Y 2 new, Z 2 new) of both ends of the object in the current frame are subjected to weighted addition by an addition ratio.
- the determination means 12 calculates relative speeds ( ⁇ X 1 new, ⁇ Y 1 new, ⁇ Z 1 new) and ( ⁇ X 2 new, ⁇ Y 2 new, ⁇ Z 2 new) in the next frame according to the following Expressions (5) to (10) (Step S 13 in FIG.
- the relative positions (X 1 new, Y 1 new, Z 1 new) and (X 2 new, Y 2 new, Z 2 new) of both ends of the object which are defined by the relative positions (X 1 est, Y 1 est, Z 1 est) and (X 2 est, Y 2 est, Z 2 est) where the object have been expected to exist in Step S 10 , are used as X 1 new, Y 1 new, Z 1 new, X 2 new, Y 2 new, and Z 2 new.
- the determination means 12 correlates information about the object subjected to calculation with information that noise is included.
- the addition ratio P of the object is decreased so as to reduce contribution of the differences X 1 new-X 1 old and so on to smoothing. This is because reliability of information about the relative positions (X 1 new, Y 1 new, Z 1 new) and (X 2 new, Y 2 new, Z 2 new) of both ends of the object calculated in the current frame is low.
- the determination means 12 sets the addition ratio P at 0 in Expressions (5) to (10).
- the relative speeds ( ⁇ X 1 old, ⁇ Y 1 old, ⁇ Z 1 old) and ( ⁇ X 2 old, ⁇ Y 2 old, ⁇ Z 2 old) of the object used in the current frame are also used as relative speeds ( ⁇ X 1 new, ⁇ Y 1 new, ⁇ Z 1 new) and ( ⁇ X 2 new, ⁇ Y 2 new, ⁇ Z 2 new) in the next frame.
- the determination means 12 sets the calculated relative speeds ( ⁇ X 1 new, ⁇ Y 1 new, ⁇ Z 1 new) and ( ⁇ X 2 new, ⁇ Y 2 new, ⁇ Z 2 new) as relative speeds ( ⁇ X 1 old ⁇ Y 1 old, ⁇ Z 1 old) and ( ⁇ X 2 old, ⁇ Y 2 old, ⁇ Z 2 old) for use in the next frame (Step S 14 ), sets the relative positions (X 1 new, Y 1 new, Z 1 new) and (X 2 new, Y 2 new, Z 2 new) as relative positions (X 1 old, Y 1 old, Z 1 old) and (X 2 old, Y 2 old, Z 2 old) (Step S 15 ), and proceeds to processing for the next frame.
- estimated regions R 0 est and R C est are respectively set in the reference image T 0 and the comparative image T C , as shown in FIGS. 15A and 15B .
- the estimated regions R 0 est and R C est include images of the same object O. Since the images of the same object O are taken with similar luminances in both the images T 0 and T C , an average P 1 ij _ave of luminances p 1 ij in the estimated region R 0 est of the reference image T 0 is substantially equal to an average P 2 ij _ave of luminances p 2 ij in the estimated region R C est of the comparative image T C .
- of the difference between the average values is a small value less than a threshold value ⁇ pth (Step S 5 in FIG. 5 ; NO).
- the determination means 12 determines whether the object O is detected in the estimated region R 0 est of the reference image T 0 (Step S 6 ).
- the object detecting means 10 sets a frame Fr that encloses the object O in the reference image T 0 , as shown in FIG. 15A .
- the frame Fr and the estimated region R 0 est are set at almost the same position and overlap with each other, and the distance between the positions of the regions in real space is less than a predetermined distance. Therefore, the determination means 12 determines that the object O is detected in the estimated region R 0 est of the reference image T 0 (Step S 6 in FIG. 5 ; YES).
- the determination means 12 updates and maintains information about the object O, such as positions (X 1 new, Y 1 new, Z 1 new) and (X 2 new, Y 2 new, Z 2 new) in real space of both ends of an approximation line representing the object O relative to the vehicle A, in the memory (for example, Step S 15 in FIG. 6 ), and seeks the object O.
- Step S 5 in FIG. 5 ; NO when the object O is not detected in the estimated region R 0 est of the reference image T 0 (Step S 6 in FIG. 5 ; NO), the determination means 12 determines that images of the object O are not included in the estimated regions R 0 est and R C est and that the object O does not exist in the reference image T 0 . Then, the determination means 12 deletes information about the object O stored in the memory (Step S 7 ), and stops seeking of the object O.
- a wiper W operates in case of rain and the image of the object O in the comparative image T C shown in FIG. 15B is hidden by the wiper W in the next frame, as shown in FIG. 16B .
- the main camera 2 a and the sub-camera 2 b are normally not placed at positions such that the image of the object in the reference image T 0 is hidden by the right wiper viewed from the driver's seat at the same timing as the timing when the image of the same object in the comparative image T C is hidden by the left wiper. Therefore, the image of the object O is included in the reference image T 0 as long as there is no other cause to hinder image taking, as shown in FIG. 16A .
- the stereo matching means 7 cannot perform normal stereo matching, and the object detecting means 10 cannot detect the object O in the reference image T 0 . Therefore, a frame enclosing the object O is not set in the reference image T 0 .
- the estimated regions R 0 est and R C est in the reference image T 0 and the comparative image T C in the current frame are set on the basis of the relative positions (X 1 old, Y 1 old, Z 1 old) and (X 2 old, Y 2 old, Z 2 old) of both ends of the object O detected by the object detecting means 10 according to the reference image T 0 and the comparative image T C in the previous frame shown in FIGS. 15A and 15B , and on the basis of the speeds ( ⁇ X 1 old ⁇ Y 1 old, ⁇ Z 1 old) and ( ⁇ X 2 old, ⁇ Y 2 old, ⁇ Z 2 old) of both ends of the object O relative to the vehicle calculated in the previous frame.
- estimated regions R 0 est and R C est are set in the reference image T 0 and the comparative image T C , as shown in FIGS. 16A and 16B .
- an average value p 2 ij _ave of luminances p 2 ij of the estimated region R C est in the comparative image T C significantly decreases, as shown in FIG. 17 , because the estimated region R C est is hidden by the wiper W.
- an average value p 1 ij _ave of luminances p 1 ij of the estimated region R 0 est in the reference image T 0 does not change so much.
- the determination means 12 determines that the object O is not detected in the estimated region R 0 est of the reference image T 0 (Step S 8 ; NO).
- the object O is not immediately lost.
- Step S 11 since the object O is originally not detected by the object detecting means 10 , information about the relative positions thereof is correlated with information that the object is not detected and information that noise is included (Step S 11 ).
- the wiper W normally comes out of the image in one or several frames.
- the reference image T 0 and the comparative image T C return from the states shown in FIGS. 16A and 16B to the states shown in FIGS. 15A and 15B in the next frame. For this reason, as shown in FIG.
- the average value p 2 ij _ave of luminances p 2 ij in the estimated region R C est of the comparative image T C substantially recovers to its initial value, and the absolute value
- the average luminances in the estimated regions also increase or decrease when the wiper W cuts in front of the main camera 2 a that takes the reference image T 0 .
- the average luminances p 1 ij _ave and p 2 ij _ave in the estimated regions R 0 est and R C est of the reference image T 0 and the comparative image T C taken by the cameras 2 a and 2 b greatly decrease only for one frame in this embodiment.
- the determination means 12 determines that the wiper W cuts in front of the main camera 2 a or the sub-camera 2 b to which the estimated region having the lower average luminance belongs. This makes it possible to detect that the wiper W has cut in front of the main camera 2 a and the sub-camera 2 b.
- an average p 2 ij ave of luminances P 2 ij in the estimated region R C est remains less than an average p 1 ij _ave of luminances p 1 ij in an estimated region R 0 est correspondingly set in the reference image T 0 , as shown in FIG. 20 .
- an average p 2 ij _ave of luminances P 2 ij in the estimated region R C est remains more than an average p 1 ij _ave of luminances p 1 ij in an estimated region R 0 est correspondingly set in the reference image T 0 , as shown in FIG. 21 , contrary to the case shown in FIG. 20 .
- stereo matching is sometimes performed by the stereo matching means 7 and an object can be detected in the reference image T 0 by the object detecting means 10 .
- a frame enclosing the object is set in the reference image T 0 .
- of the difference between the average luminances of the estimated regions R 0 est and R C est in the reference image T 0 and the comparative image T C corresponding to the portion where the substance adheres is less than the threshold value ⁇ pth (Step S 5 in FIG. 5 ; NO)
- a normal object detecting operation is performed by the determination means 12 (Step S 6 , S 7 ).
- the determination means 12 stores information about the relative positions (X 1 new, Y 1 new, Z 1 new) and (X 2 new, Y 2 new, Z 2 new) of both ends of the object in the memory in correlation with information that noise is included (Step S 9 ). Further, the information about the relative positions is output in correlation with the information that noise is included.
- the determination means 12 determines that a substance adheres to the front glass in front of the main camera 2 a or the sub-camera 2 b . This makes it possible to detect that the substance adheres to the front glass in front of the main camera 2 a or the sub-camera 2 b.
- the object such as the preceding vehicle, itself becomes light or dark. Therefore, the average values p 1 ij _ave and p 2 ij _ave of the luminances p 1 ij and p 2 ij in the estimated regions R 0 est and R C est of the reference image T 0 and the comparative image T C including the image of the preceding vehicle decrease or increase simultaneously.
- the determination means 12 continues detection of an object, such as a preceding vehicle, by basic operation in a normal image taking state (Step S 6 , S 7 ). While the normal object detecting operation is thus performed in the object detection system 1 of this embodiment even when an external environment suddenly changes, an operation different from the normal operation is performed when the average luminance decreases only in one of the estimated regions, as shown in FIGS. 17 to 19 .
- estimated regions R 0 est and R C est in which images of the object are expected to be taken in the current frame, are respectively set in the reference image T 0 and the comparative image T C on the basis of the result of the previous frame. Further, it is determined whether information about the object detected in the estimated region R 0 est or information that the object is not detected includes noise, on the basis of the absolute value
- the object can be detected by thus accurately distinguishing between noise and the change of the external environment, it is possible to accurately distinguish between a case in which only the external environment changes and information about the detected object and information that the object is not detected are highly reliable, and a case in which noise occurs and information about the detected object and information that the object is not detected are not reliable.
- reliability of the information is low, information about the reliability can be output in proper correlation with the object information.
- the information when automatic running control of the vehicle is performed on the basis of the output information, the information can be used for automatic control after recognizing reliability of the information. This allows accurate automatic control.
- information about the object detected in the estimated region R 0 est of the reference image T 0 or information that the object is not detected is correlated with information that noise is included, as described above.
- information about the object can be correlated with reliability beforehand. By changing the reliability of information to a large value when the object is detected normally and to a small value when the object is not detected, the information can be correlated with information that noise is included.
- a signal can be transmitted from the determination means 12 to the stereo matching means 7 so as to tighten the criterion of stereo matching in the stereo matching means 7 , as shown in FIG. 22 .
- a threshold value of a SAD value serving as the criterion of stereo matching is changed to a smaller value, and a comparative pixel block PB C is specified in the comparative image T C only when it has a luminance pattern similar to that of a reference pixel block PB 0 in the reference image T 0 .
- of the difference between the average luminances of the estimated regions R 0 est and R C est in the reference image T 0 and the comparative image T C is large, reliability of information about the distance of the detected object can be improved further.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar, Positioning & Navigation (AREA)
- Electromagnetism (AREA)
- Measurement Of Optical Distance (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Traffic Control Systems (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Geophysics And Detection Of Objects (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
X=CD/2+Z×PW×(i−IV) (2)
Y=CH+Z×PW×(j−JV) (3)
Z=CD/(PW×(dp−DP)) (4)
where a point on the road surface just below the midpoint between the
ΔX1new=(1−P)×ΔX1old+P×(X1new−X1old) (5)
ΔY1new=(1−P)×ΔY1old+P×(Y1new−Y1old) (6)
ΔZ1new=(1−P)×ΔZ1old+P×(Z1new−Z1old) (7)
ΔX2new=(1−P)×ΔX2old+P×(X2new−X2old) (8)
ΔY2new=(1−P)×ΔY2old+P×(Y2new−Y2old) (9)
ΔZ2new=(1−P)×ΔZ2old+P×(Z2new−Z2old) (10)
Claims (10)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2007280483A JP4856612B2 (en) | 2007-10-29 | 2007-10-29 | Object detection device |
JP2007-280483 | 2007-10-29 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20090244263A1 US20090244263A1 (en) | 2009-10-01 |
US8233031B2 true US8233031B2 (en) | 2012-07-31 |
Family
ID=40690141
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/260,538 Active 2031-05-23 US8233031B2 (en) | 2007-10-29 | 2008-10-29 | Object detecting system |
Country Status (3)
Country | Link |
---|---|
US (1) | US8233031B2 (en) |
JP (1) | JP4856612B2 (en) |
DE (1) | DE102008053472B4 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120044327A1 (en) * | 2009-05-07 | 2012-02-23 | Shinichi Horita | Device for acquiring stereo image |
US9066085B2 (en) | 2012-12-13 | 2015-06-23 | Delphi Technologies, Inc. | Stereoscopic camera object detection system and method of aligning the same |
US20170064286A1 (en) * | 2015-08-24 | 2017-03-02 | Denso Corporation | Parallax detection device |
US20170109884A1 (en) * | 2013-04-15 | 2017-04-20 | Sony Corporation | Image processing apparatus, image processing method, and program |
US10217006B2 (en) * | 2015-08-31 | 2019-02-26 | Continental Automotive Gmbh | Method and device for detecting objects in the dark using a vehicle camera and a vehicle lighting system |
US10823950B2 (en) * | 2016-01-07 | 2020-11-03 | Digital Surigcals PTE. LTD. | Camera system with balanced monocular cues for use in digital stereo microscopes |
US10984509B2 (en) | 2016-01-28 | 2021-04-20 | Ricoh Company, Ltd. | Image processing apparatus, imaging device, moving body device control system, image information processing method, and program product |
US11004215B2 (en) | 2016-01-28 | 2021-05-11 | Ricoh Company, Ltd. | Image processing apparatus, imaging device, moving body device control system, image information processing method, and program product |
US20220026698A1 (en) * | 2020-07-24 | 2022-01-27 | United Scope LLC | Digital microscopy system and graphical user interface |
US11842501B2 (en) | 2019-03-11 | 2023-12-12 | Velo Software, Llc | Methods and systems of tracking velocity |
Families Citing this family (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010084521A1 (en) * | 2009-01-20 | 2010-07-29 | 本田技研工業株式会社 | Method and apparatus for identifying raindrops on a windshield |
JP2011013064A (en) * | 2009-07-01 | 2011-01-20 | Nikon Corp | Position detection device |
JP5371725B2 (en) * | 2009-12-16 | 2013-12-18 | 富士重工業株式会社 | Object detection device |
DE102010034262A1 (en) * | 2010-08-13 | 2012-02-16 | Valeo Schalter Und Sensoren Gmbh | Retrofit system for a motor vehicle and method for assisting a driver when driving a motor vehicle |
JP5644634B2 (en) * | 2011-03-30 | 2014-12-24 | アイシン・エィ・ダブリュ株式会社 | Vehicle information acquisition device, vehicle information acquisition method and program |
US9532011B2 (en) * | 2011-07-05 | 2016-12-27 | Omron Corporation | Method and apparatus for projective volume monitoring |
KR20130024504A (en) * | 2011-08-31 | 2013-03-08 | 삼성전기주식회사 | Stereo camera system and method for controlling convergence |
CN103473757B (en) * | 2012-06-08 | 2016-05-25 | 株式会社理光 | Method for tracing object in disparity map and system |
JP5957359B2 (en) * | 2012-10-19 | 2016-07-27 | 日立オートモティブシステムズ株式会社 | Stereo image processing apparatus and stereo image processing method |
JP6060021B2 (en) * | 2013-03-29 | 2017-01-11 | 株式会社日立ハイテクノロジーズ | Trajectory inspection method and apparatus |
CN104424648B (en) * | 2013-08-20 | 2018-07-24 | 株式会社理光 | Method for tracing object and equipment |
JP6315308B2 (en) * | 2014-01-15 | 2018-04-25 | 株式会社リコー | Control object identification device, mobile device control system, and control object recognition program |
JP2015225450A (en) * | 2014-05-27 | 2015-12-14 | 村田機械株式会社 | Autonomous traveling vehicle, and object recognition method in autonomous traveling vehicle |
JP2014238409A (en) * | 2014-07-23 | 2014-12-18 | 日立オートモティブシステムズ株式会社 | Distance calculation device and distance calculation method |
JP6570296B2 (en) * | 2015-04-09 | 2019-09-04 | キヤノン株式会社 | Image processing apparatus, image processing method, and program |
JP6775285B2 (en) * | 2015-09-24 | 2020-10-28 | アルパイン株式会社 | Rear side vehicle detection alarm device |
JP6660751B2 (en) * | 2016-02-04 | 2020-03-11 | 日立オートモティブシステムズ株式会社 | Imaging device |
US10571913B2 (en) | 2016-08-05 | 2020-02-25 | Aptiv Technologies Limited | Operation-security system for an automated vehicle |
JP6654999B2 (en) | 2016-12-12 | 2020-02-26 | 株式会社Soken | Target detection device |
JP6334773B2 (en) * | 2017-04-07 | 2018-05-30 | 日立オートモティブシステムズ株式会社 | Stereo camera |
JP6875940B2 (en) * | 2017-06-15 | 2021-05-26 | 株式会社Subaru | Mutual distance calculation device |
KR102372170B1 (en) * | 2017-06-26 | 2022-03-08 | 삼성전자주식회사 | Range hood and control method of thereof |
JP6627127B2 (en) * | 2017-11-27 | 2020-01-08 | 本田技研工業株式会社 | Vehicle control device, vehicle control method, and program |
JP7118717B2 (en) * | 2018-04-18 | 2022-08-16 | 日立Astemo株式会社 | Image processing device and stereo camera device |
US11182914B2 (en) | 2018-05-21 | 2021-11-23 | Facebook Technologies, Llc | Dynamic structured light for depth sensing systems based on contrast in a local area |
US10943129B2 (en) * | 2019-01-04 | 2021-03-09 | Ford Global Technologies, Llc | Low-light sensor cleaning |
US11222218B2 (en) * | 2019-03-11 | 2022-01-11 | Subaru Corporation | Vehicle exterior environment detection apparatus |
JP7247772B2 (en) * | 2019-06-13 | 2023-03-29 | 株式会社デンソー | Information processing device and driving support system |
WO2021008712A1 (en) * | 2019-07-18 | 2021-01-21 | Toyota Motor Europe | Method for calculating information relative to a relative speed between an object and a camera |
CN112799079B (en) * | 2019-10-24 | 2024-03-26 | 华为技术有限公司 | A data association method and device |
JP7457574B2 (en) * | 2020-05-21 | 2024-03-28 | 株式会社Subaru | Image processing device |
DE112021008172T5 (en) * | 2021-08-31 | 2024-07-18 | Mitsubishi Electric Corporation | OBSCURBATION ASSESSMENT DEVICE, OCCUPANT MONITORING DEVICE AND OBSCURBATION ASSESSMENT METHOD |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05114099A (en) | 1991-10-22 | 1993-05-07 | Fuji Heavy Ind Ltd | Distance detector for vehicle |
JPH05265547A (en) | 1992-03-23 | 1993-10-15 | Fuji Heavy Ind Ltd | On-vehicle outside monitoring device |
JPH06266828A (en) | 1993-03-12 | 1994-09-22 | Fuji Heavy Ind Ltd | Outside monitoring device for vehicle |
JPH10283461A (en) | 1997-04-04 | 1998-10-23 | Fuji Heavy Ind Ltd | Outside monitoring device |
JPH10283477A (en) | 1997-04-04 | 1998-10-23 | Fuji Heavy Ind Ltd | Outer-vehicle monitoring device |
JP2000123151A (en) | 1998-10-15 | 2000-04-28 | Nec Corp | Image processing method |
US6366691B1 (en) * | 1999-09-22 | 2002-04-02 | Fuji Jukogyo Kabushiki Kaisha | Stereoscopic image processing apparatus and method |
US6381360B1 (en) * | 1999-09-22 | 2002-04-30 | Fuji Jukogyo Kabushiki Kaisha | Apparatus and method for stereoscopic image processing |
JP2006072495A (en) | 2004-08-31 | 2006-03-16 | Fuji Heavy Ind Ltd | Three-dimensional object monitoring device |
JP2006107314A (en) | 2004-10-08 | 2006-04-20 | Nissan Motor Co Ltd | Image processing apparatus and image processing method |
US20090237491A1 (en) * | 2007-10-29 | 2009-09-24 | Toru Saito | Object Detecting System |
US8094934B2 (en) * | 2007-03-07 | 2012-01-10 | Fuji Jukogyo Kabushiki Kaisha | Object detection system |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07280561A (en) * | 1994-04-12 | 1995-10-27 | Nippon Soken Inc | Distance measuring apparatus for vehicle |
JP3384278B2 (en) * | 1997-03-24 | 2003-03-10 | 日産自動車株式会社 | Distance measuring device |
JP3092105B1 (en) * | 1999-07-30 | 2000-09-25 | 富士重工業株式会社 | Outside monitoring device with fail-safe function |
JP2004020237A (en) * | 2002-06-12 | 2004-01-22 | Fuji Heavy Ind Ltd | Vehicle management system |
-
2007
- 2007-10-29 JP JP2007280483A patent/JP4856612B2/en active Active
-
2008
- 2008-10-28 DE DE102008053472.2A patent/DE102008053472B4/en active Active
- 2008-10-29 US US12/260,538 patent/US8233031B2/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05114099A (en) | 1991-10-22 | 1993-05-07 | Fuji Heavy Ind Ltd | Distance detector for vehicle |
JPH05265547A (en) | 1992-03-23 | 1993-10-15 | Fuji Heavy Ind Ltd | On-vehicle outside monitoring device |
JPH06266828A (en) | 1993-03-12 | 1994-09-22 | Fuji Heavy Ind Ltd | Outside monitoring device for vehicle |
JPH10283461A (en) | 1997-04-04 | 1998-10-23 | Fuji Heavy Ind Ltd | Outside monitoring device |
JPH10283477A (en) | 1997-04-04 | 1998-10-23 | Fuji Heavy Ind Ltd | Outer-vehicle monitoring device |
JP2000123151A (en) | 1998-10-15 | 2000-04-28 | Nec Corp | Image processing method |
US6366691B1 (en) * | 1999-09-22 | 2002-04-02 | Fuji Jukogyo Kabushiki Kaisha | Stereoscopic image processing apparatus and method |
US6381360B1 (en) * | 1999-09-22 | 2002-04-30 | Fuji Jukogyo Kabushiki Kaisha | Apparatus and method for stereoscopic image processing |
JP2006072495A (en) | 2004-08-31 | 2006-03-16 | Fuji Heavy Ind Ltd | Three-dimensional object monitoring device |
JP2006107314A (en) | 2004-10-08 | 2006-04-20 | Nissan Motor Co Ltd | Image processing apparatus and image processing method |
US8094934B2 (en) * | 2007-03-07 | 2012-01-10 | Fuji Jukogyo Kabushiki Kaisha | Object detection system |
US20090237491A1 (en) * | 2007-10-29 | 2009-09-24 | Toru Saito | Object Detecting System |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120044327A1 (en) * | 2009-05-07 | 2012-02-23 | Shinichi Horita | Device for acquiring stereo image |
US9066085B2 (en) | 2012-12-13 | 2015-06-23 | Delphi Technologies, Inc. | Stereoscopic camera object detection system and method of aligning the same |
US20170109884A1 (en) * | 2013-04-15 | 2017-04-20 | Sony Corporation | Image processing apparatus, image processing method, and program |
US10430944B2 (en) * | 2013-04-15 | 2019-10-01 | Sony Corporation | Image processing apparatus, image processing method, and program |
US20170064286A1 (en) * | 2015-08-24 | 2017-03-02 | Denso Corporation | Parallax detection device |
US10217006B2 (en) * | 2015-08-31 | 2019-02-26 | Continental Automotive Gmbh | Method and device for detecting objects in the dark using a vehicle camera and a vehicle lighting system |
US10823950B2 (en) * | 2016-01-07 | 2020-11-03 | Digital Surigcals PTE. LTD. | Camera system with balanced monocular cues for use in digital stereo microscopes |
US10984509B2 (en) | 2016-01-28 | 2021-04-20 | Ricoh Company, Ltd. | Image processing apparatus, imaging device, moving body device control system, image information processing method, and program product |
US11004215B2 (en) | 2016-01-28 | 2021-05-11 | Ricoh Company, Ltd. | Image processing apparatus, imaging device, moving body device control system, image information processing method, and program product |
US11842501B2 (en) | 2019-03-11 | 2023-12-12 | Velo Software, Llc | Methods and systems of tracking velocity |
US20220026698A1 (en) * | 2020-07-24 | 2022-01-27 | United Scope LLC | Digital microscopy system and graphical user interface |
US11782254B2 (en) * | 2020-07-24 | 2023-10-10 | United Scope LLC | Digital microscopy system and graphical user interface |
Also Published As
Publication number | Publication date |
---|---|
DE102008053472B4 (en) | 2018-10-04 |
DE102008053472A1 (en) | 2009-06-25 |
JP2009110173A (en) | 2009-05-21 |
US20090244263A1 (en) | 2009-10-01 |
JP4856612B2 (en) | 2012-01-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8233031B2 (en) | Object detecting system | |
US8244027B2 (en) | Vehicle environment recognition system | |
US8174563B2 (en) | Object detecting system | |
US11836989B2 (en) | Vehicular vision system that determines distance to an object | |
US8437536B2 (en) | Environment recognition system | |
US9740942B2 (en) | Moving object location/attitude angle estimation device and moving object location/attitude angle estimation method | |
CN103129555B (en) | lane tracking system | |
CN101405783B (en) | Road division line detector | |
US7619668B2 (en) | Abnormality detecting apparatus for imaging apparatus | |
US7623700B2 (en) | Stereoscopic image processing apparatus and the method of processing stereoscopic images | |
US8175334B2 (en) | Vehicle environment recognition apparatus and preceding-vehicle follow-up control system | |
US7580548B2 (en) | Abnormality detecting apparatus for imaging apparatus | |
US20100201814A1 (en) | Camera auto-calibration by horizon estimation | |
JPH10512694A (en) | Method and apparatus for detecting movement of an object in a continuous image | |
CN1954351A (en) | Lane boundary recognition apparatus for vehicle | |
US9398227B2 (en) | System and method for estimating daytime visibility | |
CN116648734A (en) | Correction of the surround view camera system image in rain, incident light and dirt | |
JP7227112B2 (en) | OBJECT DETECTION DEVICE, TRIP CONTROL SYSTEM, AND TRIP CONTROL METHOD | |
CN100576283C (en) | Driving dividing line recognition device for vehicles | |
JP2020087210A (en) | Calibration device and calibration method | |
JP2005127781A (en) | Ranging performance degradation detection device for vehicles | |
US20180060671A1 (en) | Image processing device, imaging device, equipment control system, equipment, image processing method, and recording medium storing program | |
JP4100336B2 (en) | Image processing device | |
CN111583387A (en) | Method and system for three-dimensional reconstruction of outdoor scene of unmanned aerial vehicle | |
US20230421739A1 (en) | Robust Stereo Camera Image Processing Method and System |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FUJI JUKOGYO KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SAITO, TORU;REEL/FRAME:022806/0507 Effective date: 20090522 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
AS | Assignment |
Owner name: SUBARU CORPORATION, JAPAN Free format text: CHANGE OF NAME;ASSIGNOR:FUJI JUKOGYO KABUSHIKI KAISHA;REEL/FRAME:042624/0886 Effective date: 20170401 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |