WO2016039077A1 - 画像処理装置 - Google Patents
画像処理装置 Download PDFInfo
- Publication number
- WO2016039077A1 WO2016039077A1 PCT/JP2015/072987 JP2015072987W WO2016039077A1 WO 2016039077 A1 WO2016039077 A1 WO 2016039077A1 JP 2015072987 W JP2015072987 W JP 2015072987W WO 2016039077 A1 WO2016039077 A1 WO 2016039077A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- image
- processing
- distance
- vehicle
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
- G01C3/085—Use of electric radiation detectors with electronic parallax measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
Definitions
- the present invention relates to an image processing apparatus, and more particularly to an image processing apparatus that processes images captured by a plurality of cameras mounted on a vehicle such as an automobile to recognize an environment around the vehicle.
- a system that detects a vehicle existing around a vehicle based on an image obtained from a camera (an in-vehicle camera) mounted on the vehicle such as an automobile and uses the detection result for driver assistance. Yes.
- a camera an in-vehicle camera mounted on the vehicle such as an automobile
- ACC inter-vehicle maintenance system
- a collision avoidance system that alerts the driver with an alarm or the like, and reduces the injury to the occupant with an automatic brake or the like when a collision becomes unavoidable.
- Patent Document 1 discloses a technique for measuring an accurate distance to an object by improving distance accuracy when measuring a distance to the object reflected in an image. .
- the image processing apparatus disclosed in Patent Document 1 extracts one image object region including an image of an object from one image of a pair of images captured in the same direction at the same time by a pair of imaging elements, On the other hand, for each of a plurality of image components constituting the image object region, a background degree that is an accuracy of whether the image is an object image component part or a background image component part is calculated, and the background degree is used to calculate the inside of the other image.
- a conventional image processing apparatus used for an object (particularly vehicle) detection technique using a stereo camera apparatus generally has an apparatus configuration as shown in FIG. That is, as shown in FIG. 7, the conventional image processing apparatus 50 ′ mainly includes a right image acquisition unit 501 ′, a left image acquisition unit 502 ′, and a parallax information acquisition unit 503 ′ that constitute the image processing unit 52 ′. , A vehicle detection unit 504 ′, a distance calculation unit 505 ′, a collation processing unit 506 ′, a relative speed calculation unit 507 ′, and a calculation result output unit 508 ′ constituting the arithmetic processing unit 53 ′.
- the left image acquisition unit 502 ′ and the right image acquisition unit 501 ′ acquire images from a pair of left and right cameras mounted on the vehicle.
- the parallax information acquisition unit 503 ′ calculates the parallax by collating the two images captured by the left and right cameras, and outputs a parallax image.
- the parallax information (parallax image) obtained from the calculation result and the left and right images (the left image and the right image captured by the left and right cameras) or one of the images is obtained by the vehicle detection unit 504 ′. Is detected and recognized, and the position (position in the image) of the vehicle candidate group is output.
- the distance calculation unit 505 ′ calculates and outputs the distance to the vehicle (distance of the vehicle candidate group) using the detection result of the vehicle (position of the vehicle candidate group), parallax information (parallax image), and the like. .
- the calculation result is recorded in the registration table 509 ′.
- the position information of the vehicle detected between frames for example, the position and distance of the vehicle candidate group of a certain frame T and the past frame T-1) and information on how the image is viewed by the matching processing unit 506 ′. Is used to associate the detection result in a certain frame with the detection result in the next frame. When there is no past information, the matching process by the matching processing unit 506 ′ is not performed.
- the relative speed calculation unit 507 ′ uses the correspondence between frames recognized by the matching processing unit 506 ′ (that is, the position and distance of the vehicle candidate group associated between the frames), and the rate of change in distance. Is recognized and the relative speed of the vehicle is calculated, and the relative speed obtained in the frame is recorded in the registration table 509 ′ together with information such as the position and distance of the vehicle. Note that the relative speed can be obtained from the enlargement ratio, and in this case, either the right image or the left image may be used. Then, the calculation result output unit 508 ′ outputs the calculation result to a control processing unit or the like. Note that the position and distance of the vehicle candidate group measured in a certain frame, the relative speed, and the like recorded in the registration table 509 ′ are used for processing in the next frame.
- This collation range is performed within a predetermined mad range, but the processing becomes heavy because the above-described fine region collation is performed over the entire target image to be stereoscopically viewed.
- this processing is performed by an element that performs image processing exclusively, but it is known that the calculation load is still large.
- the image processing by the conventional image processing apparatus 50 ′ will be described in more detail in time series.
- the process in the image processing unit 52 ′ is set to the S1 ′ series, and the process in the arithmetic processing unit 53 ′ is set to S2 ′.
- the series first, in the image processing unit 52 ′ (S1 ′ series), the right image input process S501 ′ from the right camera and the left image input process S502 ′ from the left camera are performed, and both images are accumulated.
- the parallax calculation process S503 ′ is performed.
- parallax calculation processing S503 ′ right and left or one of the images and the corresponding parallax information (parallax image) are obtained.
- the processing result in the parallax calculation processing S503 ′ is transferred to the arithmetic processing unit 53 ′ at timing T1 ′, and the arithmetic processing unit 53 ′ (S2 ′ series) waits for the end of the transfer, and then detects and recognizes the vehicle. 'I do. After that, distance calculation processing S505 ′, collation processing S506 ′, relative speed calculation processing S507 ′, and calculation result output processing S508 ′ are performed.
- the delay in response from the image input from the camera to the calculation result output is not limited to simply communicating with the vehicle control device or the like using the stereo camera device alone, but in combination with other sensors Even in a sensor fusion system such as that provided to a vehicle, there may be a problem that the difference in response between sensors becomes large, and the integration of the information becomes difficult.
- the present invention has been made in view of the above problems, and its object is to suppress a delay in response from image input from a camera to calculation result output while maintaining recognition accuracy such as distance. Another object of the present invention is to provide an image processing apparatus capable of meeting mutually contradictory requirements such as recognition accuracy improvement, calculation load reduction (processing efficiency improvement) and response speed improvement.
- an image processing apparatus processes a plurality of images captured in synchronization with a plurality of cameras and measures a relative distance or a relative speed to an object shown in the images.
- An image processing apparatus that performs tracking processing for obtaining current position information of the target object based on any one of the plurality of images and past position information of the target object
- An object tracking unit ; a first distance calculating unit that calculates a relative distance to the object based on the current position information and at least two images of the plurality of images; and Based on the parallax information obtained from the at least two images, the object detection unit that detects the object by obtaining the current position information of the object, and the object based on the detection result of the object Relative to
- a second distance calculation unit that calculates separation, current position information obtained by the object tracking unit, current position information obtained by the object detection unit, and a relative distance calculated by the first distance calculation unit
- a correction processing unit that corrects at least one of the current position information and the relative distance based on at
- the processing start timing of the first distance calculation unit is set before the processing end timing of the target object detection unit, in particular, before the processing start timing of the target object detection unit.
- FIG. 1 is a system configuration diagram schematically showing a system configuration of a stereo camera system to which Embodiment 1 of an image processing apparatus according to the present invention is applied.
- FIG. 2 is an internal configuration diagram schematically showing an internal configuration of the image processing apparatus shown in FIG. 1.
- the internal block diagram which shows the internal structure of the image processing apparatus shown in FIG. 2 is a timing chart for explaining processing contents of the image processing apparatus shown in FIG. 1 in time series.
- 9 is a timing chart for explaining processing contents of the image processing apparatus according to the second embodiment of the present invention in time series.
- 9 is a timing chart for explaining processing contents of the image processing apparatus according to the third embodiment of the present invention in time series.
- the internal block diagram which shows the internal structure of the conventional image processing apparatus.
- the schematic diagram which illustrates typically the general processing content of a parallax information acquisition part.
- FIG. 1 schematically shows a system configuration of a stereo camera system to which Embodiment 1 of an image processing apparatus according to the present invention is applied.
- the illustrated stereo camera system 1 is mounted on a vehicle V such as an automobile, for example, and is mainly synchronized with a stereo camera device 11 composed of a plurality of (two in this embodiment) cameras and each camera of the stereo camera device 11.
- the image processing device 10 that processes a plurality of images captured in this manner, and various devices mounted on the vehicle V based on the control signal generated by the image processing device 10 (for example, the accelerator 13, the brake 14, the speaker 15, And a control device 12 for controlling the steering 16 and the like.
- the stereo camera device 11 is installed toward the front of the vehicle V, for example, in front of the upper portion of the windshield in front of the vehicle V, and the image processing device 10, the control device 12, the accelerator 13, The brake 14 and the speaker 15 are communicably connected.
- the stereo camera device 11 includes a right camera 11a and a left camera 11b as a pair of imaging means for capturing an image of the front of the vehicle V and acquiring image information.
- Each of the right camera 11a and the left camera 11b has an image sensor such as a CCD, and is installed so as to image the front of the vehicle V from a position separated from each other in the vehicle width direction (left-right direction) (FIG. 2). reference).
- the image processing apparatus 10 is an apparatus for recognizing an environment outside the vehicle based on image information of an imaging target area ahead of the vehicle V acquired in a time series with a predetermined cycle by the stereo camera apparatus 11. For example, a white line on a road Recognize various objects such as pedestrians, vehicles, other three-dimensional objects, signals, signs, lighting lamps, etc., generate control signals based on the recognition results, and output them to the control device 12. Based on the control signal received from the image processing apparatus 10, the control device 12 adjusts the accelerator 13, the brake 14, the steering 16, and the like of the vehicle V (own vehicle).
- the image processing apparatus 10 and the control apparatus 12 may be incorporated in the stereo camera apparatus 11 so that the stereo camera apparatus 11 itself may also perform processing, or may be incorporated in an integrated controller or the like.
- FIG. 2 schematically shows the internal configuration of the image processing apparatus shown in FIG.
- the illustrated image processing apparatus 10 mainly includes an image input interface 21, an image processing unit 22, an arithmetic processing unit 23, a storage unit 24, a CAN interface 25, and a control processing unit 26.
- the bus 20 is communicably connected.
- the image input interface 21 controls the imaging of the stereo camera device 11 and captures images captured by the stereo camera device 11 (each camera 11a, 11b).
- the images of the cameras 11a and 11b captured through the image input interface 21 are transferred through the bus 20, processed by the image processing unit 22 and the arithmetic processing unit 23, and image data that is a result during processing or a final result. And the like are stored in the storage unit 24.
- the image processing unit 22 compares the right image obtained from the image sensor of the right camera 11a of the stereo camera device 11 with the left image obtained from the image sensor of the left camera 11b, and applies the image to the image sensor.
- Image correction such as device-specific deviation correction and noise interpolation is performed and stored in the storage unit 24. Further, a mutually corresponding portion is calculated between the right image and the left image, and disparity information is calculated to generate a disparity image (disparity information), which is similarly stored in the storage unit 24.
- the arithmetic processing unit 23 uses various images and parallax information (distance information for each point on the image) stored in the storage unit 24 to recognize various objects necessary to perceive the environment around the vehicle V. I do.
- various objects include people, vehicles, other obstacles, traffic lights, signs, vehicle tail lamps, head rides, and the like.
- the control processing unit 26 generates a control signal for controlling braking or the like of the vehicle V using the recognition result stored in the storage unit 24, and the control signal related to the control policy of the vehicle V and object recognition.
- a part of the recognition result is transmitted to the in-vehicle network CAN 27 via the CAN interface 25 and is transferred to the control device 12 and the like therefrom.
- FIG. 3 shows the internal configuration of the image processing apparatus shown in FIG. 2 more specifically.
- a case where the presence of a preceding vehicle existing in front of the vehicle V as an object is detected and the relative distance and speed from the vehicle V to the preceding vehicle are recognized will be specifically described.
- the image processing apparatus 10 mainly includes a right image acquisition unit 101, a left image acquisition unit 102, a parallax information acquisition unit 103, and a vehicle tracking that form an arithmetic processing unit 23.
- Unit 104 first distance calculation unit 105, relative speed calculation unit 106, vehicle detection unit 107, second distance calculation unit 108, correction processing unit 109, and calculation result output unit 110.
- the first recognition process means a process for detecting for the first time from a state in which the target vehicle has not been detected in the past.
- the right image acquisition unit 101 and the left image acquisition unit 102 acquire the right image and the left image captured by the right camera 11a and the left camera 11b from the storage unit 24.
- the right image acquisition unit 101 and the left image acquisition unit 102 may acquire the right image and the left image directly from the stereo camera device 11 (each camera 11a, 11b) via the image input interface 21.
- the parallax information acquisition unit 103 performs parallax calculation using the right image and the left image acquired by the right image acquisition unit 101 and the left image acquisition unit 102, and generates a parallax image (parallax information).
- the vehicle detection unit 107 uses these results to determine the current position information of the vehicle shown in the image (in the image). (Location information) is detected and the vehicle is detected and recognized.
- the second distance calculation unit 108 calculates the distance to the preceding vehicle using the detection result.
- the vehicle detection unit 107 and the second distance calculation unit 108 register the position information and distance information, which are the calculation results, in the registration table 111 as the storage unit 24 in order to use the information as past position information of the object.
- the vehicle tracking unit 104 has already been registered in the registration table 111 with the right image after the right image acquisition unit 101 has acquired the right image in order to improve the processing efficiency.
- a vehicle tracking process for obtaining current position information (position information in the image) of the preceding vehicle is performed using the detected result (that is, past position information of the object).
- the vehicle tracking unit 104 searches the vicinity of the position included in the detected result using the information of the right image, and specifies the position of the preceding vehicle in the current frame.
- the first distance calculation unit 105 calculates the distance to the preceding vehicle using the position information obtained by the vehicle tracking unit 104 and the parallax image (parallax information) generated by the parallax information acquisition unit 103. Then, the relative speed calculation unit 106 calculates the relative speed of the preceding vehicle using the calculation result by the first distance calculation unit 105 and the calculation result calculated by the first distance calculation unit 105 in the past. Then, the calculation result output unit 110 outputs the calculation result by the first distance calculation unit 105 and the calculation result by the relative speed calculation unit 106 to the control processing unit 26 via the storage unit 24 and the like.
- the vehicle tracking unit 104 is the vehicle detection unit of the current frame, like the matching processing unit 506 ′ of the conventional image processing apparatus 50 ′ described with reference to FIGS.
- the position of the object (preceding vehicle) in the current frame is specified in the processing of itself from the past information already registered in the registration table 111.
- the vehicle detection / recognition process by the vehicle detection unit 107 uses a dictionary (image case dictionary) in which a large number of vehicle image cases are registered in order to recognize vehicle images in various designs and various external environments. And recognize. Therefore, although the processing accuracy is good, the calculation amount tends to be large and the processing time tends to be long. On the other hand, in the vehicle tracking process by the vehicle tracking unit 104, since an image of a vehicle that has already been detected is tracked with respect to the periphery thereof, the calculation amount is reduced, and the processing time from image input to calculation result output is reduced. Can be very short.
- the vehicle detection unit 107 uses the result of the parallax calculation by the parallax information acquisition unit 103 as described above.
- the position of the preceding vehicle in the frame is obtained and the preceding vehicle is detected and recognized, and the second distance calculation unit 108 calculates the distance to the preceding vehicle using the detection result.
- the vehicle tracking unit 104 has already obtained the position of the object (preceding vehicle) in the current frame, and the first distance calculation unit 105 has already obtained the distance to the object (preceding vehicle).
- the correction processing unit 109 compares the results of the vehicle detection unit 107 and the second distance calculation unit 108 with the results of the vehicle tracking unit 104 and the first distance calculation unit 105, and determines which position information and distance information in the next frame. Decide what to use. For example, the correction processing unit 109 can prioritize one of the information and use it in the next frame, or use the internal dividing point of the position obtained by both in the next frame. However, the correction method is particularly limited. Not.
- the correction processing unit 109 can also correct the result obtained by the relative speed calculation unit 106 based on the result of the second distance calculation unit 108 and the result of the first distance calculation unit 105.
- the correction processing unit 109 uses the obtained results (position information, distance information, relative speed information, etc.) in the registration table 111 for use as past position information of the object in the vehicle tracking process in the next frame. save.
- the image processing by the image processing apparatus 10 will be described in more detail in time series.
- the process in the image processing unit 22 is the S1 series and the process in the arithmetic processing unit 23 is the S2 series.
- a right image input process S101 from the right camera 11a and a left image input process S102 from the left camera 11b are performed, and at the stage where both images are accumulated, the parallax calculation process S103 is performed. I do.
- As a result of the parallax calculation process S103 left and right images or one of the images and the corresponding parallax information (parallax image) are obtained.
- the processing result in the parallax calculation processing S103 is transferred to the arithmetic processing unit 23 at timing T1.
- the processing flow so far is the same as the processing flow of the conventional image processing apparatus 50 ′ described with reference to FIG.
- the right image input process S101 ends before the timing T1, which is the transfer timing of the process result in the parallax calculation process S103 (the timing after the process end timing of the parallax calculation process S103).
- timing T0 image-only data that does not include stereoscopic data (parallax information) is transferred from the image processing unit 22 (S1 series) to the arithmetic processing unit 23 (S2 series). Since a vehicle that has already been detected once can be tracked using only an image, the arithmetic processing unit 23 (S2 series) performs a left image input process S102 and a parallax calculation process in the image processing unit 22 (S1 series).
- the vehicle tracking process S104 for obtaining the position of the vehicle in the current frame is performed. Thereafter, the first distance calculation process S105 and the relative speed calculation process S106 are performed using the disparity information transferred at the timing T1 after the parallax calculation process S103, and at this stage, the first distance calculation process S105 and the relative speed calculation process are performed.
- a calculation result output process S110 for outputting a calculation result by the speed calculation process S106 is performed. Thereby, the delay from the image input to the calculation result output to the control processing unit 26 or the like is suppressed.
- the vehicle detection / recognition process S107 is performed using the parallax information, and the second distance calculation process S108 is performed to calculate the distance to the preceding vehicle using the detection result.
- the correction process S109 for determining the position to be used in the next frame is performed, and the process in this frame is performed. Ends.
- the image processing in the image processing apparatus 10 is subdivided and reconfigured by dividing it into a monocular processing flow and a stereoscopic processing flow.
- the process start timing of the vehicle tracking unit 104 is set before the process end timing of the parallax information acquisition unit 103, particularly before the process start timing of the parallax information acquisition unit 103, and the first distance calculation
- the processing start timing of the unit 105 is set before the processing end timing of the vehicle detection unit 107, particularly before the processing start timing of the vehicle detection unit 107, and before the vehicle detection / recognition processing S107 by the vehicle detection unit 107 ends.
- FIG. 5 illustrates processing contents of the image processing apparatus according to the second embodiment of the present invention in time series.
- the image processing apparatus of the second embodiment is different from the image processing apparatus 10 of the first embodiment described above mainly in the distance calculation processing method by the first distance calculation unit and the processing start / end timing. That is, the apparatus configuration of the image processing apparatus according to the second embodiment is the same as the apparatus configuration of the image processing apparatus 10 according to the first embodiment, and thus detailed description thereof will be omitted.
- image processing by the image processing apparatus will be described. The processing contents will be specifically described.
- the image does not include stereoscopic data (disparity information) at timing T0A before the timing T1A that is the transfer timing of the processing result in the parallax calculation processing S103A and after the right image input processing S101A ends. Only the data (right image) is transferred from the image processing unit 22 (S1A series) to the arithmetic processing unit 23 (S2A series), and a vehicle tracking process S104A for obtaining the position of the vehicle in the current frame is performed.
- image-only data (left image alone or both right and left images) including no disparity information is sent to the image processing unit 22 (S1A series).
- image processing unit 22 (S1A series).
- arithmetic processing unit 23 (S2A series).
- the distance measurement and the relative speed measurement can be performed by limiting the calculation range based on the same principle as described with reference to FIG. 8 if there is a right image and a left image, and is generally shorter than the parallax calculation process S103A. It can be implemented in processing time.
- the first distance calculation process S105A and the relative speed calculation process S106A are performed using the position information obtained in the vehicle tracking process S104A and the right image and the left image transferred at the timings T0A and T2A.
- a calculation result output process S110A for outputting a calculation result by the first distance calculation process S105A and the relative speed calculation process S106A is performed. Thereby, the processing time from the image input to the calculation result output to the control processing unit 26 or the like can be further shortened.
- the vehicle detection / recognition process S107A is performed using the disparity information in the frame transferred at timing T1A after the end timing of the parallax calculation process S103A, and the distance to the preceding vehicle is calculated using the detection result.
- the second distance calculation process S108A is performed, and the correction process S109A for determining the position to be used in the next frame is performed based on the result of the vehicle tracking process S104A described above and the result of the vehicle detection / recognition process S107A. To do.
- the processing start timing of the first distance calculation unit is set before the processing end timing of the parallax information acquisition unit, and the timing is the transfer timing of the processing result in the parallax calculation processing S103A.
- Each image is transferred when the image input process from each camera is completed before T1A, and before the process in the parallax calculation process S103A is completed, the vehicle tracking process S104A is completed and When each image is transferred, the first distance calculation process S105A and the relative speed calculation process S106A are started, thereby further delaying the response from the image input to the calculation result output while maintaining the recognition accuracy of the distance and the like. It can be shortened.
- FIG. 6 illustrates processing contents of the third embodiment of the image processing apparatus according to the present invention in time series.
- the image processing apparatus according to the third embodiment is different from the image processing apparatus according to the second embodiment in that the arithmetic processing by the arithmetic processing unit is mainly performed using two CPUs. That is, the apparatus configuration of the image processing apparatus according to the third embodiment is the same as the apparatus configuration of the image processing apparatus according to the first and second embodiments, and thus detailed description thereof will be omitted.
- image processing by the image processing apparatus will be described. The processing contents of will be specifically described.
- Monocular vehicle tracking processing and related processing and stereoscopic vehicle detection / recognition processing and related processing use different data, and the vehicle tracking processing does not require stereo information, and Since the processes are independent of each other, the vehicle tracking process and the vehicle detection / recognition process in the arithmetic processing unit 23 can be performed independently by the two CPUs.
- the process in the image processing unit 22 is an S1B series
- the process in the arithmetic processing unit 23 is an S2aB series and an S2bB series that run on each CPU assuming two CPUs.
- the image processing unit 22 (S1B series) performs a right image input process S101B, a left image input process S102B, and a parallax calculation process S103B.
- the processing result (parallax information) is transferred to the S2bB sequence of the arithmetic processing unit 23.
- Timing T0B timing after acquiring one image
- timing T2B timing after acquiring both images
- Data of only the image not including the parallax information is transferred from the image processing unit 22 (S1B sequence) to the S2aB sequence of the arithmetic processing unit 23.
- the vehicle tracking process S104B and the first distance calculation process S105B are performed.
- the relative speed calculation process S106B and the calculation result output process S110B are performed.
- the results of the vehicle tracking process S104B and the like are transferred from the S2aB series of the arithmetic processing unit 23 to the S2bB series.
- the vehicle detection / recognition process S107B and the second distance calculation process S108B are performed using the disparity information in the frame transferred at the timing T1B after the end timing of the parallax calculation process S103B. To implement. Thereafter, based on the result of the vehicle detection / recognition process S107B and the result of the vehicle tracking process S104B transferred from the S2aB series, the correction process S109B for determining the position to be used in the next frame is performed.
- the vehicle tracking process in the arithmetic processing unit 23 and the related process, the vehicle detection / recognition process, and the related process are performed by each of the two CPUs.
- the vehicle tracking process S104B takes time, and the calculation result from the vehicle tracking Even if the processing time until output becomes long and the execution of the vehicle detection / recognition process is delayed in the case of one CPU, the vehicle detection / recognition process can be performed in parallel with the vehicle tracking process. Therefore, the processing time (processing cycle) of the entire image processing by the image processing apparatus can be shortened, and the processing efficiency can be effectively increased.
- the right image is acquired before the left image and the vehicle tracking process is performed using the right image.
- the left image is acquired before the right image.
- the vehicle tracking process may be performed using the left image.
- the present invention is not limited to the first to third embodiments described above, and includes various modifications.
- the first to third embodiments described above are described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.
- a part of the configuration of an embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of an embodiment.
- control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
Abstract
Description
図1は、本発明に係る画像処理装置の実施形態1が適用されたステレオカメラシステムのシステム構成を概略的に示したものである。図示するステレオカメラシステム1は、例えば自動車等の車両Vに搭載され、主に、複数(本実施形態では、2つ)のカメラからなるステレオカメラ装置11と、ステレオカメラ装置11の各カメラで同期して撮像された複数の画像を処理する画像処理装置10と、画像処理装置10で生成された制御信号に基づいて車両Vに搭載された各種装置(例えば、アクセル13、ブレーキ14、スピーカ15、ステアリング16など)を制御する制御装置12と、から構成されている。
図5は、本発明に係る画像処理装置の実施形態2の処理内容を時系列で説明したものでる。本実施形態2の画像処理装置は、上記した実施形態1の画像処理装置10に対し、主に第1距離算出部による距離算出処理の処理方法と処理開始・終了タイミングが相違している。すなわち、本実施形態2の画像処理装置の装置構成は、実施形態1の画像処理装置10の装置構成と同様であるため、その詳細な説明は省略し、以下では、画像処理装置による画像処理の処理内容を具体的に説明する。
図6は、本発明に係る画像処理装置の実施形態3の処理内容を時系列で説明したものでる。本実施形態3の画像処理装置は、上記した実施形態2の画像処理装置に対し、主に演算処理部による演算処理を2つのCPUを利用して実施する点が相違している。すなわち、本実施形態3の画像処理装置の装置構成は、実施形態1、2の画像処理装置の装置構成と同様であるため、その詳細な説明は省略し、以下では、画像処理装置による画像処理の処理内容を具体的に説明する。
Claims (11)
- 複数のカメラで同期して撮像された複数の画像を処理して前記画像に映る対象物までの相対距離もしくは相対速度を計測する画像処理装置であって、
前記複数の画像のうちのいずれか一つの画像と前記対象物の過去の位置情報とに基づいて、前記対象物の現在の位置情報を求める追跡処理を実施する対象物追跡部と、
前記現在の位置情報と前記複数の画像のうちの少なくとも二つの画像とに基づいて、前記対象物までの相対距離を算出する第1距離算出部と、
前記複数の画像のうちの少なくとも二つの画像から得られる視差情報に基づいて、前記対象物の現在の位置情報を求めて前記対象物を検知する対象物検知部と、
前記対象物の検知結果に基づいて前記対象物までの相対距離を算出する第2距離算出部と、
前記対象物追跡部により求められる現在の位置情報と前記対象物検知部により求められる現在の位置情報及び前記第1距離算出部により算出される相対距離と前記第2距離算出部により算出される相対距離の少なくとも一方に基づいて、前記現在の位置情報及び前記相対距離の少なくとも一方を補正する補正処理部と、を備え、
前記第1距離算出部の処理開始タイミングは、前記対象物検知部の処理終了タイミングよりも前に設定されていることを特徴とする画像処理装置。 - 前記第1距離算出部の処理開始タイミングは、前記対象物検知部の処理開始タイミングの以前に設定されていることを特徴とする、請求項1に記載の画像処理装置。
- 前記画像処理装置は、前記第1距離算出部の算出結果を外部へ出力する算出結果出力部を更に備えることを特徴とする、請求項1に記載の画像処理装置。
- 前記対象物追跡部は、前記補正処理部により補正された前記現在の位置情報及び前記相対距離の少なくとも一方を前記対象物の過去の位置情報として用いて前記追跡処理を実施することを特徴とする、請求項1に記載の画像処理装置。
- 前記画像処理装置は、前記複数の画像のうちの少なくとも二つの画像から前記視差情報を取得する視差情報取得部を更に備え、
前記対象物追跡部の処理開始タイミングは、前記視差情報取得部の処理終了タイミングよりも前に設定されていることを特徴とする、請求項1に記載の画像処理装置。 - 前記対象物追跡部の処理開始タイミングは、前記視差情報取得部の処理開始タイミングの以前に設定されていることを特徴とする、請求項5に記載の画像処理装置。
- 前記第1距離算出部の処理開始タイミングは、前記視差情報取得部の処理終了タイミングの以後に設定されていることを特徴とする、請求項5に記載の画像処理装置。
- 前記第1距離算出部の処理開始タイミングは、前記視差情報取得部の処理終了タイミングよりも前に設定されていることを特徴とする、請求項5に記載の画像処理装置。
- 前記対象物追跡部及び前記第1距離算出部による処理と前記対象物検知部及び前記第2距離算出部による処理とが並行して実施されるようになっていることを特徴とする、請求項1に記載の画像処理装置。
- 前記画像処理装置は、前記第1距離算出部により算出される相対距離に基づいて前記対象物の相対速度を算出する相対速度算出部を更に備えることを特徴とする、請求項1に記載の画像処理装置。
- 前記補正処理部は、前記第1距離算出部により算出される相対距離と前記第2距離算出部により算出される相対距離とに基づいて、前記対象物の相対速度を補正することを特徴とする、請求項10に記載の画像処理装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP15839643.2A EP3193134B1 (en) | 2014-09-11 | 2015-08-17 | Image processing device |
JP2016547792A JP6279090B6 (ja) | 2014-09-11 | 2015-08-17 | 画像処理装置 |
US15/500,127 US10247551B2 (en) | 2014-09-11 | 2015-08-17 | Vehicle image processing device for environment recognition |
CN201580038805.8A CN106662441B (zh) | 2014-09-11 | 2015-08-17 | 图像处理装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014185577 | 2014-09-11 | ||
JP2014-185577 | 2014-09-11 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016039077A1 true WO2016039077A1 (ja) | 2016-03-17 |
Family
ID=55458835
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/072987 WO2016039077A1 (ja) | 2014-09-11 | 2015-08-17 | 画像処理装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10247551B2 (ja) |
EP (1) | EP3193134B1 (ja) |
JP (1) | JP6279090B6 (ja) |
CN (1) | CN106662441B (ja) |
WO (1) | WO2016039077A1 (ja) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6920159B2 (ja) * | 2017-09-29 | 2021-08-18 | 株式会社デンソー | 車両の周辺監視装置と周辺監視方法 |
JP6620175B2 (ja) * | 2018-01-19 | 2019-12-11 | 本田技研工業株式会社 | 距離算出装置及び車両制御装置 |
JP7042185B2 (ja) * | 2018-07-27 | 2022-03-25 | 日立Astemo株式会社 | 距離算出装置 |
JP7291505B2 (ja) * | 2019-03-19 | 2023-06-15 | 株式会社Subaru | 車外環境検出装置 |
CN114071132B (zh) * | 2022-01-11 | 2022-05-13 | 浙江华睿科技股份有限公司 | 一种信息延时的检测方法、装置、设备及可读存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002024807A (ja) * | 2000-07-07 | 2002-01-25 | National Institute Of Advanced Industrial & Technology | 物体運動追跡手法及び記録媒体 |
JP2003061075A (ja) * | 2001-08-09 | 2003-02-28 | Matsushita Electric Ind Co Ltd | 物体追跡装置、物体追跡方法および侵入者監視システム |
JP2008236642A (ja) * | 2007-03-23 | 2008-10-02 | Hitachi Ltd | 物体追跡装置 |
WO2013080745A1 (ja) * | 2011-11-30 | 2013-06-06 | 日立オートモティブシステムズ株式会社 | 物体検知装置 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10141913A (ja) * | 1996-11-07 | 1998-05-29 | Komatsu Ltd | 移動体の位置、速度計測装置 |
JP5068782B2 (ja) * | 2009-03-25 | 2012-11-07 | 富士フイルム株式会社 | 距離測定方法および装置 |
JP5870510B2 (ja) * | 2010-09-14 | 2016-03-01 | 株式会社リコー | ステレオカメラ装置、校正方法およびプログラム |
JP5414714B2 (ja) * | 2011-03-04 | 2014-02-12 | 日立オートモティブシステムズ株式会社 | 車戴カメラ及び車載カメラシステム |
JP5472538B2 (ja) * | 2011-06-14 | 2014-04-16 | 日産自動車株式会社 | 距離計測装置及び環境地図生成装置 |
JP5587852B2 (ja) * | 2011-11-11 | 2014-09-10 | 日立オートモティブシステムズ株式会社 | 画像処理装置及び画像処理方法 |
JP5587930B2 (ja) * | 2012-03-09 | 2014-09-10 | 日立オートモティブシステムズ株式会社 | 距離算出装置及び距離算出方法 |
-
2015
- 2015-08-17 JP JP2016547792A patent/JP6279090B6/ja active Active
- 2015-08-17 CN CN201580038805.8A patent/CN106662441B/zh active Active
- 2015-08-17 EP EP15839643.2A patent/EP3193134B1/en active Active
- 2015-08-17 WO PCT/JP2015/072987 patent/WO2016039077A1/ja active Application Filing
- 2015-08-17 US US15/500,127 patent/US10247551B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002024807A (ja) * | 2000-07-07 | 2002-01-25 | National Institute Of Advanced Industrial & Technology | 物体運動追跡手法及び記録媒体 |
JP2003061075A (ja) * | 2001-08-09 | 2003-02-28 | Matsushita Electric Ind Co Ltd | 物体追跡装置、物体追跡方法および侵入者監視システム |
JP2008236642A (ja) * | 2007-03-23 | 2008-10-02 | Hitachi Ltd | 物体追跡装置 |
WO2013080745A1 (ja) * | 2011-11-30 | 2013-06-06 | 日立オートモティブシステムズ株式会社 | 物体検知装置 |
Non-Patent Citations (2)
Title |
---|
See also references of EP3193134A4 * |
SHIN'ICHI GOTO ET AL.: "Zen Hoi Stereo Gazo ni yoru Nigan Stereo to Motion Stereo o Heiyo shita 3 Jigen Keisoku", ITE TECHNICAL REPORT, vol. 34, no. 34, 31 August 2010 (2010-08-31), pages 81 - 84, XP009500946 * |
Also Published As
Publication number | Publication date |
---|---|
EP3193134A1 (en) | 2017-07-19 |
CN106662441B (zh) | 2019-06-18 |
US20180321030A1 (en) | 2018-11-08 |
CN106662441A (zh) | 2017-05-10 |
JPWO2016039077A1 (ja) | 2017-04-27 |
JP6279090B6 (ja) | 2018-06-27 |
EP3193134A4 (en) | 2018-05-16 |
US10247551B2 (en) | 2019-04-02 |
EP3193134B1 (en) | 2022-10-12 |
JP6279090B2 (ja) | 2018-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10486712B2 (en) | Evacuation control apparatus and evacuation control method | |
JP6279090B6 (ja) | 画像処理装置 | |
US10688997B2 (en) | Lane merging determination apparatus | |
JP6222785B2 (ja) | 操舵支援装置 | |
JP4211809B2 (ja) | 物体検出装置 | |
US9623869B2 (en) | Vehicle driving support control apparatus | |
JP6152673B2 (ja) | 車線変更支援装置 | |
JPWO2016039077A6 (ja) | 画像処理装置 | |
US9574538B2 (en) | Idling stop control system for vehicle | |
JP6313198B2 (ja) | 車両制御装置 | |
JP5545022B2 (ja) | 障害物認識装置 | |
US10871565B2 (en) | Object detection apparatus and object detection method | |
US10252715B2 (en) | Driving assistance apparatus | |
WO2016194900A1 (ja) | 車両制御装置、及び車両制御方法 | |
WO2018123641A1 (ja) | 走行可能領域検出装置及び走行支援システム | |
JP2018060422A (ja) | 物体検出装置 | |
JP2013161190A (ja) | 物体認識装置 | |
US11332135B2 (en) | Driving assistance device, driving assistance method, and computer program | |
JP2012164275A (ja) | 画像認識装置 | |
JP2006298254A (ja) | 走行支援装置 | |
JP5717416B2 (ja) | 運転支援制御装置 | |
WO2020039837A1 (ja) | 画像処理装置 | |
JP5104604B2 (ja) | 衝突判断装置 | |
JP2014123301A (ja) | 白線検出装置 | |
JP2005234999A (ja) | 車両用運転支援装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15839643 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016547792 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15500127 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2015839643 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015839643 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |