WO2013102529A1 - Method for the image-based detection of objects - Google Patents

Method for the image-based detection of objects Download PDF

Info

Publication number
WO2013102529A1
WO2013102529A1 PCT/EP2012/074647 EP2012074647W WO2013102529A1 WO 2013102529 A1 WO2013102529 A1 WO 2013102529A1 EP 2012074647 W EP2012074647 W EP 2012074647W WO 2013102529 A1 WO2013102529 A1 WO 2013102529A1
Authority
WO
Grant status
Application
Patent type
Prior art keywords
object
cameras
vehicle
characterized
camera
Prior art date
Application number
PCT/EP2012/074647
Other languages
German (de)
French (fr)
Inventor
Tobias Ehlgen
Leo VEPA
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • G06T7/596Depth or shape recovery from multiple images from stereo images from three or more stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Abstract

The invention relates to a method for monitoring the surroundings of a vehicle by means of at least two cameras (34, 38), in particular by means of two mono-cameras (34, 38). First, a detection of objects or an object (48) located in the surroundings of the vehicle occurs. Then, object data are correlated, wherein the object data that one of the at least two cameras (34, 38) records are correlated with the object data of one and the same object (12, 14, 18) that the other of the at least two cameras (34) records. Finally, the point in time of the recording of the at least two camera images of the at least two cameras (34, 38) is determined and an assessment of the plane in which the object (48) moves is made.

Description

Description Title

METHOD FOR IMAGE-BASED DETECTION OF OBJECTS

PRIOR ART The invention relates to a method for improving image-based detections of objects as well as their dimensioning by the merger of data of multiple cameras with different angles, to a method-implement this

Computer program and an apparatus, in particular a driver assistance system in which this method is implemented.

DE 101 31 196 A1 discloses a device for detecting objects or the like that are located in the surroundings of a vehicle. An image recording unit includes at least two image sensors, the optical axes of the image sensors relative to each other and / or can be aligned with variable relative to the vehicle.

can each other according to the orientation of the optical axes with respect to the

increased overlapping area of ​​the detection areas of the individual sensors or decreased in the other case. In the image pairs corresponding image areas in the two from slightly different angles captured images are being sought. For the corresponding image areas, the evaluation unit creates a three-dimensional geometry corresponding to the alignment of the optical axes. Of the

Superimposing area defines the stereo detecting the scenes recorded and the remaining portions of the image capturing regions or outside of the overlap region defining the mono detection of the captured scene. DE 196 40 938 A1 discloses an arrangement for monitoring traffic areas by means of video cameras, each two video cameras at the same time an overlapped

Watch image. The evaluation of the image information of overlapping image portions of two cameras, a much more robust traffic measurement is possible. When using two cameras can be distinguished whether gray value changes, so due to an area in the observed three-dimensional scene on objects above the surface or on virtual objects in the area such as shadows, or virtual objects below the surface, for example, reflections , The application of this method, at least two video cameras take simultaneous pictures. In such a way at the same time and overlapping with each other captured images are the basis for the stereoscopic detection of the traffic situation. Through the use of stereoscopic algorithms virtual objects are separated from the images and it can be a physical object measurement to be made.

DE 197 44 694 A1 discloses a video motion detection device for monitoring an area, said device comprising mutually spaced-apart video cameras for different surface area of ​​the total area. A three-dimensional analysis of the monitored space is achieved in that a video monitoring of large area is performed with a plurality of video cameras, which are arranged such that their LOOK- or detection areas partly overlap. are these cut portions thus image information from at least two video cameras from different angles before allowing a three-dimensional analysis.

An image-based detection of objects as well as their dimensions is mostly on images from a single camera or a stereo camera with very similar

Visual fields (fields of view) is performed. Thus, these objects can only be seen from one angle or from a viewing area. This in turn implies that at best, the height and the width may be estimated from the side facing the object side camera from the images. Since only one view angle or are similar angle to the object is available for stereo camera systems, not all object dimensions as the height and the width, and particularly the depth value. If the objects are in a twisted position for the camera, the objects can be detected worse and the estimable dimensions such as height and width can be very major errors.

Summary of the Invention

According to the invention, a plurality of cameras at different

Angles are arranged on the scene to use, so that even the camera twisted blank robust estimate appearing objects and their dimensions, is particularly advantageous that such objects are in the common field of view of multiple cameras.

Since the object detection and the dimensional estimation using multiple cameras

is carried out, that are installed in known positions on the vehicle, such as front cameras, side cameras and rear cameras, various viewing angles can be realized in several cameras in common overlapping areas. Thus one and the same object from different angles is visible. Therefore, you may object, in particular its three-dimensional training, more robust and determine the object dimensions, in particular should be emphasized here the object depth, can be better appreciated.

Other from images or extracted from video information, such as the object speed, just think of a rolling ball and the direction in which the object is moving, can be significantly improved by using multiple cameras with different angles. Instead of cameras and sensor systems such as radar, lidar or ultrasound could be used.

The inventively proposed method can be especially particularly easy to use or retrofitted or realize that have a Surround View system or Area View system, a top view or a Bird's Eye View of vehicles. Such vehicles are usually equipped with four cameras on the vehicle front, at the rear, on the two exterior mirrors. Thus, the present invention proposed system can be exploited by resorting to such a camera to an improvement in a three-dimensional reproduction of objects penetrating into the driving path for a driving assistance system.

Advantages of the invention By using the present invention the proposed method can be detected more robust objects, as more image data for the respective objects are available when they are detected with a number of cameras, for example two or three cameras from known installation locations on the vehicle. Due to the different object views the object dimensions in terms of height, width and depth Estimate much better. This means that more information can be obtained from the environment of the vehicle, which in turn can make better assumptions for the environment, so that the driver assistance system in implementation of the inventively proposed

Method, has an improved risk assessment or a substantially improved result evasion strategy.

BRIEF DESCRIPTION OF THE DRAWINGS With reference to the drawing, the invention will be described in greater detail below.

It shows:

1 shows a taking place the end in front of a vehicle scene recorded with a

Mono camera and

Figure 2 is a view partially overlapping detection range of two to

different locations arranged on the vehicle mono cameras, detect a three-dimensional object at the same time.

variants

1 shows a scene that takes place in front of a vehicle in the vehicle environment taken by a mono camera, which is arranged in front of the vehicle.

In the illustration according to Figure 1 is a surrounding area of ​​a vehicle, not shown in Figure 1, provided for example by a lying before the vehicle lane 10th The side of the roadway 10 is a side strip 18, in the individual parking bays are recognizable. The scene of Figure 1 shows a person 12 enters the lane is traveling in the direction of movement 16 and an object 14, which also moves in the movement direction 16, lags, as for example, a sudden running on the road 10 child that a lags on the road rolled ball 14th Both the person 12 and the object 14 are highlighted with a first image segment as a 2D image and a second image section 24 as a 2D image. It should also be noted that parked in the representation according to Figure 1 on the roadway 10 on the left side are 20 vehicles, however, are arranged only in the background of the scene shown in FIG. 1

The representation according to Figure 2 is shown in a plan view of a vehicle 32nd As Figure 2 shows, the vehicle 32 includes at least one recessed, for example, in the front end of the front camera 34, the installation is indicated by reference numeral 36th Furthermore, the vehicle 32 shown in Figure 2 comprises a first side camera 38. The location of the first side camera 38 is, for example, the left outside mirror 40 on the driver's side of the vehicle 32nd

Moreover, the vehicle 32 - although not shown in the illustration according to Figure 2 - in the right side mirror 40 comprise a further second side camera 42nd Moreover, the vehicle 32 may be equipped with a rear view camera 44, which is located for example at the upper edge of the rear window or in the region of the rear rocker, where the ultrasonic sensors for a driver assistance system can be arranged today usually at least a parking aid. With the rear camera 44, the installation is indicated by reference numeral 46, can be of situated behind the vehicle region of the vehicle environment 10 - here represented by the lane 10 - monitor, while the two usually at the bottom or at the rear of the mirrors 40 arranged side cameras 38 and 42 monitor the lateral region of the vehicle and, for example, they may be oriented such that accommodated in the exterior mirrors 40 side cameras 38, 42, which are oriented perpendicular to bottom, ie the top of the vehicle surroundings, in this case the

monitor road 10th

The representation according to Figure 2 it can be seen that the front camera 34 a

three-dimensional object 48 detected, which is also covered by the mirrors 40 arranged in the left side of the first camera 38th The installation positions of the two cameras, that is, the front camera 34 and the first side camera 38 on the vehicle 32 are known. Likewise, the recording times, be shot with the images of the three-dimensional object 48 are known and can be processed in a control unit, for example, a surround view system AreaView system, Top View system, or Bird's Eye View system. Furthermore, the cameras are listed above, that is, the front camera 34, the first and second side cameras 38 and 42 as well as the rear camera 44 in connection with the control device of the surround View- or View- Area, and Top View, or Bird's Eye View 'system.

As is apparent from Figure 2, the front camera is oriented 34 at the front of the vehicle 32 in a first angle 50 and recognized at the detection area 54, the three-dimensional object 48. At the same time this three-dimensional object is detected 48 from the arranged in the left outer view mirror 40 first side camera 38 which is oriented with respect to the vehicle longitudinal axis at a second angle 52nd It is to be noted that the first angle of view 50 of the front camera 34 due to the different mounting location 36 on the vehicle 32 from the second angle of view 52 of the first side camera 38, based on their installation on the door mirrors 40 on the driver's side, differs significantly. Both mono cameras, that is, the front camera 34 and the first side camera 38 capture the three-dimensional object 48 which is to the left front, ahead of the vehicle 32 simultaneously.

The three-dimensional object 48 is simultaneously in a detection region 54 of the front camera 34, and in a detection area 36 of the first side camera 38, which is accommodated in the side mirror 40 on the driver's side of the vehicle 32nd Both

Detecting regions, that is, the detection range 54 of the front camera 34 and the

Detection area 56 of the first side camera 38 overlap each other, indicated in Figure 2 by an overlap region 58th

If the three-dimensional object 48 by the front camera 34 and the first side camera 38 detects the same time, 34 and 38, images recorded in each of the two mono cameras are transmitted to a control unit of the driving assistance system. There is a processing of the images taken by the cameras 34 and 38 Mono images, wherein for the case that the object is detected 48 from the first side camera 38 for example, the object data with the object data of the other camera,

present case, the front camera 34 can be correlated without an elaborate

having to carry stereo calculation. Due to the different point of view, the control unit receives more information about the three-dimensional object 48. The

Installation positions of the cameras, ie the installation 36 of the front camera 34 is also known in the system, such as the installation location of the first side camera 38, namely in the exterior mirror on the driver's side of the vehicle 32. Since the recording times of the images supplied by the two mono cameras 34, 38 as well known are a plane estimation can take place at the same time to be correlated by both cameras 34, 38 detected three-dimensional object in time and place with each other. The oriented in different angles 50 and 52 cameras can hold information from which can be an indication of an estimated depth 62 of the three-dimensional object 48 and in terms of its width 60 derived with a larger Aussagekräftigkeit, as compared to a recording of a scene ahead of the vehicle 32 with only a single camera.

Since the position of the cameras or their installation locations are known in the vehicle 32, the front camera 34, installation locations of the first and second side cameras 38, 42 in the mirrors 40, and an installation 46 of the rear camera 44, can, in such a multi-camera system compare Fitting 36 may only carried out requires a stereo calculation, since the two side cameras 38 and 42 are oriented in the exterior mirrors 40 generally downward and thus despite 180 ° optics often have a different angle. Here it is possible that no correlation for a stereo calculation is possible, which leads to a post-processing, as outlined above, to achieve a more profit.

By using at least two cameras in the present case at least the front camera 34 and the first side camera 38 on the vehicle 32, and their

Arrangement with different viewing angles 50 and 52, as explained in connection in Figure 2, can also be to the mono cameras 34, 38 twisted objects appreciate 48 robust, as well as their dimensions with respect to the width 60 and depth 62 which for spatially detecting a three-dimensional object 48 is essential for human perception. The object detection and estimation of

Dimension is the proposed inventive method subsequent to a plurality of cameras 34, 38 with a common overlap region 58 instead, but at different viewing angles 50 or 52. As a result, one and the same three-dimensional object 48 from different angles 50 and 52 of the two mono cameras 34, 38, as shown in Figure 2, visible. This in turn can be the three-dimensional

Object 48 determine robust, and also the object dimensions, in particular the depth 62 of the three-dimensional object 48 is highlighted.

Also other, obtained from the images of the two mono cameras 34 38 information on the movement speed or movement direction 16 of the three-dimensional object can be produced by using several cameras 34, 38, wherein it should be at least two, in different angles are arranged 50 and 52, respectively, to improve. In vehicles 32, comprising a surround view system, which may also be referred to as Area View system, a Top View system or a bird's eye view system, are usually - as indicated in Figure 2 - a plurality of cameras, to compare the front camera 34, the first side camera 38 and the second side camera 42 as well as the rear camera 44 is provided and thus there is advantageously possible to exploit the camera environment at a such equipped vehicle 32 according to the invention for implementing the proposed method. Through the use of the inventive method proposed three-dimensional objects are detected 48 more robust as more image data of the three-dimensional object 48 are available. By the different object views of the same three-dimensional object 48 is received by the at least two different detection ranges 54, 56 scanning cameras 34, 36, the dimensions of the three-dimensional object Estimate much better. Thus, more information about the vehicle environment 10 are obtained, which can make a better model assumption for the environment, which can be used for example in driving assistance systems to improved risk assessment and the improvement of alternative strategies.

The invention further relates to a computer program, with which the method proposed by the invention on a programmable

Computer device can be executed. Furthermore, the present invention provides a driving assist system to the object, which already built-up on the vehicle

Components utilized in the form of the front camera 34, the first side of the camera 38, the second side camera 42 as well as the rear camera 44th Said cameras are connected to a control unit of the driving assistance system so that each of the

Mono cameras captured images can be transferred to the control unit for further image processing. the control unit of the driving assistance system Furthermore, the recording times of the mono cameras 34, 38, 42, 44 known and their mounting locations, as well as their different viewing angle, see reference numeral 50 or 52 in the illustration according to FIG 2, wherein it is a playback bird's is. Starting from the known installation positions of the cameras 34, 38, 42, 44, in different angles 50, 52 of the at least two mono cameras, here the front camera 34 with respect to the vehicle 32 and the first side of the camera 38 with respect to the longitudinal axis of the vehicle 32 , can be closed with respect to the object velocity and the direction of movement of the three-dimensional object 48 from the recorded record information. could take place towards the free here verges 18 - Is this example, a child who runs after one came to the road 10 ball, the directions of movement of ball and child are known, so that an alternative strategy - see in Figure 1 illustrated scenery where no vehicles are parked 20 so that would be possible evasion readily.

The invention is not limited to the embodiments described herein and the aspects highlighted in. Rather, a variety of modifications are possible within the specified range by the appended claims, which are within the technical action.

Claims

Method for monitoring a vehicle's environment (10) having at least two cameras (34, 38), in particular mono cameras (34, 38), using the following
Method steps: a) detecting objects (12, 14, 48) (in the vehicle environment 10), b) correlating the object data, which is one of at least two cameras (34, 38) receives with object data of the same object (12 , 14, 48), which (34, 38) receives the other of the at least two cameras,
c moves) Determination of the recording times of the (of the at least two cameras 34, supplied 38) images and carry out a plane estimation, in which the object (12, 14, 48).
A method according to claim 1, characterized in that the recording times of the images of at least two cameras (34, 38) and the plane estimation to the temporal and spatial correlation of the data of the object (12, 14, 48) are used.
A method according to claim 1, characterized in that from the temporal and spatial correlation of the data of the object (12, 14, 48) has a width (60) and a depth (62) of the object (12, 14, 48) to be estimated.
Process according to one of the preceding claims, characterized in that the object (12, 14, 48) of the at least two cameras (34, 38) with
different angles (50, 52) is detected. 5. The method according to any one of the preceding claims, characterized in that a movement direction (16) and a movement speed of the object (12, 14, 48) are determined.
6. The method according to any one of the preceding claims, characterized in that an image is carried out from among the at least two cameras (34, 38) supplied images postprocessing including a stereo calculation.
7. Computer program for carrying out a method according to the
preceding claims when the computer program is executed on a programmable computing device.
8. driving assistance system for monitoring a vehicle's environment (10) having at least two cameras (34, 38), in particular mono cameras (34, 38) located in a
Vehicle environment exploiting Knitting object (12, 14, 48) detect, characterized in that a component is provided, the object data containing at least one of the at least two cameras (34, 38) receives with object data of the same object (12, 14, 48) which receives the other of the at least two cameras (34, 38) correlated.
9. driving assist system according to the preceding claim, characterized in that the component for correlation of the object data of the object (12, 14, 48) has a width (60) and a depth (62) of the object (12, 14, 48) makes determined.
10. driving assist system in accordance with the two preceding claims, characterized
in that the component for correlation of the object data of the object (12, 14, 48) having a front camera (34), a first and a second side camera (38, 42) and a rear camera (44), a surround view system or a Bird's Eye View system exchanges data.
PCT/EP2012/074647 2012-01-05 2012-12-06 Method for the image-based detection of objects WO2013102529A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE201210200135 DE102012200135A1 (en) 2012-01-05 2012-01-05 Method to image-based detections of objects
DE102012200135.2 2012-01-05

Publications (1)

Publication Number Publication Date
WO2013102529A1 true true WO2013102529A1 (en) 2013-07-11

Family

ID=47594613

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2012/074647 WO2013102529A1 (en) 2012-01-05 2012-12-06 Method for the image-based detection of objects

Country Status (2)

Country Link
DE (1) DE102012200135A1 (en)
WO (1) WO2013102529A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016117518A1 (en) 2016-09-16 2018-03-22 Connaught Electronics Ltd. Adjusted joining of individual images into an overall image in a camera system for a motor vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19640938A1 (en) 1996-10-04 1998-04-09 Bosch Gmbh Robert Arrangement and method for monitoring of traffic areas
DE19744694A1 (en) 1997-10-10 1999-04-15 Bosch Gmbh Robert Video motion detector arrangement
DE10131196A1 (en) 2001-06-28 2003-01-16 Bosch Gmbh Robert Device for detecting objects, people or the like
US20070198189A1 (en) * 2005-11-10 2007-08-23 Valeo Vision Method for evaluation, by motor vehicle, of the characteristics of a front element
US20100148977A1 (en) * 2008-12-15 2010-06-17 Industrial Technology Research Institute Localization and detection system applying sensors and method thereof
US20100149333A1 (en) * 2008-12-15 2010-06-17 Sanyo Electric Co., Ltd. Obstacle sensing apparatus

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19640938A1 (en) 1996-10-04 1998-04-09 Bosch Gmbh Robert Arrangement and method for monitoring of traffic areas
DE19744694A1 (en) 1997-10-10 1999-04-15 Bosch Gmbh Robert Video motion detector arrangement
DE10131196A1 (en) 2001-06-28 2003-01-16 Bosch Gmbh Robert Device for detecting objects, people or the like
US20070198189A1 (en) * 2005-11-10 2007-08-23 Valeo Vision Method for evaluation, by motor vehicle, of the characteristics of a front element
US20100148977A1 (en) * 2008-12-15 2010-06-17 Industrial Technology Research Institute Localization and detection system applying sensors and method thereof
US20100149333A1 (en) * 2008-12-15 2010-06-17 Sanyo Electric Co., Ltd. Obstacle sensing apparatus

Also Published As

Publication number Publication date Type
DE102012200135A1 (en) 2013-07-11 application

Similar Documents

Publication Publication Date Title
Stiller et al. Multisensor obstacle detection and tracking
US6734787B2 (en) Apparatus and method of recognizing vehicle travelling behind
US20040125207A1 (en) Robust stereo-driven video-based surveillance
US8340866B2 (en) Vehicle and steering control device for vehicle
US20140152975A1 (en) Method for dynamically adjusting the operating parameters of a tof camera according to vehicle speed
JP2008219063A (en) Apparatus and method for monitoring vehicle's surrounding
JP2008227646A (en) Obstacle detector
JP2005045602A (en) Field of view monitor system for vehicle
US20130222127A1 (en) Intelligent driver assist system based on multimodal sensor fusion
US20080198226A1 (en) Image Processing Device
JP2006341641A (en) Image display apparatus and image display method
US20100060735A1 (en) Device and method of monitoring surroundings of a vehicle
JP2007129560A (en) Object detector
CN102542843A (en) Early warning method for preventing vehicle collision and device
JP2008027309A (en) Collision determination system and collision determination method
US20120307059A1 (en) Diagnosis apparatus and diagnosis method
US7894631B2 (en) Obstacle detection apparatus
JP2007264778A (en) Apparatus for recognizing pedestrian
US20070206835A1 (en) Method of Processing Images Photographed by Plural Cameras And Apparatus For The Same
CN1255624A (en) Stereo observing system of automotive single pickup camera
JP2004257837A (en) Stereo adapter imaging system
US20120327236A1 (en) Vehicle Periphery Monitoring System
US20100245578A1 (en) Obstruction detecting apparatus
JP2001082954A (en) Image processing device and image processing distance- measuring method
JP2005024463A (en) Stereo wide visual field image processing apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12816462

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 12816462

Country of ref document: EP

Kind code of ref document: A1