SE538501C2 - Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle - Google Patents

Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle Download PDF

Info

Publication number
SE538501C2
SE538501C2 SE1451428A SE1451428A SE538501C2 SE 538501 C2 SE538501 C2 SE 538501C2 SE 1451428 A SE1451428 A SE 1451428A SE 1451428 A SE1451428 A SE 1451428A SE 538501 C2 SE538501 C2 SE 538501C2
Authority
SE
Sweden
Prior art keywords
different
markings
detection unit
reflectivity
detection
Prior art date
Application number
SE1451428A
Other languages
Swedish (sv)
Other versions
SE1451428A1 (en
Inventor
Salmén Mikael
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1451428A priority Critical patent/SE538501C2/en
Priority to DE102015014318.2A priority patent/DE102015014318A1/en
Publication of SE1451428A1 publication Critical patent/SE1451428A1/en
Publication of SE538501C2 publication Critical patent/SE538501C2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K31/00Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator
    • B60K31/0008Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator including means for detecting potential obstacles in vehicle path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Image Processing (AREA)

Description

METHOD AND SYSTEM FOR IMPROVING QUALITY OF IMAGE INFORMATION FROM A 3D DETECTION UNIT FOR USE ON A VEHICLE TECHNICAL FIELD The invention relates to a method for improving reliability of image information from a 3D detection unit for use on a vehicle according to the preamble of claim 1. The invention also relates to a system for improving reliability of image information from a 3D detection unit for use on a vehicle. The invention relates to a vehicle. The invention in addition relates to a computer program and a computer program product.
BACKGROUND ART Image quality metrics and benchmarking methods exist for 2D-cameras. For Time-of-Flight cameras different metrics can be used to measure the quality of the range data, for instance measuring the variance of the range measurements. Based on these metrics the exposure time of the camera can be automatically adjusted to achieve good image quality of the entire image.
When installing a Time-of-Flight camera on a vehicle the position relative to ground will depend on the vehicle type. Further, due to the limited illumination strength of the camera, the camera will not be able to illuminate the whole field-of-view and therefore the auto-exposure function of the camera may try to maximize visibility and drive the camera at too high exposure causing other visual problems instead.
WO201154971 discloses a method and system for detecting 3D objects within a defined area. A camera is calibrated with reference to the ground base on distance data to the ground. An object detection volume is created in which all points from the ground and upwardly to e.g. 10 cm are excluded.
OBJECTS OF THE INVENTION An object of the present invention is to provide a method for improving reliability of image information from a 3D detection unit for use on a vehicle which is flexible and efficient.
Another object of the present invention is to provide a system for improving reliability of image information from a 3D detection unit for use on a vehicle which is flexible and efficient.
SUMMARY OF THE INVENTION These and other objects, apparent from the following description, are achieved by a method, a system, a vehicle, a computer program and a computer program product, as set out in the appended independent claims. Preferred embodiments of the method and the system are defined in appended dependent claims.
Specifically an object of the invention is achieved by a method for improving reliability of image information from a 3D detection unit for use on a vehicle, the 3D detection unit being configured for illuminating the object and detecting from the object reflected light representing 3D information so as to determine a 3D image. The method comprises the step of calibrating the 3D detection unit for a certain 3D detection unit position on the vehicle. The step of calibrating 3D detection unit comprises the steps of: detecting markings with different known minimum reflectivity representing lowest supported reflectivity corresponding to different object types, using different integration times/exposure times; determining reliable information from said markings detection; creating different masks based on the thus determined reliable information for different markings corresponding to different object types. Hereby the same 3D detection unit may be used for different object types, thus reducing costs. Flexibility and efficiency is thus hereby improved. Further, the quality of the information/data of the image of different objects is improved so that processing the data will result in a better identification of the detected objects. The amount of information/data required will be minimized by creating the different masks for the different objects. The maximum object detection range for the 3D detection unit may hereby be determined for the different objects. Thus, a generic solution for a vehicle fleet using one variant of 3D detection unit instead of multiple adjusted variants or manually creating a mask for each 3D detection unit depending on the object is hereby facilitated.
According to an embodiment the method comprises the step of creating different masks based on the thus determined reliable information for different 3D detection unit positions comprising height above said markings and direction of the 3D detection unit. Hereby the same 3D detection unit may be used for different vehicle configurations requiring different positions of the 3D detection unit including different heights/and or directions for identification of an object, thus reducing costs. Further, the quality of the information/data of the image of the object is improved so that processing the data will result in a better identification of the detected object independent from different positions, i.e. heights/directions of the 3D detection unit. Thus, the reliability of the image information from a 3D detection unit is improved. The amount of information/data required will be minimized by creating the different masks for the different positions of the 3D detection unit. The maximum object detection range for the 3D detection unit may hereby be determined for the different positions of the 3D detection unit. Thus, a generic solution for a vehicle fleet using one variant of 3D detection unit instead of multiple adjusted variants or manually creating a mask for each 3D detection unit depending on the position on the vehicle is hereby facilitated. By thus associating different masks for different positions of a 3D detection unit on a vehicle, such different positions may be identifiable such that when a 3D detection unit is arranged on a certain position on a vehicle, the 3D detection unit is controlled such that the mask associated to the identified position is applied.
According to an embodiment of the method the step of detecting markings with known minimum reflectivity is performed for different illumination powers of the 3D detection unit. Hereby different masks for different illumination powers may be created. Thus, different masks for different illumination powers for the respective position of the 3D detection unit may hereby be created. Thus, different masks for different illumination powers for different objects may hereby be created. By thus associating different masks for different illumination powers for a 3D detection unit, such a 3D detection unit with variable illumination power arranged on a vehicle may be controlled such that the mask associated to the set illumination power is applied.
According to an embodiment of the method the step of detecting markings with known minimum reflectivity is performed for different illumination patterns of the 3D detection unit. Hereby different masks for different illumination patterns may be created. Thus, different masks for different illumination patterns for the respective position of the 3D detection unit may hereby be created. Thus, different masks for different illumination patterns for different objects may hereby be created. By thus associating different masks for different illumination patterns for a 3D detection unit, such a 3D detection unit with variable illumination patterns arranged on a vehicle may be controlled such that the mask associated to the set illumination pattern is applied.
According to an embodiment of the method the step of creating a mask comprises excluding information from 3D unit detection pixels determined not to be reliable. Hereby quality of the image information/data is improved. Object detection and image processing speed is enhanced since not all pixels need to be processed.
According to an embodiment the method comprises the steps of: detecting markings with known maximum reflectivity representing highest expected reflectivity using different integration times/exposure times; determining saturation characteristics comprising a saturation level from said markings detection so as to determine reliable detection. Hereby it is possible to determine when a certain object will not be detectable due to its reflectivity/saturation.
According to an embodiment of the method the step of detecting markings with known maximum reflectivity comprises the step of detecting markings with different known maximum reflectivity representing highest expected reflectivity corresponding to different object types, using different integration times/exposure times; wherein saturation characteristics comprising a saturation level from said markings detection is determined so as to determine reliable detections for said different object types. Hereby it is possible to determine when different objects will not be detectable due to their reflectivity/saturation.
According to an embodiment of the method the step of detecting markings with known maximum reflectivity is performed for different illumination powers of the 3D detection unit. Hereby it is possible to determine when a certain object will not be detectable due to its reflectivity/saturation for different illumination powers of the 3D detection unit.
According to an embodiment of the method the step of detecting markings with known maximum reflectivity is performed for different illumination patterns of the 3D detection unit. Hereby it is possible to determine when a certain object will not be detectable due to its reflectivity/saturation for different illumination patterns of the 3D detection unit.
According to an embodiment the method comprises the step of storing thus determined masks. Hereby such masks may be used for validating against another mask of another 3D detection unit installed on another vehicle with corresponding characteristics, i.e. the 3D detection unit being positioned in the same way. Hereby such masks may be used in another 3D detection unit of the same kind without requiring calibration of that 3D detection unit, although some correlation may be preferred.
According to an embodiment the method comprises the step of storing thus determined saturation characteristics. Hereby such saturation characteristics may be used for validating against saturation characteristics of another 3D detection unit installed on another vehicle with corresponding characteristics, i.e. the 3D detection unit being positioned in the same way. Hereby such saturation characteristics may be used in another 3D detection unit of the same kind without requiring calibration of that 3D detection unit, although some correlation may be preferred.
Specifically an object of the invention is achieved by a system for improving reliability of image information from a 3D detection unit for use on a vehicle, the 3D detection unit being configured for illuminating the object and detecting from the object reflected light representing 3D information so as to determine a 3D image adapted to perform the methods as set out above.
The system according to the invention has the advantages according to the corresponding method claims.
BRIEF DESCRIPTION OF THE DRAWINGS For a better understanding of the present invention reference is made to the following detailed description when read in conjunction with the accompanying drawings, wherein like reference characters refer to like parts throughout the several views, and in which: Fig. 1 schematically illustrates a side view of a vehicle according to the present invention; Fig. 2 schematically illustrates a system for improving reliability of image information from a 3D detection unit for use on a vehicle according to an embodiment of the present invention; Fig. 3 schematically illustrates a block diagram of a method for improving reliability of image information from a 3D detection unit for use on a vehicle according to an embodiment of the present invention; Fig. 4a schematically illustrates a plan view of a vehicle with a 3D detection unit according to the present invention; Fig. 4b schematically illustrates a side view of a vehicle with a certain configuration for a certain position of a 3D detection unit according to the present invention; Fig. 4c schematically illustrates a side view of a vehicle with a certain configuration for a certain position of a 3D detection unit according to the present invention; and Fig. 5 schematically illustrates a computer according to an embodiment of the present invention.
DETAILED DESCRIPTION Hereinafter the term "link" refers to a communication link which may be a physical connector, such as an optoelectronic communication wire, or a non-physical connector such as a wireless connection, for example a radio or microwave link.
Hereinafter the term "2D" refers to two dimensional. Hereinafter the term "3D" refers to three dimensional.
Hereinafter the term "markings" refers to any suitable markings having a known minimum reflectivity representing lowest supported reflectivity corresponding to a certain object. The term "markings" may also refer to any suitable markings having a known maximum reflectivity representing highest expected reflectivity. The term "markings", i.e. such marking may be any suitable element such as one or more stripes. The term "markings" may refer to a marking forming a continuous pattern and/or several markings forming a pattern. The term "markings" may refer to different markings with different known minimum reflectivity representing lowest supported reflectivity corresponding to different object types. The markings may correspond to the ground, but also other objects such as pedestrians, animals, bicyclists, reflective objects such as road signs, markings on the road, crash barriers etc. The markings/mask may consider expected object size and likelihood of detecting sufficient amount of signals/pixels from the object in a certain area of the field-of-view of the 3D detection unit. For instance, a close object may due to strong signals/reflection and larger size generate e.g. 100 pixels/signals compared to an object further away which due to weaker signal/reflection and due to geometry be smaller to sensor hence generating e.g. 20 pixels.
Hereinafter the term "3D detection unit position" refers to the height of the 3D detection unit and the direction of the 3D detection unit.
The term "direction of the 3D detection unit" refers to the in which direction the 3D detection unit is pointing and thus detecting. The direction of the 3D detection unit comprises angle around the x, y, z axis.
Hereinafter the term "integration time/exposure time" is used. A 3D detection unit constituted by a Time-of-Flight camera such as a continuous wave/PMD type of sensor for example it is referred to as the integration time, such a Time-of-flight camera taking four images during the integration time.
Fig. 1 schematically illustrates a vehicle 1 according to an embodiment of the present invention. The exemplified vehicle 1 is a heavy vehicle in the shape of a truck. The vehicle according to the present invention could be any suitable vehicle a bus, a car, a train or the like. The vehicle comprises a system for improving reliability of image information from a 3D detection unit 110 for use on a vehicle according to the present invention. The vehicle comprises a 3D detection unit 110. The 3D detection unit 110 is arranged to detect objects which may vary. The 3D detection unit 110 is installed on the vehicle, here the vehicle cab, at a certain position comprising a certain height H and a certain direction D for detection, the height and/or direction varying depending on the vehicle configuration, e.g. cab configuration and/or chassis configuration. The position comprising the direction D comprises angle around the x, y, z axis, which need to be determined, i.e. roll, pitch, yaw. If the 3D detection unit 110 is tilted 1 degree around the y-axis, the y-axis being in the forward direction of the vehicle, then the result will differ compared to a 3D detection unit 110 having no roll offset from the horizontal line. A vehicle may have multiple possible mounting positions.
Fig. 2 schematically illustrates a system I for improving reliability of image information from a 3D detection unit 110 for use on a vehicle according to an embodiment of the present invention.
The system I comprises an electronic control unit 100.
The 3D detection unit 110 is configured for illuminating an object O by means of light L1 and detecting from the object reflected light L2 representing 3D information so as to determine a 3D image. The 3D detection unit 110 is configured for detecting reflected light by means of time-of-flight technique. The 3D detection unit 110 comprises active light L1 for illumination of the object O. The light L1 could be any suitable light such as infrared light, visible light, LASER light or the like. The 3D detection unit 110 comprises according to an embodiment means for illumination of an object O by means of infrared light.
The 3D detection unit 110 comprises according to an embodiment a time-of-flight camera unit. The time-of-flight camera unit may be any kind of sensor using transmitted signal through air and correlating with received signal in order to conclude a distance to measured point(s). The time-of-flight (ToF) camera unit may be any kind of ToF camera based on for instance Continuous-Wave (CW, such as PMD, Swissranger, etc), pulsed-wave (such as TDC) or range gating (Obzerv) principles. ToFs uses typically a LED that illuminates the whole scene at ones. The 3D detection unit 110 comprises according to an embodiment a LIDAR, i.e. a laser scanner unit.
The 3D detection unit 110 is arranged to detect a 3D image of an object O.
The system I comprises means 112, 112a, 112b, 120, 130, 140 for calibrating the 3D detection unit 110 for a certain 3D detection unit position on a vehicle.
The means for calibrating 3D detection unit comprises means 112 for detecting markings. The means 112 for detecting markings comprises means 112a for detecting marking with known minimum reflectivity representing lowest supported reflectivity using different integration times/exposure times for different 3D detection unit positions. The different 3D detection unit positions comprise height above said markings and direction of the 3D detection unit. Said means 112 for detecting markings is comprised in said 3D detection unit 110. The 3D detection unit 110 is thus configured to detect said markings. The system I and thus the 3D detection unit 110 comprises said means 112a for detecting markings with known minimum reflectivity representing lowest supported reflectivity using different integration times/exposure times.
According to an embodiment the means 112 for detecting markings comprises means for detecting marking with a known certain reflectivity between lowest and highest reflectivity representing a certain supported reflectivity using different integration times/exposure times.
For each position of the 3D detection unit 110 the means 112 for detecting markings will thus take images using different integration times/exposure times to get range data for each position.
The means 112 for detecting markings comprises means 112b for detecting markings with different known minimum reflectivity representing lowest supported reflectivity corresponding to different object types, using different integration times/exposure times.
The means for calibrating the 3D detection unit 110 comprises means 120 for determining reliable information from said markings detection. The system I thus comprises means 120 for determining reliable information from said markings detection. The means 120 for determining reliable information from said markings detection comprises applying a quality metric on each range pixel that the 3D detection unit 110, i.e. the means 112a for detecting markings with known minimum reflectivity, will be able to estimate what pixels/measurements can be assumed to be reliable. According to an embodiment a margin is added for increased object detection robustness.
The means 112a, 112b for detecting markings with known minimum reflectivity is arranged to be performed for different illumination powers of the 3D detection unit 110.
The means 112a, 112b for detecting markings with known minimum reflectivity is arranged to be performed for different illumination patterns of the 3D detection unit 110.
The means 120 for determining reliable information from said markings detection comprises determining reliable information from different known minimum reflectivity representing lowest supported reflectivity corresponding to different object types.
When determining reliable information from different known minimum reflectivity representing lowest supported reflectivity corresponding to different object types object size is considered. The means 112 for detecting markings for object detection is according to an embodiment configured to assume a certain threshold of pixels, for example 30 pixels, to allow a cluster to be included in the object detection. Hence markings/objects returning a number of pixels below the threshold, for example 20 pixels, will not be detected. By thus considering object size when creating a mask may considerably improve the quality/reliability.
Detected pixel density could be used as one measure to threshold where the cut-off distance for the mask is. For example, for one marking only 25% of the pixels provide sufficiently strong signals and at this distance the pixel density is e.g. 10 cm, i.e. the physical distance between two adjacent pixels is 10 cm, and the detectable object need to be 30 cm times 30 cm in size. This means that with 100% detection 3x3 pixels can be detected, meaning 9 pixels. If only 25% is detectable it results in 2 pixels. If threshold is set at 5 pixels, the object is not detectable at this distance.
The means for calibrating the 3D detection unit 110 comprises means 132 for creating different masks based on the thus determined reliable information for different 3D detection unit positions comprising height above said markings and direction of the 3D detection unit. The system I thus comprises means 132 for creating different masks based on the thus determined reliable information for different 3D detection unit positions comprising height above said markings and direction of the 3D detection unit. The masks are created by only allowing the reliable pixels. Once the 3D detection unit 110, i.e. the means 120 for determining reliable information from said markings detection, has decided which pixels can be considered as reliable each mask is stored and associated with the used integration time and object type.
The means for calibrating the 3D detection unit 110 comprises means 134 for creating different masks based on the thus determined reliable information for different markings corresponding to different object types. The system I thus comprises means 134 for creating different masks based on the thus determined reliable information for different markings corresponding to different object types.
The means 132 and the means 134 are comprised in a means 130 for creating masks. The system I thus comprises means 130 for creating masks based on reliable information for markings.
The means 130 for creating a mask comprises means 130a for excluding information from 3D unit detection pixels determined not to be reliable. The means 130a for excluding information from 3D unit detection pixels determined not to be reliable may comprise any suitable filter means The markings do not need to be continuous as the 3D detection unit, e.g. Time-of-Flight camera, can interpolate/extrapolate a mask. The markings should enable the 3D detection unit to build up the information for the whole field-of-view, only limited by its illumination.
The means 112 for detecting markings comprises means 112c for detecting markings with known maximum reflectivity representing highest expected reflectivity using different integration times/exposure times. The system I and thus the 3D detection unit 110 comprises said means 112c for detecting markings with known maximum reflectivity representing highest expected reflectivity using different integration times/exposure times.
The means 112 for detecting markings comprises means 112d for detecting markings with different known maximum reflectivity representing highest expected reflectivity corresponding to different object types, using different integration times/exposure times.
The means 112c, 112d for detecting markings with known maximum reflectivity is arranged to be performed for different illumination powers of the 3D detection unit 110.
The means 112c, 112d for detecting markings with known maximum reflectivity is arranged to be performed for different illumination patterns of the 3D detection unit 110.
The system I comprises means 140 for determining saturation characteristics comprising a saturation level from said markings detection so as to determine reliable detection.
The means 140 for determining saturation characteristics comprising a saturation level from said markings detection is provided so as to determine reliable detections for said different object types.
By thus determining saturation characteristics comprising a saturation level from said markings detection with said means 140 so that if increasing integration time to see an object with relatively low reflectivity, for instance the ground, it will be known to the system I including the 3D detection unit 110, when it can be expected that another object with relatively higher reflectivity, for instance pedestrians, will not be detectable due to saturation. The system I including the 3D detection unit 110, may according to an embodiment be controlled in such a way that the 3D detection unit 110 at a certain point will go into multi-exposure mode, so called HDR (High Dynamic Range), to ensure that the most critical object type, e.g. pedestrians, is detectable at all times.
The system I comprises means 100, 150 for storing thus determined masks. The means 100, 150 for storing thus determined masks comprises means 150 for storing thus determined masks externally comprising external storage means. Said means 150 for external storage of masks may comprise any suitable server unit, computer or the like.
The system I comprises means 100, 160 means for storing thus determined saturation characteristics. The means 100, 160 for storing thus determined saturation characteristics comprises external means 160 for storing thus determined saturation characteristics. Said external means 160 may comprise any suitable server unit, computer or the like.
The electronic control unit 100 is operatively connected to the means 112a for detecting markings with known minimum reflectivity representing lowest supported reflectivity using different integration times/exposure times comprising direction of the 3D detection unitheight above said markings and direction of the 3D detection unit via a link 12a. The electronic control unit 100 is via the link 12a arranged to receive a signal from said means 112a representing 3D image data for said markings for different integration times/exposure times for the different positions of the 3D detection unit 110.
The electronic control unit 100 is operatively connected to the means 112b for detecting markings with different known minimum reflectivity representing lowest supported reflectivity corresponding to different object types using different integration times/exposure times via a link 12b. The electronic control unit 100 is via the link 12b arranged to receive a signal from said means 112b representing 3D image data for said markings for different integration times/exposure times for the different object types.
The electronic control unit 100 is operatively connected to the means 120 for determining reliable information from said markings detection via a link 120a. The electronic control unit 100 is via the link 120a arranged to send a signal to said means 120 representing 3D image data for said markings for different integration times/exposure times for the different positions of the 3D detection unit 110 and/or 3D image data for said markings for different integration times/exposure times for the different object types.
The electronic control unit 100 is operatively connected to the means 120 for determining reliable information from said markings detection via a link 120b. The electronic control unit 100 is via the link 120b arranged to receive a signal from said means 120 representing data for reliable information from said markings detection comprising data for reliable information from markings detection for different integration times/exposure times for the different positions of the 3D detection unit 110 and/or data for reliable information from markings detection for different integration times/exposure times for the different object types.
The electronic control unit 100 is operatively connected to the means 132 for creating different masks based on the thus determined reliable information for different 3D detection unit positions comprising direction of the 3D detection unitheight above said markings and direction of the 3D detection unit via a link 132a. The electronic control unit 100 is via the link 132a arranged to send a signal to said means 132 representing mask creation data for determined reliable information for different 3D detection unit positions.
The electronic control unit 100 is operatively connected to the means 132 for creating different masks based on the thus determined reliable information for different 3D detection unit positions via a link 132b. The electronic control unit 100 is via the link 132b arranged to receive a signal from said means 132 representing mask data for different masks based on the thus determined reliable information for different 3D detection unit positions.
The electronic control unit 100 is operatively connected to the means 134 for creating different masks based on the thus determined reliable information for different markings corresponding to different object types via a link 134a. The electronic control unit 100 is via the link 134a arranged to send a signal to said means 134 representing mask creation data for determined reliable information for different object types.
The electronic control unit 100 is operatively connected to the means 134 for creating different masks based on the thus determined reliable information for different markings corresponding to different object types via a link 134b. The electronic control unit 100 is via the link 134b arranged to send a signal to said means 134 representing mask data for different masks based on the thus determined reliable information for different object types.
The electronic control unit 100 is operatively connected to the means 130a for excluding information from 3D unit detection pixels determined not to be reliable via a link 30a. The electronic control unit 100 is via the link 30a arranged to send a signal to said means 130a representing filtering data for excluding information from 3D unit detection pixels determined not to be reliable.
The mask data sent from said means 132 represents filtered mask data when the means 130a has excluded information from 3D unit detection pixels determined not to be reliable.
The mask data sent from said means 134 represents filtered mask data when the means 130a has excluded information from 3D unit detection pixels determined not to be reliable.
The electronic control unit 100 is operatively connected to the means 112c for detecting markings with known maximum reflectivity representing highest expected reflectivity using different integration times/exposure times via a link 12c. The electronic control unit 100 is via the link 12c arranged to receive a signal from said means 112c representing 3D image data for said markings for different integration times/exposure times.
The electronic control unit 100 is operatively connected to the means 112d for detecting markings with different known maximum reflectivity representing highest expected reflectivity corresponding to different object types, using different integration times/exposure times via a link 12d. The electronic control unit 100 is via the link 12d arranged to receive a signal from said means 112d representing 3D image data for said markings for different integration times/exposure times for the different object types.
The electronic control unit 100 is operatively connected to the means 140 for determining saturation characteristics comprising a saturation level from said markings detection so as to determine reliable detection via a link 140a. The electronic control unit 100 is via the link 140a arranged to send a signal to said means 140 representing 3D image data for said markings comprising 3D image data for said markings for different integration times/exposure times for the different object types.
The electronic control unit 100 is operatively connected to the means 140 for determining saturation characteristics comprising a saturation level from said markings detection so as to determine reliable detection via a link 140b. The electronic control unit 100 is via the link 140b arranged to receive a signal from said means 140 representing data for saturation from said markings detection including saturation levels for different objects so as to determine reliable detection.
The means 130 for creating masks based on reliable information for markings is operatively connected to the means 150 for storing thus determined masks via a link 150a. The means 130 is via the link 150a arranged to send a signal to said means 150 representing mask data for masks for different 3D detection unit positions and/or mask data for masks for different objects, wherein the means 150 is arranged to store said masks.
The electronic control unit 100 is according to an embodiment arranged to store masks for different 3D detection unit positions received as mask data from the means 132 via the link 132b and/or masks for different objects mask data from the means 134 via the link 134b.
The means 130 for creating masks based on reliable information for markings is operatively connected to the means 150 for storing thus determined masks externally via a link 150b. The means 130 is via the link 150b arranged to receive a signal from said means 150 representing mask data for masks for different 3D detection unit positions and/or mask data for masks for different objects having been stored externally for another 3D detection unit and/or another vehicle configuration, wherein the means 130 is arranged to validate said masks with masks created by means of the means 130.
The means for storing 150, e.g. a server unit for externally storing masks, may store a large number of masks for a large number of positions and a large number of objects, according to an embodiment also for different illumination powers and different illumination patterns, wherein the electronic control unit 100 according to an embodiment is configured to assist in sorting out the masks such that mask data representing a reduced number of masks is sent to the electronic control unit 100.
According to an embodiment the electronic control unit 100 is configured to request accessible masks based upon position, perform a first estimation and then choose the most suitable mask of the received mask data.
According to an embodiment the electronic control unit 100 performs an estimation, sends the estimation together with the position and possible other relevant information such as object type and type of 3D detection unit to the means for storing 150, e.g. external server unit, which assesses the masks and sends data representing the most suitable masks to the electronic control unit, which performs another estimation which is correlated to the received data. If not ok then another estimation is performed.
The means 140 for determining saturation characteristics comprising a saturation level from said markings detection so as to determine reliable detection is operatively connected to the means 160 for storing thus determined saturation characteristics via a link 160a. The means 140 is via the link 160a arranged to send a signal to said means 160 representing data for saturation from said markings detection so as to determine reliable detection, wherein the means 160 is arranged to store said saturation level for said markings including saturation levels for different objects. This may be particularly relevant where illumination power and/or illumination patterns are variable.
The electronic control unit 100 is according to an embodiment arranged to store saturation levels for said markings for different including saturation levels for different objects.
The means 140 for determining saturation characteristics comprising a saturation level from said markings detection so as to determine reliable detection is operatively connected to the means 160 for storing thus determined saturation characteristics via a link 160b. The means 140 is via the link 160b arranged to receive a signal from said means 160 representing data for saturation from said markings detection having been stored externally for another 3D detection unit and/or another vehicle configuration, wherein the means 130 is arranged to validate said masks with masks created by means of the means 130.
Fig. 3 schematically illustrates a block diagram of a method for improving reliability of image information from a 3D detection unit for use on a vehicle, the 3D detection unit being configured for illuminating the object and detecting from the object reflected light representing 3D information so as to determine a 3D image.
According to the embodiment the method for improving reliability of image information from a 3D detection unit for use on a vehicle comprises a step 51. In this step markings with different known minimum reflectivity representing lowest supported reflectivity corresponding to different object types, using different integration times/exposure times are detected.
According to the embodiment the method for improving reliability of image information from a 3D detection unit for use on a vehicle comprises a step 52. In this step reliable information from said markings detection is determined.
According to the embodiment the method for improving reliability of image information from a 3D detection unit for use on a vehicle comprises a step S3. In this step different masks based on the thus determined reliable information for different markings corresponding to different object types are created.
Fig. 4a schematically illustrates a plan view of a vehicle 1a with a 3D detection unit according to the present invention. Different detection ranges R3, R4 are illustrated. Different markings S10, S20, S30 corresponding to different objects, e.g. ground/road for marking S10, pedestrians for marking S20 and high reflective objects such as road signs for S30.
The markings does not need to be continuous as the 3D detection unit, e.g. Time-of-Flight camera, can interpolate/extrapolate a mask. The markings/pattern should enable the 3D detection unit to build up the information for the whole field-of-view, only limited by its illumination.
The 3D detection unit 110 is configured for detecting markings S10, S20, S30 with different known minimum reflectivity representing lowest supported reflectivity corresponding to different object types, using different integration times/exposure times.
Reliable information from said markings detection is then determined and different masks M1, M2, M3 based on the thus determined reliable information for said different object types are created.
Here markings S10, S20, S30 corresponding to different object types are illustrated in a field at the same time for illustrative purposes. Markings corresponding to one object type could be detected and then replaced by a marking corresponding to another different object for detection etc.
Fig. 4b schematically illustrates a side view of a vehicle 1b with a certain configuration for a certain position of a 3D detection unit 110 according to the present invention and fig. 4c a side view of a vehicle 1c with a certain configuration for a certain position of a 3D detection unit 110 the vehicle 1b being higher than the vehicle 1a resulting in a higher position of the 3D detection unit 110.
The 3D detection unit position on the vehicle 1a is at a height H1 above the ground 01, the ground or road constituting an object 01. The 3D detection unit position on the vehicle 1a is directed towards the ground with a direction D1 corresponding to a certain angle in the x, y, z plane.
The 3D detection unit position on the vehicle 1b is at a height H2 above the ground 01, the ground or road constituting an object 01. The 3D detection unit position on the vehicle 1b is directed towards the ground with a direction D2 corresponding to a certain angle in the x, y, z plane.
The detection range R1 of the 3D detection unit 110 when installed on the vehicle 1a may differ from the detection range R2 of the 3D detection unit 110 when installed on the vehicle 1 b.
The 3D detection unit 110 is configured for detecting markings S10 with known minimum reflectivity representing lowest supported reflectivity using different integration times/exposure times. The known reflectivity of said markings S10 corresponds to a certain object 01, e.g. the ground/road 01. Said detection is performed for different 3D detection unit positions, here illustrated with the position on vehicle 1a and the position on vehicle 1b.
Reliable information from said markings detection is then determined and different masks based on the thus determined reliable information for said different 3D detection unit positions are created.
With reference to figure 5, a diagram of an apparatus 500 is shown. The control unit 100 described with reference to fig. 2 may according to an embodiment comprise apparatus 500. Apparatus 500 comprises a non-volatile memory 520, a data processing device 510 and a read/write memory 550. Non-volatile memory 520 has a first memory portion 530 wherein a computer program, such as an operating system, is stored for controlling the function of apparatus 500. Further, apparatus 500 comprises a bus controller, a serial communication port, l/O-means, an A/D-converter, a time date entry and transmission unit, an event counter and an interrupt controller (not shown). Non-volatile memory 520 also has a second memory portion 540.
A computer program P is provided comprising routines for improving reliability of image information from a 3D detection unit for use on a vehicle. The program P comprises routines for detecting markings with different known minimum reflectivity representing lowest supported reflectivity corresponding to different object types, using different integration times/exposure times. The program P comprises routines for determining reliable information from said markings detection. The program P comprises according to an embodiment routines for creating different masks based on the thus determined reliable information for different markings corresponding to different object types. P may be stored in an executable manner or in a compressed condition in a separate memory 560 and/or in read/write memory 550.
When it is stated that data processing device 510 performs a certain function it should be understood that data processing device 510 performs a certain part of the program which is stored in separate memory 560, or a certain part of the program which is stored in read/write memory 550.
Data processing device 510 may communicate with a data communications port 599 by means of a data bus 515. Non-volatile memory 520 is adapted for communication with data processing device 510 via a data bus 512. Separate memory 560 is adapted for communication with data processing device 510 via a data bus 511. Read/write memory 550 is adapted for communication with data processing device 510 via a data bus 514. To the data communications port 599 e.g. the links connected to the control unit 100 may be connected.
When data is received on data port 599 it is temporarily stored in second memory portion 540. When the received input data has been temporarily stored, data processing device 510 is set up to perform execution of code in a manner described above. The signals received on data port 599 can be used by apparatus 500 for detecting markings with different known minimum reflectivity representing lowest supported reflectivity corresponding to different object types, using different integration times/exposure times. The signals received on data port 599 can be used by apparatus 500 for determining reliable information from said markings detection. The signals received on data port 599 can be used by apparatus 500 for creating different masks based on the thus determined reliable information for different markings corresponding to different object types.
Parts of the methods described herein can be performed by apparatus 500 by means of data processing device 510 running the program stored in separate memory 560 or read/write memory 550. When apparatus 500 runs the program, parts of the methods described herein are executed.
The foregoing description of the preferred embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated.

Claims (25)

1. A method for improving reliability of image information from a 3D detection unit (110) for use on a vehicle (1,1 a, 1 b, 1 c), the 3D detection unit (110) being configured for illuminating the object and detecting from the object reflected light representing 3D information so as to determine a 3D image, comprising the step of calibrating the 3D detection unit (110) for a certain 3D detection unit (110) position on the vehicle, characterized by the step of calibrating 3D detection unit (110) comprises the steps of: - detecting (S1) markings (S10, S20, S30) with different known minimum reflectivity representing lowest supported reflectivity corresponding to different object types, being the minimum number of pixels returned by said markings to be detected, using different integration times/exposure times; - determining (S2) reliable information from said markings detection; - creating (S3) different masks based on the thus determined reliable information for different markings corresponding to different object types.
2. A method according to claim 1, comprising the step of creating different masks based on the thus determined reliable information for different 3D detection unit positions comprising height above said markings and direction of the 3D detection unit.
3. A method according to claim 1 or 2, wherein the step of detecting markings with known minimum reflectivity is performed for different illumination powers of the 3D detection unit (110).
4. A method according to any preceding claims, wherein the step of detecting markings with known minimum reflectivity is performed for different illumination patterns of the 3D detection unit (110).
5. A method according to any preceding claims, wherein the step of creating a mask comprises excluding information from 3D unit detection pixels determined not to be reliable.
6. A method according to any preceding claims, comprising the steps of: detecting markings with known maximum reflectivity representing highest expected reflectivity, being the maximum number of pixels returned by said markings, using different integration times/exposure times; determining saturation characteristics comprising a saturation level from said markings detection so as to determine reliable detection.
7. A method according to any preceding claims, wherein the step of detecting markings with known maximum reflectivity comprises the step of detecting markings with different known maximum reflectivity representing highest expected reflectivity corresponding to different object types, using different integration times/exposure times; wherein saturation characteristics comprising a saturation level from said markings detection is determined so as to determine reliable detections for said different object types.
8. A method according to any preceding claims, wherein the step of detecting markings with known maximum reflectivity is performed for different illumination powers of the 3D detection unit (110).
9. A method according to any preceding claims, wherein the step of detecting markings with known maximum reflectivity is performed for different illumination patterns of the 3D detection unit (110).
10. A method according to any preceding claims, comprising the step of storing thus determined masks.
11. A method according to claim 6 or 7, comprising the step of storing thus determined saturation characteristics.
12. A system (I) for improving reliability of image information from a 3D detection unit (110) for use on a vehicle (1, 1a, 1b, 1c), the 3D detection unit (110) being configured for illuminating the object and detecting from the object reflected light representing 3D information so as to determine a 3D image, comprising means for calibrating the 3D detection unit (110) for a certain 3D detection unit (110) position on the vehicle, characterized in that the means for calibrating the 3D detection unit (110) comprises means (112) for detecting markings (S10, S20, S30) comprising means (112b) for detecting markings (S10, S20, S30) with different known minimum reflectivity representing lowest supported reflectivity corresponding to different object types, being the minimum number of pixels returned by said markings to be detected, using different integration times/exposure times; means (120) for determining reliable information from said markings detection; and means (134) for creating different masks based on the thus determined reliable information for different markings corresponding to different object types.
13. A system according to claim 12, wherein the means (112) for detecting markings comprises means (112a) for detecting markings with known minimum reflectivity representing lowest supported reflectivity using different integration times/exposure times for different 3D detection unit positions comprising height above said markings and direction of the 3D detection unit; wherein means (120) for determining reliable information from said markings detection is provided; and wherein means (132) for creating different masks based on the thus determined reliable information for different 3D detection unit positions is provided.
14. A system according to claim 12 or 13, wherein the means (112a, 112b) for detecting markings with known minimum reflectivity is arranged to be performed for different illumination powers of the 3D detection unit (110).
15. A system according to any of claims 12-14, wherein the means (112a, 112b) for detecting markings with known minimum reflectivity is arranged to be performed for different illumination patterns of the 3D detection unit (110).
16. A system according to any of claims 12-15, wherein the means (132, 134) for creating a mask comprises means (130a) for excluding information from 3D unit detection pixels determined not to be reliable.
17. A system according to any of claim 12-16, wherein the means (112) for detecting markings comprises means (112c) for detecting markings with known maximum reflectivity representing highest expected reflectivity, being the maximum number of pixels returned by said markings, using different integration times/exposure times; and means (140) for determining saturation characteristics comprising a saturation level from said markings detection so as to determine reliable detection.
18. A system according to any of claims 12-17, wherein the means (112) for detecting markings comprises means (112d) for detecting markings with different known maximum reflectivity representing highest expected reflectivity corresponding to different object types, using different integration times/exposure times; wherein means (140) for determining saturation characteristics comprising a saturation level from said markings detection is provided so as to determine reliable detections for said different object types.
19. A system according to any of claims 12-18, wherein the means (112c, 112d) for detecting markings with known maximum reflectivity is arranged to be performed for different illumination powers of the 3D detection unit (110).
20. A system according to any of claims 12-19, wherein the means (112c, 112d) for detecting markings with known maximum reflectivity is arranged to be performed for different illumination patterns of the 3D detection unit (110).
21. A system according to any of claims 12-20, comprising means (150) for storing thus determined masks.
22. A system according to claim any of claims 17-20, comprising means (160) for storing thus determined saturation characteristics.
23. A vehicle (1) comprising a system (I) according to any of claims 12-22.
24. A computer program (P) for improving reliability of image information from a 3D detection unit (110) for use on a vehicle, said computer program (P) comprising program code which, when run on an electronic control unit (100) or another computer (500) connected to the electronic control unit (100), causes the electronic control unit to perform the steps according to claim 1-11.
25. A computer program product comprising a digital storage medium storing the computer program according to claim 24.
SE1451428A 2014-11-26 2014-11-26 Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle SE538501C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1451428A SE538501C2 (en) 2014-11-26 2014-11-26 Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle
DE102015014318.2A DE102015014318A1 (en) 2014-11-26 2015-11-05 Method and system for improving the quality of image information of a 3D recognition unit for use in a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1451428A SE538501C2 (en) 2014-11-26 2014-11-26 Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle

Publications (2)

Publication Number Publication Date
SE1451428A1 SE1451428A1 (en) 2016-05-27
SE538501C2 true SE538501C2 (en) 2016-08-16

Family

ID=55967887

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1451428A SE538501C2 (en) 2014-11-26 2014-11-26 Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle

Country Status (2)

Country Link
DE (1) DE102015014318A1 (en)
SE (1) SE538501C2 (en)

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2475104A (en) 2009-11-09 2011-05-11 Alpha Vision Design Res Ltd Detecting movement of 3D objects using a TOF camera

Also Published As

Publication number Publication date
DE102015014318A1 (en) 2016-06-02
SE1451428A1 (en) 2016-05-27

Similar Documents

Publication Publication Date Title
EP2026104B1 (en) Distance measurement method and device and vehicle equipped with said device
CN109831660B (en) Depth image acquisition method, depth image acquisition module and electronic equipment
US10643091B2 (en) Automatic feature point detection for calibration of multi-camera systems
EP2910971A1 (en) Object recognition apparatus and object recognition method
JP6132412B2 (en) Outside environment recognition device
JP6458651B2 (en) Road marking detection device and road marking detection method
EP2919197A1 (en) Object detection device and object detection method
CN105190288B (en) Method and device for determining the apparent distance in the presence of fog during the day
JP2018179911A (en) Range-finding device, distance information acquisition method
US20120207348A1 (en) Vehicle detection apparatus
CN105157608A (en) Detection method, apparatus, and system of oversized vehicle
US20160330434A1 (en) Control method of a depth camera
US20200177867A1 (en) Camera-parameter-set calculation apparatus, camera-parameter-set calculation method, and recording medium
US10026196B2 (en) Apparatuses and methods for self-position calculation of a vehicle using a light projector and a camera
US9336449B2 (en) Vehicle recognition device
WO2015125301A1 (en) Local location computation device and local location computation method
CN111611841B (en) Object detection apparatus and method for vehicle
CN109703555A (en) Method and apparatus for detecting object shielded in road traffic
US20220413112A1 (en) Apparatus for determining orientation and position of sensor
KR102566120B1 (en) Apparatus and method for evaluating a surround-view image for a vehicle
JP2023101522A (en) Imaging device, information processing device, imaging method, and program
EP3553556A1 (en) Light modulating lidar apparatus
CN104517099A (en) Filtering device and environment recognition system
EP3333828B1 (en) Step detection device and step detection method
SE538501C2 (en) Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle