SE538500C2 - Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle - Google Patents

Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle Download PDF

Info

Publication number
SE538500C2
SE538500C2 SE1451427A SE1451427A SE538500C2 SE 538500 C2 SE538500 C2 SE 538500C2 SE 1451427 A SE1451427 A SE 1451427A SE 1451427 A SE1451427 A SE 1451427A SE 538500 C2 SE538500 C2 SE 538500C2
Authority
SE
Sweden
Prior art keywords
different
detection unit
markings
detection
reflectivity
Prior art date
Application number
SE1451427A
Other languages
Swedish (sv)
Other versions
SE1451427A1 (en
Inventor
Salmén Mikael
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1451427A priority Critical patent/SE538500C2/en
Priority to DE102015014320.4A priority patent/DE102015014320A1/en
Publication of SE1451427A1 publication Critical patent/SE1451427A1/en
Publication of SE538500C2 publication Critical patent/SE538500C2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K31/00Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator
    • B60K31/0008Vehicle fittings, acting on a single sub-unit only, for automatically controlling vehicle speed, i.e. preventing speed from exceeding an arbitrarily established velocity or maintaining speed at a particular velocity, as selected by the vehicle operator including means for detecting potential obstacles in vehicle path
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Chemical & Material Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

ABSTRACT The present invention relates to a method for improving reliability of imageinformation from a 3D detection unit for use on a vehicle, the 3D detectionunit being configured for iiluminating the object and detecting from the objectreflected light representing 3D information so as to determine a 3D image.The method comprises the step of calibrating the 3D detection unit for acertain 3D detection unit position on the vehicle. The step of calibrating 3Ddetection unit comprises the steps of: detecting (S1) markings with knownminimum reflectivity representing lowest supported reflectivity using differentintegration times/exposure times for different 3D detection unit positionscomprising height above said markings and direction of the 3D detection unit;determining (S2) reliable information from said markings detection; andcreating (S3) different masks based on the thus determined reliableinformation for said different 3D detection unit positions. The present invention also relates to a system for improving reliability ofimage information from a 3D detection unit for use on a vehicle. The presentinvention also relates to a vehicle. The present invention also relates to a computer program and a computer program product. (Pig. s)

Description

METHOD AND SYSTEM FOR IIVIPROVING QUALITY OF II\/IAGEINFORIVIATION FROIVI A 3D DETECTION UNIT FOR USE ON A VEHICLE TEOHNICAL FIELD The invention relates to a method for improving reliability of imageinformation from a 3D detection unit for use on a vehicle according to thepreamble of claim 1. The invention also relates to a system for improvingreliability of image information from a 3D detection unit for use on a vehicle.The invention relates to a vehicle. The invention in addition relates to acomputer program and a computer program product.
BACKGFIOUND ART Image quality metrics and benchmarking methods exist for 2D-cameras. ForTime-of-Flight cameras different metrics can be used to measure the qualityof the range data, for instance measuring the variance of the rangemeasurements. Based on these metrics the exposure time of the camera can be automatically adjusted to achieve good image quality of the entire image.
When lnstalling a Time-of-Flight camera on a vehicle the position relative toground will depend on the vehicle type. Further, due to the limited illuminationstrength of the camera, the camera will not be able to illuminate the wholefield-of-view and therefore the auto-exposure function of the camera may tryto maximize visibility and drive the camera at too high exposure causingother visual problems instead.
WO201154971 discloses a method and system for detecting 3D objectswithin a defined area. A camera is calibrated with reference to the groundbase on distance data to the ground. An object detection volume is created inwhich all points from the ground and upwardly to e.g. 10 cm are excluded.
OBJECTS OF THE INVENTION An object of the present invention is to provide a method for improvingreliability of image information from a 3D detection unit for use on a vehiclewhich is cost efficient.
Another object of the present invention is to provide a system for improvingreliability of image information from a 3D detection unit for use on a vehiclewhich is cost efficient.
SUMMARY OF THE INVENTION These and other objects, apparent from the following description, areachieved by a method, a system, a vehicle, a computer program and acomputer program product, as set out in the appended independent claims.Preferred embodiments of the method and the system are defined inappended dependent claims.
Specifically an object of the invention is achieved by a method for improvingreliability of image information from a 3D detection unit for use on a vehicle,the 3D detection unit being configured for illuminating the object anddetecting from the object reflected light representing 3D information so as todetermine a 3D image. The method comprises the step of calibrating the 3Ddetection unit for a certain 3D detection unit position on the vehicle. The stepof calibrating 3D detection unit comprises the steps of: detecting markingswith known minimum reflectivity representing lowest supported reflectivityusing different integration times/exposure times for different 3D detection unitpositions comprising height above said markings and direction of the 3Ddetection unit; determining reliable information from said markings detection;creating different masks based on the thus determined reliable informationfor said different 3D detection unit positions. Hereby the same 3D detection unit may be used for different vehicle configurations requiring differentpositions of the 3D detection unit including different heights/and or directionsfor identification of an object, thus reducing costs. Further, the quality of theinformation/data of the image of the object is improved so that processing thedata will result in a better identification of the detected object independentfrom different positions, i.e. heights/directions of the 3D detection unit. Thus,the reliability of the image information from a 3D detection unit is improved.The amount of information/data required will be minimized by creating thedifferent masks for the different positions of the 3D detection unit. Themaximum object detection range for the 3D detection unit may hereby bedetermined for the different positions of the 3D detection unit. Thus, a genericsolution for a vehicle fleet using one variant of 3D detection unit instead ofmultiple adjusted variants or manually creating a mask for each 3D detectionunit depending on the position on the vehicle is hereby facilitated. By thusassociating different masks for different positions of a 3D detection unit on avehicle, such different positions may be identifiable such that when a 3Ddetection unit is arranged on a certain position on a vehicle, the 3D detectionunit is controlled such that the mask associated to the identified position isapplied.
According to an embodiment of the method the step of detecting markingswith known minimum reflectivity comprises detecting markings with differentknown minimumto different reflectivity representingobject times/exposure times; wherein reliable information from said markings lowest supported reflectivity corresponding types, using different integrationdetection is determined; and wherein different masks based on the thusdetermined reliable information for different markings corresponding todifferent object types are created. Hereby the same 3D detection unit may beused for different object types. Further, the quality of the information/data ofthe image of different objects is improved so that processing the data willresult in a better identification of the detected objects. The amount of information/data required will be minimized by creating the different masks for the different objects. The maximum object detection range for the 3Ddetection unit may hereby be determined for the different objects. Thus, ageneric solution for a vehicle fleet using one variant of 3D detection unitinstead of multiple adjusted variants or manually creating a mask for each 3Ddetection unit depending on the object is hereby facilitated.
According to an embodiment of the method the step of detecting markingswith known minimum reflectivity is performed for different illumination powersof the 3D detection unit. Hereby different masks for different illuminationpowers may be created. Thus, different masks for different illuminationpowers for the respective position of the 3D detection unit may hereby becreated. Thus, different masks for different illumination powers for differentobjects may hereby be created. By thus associating different masks fordifferent illumination powers for a 3D detection unit, such a 3D detection unitwith variable illumination power arranged on a vehicle may be controlledsuch that the mask associated to the set illumination power is applied.
According to an embodiment of the method the step of detecting markingswith known minimum reflectivity is performed for different illuminationpatterns of the 3D detection unit. Hereby different masks for differentillumination patterns may be created. Thus, different masks for differentillumination patterns for the respective position of the 3D detection unit mayhereby be created. Thus, different masks for different illumination patterns fordifferent objects may hereby be created. By thus associating different masksfor different illumination patterns for a 3D detection unit, such a 3D detectionunit with variable illumination patterns arranged on a vehicle may becontrolled such that the mask associated to the set illumination pattern isapplied.
According to an embodiment of the method the step of creating a maskcomprises excluding information from 3D unit detection pixels determined notto be reliable. Hereby quality of the image information/data is improved.
Object detection and image processing speed is enhanced since not allpixels need to be processed.
According to an embodiment the method comprises the steps of: detectingmarkings with known maximum reflectivity representing highest expectedreflectivity using different integration times/exposure times; determiningsaturation characteristics comprising a saturation level from said markingsdetection so as to determine reliable detection. Hereby it is possible todetermine when a certain object will not be detectable due to its reflectivity/saturation.
According to an embodiment of the method the step of detecting markingswith known maximum reflectivity comprises the step of detecting markingswith different known maximum reflectivity representing highest expectedreflectivity corresponding to different object types, using different integrationtimes/exposure times; wherein saturation characteristics comprising asaturation level from said markings detection is determined so as todetermine reliable detections for said different object types. Hereby it ispossible to determine when different objects will not be detectable due totheir reflectivity/saturation.
According to an embodiment of the method the step of detecting markingswith known maximum reflectivity is performed for different illumination powersof the 3D detection unit. Hereby it is possible to determine when a certainobject will not be detectable due to its reflectivity/saturation for differentillumination powers of the 3D detection unit.
According to an embodiment of the method the step of detecting markingswith known maximum reflectivity is performed for different illuminationpatterns of the 3D detection unit. Hereby it is possible to determine when acertain object will not be detectable due to its reflectivity/saturation fordifferent illumination patterns of the 3D detection unit.
According to an embodiment the method comprises the step of storing thusdetermined masks. Hereby such masks may be used for validating againstanother mask of another 3D detection unit installed on another vehicle withcorresponding characteristics, i.e. the 3D detection unit being positioned inthe same way. Hereby such masks may be used in another 3D detection unitof the same kind without requiring calibration of that 3D detection unit,although some correlation may be preferred.
According to an embodiment the method comprises the step of storing thusdetermined saturation characteristics. Hereby such saturation characteristicsmay be used for validating against saturation characteristics of another 3Ddetection unit installed on another vehicle with corresponding characteristics,i.e. the 3D detection unit being positioned in the same way. Hereby suchsaturation characteristics may be used in another 3D detection unit of thesame kind without requiring calibration of that 3D detection unit, althoughsome correlation may be preferred.
Specifically an object of the invention is achieved by a system for improvingreliability of image information from a 3D detection unit for use on a vehicle,the 3D detection unit being configured for illuminating the object anddetecting from the object reflected light representing 3D information so as todetermine a 3D image adapted to perform the methods as set out above.
The system according to the invention has the advantages according to thecorresponding method claims.
BRIEF DESCRIPTION OF THE DRAWINGS For a better understanding of the present invention reference is made to the following detailed description when read in conjunction with theaccompanying drawings, wherein like reference characters refer to like parts throughout the several views, and in which: Fig. 1 schematically illustrates a side view of a vehicle according to the present invention; Fig. 2 schematically illustrates a system for improving reliability of imageinformation from a 3D detection unit for use on a vehicle according to an embodiment of the present invention; Fig. 3 schematically illustrates a block diagram of a method for improvingreliability of image information from a 3D detection unit for use on a vehicle according to an embodiment of the present invention; Fig. 4a schematically illustrates a side view of a vehicle with a certainconfiguration for a certain position of a 3D detection unit according to the present invention; Fig. 4b schematically illustrates a side view of a vehicle with a certainconfiguration for a certain position of a 3D detection unit according to thepresent invention; Fig. 4c schematically illustrates a plan view of a vehicle with a 3D detectionunit according to the present invention; and Fig. 5 schematically illustrates a computer according to an embodiment of the present invention.
DETAILED DESCRIPTION Hereinafter the term “|ink” refers to a communication link which may be aphysical connector, such as an optoelectronic communication wire, or a non-physical connector such as a wireless connection, for example a radio ormicrowave link.
Hereinafter the term “2D” refers to two dimensional. Hereinafter the term “3D” refers to three dimensional.
Hereinafter the term “markings” refers to any suitable markings having a known minimum reflectivity representing lowest supported reflectivitycorresponding to a certain object. The term “markings” may also refer to anysuitable markings having a known maximum reflectivity representing highestexpected reflectivity. The term “markings”, i.e. such marking may be anysuitable element such as one or more stripes. The term “markings” may referto a marking forming a continuous pattern and/or several markings forming apattern. The term “markings” may refer to different markings with differentknown minimum reflectivity representing lowest supported reflectivitycorresponding to different object types. The markings may correspond to theground, but also other objects such as pedestrians, animals, bicyclists,reflective objects such as road signs, markings on the road, crash barriersetc. The markings/mask may consider expected object size and likelihood ofdetecting sufficient amount of signals/pixels from the object in a certain areaof the field-of-view of the 3D detection unit. For instance, a close object may100 pixels/signals compared to an object further away which due to weaker due to strong signals/reflection and larger size generate e.g.signal/reflection and due to geometry be smaller to sensor hence generating e.g. 20 pixels.
Hereinafter the term “3D detection unit position” refers to the height of the 3Ddetection unit and the direction of the 3D detection unit.
The term “direction of the 3D detection unit” refers to the in which directionthe 3D detection unit is pointing and thus detecting. The direction of the 3Ddetection unit comprises angle around the x, y, z axis.
Hereinafter the term “integration time/exposure time” is used. A 3D detectionunit constituted by a Time-of-Flight camera such as a continuous wave/PMDtype of sensor for example it is referred to as the integration time, such aTime-of-flight camera taking four images during the integration time.
Fig. 1 schematically illustrates a vehicle 1 according to an embodiment of thepresent invention. The exemplified vehicle 1 is a heavy vehicle in the shapeof a truck. The vehicle according to the present invention could be anysuitable vehicle a bus, a car, a train or the like. The vehicle comprises asystem for improving reliability of image information from a 3D detection unit110 for use on a vehicle according to the present invention. The vehiclecomprises a 3D detection unit 110. The 3D detection unit 110 is installed onthe vehicle, here the vehicle cab, at a certain position comprising a certainheight H and a certain direction D for detection, the height and/or directionvarying depending on the vehicle configuration, e.g. cab configuration and/orchassis configuration. The position comprising the direction comprises anglearound the x, y, z axis, which need to be determined, i.e. roll, pitch, yaw. lfthe 3D detection unit 110 is tilted 1 degree around the y-axis, the y-axis beingin the forward direction of the vehicle, then the result will differ compared to a3D detection unit 110 having no roll offset from the horizontal line. A vehiclemay have multiple possible mounting positions.
Fig. 2 schematically illustrates a system I for improving reliability of imageinformation from a 3D detection unit 110 for use on a vehicle according to an embodiment of the present invention.The system I comprises an electronic control unit 100.
The 3D detection unit 110 is configured for illuminating an object O by meansof light L1 and detecting from the object reflected light L2 representing 3Dinformation so as to determine a 3D image. The 3D detection unit 110 isconfigured for detecting reflected light by means of time-of-flight technique.The 3D detection unit 110 comprises active light L1 for illumination of theobject O. The light L1 could be any suitable light such as infrared light, visiblelight, LASER light or the like. The 3D detection unit 110 comprises accordingto an embodiment means for illumination of an object O by means of infraredlight.
The 3D detection unit 110 comprises according to an embodiment a time-of-flight camera unit. The time-of-flight camera unit may be any kind of sensorusing transmitted signal through air and correlating with received signal inorder to conclude a distance to measured point(s). The time-of-flight (ToF)camera unit may be any kind of ToF camera based on for instanceContinuous-Wave (CW, such as PMD, Swissranger, etc), pulsed-wave (suchas TDC) or range gating (Obzerv) principles. ToFs uses typically a LED thatilluminates the whole scene at ones. The 3D detection unit 110 comprisesaccording to an embodiment a LIDAR, i.e. a laser scanner unit.
The 3D detection unit 110 is arranged to detect a 3D image of an object O.
The system I comprises means 112, 112a, 112b, 120, 130, 140 forcalibrating the 3D detection unit 110 for a certain 3D detection unit positionon a vehicle.
The means for calibrating the 3D detection unit comprises means 112 fordetecting markings. The means 112 for detecting markings comprises means112a for detecting marking with known minimum reflectivity representinglowest supported reflectivity using different integration times/exposure timesfor different 3D detection unit positions. The different 3D detection unitpositions comprise height above said markings and direction of the 3Ddetection unit. Said means 112 for detecting markings is comprised in said3D detection unit 110. The 3D detection unit 110 is thus configured to detectsaid markings. The system I and thus the 3D detection unit 110 comprisessaid means 112a for detecting markings with known minimum reflectivityrepresenting lowest supported reflectivity using different integration times/exposure times.
According to an embodiment the means 112 for detecting markingscomprises means for detecting marking with a known certain reflectivitybetween lowest and highest reflectivity representing a certain supportedreflectivity using different integration times/exposure times. 11 For each position of the 3D detection unit 110 the means 112 for detectingmarkings will thus take images using different integration times/exposuretimes to get range data for each position.
The means 112 for detecting markings comprises means 112b for detectingmarkings with different known minimum reflectivity representing lowestsupported reflectivity corresponding to different object types, using differentintegration times/exposure times.
The means for calibrating the 3D detection unit 110 comprises means 120 fordetermining reliable information from said markings detection. The system Ithus comprises means 120 for determining reliable information from saidmarkings detection. The means 120 for determining reliable information fromsaid markings detection comprises applying a quality metric on each rangepixel that the 3D detection unit 110, i.e. the means 112a for detectingmarkings with known minimum reflectivity, will be able to estimate whatpixels/measurements can be assumed to be reliable. According to anembodiment a margin is added for increased object detection robustness.
The means 112a, 112b for detecting markings with known minimumreflectivity is arranged to be performed for different illumination powers of the3D detection unit 110.
The means 112a, 112b for detecting markings with known minimumreflectivity is arranged to be performed for different illumination patterns ofthe 3D detection unit 110.
The means 120 for determining reliable information from said markingsdetection comprises determining reliable information from different knownminimum reflectivity representing lowest supported reflectivity correspondingto different object types.
When determining reliable information from different known minimum reflectivity representing lowest supported reflectivity corresponding to 12 different object types object size is considered. The means 112 for detectingmarkings for object detection is according to an embodiment configured toassume a certain threshold of pixels, for example 30 pixels, to allow a clusterto be included in the object detection. Hence markings/objects returning anumber of pixels below the threshold, for example 20 pixels, will not bedetected. By thus considering object size when creating a mask mayconsiderably improve the quality/reliability.
Detected pixel density could be used as one measure to threshold where thecut-off distance for the mask is. For example, for one marking only 25% ofthe pixels provide sufficiently strong signals and at this distance the pixeldensity is e.g. 10 cm, i.e. the physical distance between two adjacent pixelsis 10 cm, and the detectable object need to be 30 cm times 30 cm in size.This means that with 100% detection 3x3 pixels can be detected, meaning 9pixels. lf only 25% is detectable it results in 2 pixels. lf threshold is set at 5pixels, the object is not detectable at this distance.
The means for calibrating the 3D detection unit 110 comprises means 132 forcreating different masks based on the thus determined reliable informationfor different 3D detection unit positions comprising height above saidmarkings and direction of the 3D detection unit. The system I thus comprisesmeans 132 for creating different masks based on the thus determinedreliable information for different 3D detection unit positions comprising heightabove said markings and direction of the 3D detection unit. The masks arecreated by only allowing the reliable pixels. Once the 3D detection unit 110,i.e. the means 120 for determining reliable information from said markingsdetection, has decided which pixels can be considered as reliable each maskis stored and associated with the used integration time and object type.
The means for calibrating the 3D detection unit 110 comprises means 134 forcreating different masks based on the thus determined reliable informationfor different markings corresponding to different object types. The system Ithus comprises means 134 for creating different masks based on the thus 13 determined reliable information for different markings corresponding todifferent object types.
The means 132 and the means 134 are comprised in a means 130 forcreating masks. The system I thus comprises means 130 for creating masksbased on reliable information for markings.
The means 130 for creating a mask comprises means 130a for excludinginformation from 3D unit detection pixels determined not to be reliable. Themeans 130a for excluding information from 3D unit detection pixelsdetermined not to be reliable may comprise any suitable filter means The markings do not need to be continuous as the 3D detection unit, e.g.Time-of-Flight camera, can interpolate/extrapolate a mask. The markingsshould enable the 3D detection unit to build up the information for the wholefield-of-view, only limited by its illumination.
The means 112 for detecting markings comprises means 112c for detectingmarkings with known maximum reflectivity representing highest expectedreflectivity using different integration times/exposure times. The system I andthus the 3D detection unit 110 comprises said means 112c for detectingmarkings with known maximum reflectivity representing highest expectedreflectivity using different integration times/exposure times.
The means 112 for detecting markings comprises means 112d for detectingmarkings with different known maximum reflectivity representing highestexpected reflectivity corresponding to different object types, using different integration times/exposure times.
The means 112c, 112d for detecting markings with known maximumreflectivity is arranged to be performed for different illumination powers of the3D detection unit 110. 14 The means 112c, 112d for detecting markings with known maximumreflectivity is arranged to be performed for different illumination patterns ofthe 3D detection unit 110.
The system I comprises means 140 for determining saturation Characteristicscomprising a saturation level from said markings detection so as to determinereliable detection.
The means 140 for determining saturation characteristics comprising asaturation level from said markings detection is provided so as to determinereliable detections for said different object types.
By thus determining saturation characteristics comprising a saturation levelfrom said markings detection with said means 140 so that if increasingintegration time to see an object with relatively low reflectivity, for instancethe ground, it will be known to the system I including the 3D detection unit110, when it can be expected that another object with relatively higherreflectivity, for instance pedestrians, will not be detectable due to saturation.The system I including the 3D detection unit 110, may according to anembodiment be controlled in such a way that the 3D detection unit 110 at acertain point will go into multi-exposure mode, so called HDR (High DynamicRange), to ensure that the most critical object type, e.g. pedestrians, isdetectable at all times.
The system I comprises means 100, 150 for storing thus determined masks.The means 100, 150 for storing thus determined masks comprises means150 for storing thus determined masks externally comprising external storagemeans. Said means 150 for external storage of masks may comprise any suitable server unit, computer or the like.
The system I comprises means 100, 160 means for storing thus determinedsaturation characteristics. The means 100, 160 for storing thus determinedsaturation characteristics comprises external means 160 for storing thus determined saturation Characteristics. Said external means 160 may comprise any suitable server unit, computer or the like.
The electronic control unit 100 is operatively connected to the means 112afor detecting markings with known minimum reflectivity representing lowestsupported reflectivity using different integration times/exposure timescomprising height above said markings and direction of the 3D detection unitvia a link 12a. The electronic control unit 100 is via the link 12a arranged toreceive a signal from said means 112a representing 3D image data for saidmarkings for different integration times/exposure times for the different positions of the 3D detection unit 110.
The electronic control unit 100 is operatively connected to the means 112bfor detecting markings with different known minimum reflectivity representinglowest supported reflectivity corresponding to different object types usingdifferent integration times/exposure times via a link 12b. The electroniccontrol unit 100 is via the link 12b arranged to receive a signal from saidmeans 112b representing 3D image data for said markings for differentintegration times/exposure times for the different object types.
The electronic control unit 100 is operatively connected to the means 120 fordetermining reliable information from said markings detection via a link 120a.The electronic control unit 100 is via the link 120a arranged to send a signalto said means 120 representing 3D image data for said markings for differentintegration times/exposure times for the different positions of the 3Ddetection unit 110 and/or 3D image data for said markings for differentintegration times/exposure times for the different object types.
The electronic control unit 100 is operatively connected to the means 120 fordetermining reliable information from said markings detection via a link 120b.The electronic control unit 100 is via the link 120b arranged to receive asignal from said means 120 representing data for reliable information fromsaid markings detection comprising data for reliable information from 16 markings detection for different integration times/exposure times for thedifferent positions of the 3D detection unit 110 and/or data for reliableinformation from markings detection for different integration times/exposuretimes for the different object types.
The electronic control unit 100 is operatively connected to the means 132 forcreating different masks based on the thus determined reliable informationfor different 3D detection unit positions comprising direction of the 3Ddetection unitheight above said markings and direction of the 3D detectionunit via a link 132a. The electronic control unit 100 is via the link 132aarranged to send a signal to said means 132 representing mask creationdata for determined reliable information for different 3D detection unitpositions.
The electronic control unit 100 is operatively connected to the means 132 forcreating different masks based on the thus determined reliable informationfor different 3D detection unit positions via a link 132b. The electronic controlunit 100 is via the link 132b arranged to receive a signal from said means132 representing mask data for different masks based on the thusdetermined reliable information for different 3D detection unit positions.
The electronic control unit 100 is operatively connected to the means 134 forcreating different masks based on the thus determined reliable informationfor different markings corresponding to different object types via a link 134a.The electronic control unit 100 is via the link 134a arranged to send a signalto said means 134 representing mask creation data for determined reliableinformation for different object types.
The electronic control unit 100 is operatively connected to the means 134 forcreating different masks based on the thus determined reliable informationfor different markings corresponding to different object types via a link 134b.The electronic control unit 100 is via the link 134b arranged to receive a 17 signal from said means 134 representing mask data for different masksbased on the thus determined reliable information for different object types.
The electronic control unit 100 is operatively connected to the means 130afor excluding information from 3D unit detection pixels determined not to bereliable via a link 30a. The electronic control unit 100 is via the link 30aarranged to send a signal to said means 130a representing filtering data forexcluding information from 3D unit detection pixels determined not to bereliable.
The mask data sent from said means 132 represents filtered mask data whenthe means 130a has excluded information from 3D unit detection pixelsdetermined not to be reliable.
The mask data sent from said means 134 represents filtered mask data whenthe means 130a has excluded information from 3D unit detection pixelsdetermined not to be reliable.
The electronic control unit 100 is operatively connected to the means 112cfor detecting markings with known maximum reflectivity representing highestexpected reflectivity using different integration times/exposure times via a link12c. The electronic control unit 100 is via the link 12c arranged to receive asignal from said means 112c representing 3D image data for said markingsfor different integration times/exposure times.
The electronic control unit 100 is operatively connected to the means 112dfor detecting markings with different known maximum reflectivity representinghighest expected reflectivity corresponding to different object types, usingdifferent integration times/exposure times via a link 12d. The electroniccontrol unit 100 is via the link f2d arranged to receive a signal from saidmeans 112d representing 3D image data for said markings for differentintegration times/exposure times for the different object types. 18 The electronic control unit 100 is operatively connected to the means 140 fordetermining saturation Characteristics comprising a saturation level from saidmarkings detection so as to determine reliable detection via a link 140a. Theelectronic control unit 100 is via the link 140a arranged to send a signal tosaid means 140 representing 3D image data for said markings comprising 3Dimage data for said markings for different integration times/exposure timesfor the different object types.
The electronic control unit 100 is operatively connected to the means 140 fordetermining saturation characteristics comprising a saturation level from saidmarkings detection so as to determine reliable detection via a link 140b. Theelectronic control unit 100 is via the link 140b arranged to receive a signalfrom said means 140 representing data for saturation from said markingsdetection including saturation levels for different objects so as to determinereliable detection.
The means 130 for creating masks based on reliable information formarkings is operatively connected to the means 150 for storing thusdetermined masks via a link 150a. The means 130 is via the link 150aarranged to send a signal to said means 150 representing mask data formasks for different 3D detection unit positions and/or mask data for masksfor different objects, wherein the means 150 is arranged to store said masks.
The electronic control unit 100 is according to an embodiment arranged tostore masks for different 3D detection unit positions received as mask datafrom the means 132 via the link 132b and/or masks for different objects maskdata from the means 134 via the link 134b.
The means 130 for creating masks based on reliable information formarkings is operatively connected to the means 150 for storing thusdetermined masks externally via a link 150b. The means 130 is via the link150b arranged to receive a signal from said means 150 representing maskdata for masks for different 3D detection unit positions and/or mask data for 19 masks for different objects having been stored externally for another 3Ddetection unit and/or another vehicle configuration, wherein the means 130 isarranged to validate said masks with masks created by means of the means130.
The means for storing 150, e.g. a server unit for externally storing masks,may store a large number of masks for a large number of positions and alarge number of objects, according to an embodiment also for differentillumination powers and different illumination patterns, wherein the electroniccontrol unit 100 according to an embodiment is configured to assist in sortingout the masks such that mask data representing a reduced number of masksis sent to the electronic control unit 100.
According to an embodiment the electronic control unit 100 is configured torequest accessible masks based upon position, perform a first estimation andthen choose the most suitable mask of the received mask data.
According to an embodiment the electronic control unit 100 performs anestimation, sends the estimation together with the position and possible otherrelevant information such as object type and type of 3D detection unit to themeans for storing 150, e.g. external server unit, which assesses the masksand sends data representing the most suitable masks to the electroniccontrol unit, which performs another estimation which is correlated to thereceived data. lf not ok then another estimation is performed.
The means 140 for determining saturation characteristics comprising asaturation level from said markings detection so as to determine reliabledetection is operatively connected to the means 160 for storing thusdetermined saturation characteristics via a link 160a. The means 140 is viathe link 160a arranged to send a signal to said means 160 representing datafor saturation from said markings detection so as to determine reliabledetection, wherein the means 160 is arranged to store said saturation levelfor said markings including saturation levels for different objects. This may be particularly relevant where illumination power and/or illumination patterns are variable.
The electronic control unit 100 is according to an embodiment arranged tostore saturation levels for said markings for different including saturationlevels for different objects.
The means 140 for determining saturation Characteristics comprising asaturation level from said markings detection so as to determine reliabledetection is operatively connected to the means 160 for storing thusdetermined saturation characteristics via a link 160b. The means 140 is viathe link 160b arranged to receive a signal from said means 160 representingdata for saturation from said markings detection having been storedexternally for another 3D detection unit and/or another vehicle configuration,wherein the means 130 is arranged to validate said masks with maskscreated by means of the means 130.
Fig. 3 schematically illustrates a block diagram of a method for improvingreliability of image information from a 3D detection unit for use on a vehicle,the 3D detection unit being configured for illuminating the object anddetecting from the object reflected light representing 3D information so as todetermine a 3D image.
According to the embodiment the method for improving reliability of imageinformation from a 3D detection unit for use on a vehicle comprises a stepS1. In this step markings with known minimum reflectivity representing lowestsupported reflectivity using different integration times/exposure times aredetected for different 3D detection unit positions comprising height abovesaid markings and direction of the 3D detection unit.
According to the embodiment the method for improving reliability of imageinformation from a 3D detection unit for use on a vehicle comprises a stepS2. ln this step reliable information from said markings detection is determined. 21 According to the embodiment the method for improving reliability of imageinformation from a 3D detection unit for use on a vehicle comprises a stepS3. ln this step different masks based on the thus determined reliableinformation are created for said different 3D detection unit positions.
Fig. 4a schematically illustrates a side view of a vehicle 1a with a certainconfiguration for a certain position of a 3D detection unit 110 according to thepresent invention and fig. 4b a side view of a vehicle 1b with a certainconfiguration for a certain position of a 3D detection unit 110 the vehicle 1bbeing higher than the vehicle 1a resulting in a higher position of the 3Ddetection unit 110.
The 3D detection unit position on the vehicle 1a is at a height H1 above theground O1, the ground or road constituting an object O1. The 3D detectionunit position on the vehicle 1a is directed towards the ground with a directionD1 corresponding to a certain angle in the x, y, z plane.
The 3D detection unit position on the vehicle 1b is at a height H2 above theground O1, the ground or road constituting an object O1. The 3D detectionunit position on the vehicle 1b is directed towards the ground with a directionD2 corresponding to a certain angle in the x, y, z plane..
The detection range F11 of the 3D detection unit 110 when installed on thevehicle 1a may differ from the detection range F12 of the 3D detection unit110 when installed on the vehicle 1b.
The 3D detection unit 110 is configured for detecting markings S10 withknown minimum reflectivity representing lowest supported reflectivity usingdifferent integration times/exposure times. The known reflectivity of saidmarkings S10 corresponds to a certain object O1, e.g. the ground/road O1.Said detection is performed for different 3D detection unit positions, hereillustrated with the position on vehicle 1a and the position on vehicle 1b. 22 Fleliable information from said markings detection is then determined anddifferent masks based on the thus determined reliable information for saiddifferent 3D detection unit positions are created.
Fig. 4c schematically illustrates a plan view of a vehicle 1c with a 3Ddetection unit according to the present invention. Different detection rangesF13, Fl4 are illustrated. Different markings S10, S20, S30 corresponding todifferent objects, e.g. ground/road for marking S10, pedestrians for markingS20 and high reflective objects such as road signs for S30.
The markings does not need to be continuous as the 3D detection unit, e.g.Time-of-Flight Themarkings/pattern should enable the 3D detection unit to build up the camera, can interpolate/extrapolate a mask. information for the whole field-of-view, only limited by its illumination.
The 3D detection unit 110 is configured for detecting markings S10, S20, S30with different known minimum reflectivity representing lowest supportedreflectivity corresponding to different object types, using different integration times/exposure times.
Reliable information from said markings detection is then determined anddifferent masks lVl1,information for said different object types are created. l\/l2, l\/l3 based on the thus determined reliable Here markings S10, S20, S30 corresponding to different object types areillustrated in a field at the same time for illustrative purposes. Markingscorresponding to one object type could be detected and then replaced by amarking corresponding to another different object for detection etc.
With reference to figure 5, a diagram of an apparatus 500 is shown. Thecontrol unit 100 described with reference to fig. 2 may according to anembodiment comprise apparatus 500. Apparatus 500 comprises a non-volatile memory 520, a data processing device 510 and a read/write memory550. Non-volatile memory 520 has a first memory portion 530 wherein a 23 computer program, such as an operating system, is stored for controlling thefunction of apparatus 500. Further, apparatus 500 comprises a bus controller,a serial communication port, I/O-means, an A/D-converter, a time date entryand transmission unit, an event counter and an interrupt controller (notshown). Non-volatile memory 520 also has a second memory portion 540.
A computer program P is provided comprising routines for improvingreliability of image information from a 3D detection unit for use on a vehicle.The program P comprises routines for detecting markings with knownminimum reflectivity representing lowest supported reflectivity using differentintegration times/exposure times for different 3D detection unit positionscomprising height above said markings and direction of the 3D detection unit.The program P comprises routines for determining reliable information fromsaid markings detection. The program P comprises according to anembodiment routines for creating different masks based on the thusdetermined reliable information for different 3D detection unit positions. Pmay be stored in an executable manner or in a compressed condition in aseparate memory 560 and/or in read/write memory 550.
When it is stated that data processing device 510 performs a certain functionit should be understood that data processing device 510 performs a certainpart of the program which is stored in separate memory 560, or a certain partof the program which is stored in read/write memory 550.
Data processing device 510 may communicate with a data communicationsport 599 by means of a data bus 515. Non-volatile memory 520 is adaptedfor communication with data processing device 510 via a data bus 512.Separate memory 560 is adapted for communication with data processingdevice 510 via a data bus 511. Read/write memory 550 is adapted forcommunication with data processing device 510 via a data bus 514. To thedata communications port 599 e.g. the links connected to the control unit 100may be connected. 24 When data is received on data port 599 it is temporarily stored in secondmemory portion 540. When the received input data has been temporarilystored, data processing device 510 is set up to perform execution of code ina manner described above. The signals received on data port 599 can beused by apparatus 500 for detecting markings with known minimumreilectivity representing lowest supported reflectivity using differentintegration times/exposure times for different 3D detection unit positionscomprising height above said markings and direction of the 3D detection unit.The signals received on data port 599 can be used by apparatus 500 fordetermining reliable information from said markings detection. The signalsreceived on data port 599 can be used by apparatus 500 for creating differentmasks based on the thus determined reliable information for different 3D detection unit positions.
Parts of the methods described herein can be performed by apparatus 500by means of data processing device 510 running the program stored inseparate memory 560 or read/write memory 550. When apparatus 500 runsthe program, parts of the methods described herein are executed.
The foregoing description of the preferred embodiments of the presentinvention has been provided for the purposes of illustration and description. ltis not intended to be exhaustive or to limit the invention to the precise formsdisclosed. Obviously, many modifications and variations will be apparent topractitioners skilled in the art. The embodiments were chosen and describedin order to best explain the principles of the invention and its practicalapplications, thereby enabling others skilled in the art to understand theinvention for various embodiments and with the various modifications as aresuited to the particular use contemplated.

Claims (25)

CLAlMS
1. A method for improving reliability of image information from a 3D detectionunit (110) for use on a vehicle (1, 1a, 1b, 1c), the 3D detection unit (110) beingconfigured for illuminating the object and detecting from the object reflectedlight representing 3D information so as to determine a 3D image, comprisingthe step of calibrating the 3D detection unit (1 10) for a certain 3D detection unit(110) position on the vehicle, characterized in that the step of calibrating the3D detection unit (110) comprises the steps of: - detecting (S1) markings (S10, S20, S30) with known minimum reflectivityrepresenting lowest supported reflectivity, beinq the minimum number of pixelsreturned by said markinqs to be detected, using different integrationtimes/exposure times for different 3D detection unit positions comprisingheight above said markings and direction of the 3D detection unit (110); - determining (S2) reliable information from said markings detection; - creating (S3) different masks based on the thus determined reliable information for said different 3D detection unit positions.
2. A method according to claim 1, wherein the step of detecting markings withknown minimum reflectivity comprises detecting markings with different knownminimum reflectivity representing lowest supported reflectivity correspondingto different object types, using different integration times/exposure times;wherein reliable information from said markings detection is determined; andwherein different masks based on the thus determined reliable information for different markings corresponding to different object types are created.
3. A method according to claim 1 or 2, wherein the step of detecting markingswith known minimum reflectivity is performed for different illumination powersof the 3D detection unit (110).
4. A method according to any preceding claims, wherein the step of detectingmarkings with known minimum reflectivity is performed for different illuminationpatterns of the 3D detection unit (110). 26
5. A method according to claim 1, wherein the step of creating a maskcomprises excluding information from 3D unit detection pixels determined not to be reliable.
6. A method according to any preceding claims, comprising the steps of:detecting markings with known maximum reflectivity representing highest expected reflectivity, beinq the maximum number of pixels returned by said markings, using different integration times/exposure times; determiningsaturation Characteristics comprising a saturation level from said markings detection so as to determine reliable detection.
7. A method according to any preceding claims, wherein the step of detectingmarkings with known maximum reflectivity comprises the step of detectingmarkings with different known maximum reflectivity representing highestexpected reflectivity corresponding to different object types, using differentintegration times/exposure times; wherein saturation characteristicscomprising a saturation level from said markings detection is determined so as to determine reliable detections for said different object types.
8. A method according to any preceding claims, wherein the step of detectingmarkings with known maximum reflectivity is performed for different illumination powers of the 3D detection unit (110).
9. A method according to any preceding claims, wherein the step of detectingmarkings with known maximum reflectivity is performed for different illumination patterns of the 3D detection unit (110).
10. A method according to any preceding claims, comprising the step of storing thus determined masks.
11. A method according to claim 6 or 7, comprising the step of storing thus determined saturation characteristics.
12.A system (l) for improving reliability of image information from a 3Ddetection unit (110) for use on a vehicle (1, 1a, 1b, 1c), the 3D detection unit 27 (110) being configured for illuminating the object and detecting from the objectreflected light representing 3D information so as to determine a 3D image,comprising means for calibrating the 3D detection unit (110) for a certain 3Ddetection unit (110) position on the vehicle, characterized in that the meansfor calibrating the 3D detection unit (1 10) comprises means (1 12) for detectingmarkings (S10, S20, S30) comprising means (112a) for detecting markingswith known minimum reflectivity representing lowest supported reflectivity,beinq the minimum number of pixels returned by said markinqs to be detected,using different integration times/exposure times for different 3D detection unitpositions comprising height above said markings and direction of the 3Ddetection unit; means (120) for determining reliable information from saidmarkings detection; and means (132) for creating different masks based onthe thus determined reliable information for different 3D detection unit positions.
13. A system according to claim 8, wherein the means (112) for detectingmarkings comprises means (1 12b) for detecting markings (S10, S20, S30) withdifferent known minimum reflectivity representing lowest supported reflectivitycorresponding to different object types, using different integrationtimes/exposure times; wherein means (120) for determining reliableinformation from said markings detection is provided; and wherein means(134) for creating different masks based on the thus determined reliableinformation for different markings corresponding to different object types is provided.
14. A system according to claim 12 or 13, wherein the means (112a, 112b) fordetecting markings with known minimum reflectivity is arranged to be performed for different illumination powers of the 3D detection unit (110).
15. A system according to any of claims 12-14, wherein the means (112a,112b) for detecting markings with known minimum reflectivity is arranged to beperformed for different illumination patterns of the 3D detection unit (110). 28
16. A system according to any of claims 12-15, wherein the means (132, 134)for creating a mask comprises means (130a) for excluding information from 3D unit detection pixels determined not to be reliable.
17. A system according to any of claim 12-16, wherein the means (112) fordetecting markings comprises means (112c) for detecting markings withknown maximum reflectivity representing highest expected reflectivityígthe maximum number of pixels returned by said markinqs, using differentintegration times/exposure times; and means (140) for determining saturationCharacteristics comprising a saturation level from said markings detection so as to determine reliable detection.
18. A system according to any of claims 12-17, wherein the means (112) fordetecting markings comprises means (112d) for detecting markings withdifferent known maximum reflectivity representing highest expected reflectivitycorresponding to different object types, using different integrationtimes/exposure times; wherein means (140) for determining saturationcharacteristics comprising a saturation level from said markings detection is provided so as to determine reliable detections for said different object types.
19. A system according to any of claims 12-18, wherein the means (112c,112d) for detecting markings with known maximum reflectivity is arranged to be performed for different illumination powers of the 3D detection unit (110).
20. A system according to any of claims 12-19, wherein the means (112c,112d) for detecting markings with known maximum reflectivity is arranged to be performed for different illumination patterns of the 3D detection unit (110).
21. A system according to any of claims 12-20, comprising means (150) for storing thus determined masks.
22. A system according to claim any of claims 17-20, comprising means (160)for storing thus determined saturation characteristics.
23. A vehicle (1) comprising a system (I) according to any of claims 12-22. 29
24. A computer program (P) for improving reliability of image information froma 3D detection unit (110) for use on a vehicle, said computer program (P)comprising program code which, when run on an electronic control unit (100)or another computer (500) connected to the electronic control unit (100), causes the electronic control unit to perform the steps according to claim 1-11.
25. A computer program product comprising a digital storage medium storing the computer program according to claim 24.
SE1451427A 2014-11-26 2014-11-26 Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle SE538500C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1451427A SE538500C2 (en) 2014-11-26 2014-11-26 Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle
DE102015014320.4A DE102015014320A1 (en) 2014-11-26 2015-11-05 Method and system for improving the quality of image information of a 3D recognition unit for use in a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1451427A SE538500C2 (en) 2014-11-26 2014-11-26 Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle

Publications (2)

Publication Number Publication Date
SE1451427A1 SE1451427A1 (en) 2016-05-27
SE538500C2 true SE538500C2 (en) 2016-08-16

Family

ID=55967959

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1451427A SE538500C2 (en) 2014-11-26 2014-11-26 Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle

Country Status (2)

Country Link
DE (1) DE102015014320A1 (en)
SE (1) SE538500C2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017126378A1 (en) * 2017-11-10 2019-05-16 Infineon Technologies Ag Method for processing a raw image of a time-of-flight camera, image processing device and computer program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2475104A (en) 2009-11-09 2011-05-11 Alpha Vision Design Res Ltd Detecting movement of 3D objects using a TOF camera

Also Published As

Publication number Publication date
SE1451427A1 (en) 2016-05-27
DE102015014320A1 (en) 2016-06-02

Similar Documents

Publication Publication Date Title
US10276049B2 (en) Camera based trailer identification and blind zone adjustment
US9840253B1 (en) Lane keeping system for autonomous vehicle during camera drop-outs
US20180308282A1 (en) Shape measuring apparatus and method
JP6458651B2 (en) Road marking detection device and road marking detection method
US10643091B2 (en) Automatic feature point detection for calibration of multi-camera systems
US11487009B2 (en) Image capture control device, image capture control method, and recording medium
US9506859B2 (en) Method and device for determining a visual range in daytime fog
US20160162743A1 (en) Vehicle vision system with situational fusion of sensor data
US9886773B2 (en) Object detection apparatus and object detection method
KR20160137247A (en) Apparatus and method for providing guidance information using crosswalk recognition result
US11256932B2 (en) Falling object detection apparatus, in-vehicle system, vehicle, and computer readable medium
US10453214B2 (en) Image capturing device and method, program, and record medium to perform exposure control based on the brightness in an attention area corresponding to a detected object
JP6237874B2 (en) Self-position calculation device and self-position calculation method
KR102707085B1 (en) Camera-calibration system and method thereof
US11373324B2 (en) Depth acquisition device and depth acquisition method for providing a corrected depth image
WO2015125299A1 (en) Local location computation device and local location computation method
KR102297683B1 (en) Method and apparatus for calibrating a plurality of cameras
WO2018153915A1 (en) Determining an angular position of a trailer without target
KR20210005605A (en) Online evaluation of camera specific parameters
JP2011150573A (en) Drive recorder
SE538500C2 (en) Method and system for improving quality of image informationfrom a 3d detection unit for use on a vehicle
KR101276073B1 (en) System and method for detecting distance between forward vehicle using image in navigation for vehicle
WO2021059967A1 (en) Object recognition device and object recognition program
US20220196841A1 (en) Object recognition abnormality detection apparatus, object recognition abnormality detection program product, and object recognition abnormality detection method
KR102371592B1 (en) Apparatus and method for estimating inter-vehicle distance