SE538595C2 - Method and system for improving object detection performanceof a 3d detection unit - Google Patents

Method and system for improving object detection performanceof a 3d detection unit Download PDF

Info

Publication number
SE538595C2
SE538595C2 SE1451426A SE1451426A SE538595C2 SE 538595 C2 SE538595 C2 SE 538595C2 SE 1451426 A SE1451426 A SE 1451426A SE 1451426 A SE1451426 A SE 1451426A SE 538595 C2 SE538595 C2 SE 538595C2
Authority
SE
Sweden
Prior art keywords
image
information
detection unit
unit
correlating
Prior art date
Application number
SE1451426A
Other languages
Swedish (sv)
Other versions
SE1451426A1 (en
Inventor
Salmén Mikael
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1451426A priority Critical patent/SE538595C2/en
Priority to DE102015014199.6A priority patent/DE102015014199B4/en
Publication of SE1451426A1 publication Critical patent/SE1451426A1/en
Publication of SE538595C2 publication Critical patent/SE538595C2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • G06V20/647Three-dimensional objects by matching two-dimensional images to three-dimensional objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

26 ABSTRACT The present invention relates to a method for improving object detectionperformance of a 3D detection unit, the 3D detection unit being configured forlightrepresenting 3D information so as to determine a 3D image. The method illuminating the object and detecting from the object reflected comprises the steps of: by means of the 3D detection unit detecting (S1) a3D image of the object; and detecting (S2) a 2D image of said object. Themethod further comprises the steps of: correlating (S3) the 2D image and the3D image so as to determine possible lack of information in the 3D image;and extrapolating (S4) the 3D image using information from the 2D image incase of information Iacking in the 3D image. The present invention also relates to a system for improving object detectionperformance of a 3D detection unit. The present invention also relates to avehicle. The present invention also relates to a computer program and a computer program product. (Pig. s)

Description

METHOD AND SYSTEM FOR IIVIPROVING OBJECT DETECTIONPERFORMANCE OF A 3D DETECTION UNIT TECHNICAL FIELD The invention relates to a method for improving object detection performanceof a 3D detection unit according to the preamble of claim 1. The inventionalso relates to a system for improving object detection performance of a 3Ddetection unit. The invention also relates to a vehicle. The invention inaddition relates to a computer program and a computer program product.
BACKGROUND ART ln 3D image processing data quality is very dependent on having low noisevariation in the measurements. For active 3D-cameras the noise variation willbe a function of the ego-illumination and surrounding noise-illumination suchas sunlight.
Segmentation in 2D image processing is common and performed as a step inthe object detection process. Similar segmentation is done in 3D imageprocessing as well.
US2010322477 discloses a method for increasing spatial resolution of a 3D-image by using a 2D image with higher spatial resolution than the original 3Dimage. US2010322477 also describes methods on how to more accuratelyinterpolate the new 3D-image voxels.
OBJECTS OF THE INVENTION An object of the present invention is to provide a method for improving objectdetection performance of a 3D detection unit which facilitates a more robustand correct detection of objects.
Another object of the present invention is to provide a system for improvingobject detection performance of a 3D detection unit which facilitates a morerobust and correct detection of objects.
SUMMARY OF THE INVENTION These and other objects, apparent from the following description, areachieved by a method, a system, a vehicle, a computer program and acomputer program product, as set out in the appended independent claims.Preferred embodiments of the method and the system are defined inappended dependent claims.
Specifically an object of the invention is achieved by a method for improvingobject detection performance of a 3D detection unit, the 3D detection unitbeing configured for illuminating the object and detecting from the objectreflected light representing 3D information so as to determine a 3D image.The method comprises the steps of: by means of the 3D detection unitdetecting a 3D image of the object; and detecting a 2D image of said object.The method further comprises the steps of: correlating the 2D image and the3D image so as to determine possible lack of information in the 3D image;and extrapolating the 3D image using information from the 2D image in caseof information lacking in the 3D image. Hereby a more robust and correctdetection of objects by means of a 3D detection unit is facilitated and thus amore correct and robust 3D image may be obtained.
According to an embodiment the method comprises the step of segmentingthe 2D and 3D images, wherein the step of correlating the 2D image and the3D image comprises correlating the segmented 2D and 3D images as abasis when extrapolating the 3D image. By thus segmenting the 2D and 3Dimages and correlating the segmented portions an even more robust andcorrect detection of objects by means of a 3D detection unit is facilitated inthat segments and not the entire object is correlated.
According to an embodiment the method comprises the step of transformingthe 3D image to a 2D image prior to correlating the 2D and 3D image.Hereby the correlation of the 2D and 3D images/ segmented 2D and 3Dimages is facilitated.
According to an embodiment the method comprises the step of fiitering the3D image prior to correlating the 2D and 3D image. By thus fiitering the 3Dimage the quality of the quality of the 3D image is improved thus improvingthe correlation with the 2D image. By removing the noisiest measurements inthe 3D image the segmentation will be more accurate hence making it morelikely to find correlation with 2D image.
According to an embodiment of the method the step of extrapolatinginformation from 2D image comprises including information extending to thearea where information is lacking in the 3D image. Hereby a more robust andcorrect 3D image is obtained.
According to an embodiment the method comprises the step of performing arange estimation based on said 2D image so as to obtain reconstructedinformation for the 3D image. Hereby a more correct 3D image is obtained fora 3D image, e.g. for a 3D image where there is no corresponding 3Dsegmentation for a 2D segmented object. lf 3D data is present in one part ofthe 2D segment, extrapolation of the data, for instance if surface isconsidered flat within 2D segment, can be made using present data andinformation from 2D image, hence the range estimation “fills” the missing 3Ddata for the entire 2D segment.
According to an embodiment the method comprises the step of adapting theimage exposure settings and/or the brightness of the 3D detection unit so asto increase the quality of the information in the 3D image. By thus adaptingthe image exposure settings and/or the brightness of the 3D detection unitimproved data will be obtained in non-detected areas of the 2D image/2Dsegmented image, wherein a following detection of the 3D detection unit will capture more data and thus an improved 3D image will be obtained. Theadaption the image exposure settings and/or the brightness of the 3Ddetection unit will hereby only be done were deemed necessary.
According to an embodiment of the method the 3D detection unit comprises atime-of-flight camera unit. A time-of-flight camera unit is a suitable cameraunit for such object detection.
According to an embodiment of the method the 3D detection unit isconfigured to provide said 2D image. Hereby only one detection unit isneeded.
Specifically an object of the invention is achieved by a system for improvingobject detection performance of a 3D detection unit, the 3D detection unitbeing configured for illuminating the object and detecting from the objectreflected light representing 3D information so as to determine a 3D imageadapted to perform the methods as set out above.
The system according to the invention has the advantages according to thecorresponding method claims.
BRIEF DESCRIPTION OF THE DRAWINGS For a better understanding of the present invention reference is made to the following detailed description when read in conjunction with theaccompanying drawings, wherein like reference characters refer to like parts throughout the several views, and in which: Fig. 1 schematically illustrates a side view of a vehicle according to the present invention; Fig. 2 schematically illustrates a system for improving object detectionperformance of a 3D detection unit according to an embodiment of thepresent invention; Fig. 3 schematically illustrates a block diagram of a method for improvingobject detection performance of a 3D detection unit according to an embodiment of the present invention; Fig. 4a schematically illustrates a view of a 3D image of an object in the formof a table taken by means of a 3D detection unit; Fig. 4b schematically illustrates the table in fig. 4a where a 3D segmentedpart of the table is marked; Fig. 4c schematically illustrates the table in fig. 4a where a 2D segmentedpart of the table is marked; Fig. 4d schematically illustrates the table in fig. 4a where the 3D image hasbeen reconstructed using information from the 2D segmented part; Fig. 5 schematically illustrates a block diagram of a method for improvingobject detection performance of a 3D detection unit according to embodiments of the present invention; and Fig. 6 schematically illustrates a computer according to an embodiment of the present invention.
DETAILED DESCRIPTION Hereinafter the term “link” refers to a communication link which may be aphysical connector, such as an optoelectronic communication wire, or a non-physical connector such as a wireless connection, for example a radio or microwave link.
Hereinafter the term “2D” refers to two dimensional. Hereinafter the term “3D”refers to three dimensional.
Hereinafter the term “range estimation” refers reconstruction of depth valuefor 3D signal where 2D image shows the presence of an object. For instance,when a 3D image area has been found where no data exists but 2D imagesegment shows that an object exists, then the 3D data need to bereconstructed. The term “range estimation” further refers to using values ofnoisy 3D signals present in area which were removed/excluded throughfiltration prior to segmentation and correlation to estimate/reconstruct newvalues for area. lf 3D data is present in one part of the 2D segment,extrapolation of the data, for instance if surface is considered flat within 2Dsegment, can be made using present data and information from 2D image,hence the range estimation “fills” the missing 3D data for the entire 2D segment.
Hereinafter the term “3D image" refers to the collection of points in the 3D setof data, the points having x, y, z coordinates. The term “3D image” may alsobe referred to as the point cloud.
Fig. 1 schematically illustrates avehicle 1 according to an embodiment of thepresent invention. The exemplified vehicle 1 is a heavy vehicle in the shapeof a truck. The vehicle according to the present invention could be anysuitable vehicle a bus, a car, a train or the like. The vehicle comprises asystem for improving object detection performance of a 3D detection unitaccording to the present invention. The vehicle comprises a 3D detectionunit.
Fig. 2 schematically illustrates a system I for improving object detectionperformance of a 3D detection unit 110 according to an embodiment of thepresent invention.
The system I comprises an electronic control unit 100.
The 3D detection unit 110 is configured for illuminating an object O by meansof light L1 and detecting from the object reflected light L2 representing 3Dinformation so as to determine a 3D image. The 3D detection unit 110 isconfigured for detecting reflected light by means of time-of-flight technique.The 3D detection unit 110 comprises active light L1 for illumination of theobject O. The light L1 could be any suitable light such as infrared light, visiblelight, LASER light or the like. The 3D detection unit 110 comprises accordingto an embodiment means for illumination of an object O by means of infraredlight.
The 3D detection unit 110 comprises according to an embodiment a time-of-flight camera unit. The time-of-flight camera unit may be any kind of sensorusing transmitted signal through air and correlating with received signal inorder to conclude a distance to measured point(s). The time-of-flight (ToF)camera unit may be any kind of ToF camera based on for instanceContinuous-Wave (CW, such as PMD, Swissranger, etc), pulsed-wave (suchas TDC) or range gating (Obzerv) principles. ToFs uses typically a LED thatilluminates the whole scene at ones. The 3D detection unit 110 comprisesaccording to an embodiment a LIDAR, i.e. a laser scanner unit.
The 3D detection unit 110 is arranged to detect a 3D image of an object O.The 3D detection unit 110 is arranged to provide a 3D image of the object O.The system I comprises according to an embodiment the 3D detection unit110. 3D detection unit is configured to create amplitude data. 3D detectionunit is configured to create depth/distance data. By means of saiddepth/distance data a point cloud is created constituting said 3D image.
The system I comprises means 110, 112 for detecting a 2D image of saidobject O. The 3D detection unit 110 is according to an embodiment arrangedto detect the 2D image of the object O. The 3D detection unit 110 is herebyconfigured to provide said 2D image of the object O. The 3D detection unit110 is thus configured to extract E1 brightness data from said object O so asto provide said 2D image of the object O. A brightness image is hereby created by means of the 3D detection unit 110. The means 110, 112 fordetecting a 2D image of said object O thus comprises the 3D detection unit110.
The means 110, 112 for detecting a 2D image of said object O comprisesaccording to an embodiment a separate 2D detection unit 112 configured toprovide a 2D image of the object O. The 2D detection unit 112 is thusconfigured to extract E2 brightness data from said object O so as to providesaid 2D image of the object O. A brightness image is hereby created bymeans of the detection unit 1 12. The detection unit 112 may be any cameraunit configured for providing 2D images of an object.
The system I comprises means 120 for correlating the 2D image and the 3Dimage so as to determine possible lack of information in the 3D image. Themeans 120 for correlating the 2D image and the 3D image may be anysuitable image correlation unit for correlating such images. The means 120for correlating the 2D image and the 3D image is according to anembodiment comprised in the electronic control unit 100. The means 120 forcorrelating the 2D image and the 3D image is arranged to compare the 2Dimage and the 3D image by organizing the 2D image and 3D arranging themso that they overlap and are substantially aligned. The correlation may beperformed in any suitable way by any suitable means comprising means forperforming calculations. The correlation is according to an embodimentperformed in the 3D detection unit 110.
The system I comprises means 130 means for extrapolating the 3D imageusing information from the 2D image in case of information lacking in the 3Dimage. The means 130 means for extrapolating the 3D image usinginformation from the 2D image may be any suitable unit for suchextrapolation. he means 130 means for extrapolating the 3D image usinginformation from the 2D is according to an embodiment comprised in theelectronic control unit 100. The extrapolation may be performed in any suitable way by any suitable means comprising means for performing calculations. The correlation is according to an embodiment performed in the3D detection unit 110.
Said means for extrapolating information from the 2D image comprisesincluding information extending to the area where information is Iacking in the3D image. Said means for extrapolating information from 2D image is thusarranged to include information extending to the area where information is Iacking in the 3D image.
The system I comprises means 140 for transforming the 3D image to a 2Dimage prior to correlating the 2D and 3D image. The means 140 fortransforming the 3D image to a 2D image comprises any suitabletransformation unit for transforming a 3D image to a 2D image. The 3D pointcloud, i.e. the 3D image, is typically projected on to a 2D plane. This isaccording to an embodiment performed by the 3D detection unit 110 but could also be executed externally.
The system I comprises means 150 for segmenting the 2D and 3D images.The means 150 for segmenting the 2D and 3D images comprise means 152for segmenting 2D images. The means 150 for segmenting the 2D and 3Dimages comprise means 154 for segmenting 3D images.
The means 120 for correlating the 2D image and the 3D image comprisesmeans for correlating the segmented 2D and 3D images as a basis whenextrapolating the 3D image. The means 120 is thus configured for correlating2D images and 3D images and segmented 2D images and 3D images.
The system I comprises means 160 for filtering the 3D image prior tocorrelating the 2D and 3D image. The means 160 for filtering the 3D imagemay comprise any suitable filter for filtering. Preferably only safe informationis included in order to obtain a relevant correlation with the 2D-image. Hence3D images comprlsing noise/unsafe information are filtered. By removing thenoisiest measurements in the 3D image the segmentation will be more accurate hence making it more likely to find correlation with 2D image.
The system I comprises means 170 for performing a range estimation basedon said 2D image so as to obtain reconstructed information for the 3D image.Range estimation in this context thus means reconstruction of depth value fora 3D signal where a 2D image shows the presence of an object. Forinstance, when a 3D image area has been found where no data exists but a2D image segment shows that an object exists, then the 3D data need to bereconstructed. lf neighbouring pixels exist then known art can be used toestimate 3D signals through interpolation, either linear or more advancedsuch as bilinear. lf noisy 3D signals are present in the area, which accordingto an embodiment were removed/excluded through filtration prior tosegmentation and correlation, then these values could be used toestimate/reconstruct new values for the area. This is referred to as rangeestimation in this case. lf 3D data is present in one part of the 2D segment,extrapolation of the data, for instance if surface is considered flat within a 2Dsegment, can be made using present data and information from the 2Dimage, the range estimation thus filling the missing 3D data for the entire 2Dsegment.
The system I comprises means 180 for adapting the image exposure settingsand/or the brightness of the 3D detection unit so as to increase the quality ofthe information in the 3D image.
The electronic control unit 100 is operatively connected to the 3D detectionunit 110 via a link 110a. The electronic control unit 100 is via the link 110aarranged to receive a signal from the 3D detection unit 110 representing 3Ddata, i.e. point cloud data, for a 3D image of the object O.
The electronic control unit 100 is operatively connected to the 3D detectionunit 110 via a link 110b. The electronic control unit 100 is via the link 110barranged to receive a signal from the 3D detection unit 110 representing 2Ddata, i.e. brightness data, for a 2D image of the object O. 11 The electronic control unit 100 is operatively connected to the 2D detectionunit 112 via a link 112a. The electronic control unit 100 is via the link 112aarranged to receive a signal from the 2D detection unit 112 representing 2Ddata, i.e. brightness data, for a 2D image of the object O.
The electronic control unit 100 is operatively connected to means 140 fortransforming the 3D image to a 2D image prior to correlating the 2D and 3Dimage via a link 140a. The electronic control unit 100 is via the link 140aarranged to send a signal to the means 140 representing 3D datarepresenting 3D image of said object, wherein the means 140 is arranged totransform the 3D image to a 2D image.
The electronic control unit 100 is operatively connected to means 140 fortransforming the 3D image to a 2D image prior to correlating the 2D and 3Dimage via a link 140a. The electronic control unit 100 is via the link 140aarranged to receive a signal from the means 140 representing data for 2D-transformed 3D image.
The electronic control unit 100 is operatively connected to means 152 forsegmenting 2D images via a link 152a. The electronic control unit 100 is viathe link 152a arranged to send a signal to the means 152 representing 2Ddata representing 2D image of said object, wherein the means 152 isarranged to segment the 2D image so as to obtain a segmented 2D image ofa segment of the object.
The electronic control unit 100 is operatively connected to means 152 forsegmenting 2D images via a link 152b. The electronic control unit 100 is viathe link 152b arranged to receive a signal from the means 152 representingdata for a segmented 2D image of the object.
The electronic control unit 100 is operatively connected to means 154 forsegmenting 3D images via a link 154a. The electronic control unit 100 is viathe link 154a arranged to send a signal to the means 154 representing 3Ddata representing 3D image of said object, wherein the means 154 is 12 arranged to segment the 3D image so as to obtain a segmented 3D image ofa segment of the object.
The electronic control unit 100 is operatively connected to means 154 forsegmenting 3D images via a link 154b. The electronic control unit 100 is viathe link 154b arranged to receive a signal from the means 154 representingdata for a segmented 3D image of the object.
The electronic control unit 100 is operatively connected to means 120 forcorrelating the 2D image and the 3D image so as to determine possible lackof information in the 3D image via a link 120a. The electronic control unit 100is via the link 120a arranged to send a signal to the means 120 representing2D data and 3D data representing 2D image and 3D image of said object.
The electronic control unit 100 is operatively connected to means 120 forcorrelating the 2D image and the 3D image so as to determine possible lackof information in the 3D image via a link 120b. The electronic control unit 100is via the link 120b arranged to receive a signal from the means 120representing correlatlon data for the correlated 2D image and 3D imagecomprising data of possible lack of information/data in the 3D image.
The electronic control unit 100 is operatively connected to the means 120 forcorrelating the segmented 2D image and the segmented 3D image so as todetermine possible lack of information in the segmented 3D image via a link120a. The electronic control unit 100 is via the link 120a arranged to send asignal to the means 120 representing 2D data and 3D data representingsegmented 2D image and segmented 3D image of said object.
The electronic control unit 100 is operatively connected to means 120 forcorrelating the segmented 2D image and the segmented 3D image so as todetermine possible lack of information in the segmented 3D image via a link120b. The electronic control unit 100 is via the link 120b arranged to receivea signal from the means 120 representing correlatlon data for the correlated 13 segmented 2D image and segmented 3D image comprising data of possiblelack of information/data in the segmented 3D image.
The electronic control unit 100 is operatively connected to means 130 meansfor extrapolating the 3D image using information from the 2D image in caseof information lacking in the 3D image via a link 130a. The electronic controlunit 100 is via the link 130a arranged to send a signal to the means 130representing extrapolation data for extrapolating the 3D image usinginformation from the 2D image including where applicable informationextending to the area where information is lacking in the 3D image.
The electronic control unit 100 is operatively connected to means 130 meansfor extrapolating the 3D image using information from the 2D image in caseof information lacking in the 3D image via a link 130b. The electronic controlunit 100 is via the link 130b arranged to receive a signal from the means 130representing data for the extrapolated 3D image.
The electronic control unit 100 is operatively connected to means 130 meansfor extrapolating the segmented 3D image using information from thesegmented 2D image in case of information lacking in the segmented 3Dimage via a link 130a. The electronic control unit 100 is via the link 130aarranged to send a signal to the means 130 representing extrapolation datafor extrapolating the segmented 3D image using information from thesegmented 2D image including where applicable information extending to thearea where information is lacking in the segmented 3D image.
The electronic control unit 100 is operatively connected to means 130 meansfor extrapolating the segmented 3D image using information from thesegmented 2D image in case of information lacking in the segmented 3Dimage via a link 130b. The electronic control unit 100 is via the link 130barranged to receive a signal from the means 130 representing data for theextrapolated segmented 3D image. 14 The electronic control unit 100 is operatively connected to means 160 forfiltering the 3D image prior to correlating the 2D and 3D image via a link160a. The electronic control unit 100 is via the link 160a arranged to send asignal to the means 160 representing 3D data representing 3D image of saidobject, wherein the means 160 is arranged to filter the 3D image.
The electronic control unit 100 is operatively connected to means 160 forfiltering the 3D image prior to correlating the 2D and 3D image via a link160b. The electronic control unit 100 is via the link 160b arranged to send asignal to the means 160 representing filtered 3D data representing filtered 3Dimage of said object.
The electronic control unit 100 is operatively connected to means 170 forperforming a range estimation based on said 2D image so as to obtainreconstructed information for the 3D image via a link 170a. The electroniccontrol unit 100 is via the link 170a arranged to send a signal to the means170 representing 3D data for performing a range estimation based on said2D image, wherein the means 170 is configured to perform the range estimation.
The electronic control unit 100 is operatively connected to means 170 forperforming a range estimation based on said 2D image so as to obtainreconstructed information for the 3D image via a link 170b. The electroniccontrol unit 100 is via the link 170b arranged to receive a signal from themeans 170 representing range estimation data containing reconstructedinformation.
The electronic control unit 100 is operatively connected to means 180 foradapting the image exposure settings and/or the brightness of the 3Ddetection unit so as to increase the quality of the information in the 3D imagevia a link 180a. The electronic control unit 100 is via the link 180a arranged tosend a signal to the means 180 representing 3D data for the 3D image/segmented 3D image.
The electronic control unit 100 is operatively connected to means 180 foradaptlng the image exposure settings and/or the brightness of the 3Ddetection unit so as to increase the quality of the information in the 3D imagevia a link 180b. The electronic control unit 100 is via the link 180b arranged toreceive a signal from the means 180 representing 3D adaption data foradaptlng the image exposure settings and/or the brightness of the 3Ddetection unit so as to increase the quality of the information in the 3D image.
The electronic control unit 100 is operatively connected to the 3D detectionunit 110 via a link 110c. The electronic control unit 100 is via the link 110barranged to send a signal to the 3D detection unit 110 representing adaptiondata for adaptlng the image exposure settings and/or the brightness of the3D detection unit so as to increase the quality of the information in the 3D image.
Fig. 3 schematically illustrates a block diagram of a method for improvingobject detection performance of a 3D detection unit, the 3D detection unitbeing configured for illuminating the object and detecting from the objectreflected light representing 3D information so as to determine a 3D image.
According to the embodiment the method for improving object detectionperformance of a 3D detection unit comprises a step S1. ln this step a 3Dimage of the object is detected by means of the 3D detection unit.
According to the embodiment the method for improving object detectionperformance of a 3D detection unit comprises a step S2. ln this step a 2Dimage of the object is detected.
According to the embodiment the method for improving object detectionperformance of a 3D detection unit comprises a step S3. ln this step the 2Dimage and the 3D image are correlated so as to determine possible lack ofinformation in the 3D image. 16 According to the embodiment the method for improving object detectionperformance of a 3D detection unit comprises a step S4. ln this step the 3Dimage is extrapolated using information from the 2D image in case ofinformation Iacking in the 3D image.
According to an embodiment not illustrated in fig. 3 the method for improvingobject detection performance of a 3D detection unit may comprise the stepsof segmenting the 2D and 3D images.
According to an embodiment not illustrated in fig. 3 the method for improvingobject detection performance of a 3D detection unit may comprise the step oftransforming the 3D image to a 2D image prior to correlating the 2D and 3Dimage.
According to an embodiment not illustrated in fig. 3 the method for improvingobject detection performance of a 3D detection unit may comprise the step offiltering the 3D image prior to correlating the 2D and 3D image.
According to an embodiment not illustrated in fig. 3 the method for improvingobject detection performance of a 3D detection unit may comprise the step ofperforming a range estimation based on said 2D image so as to obtainreconstructed information for the 3D image.
According to an embodiment not illustrated in fig. 3 the method for improvingobject detection performance of a 3D detection unit may comprise the step ofadapting the image exposure settings and/or the brightness of the 3Ddetection unit so as to increase the quality of the information in the 3D image.
Fig. 4a schematically illustrates a view of a 3D image 50 of an object in theform of a table taken by means of a 3D detection unit, fig. 4b the table in fig.4a where a 3D segmented part of the table is marked, fig. 4c the table in fig.4a where a 2D segmented part of the table is marked, and fig. 4d the table infig. 4a where the 3D image has been reconstructed using information fromthe 2D segmented part. 17 The 3D image of the object in the form of a table in fig. 4a has portions 50awith good quality 3D information /3D data and portions 50b with poor quality3D information/ 3D data which more corresponds to noise. The signa|-to-noise ratio (SNR) probably differ a lot for the points considered to be of worsequality. The comparison with the 2D image is interesting to do as it gives abetter guidance to were in the 3D space there are data comprisingextractable information and what could be considered to be mostly noise.
The 3D image of the object, i.e. the table, is according to an embodimentfiltered. ln fig. 4b the 3D image is segmented such that a segment 52 of the 3Dimage with information with good quality is identified. The segmented 3Dimage is transformed into a corresponding segmented 2D image. Accordingto a variant the 3D image is transformed into a 2D image prior to thesegmentation.
A 2D image of the object O, i.e. table, is also detected. ln fig. 4c the 2Dimage is segmented such that a segment 54 of the 2D image of brightnessdata including the segmented part of the 3D image and additional data fromthe area in which there was data with poor quality/noise in the 3D image isextracted. ln fig. 4d the segmented 3D image has been reconstructed using informationfrom the 2D segmented part 52. 2D and 3D segmentation data are heremerged such that the lacking portion in the segmented 3D image 52 hasbeen reconstructed using the information in the segmented 2D image 54.The segmented 3D image has thus been extrapolated to an extrapolatedsegmented 3D image 55 using information from the segmented 2D image 54including information extending to the area 50b where information is lackingin the segmented 3D image. Hereby a more robust and correct 3D image ofthe object O/table is obtained. 18 Fig. 5 schematically illustrates a block diagram of a method for improvingobject detection performance of a 3D detection unit 110 according toembodiments of the present invention. Here different embodiments, i.e.different options, of the method for improving object detection performance ofa 3D detection unit 110 are illustrated.
A 3D image of the object is detected by means of the 3D detection unit 110and a 2D image of the object O is detected by means of the 3D detection unit110 and/or a separate 2D detection unit 112.
Hereby data of a 3D image of the detected object O, i.e. point cloud datarepresenting information of the 3D image, is obtained. This data of the 3Dimage is denoted “3D llvlAGE” in fig. 5. Further data of a 2D image of thedetected object, i.e. brightness data representing information of the 2D imageis obtained. This data of the 2D image is denoted “2D IIVIAGE” in fig. 5.
According to an embodiment representing a first variant the 3D IMAGE isfiltered in 160 (FILTER 3D IMAGE). Then the filtered 3D image istransformed into a 2D image, i.e. projected as a 2D image on a planecorresponding to the observation of the existing 2D image, in 140(TRANSFORM 3D TO 2D). Then the 2D-transformed 3Dsegmented in 154 (SEG|\/IENT) to a segmented 2D image. The 2D image issegmented in 152 (SEGMENT) to a segmented 2D image. The segmented2D-3D image is correlated with the segmented 2D(CORRELATE 2D & 3D) so as to determine possible lack of information inthe 3D image. image is image in 120 According to an embodiment representing a second variant the 3D image istransformed into a 2D image, i.e. projected as a 2D image, in 140(TRANSFORM 3D TO 2D). Then the 2D-transformed 3Dsegmented in 154 (SEGIVIENT) to a segmented 2D of 3D image. The 2Dimage is segmented in 152 (SEGMENT) to a segmented 2D image. The image is segmented 3D image is correlated with the segmented 2D image in 120 19 (CORRELATE 2D & 3D) so as to determine possible lack of information inthe 3D image.
According to an embodiment representing a third variant the 3D image issegmented in 154 (SEGMENT) to a segmented 3D image. Then thesegmented 3D image is transformed into a 2D image, i.e. projected as a 2Dimage, in 140 (TRANSFORIVI 3D TO 2D). The 2D image is segmented in 152(SEGMENT) to a segmented 2D image. The segmented 3D image iscorrelated with the segmented 2D image in 120 (CORRELATE 2D & 3D) soas to determine possible lack of information in the 3D image.
According to an embodiment representing a fourth variant the 3D image istransformed into a 2D image, i.e. projected as a 2D image, in 140(TRANSFORM 3D TO 2D). The thus transformed 3D image is correlated withthe 2D image in 120 (CORRELATE 2D & 3D) so as to determine possiblelack of information in the 3D image.
For the first, second, third and fourth variants as mentioned above, aftercorrelation in 120 (CORRELATE 2D & 3D) the 3D image/segmented 3Dimage is extrapolated in 130 (EXTRAPOLATE 3D BASED ON 2D) usingthe 2Dweaker/noisier 3D signal. information from image/segmented 2d image and filtered A new 3D image is then obtained from the originally detected 3D imagebased on said extrapolation.
The object is then identified.
With reference to figure 6, a diagram of an apparatus 500 is shown. Thecontrol unit 100 described with reference to fig. 2 may according to anembodiment comprise apparatus 500. Apparatus 500 comprises a non-volatile memory 520, a data processing device 510 and a read/write memory550. Non-volatile memory 520 has a first memory portion 530 wherein acomputer program, such as an operating system, is stored for controlling the function of apparatus 500. Further, apparatus 500 comprises a bus controller,a serial communication port, I/O-means, an A/D-converter, a time date entryand transmission unit, an event counter and an interrupt controller (notshown). Non-volatile memory 520 also has a second memory portion 540.
A computer program P is provided comprising routines for improving objectdetection performance of a 3D detection unit, the 3D detection unit beingconfigured for illuminating the object and detecting from the object reflectedlight representing 3D information so as to determine a 3D image. Theprogram P comprises routines for detecting a 3D image of the object bymeans of the 3D detection unit. The program P comprises routines fordetecting a 2D image of said object. The program P comprises routines forcorrelating the 2D image and the 3D image so as to determine possible lackof information in the 3D image. The program P comprises routines forextrapolating the 3D image using information from the 2D image in case ofinformation lacking in the 3D image. The program P comprises according toan embodiment routines for segmenting the 2D and 3D images. The programP comprises according to an embodiment routines for transforming the 3Dimage to a 2D image prior to correlating the 2D and 3D image. The programP comprises according to an embodiment routines for filtering the 3D imageprior to correlating the 2D and 3D image. The program P comprisesaccording to an embodiment routines for performing a range estimationbased on said 2D image so as to obtain reconstructed information for the 3Dimage. The program P comprises according to an embodiment routines foradapting the image exposure settings and/or the brightness of the 3Ddetection unit so as to increase the quality of the information in the 3D image.The computer program P may be stored in an executable manner or in acompressed condition in a separate memory 560 and/or in read/writememory 550.
When it is stated that data processing device 510 performs a certain functionit should be understood that data processing device 510 performs a certain 21 part of the program which is stored in separate memory 560, or a certain partof the program which is stored in read/write memory 550.
Data processing device 510 may communicate with a data communicationsport 599 by means of a data bus 515. Non-volatile memory 520 is adaptedfor communication with data processing device 510 via a data bus 512.Separate memory 560 is adapted for communication with data processingdevice 510 via a data bus 511. Read/write memory 550 is adapted forcommunication with data processing device 510 via a data bus 514. To thedata communications port 599 e.g. the links connected to the control unit 100may be connected.
When data is received on data port 599 it is temporarily stored in secondmemory portion 540. When the received input data has been temporarilystored, data processing device 510 is set up to perform execution of code ina manner described above. The signals received on data port 599 can beused by apparatus 500 for detecting a 3D image of the object by means ofthe 3D detection unit. The signals received on data port 599 can be used byapparatus 500 for detecting a 2D image of said object. The signals receivedon data port 599 can be used by apparatus 500 for correlating the 2D imageand the 3D image so as to determine possible lack of information in the 3Dimage. The signals received on data port 599 can be used by apparatus 500for extrapolating the 3D image using information from the 2D image in caseof information lacking in the 3D image. The signals received on data port 599can according to an embodiment be used by apparatus 500 for improvingobject detection performance of a 3D detection unit may comprise the stepsof segmenting the 2D and 3D images. The signals received on data port 599can according to an embodiment be used by apparatus 500 for transformingthe 3D image to a 2D image prior to correlating the 2D and 3D image. Thesignals received on data port 599 can according to an embodiment be usedby apparatus 500 for filtering the 3D image prior to correlating the 2D and 3Dimage. The signals received on data port 599 can according to an 22 embodiment be used by apparatus 500 for performing a range estimationbased on said 2D image so as to obtain reconstructed information for the 3Dimage. The signals received on data port 599 can according to anembodiment be used by apparatus 500 for adapting the image exposuresettings and/or the brightness of the 3D detection unit so as to increase thequality of the information in the 3D image.
Parts of the methods described herein can be performed by apparatus 500by means of data processing device 510 running the program stored inseparate memory 560 or read/write memory 550. When apparatus 500 runsthe program, parts of the methods described herein are executed.
The foregoing description of the preferred embodiments of the presentinvention has been provided for the purposes of illustration and description. ltis not intended to be exhaustive or to limit the invention to the precise formsdisclosed. Obviously, many modifications and variations will be apparent topractitioners skilled in the art. The embodiments were chosen and describedin order to best explain the principles of the invention and its practicalapplications, thereby enabling others skilled in the art to understand theinvention for various embodiments and with the various modifications as are suited to the particular use contemplated.

Claims (19)

1. A method for improving object detection performance of a 3D detectionunit, the 3D detection unit (1 10) being configured for illuminating the object anddetecting from the object reflected light representing 3D information so as todetermine a 3D image, comprising the steps of: by means of the 3D detectionunit (110) detecting (S1) a 3D image of the object; and detecting (S2) a 2Dimage of said object, characterized by the steps of: - correlating (S3) the 2D image and the 3D image, by comgaring the 2Dimaqe and the 3D imaqe bv orqanizinq the 2D imaqe and 3D arranqinq themso that they overlgp and are substantialjgligned, so as to determine possiblelack of information in the 3D image; and - extrapolating (S4) the 3D image using information from the 2D image incase of information lacking in the 3D imagef, comprisinq the step of performing a ranqe estimation, bv reconstruction of depth value for a 3D signal where a2D imaqe shows the presence of an object, based on said 2D imaqe so as toobtain reconstructed information for the 3D imaqe.
2. A method according to claim 1, comprising the step of segmenting the 2Dand 3D images, wherein the step of correlating the 2D image and the 3D imagecomprises correlating the segmented 2D and 3D images as a basis whenextrapolating the 3D image.
3. A method according to claim 1 or 2, comprising the step of transformingthe 3D image to a 2D image prior to correlating the 2D and 3D image.
4. A method according to any preceding claims, comprising the step offiltering the 3D image prior to correlating the 2D and 3D image.
5. A method according to any preceding claims, wherein the step ofextrapolating information from 2D image comprises including information extending to the area where information is lacking in the 3D image. 7~.
6. A method according to any preceding claims, comprising the step of adapting the image exposure settings and/or the brightness of the 3Ddetection unit (110) so as to increase the quality of the information in the 3D image. 8,
7. A method according to any preceding claims, wherein the 3D detection unit (110) comprises a time-of-fiight camera unit. 38, A method according to any preceding claims, wherein the 3Ddetection unit (110) is configured to provide said 2D image. 10.9.detection unit, the 3D detection unit (1 10) being configured for illuminating (L1) _A system (I) for improving object detection performance of a 3D the object (O) and detecting from the object (O) reflected light (L2) representing3D information so as to determine a 3D image, comprising detecting a 3Dimage of the object by means of the 3D detection unit (110); and means (110,112) for detecting a 2D image of said object, characterized by means (120)for correiating the 2D image and the 3D image, by comparing the 2D imageand the 3D imaqe bv organizing the 2D imaqe and 3D arranqinq them so thatthey overlap and are substantiallv aliqned, so as to determine possible lack ofinformation in the 3D image; and means (130) for extrapolating the 3D imageusing information from the 2D image in case of information lacking in the 3Dimagef, comprisinq means (170) for performinq a ranqe estimation, bvreconstruction of depth value for a 3D siqnal where a 2D imaqe shows thepresence of an object, based on said 2D imaqe so as to obtain reconstructedinformation for the 3D image. 11:10. A system according to ciaim 310, comprising means (150, 152,154) for segmenting the 2D and 3D images, wherein said means (120) forcorrelating the 2D image and the 3D image comprises means (120) for *s-fl-*í Formaterat: Ingen numrering correlating the segmented 2D and 3D images as a basis when extrapolatingthe 3D image. | 42-11. A system according to claim 9 or 1040-er-1-1, comprising means(140) for transforming the 3D image to a 2D image prior to correlating the 2D5 and 3D image. | 13112. A system according to any of claims 9-1110-1-2, comprisingmeans (160) forfiltering the 3D image priorto correlating the 2D and 3D image. | 14.13. A system according to any of claims 9-1210-43, wherein saidmeans (130) for extrapolating information from 2D image comprises including10 information extending to the area where information is Iacking in the 3D image. 1% ramen-ae. Inge” “Umfefifig 16.14. A system according to any claims 9-134-9-15, comprising means 15 (1 80) for adapting the image exposure settings and/or the brightness of the 3Ddetection unit (110) so as to increase the quality of the information in the 3Dimage. I 1J-.1Q._ A system according to any claims 9-1440-46, wherein the 3Ddetection unit (110) comprises a time-of-flight camera unit. |20 1846. A system according to any claims 9-1540-41, wherein the 3Ddetection unit (110) is configured to provide said 2D image. 1917. A vehicle (1) comprising a system (I) according to any of claims ißflfê-álâ.2011
8. A computer program (P) for improving object detection 25 performance of a 3D detection unit, said computer program (P) comprisingprogram code which, when run on an electronic control unit (100) or another 26 computer (500) connected to the electronic control unit (100), causes the electronic control unit to perform the steps according to claim 1-§
9. 214 9. A computer program product comprising a digital storage mediumstoring the computer program according to claim ßZQ.
SE1451426A 2014-11-26 2014-11-26 Method and system for improving object detection performanceof a 3d detection unit SE538595C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1451426A SE538595C2 (en) 2014-11-26 2014-11-26 Method and system for improving object detection performanceof a 3d detection unit
DE102015014199.6A DE102015014199B4 (en) 2014-11-26 2015-11-04 Method and system for improving the object recognition performance of a 3D recognition unit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1451426A SE538595C2 (en) 2014-11-26 2014-11-26 Method and system for improving object detection performanceof a 3d detection unit

Publications (2)

Publication Number Publication Date
SE1451426A1 SE1451426A1 (en) 2016-05-27
SE538595C2 true SE538595C2 (en) 2016-09-27

Family

ID=55967883

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1451426A SE538595C2 (en) 2014-11-26 2014-11-26 Method and system for improving object detection performanceof a 3d detection unit

Country Status (2)

Country Link
DE (1) DE102015014199B4 (en)
SE (1) SE538595C2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022093255A1 (en) * 2020-10-30 2022-05-05 Hewlett-Packard Development Company, L.P. Filterings of regions of object images

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009023896B4 (en) 2009-06-04 2015-06-18 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Apparatus and method for detecting a plant
DE102011053219B4 (en) 2011-09-02 2022-03-03 pmdtechnologies ag Combined pixel with phase-sensitive and color-selective sub-pixel

Also Published As

Publication number Publication date
DE102015014199A1 (en) 2016-06-02
DE102015014199B4 (en) 2021-08-12
SE1451426A1 (en) 2016-05-27

Similar Documents

Publication Publication Date Title
US11145038B2 (en) Image processing method and device for adjusting saturation based on depth of field information
US11206360B2 (en) Exposure control method for obtaining HDR image, related exposure control device and electronic device
US9412172B2 (en) Sparse light field representation
EP3712841B1 (en) Image processing method, image processing apparatus, and computer-readable recording medium
CN110378946B (en) Depth map processing method and device and electronic equipment
US9786062B2 (en) Scene reconstruction from high spatio-angular resolution light fields
CN108961383B (en) Three-dimensional reconstruction method and device
KR20210006276A (en) Image processing method for flicker mitigation
KR102059906B1 (en) Method and image capturing device for detecting fog in a scene
CN110378944B (en) Depth map processing method and device and electronic equipment
US11644570B2 (en) Depth information acquisition system and method, camera module, and electronic device
CN109885053B (en) Obstacle detection method and device and unmanned aerial vehicle
CN112633181B (en) Data processing method, system, device, equipment and medium
US9367759B2 (en) Cooperative vision-range sensors shade removal and illumination field correction
CN102034230B (en) Method for enhancing visibility of image
CN107527074B (en) Image processing method and device for vehicle
CN110188640B (en) Face recognition method, face recognition device, server and computer readable medium
Łuczyński et al. Underwater image haze removal with an underwater-ready dark channel prior
JP5955291B2 (en) Filtering device and environment recognition system
KR101341243B1 (en) Apparatus and method of restoring image damaged by weather condition
US8655054B2 (en) System and method of correcting a depth map for 3D image
SE538595C2 (en) Method and system for improving object detection performanceof a 3d detection unit
CN107025636B (en) Image defogging method and device combined with depth information and electronic device
WO2019175920A1 (en) Fog specification device, fog specification method, and fog specification program
KR102079686B1 (en) Apparatus and method of color image quality enhancement using intensity image and depth image