CN111024040A - Distance estimation method and apparatus - Google Patents

Distance estimation method and apparatus Download PDF

Info

Publication number
CN111024040A
CN111024040A CN201910953997.5A CN201910953997A CN111024040A CN 111024040 A CN111024040 A CN 111024040A CN 201910953997 A CN201910953997 A CN 201910953997A CN 111024040 A CN111024040 A CN 111024040A
Authority
CN
China
Prior art keywords
image
distance estimation
lane
target length
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910953997.5A
Other languages
Chinese (zh)
Other versions
CN111024040B (en
Inventor
李宰雨
具滋厚
李元周
玄伦硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN111024040A publication Critical patent/CN111024040A/en
Application granted granted Critical
Publication of CN111024040B publication Critical patent/CN111024040B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/571Depth or shape recovery from multiple images from focus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/536Depth or shape recovery from perspective effects, e.g. by using vanishing points
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo or light sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/50Magnetic or electromagnetic sensors
    • B60W2420/506Inductive sensors, i.e. passive wheel sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/12Bounding box

Abstract

A distance estimation method and apparatus are provided. The method comprises the following steps: acquiring an actual target length corresponding to a target point of an object; calculating an image target length corresponding to the actual target length from the input image; and estimating a distance from the distance estimation device to the object based on the actual target length, the image target length, and a focal length of the distance estimation device.

Description

Distance estimation method and apparatus
This application claims priority from korean patent application No. 10-2018-0120158, filed on 10.10.2018 of the korean intellectual property office, the disclosure of which is incorporated herein by reference in its entirety.
Technical Field
Methods and apparatus consistent with the present disclosure relate to distance estimation techniques.
Background
Active Cruise Control (ACC) techniques are required in autonomous driving, such as Advanced Driving Assistance Systems (ADAS). The ACC technique is a technique for sensing the speed of a preceding vehicle in a lane in which the vehicle is currently traveling and adjusting the speed of the vehicle so that the vehicle can not collide with the preceding vehicle by traveling while keeping a certain distance from the preceding vehicle.
Some vehicles currently on the market include the following functions: when a desired target speed is input to the vehicle, the vehicle is driven at the target speed without a preceding vehicle, and is kept at a certain distance from the preceding vehicle by reducing its speed accordingly when the preceding vehicle is present. In order to implement such a technique, a technique for stably measuring a distance to another vehicle is required.
Disclosure of Invention
An aspect is to provide a distance estimation method and apparatus.
According to an aspect of the embodiments, there is provided a distance estimation method including: acquiring an actual target length corresponding to a target point of an object; calculating an image target length corresponding to the actual target length from the input image; and estimating a distance from the distance estimation device to the object based on the actual target length, the image target length, and a focal length of the distance estimation device.
According to another aspect of the embodiments, there is provided a distance estimation apparatus including: an image sensor configured to acquire an input image; and a processor configured to: acquiring an actual target length corresponding to a target point of an object; calculating an image target length corresponding to the actual target length from the input image; and estimating a distance from the distance estimation device to the object based on the actual target length, the image target length, and the focal length of the distance estimation device.
According to another aspect of the embodiments, there is provided a distance estimation apparatus including: an image sensor including a lens and an image plane, wherein the image sensor is configured to capture an input image of a target object; and a processor configured to: generating a projection image from the input image; estimating an actual lane width based on a lane distance between lane lines in the projection image; calculating an image lane width and an image target object width by pixels from an input image; calculating a first ratio between an image lane width and an image target object width; calculating an actual target object width based on the actual lane width and the first ratio; and estimating a distance from the image sensor to the target object based on the actual target object width, the image target object width, and a focal length from a focal point of the lens to the image plane.
Drawings
The embodiments will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings, in which:
fig. 1 is a diagram showing a front object and a distance estimation apparatus according to an embodiment;
FIG. 2 is a flow diagram illustrating a distance estimation method according to an embodiment;
fig. 3 is a diagram showing the distance estimation process according to the embodiment in detail;
fig. 4 is a diagram illustrating estimation of an actual lane width according to an embodiment;
fig. 5 and 6 are diagrams showing estimation of the vehicle width according to the embodiment;
FIG. 7 is a top view illustrating distance estimation according to an embodiment;
fig. 8 is a block diagram showing a configuration of a distance estimation apparatus according to an embodiment.
Detailed Description
Fig. 1 is a diagram illustrating a front object and a distance estimation apparatus according to an embodiment.
The distance estimation apparatus 110 according to the embodiment may estimate the distance 119 to the front object 190. The distance 119 to the front object 190 may represent a straight-line distance 119 from the distance estimation device 110 to the front object 190. The distance estimation device 110 may acquire an input image through an image sensor and estimate the distance 119 to the front object 190 based on the input image.
When the vanishing point changes due to camera motion (e.g., pitch motion), there may be an error in the distance estimation using the vanishing point. This vanishing point change may occur when the vehicle passes over a speed bump, over a bumpy road, or over an inclined road. The distance estimation apparatus 110 according to the embodiment can stably estimate the distance by using the geometric relationship between the object width and the lane width, regardless of the vanishing point change, even if the ground is curved.
For example, the geometric relationship between the width of an object appearing in the input image and the width of a lane may be the same as or similar to the relationship between the width of an object in the real world and the width of a lane. Therefore, the distance estimation device 110 can estimate the vehicle width in the real world based on the relationship between the width of the object appearing in the input image and the lane width. The distance estimation device 110 may estimate the distance 119 to the front object 190 based on the vehicle width in the real world. Hereinafter, referring to fig. 2 to 8, a technique will be given to stably estimate the distance 119 to the front object 190 by only a single input image.
For reference, here, a road 101 may include one or more lanes 102. Lane lines 103 may represent boundary lines between one lane 102 and another lane 102. However, the inventive concept is not limited to the case where the lane line 103 must be in the road 101 or the lane 102. The driving lane 102 may represent the lane 102 in which the vehicle is currently driving. As one type of lane line 103, a center line (not shown) may represent a boundary line indicating prohibition of entry of a vehicle.
Fig. 2 is a flowchart illustrating a distance estimation method according to an embodiment.
First, in operation 210, the distance estimation apparatus may acquire an actual target length corresponding to a target point of an object. The target point may be a specific point specified in the object, and may represent two points specified as references for distance estimation. For example, the actual target length may be a length corresponding to a physical width of the object, and the target point may be two points designated for setting the width of the object, and may be a leftmost point of the object along the horizontal axis with respect to the vertical axis and a rightmost point parallel to the leftmost point. In the case of a three-dimensional (3D) bounding box with real dimensions that cover objects in the physical world, each vertex of the front or back of the bounding box may correspond to a target point. However, the inventive concept is not limited thereto, and as the distance between the feature points in the visual appearance of the object, for example, in the case where the object is a vehicle, a point (e.g., a center point) corresponding to two rear lamps may be set as a target point, or a point (e.g., a center point) corresponding to two front lamps may be set as a target point.
Additionally or alternatively, the target point may be set for height rather than width. For example, when the object is a vehicle, a point corresponding to the bottom of the vehicle and a point corresponding to the top of the vehicle may be set as target points. In this case, the actual target length may represent the height of the vehicle (e.g., vehicle height).
The acquisition of the actual target length will be described below with reference to fig. 5 and 6.
In operation 220, the distance estimation apparatus may calculate an image target length corresponding to the actual target length from the input image. The image target length may represent a length corresponding to a target point in the input image. The distance estimation apparatus may detect a bounding box including an object from an input image and determine an image target length based on the detected bounding box. For example, the image target length may represent the length between vertices of a bounding box that includes the object. When the bounding box is a two-dimensional (2D) bounding box, the image target length may be the length of the bottom line (base line) of the bounding box. When the bounding box is a 3D bounding box, the image object length may be the length of the bottom line that constitutes the back of the bounding box. For example, for the central axis of the road, the back of the bounding box may represent the surface facing the distance estimation device. In other words, among the plurality of faces of the bounding box, a face located closer to the distance estimation device with respect to the center axis of the road may be referred to as a back face, and a face located farther from the distance estimation device may be referred to as a front face. The central axis of the road may be an axis defined along the center point of the lane where the distance estimation device is located, and may be represented as a straight line in a straight road and a curve in a curved road.
For example, the input image may be a color image. The color image may include a plurality of color channel images. For example, the color channel images may include a red channel image, a green channel image, and a blue channel image. The pixels of each color channel image may represent the intensity of light corresponding to the wavelength of a color in the light received by the sensor. However, the input image is not limited thereto.
Subsequently, in operation 230, the distance estimation apparatus may estimate the distance from the distance estimation apparatus to the object based on the actual target length of the object, the image target length, and the focal length of the distance estimation apparatus. For example, the distance estimation apparatus may estimate the distance to the object from the focal distance based on the relationship between the actual target length and the image target length. This is because the proportional relationship between the actual target length and the image target length is the same as the proportional relationship between the distance to the object and the focal length. Such distance estimation will be described below with reference to fig. 7.
Fig. 3 is a diagram illustrating the distance estimation process according to the embodiment in detail.
In operation 301, the distance estimation apparatus may acquire a camera image. For example, the distance estimation device may receive camera images through one or more image sensors. The image sensor may be installed to capture an image at the front side of the distance estimation device. The image sensor may photograph a scene corresponding to an angle of view of the front side. However, the number of image sensors and the mounting positions thereof are not limited thereto.
In operation 311, the distance estimation apparatus may perform a projection image operation. For example, the distance estimation device may generate a projection image by projecting the input image on the ground. The distance estimation device may generate the projection image by homography. For example, the projected image may correspond to a bird's eye view.
In operation 312, the distance estimation apparatus may detect a lane width around the distance estimation apparatus. The distance estimation apparatus according to the embodiment may estimate a lane line of a lane in which the object is located in the input image. The distance estimation apparatus may determine an image lane width based on the estimated lane line. Here, the image lane width may be a length corresponding to a width of a lane detected from the input image, and may be, for example, a pixel distance between one lane line and another lane line, wherein the pixel distance defines the lane recognized in the input image. The determination of the image lane width will be described below with reference to fig. 5.
For example, the distance estimation apparatus may detect a lane line by interpolating portions corresponding to the same line in the input image. For example, the lane line may be a dotted line in which portions corresponding to the line are disposed at certain intervals. Even when the lane line is a broken line, the distance estimation apparatus can detect the lane line by converting the broken line into a solid line by the above-described interpolation.
In operation 313, the distance estimation apparatus may detect a lane line around the object. For example, the distance estimation apparatus may estimate the actual lane width based on a projection image obtained by projecting the input image on the ground. For example, the actual lane width may be estimated in meters. The distance estimation device may identify lane lines in the projection image and estimate a horizontal interval between the lane lines. The horizontal spacing between lane lines may correspond to the actual lane width. Lane line recognition using the projection images and estimation of the actual lane width will be described below with reference to fig. 4.
However, the acquisition of the actual lane width is not limited thereto. In response to a case where High Definition (HD) map data may be accessed, the distance estimation apparatus may acquire, from the map data corresponding to the current road, an actual lane width of a lane in which the object is located. For example, the distance estimation apparatus may acquire lane information corresponding to geographic coordinates where the object is located from the HD map data, and extract the actual lane width from the lane information.
In operation 314, the distance estimation apparatus may detect the actual width of the target object. For example, the distance estimation device may calculate the actual target length based on the image target length and the lane line appearing in the input image. The distance estimation apparatus may calculate a ratio between an image target length and an image lane width corresponding to a lane in which the object is located from the input image. The distance estimation device may estimate the actual target length from the actual lane width based on the ratio. The estimation of the actual target length will be described below with reference to fig. 5.
In operation 321, the distance estimation apparatus may detect a bounding box of the target object. For example, the distance estimation apparatus may detect a bounding box covering the object from the input image. The distance estimation device may detect the bounding box by using one of various algorithms. For example, the distance estimation apparatus may detect a bounding box including a region corresponding to an object in the input image using a neural network. The neural network may be trained to output a bounding box region from the image that corresponds to an object (e.g., a vehicle) to be detected. The bounding box may represent a 2D box or a 3D box that includes the object. The bounding box may have a particular shape (e.g., a rectangle or cuboid), and may represent a box that includes the space occupied by the object in a 2D space or a 3D space.
For example, each edge of the 2D bounding box may contact a portion of the object, and the 2D bounding box may be a minimum bounding box defined to minimize the size of the 2D bounding box. The top edge of the 2D bounding box may contact a top portion of an object appearing in the input image and the bottom edge thereof may contact a bottom portion of the object. Each face of the 3D bounding box may contact a portion of the object, and the 3D bounding box may be a minimum bounding box defined to minimize a size of the 3D bounding box. When the object is a vehicle, the front of the vehicle may contact the front of the 3D bounding box and the rear of the vehicle may contact the back of the 3D bounding box. An upper portion of the vehicle may contact a top surface of the 3D bounding box and a lower portion of the vehicle may contact a bottom surface of the 3D bounding box. The side of the vehicle may contact the side of the 3D bounding box.
In operation 322, the distance estimation device may track the bounding box. According to one embodiment, the distance estimation apparatus may track an object in an input image acquired in each frame. The distance estimation device may determine the position in the current frame for the bounding box detected in the previous frame. The distance estimation apparatus may store the result of tracking the bounding box in the object information database 340. For example, in response to a case where the actual target length (e.g., the actual vehicle width) has been estimated for the bounding box corresponding to the object in the previous frame, the distance estimation apparatus may load the actual target length of the previous frame from the object information database 340 into the current frame. The distance estimation apparatus may track an object to which a previous target length acquired in a previous frame is assigned. The distance estimation apparatus may determine a current target length of a current frame for the tracked object as a previous target length acquired in a previous frame. Therefore, even when it is difficult to estimate the actual target length in the current frame, the distance estimation apparatus can stably acquire the actual target length corresponding to the object. Here, the frame may represent a time point, and when it is assumed that the current frame is the tth time point, the previous frame may represent any one of the 1 st time point to the (t-1) th time point. Here, "t" may represent an integer greater than or equal to 2.
The distance estimation apparatus may estimate a previous target length of the object in a previous frame based on the additional sensor. The additional sensor may be another sensor used with the image sensor and may include, for example, a radar sensor or a LiDAR sensor. The distance estimation device may assign the estimated previous target length to the object. Therefore, when target length estimation using an additional sensor cannot be performed in the current frame, the distance estimation apparatus may use an actual target length previously acquired in the previous frame.
Object information database 340 may represent a database that includes information associated with objects. For example, the object information database 340 may include an Identification (ID) of the object, a size of the bounding box, a number of bounding boxes, and an actual target length estimated in a previous frame for the object.
The object information database 340 may include object size information mapped for each object model (e.g., vehicle model). The object size information may be information related to the size of an object corresponding to a specific model, and may include, for example, the width, height, and length of the object. The distance estimation apparatus according to the embodiment may determine object size information corresponding to an object appearing in the input image based on a visual appearance of the object. The distance estimation device may acquire the actual target length from the object size information.
In operation 331, the distance estimation apparatus may extract a distance. For example, the distance estimation apparatus may calculate the distance from the focal distance based on a ratio between the actual target length and the image target length. The distance calculation will be described in detail below with reference to fig. 7.
Fig. 4 is a diagram illustrating estimation of an actual lane width according to an embodiment.
The distance estimation apparatus 491 according to the embodiment can estimate the actual lane width 421 by using the input image 410 captured at the front side thereof. Although the input image 410 shown in fig. 4 includes one vehicle as the front object 492 for convenience of description, the number and shape of the front object 492 are not limited thereto. The distance estimation device 491 may be mounted on a vehicle, and the input image 410 may include a portion of the vehicle (e.g., a hood). However, the inventive concept is not limited thereto, and a portion of the vehicle may appear in the input image 410 according to the installation position and the viewing angle of the image sensor.
The distance estimation device 491 may generate a projection image 420 by projecting the input image 410. For example, the distance estimation device 491 may generate the projection image 420 corresponding to the bird's eye view. The distance estimation device 491 may convert the input image 410 into the projection image 420 by a matrix operation (e.g., homography matrix calculation) that converts the coordinates of each pixel of the input image 410 into coordinates on the projection image 420. Since the influence of the movement of the vehicle or the height of the ground is small relative to the area immediately in front of the vehicle, the coordinates of the area closer to the vehicle in the projection image 420 may be more accurate.
The distance estimation device 491 can identify the lane line in the projection image 420. The distance estimation device 491 can calculate physical coordinates (e.g., 2D coordinates) of a position corresponding to a lane line. The distance estimation device 491 can calculate the horizontal interval between two parallel lane lines by using the physical coordinates of the lane lines. The distance estimation device 491 may estimate the actual lane width 421 based on the interval between the identified lane lines. The projected image 420 generated by the projected image operation corresponding to the bird's eye view may represent precise coordinates relative to an area closer to the sensor. Therefore, in order to estimate the lane width more accurately, the distance estimation device 491 may use information of an area closer to the distance estimation device 491 in the projection image 420.
For example, the distance estimation device 491 may identify a lane line corresponding to a lane in which the object 492 is located in the projection image 420. The distance estimation device 491 may estimate the actual lane width 421 based on the horizontal distance between boundary lines within a threshold distance from the distance estimation device 491 among the identified lane lines. Here, the actual lane width 421 may represent the actual width of a lane in the physical world.
The distance estimation device 491 according to the embodiment may assume that the lane width at the position of the object 492 is the same as or similar to the lane width around the distance estimation device 491. Therefore, as described above, the distance estimation device 491 can estimate the actual lane width 421 of the lane around the object 492 based on the projection image 420 around the distance estimation device 491.
As shown in fig. 4, due to the angle of view of the image sensor, a lane line of another lane on the projection image 420 may be recognized at a position spaced apart from the distance estimation device 491. Therefore, the distance estimation apparatus 491 can also estimate the actual lane width 421 of the lane of the subject located in a different lane from the distance estimation apparatus 491 itself.
In response to a case where the point identifying the lane line corresponding to the lane in which the object 492 is located in the current frame is closer than the point identified in the previous frame, the distance estimation apparatus 491 may update the actual lane width 421 based on the point identifying the lane line in the current frame. Therefore, since the lane width can be calculated more accurately by using the information of the area closer to the distance estimation device 491 in the projection image 420, the distance estimation device 491 can update the actual lane width 421 more accurately.
According to another embodiment, the distance estimation apparatus 491 may estimate the actual lane width 421 only when the width between the lane width at the position of the object 492 and the lane width around the apparatus does not change much. For example, in response to a situation where the width of the lane in which the object 492 is located changes by more than a threshold change, the distance estimation device 491 may exclude estimation of the actual lane width 421. When excluding the estimation of the actual lane width 421, the distance estimation apparatus 491 may acquire the estimation of the actual lane width 421 or the actual target length of the excluding object by another method.
Fig. 5 and 6 are diagrams illustrating estimation of the vehicle width according to the embodiment.
Fig. 5 is a diagram illustrating a process of estimating an actual target length in a state where a vehicle mounted with a distance estimation device travels in a general road.
The distance estimation device may detect a front object 592 at the front side of the distance estimation device from the input image 510. For example, the distance estimation device may detect a bounding box 582 that includes a front object 592. Fig. 5 shows an input image 510 in which a bounding box for each of three front objects is detected in the input image 510. Fig. 5 shows an example of detecting the bounding box 582 in 2D. The enlarged image 511 is an enlarged partial image including the bounding box 582.
The distance estimation device may determine the image target length and the interval between lane lines 503 from the input image 510.
For example, the distance estimation device may determine the image target length from the bounding box 582 detected from the input image 510. The distance estimation apparatus may determine a pixel distance corresponding to the width of the object as the image target length. The pixel distance corresponding to the width of the object may correspond to the width of the bounding box 582. The distance estimation device may determine the length of the bottom line or the upper edge in the bounding box 582 as the pixel distance corresponding to the width of the object.
The distance estimation device may determine the image lane width based on the bounding box 582 detected from the input image 510. The distance estimation apparatus may determine the image lane width based on an intersection between the lane line 503 and an extension line of the bottom line of the boundary box 582. When the front object 592 is a vehicle, it is assumed that the vehicle keeping a traveling lane generally travels parallel to the lane line 503 along the center axis of the lane. In this case, the longitudinal axis of the preceding vehicle may be parallel to the lane line 503, and the lateral axis of the preceding vehicle may be perpendicular to the lane line 503. Accordingly, since the bottom line of the boundary frame 582 is parallel to the lateral axis of the preceding vehicle, the length between the intersection points between the lane line 503 and the extended lines of the bottom line may correspond to the image lane width. The distance estimation apparatus may determine a pixel distance between intersections between the lane line 503 and the extension lines of the bottom lines as the image lane width.
Subsequently, the distance estimation device may estimate the actual target length based on the image lane width and the image target length calculated as described above. For example, the distance estimation apparatus may calculate, from the input image 510, a ratio between the image target length and the image lane width corresponding to the lane in which the object 592 is located. The distance estimation apparatus may calculate a ratio between a pixel distance corresponding to the interval between the lane lines 503 in the input image 510 and a pixel distance corresponding to the width of the object 592. The distance estimation apparatus may estimate the actual target length from the actual lane width based on the above ratio according to equation 1 and equation 2 below.
Equation 1
Ratioimg=(Wimg,V/Wimg,L)
Equation 2
Wreal,V=Ratioimg·Wreal,L
In equation 1, Wimg,VMay represent an image target length (e.g., the width of a vehicle appearing in the image), W, in pixelsimg,LMay represent the image lane width in pixels. Ratio (R)imgCan represent the length W of the image targetimg,VWidth W of lane with imageimg,LThe ratio of (a) to (b). In equation 2, Wreal,LThe actual lane width may be represented. For example, the distance estimation apparatus may acquire the actual lane width W by the operation described above with reference to fig. 4real,L。Wreal,VAn actual target length (e.g., an actual width of the vehicle) may be represented.
For example, when the image target length Wimg,VIs a distance corresponding to 15 pixels, and an image lane width Wimg,LIs a distance corresponding to 20 pixels, the RatioimgMay be 0.75. Actual lane width W as calculated by homography in FIG. 4real,LIs 4 m, the actual target length Wreal,VMay be 3 meters.
The actual target length W has been estimated for an object 592 at a close distance from the distance estimation devicereal,VIn this case, the distance estimation device may maintain the previously estimated actual target length W when the object 592 moves away from the distance estimation devicereal,VWhile continuing to track object 592. This is because the value estimated at a close distance may be more accurate than the value estimated at a far distance. By maintaining the previously estimated actual target length Wreal,VEven when the object 592 is distant from the distance estimation device and thus the lane line 503 around the object 592 is not recognized or the lane width around the object 592 is not constant, the distance estimation device can stably estimate the distance to the object 592. Further, in response to the distance to the object 592 being less than the pre-estimated actual target length Wreal,VIn the case of the distance of (3), the distance estimation device may update the actual target length W of the object 592real,V. This is because, when the object 592 is located at a closer position, the actual target length Wreal,VCan be estimated more accurately.
Although fig. 5 has been described with reference to a straight road, the inventive concept is not limited thereto. Even when traveling on a curved road, the distance estimation device can detect the 2D bounding box and estimate the actual target length as described with reference to fig. 5.
Fig. 6 is a diagram illustrating a process of estimating an actual target length in a state where a vehicle mounted with a distance estimation device travels on a curved road.
The top view image 620 shows a state in which the vehicle mounted with the distance estimation device 691 travels on the curved road 601. Although only one lane of the curved road 601 is shown in the top view image 620, this is merely for convenience of description and the inventive concept is not limited thereto. The input image 610 may include a front object 692 that appears distorted when the front object 692 travels in a curved portion of the curved road 601. Therefore, the distance estimation device 691 can estimate the actual target length of the object more accurately by considering the attitude (pose) of the object 692.
The distance estimation device 691 according to the embodiment may estimate the pose of the object 692 from the input image 610. The distance estimation device 691 may determine the image target length 671 based on the estimated pose. For example, the distance estimation device 691 may detect a 3D bounding box 682 corresponding to the object 692 from the input image 610. The distance estimation device 691 may determine the image target length 671 based on the 3D bounding box 682. The distance estimation device 691 may determine the width of the 3D bounding box 682 (e.g., the length of the top line or the bottom line on the back of the box) as the image target length 671.
The distance estimation device 691 may determine the image lane width 672 based on the pose of the object 692 in the input image 610. For example, the distance estimation device 691 may determine the image lane width 672 based on an intersection between the lane line and an extension line of a bottom line constituting the back face of the 3D bounding box 682.
The distance estimation device 691 may estimate the actual target length according to equation 1 and equation 2 based on the image target length 671 and the image lane width 672 determined according to the description with reference to fig. 6.
Although fig. 6 has been described with reference to a curved road 601, the inventive concept is not limited thereto. Even when traveling on the straight road 601, the distance estimation device 691 can detect the 3D bounding box 682 and estimate the actual target length as described with reference to fig. 6.
Further, the distance estimation apparatus may identify linearity of a traveling road and determine the distance estimation method according to the identified linearity. For example, in response to a case where the current traveling road is identified as a straight road (e.g., a road having a curvature smaller than a threshold curvature), the distance estimation apparatus may detect the 2D bounding box from the input image and estimate the actual target length as described above with reference to fig. 5. As another example, in response to a case where the current traveling road is identified as a curved road (e.g., a road having a curvature equal to or exceeding a threshold curvature), the distance estimation apparatus may detect the posture of the 3D bounding box or the object from the input image and estimate the actual target length as described with reference to fig. 6. However, this is merely an example, and the operation of the distance estimation device is not limited thereto.
Fig. 7 is a top view illustrating distance estimation according to an embodiment.
Fig. 7 is a top view showing an object 790, a focus 711, and an image plane 710 of an image sensor of the distance estimation apparatus. The image plane 710 may be a plane corresponding to a point in the image sensor that receives light through the lens, and each point of the image plane 710 may correspond to each pixel of the input image. Light reflected and received from an object 790 outside the distance estimation device may pass through the focal point 711 of the lens and reach the image plane 710. The positional relationship between the focal length f and the distance D may be related to the length of an image object appearing on the image plane 710Degree Wimg,VAnd an actual target length Wreal,VThe positional relationship therebetween is the same.
Equation 3
Figure BDA0002226697270000111
Therefore, the distance estimation apparatus can be based on the actual target length W according to equation 3real,VAnd the image target length Wimg,VThe distance D is calculated from the focal length f. The distance D may represent the distance from the focal point 711 of the lens to the object 790 in the image sensor.
Fig. 8 is a block diagram showing a configuration of a distance estimation apparatus according to an embodiment.
The distance estimation apparatus 800 according to an embodiment may include a sensor 810, a processor 820, and a memory 830.
The sensor 810 may be an image sensor 810 for acquiring an input image of a front side thereof. Image sensor 810 may generate a color image, and the color image may include a plurality of color channel images. However, the inventive concept is not limited thereto, and the image sensor 810 may generate a monochrome image (e.g., a black-and-white image).
The processor 820 may acquire an actual target length corresponding to a target point of the object. The processor 820 may calculate an image target length corresponding to the actual target length from the input image. The processor 820 may estimate the distance from the distance estimation device 800 to the object based on the actual target length of the object, the image target length, and the focal length of the distance estimation device 800. However, the operation of the processor 820 is not limited thereto, and the processor 820 may perform the operations described with reference to fig. 1 to 7.
The memory 830 may temporarily or permanently store data for performing the distance estimation method. For example, the memory 830 may store the actual target length estimated in the previous frame, the actual lane width, the distance from the object in the previous frame, and the like.
The above-described apparatus may be implemented as hardware components, software components, and/or a combination of hardware and software components. For example, the devices and components described in the embodiments may be implemented using one or more general purpose or special purpose computers, such as a processor, controller, Arithmetic Logic Unit (ALU), digital signal processor, microcomputer, field programmable array (FPGA), Programmable Logic Unit (PLU), microprocessor, or any other device capable of executing and responding to instructions. The processor may execute an Operating System (OS) and one or more software applications executing on the OS. Further, in response to execution of the software, the processor may access, store, manipulate, process, and generate data. For ease of understanding, the processor may be described as being used alone, however, one of ordinary skill in the art will appreciate that the processor may include multiple processing elements and/or multiple types of processing elements. For example, a processor may include multiple processors or a processor and a controller. In addition, other processing configurations (such as parallel processors) are also possible.
The software may include computer programs, code, instructions, or a combination of one or more of them, and may configure the processor to operate as desired or may individually or collectively instruct the processor. Software and/or data may be embodied permanently or temporarily in any type of machine, component, physical device, virtual equipment, computer storage medium or device, or transmitted signal wave for purposes of interpretation by or provision of commands or data to a processor. The software may be distributed over network-coupled computer systems so that it is stored and executed in a distributed fashion. The software and data may be stored on one or more computer-readable recording media.
The method according to the embodiment may be implemented in the form of program commands executable by various computer devices, wherein the program commands may be recorded on a computer-readable recording medium. The computer readable recording medium may include program commands, data files, and data structures alone or in combination. The program command recorded on the computer-readable recording medium may be a program command specially designed and configured for the embodiment, or may be a program command known and available to a computer programmer in the art. Examples of the computer-readable recording medium may include magnetic recording media (such as hard disks, floppy disks, and magnetic tapes); optical recording media such as compact disc read only memory (CD-ROM) and Digital Versatile Disc (DVD); magneto-optical recording media (such as floppy disks); and hardware devices (such as ROM, RAM, and flash memory) specially configured to store and execute program commands. Examples of the program command may include a machine language code that can be generated by a compiler and a high-level language code that can be executed by a computer by using an interpreter. A hardware device may be configured to operate as one or more software modules to perform the operations of an embodiment, and vice versa.
Although the embodiments have been described with reference to the accompanying drawings, those skilled in the art can make various changes and modifications without departing from the spirit and scope of the inventive concept. For example, the described techniques may be performed in a different order than the described methods, and/or the described components (such as systems, structures, devices, and circuits) may be combined or combined in a different manner than the described methods, or may be replaced or substituted by other components or their equivalents.
Accordingly, other embodiments, other examples, and equivalents of the appended claims are within the scope of the following claims.
While the inventive concept has been particularly shown and described with reference to embodiments thereof, it will be understood that various changes in form and details may be made therein without departing from the spirit and scope of the appended claims.

Claims (22)

1. A method of distance estimation, comprising:
acquiring an actual target length corresponding to a target point of an object;
calculating an image target length corresponding to the actual target length from the input image; and
estimating a distance from the distance estimation device to the object based on the actual target length, the image target length, and a focal length of the distance estimation device.
2. The distance estimation method according to claim 1, wherein the step of acquiring the actual target length includes: the actual target length is calculated based on the image target length and the lane line appearing in the input image.
3. The distance estimation method according to claim 2, wherein the step of calculating the actual target length includes:
calculating a ratio between an image target length and an image lane width corresponding to a lane in which the object is located from the input image; and
an actual target length is estimated from an actual lane width based on the ratio.
4. The distance estimation method according to claim 3, wherein the step of calculating the ratio includes:
estimating a lane line of a lane in which the object is located in the input image; and
an image lane width is determined based on the estimated lane lines.
5. The distance estimation method according to claim 4, wherein the step of estimating the lane line includes: the lane line is detected by interpolating portions corresponding to the same line in the input image.
6. The distance estimation method according to claim 4, wherein the step of calculating the image target length includes:
detecting a bounding box corresponding to an object from an input image; and
the image object length is determined from the bounding box,
wherein the image lane width is determined based on an intersection between the lane line and an extension line of the bottom line of the bounding box.
7. The distance estimation method according to claim 4, wherein the step of calculating the image target length includes:
estimating a pose of an object from an input image; and
an image target length is determined based on the pose,
wherein the image lane width is determined based on the pose of the object in the input image.
8. The distance estimation method according to claim 4, wherein the step of calculating the image target length includes:
detecting a three-dimensional bounding box corresponding to an object from an input image; and
determining an image object length based on the three-dimensional bounding box,
wherein the image lane width is determined based on an intersection between the lane line and an extension line of a bottom line constituting a back face of the three-dimensional bounding box.
9. The distance estimation method according to claim 3, wherein the step of estimating the actual target length includes: the actual lane width is estimated based on a projection image obtained by projecting the input image on the ground.
10. The distance estimation method according to claim 9, wherein the step of estimating the actual lane width includes:
identifying a lane line corresponding to a lane in which the object is located in the projection image; and
estimating an actual lane width based on a horizontal interval between boundary lines within a threshold distance from the distance estimation device among the identified lane lines.
11. The distance estimation method according to claim 3, wherein the step of estimating the actual target length includes: an actual lane width of a lane in which the object is located is acquired from map data corresponding to the current road.
12. The distance estimation method according to claim 3, wherein the step of calculating the ratio includes: a pixel ratio between a first pixel distance corresponding to a space between lane lines in an input image and a second pixel distance corresponding to a width of an object is calculated.
13. The distance estimation method according to claim 3, wherein the step of estimating the actual target length includes: in response to a case where a first point of a lane line corresponding to a lane in which the object is located is identified to be closer in the current frame than a second point identified in the previous frame, the actual lane width is updated based on the first point.
14. The distance estimation method according to claim 3, wherein the step of calculating the ratio includes: excluding the estimated actual target length of the object in response to a situation where the width change of the lane in which the object is located exceeds a threshold change.
15. The distance estimation method according to claim 1, wherein the step of acquiring the actual target length includes:
tracking an object assigned a previous target length acquired in a previous frame; and
a current target length in a current frame of the tracked object is determined as a previous target length acquired in a previous frame.
16. The distance estimation method according to claim 15, wherein the step of tracking the object includes:
estimating a previous target length of the object in a previous frame based on the additional sensor; and
the estimated previous target length is assigned to the object.
17. The distance estimation method according to claim 1, wherein the step of acquiring the actual target length includes:
determining object size information corresponding to an object based on a visual appearance of the object represented in the input image; and
the actual target length is obtained from the object size information.
18. The distance estimation method according to claim 1, wherein the step of estimating the distance includes: the distance is calculated from the focal distance based on the ratio between the actual target length and the image target length.
19. A non-transitory computer-readable recording medium having recorded thereon at least one computer program comprising at least one instruction for executing the distance estimation method according to claim 1.
20. A distance estimation device comprising:
an image sensor configured to acquire an input image; and
a processor configured to:
acquiring an actual target length corresponding to a target point of an object;
calculating an image target length corresponding to the actual target length from the input image; and
estimating a distance from the distance estimation device to the object based on an actual target length, an image target length, and a focal length of the distance estimation device.
21. A distance estimation device comprising:
an image sensor including a lens and an image plane, wherein the image sensor is configured to capture an input image of a target object; and
a processor configured to:
generating a projection image from the input image;
estimating an actual lane width based on a lane distance between lane lines in the projection image;
calculating an image lane width and an image target object width by pixels from an input image;
calculating a first ratio between an image lane width and an image target object width;
calculating an actual target object width based on the actual lane width and the first ratio; and
the distance from the image sensor to the target object is estimated based on the actual target object width, the image target object width, and the focal length from the focal point of the lens to the image plane.
22. The distance estimation apparatus according to claim 21, wherein the projection image is generated by a homography matrix calculation.
CN201910953997.5A 2018-10-10 2019-10-09 Distance estimation method and device Active CN111024040B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2018-0120158 2018-10-10
KR1020180120158A KR20200040374A (en) 2018-10-10 2018-10-10 Method and device to estimate distance

Publications (2)

Publication Number Publication Date
CN111024040A true CN111024040A (en) 2020-04-17
CN111024040B CN111024040B (en) 2023-05-30

Family

ID=68172105

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910953997.5A Active CN111024040B (en) 2018-10-10 2019-10-09 Distance estimation method and device

Country Status (4)

Country Link
US (1) US11138750B2 (en)
EP (1) EP3637313A1 (en)
KR (1) KR20200040374A (en)
CN (1) CN111024040B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113468955A (en) * 2021-05-21 2021-10-01 香港生产力促进局 Method, device and storage medium for estimating distance between two points in traffic scene
CN114132325A (en) * 2021-12-14 2022-03-04 京东鲲鹏(江苏)科技有限公司 Vehicle driving method and device
CN114659489A (en) * 2022-03-11 2022-06-24 苏州清研微视电子科技有限公司 Front vehicle distance detection method and device based on convex lens imaging principle
CN115861975A (en) * 2023-02-28 2023-03-28 杭州枕石智能科技有限公司 Obstacle vehicle pose estimation method and device

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102633140B1 (en) * 2018-10-23 2024-02-05 삼성전자주식회사 Method and apparatus of determining driving information
US10380440B1 (en) 2018-10-23 2019-08-13 Capital One Services, Llc Method for determining correct scanning distance using augmented reality and machine learning models
GB201900839D0 (en) * 2019-01-21 2019-03-13 Or3D Ltd Improvements in and relating to range-finding
US11037002B2 (en) * 2019-02-08 2021-06-15 ANI Technologies Private Limited Calibration of fixed image-capturing device for depth estimation
KR20210017525A (en) * 2019-08-08 2021-02-17 삼성전자주식회사 Distance estimating apparatus and operating method thereof
US11341719B2 (en) * 2020-05-07 2022-05-24 Toyota Research Institute, Inc. System and method for estimating depth uncertainty for self-supervised 3D reconstruction
US11625859B2 (en) 2020-07-22 2023-04-11 Motorola Solutions, Inc. Method and system for calibrating a camera and localizing objects within the camera field of view
CN112884961B (en) * 2021-01-21 2022-11-29 吉林省吉科软信息技术有限公司 Face recognition gate system for epidemic situation prevention and control
JP2022172928A (en) * 2021-05-07 2022-11-17 ウーブン・プラネット・ホールディングス株式会社 Remote monitoring device, remote monitoring system, remote monitoring method, and remote monitoring program
KR102413162B1 (en) * 2021-12-30 2022-06-24 주식회사 다리소프트 Hazardous object information management server which is able to estimate the actual size of the hazardous object on the road by interworking with the information collecting terminal equipped on a vehicle, and the operating method thereof
US11967159B2 (en) * 2022-01-26 2024-04-23 Motional Ad Llc Semantic annotation of sensor data with overlapping physical features
JP2023131720A (en) * 2022-03-09 2023-09-22 キヤノン株式会社 Electronic device, moving body, distance calculation method, and computer program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002296350A (en) * 2001-03-30 2002-10-09 Nissan Motor Co Ltd Object detector
CN104185588A (en) * 2012-03-28 2014-12-03 金泰克斯公司 Vehicular imaging system and method for determining roadway width
US20150302611A1 (en) * 2014-04-16 2015-10-22 Xerox Corporation Vehicle dimension estimation from vehicle images
CN105632186A (en) * 2016-03-11 2016-06-01 博康智能信息技术有限公司 Method and device for detecting vehicle queue jumping behavior
CN107105154A (en) * 2016-02-22 2017-08-29 株式会社万都 Drive assistance device and driving assistance method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3861781B2 (en) 2002-09-17 2006-12-20 日産自動車株式会社 Forward vehicle tracking system and forward vehicle tracking method
JP4681856B2 (en) 2004-11-24 2011-05-11 アイシン精機株式会社 Camera calibration method and camera calibration apparatus
EP2863374A4 (en) * 2012-06-14 2016-04-20 Toyota Motor Co Ltd Lane partition marking detection apparatus, and drive assist system
KR101646495B1 (en) 2014-11-18 2016-08-08 한밭대학교 산학협력단 Road distance estimation method using image information of black box camera
JP6450294B2 (en) 2015-09-30 2019-01-09 株式会社デンソーアイティーラボラトリ Object detection apparatus, object detection method, and program
KR20180022277A (en) 2016-08-24 2018-03-06 김정태 System for measuring vehicle interval based blackbox
KR102371592B1 (en) 2016-11-02 2022-03-07 현대자동차주식회사 Apparatus and method for estimating inter-vehicle distance
US10529083B2 (en) * 2016-12-08 2020-01-07 Lighmetrics Technologies Pvt. Ltd. Methods and systems for estimating distance of an object from a moving vehicle
KR102535540B1 (en) * 2017-01-12 2023-05-23 모빌아이 비젼 테크놀로지스 엘티디. Navigation based on vehicle activity

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002296350A (en) * 2001-03-30 2002-10-09 Nissan Motor Co Ltd Object detector
CN104185588A (en) * 2012-03-28 2014-12-03 金泰克斯公司 Vehicular imaging system and method for determining roadway width
US20150302611A1 (en) * 2014-04-16 2015-10-22 Xerox Corporation Vehicle dimension estimation from vehicle images
CN107105154A (en) * 2016-02-22 2017-08-29 株式会社万都 Drive assistance device and driving assistance method
CN105632186A (en) * 2016-03-11 2016-06-01 博康智能信息技术有限公司 Method and device for detecting vehicle queue jumping behavior

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113468955A (en) * 2021-05-21 2021-10-01 香港生产力促进局 Method, device and storage medium for estimating distance between two points in traffic scene
CN113468955B (en) * 2021-05-21 2024-02-02 香港生产力促进局 Method, device and storage medium for estimating distance between two points in traffic scene
CN114132325A (en) * 2021-12-14 2022-03-04 京东鲲鹏(江苏)科技有限公司 Vehicle driving method and device
CN114132325B (en) * 2021-12-14 2024-03-01 京东鲲鹏(江苏)科技有限公司 Method and device for driving vehicle
CN114659489A (en) * 2022-03-11 2022-06-24 苏州清研微视电子科技有限公司 Front vehicle distance detection method and device based on convex lens imaging principle
CN115861975A (en) * 2023-02-28 2023-03-28 杭州枕石智能科技有限公司 Obstacle vehicle pose estimation method and device
CN115861975B (en) * 2023-02-28 2023-05-12 杭州枕石智能科技有限公司 Obstacle vehicle pose estimation method and equipment

Also Published As

Publication number Publication date
US11138750B2 (en) 2021-10-05
US20200118283A1 (en) 2020-04-16
EP3637313A1 (en) 2020-04-15
CN111024040B (en) 2023-05-30
KR20200040374A (en) 2020-04-20

Similar Documents

Publication Publication Date Title
CN111024040B (en) Distance estimation method and device
US11632536B2 (en) Method and apparatus for generating three-dimensional (3D) road model
CN112292711B (en) Associating LIDAR data and image data
US9916689B2 (en) Apparatus and method for estimating camera pose
US11670087B2 (en) Training data generating method for image processing, image processing method, and devices thereof
KR102455632B1 (en) Mehtod and apparatus for stereo matching
KR102519666B1 (en) Device and method to convert image
CN111086521A (en) Calibration method, calibration device, and non-transitory computer recording medium
KR102436730B1 (en) Method and apparatus for estimating parameter of virtual screen
KR20190030474A (en) Method and apparatus of calculating depth map based on reliability
JP7424390B2 (en) Image processing device, image processing method, and image processing program
Berriel et al. A particle filter-based lane marker tracking approach using a cubic spline model
JP5521217B2 (en) Obstacle detection device and obstacle detection method
KR20210015516A (en) Method and system for improving depth information of feature points using camera and lidar
KR102003387B1 (en) Method for detecting and locating traffic participants using bird's-eye view image, computer-readerble recording medium storing traffic participants detecting and locating program
KR102195040B1 (en) Method for collecting road signs information using MMS and mono camera
Neves et al. A calibration algorithm for multi-camera visual surveillance systems based on single-view metrology
WO2022133986A1 (en) Accuracy estimation method and system
KR102049666B1 (en) Method for Estimating 6-DOF Relative Displacement Using Vision-based Localization and Apparatus Therefor
Kowsari et al. Map-based lane and obstacle-free area detection
JP2020027328A (en) Traffic light estimation device, traffic light estimation method, and program
US20240098231A1 (en) Image processing device, image processing method, and computer-readable medium
CN116612459B (en) Target detection method, target detection device, electronic equipment and storage medium
US20230245362A1 (en) Image processing device, image processing method, and computer-readable medium
KR102613257B1 (en) Modeling method, modeling device and modeling system for modeling target object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant