CN110307791B - Vehicle length and speed calculation method based on three-dimensional vehicle boundary frame - Google Patents

Vehicle length and speed calculation method based on three-dimensional vehicle boundary frame Download PDF

Info

Publication number
CN110307791B
CN110307791B CN201910509507.2A CN201910509507A CN110307791B CN 110307791 B CN110307791 B CN 110307791B CN 201910509507 A CN201910509507 A CN 201910509507A CN 110307791 B CN110307791 B CN 110307791B
Authority
CN
China
Prior art keywords
vehicle
dimensional
boundary frame
detection area
homography matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910509507.2A
Other languages
Chinese (zh)
Other versions
CN110307791A (en
Inventor
张建
张博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201910509507.2A priority Critical patent/CN110307791B/en
Publication of CN110307791A publication Critical patent/CN110307791A/en
Application granted granted Critical
Publication of CN110307791B publication Critical patent/CN110307791B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/04Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness specially adapted for measuring length or width of objects while moving
    • G01B11/043Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness specially adapted for measuring length or width of objects while moving for measuring length
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P5/00Measuring speed of fluids, e.g. of air stream; Measuring speed of bodies relative to fluids, e.g. of ship, of aircraft
    • G01P5/18Measuring speed of fluids, e.g. of air stream; Measuring speed of bodies relative to fluids, e.g. of ship, of aircraft by measuring the time taken to traverse a fixed distance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • G06V20/42Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/08Detecting or categorising vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a vehicle length and speed calculation method based on a three-dimensional vehicle boundary frame. The method comprises the following steps: (1) generating a vehicle Mask based on a Mask R-CNN network; (2) drawing tangents from three vanishing points in the scene to the generated vehicle mask to construct a three-dimensional vehicle boundary frame; (3) establishing a vehicle virtual detection area, and judging whether a vehicle is in the detection area according to whether the midpoint of the front edge of the bottom surface of the three-dimensional vehicle boundary frame is in the detection area; (4) determining pixel coordinates of a road reference point by using a lane virtual line segment and a scene vanishing point, and solving a homography matrix between a road plane world coordinate and a corresponding pixel coordinate according to the known length of the lane virtual line segment and the known lane width; (5) calculating the actual length of the vehicle by using the homography matrix and the three-dimensional vehicle boundary frame; (6) and calculating the vehicle speed by using the homography matrix, the three-dimensional vehicle boundary frame and the virtual detection area. The invention has high calculation precision and low equipment cost, and can be effectively applied to an intelligent traffic system.

Description

Vehicle length and speed calculation method based on three-dimensional vehicle boundary frame
Technical Field
The invention relates to a vehicle length and speed calculation method based on a three-dimensional vehicle boundary frame, and belongs to the fields of computer vision technology and intelligent traffic.
Background
Vehicle length and speed are important parameters in traffic vehicle information. Vehicle speed is usually obtained by sensors buried under the road surface or by placing a radar on the road. However, in this way, the cost of the equipment is high, and when the distance between the vehicles is very close, a large detection error is brought, and even the detection fails. Video technology is widely researched as a low-cost detection technology at present, but the detection precision is still not ideal. The length of the vehicle is an important index for classifying the vehicle type, and an effective and low-cost calculation method is still insufficient at present. How to effectively, highly accurately and inexpensively calculate the length and the speed of the vehicle is a difficult problem in the current traffic field.
Disclosure of Invention
In order to solve the existing problems, the invention discloses a vehicle length and speed calculation method based on a three-dimensional vehicle boundary frame, which is high in calculation precision and low in equipment cost.
The above purpose is realized by the following technical scheme:
a vehicle length and speed calculation method based on a three-dimensional vehicle boundary frame comprises the following steps:
(1) generating a vehicle Mask based on a Mask R-CNN network;
(2) drawing tangents from three vanishing points in the scene to the generated vehicle mask to construct a three-dimensional vehicle boundary frame;
(3) establishing a vehicle virtual detection area, and judging whether a vehicle is in the detection area according to whether the midpoint of the front edge of the bottom surface of the three-dimensional vehicle boundary frame is in the detection area;
(4) calibrating the road surface to obtain a homography matrix between the world coordinates of the road plane and the corresponding pixel coordinates;
(5) calculating the actual length of the vehicle by using the homography matrix and the three-dimensional vehicle boundary frame;
(6) and calculating the vehicle speed by using the homography matrix, the three-dimensional vehicle boundary frame and the virtual detection area.
The method for calculating the length and the speed of the vehicle based on the three-dimensional vehicle boundary frame comprises the steps of (1) constructing the three-dimensional vehicle boundary frame, and specifically determining a first orthogonal vanishing point, a second orthogonal vanishing point and a third orthogonal vanishing point of a scene according to a lane line, a vehicle texture and a street lamp position in a traffic scene, and constructing the three-dimensional vehicle boundary frame by cutting a vehicle Mask generated by Mask R-CNN based on the three orthogonal vanishing points.
The method for calculating the length and the speed of the vehicle based on the three-dimensional vehicle boundary frame comprises the steps of (1) establishing a vehicle virtual detection area in a lens visual field range, and judging whether the vehicle is in the detection area according to whether the midpoint of the front edge of the bottom surface of the three-dimensional vehicle boundary frame is in the detection area.
The vehicle length and speed calculation method based on the three-dimensional vehicle boundary frame is characterized in that the road surface is calibrated to further obtain a homography matrix H between the road plane world coordinate and the corresponding pixel coordinate,
Figure BDA0002092987690000021
firstly, determining pixel coordinates of lane virtual line segment endpoints, establishing straight lines according to the lane virtual line segment endpoints and a second scene vanishing point, and then obtaining pixel coordinates of intersection points of the straight lines and solid lines on two sides of a lane; taking the points with the known pixel coordinates as road surface calibration reference points; the world coordinates of the reference points can be obtained by using the known actual length and lane width of the lane dotted line section; the pixel coordinates of the reference point and the world coordinates are taken into the formula (2), and eight independent parameters in the homography matrix can be obtained by utilizing a least square method or singular value decomposition, so that a basis is provided for the calculation of the length and the speed of the vehicle based on the three-dimensional vehicle boundary frame.
Figure BDA0002092987690000022
In the formula (x)i,yi) And (X)i,Yi) Respectively representing the world coordinate and the pixel coordinate of the reference point i, and m is an independent parameter in the homography matrix H.
In the vehicle length calculation in the step (5), the mid-point world coordinates of the front and rear sides of the bottom surface of the three-dimensional vehicle boundary frame are calculated according to a homography matrix obtained by road surface calibration, and the difference value of the two world coordinates is the actual length of the vehicle.
In the vehicle speed calculation described in (6), the vehicle speed is calculated by using the distance and time that the vehicle travels in the virtual detection area, where the travel distance is calculated according to the world coordinate difference between the midpoints of the front edges of the bottom surfaces of the three-dimensional vehicle boundary frames corresponding to the entering and leaving detection areas of the vehicle, and the travel time is determined according to the video frame rate, and the vehicle speed calculation formula is as follows:
Figure BDA0002092987690000023
in the formula: v is the vehicle speed, L and T are the driving distance and time of the vehicle in the virtual detection area, NfIs the total number of frames, F, corresponding to the running process of the vehicle in the detection arearIs the frame rate of the video and,
Figure BDA0002092987690000024
and
Figure BDA0002092987690000025
respectively, the world coordinates of the middle point of the front side of the bottom surface of the three-dimensional bounding box corresponding to the vehicle entering and leaving detection areas.
Compared with the prior art, the invention has the beneficial effects that:
(1) compared with a calculation method utilizing three orthogonal vanishing points and lens height, the road plane calibration method provided by the invention has higher precision.
(2) Compared with the traditional speed measurement scheme based on the embedded sensor or the radar, the speed measurement method has the advantage that the required equipment cost is greatly reduced.
(3) The invention provides a monocular camera-based vehicle length calculation method, and provides a new mode for traffic vehicle classification.
Drawings
FIG. 1 is a construction of a three-dimensional vehicle bounding box;
FIG. 2 is a diagram of a road surface calibration reference point position and a verification target;
FIG. 3 is a schematic view of the vehicle length calculation;
fig. 4 is a vehicle speed calculation diagram.
Detailed Description
The present invention will be further described with reference to the following embodiments.
Taking a traffic scene on a certain bridge floor as an example, the length and the speed of a passing vehicle in a lens are detected. The method specifically comprises the following steps:
1. the method comprises the steps of respectively determining a first vanishing point, a second vanishing point and a third vanishing point of a scene according to a lane line, a vehicle texture and a street lamp position in a traffic scene, and determining an intersection point, namely a vanishing point, of different straight lines in the same direction in the scene according to a least square method. And based on the three orthogonal vanishing points, tangent lines are made to the vehicle Mask generated by Mask R-CNN to construct a three-dimensional vehicle boundary frame, as shown in FIG. 1, the angular points 1,2,3 and 4 of the three-dimensional boundary frame can be directly determined according to intersection points between the tangent lines, and then the other four angular points 5,6,7 and 8 of the three-dimensional boundary frame can be determined by utilizing the four angular points. And after all the eight corner points are determined, a three-dimensional vehicle boundary frame can be constructed.
2. In order to calculate the vehicle speed, a virtual vehicle detection area is established in the lens visual field range, and whether the vehicle is in the detection area is judged by whether the middle point of the front side of the bottom surface of the three-dimensional vehicle boundary frame is in the detection area.
3. Since the homography matrix in equation (1) has 8 independent unknown parameters, at least 4 reference points that are not on a straight line are required to solve the homography matrix H. Firstly, determining pixel coordinates of the endpoint of the virtual line segment of the lane, then establishing straight lines according to the endpoint of the virtual line segment of the lane and the second vanishing point, and then obtaining the pixel coordinates of the intersection points of the straight lines and the solid lines on the two sides of the lane. These points of known pixel coordinates are taken as calibration reference points, as shown in fig. 2. The world coordinates of these reference points can be obtained using the known actual length of the dashed line segment of the lane and the lane width. And (3) bringing the pixel coordinates of the reference point and the world coordinates into the formula (2), and solving the homography matrix H by a least square method or singular value decomposition. In order to verify the reliability of the calibration method provided by the present invention, 15 dotted lane line segments of known length are used as the verification target to verify the calibration result of the present invention, and the verification target is shown in fig. 2. The lengths of the verification targets calculated from the three orthogonal vanishing points and the camera height and the lengths of the verification targets calculated according to the method of the present patent are listed in table 1. It can be seen from the table that the calculation errors of the method provided by the patent are less than 5%, which greatly exceeds the traditional method.
4. Inverting the obtained homography matrix to obtain H-1Then multiplying H by the homogeneous form of the coordinates of the midpoint pixels of the front and the rear sides of the bottom surface of the three-dimensional vehicle bounding box-1And obtaining corresponding homogeneous world coordinates, normalizing the homogeneous world coordinates by using a third component to obtain corresponding two-dimensional world coordinates, wherein the distance between the two-dimensional world coordinates of the middle points of the front side and the rear side of the bottom surface of the vehicle boundary frame is the actual length of the vehicle, and a vehicle length calculation schematic diagram is shown in fig. 3.
5. The vehicle speed is calculated by using the distance traveled by the vehicle in the virtual detection area and the time, as shown in fig. 4, wherein the travel distance is calculated by using the world coordinate difference between the middle points of the front sides of the bottom surfaces of the three-dimensional vehicle bounding boxes corresponding to the vehicle entering and leaving the detection area, and the travel time is determined according to the video frame rate, and the vehicle speed is calculated according to the following formula (3):
Figure BDA0002092987690000041
in conclusion, the vehicle length and speed calculation method based on the three-dimensional vehicle boundary frame provided by the invention can be effectively applied to an intelligent traffic system.
TABLE 1 calculation of verified target lengths on pavement
Figure BDA0002092987690000042

Claims (6)

1. A vehicle length and speed calculation method based on a three-dimensional vehicle boundary frame is characterized by comprising the following steps:
(1) generating a vehicle Mask based on a Mask R-CNN network;
(2) drawing tangents from three vanishing points in the scene to the generated vehicle mask to construct a three-dimensional vehicle boundary frame;
(3) establishing a vehicle virtual detection area, and judging whether a vehicle is in the detection area according to whether the midpoint of the front edge of the bottom surface of the three-dimensional vehicle boundary frame is in the detection area;
(4) calibrating the road surface to obtain a homography matrix between the world coordinates of the road plane and the corresponding pixel coordinates;
(5) calculating the actual length of the vehicle by using the homography matrix and the three-dimensional vehicle boundary frame;
(6) and calculating the vehicle speed by using the homography matrix, the three-dimensional vehicle boundary frame and the virtual detection area.
2. The method of claim 1, wherein the method comprises: (2) the three-dimensional vehicle boundary frame is constructed by respectively determining three vanishing points in a scene according to lane lines, vehicle textures and street lamp positions in a traffic scene, and constructing the three-dimensional vehicle boundary frame by cutting lines to a vehicle Mask generated by Mask R-CNN based on the three vanishing points.
3. The method of claim 1, wherein the method comprises: (3) the vehicle virtual detection area is established in the lens visual field range, and whether the vehicle is in the detection area is judged according to whether the middle point of the front side of the bottom surface of the three-dimensional vehicle boundary frame is in the detection area.
4. The method of claim 1, wherein the method comprises: (4) the road surface calibration in (1) further obtains a homography matrix H between the road plane world coordinates and the corresponding pixel coordinates,
Figure FDA0002692630460000011
firstly, determining pixel coordinates of lane virtual line segment endpoints, establishing straight lines according to the lane virtual line segment endpoints and a second scene vanishing point, and then obtaining pixel coordinates of intersection points of the straight lines and solid lines on two sides of a lane; taking the points with the known pixel coordinates as road surface calibration reference points; the world coordinates of the reference points can be obtained by using the known actual length and lane width of the lane dotted line section; taking the pixel coordinates and world coordinates of the reference point into the formula (2), and solving eight independent parameters in the homography matrix by utilizing a least square method or singular value decomposition, thereby providing a basis for calculating the length and the speed of the vehicle based on the three-dimensional vehicle boundary frame;
Figure FDA0002692630460000021
in the formula (x)i,yi) And (X)i,Yi) Respectively representing the world coordinate and the pixel coordinate of the reference point i, and m is an independent parameter in the homography matrix H.
5. The method of claim 1, wherein the method comprises: (5) in the vehicle length calculation, the mid-point world coordinates of the front and back sides of the bottom surface of the three-dimensional vehicle boundary frame are calculated according to the homography matrix obtained by road surface calibration, and the difference value of the two mid-point world coordinates is the actual length of the vehicle.
6. The method of claim 1, wherein the method comprises: (6) in the vehicle speed calculation, the vehicle speed is calculated by using the distance traveled by the vehicle in the virtual detection area and the time, wherein the travel distance is calculated according to the world coordinate difference between the middle points of the front edges of the bottom surfaces of the three-dimensional vehicle boundary frames corresponding to the vehicle entering and leaving the detection area, and the travel time is determined according to the video frame rate, and the vehicle speed calculation formula is as follows:
Figure FDA0002692630460000022
in the formula: v is the vehicle speed, L and T are the driving distance and time of the vehicle in the virtual detection area, NfIs the sum of the corresponding running processes of the vehicle in the detection areaNumber of frames, FrIs the frame rate of the video and,
Figure FDA0002692630460000023
and
Figure FDA0002692630460000024
respectively, the world coordinates of the middle point of the front side of the bottom surface of the three-dimensional bounding box corresponding to the vehicle entering and leaving detection areas.
CN201910509507.2A 2019-06-13 2019-06-13 Vehicle length and speed calculation method based on three-dimensional vehicle boundary frame Active CN110307791B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910509507.2A CN110307791B (en) 2019-06-13 2019-06-13 Vehicle length and speed calculation method based on three-dimensional vehicle boundary frame

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910509507.2A CN110307791B (en) 2019-06-13 2019-06-13 Vehicle length and speed calculation method based on three-dimensional vehicle boundary frame

Publications (2)

Publication Number Publication Date
CN110307791A CN110307791A (en) 2019-10-08
CN110307791B true CN110307791B (en) 2020-12-29

Family

ID=68076628

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910509507.2A Active CN110307791B (en) 2019-06-13 2019-06-13 Vehicle length and speed calculation method based on three-dimensional vehicle boundary frame

Country Status (1)

Country Link
CN (1) CN110307791B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110909620A (en) * 2019-10-30 2020-03-24 北京迈格威科技有限公司 Vehicle detection method and device, electronic equipment and storage medium
CN113689713A (en) * 2020-05-19 2021-11-23 昆山研达电脑科技有限公司 Vehicle speed monitoring method based on automobile data recorder
WO2021237750A1 (en) * 2020-05-29 2021-12-02 Siemens Ltd., China Method and apparatus for vehicle length estimation
CN112489106A (en) * 2020-12-08 2021-03-12 深圳市哈工交通电子有限公司 Video-based vehicle size measuring method and device, terminal and storage medium
CN112798811B (en) * 2020-12-30 2023-07-28 杭州海康威视数字技术股份有限公司 Speed measurement method, device and equipment
CN113011388B (en) * 2021-04-23 2022-05-06 吉林大学 Vehicle outer contour size detection method based on license plate and lane line
CN114863025B (en) * 2022-05-18 2023-03-10 禾多科技(北京)有限公司 Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN115019557B (en) * 2022-06-09 2024-05-14 杭州电子科技大学 Lane virtual boundary construction and boundary crossing detection method based on TUIO protocol

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010214258A (en) * 2009-03-13 2010-09-30 Toyota Motor Corp Masking jig
CN102410836A (en) * 2011-07-26 2012-04-11 清华大学 Space six-freedom degrees article locating system based on two-dimensional sensitive sensor
CN104200483A (en) * 2014-06-16 2014-12-10 南京邮电大学 Human body central line based target detection method under multi-camera environment
CN105573047A (en) * 2014-10-10 2016-05-11 中芯国际集成电路制造(上海)有限公司 System and method for detecting mask figure fidelity
CN105718870A (en) * 2016-01-15 2016-06-29 武汉光庭科技有限公司 Road marking line extracting method based on forward camera head in automatic driving
CN106408589A (en) * 2016-07-14 2017-02-15 浙江零跑科技有限公司 Vehicle-mounted overlooking camera based vehicle movement measurement method
CN107122792A (en) * 2017-03-15 2017-09-01 山东大学 Indoor arrangement method of estimation and system based on study prediction
US10055853B1 (en) * 2017-08-07 2018-08-21 Standard Cognition, Corp Subject identification and tracking using image recognition
CN108550143A (en) * 2018-04-03 2018-09-18 长安大学 A kind of measurement method of the vehicle length, width and height size based on RGB-D cameras
CN108759667A (en) * 2018-05-29 2018-11-06 福州大学 Front truck distance measuring method based on monocular vision and image segmentation under vehicle-mounted camera
WO2019097456A1 (en) * 2017-11-17 2019-05-23 C 3 Limited Object measurement system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8724854B2 (en) * 2011-04-08 2014-05-13 Adobe Systems Incorporated Methods and apparatus for robust video stabilization
US10872531B2 (en) * 2017-09-29 2020-12-22 Uber Technologies, Inc. Image processing for vehicle collision avoidance system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010214258A (en) * 2009-03-13 2010-09-30 Toyota Motor Corp Masking jig
CN102410836A (en) * 2011-07-26 2012-04-11 清华大学 Space six-freedom degrees article locating system based on two-dimensional sensitive sensor
CN104200483A (en) * 2014-06-16 2014-12-10 南京邮电大学 Human body central line based target detection method under multi-camera environment
CN105573047A (en) * 2014-10-10 2016-05-11 中芯国际集成电路制造(上海)有限公司 System and method for detecting mask figure fidelity
CN105718870A (en) * 2016-01-15 2016-06-29 武汉光庭科技有限公司 Road marking line extracting method based on forward camera head in automatic driving
CN106408589A (en) * 2016-07-14 2017-02-15 浙江零跑科技有限公司 Vehicle-mounted overlooking camera based vehicle movement measurement method
CN107122792A (en) * 2017-03-15 2017-09-01 山东大学 Indoor arrangement method of estimation and system based on study prediction
US10055853B1 (en) * 2017-08-07 2018-08-21 Standard Cognition, Corp Subject identification and tracking using image recognition
WO2019097456A1 (en) * 2017-11-17 2019-05-23 C 3 Limited Object measurement system
CN108550143A (en) * 2018-04-03 2018-09-18 长安大学 A kind of measurement method of the vehicle length, width and height size based on RGB-D cameras
CN108759667A (en) * 2018-05-29 2018-11-06 福州大学 Front truck distance measuring method based on monocular vision and image segmentation under vehicle-mounted camera

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Speed estimation and length based vehicle classification from freeway single-loop detectors;Benjamin Coifman等;《Transportation Research Part C》;20090105;第349-364页 *
交通视频中运动车辆检测和跟踪技术的研究;赵俊梅等;《车辆与动力技术》;20120415(第4期);第46-49页 *
基于光流旳运动车辆检测和跟踪技术的研究;张利平等;《车辆与动力技术》;20140215(第2期);第61-64页 *

Also Published As

Publication number Publication date
CN110307791A (en) 2019-10-08

Similar Documents

Publication Publication Date Title
CN110307791B (en) Vehicle length and speed calculation method based on three-dimensional vehicle boundary frame
US10354151B2 (en) Method of detecting obstacle around vehicle
CN107705331B (en) Vehicle video speed measurement method based on multi-viewpoint camera
CN101750049B (en) Monocular vision vehicle distance measuring method based on road and vehicle information
CN110031829B (en) Target accurate distance measurement method based on monocular vision
CN103487034B (en) Method for measuring distance and height by vehicle-mounted monocular camera based on vertical type target
CN112037159B (en) Cross-camera road space fusion and vehicle target detection tracking method and system
CN104700414A (en) Rapid distance-measuring method for pedestrian on road ahead on the basis of on-board binocular camera
CN103679120B (en) The detection method of rough road and system
CN110197173B (en) Road edge detection method based on binocular vision
CN109791607B (en) Detection and verification of objects from a series of images of a camera by means of homography matrices
CN111694011A (en) Road edge detection method based on data fusion of camera and three-dimensional laser radar
CN105551020A (en) Method and device for detecting dimensions of target object
Kellner et al. Road curb detection based on different elevation mapping techniques
CN112200779B (en) Driverless road surface rut shape and structure transverse difference degree evaluation method
JP6171849B2 (en) Moving body position / posture angle estimation apparatus and moving body position / posture angle estimation method
CN111476798B (en) Vehicle space morphology recognition method and system based on contour constraint
CN116503818A (en) Multi-lane vehicle speed detection method and system
CN110415299B (en) Vehicle position estimation method based on set guideboard under motion constraint
US9098774B2 (en) Method for detection of targets in stereoscopic images
Hara et al. Vehicle localization based on the detection of line segments from multi-camera images
CN118053299A (en) Underground garage blind area display method and system based on thunder fusion
CN116740295A (en) Virtual scene generation method and device
Oniga et al. A fast ransac based approach for computing the orientation of obstacles in traffic scenes
CN113834463B (en) Intelligent vehicle side pedestrian/vehicle monocular depth ranging method based on absolute size

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant