CN103686083B - Real-time speed measurement method based on vehicle-mounted sensor video streaming matching - Google Patents

Real-time speed measurement method based on vehicle-mounted sensor video streaming matching Download PDF

Info

Publication number
CN103686083B
CN103686083B CN201310659886.6A CN201310659886A CN103686083B CN 103686083 B CN103686083 B CN 103686083B CN 201310659886 A CN201310659886 A CN 201310659886A CN 103686083 B CN103686083 B CN 103686083B
Authority
CN
China
Prior art keywords
angle point
image
vehicle
destination object
speed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201310659886.6A
Other languages
Chinese (zh)
Other versions
CN103686083A (en
Inventor
闫莉萍
孙初雄
王晓林
夏元清
王美玲
付梦印
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201310659886.6A priority Critical patent/CN103686083B/en
Publication of CN103686083A publication Critical patent/CN103686083A/en
Application granted granted Critical
Publication of CN103686083B publication Critical patent/CN103686083B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a real-time speed measurement method based on vehicle-mounted sensor video streaming matching. The method includes the steps that the number of movement pixels of one corner point of the same target object in two images shot by a vehicle-mounted camera is utilized to conduct reverse derivation on driving displacement of a vehicle within the time interval of the two images, the driving speed of the vehicle is acquired, and then real-time speed information is provided for a driver. By the utilization of the method, the vehicle speed can be monitored only by installing one vehicle-mounted camera, the influences of manufacturing and assembly errors of the vehicle and tire wear problems on vehicle speed measurement do not exist, and the method is convenient to conduct and feasible.

Description

A kind of real time speed measuring method based on onboard sensor video flowing coupling
Technical field
The invention belongs to Digital Image Processing, pattern recognition and system identification correlative technology field, be specifically related to a kind of real time speed measuring method based on onboard sensor video flowing coupling.
Background technology
At present, vehicle speed measuring method has a lot, is broadly divided into the speed-measuring method of non-vehicle-mounted type and the speed-measuring method of vehicular.The speed-measuring method of non-vehicle-mounted type mainly has radar velocity measurement, the method such as laser velocimeter and video frequency speed-measuring, wherein, video signal is processed especially by software algorithm and analyzes by video frequency speed-measuring method, it is typically to obtain references object from video image, and calculate references object pixel count of movement within the regular hour, carry out the conversion of pixel count and actual range again, the movement velocity of vehicle it is calculated finally according to speed formula, the method need not radar, also without a variety of sensors, have only to that video frequency pick-up head is installed on the side of road and upper convenience can realize detecting the travel speed of vehicle in video, but the method is only as a kind of hands section of traffic monitoring speed, and cannot show that speed is to remind driver in real time on vehicle.
The speed-measuring method predominantly organic tool formula of vehicular tests the speed, tachometer generator, hall sensor test the speed, ground sensing coil speed measuring;Generally utilize wheel shaft to rotate the information produced to obtain velocity amplitude, and speed is shown on speedometer, remind driver.
The method that current vehicle self tests the speed is essentially divided into two kinds: a kind of is to be calculated by the rotating speed of pinion rotation inside measurement change speed gear box;Another is that the wheel speed instrument braked above by loading ABS calculates.But due to the reason such as the manufacture of vehicle sensors and rigging error and tire wear, tire pressure be against regulation, all may cause and between vehicle speed measurement result and actual vehicle speed, there is error, easily cause the generation of vehicle accident.
Summary of the invention
In view of this, the invention provides a kind of real time speed measuring method based on onboard sensor video flowing coupling, by utilizing in two width images of shooting, the pixel count that moves of same destination object angle point is counter pushes away the displacement of vehicle traveling in two width image temporal intervals, thus obtain the travel speed of vehicle, and then provide real-time velocity information for driver.The method has only to install a vehicle-mounted camera just can realize the monitoring to speed, there is not vehicle manufacture and rigging error and tire wear to vehicle speed measurement, facilitates feasible.
A kind of real time speed measuring method based on onboard sensor video flowing coupling, comprises the following steps:
Step one, video capture device being arranged on vehicle body side, camera lens is vertical towards the course with car, at t1、t2Moment gathers video image respectively, at t1Image on the basis of the image definition that moment gathers, at t2The image definition that moment gathers is image subject to registration, defines Δ T=t2-t1For time of vehicle operation;
Step 2, video processing equipment use Harris angular-point detection method to extract the angle point in each image, it is thus achieved that angle point collection;
Step 3, the angle point collection using local feature method to extract step 3 registrate, it is thus achieved that coupling angle point pair;
Step 4, the same destination object chosen in two width images, and the coupling angle point centering obtained from step 3 finds an angle point pair of described destination object, affine transformation is used to be transformed in benchmark image by the angle point of destination object in image subject to registration, thus the pixel count Δ d that the angle point obtaining destination object moves;
Step 5, the pixel count that the angle point of destination object moves in the picture utilize the actual displacement Δ S, Δ S of the angle point that formula (9) is converted to destination object are the distance that vehicle travels in Δ T time;
Wherein, the visual angle of image capture device isW is the pixel count on image level direction, and d is the vertical dimension between camera lens and destination object plane;
Step 6, distance, delta S of the vehicle traveling obtained according to the time of vehicle operation Δ T described in step one and step 5, it is thus achieved that the travel speed of vehicle.
It is also preferred that the left benchmark image and image subject to registration in step one are two continuous frames image.
Beneficial effect:
1) present invention will utilize the anti-go-cart of pixel that in the two width images shot, same destination object angle point moves to travel the displacement in two width image temporal intervals, thus obtain the travel speed of vehicle, there is not the impact on testing the speed of vehicle manufacture and rigging error and tire wear in the method, have only to that a vehicle-mounted camera is installed and just can realize the monitoring to speed, facilitate feasible.
2) in a preferred embodiment of the present invention, the benchmark image in step one and image subject to registration are two continuous frames image, the time that video capture device gathers two continuous frames image is the shortest, it is believed that the speed that vehicle is during this period of time is constant and moves along a straight line, image also will not be by bigger disturbing influence, the car speed of the result tested the speed more approaching to reality.
Accompanying drawing explanation
Fig. 1 is the flow chart of the present invention;
Fig. 2 is destination object displacement in the picture;
Fig. 3 is the angle point design sketch of Harris angle point grid;
Fig. 4 is corners Matching design sketch;
Fig. 5 is video camera schematic diagram;
Fig. 6 is the plan view from above of video camera schematic diagram.
Detailed description of the invention
Develop simultaneously embodiment below in conjunction with the accompanying drawings, describes the present invention.
The invention provides a kind of real time speed measuring method based on onboard sensor video flowing coupling, carry out under computer hardware environment, Windows 2000/XP;Matlab or any language environment software such as C language or C++ all can realize, and the present embodiment uses matlab language, utilizes the unmanned Autonomous Vehicles in ATRV type ground, video camera to test, and flow process is as it is shown in figure 1, specifically comprise the following steps that
Step one, video camera being arranged on the vehicle body side of the unmanned Autonomous Vehicles in ATRV type ground, camera lens is vertical towards the course with car, at t1、t2Moment gathers video image respectively, at t1Image on the basis of the image definition that moment gathers, at t2The image definition that moment gathers is image subject to registration, defines Δ T=t2-t1For time of vehicle operation;
Assuming that the time interval between two continuous frames image is Δ t, interval time gathers the frame number of image in Δ T be n, then Δ T=n Δ t.
Owing to the background in Vehicular video image is continually changing, in order to avoid the various factors impact on vehicle speed measuring, and ensure two two field pictures gathered all exist destination object, the image gathering two continuous frames calculates, and their time interval is the shortest, it is believed that vehicle speed during this period of time is constant and moves along a straight line, image also will not be by bigger disturbing influence, the car speed of the result tested the speed more approaching to reality, because the present embodiment n=1, Δ T=Δ t.
Angle point in each image is extracted by step 2, employing Harris angular-point detection method, it is thus achieved that angle point collection.
Harris operator is a kind of angle point grid operator that C.harris and M.J.Stephens proposed in 1988, this operator is inspired by auto-correlation function in signal processing, provide the matrix M being associated with auto-correlation function, the eigenvalue of M battle array is the single order curvature of auto-correlation function, if two curvature values are the highest, then being considered as this point is angle point.It is simple that the method for Harris operator extraction angle point has calculating, to advantages such as image rotation, grey scale change, influence of noise and viewpoint change are insensitive, is a kind of more stable Angular Point Extracting Method.
The ultimate principle of Harris corner detection approach is as follows:
In Harris algorithm, note pixel (x, gray scale y) be f (x, y), each pixel of image (x, y) mobile (u, gray-scale intensity change v) is expressed as:
Peer-to-peer right-hand member is by Taylor series expansion and ignores the highest item and can obtain:
Wherein, wu,vFor Gauss window (u, v) place coefficient andIn order to preferably reduce effect of noise, Harris detection algorithm extracts angle point after image is carried out smothing filtering again.WithWithReflection image is in the grey scale change direction of each pixel, if (x y) exists both direction gray scale and occurs sufficiently large change to be then extracted as angle point pixel.
OrderM be referred to as pixel (x, autocorrelation matrix y), if λ1And λ2Being 2 eigenvalues of M respectively, they are auto-correlation function single order curvature.Can be by judging λ1And λ2The situation of value judge flat region, angle point and edge.Simple in order to calculate, use angle point receptance function to avoid the eigenvalue of solution matrix.
Angle point receptance function CRF is defined as:
CRF=det (M)-k*trace2(M),
Wherein, det (M)=λ1λ2, trace (M)=λ12.K is empirical value, typically takes 0.04~0.06, and its criterion judged is: when the value of CRF is more than certain threshold value set in advance, is the angle point of candidate, is not the most angle point.The size of threshold value is adjusted according to the number of required angle point.
As it is shown on figure 3, image on the basis of the left side, the right is image subject to registration.Harris algorithm is used to carry out angle point grid two width images, wherein, the criterion that angle point judges is: when the value of the receptance function CRF that certain is put is more than certain threshold value set in advance, this point is the angle point of candidate, it it not the most angle point, the size of this threshold value determines according to the number of required angle point, therefore, the number of angle point to be guaranteed is abundant, it is accomplished by arranging suitable threshold value, rule of thumb understand, the number of angle point is when about 200, enough angle points pair are had when can guarantee that image registration, pass through many experiments, for this image, elect the threshold value of angle point as 0.000149044.
The angle point collection that step 3, video processing equipment use local feature method to extract step 2 registrates, it is thus achieved that coupling angle point pair, and as shown in Figure 4, this figure is to use local feature method diagonal angle point to carry out the design sketch mated.
Use local feature method to extract Corner Feature collection registrate inwardly: in benchmark image, choose angle point a, angle point b is chosen in image subject to registration, the closest angle point of equal number is chosen respectively around angle point a and angle point b, if the angle point around angle point a is equal with the distance ratio of angle point b with the angle point around the distance of angle point a ratio and angle point b, then it is assumed that angle point a and angle point b is coupling angle point pair.
Due to camera placements onboard, video camera will move with speed, and gather the image of road side.The angle point that can registrate is to the same destination object for road side, and in road both sides, the destination object usually registrated by angle point is such as building, these static constant objects of trees mostly, and these objects are adapted to as the object of reference calculating speed.
Step 4, the same destination object chosen in two width images, and the coupling angle point centering obtained from step 3 finds an angle point pair of described destination object, affine transformation is used to be transformed in benchmark image by the angle point of destination object in image subject to registration, thus the pixel count Δ d that the angle point obtaining destination object moves.
The principle that Vehicular video tests the speed is mainly by the displacement moving anti-go-cart relative to position between video image two frame of the objects such as target structures thing.Assuming that vehicle crosses a certain building on the road surface of level from left to right, in the video image of shot by camera, this building is exactly that we carry out the destination object that video flowing coupling tests the speed.Owing to video camera is fixed relative to car, according to the principle of relative motion, static building relative to the displacement of video camera equal to the vehicle of motion relative to the displacement of static target structures thing.In the short period of time, there is identical building in two two field pictures collected from video flowing, using building as destination object, the displacement of destination object is exactly the displacement of the angle point on destination object, the problem measuring destination object displacement in the picture has reformed into images match, calculates the problem that the angle point of destination object moves pixel count.
In the driving process that vehicle is actual, due to the out-of-flatness on road surface thus cause the situation that vehicle pitches, make the camera acquisition being fixed on vehicle to video image can translate, low-angle rotation and scaling motion, the registration of image depends on selected geometric transformation model, the common method obtaining geometric transformation model has rigid body translation, similarity transformation, affine transformation and projective transformation etc., owing to affine transformation is capable of the translation of image, rotate, the multiple conversion such as scaling, the most again without complicated amount of calculation, accuracy and the real-time of images match can be further ensured that.Therefore, the conversion of the image in onboard system uses affine Transform Model.
The fundamental matrix of affine Transform Model is:
Wherein,It is zoom factor, h13、h23The shift factor of level, vertical direction respectively, (x, y), (x ', y ') be respectively same scene point imaging point coordinate on adjacent two two field pictures.
Above formula is rewritable is:
OrderKnowable to transformation matrix H, image has translation on the direction of x-axis and y-axis, and this is mainly due to vehicle during travelling, and unstable due to road surface causes video camera to create caused by shake when shooting.Generally video camera position in space is arbitrary, but camera coordinate system is generally with the photocentre of video camera as initial point, the optical axis of video camera is z-axis, x-axis is to the right, and meet right-hand rule, the i.e. travel direction of vehicle is x-axis, therefore, using destination object along the displacement of x-axis as destination object displacement on image.
Being converted by model, destination object moves h in the time interval of adjacent two frames13Pixel, and then obtain the pixel count of movement between destination object two continuous frames in video streaming.
Step 5, the pixel count that the angle point of destination object moves in the picture utilize the actual displacement Δ S, Δ S of the angle point that formula (9) is converted to destination object are the distance that vehicle travels in Δ T time.
It is illustrated in figure 5 the illustraton of model of video camera shooting road side.If q point is an angle point on destination object material object, t1In the moment, the photocentre of video camera is o, and angle point q point imaging point on benchmark image is p, t2In the moment, the photocentre of video camera is o ', and q point imaging point on image subject to registration is p ', and Δ d '=| op+o ' p ' | is the distance that a q moves along the x-axis after Δ T, and unit is " millimeter ".
Plan view from above after Fig. 5 is simplified, as shown in Figure 6,WithIt is respectively t1、t2Moment photographic head photocentre o and o ' destination object material object projection in the plane, definition destination object material object place plane is parallel and perpendicular to the plane of ground level with vehicle course here, then,For angle point q relative to the actual displacement of car, i.e. the actual displacement of vehicle, unit is " rice ".Oc=o ' c '=f is the focal length of video camera,For the vertical dimension between video camera and destination object material object place plane, this distance can obtain by shooting moment destination object material object place plane is carried out laser ranging measurement in actual applications.By the similar triangles in Fig. 5, it is known that
Formula (3), formula (4) are substituted in actual displacement formula,
By oc=o ' c '=f,Substitution formula (5):
Wherein, in time interval Δ T, angle point p displacement on image is Δ d ', and unit is " millimeter ".Converted by emulation, the angle point of destination object in image subject to registration is transformed in benchmark image, obtaining angle point p pixel count of movement in benchmark image is Δ d, unit is " pixel ", the relation of Δ d and Δ d ' is Δ d '=k Δ d, k be " pixel " be the amount of unit and with the tie between " millimeter " amount as unit.
If the visual angle of video camera isThe resolution of image is W × H, uses the width W in the horizontal direction of image here, and unit is " pixel ", and being converted to parasang is kW, and unit is " millimeter ", according to formula:
I.e.
Wherein, f is focal length, and unit is " millimeter ", and W unit is " pixel ", then i.e. can get k by above formula, and unit is " millimeter/pixel ".
Formula (8) is substituted into formula (6) obtain
Vehicle actual displacement in time interval Δ T can be calculated.
Distance, delta S that the vehicle that step 6, the time of vehicle operation Δ T obtained according to step one and step 6 obtain travels, it is thus achieved that the travel speed of vehicle.
According to speed formula v=Δ S/ Δ T, the real-time speed of vehicle can be obtained.
In sum, these are only presently preferred embodiments of the present invention, be not intended to limit protection scope of the present invention.All within the spirit and principles in the present invention, any modification, equivalent substitution and improvement etc. made, should be included within the scope of the present invention.

Claims (2)

1. a real time speed measuring method based on onboard sensor video flowing coupling, it is characterised in that under including Row step:
Step one, video capture device being arranged on vehicle body side, camera lens is vertical towards the course with car, t1、t2Moment gathers video image respectively, at t1Image on the basis of the image definition that moment gathers, at t2Moment The image definition gathered is image subject to registration, defines Δ T=t2-t1For time of vehicle operation;
Step 2, video processing equipment use Harris angular-point detection method to the every width in above-mentioned two width images Angle point in image extracts, it is thus achieved that angle point collection;
Step 3, the angle point collection using following matching process to extract step 2 registrate, it is thus achieved that matching angle Point is right;
Described matching process is: choose angle point a in benchmark image, chooses angle point b in image subject to registration, The closest angle point of equal number is chosen respectively, if around angle point a around angle point a and angle point b Angle point is equal with the distance ratio of angle point b with the angle point around the distance of angle point a ratio and angle point b, then it is assumed that angle Point a and angle point b is coupling angle point pair;
Step 4, the same destination object chosen in above-mentioned two width images, and the matching angle obtained from step 3 An angle point pair of described destination object is found in some centering, uses affine transformation by target pair in image subject to registration The angle point of elephant transforms in benchmark image, thus the pixel count Δ d that the angle point obtaining destination object moves;
Step 5, the pixel count that the angle point of destination object moves in the picture utilize formula (9) be converted to mesh The actual displacement Δ S, Δ S of the angle point of mark object are the distance that vehicle travels in Δ T time;
Wherein, the visual angle of image capture device isW is the pixel count on image level direction, and d is camera lens and mesh Vertical dimension between mark object plane;
Step 6, the vehicle obtained according to the time of vehicle operation Δ T described in step one and step 5 travel Distance, delta S, it is thus achieved that the travel speed of vehicle.
A kind of real time speed measuring method based on onboard sensor video flowing coupling, It is characterized in that, benchmark image and image subject to registration in described step one are two continuous frames image.
CN201310659886.6A 2013-12-09 2013-12-09 Real-time speed measurement method based on vehicle-mounted sensor video streaming matching Expired - Fee Related CN103686083B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310659886.6A CN103686083B (en) 2013-12-09 2013-12-09 Real-time speed measurement method based on vehicle-mounted sensor video streaming matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310659886.6A CN103686083B (en) 2013-12-09 2013-12-09 Real-time speed measurement method based on vehicle-mounted sensor video streaming matching

Publications (2)

Publication Number Publication Date
CN103686083A CN103686083A (en) 2014-03-26
CN103686083B true CN103686083B (en) 2017-01-11

Family

ID=50322200

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310659886.6A Expired - Fee Related CN103686083B (en) 2013-12-09 2013-12-09 Real-time speed measurement method based on vehicle-mounted sensor video streaming matching

Country Status (1)

Country Link
CN (1) CN103686083B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014007900A1 (en) * 2014-05-27 2015-12-03 Man Truck & Bus Ag Method and driver assistance system for determining the driving dynamics of a commercial vehicle
CN104168462B (en) * 2014-08-27 2017-06-13 重庆大学 Camera scene change detection method based on image angle point set feature
CN104331907B (en) * 2014-11-10 2018-03-16 东南大学 A kind of method based on ORB feature detections measurement bearer rate
CN104575004A (en) * 2014-12-24 2015-04-29 上海交通大学 Urban environment vehicle speed measurement method and system based on intelligent mobile terminal
CN106875427B (en) * 2017-01-11 2020-05-22 西南交通大学 Method for monitoring snaking motion of locomotive
CN106803350A (en) * 2017-03-06 2017-06-06 中山大学 A kind of vehicle speed detection method and device based on camera shooting time difference
US10078892B1 (en) * 2017-03-16 2018-09-18 GM Global Technology Operations LLC Methods and systems for vehicle tire analysis using vehicle mounted cameras
CN109543613A (en) * 2018-11-22 2019-03-29 西安科技大学 Vehicle Speed and Vehicle License Plate Recognition System and method based on TOF imaging
CN111091077B (en) * 2019-12-03 2022-11-11 华中科技大学 Vehicle speed detection method based on image correlation and template matching
CN112309135A (en) * 2020-01-07 2021-02-02 常俊杰 Direction judgment system based on lane detection and corresponding terminal

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102303605A (en) * 2011-06-30 2012-01-04 中国汽车技术研究中心 Multi-sensor information fusion-based collision and departure pre-warning device and method
CN103177582A (en) * 2013-04-22 2013-06-26 杜东 All-in-one machine with video velocity measurement and vehicle license plate recognition

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110298988A1 (en) * 2010-06-04 2011-12-08 Toshiba Alpine Automotive Technology Corporation Moving object detection apparatus and moving object detection method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102303605A (en) * 2011-06-30 2012-01-04 中国汽车技术研究中心 Multi-sensor information fusion-based collision and departure pre-warning device and method
CN103177582A (en) * 2013-04-22 2013-06-26 杜东 All-in-one machine with video velocity measurement and vehicle license plate recognition

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Vehicle Velocity Estimation based on Data fusion by Kalman Filtering for ABS;M.Amiri等;《2012 20th Iranian conference on Electrical Engineering(ICEE)》;20120517;1495-1500 *
融合边缘和角点特征的实时车辆检测技术;徐东彬等;《小型微型计算机系统》;20080630;第29卷(第6期);1142-1147 *

Also Published As

Publication number Publication date
CN103686083A (en) 2014-03-26

Similar Documents

Publication Publication Date Title
CN103686083B (en) Real-time speed measurement method based on vehicle-mounted sensor video streaming matching
US11697427B2 (en) Systems and methods for vehicle navigation
EP3544856B1 (en) Determining a road surface characteristic
US11200433B2 (en) Detection and classification systems and methods for autonomous vehicle navigation
US11953599B2 (en) Vehicle navigation based on aligned image and LIDAR information
CN102999759B (en) A kind of state of motion of vehicle method of estimation based on light stream
CN102467821B (en) Road distance detection method based on video image and apparatus thereof
CN104021676A (en) Vehicle positioning and speed measuring method based on dynamic video feature of vehicle
Zhang et al. A real-time curb detection and tracking method for UGVs by using a 3D-LIDAR sensor
US20230127230A1 (en) Control loop for navigating a vehicle
US20230117253A1 (en) Ego motion-based online calibration between coordinate systems
WO2021011617A1 (en) Reducing stored parameters for a navigation system
US20230341239A1 (en) Systems and methods for road segment mapping
CN104267209B (en) Method and system for expressway video speed measurement based on virtual coils
CN106408589B (en) Based on the vehicle-mounted vehicle movement measurement method for overlooking camera
KR102414632B1 (en) Method for determining the location of a fixed object using multiple observation information
US20220245831A1 (en) Speed estimation systems and methods without camera calibration
CN116736322A (en) Speed prediction method integrating camera image and airborne laser radar point cloud data
KR102368262B1 (en) Method for estimating traffic light arrangement information using multiple observation information
Gorobetz et al. Vehicle distance and speed estimation algorithm for computer vision sensor system
Liu Performance evaluation of stereo and motion analysis on rectified image sequences
US11919508B2 (en) Computer vision system for object tracking and time-to-collision
US11893799B2 (en) Computer vision system for object tracking and time-to-collision
US20230154013A1 (en) Computer vision system for object tracking and time-to-collision
EP4195151A2 (en) Computer vision system for object tracking and time-to-collision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170111

Termination date: 20171209

CF01 Termination of patent right due to non-payment of annual fee