CN103983270B  A kind of image conversion processing method of sonar data  Google Patents
A kind of image conversion processing method of sonar data Download PDFInfo
 Publication number
 CN103983270B CN103983270B CN201410210477.2A CN201410210477A CN103983270B CN 103983270 B CN103983270 B CN 103983270B CN 201410210477 A CN201410210477 A CN 201410210477A CN 103983270 B CN103983270 B CN 103983270B
 Authority
 CN
 China
 Prior art keywords
 point
 sonar
 current
 circle
 max
 Prior art date
Links
Classifications

 G—PHYSICS
 G01—MEASURING; TESTING
 G01S—RADIO DIRECTIONFINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCEDETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
 G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
 G01S15/02—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
 G01S15/06—Systems determining the position data of a target
 G01S15/08—Systems for measuring distance only

 G—PHYSICS
 G01—MEASURING; TESTING
 G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
 G01C21/00—Navigation; Navigational instruments not provided for in preceding groups G01C1/00G01C19/00

 G—PHYSICS
 G01—MEASURING; TESTING
 G01S—RADIO DIRECTIONFINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCEDETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
 G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
 G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
 G01S7/539—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 using analysis of echo signal for target characterisation; Target signature; Target crosssection
Abstract
Description
Technical field
The present invention relates to the processing method of a kind of sonar range data, by range data being mapped to the side of image space Formula, it is achieved that the image conversion of sonar data processes.Belong to Mobile Robotics Navigation field.
Background technology
Along with the progress of airmanship, mobile robot is widely used to assist the mankind to complete circumstances not known and visits The tasks such as survey, " Jade Hare " lunar rover of China is typical example.For completing the interrelated task of exploring unknown environments, robot Need to possess the function of independent navigation.Typically, independent navigation includes three subproblems: 1. I am at that？2. which I to go？3. I how Which goes？The most corresponding robot localization, map building and path planning.Robot self pose and environmental characteristic positional information Perception is the premise solving the problems referred to above.
At present, moving robot mainly uses vision sensor, laser sensor and ultrasonic sensor etc. to be obtained from Body and the positional information of environmental characteristic.Wherein, the abundant information that vision sensor obtains, but require that robot quickly counts According to disposal ability, it addition, light, block etc. is disturbed sensitivity by vision sensor, limit its range of application.Laser sensor and Ultrasonic sensor is range sensor, by the distance between robot measurement and environmental characteristic, it is provided that robot navigation institute The information needed.Laser sensor response is fast, and the precision of information of acquisition is high, but, laser sensor installation accuracy requires height, price Expensive.Relative, ultrasonic sensor is installed simple, and price is relatively low, it is possible to obtain the information that precision is of a relatively high, because of This, ultrasonic sensor is still widely used.But, owing to field angle is relatively big, the information that ultrasonic sensor obtains exists not Definitiveness.Probability theory, fuzzy theory and gray system theory etc. are all used to express, process ultrasound information, and finally real Existing robot map building, location and path planning.
Take a broad view of current sonar information processing method, be usually and directly process the range data that sonar obtains, and by meter Calculate the statistical information of range data, set up the formalized description of environmental characteristic.The rasterizing description side of typical example such as environment Method and characteristics map creation method.Being affected by field angle, the range information that sonar obtains is inevitably with error.Current Sonar range data processing method make use of the statistical information that packet contains to greatest extent, and at mobile robot autonomous navigation In achieve successful Application.But, when error is bigger, the accuracy of statistical information can be affected, and meanwhile, Existing methods is also It is difficult to excavate from initial data further useful information.To this end, herein sonar range data are mapped to figure from metric space Image space, utilizes the related art method of image procossing to realize the process of sonar range information.The method can be efficiently applied to move During mobile robot location, map building and path planning.
Summary of the invention
It is an object of the invention to the range data of sonar is mapped to image space, utilize image processing techniques to realize sonar The process of range data, by the process of image conversion, excavates the environmental information that sonar data comprises to greatest extent, and the present invention one Aspect provides the new approaches that sonar data processes, and on the other hand, by the image conversion matching technique of multiple sonar data, improves Precision that sonar data processes and robustness.
The present invention provides the image conversion processing method of a kind of sonar data, by sonar range information MAP to image is empty Between, the method utilizing image procossing, it is achieved the process of sonar range information, mainly comprise the steps that
The measured value that step 1, filtration produce owing to measuring blind area or exceed sonar to measure scope is (if note sonar data is (x, y, θ, ra), wherein (x, y) represents the coordinate of target, and θ is the orientation of target opposed robots, and ra is that target arrives robot Distance.Undesirable data markers is ra=R, and wherein R is that the maximum of sonar sensor measures distance, R=in the present invention 5000mm), the data set after filtering of recording a demerit is S.
Step 2, filtration singular value.In the present invention, singular value refers to not represent any physical presence feature, the most sparse Measured value.Dataoriented collection S, calculate respectively in S Euclidean distance the most between any two, and according to coordinate and the size of distance Point in S is classified.Add up each classification and comprise number a little, when the number at midpoint of classifying is less than threshold value Num, remove A little, Num is the threshold value being previously set in institute in respective class.Data set after filter singular value of recording a demerit is S_{0}, the present invention claims S_{0}For away from From data space, referred to as metric space.
Step 3, extraction angle point.Defining a length of N, sliding step is s, and glide direction isSliding window.EdgeSide To, from S_{0}In take N number of point successively, remember that its transverse and longitudinal coordinate is respectively X_{t}=[x_{1},x_{2},...,x_{N}] and Y_{t}=[y_{1},y_{2},...,y_{N}], then X_{t}、Y_{t}Covariance matrix be:
Wherein: WithIt is respectively X_{t}And Y_{t}The average of element.Note C_{t}Eigenvalue For λ_{max}And λ_{min}, its ratio is EVR=λ_{min}/λ_{max}.Along with the slip of window, S can be calculated_{0}All EVR values, worked as Front EVR curve.At corner point, EVR can reach extreme value.It is bent that the present invention calculates current EVR by the way of relatively adjacent EVR value The peak value of line, the corresponding angle point of each peak value.
Step 4, calculating error ellipse.Calculating the peak value of current EVR curve, this step, as a example by peak value O, elaborates by mistake The calculating process of difference ellipse is as follows: (x y), then takes so that (x, y) is the center of circle, and r' is radius to calculate sonar data point corresponding for O In the range of n data point, its transverse and longitudinal coordinate is designated as X=[x respectively_{1},...,x_{n}] and Y=[y_{1},...,y_{n}].If the association of note X, Y Variance matrix be the eigenvalue of C, C be λ_{1}And λ_{2}(λ_{1}≥λ_{2}), corresponding characteristic vector is v_{1}And v_{2}, then error ellipse is v_{1}And v_{2}Group Become coordinate system inFor the center of circle, λ_{1}And λ_{2}It is respectively major axis and the ellipse of short axle, wherein
Step 5, calculating maximum error circle, set up local and expand map.WithFor the center of circle, λ_{1}For radius, set up error Oval corresponding maximum error circle.According to the actual connection of angle point, calculate the maximum error circle that the connected angle point of reality is corresponding Public outer tangent line, calculate further the point of contact of tangent line and maximum error circle, connect reality and connect angle point correspondence maximum error circle Point of contact constitute local expand map M, all maximum erroies circle the center of circle, point of contact constitute local expand map M crucial point set im_point。
Step 6, map to image space.The transverse and longitudinal coordinate that note im_point comprises a little is respectively P=(p_{1},...,p_{m}), Q =(q_{1},...,q_{m}), p_{max}And q_{max}Represent the maximum of transverse and longitudinal coordinate, p respectively_{min}And q_{min}It is respectively the minima of transverse and longitudinal coordinate. Then a bit (p in im_point_{i},q_{i}) a bit (h in image space is mapped to by formula (2) and formula (3)_{i},k_{i}):
Wherein, η is proportionality coefficient, and η is relevant to the yardstick of sonar data and image space.
Step 7, note (h_{i},k_{i}) it is the picture in the center of circle, r in im_point_{i}For the radius of corresponding circle, i=1 ..., j, j are circle Heart number.With (h_{i},k_{i}) it is the center of circle, r_{i}Determine border circular areas for radius, make the picture at point of contact in this border circular areas and im_point Pixel value in the region determined is 0, otherwise is 1.Remember in this bianry image space pixel value be the region of 0 be in image space Local expand map p_M.
Step 8, location matches.Location matches mainly includes two aspects: rotational invariance coupling and twodimensional scan are mated.
Step A, rotational invariance are mated.
Step A1, the maximum of spacing of note p_M both horizontally and vertically pixel are respectively h_{max}And v_{max}, the present invention Definition P_{c}(int(h_{max}/2),int(v_{max}/ 2)) it is the central point of p_M.With P_{c}For zero, along the horizontal and vertical side of p_M To setting up coordinate system Σ_{c}；
Step A2, make D=even ((h_{max}/2)^{2}+(v_{max}/2)^{2})^{1/2}, " even " represents ((h_{max}/2)^{2}+(v^{max}/2)^{2})^{1/2} Upwards take even number.In coordinate system Σ_{c}In, with P_{c}For the center of circle, respectively as the concentric circular that radius is r and r+ Δ r, constitute donut R_{r}。R_{r}The number of pixels comprised is designated as N_{r}, the pixel count wherein occupied by p_M is designated as U_{r}, its ratio is designated as v (r)=U_{r}/N_{r}, claim For the effective duty cycle that radius r is corresponding.According to this, calculate initial point to all of effective duty cycle in the range of D/2, constitute and effectively account for Empty than vectorial V=[v (0), v (Δ r) ... v (r) ... v (D/2)]^{T}。
Step A3, set current and history p_M effective duty cycle vector and be respectively V_{l}=[v_{l}(0), v_{l}(Δr)...v_{l} (r)...v_{l}(D/2)]^{T}And V_{d}=[v_{d}(0), v_{d}(Δr)...v_{d}(r)...v_{d}(D/2)]^{T}, then V_{l}And V_{d}Matching rate be:
Wherein,
Step B, twodimensional scan coupling.
Step B1, the establishment of coordinate system method mentioned according to step A1, set up the coordinate of current p_M and history p_M respectively System；
Step B2, remembering that in current bianry image space, the pixel count of picture level and vertical direction is N, current p_M occupies The ith row and the pixel count respectively N of the ith row_{ri}And N_{ci}, then the effective duty cycle of the ith row and column is designated as w respectively_{ri}=N_{ri}/ N and w_{ci}=N_{ci}/N.Typically, due to current p_M and history p_M towards difference, the dutycycle of same sequence number ranks is the most different.For This, the present invention realizes history p_M and the coupling of current p_M according to following rule:
2.1 fix current p_M, remember that the effective duty cycle of its ith row and column isWithCalculate it to own
The effective duty cycle of ranks, generates the effective duty cycle vector of current p_MWith
2.2 with 2.1, and the effective duty cycle vector of note history p_M is respectivelyWith
2.3 utilize formula (4) to calculate the canonical correlation coefficient of two dimensions respectivelyWith.History p_M rotates 1 counterclockwise
Degree, recalculates the canonical correlation coefficient of current p_M and history p_M, is designated asWith.Successively, two dimensions one are calculated respectively
360 groups of canonical correlation coefficients in cycle, generate two respective correlation coefficienies of dimension vectorial:With
If the current p_M of step B3 and history p_M represent identical local environment, then correlation coefficient vector λ_{r}And λ_{c}In right The some continuous element answered will exceed threshold value λ of setting_{th}；Otherwise, λ_{r}And λ_{c}Element be respectively less than λ_{th}Or the most some discrete elements Element exceedes threshold value.Take λ_{r}And λ_{c}In comprise exceed threshold value λ_{th}The average of element, be designated as respectivelyWithThen current p_M with go through The matching rate of history p_M
Step 9, images match.Utilize SIFT operator to extract the point of interest of current p_M and history p_M respectively, and carry out figure As coupling, note matching rate is P_{p}, P in the present invention_{p}The point of interest logarithm of the true coupling of making a comment or criticism and the ratio of all match interest point logarithms Value.
Step 10, matching rate merge.The present invention specifying, final matching rate is P_{f}=α p_{rpt}+βp_{2d}+γp_{p}, wherein α, β Weight with γ is respectively each matching rate, meets alpha+beta+γ=1.The value of α, β and γ is according to each method in reality application Matching precision determine.If P_{f}It is unsatisfactory for threshold requirement, then stores current p_M；Otherwise, then may utilize match information and move The tasks such as mobile robot location, map building and path planning.
The invention have the advantages that
1, sonar range data are mapped to image space, utilize image processing techniques to process sonar range data, it is provided that A kind of new sonar data roadmap.
2, by setting up local expansion map, improve the robustness that sonar data processes.
3, proposing three kinds of sonar data image conversion matching process, three kinds of methods complement each other, and improve matching precision.
4, the image conversion of sonar data processes and can be efficiently applied to localization for Mobile Robot, map building and path planning During, improve precision and the robustness of robot navigation's task.
Accompanying drawing explanation
Fig. 1 is sonar data image conversion Processing Algorithm flow chart；
Fig. 2 (a), Fig. 2 (b) are respectively data set S and S_{0}, Fig. 2 (b) illustrates error ellipse and maximum error circle simultaneously；
Fig. 3 (a), Fig. 3 (b) are respectively rotational invariance matching process schematic diagram and rotational invariance matching method matches knot Really schematic diagram；
Fig. 4 (a), Fig. 4 (b) are respectively twodimensional scan matching process schematic diagram and twodimensional scan matching method matches result is shown It is intended to；
Fig. 5 (a), Fig. 5 (b) are respectively current p_M and history p_M interest point extraction result schematic diagram；
Fig. 6 (a) is current and the schematic diagram of history p_M SIFT interest points matching effect, and Fig. 6 (b) is current p_M interest The schematic diagram of some Self Matching effect.
Detailed description of the invention
The present embodiment is implemented under premised on inventive technique scheme, gives detailed embodiment and process, But the practical range of the present invention is not limited to following embodiment.
The present embodiment utilizes 16 sonar sensors that robot Pioneer3DX equips under corridoroffice environment Gather data, utilize Visual Studio2008, OpenCV1.0.0 and Matlab R2009a hybrid programming to realize sonar The image conversion of data processes, and algorithm flow is as shown in Figure 1.It is as follows that the present invention specifically performs step:
(1) utilize robot sonar to gather current local environment data, separate the sonar data point of ra=5000mm, separate Shown in result such as Fig. 2 (a), filter the singular value in sonar data, set up metric space S_{0}, as shown in Fig. 2 (b)；
(2) the present embodiment takes N=10, s=1, X_{t}=[x_{1},x_{2},...,x_{10}], Y_{t}=[y_{1},y_{2},...,y_{10}].First, X is calculated by formula (1)_{t}And Y_{t}Covariance matrix C_{t}, then, utilize the eig () function provided in Matlab, calculate C_{t}'s Characteristic vector and individual features value, calculate EVR=λ further_{min}/λ_{max}。
(3) edgeWith steplength s sliding window, to window, data are less than N.Repeat step (2), calculate all windows corresponding EVR, constitute current distance space S_{0}EVR curve.
(4) the most adjacent EVR value, calculates the peak value of current EVR curve.Assuming that O is one of them peak value, O is corresponding Sonar data point be (x, y).Take so that (x, y) is the center of circle, and r' is n data point in the circle of radius, and its coordinate is designated as X= [x_{1},...,x_{n}] and Y=[y_{1},...,y_{n}].Calculate the eigenvalue λ of the covariance matrix C, C of X, Y_{1}And λ_{2}And corresponding feature to Amount v_{1}And v_{2}.At v_{1}And v_{2}In the coordinate system constituted, withFor the center of circle, λ_{1}And λ_{2}Error ellipse is determined for major axis and short axle, its InWithFor the center of circle, λ_{1}For radius, determine maximum error circle.
(5) calculate, according to step (4), the maximum error circle that the current all peak values of EVR curve are corresponding.Reality according to angle point Connection, calculates the public outer tangent line of maximum error circle corresponding to the connected angle point of reality, calculates tangent line further with maximum by mistake The point of contact of difference circle, builds local and expands map M, determines the crucial point set im_point of M (as shown in Fig. 3 (a) and Fig. 4 (a)).
(6) the transverse and longitudinal coordinate that note im_point comprises a little is respectively P=(p_{1},...,p_{m}) and Q=(q_{1},...,q_{m}), determine p_{max}And q_{max}, and p_{min}And q_{min}.By formula (2) and formula (3) by the point (p in im_point_{i},q_{i}) map to image sky Point (h between_{i},k_{i}), η=0.02 in the present invention.
(7) (h is determined_{i},k_{i}), r_{i}, i=1 ..., j.With (h_{i},k_{i}) it is the center of circle, r_{i}Border circular areas is determined for radius.Make by Pixel value in the region that the picture at this border circular areas and im_point point of contact determines is 0, otherwise is 1, determines in image space Local expand map p_M (as shown in Fig. 5 (a) and Fig. 5 (b)).
(8) location matches.
Step A, rotational invariance are mated.
Step A1, calculate the maximum h of spacing of p_M both horizontally and vertically pixel_{max}And v_{max}, calculate further The center point P of p_M_{c}(int(h_{max}/2),int(v_{max}/ 2)), coordinate system Σ is set up_{c}, as shown in Fig. 3 (a)；
Step A2, first calculate D=even ((h_{max}/2)^{2}+(v_{max}/2)^{2})^{1/2}, then calculate current and history p_M has Effect dutycycle vector V_{l}=[v_{l}(0), v_{l}(Δr)...v_{l}(r)...v_{l}(D/2)]^{T}And V_{d}=[v_{d}(0), v_{d}(Δr)...v_{d} (r)...v_{d}(D/2)]^{T}, calculate V by formula (4)_{l}And V_{d}Matching rate p_{rpt}(as shown in Fig. 3 (b)).
Step B, twodimensional scan coupling.
Step B1, the establishment of coordinate system method proposed according to step A1, set up the coordinate of current p_M and history p_M respectively System, as shown in Fig. 4 (a)；
Step B2, calculate current p_M effective duty cycle vectorWith
Step B3, the effective duty cycle vector of calculating history p_MWith
Step B4, utilize formula (4) calculate two dimensions canonical correlation coefficient vectorWith
Step B5, calculating P_{2d}(as shown in Fig. 4 (b)).
(9) SIFT operator is utilized to extract the point of interest of current p_M and history p_M respectively (such as Fig. 5 (a) and Fig. 5 (b) institute Show), and carry out images match (as shown in Fig. 6 (a) and Fig. 6 (b)), note matching rate is P_{p}。
(10) final matching rate P is calculated_{f}=α p_{rpt}+βp_{2d}+γp_{p}.In the present invention, test of many times result shows, three kinds The precision height joining algorithm is followed successively by: twodimensional scan matching algorithm, rotational invariance matching algorithm and image matching algorithm, therefore This experiment takes α=0.33, β=0.37, γ=0.30.If P_{f}It is unsatisfactory for threshold requirement, then stores current p_M, otherwise, then Available match information moves the tasks such as robot localization, map building and path planning.
Claims (11)
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

CN201410210477.2A CN103983270B (en)  20140516  20140516  A kind of image conversion processing method of sonar data 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

CN201410210477.2A CN103983270B (en)  20140516  20140516  A kind of image conversion processing method of sonar data 
Publications (2)
Publication Number  Publication Date 

CN103983270A CN103983270A (en)  20140813 
CN103983270B true CN103983270B (en)  20160928 
Family
ID=51275334
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

CN201410210477.2A CN103983270B (en)  20140516  20140516  A kind of image conversion processing method of sonar data 
Country Status (1)
Country  Link 

CN (1)  CN103983270B (en) 
Families Citing this family (1)
Publication number  Priority date  Publication date  Assignee  Title 

CN106197421B (en) *  20160624  20190322  北京工业大学  A kind of forward position target point generation method independently explored for mobile robot 
Citations (2)
Publication number  Priority date  Publication date  Assignee  Title 

CN103247040A (en) *  20130513  20130814  北京工业大学  Layered topological structure based map splicing method for multirobot system 
CN103278170A (en) *  20130516  20130904  东南大学  Mobile robot cascading map building method based on remarkable scenic spot detection 

2014
 20140516 CN CN201410210477.2A patent/CN103983270B/en active IP Right Grant
Patent Citations (2)
Publication number  Priority date  Publication date  Assignee  Title 

CN103247040A (en) *  20130513  20130814  北京工业大学  Layered topological structure based map splicing method for multirobot system 
CN103278170A (en) *  20130516  20130904  东南大学  Mobile robot cascading map building method based on remarkable scenic spot detection 
NonPatent Citations (4)
Title 

CHOI J et al..Autonomous topological modeling of a home environment and topological localization using a sonar grid map.《Autonomous Robots》.2011,第30卷(第4期),第351368页. * 
CHOI YH et al..A line feature based SLAM with low grade range sensors using geometric constraints and active exploration for mobile robot.《Autonomous Robots》.2008,第24卷(第1期),第1327页. * 
一种基于声纳信息的移动机器人地图创建方法;高丽华等;《制造业自动化》;20061130;第28卷(第11期);第3335、65页 * 
基于声纳的室内环境栅格地图创建方法的研究;李润伟;《中国优秀硕士学位论文全文数据库信息科技辑》;20090115;论文全文 * 
Also Published As
Publication number  Publication date 

CN103983270A (en)  20140813 
Similar Documents
Publication  Publication Date  Title 

Pizzoli et al.  REMODE: Probabilistic, monocular dense reconstruction in real time  
Pfeiffer et al.  Exploiting the power of stereo confidences  
AU2012314067B2 (en)  Localising transportable apparatus  
Fritsch et al.  A new performance measure and evaluation benchmark for road detection algorithms  
CN103278170B (en)  Based on mobile robot's cascade map creating method that remarkable scene point detects  
Dewan et al.  Motionbased detection and tracking in 3d lidar scans  
Kümmerle et al.  Large scale graphbased SLAM using aerial images as prior information  
Stachniss et al.  Exploring unknown environments with mobile robots using coverage maps  
US8521418B2 (en)  Generic surface feature extraction from a set of range data  
Jin et al.  Environmental boundary tracking and estimation using multiple autonomous vehicles  
Milella et al.  Stereobased egomotion estimation using pixel tracking and iterative closest point  
Nieto et al.  Recursive scanmatching SLAM  
CN103236064B (en)  A kind of some cloud autoegistration method based on normal vector  
CN103268729B (en)  Based on mobile robot's tandem type map creating method of composite character  
Pfeiffer et al.  Modeling dynamic 3D environments by means of the stixel world  
CN106595659A (en)  Map merging method of unmanned aerial vehicle visual SLAM under city complex environment  
Kim et al.  SLAM in indoor environments using omnidirectional vertical and horizontal line features  
US20040039498A1 (en)  System and method for the creation of a terrain density model  
JP2010061655A (en)  Object tracking using linear feature  
KR20090088516A (en)  Method for selflocalization of a robot based on object recognition and environment information around the recognized object  
CN105182328B (en)  A kind of GPR buried target detection method based on twodimensional empirical mode decomposition  
US8831778B2 (en)  Method of accurate mapping with mobile robots  
CN105856230A (en)  ORB key frame closedloop detection SLAM method capable of improving consistency of position and pose of robot  
CN102435188A (en)  Monocular vision/inertia autonomous navigation method for indoor environment  
CN106156748A (en)  Traffic scene participant's recognition methods based on vehiclemounted binocular camera 
Legal Events
Date  Code  Title  Description 

C06  Publication  
PB01  Publication  
C10  Entry into substantive examination  
SE01  Entry into force of request for substantive examination  
C14  Grant of patent or utility model  
GR01  Patent grant 