CN103954283A - Scene matching/visual odometry-based inertial integrated navigation method - Google Patents

Scene matching/visual odometry-based inertial integrated navigation method Download PDF

Info

Publication number
CN103954283A
CN103954283A CN201410128459.XA CN201410128459A CN103954283A CN 103954283 A CN103954283 A CN 103954283A CN 201410128459 A CN201410128459 A CN 201410128459A CN 103954283 A CN103954283 A CN 103954283A
Authority
CN
China
Prior art keywords
image
unmanned plane
navigation
point
scene matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410128459.XA
Other languages
Chinese (zh)
Other versions
CN103954283B (en
Inventor
赵春晖
王荣志
张天武
潘泉
马鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Chenxiang Zhuoyue Technology Co ltd
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201410128459.XA priority Critical patent/CN103954283B/en
Publication of CN103954283A publication Critical patent/CN103954283A/en
Application granted granted Critical
Publication of CN103954283B publication Critical patent/CN103954283B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Image Analysis (AREA)
  • Navigation (AREA)

Abstract

The invention relates to a scene matching/visual odometry-based inertial integrated navigation method. The method comprises the following steps: calculating the homography matrix of an unmanned plane aerial photography real time image sequence according to a visual odometry principle, and carrying out recursive calculation by accumulating a relative displacement between two continuous frames of real time graph to obtain the present position of the unmanned plane; introducing an FREAK characteristic-based scene matching algorithm because of the accumulative error generation caused by the increase of the visual odometry navigation with the time in order to carry out aided correction, and carrying out high precision positioning in an adaption zone to effectively compensate the accumulative error generated by the long-time work of the visual odometry navigation, wherein the scene matching has the advantages of high positioning precision, strong automaticity, anti-electromagnetic interference and the like; and establishing the error model of the inertial navigation system and a visual data measuring model, carrying out Kalman filtering to obtain an optimal estimation result, and correcting the inertial navigation system. The method effectively improves the navigation precision, and is helpful for improving the autonomous flight capability of the unmanned plane.

Description

Inertia integrated navigation method based on scene matching aided navigation/vision mileage
Technical field
The invention belongs to Navigation of Pilotless Aircraft field of locating technology, relate to a kind of inertia integrated navigation method based on scene matching aided navigation/vision mileage.
Background technology
High precision, high dynamic and highly reliable independent navigation are to guarantee that unmanned plane completes one of gordian technique of various tasks smoothly, for strengthening unmanned plane independent behaviour ability, improve fighting efficiency tool and are of great significance.Navigate mode is divided into satellite navigation, radio navigation, inertial navigation etc., wherein inertial navigation (INS) is with its highly autonomous outstanding advantages, in airmanship, occupy special position, existing UAV Navigation System is all to take inertial navigation as core forms integrated navigation system, completes the independent navigation of unmanned plane under complex environment.
At present, the main mode of Navigation of Pilotless Aircraft is INS/GPS integrated navigation system, but under complex environment the unknown, dynamic change, a little less than gps signal power, be vulnerable to electromagnetic interference (EMI), in signal blind zone, even can quit work, cause integrated navigation system to occur very big mistake, the consequence that generation can not be estimated, and the Beidou satellite navigation system of China is in the development stage, its reliability and accuracy are still difficult to be met such as high precision navigation requests such as Military Application.
Summary of the invention
The technical matters solving
For fear of the deficiencies in the prior art part, the present invention proposes a kind of inertia integrated navigation method based on scene matching aided navigation/vision mileage, realizes full independent navigation under unmanned plane complex environment.
Technical scheme
An inertia integrated navigation method based on scene matching aided navigation/vision mileage, is characterized in that step is as follows:
Step 1: in unmanned plane during flying process, look video camera Real-time Obtaining ground image a under airborne;
Step 2: utilize image a and former frame image a ', determine the vision mileage of unmanned plane, step is as follows:
A, use Harris Corner Detection Algorithm are respectively at two continuous frames realtime graphic a and the middle extract minutiae of a ';
B, in image a with (x, y) tcentered by square region in search with image a ' in each unique point (x, y) tthe match point with the highest neighborhood simple crosscorrelation; Meanwhile, in image a ' with (x, y) tcentered by square region in search with image a in each unique point (x, y) tthe match point with the highest neighborhood simple crosscorrelation;
C, utilization RANSAC robust estimation method obtain maximum consistent point set and the estimation of homography matrix H, and process forms a random sample for first randomly drawing 4 groups of matching double points, and calculates homography matrix H; Then to each matching double points in step B, calculate apart from dis; Reset threshold value t, if dis<t, this matching double points is interior point, otherwise rejects, and counts out in statistics; Repeat said process k time, select H;
D, by delimiting as all matching double points of interior point have the H that imperial palace counts, reappraise the H that use regains calculate in image a ' with image a in each unique point (x, y) tcorresponding point (x 1, y 1) t.And use in step B method respectively in image a with (x, y) tcentered by square region in search with image a ' in each unique point (x 1, y 1) tthe match point with the highest neighborhood simple crosscorrelation; Meanwhile, in image a ' with (x 1, y 1) tcentered by square region in search with image a in each unique point (x, y) tthe match point with the highest neighborhood simple crosscorrelation;
E, repeating step B are to step D, until the number of matching double points is stable;
Step 3: when unmanned plane enters adaptive district, according to inertial navigation system, unmanned plane is carried out to coarse positioning, find the benchmark image b corresponding with coarse positioning result in on-board memory devices, and carry out scene matching aided navigation with realtime graphic a, determine the position of unmanned plane;
Step 4: the error of utilizing unmanned plane position correction vision mileage that step 3 obtains to produce in adaptive district, obtains more accurately position of unmanned plane;
Step 5: utilize inertial navigation system to provide the current position of unmanned plane and attitude;
Step 6: use the error equation of inertial navigation system as the state equation of integrated navigation system, navigation coordinate system is chosen as sky, northeast coordinate system, using in step 4 must out position in out position and step 5 difference as measurement.By Kalman filter, estimate the drift error of inertia system, and use this drift error to proofread and correct inertial navigation system, the navigational parameter after being merged.
The process of described step 3 is as follows: first under unmanned aerial vehicle onboard, look video camera Real-time Obtaining ground image a, image a is carried out to pre-service, obtain image c; Then extract the FAST feature of image b and image c; Re-using FREAK feature descriptor is described the FAST feature of extracting; Utilize the similarity criterion of nearest Hamming distance to carry out characteristic matching, obtain matched position, i.e. the position of current unmanned plane.
Described ground image a is optical imagery or infrared image.
Beneficial effect
A kind of inertia integrated navigation method based on scene matching aided navigation/vision mileage that the present invention proposes, according to vision mileage principle, calculate the take photo by plane homography matrix of realtime graphic sequence of unmanned plane, by the relative displacement between the real-time figure of accumulation two continuous frames, recursion calculates the current location of unmanned plane; Because vision mileage navigation increase in time can produce cumulative errors, thereby the Scene Matching Algorithms of introducing based on FREAK feature is assisted correction, scene matching aided navigation has the advantages such as positioning precision is high, independence is strong, anti-electromagnetic interference (EMI), in adaptive district, can carry out hi-Fix, the cumulative errors that the navigation of effective compensation vision mileage works long hours and produces; Set up the error model of inertial navigation system and the measurement model of vision data, by Kalman filtering, draw optimal estimation result, and inertial navigation system is proofreaied and correct.The present invention has effectively improved navigation accuracy, contributes to improve unmanned plane autonomous flight ability.
But the research of the adaptive discrimination of scene matching aided navigation shows, only the coupling in adaptive district can provide positional information comparatively reliably for unmanned plane, and in non-adaptive district as desert, sea etc., scene matching aided navigation cannot normally be worked.
Vision mileage is that this technology, as a kind of new navigator fix mode, is successfully applied in autonomous mobile robot by processing the technology of two continuous frames Image estimation movable information.In unmanned plane during flying process, because two continuous frames image is the image that same sensor under unmanned aerial vehicle platform, same time period, identical conditions are taken, there is identical noise profile and image error, and do not exist natural conditions to change the imaging difference causing, therefore can provide more accurate locating information in non-adaptive district; Meanwhile, two continuous frames picture size is much smaller than reference map size in scene matching aided navigation, so match time is less, thereby improves the real-time of navigational system.
Therefore, the present invention proposes a kind of unmanned plane inertia integrated navigation method based on scene matching aided navigation/vision mileage, the method has that independence is strong, load is light, equipment cost is low and the advantage such as strong interference immunity, for the engineering application of UAV Navigation System provides a kind of feasible technical scheme.It is a kind of navigate mode based on computer vision, there is the remarkable advantages such as high, the anti-electronic interferences ability of positioning precision is strong and airborne equipment size is little, cost is low, can eliminate the cumulative errors that inertial navigation system works long hours, increase substantially the positioning precision of inertial navigation system, method is sent out in the standby navigation becoming in the situations such as GPS failure or precise decreasing.
Accompanying drawing explanation
Fig. 1 is framework flow process of the present invention
Fig. 2 is two view homograph graphs of a relation
Fig. 3 is FREAK descriptor sampling pattern
Embodiment
Now in conjunction with the embodiments, the invention will be further described for accompanying drawing:
Inertia integrated navigation method based on scene matching aided navigation/vision mileage in the present invention, utilize atural object scene that reconnaissance means obtain the predetermined flight range of unmanned plane as reference map, when carrying the unmanned plane of vision sensor and fly over predetermined region, obtain local image atural object scene that according to pixels parameter such as point resolution, flying height and visual field generates a certain size as real-time figure, by mating of real-time figure and reference map, find out the position of real-time figure in reference map, and then determine the current accurate location of unmanned plane.The scene matching aided navigation navigation accuracy time effects of not navigated, possess the advantages such as anti-electromagnetic interference capability is strong, independence good, precision is high, cost is low, volume is little, abundant information, itself and inertial navigation are combined to the overall performance that can greatly improve navigational system, therefore in Navigation of Pilotless Aircraft, carry out the research of inertia/scene matching aided navigation independent combined navigation, be conducive to break away from extraneous backup system, strengthen reliability, maneuverability, disguise, anti-interference and the viability of unmanned plane.Mainly comprise inertial navigation, scene matching aided navigation, vision mileage, merge and proofread and correct and five parts of integrated navigation Kalman filtering, comprise the following steps:
The first step, looks video camera Real-time Obtaining ground image a under airborne, by estimating the homography matrix of realtime graphic a and former frame realtime graphic a ', determines the vision mileage of unmanned plane.
Concrete implementation step is as follows:
1, use Harris Corner Detection Algorithm respectively at two continuous frames realtime graphic a and the middle extract minutiae of a ';
2, to each unique point (x, y) in a ' t, in a with (x, y) tcentered by square region in search there is the match point of the highest neighborhood simple crosscorrelation.In like manner, each unique point in a, at its match point of the middle search of a ', is finally determined to matching characteristic point is right;
3, use RANSAC robust estimation method to obtain maximum consistent point set and the homography matrix H between calculating two continuous frames realtime graphic a and a '.Idiographic flow comprises: (1) is randomly drawed 4 groups of matching double points and formed a random sample, and calculates homography matrix H; (2), to each matching double points in step 2, calculate apart from dis; (3) setting threshold t (t is less than half of the real-time figure length of side), if dis<t, this matching double points is interior point, otherwise rejects, and counts out in statistics; (4) repeat said process k time, selection has the H that imperial palace is counted;
4, by delimiting as all match points of interior point recalculate homography matrix H, and the position of the region of search in H deciding step 2, thereby determine matching double points more accurately;
5, repeating step 2, to step 4, until the number of matching double points is stable, and is used final definite stable matching point to calculating homography matrix H.
6, the homography matrix H that uses step 5 to obtain, and the elevation information that the attitude information providing according to inertance element and barometric altimeter provide, obtain the relative displacement of video camera when taking two continuous frames image, by the displacement of accumulation video camera, the position of estimating before utilizing, cyclically calculates the current position of unmanned plane.
Second step, when unmanned plane enters adaptive district, according to inertial navigation system, unmanned plane is carried out to coarse positioning, according to coarse positioning result, cutting circular horizon subimage b in reference map is taken photo by plane in pre-stored unmanned plane during flying region, benchmark subgraph radius need be greater than inertance element unit interval average drift distance and unmanned plane 1.5 times of flight time product between two adaptive districts.Realtime graphic a and benchmark subimage b are carried out to scene matching aided navigation, determine the position of unmanned plane.Scene matching aided navigation concrete steps are as follows:
1, respectively realtime graphic a and benchmark subimage b are carried out to gray processing, and extract the FAST unique point of realtime graphic a and benchmark subimage b.
2, each FAST unique point is obtained to its FREAK feature and describe operator, its step is as follows:
As shown in Figure 3, FREAK builds circular sampling centered by each unique point, and center is intensive, and surrounding is sparse, and number of samples successively decreases with exponential form.Use pre-level and smooth each the sampling circle of different gaussian kernel, the standard deviation of gaussian kernel is directly proportional to the size of circle.
FREAK descriptor consists of difference Gauss's Bit String, and the criterion T of definition sampling circle is
P arepresentative sampling is right, the sampling circle intensity level after I () representative is level and smooth.Select N sampling right, definition scale-of-two criterion
F = &Sigma; 0 &le; a < N 2 a T ( P a )
Can obtain the binary bits string of N dimension.
Right the choosing of sampling be one by slightly to smart process, criterion is as follows:
A) create a large matrix D that can comprise 50,000 unique points, every row represents a unique point, by 43 of each unique point sampling circle intensity are compared between two, obtains 1000 multidimensional descriptors;
B) average of each row of compute matrix D and variance, when average is 0.5, variance is maximum, can guarantee the uniqueness of feature descriptor.
C) according to square extent, each row is sorted, variance maximum be positioned at first row.
D) retain front N row, each unique point is described, obtain N dimension binary bits string.The present invention selects N=256.
As Fig. 3, in all sampling circles, select the individual symmetric sampling pair of M (M=45), calculating partial gradient is
O = 1 M &Sigma; P O &Element; G ( I ( P O r 1 ) - I ( P O r 2 ) ) P O r 1 - P O r 2 | | P O r 1 - P O r 2 | |
the bivector of corresponding sampling circle, the sampling circle intensity level after I () representative is level and smooth.
3, calculate the nearest Hamming distance of two unique point FREAK descriptors that mate.For filtering mistake coupling, the present invention arranges a threshold value 10, higher than this directly filtering, is considered to 2 points of mutual coupling lower than the point of this threshold value.
4, according to the method in step 2 and step 3, carry out characteristic matching, by determining that in realtime graphic a, the position of realtime graphic a in benchmark subimage b determined in the position of unique point in benchmark subimage b, thereby determine aircraft current location.
The 3rd step, the error of utilizing unmanned plane position correction vision mileage that second step obtains to produce, obtains the more accurate position P of unmanned plane vision, and provide the current position P of unmanned plane by inertial navigation system inswith attitude A ins.
Supposing that the unmanned plane site error obtaining by scene matching aided navigation in adaptive district is very little, resets in the position that directly result of use scene matching aided navigation is calculated visual odometry in this step.
The 4th step, is used Kalman filter to P vision, P insand A insestimate, obtain optimum navigation information.
Specific embodiment is as follows:
1,, in unmanned plane during flying process, look video camera Real-time Obtaining ground image a under airborne.
Utilize lower optometry video camera or the thermal camera Real-time Obtaining ground image sequence of unmanned aerial vehicle onboard, but only need preserve present frame and former frame image.
2, utilize image a and former frame image a ', by estimating the homography matrix of a and a ', determine the vision mileage of unmanned plane.
As shown in Figure 2, unmanned plane is under state of flight, and Airborne camera is taken continuously two two field picture I under different positions and pose 1and I 2, corresponding camera coordinates is F and F ', supposes that the some P on plane π is mapped as image I 1in some p and I 2in some p ', corresponding to the vector in F and F ', be with exist
p &RightArrow; &prime; = R c 1 c 2 p &RightArrow; + t
and t represents respectively rotation and the translation motion of unmanned plane between twice shooting.
By Fig. 2, n be c1 place with respect to the normal vector of plane π, d is the distance that c1 arrives plane π, has
n T p &RightArrow; = d
Therefore,
p &RightArrow; &prime; = R p &RightArrow; + t n T p &RightArrow; d = H c p &RightArrow;
Wherein,
H c = R c 1 c 2 + tn T d
Be called plane π about the homography matrix of video camera.
Consider camera intrinsic parameter,
p = K p &RightArrow; , p &prime; = K p &RightArrow; &prime;
Therefore,
p′=KH cK -1p=Hp
Wherein,
H = K ( R c 1 c 2 + tn T d ) K - 1
Be called plane π about the homography matrix of two two field pictures.
Known according to document, homography matrix is that degree of freedom is 83 * 3 matrixes, therefore need to know that between two width images, 4 groups of matching double points calculate, in order to prevent degeneration, these 4 groups of matching double points place planes require to pass through video camera photocentre, and any 3 points of 4, space point conllinear not.
In addition, because video camera is connected on unmanned plane, can think that the attitude of video camera and the attitude of unmanned plane are consistent, by airborne inertial navigation device, be provided, be respectively crab angle ψ, pitching angle theta, roll angle γ, therefore have
R c 1 c 2 = R n c 2 R c 1 n
Wherein the rotation matrix that c2 place camera coordinates is tied to navigation coordinate system, for
R n c 2 = cos &gamma; cos &psi; - sin &theta; sin &gamma; sin &psi; cos &gamma; sin &psi; + sin &theta; sin &gamma; cos &psi; - cos &theta; sin &gamma; - cos &theta; sin &psi; cos &theta; cos &psi; sin &theta; sin &gamma; cos &psi; + sin &theta; cos &gamma; sin &psi; sin &gamma; sin &psi; - sin &theta; cos &gamma; cos &psi; cos &theta; cos &gamma;
In like manner can obtain
If suppose that ground is plane, n=(0,0,1) tafter calculating homography matrix, the elevation information that the attitude information providing according to inertial navigation and barometric altimeter provide, use above-mentioned formula to obtain the relative displacement of video camera when taking two continuous frames image, by the displacement of accumulation video camera, the position of estimating before utilizing, cyclically calculates the current position of unmanned plane.Concrete implementation step is as follows:
2.1, use Harris Corner Detection Algorithm respectively at two continuous frames realtime graphic a and the middle extract minutiae of a ';
2.2, to each unique point (x, y) in a ' t, in a with (x, y) tcentered by square region in search there is the coupling of the highest neighborhood simple crosscorrelation, symmetrical, each unique point in a, in its coupling of the middle search of a ', is finally determined to matching characteristic point is right;
2.3, use RANSAC robust estimation method to obtain maximum consistent point set and the estimation of homography matrix H.Idiographic flow comprises: (1) is randomly drawed 4 groups of matching double points and formed a random sample, and calculates homography matrix H; (2), to each matching double points in step B, calculate apart from dis; (3) setting threshold t, if dis<t, this matching double points is interior point, otherwise rejects, and counts out in statistics; (4) repeat said process k time, selection has the H that imperial palace is counted;
2.4, by delimiting as all matching double points H of interior point reappraise, and the region of search in H deciding step 2.2, determine more accurately matching double points;
2.5, repeating step 2.2 is to step 2.4, until the number of matching double points is stable.
3, when unmanned plane enters adaptive district, according to inertial navigation system, unmanned plane is carried out to coarse positioning, in on-board memory devices, find the benchmark image b corresponding with coarse positioning result, and carry out scene matching aided navigation with realtime graphic a, determine the position of unmanned plane.
The present invention adopts the Scene Matching Algorithms based on FREAK descriptor to mate with reference map real-time figure, as shown in Figure 3, FREAK descriptor adopts similar retina sampling pattern, centered by each unique point, build circular sampling, center is intensive, surrounding is sparse, and number of samples successively decreases with exponential form.In order to strengthen the robustness of sampling circle, improve the stability of descriptor with unique, use pre-level and smooth each the sampling circle of different gaussian kernel, the standard deviation of gaussian kernel is directly proportional to the size of circle.
FREAK descriptor consists of difference Gauss's Bit String, and the criterion T of definition sampling circle is
P arepresentative sampling is right, the sampling circle intensity level after I () representative is level and smooth.Select N sampling right, definition scale-of-two criterion
F = &Sigma; 0 &le; a < N 2 a T ( P a )
Can obtain the binary bits string of N dimension.
Right the choosing of sampling be one by slightly to smart process, criterion is as follows:
A) create a large matrix D that can comprise 50,000 unique points, every row represents a unique point, by 43 of each unique point sampling circle intensity are compared between two, obtains 1000 multidimensional descriptors;
B) average of each row of compute matrix D and variance, when average is 0.5, variance is maximum, can guarantee the uniqueness of feature descriptor.
C) according to square extent, each row is sorted, variance maximum be positioned at first row.
D) retain front N row, each unique point is described, obtain N dimension binary bits string.Select N=256 herein.
The right Criterion of Selecting of sampling has guaranteed the gray scale unchangeability of descriptor.
As Fig. 3, in all sampling circles, select the individual symmetric sampling pair of M (M=45), calculating partial gradient is
O = 1 M &Sigma; P O &Element; G ( I ( P O r 1 ) - I ( P O r 2 ) ) P O r 1 - P O r 2 | | P O r 1 - P O r 2 | |
the bivector of corresponding sampling circle, the sampling circle intensity level after I () representative is level and smooth.The right symmetry of sampling has guaranteed the rotational invariance of descriptor.
FREAK descriptor is the 1 and 0 binary bits string forming, so similarity criterion adopts nearest Hamming distance method, mate two descriptor step-by-steps is carried out to xor operation.For 512 dimension FREAK descriptors, maximum Hamming distance is 512, and minimum is 0, for filtering mistake coupling, a threshold value T is set, and higher than directly filtering of T, threshold value is made as 10 herein.
4, the error of utilizing unmanned plane position correction vision mileage that scene matching aided navigation obtains to produce in adaptive district, obtains more accurately position of unmanned plane.
Because the positioning result of scene matching aided navigation in adaptive district is reliably, therefore directly utilize the position that unmanned plane position that scene matching aided navigation obtains obtains visual odometry to reset, eliminate the work long hours cumulative errors of generation of vision mileage.
5, utilize inertial navigation system to provide the current position of unmanned plane and attitude.
6, utilize Kalman filtering algorithm to estimate current exact position and the attitude of unmanned plane.
Use the error equation of inertial navigation system as the state equation of integrated navigation system, navigation coordinate system is chosen as sky, northeast coordinate system, using the difference of position in position in 4 and 5 as measuring value, by Kalman filter, estimate the drift error of inertia system, then proofread and correct inertial navigation system, the navigational parameter after being merged.

Claims (3)

1. the inertia integrated navigation method based on scene matching aided navigation/vision mileage, is characterized in that step is as follows:
Step 1: in unmanned plane during flying process, look video camera Real-time Obtaining ground image a under airborne;
Step 2: utilize image a and former frame image a ', determine the vision mileage of unmanned plane, step is as follows:
A, use Harris Corner Detection Algorithm are schemed a and the middle extract minutiae of a ' in real time in two continuous frames respectively;
B, in image a with (x, y) tcentered by square region in search there is the match point of the highest neighborhood simple crosscorrelation with each unique point (x, the y) T in image a '; Meanwhile, in image a ' with (x, y) tcentered by square region in search with image a in each unique point (x, y) tthe match point with the highest neighborhood simple crosscorrelation;
C, utilization RANSAC robust estimation method obtain maximum consistent point set and the estimation of homography matrix H, and process forms a random sample for first randomly drawing 4 groups of matching double points, and calculates homography matrix H; Then to each matching double points in step B, calculate apart from dis; Reset threshold value t, if dis<t, this matching double points is interior point, otherwise rejects, and counts out in statistics; Repeat said process k time, select H;
D, by delimiting as all matching double points of interior point have the H that imperial palace counts, reappraise, the H that use regains calculate in image a ' with image a in each unique point (x, y) tcorresponding point (x 1, y 1) t.And use in step B method respectively in image a with (x, y) tcentered by square region in search with image a ' in each unique point (x 1, y 1) tthe match point with the highest neighborhood simple crosscorrelation; Meanwhile, in image a ' with (x 1, y 1) tcentered by square region in search with image a in each unique point (x, y) tthe match point with the highest neighborhood simple crosscorrelation;
E, repeating step B are to step D, until the number of matching double points is stable;
Step 3: when unmanned plane enters adaptive district, according to inertial navigation system, unmanned plane is carried out to coarse positioning, find the reference map b corresponding with coarse positioning result in on-board memory devices, and carry out scene matching aided navigation with real-time figure a, determine the position of unmanned plane;
Step 4: the error of utilizing unmanned plane position correction vision mileage that step 3 obtains to produce in adaptive district, obtains more accurately position of unmanned plane;
Step 5: utilize inertial navigation system to provide the current position of unmanned plane and attitude;
Step 6: use the error equation of inertial navigation system as the state equation of integrated navigation system, navigation coordinate system is chosen as sky, northeast coordinate system, using in step 4 must out position in out position and step 5 difference as measurement.By Kalman filter, estimate the drift error of inertia system, and use this drift error to proofread and correct inertial navigation system, the navigational parameter after being merged.
2. the inertia integrated navigation method based on scene matching aided navigation/vision mileage according to claim 1, is characterized in that: described ground image a is optical imagery or infrared image.
3. the inertia integrated navigation method based on scene matching aided navigation/vision mileage according to claim 1, it is characterized in that: the process of described step 3 is as follows: first under unmanned aerial vehicle onboard, look video camera Real-time Obtaining ground image a, image a is carried out to pre-service, obtain image c; Then extract the FAST feature of image b and image c; Re-using FREAK feature descriptor is described the FAST feature of extracting; Utilize the similarity criterion of nearest Hamming distance to carry out characteristic matching, obtain matched position, i.e. the position of current unmanned plane.
CN201410128459.XA 2014-04-01 2014-04-01 Inertia integrated navigation method based on scene matching aided navigation/vision mileage Active CN103954283B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410128459.XA CN103954283B (en) 2014-04-01 2014-04-01 Inertia integrated navigation method based on scene matching aided navigation/vision mileage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410128459.XA CN103954283B (en) 2014-04-01 2014-04-01 Inertia integrated navigation method based on scene matching aided navigation/vision mileage

Publications (2)

Publication Number Publication Date
CN103954283A true CN103954283A (en) 2014-07-30
CN103954283B CN103954283B (en) 2016-08-31

Family

ID=51331593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410128459.XA Active CN103954283B (en) 2014-04-01 2014-04-01 Inertia integrated navigation method based on scene matching aided navigation/vision mileage

Country Status (1)

Country Link
CN (1) CN103954283B (en)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045276A (en) * 2015-07-03 2015-11-11 深圳一电科技有限公司 Method and apparatus for controlling flight of unmanned plane
CN105180933A (en) * 2015-09-14 2015-12-23 中国科学院合肥物质科学研究院 Mobile robot track plotting correcting system based on straight-running intersection and mobile robot track plotting correcting method
CN105222772A (en) * 2015-09-17 2016-01-06 泉州装备制造研究所 A kind of high-precision motion track detection system based on Multi-source Information Fusion
CN105374043A (en) * 2015-12-02 2016-03-02 福州华鹰重工机械有限公司 Method and device of background filtering of visual odometry
CN105675013A (en) * 2014-11-21 2016-06-15 中国飞行试验研究院 Civil aircraft inertial navigation dynamic calibration method
CN105865451A (en) * 2016-04-19 2016-08-17 深圳市神州云海智能科技有限公司 Method and device applied to indoor location of mobile robot
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN107067415A (en) * 2017-03-21 2017-08-18 南京航空航天大学 A kind of quick accurate positioning method of target based on images match
CN107167140A (en) * 2017-05-26 2017-09-15 江苏大学 A kind of unmanned plane vision positioning accumulated error suppressing method
CN107270904A (en) * 2017-06-23 2017-10-20 西北工业大学 Unmanned plane auxiliary guiding control system and method based on image registration
CN107462244A (en) * 2017-04-24 2017-12-12 北京航空航天大学 A kind of air remote sensing platform attitude angle high-precision measuring method matched based on GPS location and aerial map picture
CN107498559A (en) * 2017-09-26 2017-12-22 珠海市微半导体有限公司 The detection method and chip that the robot of view-based access control model turns to
CN107843240A (en) * 2017-09-14 2018-03-27 中国人民解放军92859部队 A kind of seashore region unmanned plane image same place information rapid extracting method
CN107967691A (en) * 2016-10-20 2018-04-27 株式会社理光 A kind of visual odometry calculates method and apparatus
CN108196285A (en) * 2017-11-30 2018-06-22 中山大学 A kind of Precise Position System based on Multi-sensor Fusion
CN108544494A (en) * 2018-05-31 2018-09-18 珠海市微半导体有限公司 A kind of positioning device, method and robot based on inertia and visual signature
CN108846857A (en) * 2018-06-28 2018-11-20 清华大学深圳研究生院 The measurement method and visual odometry of visual odometry
CN108981692A (en) * 2018-06-14 2018-12-11 兰州晨阳启创信息科技有限公司 It is a kind of based on inertial navigation/visual odometry train locating method and system
CN109102013A (en) * 2018-08-01 2018-12-28 重庆大学 A kind of improvement FREAK Feature Points Matching digital image stabilization method suitable for tunnel environment characteristic
CN109143305A (en) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 Automobile navigation method and device
CN109341700A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision assists landing navigation method under a kind of low visibility
CN109341685A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 A kind of fixed wing aircraft vision auxiliary landing navigation method based on homograph
CN109360295A (en) * 2018-10-31 2019-02-19 张维玲 A kind of mileage measuring system and method based on Aberration Analysis
CN109523579A (en) * 2018-11-12 2019-03-26 北京联海信息系统有限公司 A kind of matching process and device of UAV Video image and three-dimensional map
CN109791048A (en) * 2016-08-01 2019-05-21 无限增强现实以色列有限公司 Usage scenario captures the method and system of the component of data calibration Inertial Measurement Unit (IMU)
CN109782012A (en) * 2018-12-29 2019-05-21 中国电子科技集团公司第二十研究所 A kind of speed-measuring method based on photoelectric image feature association
CN110388939A (en) * 2018-04-23 2019-10-29 湖南海迅自动化技术有限公司 One kind being based on the matched vehicle-mounted inertial navigation position error modification method of Aerial Images
CN111238488A (en) * 2020-03-18 2020-06-05 湖南云顶智能科技有限公司 Aircraft accurate positioning method based on heterogeneous image matching
CN112577493A (en) * 2021-03-01 2021-03-30 中国人民解放军国防科技大学 Unmanned aerial vehicle autonomous positioning method and system based on remote sensing map assistance
CN113390410A (en) * 2021-08-04 2021-09-14 北京云恒科技研究院有限公司 Inertial integrated navigation method suitable for unmanned aerial vehicle
CN113432594A (en) * 2021-07-05 2021-09-24 北京鑫海宜科技有限公司 Unmanned aerial vehicle automatic navigation system based on map and environment
CN114265427A (en) * 2021-12-06 2022-04-01 江苏方天电力技术有限公司 Inspection unmanned aerial vehicle auxiliary navigation system and method based on infrared image matching
CN116518981B (en) * 2023-06-29 2023-09-22 中国人民解放军国防科技大学 Aircraft visual navigation method based on deep learning matching and Kalman filtering

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107796417B (en) * 2016-09-06 2021-02-05 北京自动化控制设备研究所 Method for adaptively estimating scene matching and inertial navigation installation error
CN107966147B (en) * 2016-10-20 2021-02-05 北京自动化控制设备研究所 Scene matching method under large-locomotive condition

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6856894B1 (en) * 2003-10-23 2005-02-15 International Business Machines Corporation Navigating a UAV under remote control and manual control with three dimensional flight depiction
EP1975646A2 (en) * 2007-03-28 2008-10-01 Honeywell International Inc. Lader-based motion estimation for navigation
CN101598556A (en) * 2009-07-15 2009-12-09 北京航空航天大学 Unmanned plane vision/inertia integrated navigation method under a kind of circumstances not known
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN102829785A (en) * 2012-08-30 2012-12-19 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6856894B1 (en) * 2003-10-23 2005-02-15 International Business Machines Corporation Navigating a UAV under remote control and manual control with three dimensional flight depiction
EP1975646A2 (en) * 2007-03-28 2008-10-01 Honeywell International Inc. Lader-based motion estimation for navigation
CN101598556A (en) * 2009-07-15 2009-12-09 北京航空航天大学 Unmanned plane vision/inertia integrated navigation method under a kind of circumstances not known
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN102829785A (en) * 2012-08-30 2012-12-19 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
庄瞳等: "一种基于单目视觉的微型无人机姿态算法", 《计算机工程》 *
李耀军等: "基于空间关系几何约束的无人机景象匹配导航", 《计算机应用研究》 *
陈方等: "惯性组合导航系统中的快速景象匹配算法研究", 《宇航学报》 *

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105675013A (en) * 2014-11-21 2016-06-15 中国飞行试验研究院 Civil aircraft inertial navigation dynamic calibration method
CN105675013B (en) * 2014-11-21 2019-03-01 中国飞行试验研究院 Civil aircraft inertial navigation dynamic calibration method
CN105045276A (en) * 2015-07-03 2015-11-11 深圳一电科技有限公司 Method and apparatus for controlling flight of unmanned plane
CN105180933A (en) * 2015-09-14 2015-12-23 中国科学院合肥物质科学研究院 Mobile robot track plotting correcting system based on straight-running intersection and mobile robot track plotting correcting method
CN105180933B (en) * 2015-09-14 2017-11-21 中国科学院合肥物质科学研究院 Mobile robot reckoning update the system and method based on the detection of straight trip crossing
CN105222772B (en) * 2015-09-17 2018-03-16 泉州装备制造研究所 A kind of high-precision motion track detection system based on Multi-source Information Fusion
CN105222772A (en) * 2015-09-17 2016-01-06 泉州装备制造研究所 A kind of high-precision motion track detection system based on Multi-source Information Fusion
CN105374043A (en) * 2015-12-02 2016-03-02 福州华鹰重工机械有限公司 Method and device of background filtering of visual odometry
CN105865451A (en) * 2016-04-19 2016-08-17 深圳市神州云海智能科技有限公司 Method and device applied to indoor location of mobile robot
CN105865451B (en) * 2016-04-19 2019-10-01 深圳市神州云海智能科技有限公司 Method and apparatus for mobile robot indoor positioning
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN109791048A (en) * 2016-08-01 2019-05-21 无限增强现实以色列有限公司 Usage scenario captures the method and system of the component of data calibration Inertial Measurement Unit (IMU)
US11125581B2 (en) 2016-08-01 2021-09-21 Alibaba Technologies (Israel) LTD. Method and system for calibrating components of an inertial measurement unit (IMU) using scene-captured data
CN107967691B (en) * 2016-10-20 2021-11-23 株式会社理光 Visual mileage calculation method and device
CN107967691A (en) * 2016-10-20 2018-04-27 株式会社理光 A kind of visual odometry calculates method and apparatus
CN107067415B (en) * 2017-03-21 2019-07-30 南京航空航天大学 A kind of object localization method based on images match
CN107067415A (en) * 2017-03-21 2017-08-18 南京航空航天大学 A kind of quick accurate positioning method of target based on images match
CN107462244A (en) * 2017-04-24 2017-12-12 北京航空航天大学 A kind of air remote sensing platform attitude angle high-precision measuring method matched based on GPS location and aerial map picture
CN107167140B (en) * 2017-05-26 2019-11-08 江苏大学 A kind of unmanned plane vision positioning accumulated error suppressing method
CN107167140A (en) * 2017-05-26 2017-09-15 江苏大学 A kind of unmanned plane vision positioning accumulated error suppressing method
CN107270904B (en) * 2017-06-23 2020-07-03 西北工业大学 Unmanned aerial vehicle auxiliary guide control system and method based on image registration
CN107270904A (en) * 2017-06-23 2017-10-20 西北工业大学 Unmanned plane auxiliary guiding control system and method based on image registration
CN107843240A (en) * 2017-09-14 2018-03-27 中国人民解放军92859部队 A kind of seashore region unmanned plane image same place information rapid extracting method
CN107498559A (en) * 2017-09-26 2017-12-22 珠海市微半导体有限公司 The detection method and chip that the robot of view-based access control model turns to
CN108196285A (en) * 2017-11-30 2018-06-22 中山大学 A kind of Precise Position System based on Multi-sensor Fusion
CN110388939A (en) * 2018-04-23 2019-10-29 湖南海迅自动化技术有限公司 One kind being based on the matched vehicle-mounted inertial navigation position error modification method of Aerial Images
CN108544494B (en) * 2018-05-31 2023-10-24 珠海一微半导体股份有限公司 Positioning device, method and robot based on inertia and visual characteristics
CN108544494A (en) * 2018-05-31 2018-09-18 珠海市微半导体有限公司 A kind of positioning device, method and robot based on inertia and visual signature
CN108981692A (en) * 2018-06-14 2018-12-11 兰州晨阳启创信息科技有限公司 It is a kind of based on inertial navigation/visual odometry train locating method and system
CN108846857A (en) * 2018-06-28 2018-11-20 清华大学深圳研究生院 The measurement method and visual odometry of visual odometry
CN109102013A (en) * 2018-08-01 2018-12-28 重庆大学 A kind of improvement FREAK Feature Points Matching digital image stabilization method suitable for tunnel environment characteristic
CN109143305A (en) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 Automobile navigation method and device
CN109360295A (en) * 2018-10-31 2019-02-19 张维玲 A kind of mileage measuring system and method based on Aberration Analysis
CN109523579A (en) * 2018-11-12 2019-03-26 北京联海信息系统有限公司 A kind of matching process and device of UAV Video image and three-dimensional map
CN109341700A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision assists landing navigation method under a kind of low visibility
CN109341685B (en) * 2018-12-04 2023-06-30 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision auxiliary landing navigation method based on homography transformation
CN109341685A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 A kind of fixed wing aircraft vision auxiliary landing navigation method based on homograph
CN109782012A (en) * 2018-12-29 2019-05-21 中国电子科技集团公司第二十研究所 A kind of speed-measuring method based on photoelectric image feature association
CN111238488A (en) * 2020-03-18 2020-06-05 湖南云顶智能科技有限公司 Aircraft accurate positioning method based on heterogeneous image matching
CN112577493A (en) * 2021-03-01 2021-03-30 中国人民解放军国防科技大学 Unmanned aerial vehicle autonomous positioning method and system based on remote sensing map assistance
CN113432594A (en) * 2021-07-05 2021-09-24 北京鑫海宜科技有限公司 Unmanned aerial vehicle automatic navigation system based on map and environment
CN113390410A (en) * 2021-08-04 2021-09-14 北京云恒科技研究院有限公司 Inertial integrated navigation method suitable for unmanned aerial vehicle
CN114265427A (en) * 2021-12-06 2022-04-01 江苏方天电力技术有限公司 Inspection unmanned aerial vehicle auxiliary navigation system and method based on infrared image matching
CN114265427B (en) * 2021-12-06 2024-02-02 江苏方天电力技术有限公司 Inspection unmanned aerial vehicle auxiliary navigation system and method based on infrared image matching
CN116518981B (en) * 2023-06-29 2023-09-22 中国人民解放军国防科技大学 Aircraft visual navigation method based on deep learning matching and Kalman filtering

Also Published As

Publication number Publication date
CN103954283B (en) 2016-08-31

Similar Documents

Publication Publication Date Title
CN103954283B (en) Inertia integrated navigation method based on scene matching aided navigation/vision mileage
CN105865454B (en) A kind of Navigation of Pilotless Aircraft method generated based on real-time online map
CN102435188B (en) Monocular vision/inertia autonomous navigation method for indoor environment
US8213706B2 (en) Method and system for real-time visual odometry
CN106097304B (en) A kind of unmanned plane real-time online ground drawing generating method
EP2503510B1 (en) Wide baseline feature matching using collaborative navigation and digital terrain elevation data constraints
CN106595659A (en) Map merging method of unmanned aerial vehicle visual SLAM under city complex environment
CN105678754A (en) Unmanned aerial vehicle real-time map reconstruction method
CN110726406A (en) Improved nonlinear optimization monocular inertial navigation SLAM method
CN108917753B (en) Aircraft position determination method based on motion recovery structure
CN102607532B (en) Quick low-level image matching method by utilizing flight control data
CN111623773B (en) Target positioning method and device based on fisheye vision and inertial measurement
CN115406447B (en) Autonomous positioning method of quad-rotor unmanned aerial vehicle based on visual inertia in rejection environment
CN114693754A (en) Unmanned aerial vehicle autonomous positioning method and system based on monocular vision inertial navigation fusion
Warren et al. High altitude stereo visual odometry
Abdi et al. Pose estimation of unmanned aerial vehicles based on a vision-aided multi-sensor fusion
CN112945233B (en) Global drift-free autonomous robot simultaneous positioning and map construction method
Chen et al. Aerial robots on the way to underground: An experimental evaluation of VINS-mono on visual-inertial odometry camera
Beauvisage et al. Multimodal visual-inertial odometry for navigation in cold and low contrast environment
CN114723920A (en) Point cloud map-based visual positioning method
Hosen et al. A vision-aided nonlinear observer for fixed-wing UAV navigation
Liu et al. 6-DOF motion estimation using optical flow based on dual cameras
CN113239936A (en) Unmanned aerial vehicle visual navigation method based on deep learning and feature point extraction
CN109341685B (en) Fixed wing aircraft vision auxiliary landing navigation method based on homography transformation
Mirisola et al. Trajectory recovery and 3d mapping from rotation-compensated imagery for an airship

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20190826

Address after: Room 404, Material Building, Northwest Polytechnic University, 127 Youyi West Road, Xi'an City, Shaanxi Province, 710072

Patentee after: Xi'an Northwestern Polytechnical University Asset Management Co.,Ltd.

Address before: 710072 Xi'an friendship West Road, Shaanxi, No. 127

Patentee before: Northwestern Polytechnical University

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20191113

Address after: 710072 floor 19, building B, innovation and technology building, northwest Polytechnic University, No.127, Youyi West Road, Beilin District, Xi'an, Shaanxi Province

Patentee after: Shaanxi CISCO Rudi Network Security Technology Co.,Ltd.

Address before: Room 404, Material Building, Northwest Polytechnic University, 127 Youyi West Road, Xi'an City, Shaanxi Province, 710072

Patentee before: Xi'an Northwestern Polytechnical University Asset Management Co.,Ltd.

TR01 Transfer of patent right
CP01 Change in the name or title of a patent holder

Address after: Floor 19, block B, innovation and technology building, Northwest University of technology, 127 Youyi West Road, Beilin District, Xi'an City, Shaanxi Province, 710072

Patentee after: Shaanxi University of technology Ruidi Information Technology Co.,Ltd.

Address before: Floor 19, block B, innovation and technology building, Northwest University of technology, 127 Youyi West Road, Beilin District, Xi'an City, Shaanxi Province, 710072

Patentee before: Shaanxi CISCO Rudi Network Security Technology Co.,Ltd.

CP01 Change in the name or title of a patent holder
TR01 Transfer of patent right

Effective date of registration: 20231008

Address after: 518000 Unit 204, Xingyuanju 2, Xilihu Road, Xili Street, Nanshan District, Shenzhen, Guangdong Province

Patentee after: Shenzhen Onoan Technology Co.,Ltd.

Address before: Floor 19, block B, innovation and technology building, Northwest University of technology, 127 Youyi West Road, Beilin District, Xi'an City, Shaanxi Province, 710072

Patentee before: Shaanxi University of technology Ruidi Information Technology Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231220

Address after: 710000, No. 581, East Zone, National E-commerce Demonstration Base, No. 528 Tianguba Road, Software New City, High tech Zone, Xi'an City, Shaanxi Province

Patentee after: Xi'an Chenxiang Zhuoyue Technology Co.,Ltd.

Address before: 518000 Unit 204, Xingyuanju 2, Xilihu Road, Xili Street, Nanshan District, Shenzhen, Guangdong Province

Patentee before: Shenzhen Onoan Technology Co.,Ltd.

TR01 Transfer of patent right