CN102252681A - Global positioning system (GPS) and machine vision-based integrated navigation and positioning system and method - Google Patents

Global positioning system (GPS) and machine vision-based integrated navigation and positioning system and method Download PDF

Info

Publication number
CN102252681A
CN102252681A CN2011100968964A CN201110096896A CN102252681A CN 102252681 A CN102252681 A CN 102252681A CN 2011100968964 A CN2011100968964 A CN 2011100968964A CN 201110096896 A CN201110096896 A CN 201110096896A CN 102252681 A CN102252681 A CN 102252681A
Authority
CN
China
Prior art keywords
gps
image
locating device
machine vision
navigation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2011100968964A
Other languages
Chinese (zh)
Inventor
张漫
陈艳
刘兆祥
籍颖
马文强
吴琼
刘刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Agricultural University
Original Assignee
China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Agricultural University filed Critical China Agricultural University
Priority to CN2011100968964A priority Critical patent/CN102252681A/en
Publication of CN102252681A publication Critical patent/CN102252681A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a global positioning system (GPS) and machine vision-based integrated navigation and positioning system and a method. The system comprises a GPS positioning device which is utilized for carrying out GPS positioning process on a navigated vehicle, acquiring position coordinates, course angles and driving speed of the navigated vehicle and transmitting the acquired information to a fusion positioning device, a machine vision positioning device which is utilized for collecting images of farmland on a navigation path, carrying out image processing on the collected images, extracting the navigation path to obtain position coordinates of known points on the navigation path, and transmitting the position coordinates to the fusion positioning device, and the fusion positioning device which is utilized for carrying out spatial adjustment and temporal adjustment processes on information from the GPS positioning device and the machine vision positioning device and carrying out filtering processing to obtain final positioning information. The method and the system provided by the invention have the advantages of high positioning accuracy, simple operations and good applicability for real-time operation in field.

Description

Integrated navigation and location system and method based on GPS and machine vision
Technical field
The present invention relates to technical field of navigation and positioning, relate in particular to a kind of integrated navigation and location system and method based on GPS and machine vision.
Background technology
Precision agriculture comprises location prescription farming and agricultural feelings automatic information collecting, and wherein prescription farming in location requires agricultural machinery to walk in the field according to the good path of planning in advance, accurately arrives the destination and finishes set job task.Precision navigation is one of gordian technique that realizes the autonomous walking of agricultural machinery, and its bearing accuracy directly influences agricultural machinery and carries out the quality of path from motion tracking.Therefore, improving the precision of navigator fix, is the matter of utmost importance of improving the agricultural machine path tracking quality.
(Global Positioning System, GPS) navigation is to use the most general airmanship to GPS, can utilize earth-circling 24 satellites emitted radio signal calculating location information earthward.GPS has characteristics such as precision height, all weather operations, unlimited user, and information such as accurate three-dimensional position, three-dimensional velocity can be provided.Utilize real time dynamic differential method (Real-time Kinematic, RTK) the GPS bearing accuracy can reach centimetre-sized, satisfy the requirement of farmland operation fully, but its precision can be subjected to the influence of the factors such as geometric distributions situation (GDOP), ephemeris error, clocking error, propagated error, multipath error and receiver noise of satellite in the visual field, when running into trees, house, high-lager building, may not receive enough satellite-signals, at this moment bearing accuracy will be subjected to certain influence.Secondly, GPS can only provide the absolute position for agricultural machinery, knows nothing for the relative information around the agricultural machinery.
The machine vision navigation technology can be cooked up desired path according to the signal of video camera to the surrounding environment real-time detection, and can not have under the situation of manual intervention along this path, moves to the intended target place and carries out operation.Use the vision guided navigation technology, video camera can not known fatigue, uniform observation farm environment, and the driver can note information of road surface for a long time, can effectively alleviate driver's manipulation strength.
The Shen Mingxia of Agricultural University Of Nanjing has proposed several novel algorithms having done a large amount of deep researchs aspect the image information extraction at the zone of farmland scenery and edge.In farmland scenery zone context of detection, at first from the texture frequency spectrum angle of crops, defined two kinds of textural characteristics of geometrical symmetry and direction degree, distinguish crops zone and non-agricultural crop zone by the judging characteristic value.This algorithm angle is relatively more novel, goes identification from the angle of texture frequency spectrum, has avoided interference of noise on the spatial domain effectively.But because the dimension of textural characteristics value than higher, needs bigger calculated amount, the real-time handling property of algorithm is poor, is not suitable for the field real time job.
Summary of the invention
(1) technical matters that will solve
The technical problem to be solved in the present invention is: how a kind of bearing accuracy height is provided, calculates the integrated navigation and location system and the method based on GPS and machine vision of simple and the real time job of suitable field.
(2) technical scheme
For addressing the above problem, the invention provides a kind of integrated navigation and location system based on GPS and machine vision, this system comprises: the GPS locating device, be used for navigation vehicle is carried out the GPS location, obtain position coordinates, course heading and the travel speed of navigation vehicle, and be sent to the fusion locating device; The machine vision locating device is used to gather the farmland image on the guidance path, and the image that collects is carried out Flame Image Process, extracts guidance path, obtains the position coordinates of known point in the guidance path, and is sent to the fusion locating device; Merge locating device, be used for carrying out spatial registration and temporal registration, and carry out Filtering Processing, obtain final locating information from the information of described GPS locating device and machine vision locating device.
Wherein, described GPS locating device further comprises: data acquisition module, be used for navigation vehicle is carried out the GPS location, and obtain position coordinates, course heading and the travel speed of navigation vehicle; Data processing module, with data collecting module collected to information via Gauss projection conversion be transformed under the plane coordinate system, with its preservation, show and be sent to described fusion locating device.
Wherein, described data processing module further comprises: microcontroller, be used for data collecting module collected to latitude and longitude information be transformed under the plane coordinate system through the Gauss projection conversion; Communication unit is used for the locating information after the described microcontroller processing is sent to the fusion locating device.
Wherein, described machine vision locating device further comprises: image capture module is used to gather the farmland image on the guidance path; Image processing module is handled the image that image capture module collects, and extracts guidance path, obtains the position coordinates of known point in the guidance path, and is sent to described fusion locating device.
Wherein, described fusion locating device further comprises: data resolution module, to carrying out spatial registration and time registration from the information of described GPS locating device and the transmission of machine vision locating device; Merge locating module, the data after the data parsing module is handled are carried out Filtering Processing, obtain final locating information.
Wherein, described GPS locating device adopts the asynchronous serial mode to carry out data and transmits for carrying out the RTKGPS receiver of virtual reference station VRS difference.
Wherein, described image capture module is a ccd video camera.
The present invention also provides a kind of combined navigation locating method based on GPS and machine vision based on said system, and the method comprising the steps of:
The S1.GPS locating device carries out GPS location to navigation vehicle, and with position coordinates, course heading and the travel speed of navigation vehicle, is sent to the fusion locating device;
S2. the machine vision locating device is gathered the farmland image on the guidance path, and the image that collects is carried out Flame Image Process, extracts navigation path, road, and the position coordinates of the unique point in the guidance path is sent to the fusion locating device;
S3. merge locating device to carrying out spatial registration and temporal registration, and carry out Filtering Processing, obtain final locating information from the information of described GPS locating device and machine vision locating device.
Wherein, among the step S3, described Flame Image Process comprises: image gray-scale transformation, image segmentation, image denoising, extraction guidance path candidate point, Hough change and extract guidance path and calculated characteristics point.
Wherein, among the step S3, described Filtering Processing is based on Unscented kalman filtering UKF algorithm.
(3) beneficial effect
Of the present invention having the following advantages:
1, refinement is carried out in the crop row zone, extract minutiae has again alleviated the difficulty of feature point detection.
2, traditional leading line extract can not be good the variation that conforms, the present invention takes all factors into consideration the influence of factors such as light, selects suitable image processing method, from the real image of field-crop, improve existing navigation datum line extraction algorithm, improved real-time and robustness.
3, Chang Yong camera marking method is based on the scaling method of 2D plane template, and the present invention has simplified calibrating procedure, has realized that the position coordinates of the relative laboratory vehicle of navigation datum line is tied to the conversion of earth coordinates from image coordinate.
4, the relative position of absolute position that GPS is obtained and machine vision acquisition merges, and has improved the bearing accuracy and the reliability of system, and the saltus step in the locator data is had some improvement.Satisfy the agricultural machinery working requirement of precision agriculture.
5, adopt the local storage of data, be convenient to the analyzing and processing of information.
6, under the VC++6.0 environment, developed the integrated navigation positioning software.The user can not only in time understand the navigator fix situation by window interface, and can carry out operations such as data conversion storage and data query very easily.The window writing routine close friend, convenient to system's control, can satisfy users' demand.
Description of drawings
Fig. 1 is the structured flowchart based on the integrated navigation and location system of GPS and machine vision according to one embodiment of the present invention;
Fig. 2 is the workflow diagram based on the GPS locating device of the integrated navigation and location system of GPS and machine vision according to one embodiment of the present invention;
Fig. 3-1~3-2 be according to one embodiment of the present invention based on the OEMV-3 GPS receiver of the integrated navigation and location system of GPS and machine vision and the hardware structure diagram of VRS data transmission equipment;
Fig. 4 is the workflow diagram based on the machine vision locating device of the integrated navigation and location system of GPS and machine vision according to one embodiment of the present invention;
Fig. 5 is the process flow diagram based on the combined navigation locating method of GPS and machine vision according to one embodiment of the present invention;
Fig. 6 be according to one embodiment of the present invention based on the bianry image Gray Projection figure in the combined navigation locating method of GPS and machine vision;
Fig. 7 be according to one embodiment of the present invention based on the Gray Projection image in the combined navigation locating method of GPS and machine vision;
Fig. 8 be according to one embodiment of the present invention based on the image after the threshold process in the combined navigation locating method of GPS and machine vision;
Fig. 9 is the workflow diagram based on the fusion locating device of the integrated navigation and location system of GPS and machine vision according to one embodiment of the present invention;
Figure 10 is a synoptic diagram for gps coordinate;
Figure 11 is the Fusion Model synoptic diagram;
Figure 12 is the dead reckoning principle schematic.
Embodiment
Integrated navigation and location system and method based on GPS and machine vision that the present invention proposes reach embodiment in conjunction with the accompanying drawings and are described in detail as follows.
The border of crop harvesting and not harvesting sometimes and non-rectilinear uses GPS to navigate separately in the farmland, is determining there is certain error aspect the navigation datum line; Machine vision is carried out this type of operation, the characteristic information that can extract real-time goes out current crop row, improved locating accuracy, but when using machine vision separately, the situation that occurs omission in the image processing process sometimes, so the present invention combines GPS and two kinds of sensors of video camera and navigates.
As shown in Figure 1, the integrated navigation and location system based on GPS and machine vision according to one embodiment of the present invention comprises:
The GPS locating device carries out the GPS location to navigation vehicle, obtains absolute location coordinates, course heading and the travel speed of navigation vehicle, and is sent to the fusion locating device;
The machine vision locating device is gathered the farmland image on the guidance path, and the image that collects is carried out Flame Image Process, extracts guidance path, obtains the relative position coordinates of known point in the navigation vehicle coordinate system in the guidance path, and is sent to the fusion locating device;
Merge locating device, to carrying out spatial registration and temporal registration, and carry out Filtering Processing, obtain final locating information from the locating information of described GPS locating device and machine vision locating device.
When agricultural RTK-GPS receiver is used in Vehicular navigation system, should possess following condition: hardware resource can satisfy agriculture application demand; System works is stable, reliable, low in energy consumption; Can intersect use with other products, dirigibility is good.According to above analysis to agricultural RTK-GPS receiver function, GPS locating device of the present invention is preferably and can carries out virtual reference station (Virtual Reference Station, VRS) the RTK GPS receiver of difference, it adopts the asynchronous serial mode to carry out data and transmits, and this device further comprises:
Data acquisition module, the configuration serial ports is followed the gps data of NMEA-0183 agreement by the serial ports collection, comprises the latitude and longitude information in the WGS-84 coordinate system;
Data processing module, with data collecting module collected to latitude and longitude information be transformed under the earth coordinates through the Gauss projection conversion, with its preservation and be presented in the interface.Owing to will use the information in GGA and two kinds of statement forms of VTG, therefore will resolve respectively to two kinds of statements.The data acquisition of GPS locating device and the workflow of parsing are as shown in Figure 2.
This data processing module further comprises:
Microcontroller, be used for data collecting module collected to latitude and longitude information be transformed under the planimetric coordinates coordinate system through the Gauss projection conversion;
Communication unit is used for the fusion locating device is preserved, shows and be sent to the locating information after the described microcontroller processing.
Fig. 3 is the structural drawing of data acquisition module (OEMV-3 GPS receiver) and data processing module (VRS data transmission equipment).Wherein, the OEM plate is preferably the Canadian Novatel OEMV-3 of company type.It is a double frequency double star plate, allows wide voltage input; The highest output frequency of locating information can carry out open-minded as required; Support the input and the output of RTCM (Radio Technical Commission for Maritime services) and CMR form differential data, not only can do base station but also can do movement station; Support many serial ports outputs and CAN bus functionality.The DTGS-800 that cdma communication unit in the VRS data transmission equipment adopts Anydata company to produce, its price is lower, is widely used.Microcontroller selects 32 ARM7LPC2366 as the main control chip, and running frequency is 72MHz.The Flash program storage is 512KB in its sheet, has in-system programming and in the application programming function, 256 byte programming times were 1ms, supports that nearly 32 vectors interrupt, and a plurality of serial port functions, ethernet feature, USB function also are provided.This data transmission equipment can be selected RTK, RTD difference information according to user's different needs, and the RTK difference information can be selected RTCM or CMR form.Internal battery guarantees that equally equipment works long hours and do not cut off the power supply.
This GPS positioning system precision is 1~2cm, and exportable locating information is a standard N MEA-0183 statement form, adopts GGA and two kinds of form output datas of VTG, and the GGA statement can obtain the position of navigation vehicle, comprises longitude and latitude; The VTG statement can obtain the travel speed and the course heading of navigation vehicle.
The machine vision locating device further comprises:
Image capture module is used to gather the farmland image on the guidance path;
Image processing module, the image that image capture module collects is handled, comprised image gray-scale transformation, image segmentation (binaryzation), image denoising (noise processed), extract guidance path candidate point, Hough (Hough) conversion extraction guidance path and calculated characteristics point.
The image capture module of machine vision locating device is a vision sensor, be preferably ccd video camera, its output image is the coloured image of .bmp form, deposit the picture of gathering in buffer zone by USB2.0, the coordination that utilizes call back function to gather and handle when Flame Image Process is handled the image of gathering, unique point in the datum line obtains navigating, it is transformed under the earth coordinates, and preserves data, its workflow as shown in Figure 4.
Merging locating device further comprises:
Data resolution module carries out spatial registration and time registration to the locating information of GPS locating device and the transmission of machine vision locating device;
Merge locating module, the data after the data parsing module is handled are carried out Filtering Processing, and export final locating information.
As shown in Figure 5, the combined navigation locating method according to the above-mentioned integrated navigation and location system based on GPS and machine vision of one embodiment of the present invention comprises step:
The S1.GPS locating device carries out GPS location to navigation vehicle, and with absolute location coordinates, course heading and the travel speed of navigation vehicle, is sent to the fusion locating device;
S2. the machine vision locating device is gathered the farmland image on the guidance path, and the image that collects is carried out Flame Image Process, extracts guidance path, and with the relative position coordinates of the unique point in the guidance path in the navigation vehicle coordinate system, and be sent to the fusion locating device;
S3. merge locating device to carrying out spatial registration and temporal registration, and carry out Filtering Processing, obtain final locating information from the information of described GPS locating device and machine vision locating device.
The collection of the farmland image among the step S2 is to be collected by the ccd video camera that is installed on the navigation vehicle (agricultural machinery), and CCD is preferably dimensioned to be 60mm * 60mm * 50mm; The pixel size is 4.65 μ m * 4.65 μ m; Valid pixel is 1300 * 1024; Frame frequency frame frequency under the full figure pattern is adjustable from 8~24Hz; The electronic shutter minimum exposure time can be 20 μ s; This video camera carries out data transmission by USB2.0 interface and computing machine.
Flame Image Process among the step S2 comprises: (one) image gray-scale transformation, (two) image segmentation (binaryzation), (three) image denoising (noise processed), (four) extract the guidance path candidate point, guidance path and (six) calculated characteristics point are extracted in (five) Hough (Hough) conversion.
(1) image gray-scale transformation: coloured image is generally three-dimensional when storage, in order to provide convenience to image segmentation, be converted into one dimension, promptly carries out greyscale transformation, and the method and the process of the image gray-scale transformation among the present invention are as follows:
According to green crop and the field difference on color in the farmland, select 2G-R-B that image is carried out greyscale transformation.Natural shades of colour light all can resolve into three kinds of color of light of red, green, blue, thus three kinds of colors of red, green, blue are called three primary colours, and be that the space that coordinate forms just is called the RGB color space with the RGB three primary colours.In the RGB model, according to the different proportion of three kinds of colors, arbitrarily coloured light can be expressed as L=r[R]+g[G]+b[B], r[R wherein], g[G], b[B] be the three primary colours component of colorama L.According to the basic theories of pattern-recognition, the feature differentiation ability that variance is big is strong, the color property of three kinds of quadratures:
Figure BDA0000055949240000091
I wherein 1, I 2, I 3Be color property, three class images after the analysis and utilization color property is handled are chosen green crop and non-green background are obviously distinguished, and help extracting from background the color property component of green plants.Transform method is as follows:
Figure BDA0000055949240000092
Wherein (i is certain any gray-scale value in the gray level image j) to f, and span is between 0~255.
(2) image segmentation (binaryzation): the fundamental purpose that image is cut apart is that crop row zone and field area region are separated, and has only two kinds of gray-scale values of black and white in the image after cutting apart, i.e. 0 (complete black), 255 (bright entirely).The present invention selects to carry out cutting apart of image based on the method for threshold value, in order to adapt to different environment, selects for use OTSU (maximum between-cluster variance) method to handle image.The method is to derive to come out on the basis of the differentiation and the principle of least square, is that a kind of self-adapting threshold is determined method.Its ultimate principle is that the histogram of image is slit into two groups in a certain threshold value punishment, when the variance between these two groups is maximum, determines threshold value.This method synthesis has been considered the variance difference between target and background two classes, and the inter-class variance maximum means wrong probability minimum of dividing, and it can effectively suppress noise.
If the gray-scale value of piece image is 0~m-1 level, gray-scale value is that the pixel count of i is m i, then sum of all pixels is:
N = Σ i = 0 m - 1 n i - - - ( 3 )
The probability of each gray-scale value is:
p i = n i N - - - ( 4 )
Be divided into two groups of C with T then 0={ 0~T-1} and C 1=T~m-1}, the probability that each group produces is:
C 0The probability that produces is: w 0 = Σ i = 0 T - 1 p i = w ( T ) - - - ( 5 )
C 1The probability that produces is: w 1 = Σ i = T m - 1 p i = 1 - w 0 - - - ( 6 )
C 0Mean value be: μ 0 = Σ i = 0 T - 1 ip i w 0 = μ ( T ) w ( T ) - - - ( 7 )
C 1Mean value be: μ 1 = Σ i = T m - 1 ip i w 1 = μ - μ ( T ) 1 - w ( T ) - - - ( 8 )
The average gray value of image is:
μ = Σ i = 0 m - 1 i p i - - - ( 9 )
Average gray when threshold value is T is:
μ ( T ) = Σ i = 0 T - 1 ip i - - - ( 10 )
Therefore, all the average gray value of sampling is: μ=w 0μ 0+ w 1μ 1(11)
Variance between two groups is:
δ 2 ( T ) = w 0 ( μ 0 - μ ) 2 + w 1 ( μ 1 - μ ) 2 = w 0 w 1 ( μ 1 - μ 0 ) 2 = [ μ · w ( T ) - μ ( T ) ] 2 w ( T ) [ 1 - w ( T ) ] - - - ( 12 )
(3) image denoising (noise processed): because the characteristics of farm environment self, the influence of some weeds, stone riprap, shade is inevitable, therefore there is a large amount of noise spots in the image after cutting apart,, carries out noise processed image in order to obtain desirable effect.Morphologic expansion and erosion algorithm are the effective ways of removing noise, and the present invention adopts the method for first expansion post-etching to carry out noise processed.
Expand: suppose that I is a pending bianry image, S is a structural element, and then I is expanded by S and is defined as:
Figure BDA0000055949240000111
In the formula,
Figure BDA0000055949240000112
The reflection of expression S,
Figure BDA0000055949240000113
Expression is with z unit of reflection translation of S.Image I is expanded by structural element S, is about to Behind the translation z unit, judging the common factor of structural element and bianry image, is not sky if occur simultaneously, and then writes down this z point, and all set of satisfying that the z of above-mentioned requirements orders are exactly the output result that I is expanded by S.
In conjunction with the farmland bianry image, crop behavior target area in the pending image I, white pixel collection just, the concrete grammar that image I is expanded is:
A. selected structural element S, and determine its initial point.According to the characteristics of field-crop image, crop row generally along needing on the direction on ridge to carry out expansion process so that fill hole in the target area, therefore, select the structural element of n * 1 pixel that image is handled.In the selection of structural element initial point, pixel all can be used as its initial point arbitrarily in principle, but the center pixel of general choice structure element is as initial point.
B. ask for the reflection of structural element S, and on image, carry out translation, judge the output result.If object pixel is a white pixel, because the structural element of selecting is n * 1 pixel, about former point symmetry, therefore
Figure BDA0000055949240000115
With this structural element in z unit of bianry image I upper edge pixel translation, if it is white pixel that a pixel is arranged in the image I in the S scope, then this moment S the image I of initial point correspondence on pixel be white pixel, otherwise, if do not have white pixel in the image I in the S scope, then this moment S the image I of initial point correspondence on pixel be black picture element.
C. the set of all white pixel among the statistical picture I promptly obtains exporting the result.
Corrosion: suppose that I is a pending bianry image, S is a structural element, and then I is defined as by the S corrosion:
IΘS = { z | ( S ) z ⊆ I } - - - ( 14 )
Image I is corroded by structural element S, be about to z unit of S translation after, judge whether structural element is included in the bianry image, then write down this z point as if comprising, all set of satisfying that the z of above-mentioned requirements orders are exactly the output result that I is corroded by S.
In conjunction with the farmland bianry image, pending image I is the crop row target area, white pixel collection just, and the concrete grammar that image I is corroded is:
D. selected structural element S, and determine its initial point.Characteristics according to the field-crop image, crop row is general along needing to carry out corrosion treatment on the direction vertical with the ridge, so that remove the interference of some noise spots, therefore, select the structural element of 1 * n pixel that image is handled, the initial point of structural element is still selected its center pixel.
E. structural element is carried out translation on bianry image, judge the output result.Object pixel is a white pixel, with this structural element in z unit of bianry image I upper edge pixel translation, if structural element S is completely contained among the I after translation, just the pixel in the image I of S correspondence all is a white pixel, then this moment S the image I of initial point correspondence on pixel be white pixel, otherwise, if be not included in the image I, then this moment S the image I of initial point correspondence on pixel be black picture element.
F. the set of all white pixel among the statistical picture I promptly obtains exporting the result.
(4) extract the guidance path candidate point:
The target area is made up of a lot of white pixel agglomerates, in order to reduce the calculated amount that straight line extracts, determine a series of candidate points from image, and the feature that these candidate point representation crops are capable has reduced garbage, for straight line extracts ready.The present invention selects to carry out the extraction of candidate point based on the vertical projection method of moving window.
Fig. 6 is the Gray Projection figure of a width of cloth bianry image, have several crests among this figure, and the position of these crests is exactly the crops center that is similar to.Because scanning is carried out in entire image, and crop is not parallel with the pixel coordinate axle in the image of farmland, so there is error in the result.Consider above-mentioned factor, adopt and carry out the extraction work of unique point based on the vertical projection method of moving window.This algorithm basic principle is, at first set the moving window of h * 1, this window is moved by row in image, Gray Projection value in the calculation window after scanning delegation finishes, is set a threshold value, approximate region according to this threshold decision crop row, determine the central point of crop row then, next window is moved delegation and continue scanning, when the range image bottom is h pixel, finish.
The size of supposing image is M * N pixel, and (i is that (S (j) is the Gray Projection value that j is listed as all pixels in the window for i, gray-scale value j), and A and D are respectively mean value and the standard deviation of all S of delegation (j) in the image on the image j) to f.The computing formula of S (j) and A, D is suc as formula shown in (15)~(17).
S ( j ) = Σ i = 1 h f ( i , j ) , j = 1,2 . . . M - - - ( 15 )
A = 1 M Σ j = 1 M S ( j ) - - - ( 16 )
D = 1 M Σ j = 1 M ( A - S ( j ) ) 2 - - - ( 17 )
The key step of extracting the guidance path candidate point is:
The window of h. selected h * 1 begins moving window from the top of image.Simultaneously do not increase the Flame Image Process time in order to reduce losing of crop row information, the window size of selecting in the present embodiment is 20 * 1.
I. the Gray Projection value S (j) in the calculation window.Because what handle is the farmland bianry image, is object pixel with the white pixel, therefore calculating the Gray Projection value in fact is exactly to calculate the number of object pixel.Fig. 7 is the projected image of 20 pixels of image top extraction
J. with window along the row pixel that moves right, recomputate S (j), after finishing the moving of delegation, obtain S (1), S (2) ... S (M), calculate mean value A and the standard deviation D of S (j).
K. set a threshold line S=A+D, the above part of this line is the target area.Method is to judge the size of S and S (j), if S (j) 〉=S then is set at 20 with S (j); If S (j)<S then is set at 0 with S (j).Fig. 8 is that the dotted line straight line is threshold line among the figure through the image after the threshold process, and last column crop is because the object pixel that exists is less, therefore out in the cold easily, but this represents the information of delegation, in the scanning of back, still this part crop row information can be filled
L. judge the left and right edges point of crop row.Because this moment, the pixel distribution of each row of image had only " 0 " and " 20 " two values, therefore can judge the marginal point of crop row by the pixel value difference of two row before and after calculating.The method of judging is: calculate the value of S (j+1)-S (j), if S (j+1)-S (j)>0, then setting j+1 is rising edge, the left hand edge of crop row just, if S (j+1)-S (j)<0, then setting j is negative edge, just the right hand edge of crop row.
M. calculate the center of crop row.Set the crop row distance threshold,, judge that this zone is the crop row zone, by calculating the mid point of left and right edges, obtains the center pixel of crop row when the difference of the left and right edges that scans during greater than given threshold value.
N. window is followed and move down a pixel, repeating step i is till distance video recording bottom is 20 pixels.
(5) guidance path is extracted in Hough (Hough) conversion:
Utilize the method for fitting a straight line to handle the candidate point that obtains and to obtain guidance path.Take all factors into consideration processing time and calculated amount, the present invention selects to carry out the extraction of straight line based on the Hough converter technique of known point.Its ultimate principle is a known point of at first determining in the candidate point, and its coordinate is defined as (x 0, y 0), in the image in farmland, this known point just can be represented the point of the feature of a row crop, and the geometric center point of selecting all candidate points generally speaking is as known point, and computing formula is formula (18).
x 0 = Σ i = 1 n x i n y 0 = Σ i = 1 n y i n - - - ( 18 )
(ρ must intersect at a point in the parameter space 0, θ 0), therefore obtain formula (19).
ρ 0 = x 0 cos θ 0 + y 0 sin θ 0 ρ 0 = x i cos θ 0 + y i sin θ 0 - - - ( 19 )
Find the solution formula (19) and just can obtain corresponding parameters ρ 0, θ 0And corresponding totalizer is added " 1 " according to the method for traditional Hough conversion, again seeking a point then in candidate point carries out above-mentioned computing and obtains corresponding ρ, the θ parameter, finish up to all candidate points calculating, the maximal value of seeking out totalizer has also just found corresponding ρ, and θ can determine straight line.
According to above principle, the key step of using Hough converter technique based on known point to carry out straight-line detection is:
O. when having many straight lines in the image, at first will classify to candidate point, the method for classification is at first according to the characteristics of farmland image, and an initialization n matrix is deposited the candidate point in each straight line class respectively, and determines distance threshold according to the line-spacing of crops; Begin scan image from image apex then, just it is left in when running into first candidate point in first matrix, continues scanning then, calculate when running into next candidate point this point and last between distance, judge the multiple relation of itself and distance threshold, and deposit in the corresponding matrix.After scanning delegation finishes, move down a pixel and repeat above-mentioned work, up to the entire image been scanned.
P. determine the known point of each candidate point class in the image space according to formula (18), and parameter space is subdivided into M * N unit, (i j), and is initialized as it " 0 " for each unit distributes a totalizer A.
Q. take out a candidate point of respective straight class in the image space, through type (19) calculates corresponding ρ, the θ parameter, at first make θ equal the scope of θ axle permission segmentation in the parameter space, then ρ is rounded up and find permission unit on the immediate ρ axle, and the value of this totalizer is added " 1 ".
R. retrieve a candidate point of respective straight class in the image space, repeating step q, the impact point of all respective class all calculates and finishes in image space.Detect the value of each totalizer, take out the ρ of the unit correspondence of accumulator value maximum, the θ parameter promptly obtains straight-line equation.If there are many straight lines, repeating step p-r then, up to all straight line classes detect finish till.
(6) calculated characteristics point: the present invention selects a pixel as unique point, for information fusion is prepared in the place, fixed position in guidance path.
In image processing process, if there is the situation can't extract leading line, default setting one bar navigation datum line then, its pole coordinate parameter θ are set at 90 ° and be positioned at the middle part of image, because the size of images acquired is 640 * 480 pixels, this moment leading line parameter x=320.Because may have θ in the correct navigation datum line that extracts is 90 ° straight line, therefore need to judge before the extract minutiae, if satisfy two conditions of θ=90 and x=320 simultaneously, then this moment, mistake appearred in the crop row extraction, should extract the new image of a width of cloth and extract the navigation datum line again, if do not satisfy above two conditions simultaneously, think that then the leading line extraction is correct, continue to extract navigation characteristic point and converted coordinate.
In step S3, the algorithm of filtering core is that (Unscented Kalman Filter, UKF) algorithm, particular flow sheet are as shown in Figure 9 for Unscented kalman filtering.At first judge the reliability of each sensing data, the initialization wave filter is sent into data and is carried out filtering, obtains exporting the result.Because it is about 165ms probably that the machine vision locating device is handled the time of image, frequency is near 6Hz, for guarantee data time domain synchronously, the output frequency of GPS locating device is set to 10Hz, after the machine vision locating device is handled piece image, extract the GPS positional information of this moment immediately, and send into wave filter during with two groups of information of same.
In order to guarantee the validity of system, before the sensing data of GPS locating device and machine vision locating device is sent into wave filter, the validity of test data at first, when a kind of faulty sensor, select still that the sensor of operate as normal positions, only when two kinds of sensors are simultaneously malfunctioning, stop the location.
Aspect spatial registration:
Because the position of ccd video camera can be obtained by translation in real time by GPS, therefore setting gps coordinate is master coordinate system (i.e. earth coordinates among the figure), and navigation vehicle coordinate system (visual coordinate system) is transformed into gps coordinate system down.Utilize the course of navigation vehicle, can obtain the anglec of rotation between the corresponding coordinate axle, by the position of navigation vehicle coordinate origin in gps coordinate system, just can unify this two coordinate systems again.
According to the relation among Figure 10,1 P in the navigation vehicle coordinate system (x, y) the position P ' (X in earth coordinates w, Y w) but through type (20) calculates:
Figure BDA0000055949240000161
Wherein,
Figure BDA0000055949240000162
Course heading for navigation vehicle.Aspect temporal registration, the output frequency of setting GPS is 10Hz, and the time of visual processes piece image is less than 200ms, promptly less than 10Hz, therefore the frequency with vision output is as the criterion, after piece image is finished dealing with, extract the GPS positional information of this moment immediately, thereby guarantee the two consistance in time.
In addition, UKF at first, does not have mark (UT) conversion when the treatment state equation, promptly calculate the σ point set; Use the state variable after the UT conversion to carry out Filtering Estimation then, to reduce evaluated error.Ultimate principle is to utilize limited parameter to come the probabilistic statistical characteristics of approximate random amount, and basic step may be summarized to be:
S. produce the one group of σ point set (Sigma-point) about x, get the number 2n+1 that σ is ordered usually, wherein n is the dimension of state vector.
T. with the σ point set substitution that calculates, carry out the prediction of state.
U. upgrade the σ point set, and the observed result substitution is upgraded state.
Consider following nonlinear model:
x k=A k-1x k-1+w k-1 (21)
z k=H kx k+v k (22)
In the formula: x kState vector for system;
z kFor measuring vector;
w kBe process noise;
v kBe measurement noise;
w kAnd v kBe mutual incoherent zero-mean white noise, and have covariance matrix Q respectively kAnd R k
If the statistical property of x satisfies Then can pass through (23), (24) calculate σ point and weight coefficient thereof,
ξ 0 = x ‾
Figure BDA0000055949240000173
ω i m = ω i c = 0.5 / ( n + λ ) , i = 1,2 , . . . , 2 n - - - ( 24 )
Wherein, λ=a 2(n+ κ)-n;
N is the number of state vector;
A decision σ spread of points degree, get usually one less on the occasion of;
κ is generally a constant;
β describes the distributed intelligence of x;
Figure BDA0000055949240000175
Representing matrix square root i row;
Figure BDA0000055949240000181
Weight coefficient when asking the first-order statistics characteristic;
Figure BDA0000055949240000182
Weight coefficient when asking second-order statistics.
It is as follows to calculate round-robin concrete steps with the UKF filtering algorithm:
V. through type (23) calculates the σ point
Figure BDA0000055949240000183
P wherein xGet P K-1,
Figure BDA0000055949240000184
Get x K-1
W. through type (25) calculates the σ point
Figure BDA0000055949240000185
By the propagation of state equation, and the state and the filtering error covariance of system predicted.
ξ k ( i ) = A k - 1 ξ k - 1 ( i ) , i = 1,2 , . . . , 2 n x ^ k = Σ i = 0 2 n ω i m ξ k ( i ) , P ^ k = Σ i = 0 2 n ω i c ( ξ k ( i ) - x ^ k ) ( ξ k ( i ) - x ^ k ) T + Q k - 1 , - - - ( 25 )
X. through type (23) upgrades the σ point
Figure BDA0000055949240000187
P wherein xGet
Figure BDA0000055949240000189
Get
Figure BDA00000559492400001810
Y. through type (26) calculates the prediction of output.
ζ k ( i ) = H k ξ k ( i ) , i = 1,2 , . . . , n z ^ k = Σ i = 0 2 n ω i m ζ k ( i ) , P z = Σ i = 0 2 n ω i c ( ζ k ( i ) - z ^ k ) ( ζ k ( i ) - z ^ k ) T + R , P x , z = Σ i = 0 2 n ω i c ( ξ k ( i ) - x ^ k ) ( ζ k ( i ) - z ^ k ) T , - - - ( 26 )
Z. obtaining measuring z kAfterwards, according to formula (27) system state of prediction and the filtering error variance battle array of prediction are upgraded.
x k = x ^ k + K k ( z k - z ^ k ) , K k = P x , z P z - 1 , P k = P ^ k - K k P z - 1 K k T , - - - ( 27 )
K wherein kIt is the filter gain battle array.
To sum up, the fusion and positioning method of step S3 is: according to characteristics and the machine vision location and the characteristics that GPS locatees of navigation vehicle operating environment, set up Fusion Model (being the spatial registration model) as shown in figure 11.
Have two coordinate systems among the figure, one is earth coordinates, another one is a visual coordinate system (navigation vehicle coordinate system), these two coordinate systems can be changed under the unified coordinate system by rotating the peace transfer, impact point among the figure is the unique point of using the capable information of representation crop in the destination path that obtains behind the above-mentioned image processing algorithm, and its coordinate in visual coordinate system is
Figure BDA0000055949240000191
The position of ccd video camera in earth coordinates obtained in real time by GPS, and its coordinate is (x v, y v), by the course of this coordinate and navigation vehicle
Figure BDA0000055949240000192
Visual coordinate system can be transformed under the earth coordinates, thereby obtain the coordinate (x of impact point under earth coordinates according to formula (20) p, y p).This algorithm is output as the position coordinates of point in terrestrial coordinate in the guidance path, and GPS not only provides the position of CC video camera in terrestrial coordinate in addition, and the course of navigation vehicle also is provided simultaneously
Figure BDA0000055949240000193
With speed V.
According to above-mentioned Fusion Model and UKF Principle of Filtering, the state vector x of etching system when determining k kFor:
Figure BDA0000055949240000194
Wherein:
(x V, k, y V, k) be the k position coordinates of navigation vehicle in earth coordinates constantly, terrestrial coordinate is the north-areal coordinate of a conquering east system;
(u V, k, v V, k) be k constantly navigation vehicle respectively at x, the speed of y direction;
Figure BDA0000055949240000195
Course heading for k moment navigation vehicle;
(x P, k, y P, k) be the k coordinate of impact point in earth coordinates constantly.
The dead reckoning principle has obtained general application in self-navigation, its ultimate principle as shown in figure 12.
Under the condition of navigation vehicle constant speed drive, the position of previous moment and the position of current time concern as shown in Equation 29:
x k = x k - 1 + u v , k - 1 Δt y k = y k - 1 + v v , k - 1 Δt - - - ( 29 )
According to above-mentioned dead reckoning principle, be carved into k+1 state-transition matrix A constantly when setting up k, as the formula (30):
A = 1 0 Δt 0 0 0 0 0 1 0 Δt 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 Δt 0 0 1 0 0 0 0 Δt 0 0 1 - - - ( 30 )
The information of GPS locating device and machine vision locating device output as observed reading, is obtained k measurement vector z constantly kFor:
Figure BDA0000055949240000202
According to the relation between sensor output data and the state vector, obtain impact point position coordinates in the visual coordinate system
Figure BDA0000055949240000203
With impact point position coordinates (x in the earth coordinates P, k, y P, k) relation, shown in (32), and through type (33) sets up state vector and measures transition matrix H between the vector, to predict that next constantly:
Figure BDA0000055949240000205
Figure BDA0000055949240000206
Integrated navigation and location system and method merge the absolute location information of GPS acquisition and the relative position information of machine vision acquisition, prevent since a little less than the gps signal the not high problem of bearing accuracy that causes and visual information lose cause lack problem such as locating information, in addition, fusion by information can be carried out offset correction to the information of two kinds of sensors, on the basis of improving system reliability, improve the bearing accuracy of system simultaneously.
Above embodiment only is used to illustrate the present invention; and be not limitation of the present invention; the those of ordinary skill in relevant technologies field; under the situation that does not break away from the spirit and scope of the present invention; can also make various variations and modification; therefore all technical schemes that are equal to also belong to category of the present invention, and scope of patent protection of the present invention should be defined by the claims.

Claims (10)

1. the integrated navigation and location system based on GPS and machine vision is characterized in that, this system comprises:
The GPS locating device is used for navigation vehicle is carried out the GPS location, obtains position coordinates, course heading and the travel speed of navigation vehicle, and is sent to the fusion locating device;
The machine vision locating device is used to gather the farmland image on the guidance path, and the image that collects is carried out Flame Image Process, extracts guidance path, obtains the position coordinates of known point in the guidance path, and is sent to the fusion locating device;
Merge locating device, be used for carrying out spatial registration and temporal registration, and carry out Filtering Processing, obtain final locating information from the information of described GPS locating device and machine vision locating device.
2. the integrated navigation and location system based on GPS and machine vision as claimed in claim 1 is characterized in that, described GPS locating device further comprises:
Data acquisition module is used for navigation vehicle is carried out the GPS location, obtains position coordinates, course heading and the travel speed of navigation vehicle;
Data processing module, with data collecting module collected to information via Gauss projection conversion be transformed under the plane coordinate system, with its preservation, show and be sent to described fusion locating device.
3. the integrated navigation and location system based on GPS and machine vision as claimed in claim 2 is characterized in that, described data processing module further comprises:
Microcontroller, be used for data collecting module collected to latitude and longitude information be transformed under the plane coordinate system through the Gauss projection conversion;
Communication unit is used for the locating information after the described microcontroller processing is sent to the fusion locating device.
4. the integrated navigation and location system based on GPS and machine vision as claimed in claim 1 is characterized in that, described machine vision locating device further comprises:
Image capture module is used to gather the farmland image on the guidance path;
Image processing module is handled the image that image capture module collects, and extracts guidance path, obtains the position coordinates of known point in the guidance path, and is sent to described fusion locating device.
5. the integrated navigation and location system based on GPS and machine vision as claimed in claim 1 is characterized in that, described fusion locating device further comprises:
Data resolution module is to carrying out spatial registration and time registration from the information of described GPS locating device and the transmission of machine vision locating device;
Merge locating module, the data after the data parsing module is handled are carried out Filtering Processing, obtain final locating information.
6. the integrated navigation and location system based on GPS and machine vision as claimed in claim 2 is characterized in that, described GPS locating device adopts the asynchronous serial mode to carry out data and transmits for carrying out the RTK GPS receiver of virtual reference station VRS difference.
7. the integrated navigation and location system based on GPS and machine vision as claimed in claim 4 is characterized in that, described image capture module is a ccd video camera.
8. combined navigation locating method based on GPS and machine vision based on the described system of claim 1-7 is characterized in that the method comprising the steps of:
The S1.GPS locating device carries out GPS location to navigation vehicle, and with position coordinates, course heading and the travel speed of navigation vehicle, is sent to the fusion locating device;
S2. the machine vision locating device is gathered the farmland image on the guidance path, and the image that collects is carried out Flame Image Process, extracts guidance path, and the position coordinates of the unique point in the guidance path is sent to the fusion locating device;
S3. merge locating device to carrying out spatial registration and temporal registration, and carry out Filtering Processing, obtain final locating information from the information of described GPS locating device and machine vision locating device.
9. the combined navigation locating method based on GPS and machine vision as claimed in claim 8, it is characterized in that, among the step S3, described Flame Image Process comprises: image gray-scale transformation, image segmentation, image denoising, extraction guidance path candidate point, Hough change and extract guidance path and calculated characteristics point.
10. as each described combined navigation locating method of claim 8-9, it is characterized in that among the step S3, described Filtering Processing is based on Unscented kalman filtering UKF algorithm based on GPS and machine vision.
CN2011100968964A 2011-04-18 2011-04-18 Global positioning system (GPS) and machine vision-based integrated navigation and positioning system and method Pending CN102252681A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011100968964A CN102252681A (en) 2011-04-18 2011-04-18 Global positioning system (GPS) and machine vision-based integrated navigation and positioning system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011100968964A CN102252681A (en) 2011-04-18 2011-04-18 Global positioning system (GPS) and machine vision-based integrated navigation and positioning system and method

Publications (1)

Publication Number Publication Date
CN102252681A true CN102252681A (en) 2011-11-23

Family

ID=44980098

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011100968964A Pending CN102252681A (en) 2011-04-18 2011-04-18 Global positioning system (GPS) and machine vision-based integrated navigation and positioning system and method

Country Status (1)

Country Link
CN (1) CN102252681A (en)

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102788591A (en) * 2012-08-07 2012-11-21 郭磊 Visual information-based robot line-walking navigation method along guide line
CN103149939A (en) * 2013-02-26 2013-06-12 北京航空航天大学 Dynamic target tracking and positioning method of unmanned plane based on vision
CN103196403A (en) * 2013-03-21 2013-07-10 中国农业大学 Earth volume measurement method based on GPS control ground system
CN103390273A (en) * 2013-07-19 2013-11-13 哈尔滨工程大学 Multi-beam side-scan sonar image registration method based on GPS (global positioning system) positioning assistance
CN103679775A (en) * 2014-01-03 2014-03-26 中南大学 Farmland operation zone boundary modeling method with combination of lines and curves
CN103809155A (en) * 2014-01-17 2014-05-21 西北农林科技大学 ZigBee-based quadrocopter farmland positioning system
CN104133192A (en) * 2014-08-14 2014-11-05 西安电子科技大学 Agricultural machine navigation system and method applied to small and medium-sized farmland
CN104567872A (en) * 2014-12-08 2015-04-29 中国农业大学 Extraction method and system of agricultural implements leading line
CN104697544A (en) * 2015-04-02 2015-06-10 芜湖航飞科技股份有限公司 Navigation system based on machine vision and GPS
CN105245281A (en) * 2015-08-31 2016-01-13 深圳市艾励美特科技有限公司 Industrial concentrator system and signal transmission method thereof
CN105388908A (en) * 2015-12-11 2016-03-09 国网四川省电力公司电力应急中心 Machine vision-based unmanned aerial vehicle positioned landing method and system
CN105809632A (en) * 2014-12-31 2016-07-27 中国科学院深圳先进技术研究院 Method for removing noise from radar images of predetermined crops
CN106502252A (en) * 2016-12-05 2017-03-15 聊城大学 The tractor navigation control system of Multi-sensor Fusion and its positioning, control method
CN106949881A (en) * 2017-02-24 2017-07-14 浙江大学 A kind of mobile robot fast vision localization method
CN107290309A (en) * 2017-05-25 2017-10-24 浙江大学 Field rice mildew automatic detection device and detection method based on fluorescence imaging
CN108230420A (en) * 2016-12-15 2018-06-29 千寻位置网络有限公司 The drawing practice in place to be drawn
CN108399743A (en) * 2018-02-07 2018-08-14 武汉理工大学 A kind of vehicle on highway anomaly detection method based on GPS data
CN108572380A (en) * 2017-06-17 2018-09-25 苏州博田自动化技术有限公司 A kind of air navigation aid and its application based on satellite navigation and vision guided navigation
CN108614283A (en) * 2018-05-10 2018-10-02 芜湖航飞科技股份有限公司 A kind of Beidou navigation terminal device for vehicle
CN108897324A (en) * 2018-07-25 2018-11-27 吉林大学 A kind of control method, device, equipment and storage medium that unmanned vehicle is stopped
CN108955713A (en) * 2017-05-27 2018-12-07 腾讯科技(北京)有限公司 The display methods and device of driving trace
CN109029243A (en) * 2018-07-04 2018-12-18 南京理工大学 A kind of improved agricultural machinery working area measurement terminal and method
CN109115223A (en) * 2018-08-30 2019-01-01 江苏大学 A kind of full source integrated navigation system of full landform towards intelligent agricultural machinery
CN109143305A (en) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 Automobile navigation method and device
CN109341690A (en) * 2018-09-25 2019-02-15 江苏大学 A kind of efficient combined navigation self-adaptive data fusion method of robust
CN109357673A (en) * 2018-10-30 2019-02-19 上海仝物云计算有限公司 Vision navigation method and device based on image
CN109474894A (en) * 2019-01-03 2019-03-15 腾讯科技(深圳)有限公司 Terminal positioning processing method, device and electronic equipment
CN109490931A (en) * 2018-09-03 2019-03-19 天津远度科技有限公司 Flight localization method, device and unmanned plane
CN109559539A (en) * 2017-09-26 2019-04-02 廖凌峣 Action positioning service collocation Object identifying DAS (Driver Assistant System) and its operation method
CN109634305A (en) * 2018-12-21 2019-04-16 国网安徽省电力有限公司淮南供电公司 UAV position and orientation method of adjustment and system based on visual aids positioning
CN110178099A (en) * 2017-05-26 2019-08-27 广州极飞科技有限公司 Unmanned plane course determines method and unmanned plane
CN110196053A (en) * 2019-06-13 2019-09-03 内蒙古大学 A kind of real-time field robot vision navigation method and system based on FPGA
CN110243372A (en) * 2019-06-18 2019-09-17 北京中科原动力科技有限公司 Intelligent agricultural machinery navigation system and method based on machine vision
CN110274600A (en) * 2019-07-10 2019-09-24 达闼科技(北京)有限公司 Obtain the method, apparatus and system of robot GPS information
CN110490830A (en) * 2019-08-22 2019-11-22 中国农业科学院农业信息研究所 A kind of agricultural remote sensing method for correcting image and system
CN110501736A (en) * 2019-08-28 2019-11-26 武汉大学 Utilize vision imaging and GNSS distance measuring signal close coupling positioning system and method
CN110806753A (en) * 2014-02-06 2020-02-18 洋马株式会社 Parallel travel work system
CN110979853A (en) * 2019-12-20 2020-04-10 湖北师范大学 Automatic packaging method and system based on machine vision
CN111380405A (en) * 2018-12-29 2020-07-07 北京理工大学 Guidance control system of high-dynamic aircraft with strapdown seeker
CN112533474A (en) * 2018-07-31 2021-03-19 株式会社久保田 Travel route generation system, travel route generation method, travel route generation program, recording medium containing travel route generation program, work management system, work management method, work management program, recording medium containing work management program, harvester, travel pattern creation system, travel pattern creation program, recording medium containing travel pattern creation program, and travel pattern creation method
CN113721625A (en) * 2021-08-31 2021-11-30 平安科技(深圳)有限公司 AGV control method, device, equipment and storage medium
US20210373181A1 (en) * 2016-06-30 2021-12-02 Faraday&Future Inc. Geo-fusion between imaging device and mobile device
CN114114369A (en) * 2022-01-27 2022-03-01 智道网联科技(北京)有限公司 Autonomous vehicle positioning method and apparatus, electronic device, and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101008671A (en) * 2006-12-29 2007-08-01 深圳市赛格导航科技股份有限公司 Method, system and device for accurately navigating mobile station
CN101452072A (en) * 2008-12-26 2009-06-10 东南大学 Electronic information system for earth monitor and method thereof
CN101661096A (en) * 2008-08-29 2010-03-03 夏晓清 Method and system for generating virtual reference station based on triangular approximation algorithm
CN101661097A (en) * 2008-08-29 2010-03-03 夏晓清 Method and system for high-precision positioning for mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101008671A (en) * 2006-12-29 2007-08-01 深圳市赛格导航科技股份有限公司 Method, system and device for accurately navigating mobile station
CN101661096A (en) * 2008-08-29 2010-03-03 夏晓清 Method and system for generating virtual reference station based on triangular approximation algorithm
CN101661097A (en) * 2008-08-29 2010-03-03 夏晓清 Method and system for high-precision positioning for mobile terminal
CN101452072A (en) * 2008-12-26 2009-06-10 东南大学 Electronic information system for earth monitor and method thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
陈艳等: "基于GPS 和机器视觉的组合导航定位方法", 《农业工程学报》 *
陈艳等: "基于Kalman滤波器的机器视觉自动导航定位算法研究", 《纪念中国农业工程学会成立30周年暨中国农业工程学会2009年学术年会(CSAE 2009)论文集》 *

Cited By (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102788591A (en) * 2012-08-07 2012-11-21 郭磊 Visual information-based robot line-walking navigation method along guide line
CN103149939B (en) * 2013-02-26 2015-10-21 北京航空航天大学 A kind of unmanned plane dynamic target tracking of view-based access control model and localization method
CN103149939A (en) * 2013-02-26 2013-06-12 北京航空航天大学 Dynamic target tracking and positioning method of unmanned plane based on vision
CN103196403A (en) * 2013-03-21 2013-07-10 中国农业大学 Earth volume measurement method based on GPS control ground system
CN103196403B (en) * 2013-03-21 2015-11-18 中国农业大学 A kind of earth volume measuring method controlling flat ground system based on GPS
CN103390273A (en) * 2013-07-19 2013-11-13 哈尔滨工程大学 Multi-beam side-scan sonar image registration method based on GPS (global positioning system) positioning assistance
CN103390273B (en) * 2013-07-19 2015-12-02 哈尔滨工程大学 A kind of multi-beam side-scan sonar image registration method auxiliary based on GPS location
CN103679775A (en) * 2014-01-03 2014-03-26 中南大学 Farmland operation zone boundary modeling method with combination of lines and curves
CN103679775B (en) * 2014-01-03 2017-01-25 中南大学 Farmland operation zone boundary modeling method with combination of lines and curves
CN103809155A (en) * 2014-01-17 2014-05-21 西北农林科技大学 ZigBee-based quadrocopter farmland positioning system
CN103809155B (en) * 2014-01-17 2015-09-16 西北农林科技大学 A kind of four-axle aircraft farmland positioning system based on ZigBee
CN110806753B (en) * 2014-02-06 2023-08-29 洋马动力科技有限公司 Parallel travel work system
CN110806753A (en) * 2014-02-06 2020-02-18 洋马株式会社 Parallel travel work system
CN104133192A (en) * 2014-08-14 2014-11-05 西安电子科技大学 Agricultural machine navigation system and method applied to small and medium-sized farmland
CN104567872A (en) * 2014-12-08 2015-04-29 中国农业大学 Extraction method and system of agricultural implements leading line
CN104567872B (en) * 2014-12-08 2018-09-18 中国农业大学 A kind of extracting method and system of agricultural machinery and implement leading line
CN105809632A (en) * 2014-12-31 2016-07-27 中国科学院深圳先进技术研究院 Method for removing noise from radar images of predetermined crops
CN105809632B (en) * 2014-12-31 2018-11-23 中国科学院深圳先进技术研究院 From the method for the radar image of predetermined crops removal noise
CN104697544A (en) * 2015-04-02 2015-06-10 芜湖航飞科技股份有限公司 Navigation system based on machine vision and GPS
CN105245281A (en) * 2015-08-31 2016-01-13 深圳市艾励美特科技有限公司 Industrial concentrator system and signal transmission method thereof
CN105245281B (en) * 2015-08-31 2018-02-13 深圳市艾励美特科技有限公司 Industrial concentrator system and signal transmission method thereof
CN105388908A (en) * 2015-12-11 2016-03-09 国网四川省电力公司电力应急中心 Machine vision-based unmanned aerial vehicle positioned landing method and system
US20210373181A1 (en) * 2016-06-30 2021-12-02 Faraday&Future Inc. Geo-fusion between imaging device and mobile device
CN106502252B (en) * 2016-12-05 2024-02-02 山东双力现代农业装备有限公司 Control method of multi-sensor fusion tractor navigation control system
CN106502252A (en) * 2016-12-05 2017-03-15 聊城大学 The tractor navigation control system of Multi-sensor Fusion and its positioning, control method
CN108230420A (en) * 2016-12-15 2018-06-29 千寻位置网络有限公司 The drawing practice in place to be drawn
CN106949881A (en) * 2017-02-24 2017-07-14 浙江大学 A kind of mobile robot fast vision localization method
CN107290309A (en) * 2017-05-25 2017-10-24 浙江大学 Field rice mildew automatic detection device and detection method based on fluorescence imaging
CN110178099A (en) * 2017-05-26 2019-08-27 广州极飞科技有限公司 Unmanned plane course determines method and unmanned plane
CN110178099B (en) * 2017-05-26 2022-05-10 广州极飞科技股份有限公司 Unmanned aerial vehicle course determining method and unmanned aerial vehicle
CN108955713A (en) * 2017-05-27 2018-12-07 腾讯科技(北京)有限公司 The display methods and device of driving trace
CN108572380B (en) * 2017-06-17 2022-02-08 苏州博田自动化技术有限公司 Navigation method based on satellite navigation and visual navigation and application thereof
CN108572380A (en) * 2017-06-17 2018-09-25 苏州博田自动化技术有限公司 A kind of air navigation aid and its application based on satellite navigation and vision guided navigation
CN109559539A (en) * 2017-09-26 2019-04-02 廖凌峣 Action positioning service collocation Object identifying DAS (Driver Assistant System) and its operation method
CN108399743B (en) * 2018-02-07 2021-09-07 武汉理工大学 Highway vehicle abnormal behavior detection method based on GPS data
CN108399743A (en) * 2018-02-07 2018-08-14 武汉理工大学 A kind of vehicle on highway anomaly detection method based on GPS data
CN108614283A (en) * 2018-05-10 2018-10-02 芜湖航飞科技股份有限公司 A kind of Beidou navigation terminal device for vehicle
CN109029243A (en) * 2018-07-04 2018-12-18 南京理工大学 A kind of improved agricultural machinery working area measurement terminal and method
CN109029243B (en) * 2018-07-04 2021-02-26 南京理工大学 Improved agricultural machinery working area measuring terminal and method
CN108897324A (en) * 2018-07-25 2018-11-27 吉林大学 A kind of control method, device, equipment and storage medium that unmanned vehicle is stopped
CN112533474A (en) * 2018-07-31 2021-03-19 株式会社久保田 Travel route generation system, travel route generation method, travel route generation program, recording medium containing travel route generation program, work management system, work management method, work management program, recording medium containing work management program, harvester, travel pattern creation system, travel pattern creation program, recording medium containing travel pattern creation program, and travel pattern creation method
CN109115223A (en) * 2018-08-30 2019-01-01 江苏大学 A kind of full source integrated navigation system of full landform towards intelligent agricultural machinery
CN109490931A (en) * 2018-09-03 2019-03-19 天津远度科技有限公司 Flight localization method, device and unmanned plane
CN109341690B (en) * 2018-09-25 2022-03-22 江苏大学 Robust and efficient combined navigation self-adaptive data fusion method
CN109341690A (en) * 2018-09-25 2019-02-15 江苏大学 A kind of efficient combined navigation self-adaptive data fusion method of robust
CN109143305A (en) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 Automobile navigation method and device
CN109357673A (en) * 2018-10-30 2019-02-19 上海仝物云计算有限公司 Vision navigation method and device based on image
CN109634305A (en) * 2018-12-21 2019-04-16 国网安徽省电力有限公司淮南供电公司 UAV position and orientation method of adjustment and system based on visual aids positioning
CN111380405B (en) * 2018-12-29 2021-01-15 北京理工大学 Guidance control system of high-dynamic aircraft with strapdown seeker
CN111380405A (en) * 2018-12-29 2020-07-07 北京理工大学 Guidance control system of high-dynamic aircraft with strapdown seeker
CN109474894B (en) * 2019-01-03 2020-09-11 腾讯科技(深圳)有限公司 Terminal positioning processing method and device and electronic equipment
CN109474894A (en) * 2019-01-03 2019-03-15 腾讯科技(深圳)有限公司 Terminal positioning processing method, device and electronic equipment
CN110196053A (en) * 2019-06-13 2019-09-03 内蒙古大学 A kind of real-time field robot vision navigation method and system based on FPGA
CN110243372A (en) * 2019-06-18 2019-09-17 北京中科原动力科技有限公司 Intelligent agricultural machinery navigation system and method based on machine vision
CN110274600B (en) * 2019-07-10 2021-08-03 达闼科技(北京)有限公司 Method, device and system for acquiring GPS (global positioning system) information of robot
CN110274600A (en) * 2019-07-10 2019-09-24 达闼科技(北京)有限公司 Obtain the method, apparatus and system of robot GPS information
CN110490830A (en) * 2019-08-22 2019-11-22 中国农业科学院农业信息研究所 A kind of agricultural remote sensing method for correcting image and system
CN110490830B (en) * 2019-08-22 2021-09-24 中国农业科学院农业信息研究所 Agricultural remote sensing image correction method and system
CN110501736A (en) * 2019-08-28 2019-11-26 武汉大学 Utilize vision imaging and GNSS distance measuring signal close coupling positioning system and method
CN110501736B (en) * 2019-08-28 2023-10-20 武汉大学 System and method for tightly coupling positioning by utilizing visual images and GNSS ranging signals
CN110979853A (en) * 2019-12-20 2020-04-10 湖北师范大学 Automatic packaging method and system based on machine vision
CN113721625B (en) * 2021-08-31 2023-07-18 平安科技(深圳)有限公司 AGV control method, device, equipment and storage medium
CN113721625A (en) * 2021-08-31 2021-11-30 平安科技(深圳)有限公司 AGV control method, device, equipment and storage medium
CN114114369A (en) * 2022-01-27 2022-03-01 智道网联科技(北京)有限公司 Autonomous vehicle positioning method and apparatus, electronic device, and storage medium

Similar Documents

Publication Publication Date Title
CN102252681A (en) Global positioning system (GPS) and machine vision-based integrated navigation and positioning system and method
Shan et al. Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain
García-Santillán et al. Automatic detection of curved and straight crop rows from images in maize fields
CN103411609B (en) A kind of aircraft return route planing method based on online composition
CN102368158B (en) Navigation positioning method of orchard machine
Maddern et al. Real-time kinematic ground truth for the oxford robotcar dataset
上條俊介 et al. Autonomous vehicle technologies: Localization and mapping
CN109032174B (en) Unmanned aerial vehicle operation route planning method and operation execution method
LeVoir et al. High-accuracy adaptive low-cost location sensing subsystems for autonomous rover in precision agriculture
CN113778081B (en) Orchard path identification method and robot based on laser radar and vision
CN114413909A (en) Indoor mobile robot positioning method and system
CN114077249B (en) Operation method, operation equipment, device and storage medium
Ericson et al. Analysis of two visual odometry systems for use in an agricultural field environment
Khoshelham et al. Vehicle positioning in the absence of GNSS signals: Potential of visual-inertial odometry
Cho et al. Autonomous positioning of the unloading auger of a combine harvester by a laser sensor and GNSS
CN107255446B (en) Dwarfing close-planting fruit tree canopy three-dimensional map construction system and method
Majdik et al. Micro air vehicle localization and position tracking from textured 3d cadastral models
CN110909821B (en) Method for carrying out high-space-time resolution vegetation index data fusion based on crop reference curve
Krejsa et al. Fusion of local and global sensory information in mobile robot outdoor localization task
Rasmussen et al. Trail following with omnidirectional vision
CN115280960A (en) Combine harvester steering control method based on field vision SLAM
Velat et al. Vision based vehicle localization for autonomous navigation
Kragh et al. Multimodal obstacle detection and evaluation of occupancy grid mapping in agriculture.
Ishii et al. Autonomous UAV flight using the Total Station Navigation System in Non-GNSS Environments
Gamallo et al. Combination of a low cost GPS with visual localization based on a previous map for outdoor navigation

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20111123