CN101099162A - Vision-aided system and method for guiding a vehicle - Google Patents

Vision-aided system and method for guiding a vehicle Download PDF

Info

Publication number
CN101099162A
CN101099162A CNA2005800459168A CN200580045916A CN101099162A CN 101099162 A CN101099162 A CN 101099162A CN A2005800459168 A CNA2005800459168 A CN A2005800459168A CN 200580045916 A CN200580045916 A CN 200580045916A CN 101099162 A CN101099162 A CN 101099162A
Authority
CN
China
Prior art keywords
vision
data
locality data
locality
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2005800459168A
Other languages
Chinese (zh)
Other versions
CN100580689C (en
Inventor
S·韩
J·F·赖德
T·D·皮克特
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deere and Co
Original Assignee
Deere and Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deere and Co filed Critical Deere and Co
Publication of CN101099162A publication Critical patent/CN101099162A/en
Application granted granted Critical
Publication of CN100580689C publication Critical patent/CN100580689C/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Position Fixing By Use Of Radio Waves (AREA)
  • Navigation (AREA)

Abstract

A method and system for guiding a vehicle comprises a location determining receiver for collecting location data for the vehicle. A vision module collects vision data for the vehicle. A location quality estimator estimates the location quality data for the location data during an evaluation time window. A vision module estimates vision quality data for the vision data during the evaluation time window. A supervisor module selects a mixing ratio for the vision data and location data (or error signals associated therewith) based on the quality data.

Description

Use is based on the method and system of the adjustment guided vehicle of vision
Technical field
The present invention relates to use vision to adjust the method and system of guided vehicle.
Background technology
GPS (GPS) receiver has been used to guiding vehicles to use position (position) data is provided.Yet, although the specific gps receiver with difference correction (differentialcorrection) has the general positioning error that is about 10 centimetres (4 inches) in its most of working time, in its working time of 5%, be typical greater than the absolute fix error of 50 centimetres (20 inches).In addition, gps signal can be stopped that this makes that only the navigational system of GPS becomes unreliable in locality (location) or environment by buildings, tree or other barriers.Therefore need to use one or more additional sensors to replenish or strengthen GPS base navigational system, thereby improve degree of accuracy and robustness.
Summary of the invention
A kind of method and system that is used for guided vehicle comprises and is used to vehicle to collect the location module of preliminary locality data (for example, receiver is determined in the place).Vision module is collected vehicle between the evaluation time window phase vision derives locality data.Vision module is that the corresponding vision of collecting derives locality data estimation vision quality data between the evaluation time window phase.If vision quality data surpasses minimum threshold level, then regulator is derived locality data based on vision preliminary locality data is adjusted into the locality data of correction, makes the locality data revised with vision derivation locality data registration or same range as (coextensive) is roughly arranged in time.
Description of drawings
Fig. 1 is for deriving the block scheme that locality data is come the system of guided vehicle according to the present invention is based on preliminary locality data and vision.
Fig. 2 is for coming the process flow diagram of the method for guided vehicle according to the present invention is based on preliminary locality data and vision derivation locality data.
Fig. 3 is for coming the process flow diagram of the another kind of method of guided vehicle according to the present invention is based on preliminary locality data and vision derivation locality data.
The chart of Fig. 4 shows the static immobilization error of locality data, for example the pilot signal that derives from differential Global Positioning System (GPS).
The chart of Fig. 5 shows the positioning error of locality data, for example such as afterwards according to the another kind of sensor " tuning " of vision module of the present invention, and the pilot signal that derives from differential Global Positioning System (GPS) signal.
Embodiment
Fig. 1 is the block scheme that is used for the guidance system 11 of guided vehicle.Guidance system 11 can be installed in that vehicle or mobile robot go up or with the latter and put.Guidance system 11 comprises that vision module 22 and the place of communicating by letter with adjuster 110 determine receiver 28.
Vision module 22 can be associated with vision quality estimator 20.The place determines that receiver 28 can be associated with place quality estimator 24.Adjuster 110 can be communicated by letter with vehicle control device 25.And vehicle control device 25 is coupled to steering 27.
The place determines that receiver 28 can comprise GPS (GPS) receiver with difference correction (for example, gps receiver and be used to receive receiver by the differential correction signal of satellite or ground-level source emission).The place determines that receiver 28 provides the locality data (for example, coordinate) of vehicle.The place determines that receiver 28 can be to adjuster 110 or place quality estimator 24 at least (for example, pass through status signal) the one or more following conditioned disjunction states of indication: receiver 28 forbiddings are determined in (1) place, (2) in one or more corresponding evaluation intervals, unavailable or the breaking-up of locality data, and (3) in one or more evaluation intervals, and the estimation degree of accuracy or the fiduciary level of locality data are lower than minimum threshold.This place determines that receiver 28 provides the locality data that is highly suitable for worldwide navigation or global path planning for vehicle.
In an illustrative embodiment, the locality data of the following form of receiver 28 outputs is determined in the place:
y gps = E off _ gps E head _ gps , E wherein Off_gpsBe the off-track error (off-track error) of determining that by the place receiver 28 (for example, receiver 28 is determined in the place) is estimated, E Head_gpsBe course (heading) error of determining that by the place receiver 28 is estimated.
Vision module 22 can comprise image collection system and image processing system.This image collection system comprises following one or more: (1) one or more monocular imaging systems, (for example be used to collect the set of diagrams picture, have the different a plurality of images that focus on the same scene of setting or lens adjustment, perhaps a plurality of images of different field of view (FOV)s); (2) stereo visual system (for example, separately two digital image-generating units of known distance and direction) is used for determining the depth information or the three-dimensional coordinate of the spot correlation on the object in the scene; (3) stadimeter (for example laser range finder) is used for determining the range observation or the three-dimensional coordinate of the point on the object in the scene; (4) optical detection and ranging system or laser radar system, be used to detect the speed of object in the scene, highly, direction or scope; (5) the scan laser system (for example, a kind of laser measurement system, it launches light pulse, and estimates this laser measurement system and distance between objects based on the travel-time between the reception of this pulse emission and its reflection), be used for distance definite and the interior object of scene; And (6) imaging system, collect image by optical micro electro-mechanical systems (MEMS), Free Space Optics MEMS or integrated optics MEMS.Free Space Optics MEMS uses compound semiconductor and has material manipulation visible light, infrared ray or the ultraviolet light of a ranges of indices of refraction, and integrated optics MEMS uses polysilicon component reflection, diffraction, modulation or handles visible light, infrared ray or ultraviolet light.MEMS can be configured to can be according to switch matrix, lens, catoptron and the diffraction grating of various semiconductor fabrications manufacturings.The image that image processing system is collected can be for example colored, monochromatic, black and white or gray scale image.
Vision module 22 or vision derive locality data can support with image in the collection of the corresponding position data in place (two dimension or three-dimensional coordinate) of characteristics of objects.Vision module 22 is highly suitable for using position data that the feature of (a) vehicle-periphery or local feature, (b) be associated with this feature or coordinate or these two, promotes automobile navigation.Local feature can comprise following one or more: cut edge and reference marker on plant row location, fence place, buildings place, place, edge, field, place, border, megalith place, rock place (for example, greater than minimum threshold size or volume), ridge and trench, tree place, place, crop edge, other vegetation (for example turf).The vision of local feature derives locality data or position data can be used for regularly (for example, periodically) tuning preliminary locality data of determining receiver 28 from the place.
In one example, reference marker can be associated with point coordinate accurately.In addition, other local features can be relevant with reference marker position.Current vehicle location can with the fixed location of reference marker place or local feature or the ground spot correlation of vehicle.In one embodiment, vision module 22 can use coordinate or, be similar to or be equivalent to substantially the data layout of coordinate or the data layout that receiver 28 is determined in the place, the vision of expressing about vehicle location derives locality data.
Vision module 22 can be by one or more in adjuster 110 or vision quality estimator 20 indications at least are following of state or data-message: whether (1) vision module 22 is disabled, (2) in one or more evaluation intervals, whether vision derives locality data unavailable, whether (3) vision derives locality data unstable or damage, and whether (4) view data experiences reliability level, performance level or the accuracy level that does not satisfy threshold performance/reliability level.
In one example, vision module 22 can be discerned plant row location, and for soybean, error is little of 1 centimetre, and for corn, error is little of 2.4 centimetres.
In an illustrative embodiment, the vision of the following form of vision module 22 outputs derives locality data:
y vision = E off _ vision E head _ vision , E wherein Off_visionBe the off-track error of estimating by vision module 22, E Head_visionBe the course error of estimating by vision module 22.
Place quality estimator 24 can comprise in the following apparatus one or more: S meter, and it determines that with the place receiver 28 is associated; Bit error rate (bit error rate) indicator, it determines that with the place receiver 28 is associated; Another device is used for measures signal quality, error rate, signal intensity, and perhaps signal, channel or be transmitted is used for the performance of the code determined in the place.In addition, determine for satellite-based place, place quality estimator 24 can comprise a device, the satellite-signal that is used for determining determining whether receiver 28 receive the minimal amount with enough signal qualitys in the place (for example, from the L1 of GPS signal with last four or more a plurality of satellites), being that vehicle provides reliable locality data during evaluation intervals.
Place quality estimator 24 estimates to be determined by the place quality or location quality data (for example, the Q of the preliminary locality data of receiver 28 outputs Gps).Place quality estimator 24 can be determined the S meter (or bit error rate) of each signal content that receiver 28 receives based on the place, estimates the quality of this preliminary locality data.Place quality estimator 24 can also be estimated this quality based on any following factor: the number of available satellite-signal in (1) one zone, what (2) place determined that receiver obtains or receive (for example has enough signal qualitys, signal strength distribution map) number of satellite, and whether (3) each satellite-signal has acceptable signal level or acceptable bit error rate (BER) or frame error rate (FER).
In one embodiment, the unlike signal strength range is associated with different respective quality level.For example, lowest signal strength range is associated with inferior quality, and the M signal strength range is associated with fair average quality, and highest signal strength range is associated with E.B.B..Turn around, lowest order error rate scope is associated with E.B.B., and interposition error rate scope is associated with fair average quality, and most significant digit error rate scope is associated with lowest quality level.
Vision quality estimator 20 is estimated quality or vision quality data (for example, the Q by the vision derivation locality data of vision module 22 outputs Vision).Vision quality estimator 20 can be considered the illumination that exists during a series of time intervals, and wherein vision module 22 moves and gather respective image in the described time interval.Vision quality estimator 20 can comprise photodetector, have frequency selects the photodetector of lens, has corresponding frequencies and select one group of photodetector of lens, charge-coupled device (CCD), photometer, cadmium sulphide cell etc.In addition, vision quality estimator 30 comprises clock or timer and the corresponding illumination measurement (for example, the brightness value of image) that is used for the image collection time is added timestamp (time-stamping).If illumination is in low intensity range, visual quality that then should the time interval is low; If illumination is in the intermediate luminance scope, visual quality height that then should the time interval; And if illumination is in high intensity range, visual quality that then should the time interval can depend on the subrange that defines for medium, low or high in this high intensity range.In one example, aforesaid brightness range is to light frequency or the application of light color ground one by one of (versus) quality.In another example, this brightness range is applied to infra-red range frequency and ultraviolet range frequencies to quality with can being different from visible light.
This vision quality estimation can relate to the confidence metric when handling image.If desired character (for example, plant row) is significantly in one or more images, then vision quality estimator 20 specifies high image quality or high confidence level to give respective image.Turn around, if desired character not obvious in one or more images (for example, owing to lack crop row), then vision quality estimator 20 can be specified low image quality or low confidence level.In one example, the summation of the absolute difference (SAD) of the mean intensity of right each column vector (for example velocity of vision module 22) of the deflection of determining to be based on hypothesis (yaw) of level of confidence/pitching (pitch).Deflection can be defined as the orientation of vision module 22 in the x-y plane, and pitching can be defined as vision module 22 usually perpendicular to the orientation in the x-z plane on x-y plane.
If vision module 22 can't be located in maximum time or with reference to fixed reference feature in the image or reference marker in threshold value, perhaps not with reference to the reference marker in the image, then vision module 22 can change vision quality estimator 20, and this may be by the quality of quality degradation indicator degeneration vision derived data.
Generally speaking, adjuster 110 (for example, field-programmable) device that comprises data processor, microcontroller, microprocessor, digital signal processor, flush bonding processor or use software instruction programming any that other are able to programme.In one embodiment, adjuster 110 comprises rule management.The rule management of adjuster 110 can be used this preliminary locality data or its derivant (derivative), as the error controling signal in the corresponding time interval, unless this vision quality data surpasses minimum threshold level.Without any need for adjustment, unless this preliminary locality data and vision derive the difference of locality data greater than maximum tolerance.Vision weight is determined to derive locality data (for example, y from the vision of vision module 22 Vision) contribution account for leading degree.Location weight is determined to account for leading degree from the contribution of the locality data of location module 22.Mixer 14 is based on the two point data (for example, y definitely of this vision weight and location weight Gps) and vision derivation locality data (for example, y Vision) to error controling signal (for example, Relative Contribution y).In one embodiment, mixer 14 can comprise digital filter, digital signal processor or other data processors, it is arranged to use following one or more: (1) vision derives the locality data weight, (2) locality data weight, and (3) at an evaluation time at interval, and locality data and vision derive the blending ratio of the Relative Contribution of locality data and express.
Poor (or error) between locality data that the error controling signal representative records (measuring) and the true place of vehicle by vision module 22 and by location module.This error controling signal is input to vehicle control device 25 to derive the control signal of compensation.The control signal of compensation is based on management and the control of this error controling signal correction to steering 27.Steering 27 can comprise and is used for the electrical interface of communicating by letter with vehicle control device 25.In one embodiment, electrical interface comprises solenoid control (solenoid-controlled) hydraulic steering system or is used to control other electric machines of hydraulic fluid.
In another embodiment, steering 27 comprises steering unit (SSU).SSU can require to be associated to the time (heading versus time) with the course, with along the route of expectation or defer to desirable path planning and turn to or guide vehicle.Course and course error (for example, be expressed as between true course angle and the desired course angle poor) are associated.
SSU can the controlled error of being determined the vehicle location that receiver 28 is estimated with compensation by vision module 22 or place.For example, the comparison of vehicle actual position (for example, in gps coordinate) and vehicle desired locations (for example, in gps coordinate) is indicated or represented to off-track error.Off-track error can be used for the motion of the course adjustment vehicle of using compensation.Yet, if at any time or a time interval do not have off-track error, uncompensated course is just enough.Course error is the real vehicles course and determines poor between the vehicle course that receiver 28 estimates by vision module 22 and place.
Fig. 2 derives the process flow diagram of the method for locality data and locality data guided vehicle for using vision.The method of Fig. 2 starts from step S200.
At step S200, the place determines that receiver 28 or place determine that receiver 28 determines the preliminary locality data of vehicles associated therewith.For example, the place determine receiver 28 (gps receiver that for example, has difference correction) can be used for one or more evaluation times at interval or the corresponding time determine the coordinate of vehicle.In addition, at step S200, the place determines that receiver 28 can determine or derive location-error signal (for example, y from this locality data Gps).Location-error signal can represent (1) in the expectation time, poor between the vehicle location of real vehicles place and expectation; (2) in an expectation time or a position, poor between real vehicles course and the expectation vehicle course; (3) or other expression of the error that is associated with this locality data.This location-error signal can be defined as vector data, but need not to be defined as vector data.
At step S202, the vision module 22 that is associated with vehicle determines that in one or more described evaluation times intervals or corresponding time vision derives locality data.For example, vision module 22 can be collected image and handle collected image and derive locality data to determine vision.In one example, vision derives the vision derivation position data that locality data comprises vehicle, and these data are by determining that with reference to one or more visual reference marks or feature with corresponding known location vehicle coordinate obtains.The coordinate of vehicle can be determined according to global coordinates system or local coordinate system.In addition, in step S202, the place determines that receiver 28 can determine or derive vision error signal (for example, y from locality data Vision).Vision error signal representative (1) is in the expected time, poor between real vehicles place and the expectation vehicle location; (2) in expected time or position, poor between real vehicles course and the expectation vehicle course; (3) or other expression of deriving the error that locality data is associated with this vision.
At step S204, vision quality estimator 20 is estimated vision quality data between the evaluation time window phase.Vision quality estimator 20 can comprise brightness or photodetector, and the time or the clock that are used for brightness measurement is added timestamp, to determine quality level based on ambient lighting conditions.Degree of confidence when vision quality estimator 20 also can comprise the processing image with the acquisition desired character or the measurement of fiduciary level.Degree of confidence or fiduciary level when handling image especially can be depending on any following factor: the technical specification of vision module 22 (for example, resolution), identifying object (for example, boundary mark in the image) reliability, estimate institute's identifying object or it on the place of point reliability, convert image coordinate or local coordinate to reliability that global coordinates or vision derive locality data, described vision derivation locality data is consistent on room and time with the locality data of determining receiver 28 from the place.
Step S204 can be by replacing or the various technology of cumulative application are implemented.In first kind of technology, vision quality estimator 20 can estimate that vision derives the degree of confidence or the fiduciary level of locality data degree of accuracy.In second kind of technology, vision quality estimator 20 estimates that at first vision derives level of confidence, level of reliability or other quality level of locality data degree of accuracy; Secondly, vision quality estimator 20 is converted to corresponding language value with quality level.
At step S206, if vision quality data surpasses minimum threshold level, then adjuster 110 is derived locality data based on vision preliminary locality data is adjusted into the locality data of correction, makes the locality data revised with vision derivation locality data registration or roughly have same range as in time.For example, adjuster 110 is adjusted the preliminary locality data of random time gap or evaluation time window, and wherein vision quality data surpasses minimum threshold level.Registration or roughly have vision that same range as is meant the identical time interval in time and derive locality data and preliminary locality data and roughly have same range as in time or differ maximum tolerance (for example, its can be expressed as be distance, vector or spacing between the geometric coordinate of unit (or other unit)) with the second.For example, this maximum tolerance can be set to 1 centimetre of specific range (for example, 2.54 centimetres) in 10 cm range.
In one embodiment, adjuster 110 sends error controling signal to vehicle control device 25 or to make error controling signal be available for vehicle control device 25 based on the locality data of preliminary locality data or correction.The locality data of revising or from the error controling signal of its derivation one by one ground, time slot (for example, during using time window) be updated.Each time slot is suitable at interval with evaluation time on scope.
The vision that has the quality of checking by use derives the quality criteria of locality data as preliminary locality data, and adjuster 110 can strengthen the locality data of the correction that is provided for navigation or control vehicle or the fiduciary level and the degree of accuracy of positional information.Although preliminary locality data and vision derive qualitative data and are collected between the evaluation time window phase, but the adjustment to the locality data of revising of step S206 can be employed during using time window, and this application time window lags behind the evaluation time window or has same range as at interval basically in time with evaluation time.No matter how to define the evaluation time window in this example and use time window, in other examples, adjuster 110 can provide estimates that control data, feedforward control data or data feedback control are to vehicle control device 25.
The method of Fig. 3 is similar to the method for Fig. 2, and difference is that the method for Fig. 3 comprises additional step S205 and uses step S208 alternative steps S206.Use identical reference number to represent identical step or operation.
At step S205, place quality estimator 24 is estimated the location quality data of locality data between the evaluation time window phase.Step S205 can be by replacing or the various technology of cumulative application are implemented.In first kind of technology, place quality estimator 24 can estimation or measures signal quality, error rate (for example, bit error rate or frame error rate), signal strength level (for example, unit is dBm) or other quality levels.In second kind of technology, place quality estimator 24 is estimation or measures signal quality, error rate (for example, bit error rate or frame error rate), signal strength level (for example, unit is dBm) or other quality levels at first; Secondly, place quality estimator 24 is categorized as scope, language description, language value or other with signal quality data.
At step S208, if vision quality data surpasses minimum threshold level and if location quality data is less than or equal to the activation threshold value level, then adjuster 110 is derived locality data based on vision preliminary locality data is adjusted into the locality data of correction, makes the locality data revised with vision derivation locality data registration or roughly have same range as in time.For example, when vision quality data surpass minimum threshold level and wherein location quality data be less than or equal to the activation threshold value level, adjuster 110 can be adjusted the preliminary locality data of random time gap or evaluation time window.For example the activation threshold value level can, in the fiduciary level of preliminary locality data or degree of accuracy since satellite availability shortage or satellite-signal or auxiliary transmission (for example, ground reference) received signal quality is low (for example, low signal intensity) less than when expectation, is used for determining accurately preliminary locality data.The vision that has the quality of checking by use derives the quality criteria of locality data as preliminary locality data, and adjuster 206 can strengthen the locality data of the correction that is provided for navigation or control vehicle or the fiduciary level and the degree of accuracy of positional information.Have the standard of being lower than by applying (subsidiary condition of) locality data quality for example, the activation threshold value level, the method for Fig. 3 with than Fig. 2 more optionally mode proceed to the adjustment of the locality data of correction.
The chart of Fig. 4 shows the static immobilization error of locality data, for example differential GPS signal.Z-axis is represented distance error (for example, rice), and transverse axis express time (for example, second).
The chart of Fig. 5 shows the Kinematic Positioning error of locality data, for example at renewal frequency or speed " tuning " differential GPS signal (for example, locality data) afterwards with expectation.Z-axis is represented distance error (for example, rice), and transverse axis express time (for example, second).The initial error that Fig. 5 will not have " tuning " is shown black circle, and the error after " tuning " is shown circle.Use vision derive locality data with rule at interval (for example as shown in Figure 5,5 seconds at interval or 0.2Hz) adjust locality data, obtain that this is tuning.
Describe preferred embodiment, but it is evident that, under the situation that does not deviate from the scope of the present invention that defines by claims, can carry out various improvement.

Claims (16)

1. method that is used for guided vehicle, this method comprises:
Between the evaluation time window phase, determine that based on the place that is associated with described vehicle receiver collects the preliminary locality data of described vehicle;
Between described evaluation time window phase, collect the vision derivation locality data of described vehicle based on the vision module that is associated with described vehicle;
Estimate that between described evaluation time window phase described vision derives the vision quality data of locality data; And
If described vision quality data surpasses minimum threshold level, then derive the locality data that locality data will described preliminary locality data be adjusted into correction based on described vision, make described correction locality data and described vision derivation locality data registration or roughly have same range as in time.
2. according to the method for claim 1, if wherein described vision quality data surpasses described minimum threshold level and if described vision derivation locality data is identical with described preliminary locality data basically in predetermined allowance, then described adjustment comprises adjusts described preliminary locality data.
3. according to the method for claim 1, comprise that also it is in shape then specifying described vision to derive locality data if the vision quality data that derives locality data corresponding to described vision satisfies or surpasses described minimum threshold level.
4. according to the method for claim 1, also comprise, whether consistently then determine in the described preliminary locality data of described evaluation time window with vision derivation locality data if vision quality data surpasses minimum threshold level.
5. according to the method for claim 1, also be included in the next evaluation time window that is less than or equal to about .2 second and repeat said process.
6. according to the process of claim 1 wherein that described adjustment is to provide the locality data of correction to finish by the frequency with 5 hertz.
7. according to the process of claim 1 wherein that collected vision derives locality data and estimates vehicle location with respect to one or more visual reference boundary marks in the visual field, wherein each visual reference boundary mark has known geometric coordinate.
8. according to the method for claim 7, wherein said visual reference mark comprises plant row.
9. system that is used for guided vehicle, this system comprises:
Location module is used between the evaluation time window phase determining that based on the place that is associated with described vehicle receiver collects the preliminary locality data of described vehicle;
Vision module, the vision derivation locality data that is used between described evaluation time window phase, collecting described vehicle based on the vision module that is associated with described vehicle;
Vision quality estimator is used for estimating that described vision derives the vision quality data of locality data between described evaluation time window phase; And
Adjuster, if described vision quality data surpasses minimum threshold level, then derive the locality data that locality data will described preliminary locality data be adjusted into correction based on described vision, make described correction locality data and described vision derivation locality data registration or roughly have same range as in time.
10. according to the system of claim 9, if if wherein described vision quality data surpasses described minimum threshold level and described vision derivation locality data is identical with described preliminary locality data basically in predetermined allowance, then described adjuster is adjusted described preliminary locality data.
11. system according to claim 9, if the vision quality data that wherein derives locality data corresponding to described vision satisfies or surpasses described minimum threshold level in addition, it is in shape that then described vision quality estimator specifies described vision to derive locality data.
12. according to the system of claim 9, if wherein vision quality data surpasses minimum threshold level, then whether estimator is determined in the described preliminary locality data of described evaluation time window consistent with vision derivation locality data.
13., also be included in the next evaluation time window that is less than or equal to about .2 second and repeat said process according to the system of claim 9.
14. according to the system of claim 9, wherein said adjuster is a locality data of adjusting described correction with 5 hertz frequency.
15. according to the system of claim 9, wherein vision derives the locality data processing module with respect to one or more visual reference boundary marks estimation vehicle location in the visual field, wherein each visual reference boundary mark has known geometric coordinate.
16. according to the system of claim 15, wherein said visual reference mark comprises plant row.
CN200580045916A 2005-01-04 2005-12-15 Vision-based system and method for adjusting and guiding vehicle Active CN100580689C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US64124005P 2005-01-04 2005-01-04
US60/641,240 2005-01-04
US11/107,114 2005-04-15

Publications (2)

Publication Number Publication Date
CN101099162A true CN101099162A (en) 2008-01-02
CN100580689C CN100580689C (en) 2010-01-13

Family

ID=39012110

Family Applications (4)

Application Number Title Priority Date Filing Date
CN200580045916A Active CN100580689C (en) 2005-01-04 2005-12-15 Vision-based system and method for adjusting and guiding vehicle
CN200580045966A Active CN100580690C (en) 2005-01-04 2005-12-15 Vision-aided system and method for guiding vehicle
CN2005800459581A Active CN101292244B (en) 2005-01-04 2005-12-16 Vision-aided system and method for guiding a vehicle
CN2005800459562A Active CN101147151B (en) 2005-01-04 2005-12-16 System and method for guiding a vehicle

Family Applications After (3)

Application Number Title Priority Date Filing Date
CN200580045966A Active CN100580690C (en) 2005-01-04 2005-12-15 Vision-aided system and method for guiding vehicle
CN2005800459581A Active CN101292244B (en) 2005-01-04 2005-12-16 Vision-aided system and method for guiding a vehicle
CN2005800459562A Active CN101147151B (en) 2005-01-04 2005-12-16 System and method for guiding a vehicle

Country Status (1)

Country Link
CN (4) CN100580689C (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108957511A (en) * 2018-05-21 2018-12-07 永康威力科技股份有限公司 A kind of automobile navigation steering control system and the modification method that navigates
CN111077549A (en) * 2019-12-31 2020-04-28 深圳一清创新科技有限公司 Position data correction method, apparatus and computer readable storage medium
CN112859107A (en) * 2019-11-12 2021-05-28 亚庆股份有限公司 Vehicle navigation switching equipment of golf course self-driving vehicle

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8447519B2 (en) * 2010-11-10 2013-05-21 GM Global Technology Operations LLC Method of augmenting GPS or GPS/sensor vehicle positioning using additional in-vehicle vision sensors
CN102880798A (en) * 2012-09-20 2013-01-16 浪潮电子信息产业股份有限公司 Variable neighborhood search algorithm for solving multi depot vehicle routing problem with time windows
JP7080101B2 (en) * 2018-05-14 2022-06-03 株式会社クボタ Work vehicle
CN109189061B (en) * 2018-08-10 2021-08-24 合肥哈工库讯智能科技有限公司 AGV trolley running state regulation and control method with time error analysis function
CN111197994B (en) * 2019-12-31 2021-12-07 深圳一清创新科技有限公司 Position data correction method, position data correction device, computer device, and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5341142A (en) * 1987-07-24 1994-08-23 Northrop Grumman Corporation Target acquisition and tracking system
US5899956A (en) * 1998-03-31 1999-05-04 Advanced Future Technologies, Inc. Vehicle mounted navigation device
JP3045713B1 (en) * 1998-12-09 2000-05-29 富士通株式会社 Vehicle-mounted vehicle guidance device, communication server system, and alternative vehicle guidance system
CN1258075C (en) * 2001-02-14 2006-05-31 松下电器产业株式会社 Vehiculor pilot system
CN1438138A (en) * 2003-03-12 2003-08-27 吉林大学 Vision guiding method of automatic guiding vehicle and automatic guiding electric vehicle

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108957511A (en) * 2018-05-21 2018-12-07 永康威力科技股份有限公司 A kind of automobile navigation steering control system and the modification method that navigates
CN112859107A (en) * 2019-11-12 2021-05-28 亚庆股份有限公司 Vehicle navigation switching equipment of golf course self-driving vehicle
CN112859107B (en) * 2019-11-12 2023-11-24 亚庆股份有限公司 Vehicle navigation switching device of golf course self-driving vehicle
CN111077549A (en) * 2019-12-31 2020-04-28 深圳一清创新科技有限公司 Position data correction method, apparatus and computer readable storage medium
CN111077549B (en) * 2019-12-31 2022-06-28 深圳一清创新科技有限公司 Position data correction method, apparatus and computer readable storage medium

Also Published As

Publication number Publication date
CN100580689C (en) 2010-01-13
CN101147151B (en) 2010-06-09
CN101292244B (en) 2010-12-08
CN100580690C (en) 2010-01-13
CN101292244A (en) 2008-10-22
CN101099163A (en) 2008-01-02
CN101147151A (en) 2008-03-19

Similar Documents

Publication Publication Date Title
EP1836648B1 (en) Method and system for guiding a vehicle with vision-based adjustment
CN100580689C (en) Vision-based system and method for adjusting and guiding vehicle
EP1836650B1 (en) Method and system for guiding a vehicle with vision enhancement
CA2592977C (en) Vision-aided system and method for guiding a vehicle
CN101681168B (en) Method and system for guiding a vehicle with vision-based adjustment
US8150574B2 (en) Method and system for guiding a vehicle with vision-based adjustment
EP1849113B1 (en) Vision-aided system and method for guiding a vehicle

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant