USRE48527E1 - Optical tracking vehicle control system and method - Google Patents

Optical tracking vehicle control system and method Download PDF

Info

Publication number
USRE48527E1
USRE48527E1 US15/196,824 US201615196824A USRE48527E US RE48527 E1 USRE48527 E1 US RE48527E1 US 201615196824 A US201615196824 A US 201615196824A US RE48527 E USRE48527 E US RE48527E
Authority
US
United States
Prior art keywords
vehicle
controller
optical movement
gnss
movement sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/196,824
Inventor
David R. Reeve
Andrew John Macdonald
Campbell Robert Morrison
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Agjunction LLC
Original Assignee
Agjunction LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US11/620,388 external-priority patent/US7835832B2/en
Priority claimed from US12/504,779 external-priority patent/US8311696B2/en
Priority claimed from US13/573,682 external-priority patent/US8768558B2/en
Application filed by Agjunction LLC filed Critical Agjunction LLC
Priority to US15/196,824 priority Critical patent/USRE48527E1/en
Assigned to AGJUNCTION LLC reassignment AGJUNCTION LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: HEMISPHERE GPS LLC
Assigned to HEMISPHERE GPS LLC reassignment HEMISPHERE GPS LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BEELINE TECHNOLOGIES PTY LTD
Assigned to BEELINE TECHNOLOGIES PTY LTD reassignment BEELINE TECHNOLOGIES PTY LTD ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MACDONALD, ANDREW JOHN, MORRISON, CAMPBELL ROBERT, REEVE, DAVID ROBERT
Application granted granted Critical
Publication of USRE48527E1 publication Critical patent/USRE48527E1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/53Determining attitude
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D15/00Steering not otherwise provided for
    • B62D15/02Steering position indicators ; Steering position determination; Steering aids
    • B62D15/025Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/43Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2201/00Application
    • G05D2201/02Control of position of land vehicles
    • G05D2201/0201Agriculture or harvesting machine

Definitions

  • the present invention relates to a control system for controlling the direction of travel of a vehicle, and in particular to a control system having an embedded spatial database using optical tracking for vehicle control and guidance.
  • the control system of the present invention may also be used to control other aspects of a vehicle's motion, such as speed or acceleration.
  • the present control system may be used to control yet other aspects of the vehicle's operation, such as the application of agricultural chemicals at desired locations (including at desired application rates), or the engagement and/or mode of operation of agricultural implements (e.g., plows, harvesters, etc.) at desired locations, etc.
  • Autosteering Automatic control of steering (“autosteering”) of vehicles is becoming more widespread, especially in agricultural and mining applications.
  • Most commercially available automatic steering systems include a controller that has means for determining, among other things, the position and heading of a vehicle, a computer-based system for comparing the position and heading of the vehicle with a desired position and heading, and a steering control responsive to a control signal issued by the controller when the position and/or heading of the vehicle deviates from the desired position and/or heading.
  • a number of control systems have previously been devised for controlling the steering of agricultural vehicles. These systems are generally used on vehicles such as tractors (including tractors with towed tools or other implements), harvesters, headers and the like which operate in large fields. These vehicles generally move along predetermined trajectories (“paths”) throughout the field. In general, a wayline is entered into the control system and subsequent paths are calculated based on the wayline. If the vehicle deviates from the path as it moves, the controller causes the vehicle to steer back towards and onto the path as described below.
  • the vehicle uses various means such as signals produced by GPS (global positioning system) or INS (inertial navigation system) to identify if the vehicle deviates from the desired path trajectory. If the vehicle deviates, the extent of the deviation (i.e. the difference between the actual curvature of the vehicle's trajectory and the desired curvature, its actual compass heading compared with the desired compass heading, and the distance the vehicle is displaced laterally from the desired path) is expressed in the form of an error, and this error is fed back into the control system and used to steer the vehicle back onto the desired path.
  • GPS global positioning system
  • INS inertial navigation system
  • a problem with previous vehicle control systems is that they are inherently “one-dimensional” or “linear” in nature. This means that, at a fundamental level, the controller operates by “knowing” the path that the vehicle is required to traverse, and “knowing” where the vehicle is located on that path (i.e. how far along the path the vehicle has moved) at a given time. However, the controller does not “know” where the vehicle is actually located in space. This is despite the fact that the controller may often progressively receive information containing the vehicle's spatial location, for example from the GPS/INS signals. In current controllers, the GPS/INS signals are used primarily to determine when the vehicle deviates from the path (i.e. to calculate the error) rather than for the primary purpose of determining the vehicle's actual position in space. Hence, at a fundamental level, the controller only “knows” the geometry of the path and how far the vehicle has moved along the path.
  • one form of common path trajectory that agricultural vehicles are often required to traverse in fields is made up of a number of (usually parallel) path segments or “swaths” (these are sometimes also referred to as “rows”).
  • swaths path segments
  • the vehicle typically moves along one swath, harvesting or plowing as it goes, and it then turns around and moves back along an adjacent parallel swath, harvesting or plowing in the opposite direction.
  • the adjacent swath will generally be spaced from the first swath sufficiently closely that no part of the field or crop is missed between the swaths, but also sufficiently apart so that there is not an unnecessary overlap region (i.e.
  • the distance between the mid-lines of each respective swath is determined with reference to the width of the vehicle (i.e. the width of the plow, harvester or possibly the tool being towed by the vehicle).
  • the first swath will often be used as a reference swath or “wayline”.
  • the geometry of the wayline in space will be entered into the control system along with the vehicle or implement width, and this is used to calculate the required spacing (and hence trajectory) for each of the adjacent parallel swaths.
  • the controller is only able to control the steering of the vehicle as it proceeds along each of the swaths. It is much harder to control the steering of the vehicle as it turns around between one swath and the next.
  • the “one-dimensional” or “linear” nature of existing control systems also causes other difficulties.
  • One example is in relation to obstacle avoidance.
  • the positions of obstacles e.g., fences, trees, immovable rocks, creeks, etc.
  • the spatial location may be known according to global latitude and longitude coordinates (e.g., as provided by GPS), or alternatively the location may be known relative to a fixed point of known location (this is generally a point in or near the field used to define the origin of a coordinate system for the field).
  • GPS global latitude and longitude coordinates
  • the location may be known relative to a fixed point of known location (this is generally a point in or near the field used to define the origin of a coordinate system for the field).
  • the control system itself is therefore unable to recognize whether the location of the obstacle coincides with the trajectory of the path, and hence whether there may be a collision.
  • one of the modules would be a collision detection module for calculating the geometry and trajectory of a section of the path a short distance ahead of the vehicle in terms of “real world” spatial coordinates and for determining whether any of the points along that section of path will coincide with the location of an obstacle.
  • the collision detection module identifies that the section of path is likely to pass through an obstacle (meaning that there would be a collision if the vehicle continued along that path)
  • a further module may be required to determine an alternative trajectory for (at least) the section of the path proximate the obstacle. Yet a further module may then be required to determine how best to steer the vehicle from the alternative trajectory back onto the original path after the vehicle has moved past the obstacle.
  • This multi-modular control system structure is complicated and can lead to computational inefficiencies because the different modules may each perform many of the same geometric calculations for their own respective purposes, separately from one another, leading to “doubling up” and unnecessary computation.
  • control of the vehicle generally passes from one module to another as described above, but determining when one module should take over from another creates significant difficulties in terms of both system implementation and maintenance.
  • Another problem associated with the “one-dimensional” nature of existing control systems is their inherent inflexibility and unadaptability. For example, in practice, if the vehicle deviates from the desired path for some reason, it may be preferable for subsequent paths (swaths) to also include a similarly shaped deviation so that the paths remain substantially parallel along their length (or tangentially parallel and consistently spaced in the case of curved sections of path). If the vehicle is, for example, a harvester or a plow, then keeping the paths parallel in this way may help to prevent portions of the field from being missed, or from being harvested/plowed multiple times (by passing over the same portion of field on multiple passes).
  • the control system does not know where the vehicle is actually located in space, and therefore it cannot recognize that the beginning of the next swath is actually located nearby—it simply does not know where the next swath is (or indeed where the current swath is in space). Therefore, current control systems cannot easily recognize when it would be better to change paths (at least without intervention from the vehicle's driver), as this example illustrates. Nor is the current “one-dimensional” structure inherently adapted to enable the control systems to automatically (i.e. autonomously without assistance from the driver) determine and guide the vehicle along an efficient trajectory between swaths.
  • “attitude” generally refers to the heading or orientation (pitch with respect to the Y axis, roll with respect to the X axis, and yaw with respect to the Z axis) of the vehicle, or of an implement associated with the vehicle.
  • Other vehicle/implement-related parameters of interest include groundspeed or velocity and position.
  • Position can be defined absolutely in relation to a geo-reference system, or relatively in relation to a fixed position at a known location, such as a base station.
  • a change in one or both of the position and orientation of the vehicle can be considered a change in the vehicle's “pose.” This includes changes (e.g., different order time derivatives) in attitude and/or position. Attitude and position are generally measured relatively with respect to a particular reference frame that is fixed relative to the area that the vehicle is operating in, or globally with respect to a geo-reference system.
  • U.S. Pat. No. 6,876,920 which is assigned to a common assignee herewith and incorporated herein by reference, describes a vehicle guidance apparatus for guiding a vehicle over a paddock or field along a number of paths, the paths being offset from each other by a predetermined distance.
  • the vehicle guidance apparatus includes a GNSS Global navigation satellite system (GNSS) receiver for periodically receiving data regarding the vehicle's location, and an inertial relative location determining means for generating relative location data along a current path during time periods between receipt of vehicle position data from the GNSS receiver.
  • the apparatus also includes data entry means to enable the entry by an operator of an initial path and a desired offset distance between the paths.
  • Processing means are arranged to generate a continuous guidance signal indicative of errors in the attitude and position of the vehicle relative to one of the paths, the attitude and position being determined by combining corrected GNSS vehicle location data with the relative location data from the inertial relative location determining means.
  • the inertial sensor is used to provide a higher data rate than that obtainable from GNSS alone.
  • the inertial navigation system (INS) part of the steering control system suffers from errors, in particular a yaw bias, the signals received from the GNSS system are used to correct these errors.
  • INS inertial navigation system
  • the combination of a GNSS based system and a relatively inexpensive INS navigation system allow for quite accurate control of the position of the vehicle.
  • this system allows for accurate vehicle positioning and sound control of the vehicle's steering, difficulties may be experienced if there are prolonged periods of GNSS outage.
  • GNSS outages may occur due to unsuitable weather conditions, the vehicle operating in an area where GNSS signals cannot be accessed, or due to problems with the GNSS receiver. If a period of prolonged GNSS outage occurs, the steering system relies solely upon the INS. Unfortunately, a yaw bias in a relatively inexpensive inertial sensor used in the commercial embodiment of that steering control system can result in errors being introduced into the steering of the vehicle.
  • Optical computer mice are widely used to control the position of a cursor on a computer screen.
  • Optical computer mice incorporate an optoelectronic sensor that takes successive pictures of the surface on which the mouse operates.
  • Most optical computer mice use a light source to illuminate the surface that is being tracked (i.e. the surface over which the mouse is moving). Changes between one frame and the next are processed using the image processing ability of the chip that is embedded in the mouse.
  • a digital correlation algorithm is used so that the movement of the mouse is translated into corresponding movement of the mouse cursor on the computer screen.
  • optical movement sensors used in optical computer mice have high processing capabilities.
  • a number of commercially available optical computer mice include optical mouse sensors that can process successive images of the surface over which the mouse is moving at speeds in excess of 1500 frames per second.
  • the mouse has a small light emitting source that bounces light off the surface and onto a complementary metal oxide semiconductor (CMOS) sensor.
  • CMOS complementary metal oxide semiconductor
  • DSP digital signal processor
  • the DSP is able to detect patterns in images and see how those patterns have moved since the previous image.
  • the digital signal processor determines how far the mouse has moved in X and Y directions, and sends these corresponding distances to the computer.
  • the computer moves the cursor on the screen based upon the coordinates received from the mouse. This happens hundreds to thousands of times each second, making the cursor appear to move very smoothly.
  • the chips incorporated into optical computer mice often include photodetectors and an embedded integrated circuit that is used to analyse the digital signals received from the photodetectors.
  • the photodetectors may include an array of photosensors, such as an array of charged couple devices (CCDs).
  • U.S. Pat. No. 5,786,804 (incorporated herein by reference), which is assigned to Hewlett-Packard Company, describes a method and system for tracking attitude of a device.
  • the system includes fixing a two-dimensional (2D) array of photosensors to the device and using the array to form a reference frame and a sample frame of images.
  • the fields of view of the sample and reference frames largely overlap, so that there are common image features from frame to frame.
  • Several frames are correlated with the reference frame to detect differences in location of the common features.
  • an attitudinal signal indicative of pitch, yaw and/or roll is generated.
  • the attitudinal signal is used to manipulate a screen cursor of a display system, such as a remote interactive video system.
  • the present invention provides a control system for controlling movement of a vehicle characterized in that the control system includes an optical movement sensor which scans a surface over which the vehicle is moving and generates a signal indicative of relative movement along an axis of the vehicle and relative movement across an axis of the vehicle, said signal being provided to a controller.
  • the present invention provides a control system for controlling movement of a vehicle comprising a controller having a computer memory for storing or generating a desired path of travel, the controller being adapted to receive position and/or heading signals from one or more sensors, the position and/or heading signals enabling the controller to determine a position and/or heading of the vehicle relative to a desired path of travel, the controller sending control signals to a steering control mechanism in response to the determined position and/or heading of the vehicle, wherein the position and/or heading signals from the one or more sensors include a signal generated by an optical movement sensor configured to scan a surface during travel of the vehicle, the optical movement sensor generating a signal indicative of relative movement along an axis of the vehicle and relative movement across an axis of the vehicle.
  • the surface that is scanned by the optical movement sensor is suitably a surface over which the vehicle is travelling.
  • the optical movement sensor scans a surface that is close to or under the vehicle during travel of the vehicle over the surface.
  • the optical movement sensor may comprise the operative part of an optical computer mouse. Therefore, in saying that the optical movement sensor “scans” the surface over which the vehicle moves, where the optical movement sensor comprises the operative part of an optical computer mouse, it will be understood that the optical movement sensor receives successive images of the surface over which the vehicle is moving. One part or other of the control system will then detect patterns in the images, and uses the change in the patterns between successive images to obtain information regarding the movement of the vehicle.
  • the optical movement sensor may comprise an illumination source and an illumination detector.
  • the optical movement sensor may comprise an optical movement sensor integrated circuit.
  • the optical movement sensor may comprise the operative part from an optical computer mouse.
  • the optical movement sensor may be adapted from or derived from the operative part of an optical computer mouse.
  • the optical movement sensor may use a light source to illuminate the surface that is being tracked (i.e. the surface over which the vehicle is moving).
  • Changes between one frame and the next may be processed by an image processing part of a chip embedded in the optical movement sensor and this may translate the movement across the surface of the optical movement sensor (which will generally be mounted to the vehicle) into movement along two axes.
  • the image processing may be performed by processing means separate from the optical movement sensor.
  • the signals received by the optical movement sensor may be conveyed to a separate microprocessor with graphics processing capabilities for processing.
  • the optical movement sensor may include an optical movement sensing circuit that tracks movement in a fashion similar to the optical movement sensing circuits used to track movement in computer mice.
  • the person skilled in the art will readily appreciate how such optical movement sensing circuits analyze data and provide signals indicative of movement of the sensor across the surface. For this reason, further discussion as to the actual algorithms used in the optical movement sensing circuits need not be provided.
  • the optical movement sensing circuit may comprise an optical movement sensing integrated circuit. Such optical movement sensing integrated circuits are readily available from a number of suppliers.
  • the control system of the present invention may further comprise one or more inertial sensors for providing further signals regarding the vehicle's attitude and position (or changes thereto) to the controller.
  • Accelerometers and rate gyroscopes are examples of inertial sensors that may be used.
  • the inertial sensors may form part of or comprise an inertial navigation system (INS), a dynamic measurement unit (DMU), an inertial sensor assembly (ISA), or an attitude heading reference system (AHRS). These are well-known to persons skilled in the art and need not be described further.
  • the inertial sensors may be used in conjunction with other navigation sensors, such as magnetometers, or vehicle based sensors such as steering angle sensors, or wheel speed encoders.
  • Inertial sensors such as rate gyroscopes and accelerometers, can suffer from time varying errors that can propagate through to create errors in the vehicle's calculated attitude and/or position. These errors can be sufficiently acute that to prevent providing the controller with significantly inaccurate measures of the vehicle's attitude and/or position, it is preferable (and often necessary) for the control system to also receive signals regarding the vehicle's attitude and/or position (or changes thereto) from a source that is independent of the inertial sensors. These separate signals can be used to compensate for the errors in the inertial sensor signals using known signal processing techniques.
  • GNSS signals which provide information regarding the vehicle's location
  • the present invention opens up the possibility of providing a control system that includes the optical movement sensor and one or an assembly of inertial sensors (and possibly including one or more other vehicle sensors as well).
  • the signals provided by the optical movement sensor may be used to compensate for the errors in the inertial sensor signals instead of or in addition to the GNSS signals.
  • a single optical movement sensor may generally be sufficient to compensate for the errors in inertial sensors, such as accelerometers which measure rates of change in linear displacement.
  • a single optical movement sensor may not be sufficient to compensate for errors in inertial sensors, such as gyroscopes which measure rates of change in angular displacement because the optical movement sensor will often be fixedly mounted to the vehicle such that the orientation of the optical movement sensor is fixed to, and changes with, the orientation of the vehicle.
  • the single optical movement sensor of the kind used in optical computer mice is able to detect and measure movement of the optical movement sensor along the X (roll) and Y (pitch) axes (in the present context this means the X (roll) and Y (pitch) axes of the vehicle because the optical movement sensor is fixed to the vehicle).
  • this kind of optical movement sensor is not generally able to detect and measure rotation about the Z (yaw) axis. Consequently, if it is desired to compensate for the XYZ errors in inertial sensors such as gyroscopes using optical movement sensors that are fixedly mounted to the vehicle, two or more optical movement sensors will generally need to be provided and mounted at different locations on the vehicle.
  • a single optical movement sensor can be used to compensate for the errors in gyroscopes and the like which measure rates of change in rotational displacement if the optical movement sensor is not fixed with respect to the vehicle. Rather, the optical movement sensor could be mounted so that when the vehicle turned (i.e. rotated about its Z (yaw) axis), the orientation of the optical movement sensor would remain unchanged. In effect, even if the vehicle turns, the orientation of the optical movement sensor would remain unchanged, meaning that the optical movement sensor would effectively translate but not rotate with respect to the surface over which the vehicle is moving.
  • a single optical movement sensor might thus be used to compensate for the errors in both accelerometers and gyroscopes, but some system or mechanism (e.g., gimbal-mounting) would need to be provided to maintain the constant orientation of the optical movement sensor.
  • control system incorporates one or more inertial sensors, one or more optical movement sensors, and where the optical movement sensor(s) are used (instead of GNSS signals) to compensate for the errors in the inertial sensor(s)
  • optical movement sensor(s) and the inertial sensor(s) can only measure changes in vehicle attitude and/or position. They are unable to fix the geographic position and attitude of the vehicle in absolute “global” coordinates. References in this document to relative movement of the vehicle, or of an implement associated with the vehicle, or relative attitude/position/heading/pose information should be understood in this context.
  • the relative coordinate system established by relative measurement control systems can relate to absolute geographic space if the vehicle can be moved sequentially to at least two, and preferably three or more, locations whose absolute geographic locations are known.
  • the inertial navigation system positions of the vehicle could be arbitrarily set on a map whose origin and orientation is known.
  • the vehicle could be located at the first known location, the internal coordinates noted, then moved to a second location and the new internal coordinates likewise noted.
  • the line between the two points could be fitted from the internal map onto the real world map to arrive at the XY offset between the two map origins, the orientation difference between the two map origins, and the linear scaling difference between the two maps.
  • the present invention may comprise a control system including one or more optical movement sensors and one or more inertial sensors.
  • the control system may include one or more optical movement sensors and an assembly of inertial sensors.
  • the control system of the present invention may further comprise an assembly of sensors including accelerometers and rate gyroscopes for providing further position and/or attitude signals to the controller.
  • the assembly may comprise between one and three sensor sets orthogonally mounted, with each sensor set comprising not necessarily one of each, but no more than one of each of the above-mentioned sensors.
  • Such inertial sensors are well known to persons skilled in the art and need not be described further.
  • the present invention may comprise a control system including one or more optical movement sensors and one or more other sensors.
  • the other sensors may comprise navigation sensors such as magnetometers, or vehicle sensors such as wheel speed encoders, and steering angle encoders.
  • Control systems in accordance with this embodiment of the invention would also be described as relative measurement control systems, and the relative coordinate system established by such a system can relate to absolute geographic space in generally the same way as described above.
  • control system of the present invention which incorporates one or more optical movement sensors, may be integrated with a GNSS system.
  • the GNSS system provides absolute measurement in geographic space and the optical movement sensor provides relative movement data that can be used to control the vehicle during periods of outage of GNSS signals or during periods of normal operation when no GNSS signals are being received.
  • the present invention provides a control system including one or more optical movement sensors and a GNSS system.
  • control system of the present invention may incorporate one or more optical movement sensors, a GNSS system and one or more inertial sensors, suitably an assembly of inertial sensors.
  • the optical movement sensor is configured to look at the ground near or under the vehicle.
  • the output signal generated by the optical movement sensor comprises the relative movement along the axis of the vehicle and the relative movement across the axis of the vehicle. This information can be used as an additional source for compensating for the errors in the inertial sensors, giving a combined GNSS/INS/optical movement sensor system with the capability of operating over sustained periods of GNSS outage.
  • the present invention that may provide a control system including one or more optical movement sensors, a GNSS system and one or more inertial sensors, such as an assembly of inertial sensors.
  • GPS global positioning system
  • GNSS including GPS and other satellite-based navigation systems
  • a number of systems also exist for increasing the accuracy of the location readings obtained using GNSS receivers. Some of these systems operate by taking supplementary readings from additional satellites and using these supplementary readings to “correct” the original GNSS location readings.
  • SBAS Shortlite Based Augmentation Systems
  • WAAS Wide Area Augmentation System
  • GNOS European Space Agency's “European Geostationary Navigation Overlay Service”
  • MFTS Japanese Multi-Functional Transportation Satellite
  • GBASs Round Based Augmentation Systems
  • the datastream from the optical movement sensor may be combined with a datastream from another sensor. This may be done using known signal processing techniques to obtain a stream of statistically optimal estimates of the vehicle's current position and/or attitude.
  • the signal processing techniques may utilize a statistically optimized filter or estimator.
  • the optimal filter or estimator could usefully, but not necessarily, comprise a Kalman filter.
  • the optical sensor used in the control system in accordance with the present invention may comprise an optical movement sensing integrated circuit that receives raw data from a lens assembly mounted on a vehicle or on an implement towed by a vehicle.
  • the lens assembly may be configured such that an image of the ground immediately below the lens assembly is formed on a photosensor plane of the optical movement sensing integrated chip by the lens assembly.
  • the lens may be a telecentric lens.
  • the lens may be an object space telecentric lens.
  • An object space telecentric lens is one that achieves dimensional and geometric invariance of images within a range of different distances from the lens and across the whole field of view. Telecentric lenses will be known to those skilled in the art and therefore need not be described any further.
  • the lens assembly may be chosen so that the extent of the image on the optical movement sensing integrated chip represents a physical extent in the object plane which is commensurate with both the anticipated maximum speed of the vehicle and the processing rate of the optical movement sensing integrated circuit. For example, if the maximum speed of the vehicle is 5 m per second and the desired overlap of successive images is 99%, an image representing 0.5 m in extent will require a processing speed of 1000 frames per second.
  • the optical movement sensor may include an illumination source of sufficient power such that the image of the ground beneath the vehicle is rendered with optimum contrast. This can be usefully, but not necessarily implemented as an array of high intensity light emitting diodes chosen to emit light at the wavelength of optimal intensity of the optical movement sensor.
  • the optical movement sensor may be provided with a mechanism to keep the entrance pupil of the optical assembly free of dust.
  • Other mechanisms may be used, such as those that spray a cleaning fluid over the pupil.
  • the cleaning fluid in those embodiments may comprise a cleaning liquid, such as water.
  • Other means or mechanisms suitable for keeping the lens, or at least the entrance pupil of the optical assembly, free of dust will be known to those skilled in the art and may also be used with the present invention.
  • the present invention provides a control system for controlling a position of an implement associated with a vehicle, characterised in that the control system includes an optical movement sensor which scans a surface over which the implement is moving and generates a signal indicative of relative movement along an axis of the implement and relative movement across an axis of the implement, said signal being provided to a controller.
  • the present invention provides a control system for maintaining a position and/or heading (attitude) of an implement close to a desired path of travel, the control system comprising a controller having a computer memory for storing or generating the desired path of travel, the controller being adapted to receive position and/or heading signals relating to a position and/or heading of the implement from one or more sensors, the position and/or heading signals enabling the controller to determine the position and/or heading of the implement relative to the desired path of travel, the controller sending control signals to a position and/or heading control mechanism in response to the determined position and/or heading, wherein the position and/or heading signals from the one or more sensors include a signal generated by an optical movement sensor configured to scan a surface over which the implement is travelling, the optical movement sensor generating a signal indicative of relative movement along an axis of the vehicle and relative movement across an axis of the vehicle.
  • the optical movement sensor is mounted to the implement.
  • the optical movement sensor may scan the surface close to the implement or underneath the implement as the implement traverses the surface.
  • control algorithms and the position control mechanisms may be as described in U.S. Pat. No. 7,460,942, which is assigned to a common assignee herewith and incorporated herein by reference.
  • the position of the implement may be controlled by controlling the steering of the vehicle associated with the implement (this is especially useful if the implement is rigidly and fixedly connected to the vehicle), or by moving the position of the implement (or at least a working part of the implement) relative to the vehicle, which may be achieved by adjusting the lateral offset between the working part of the implement and the vehicle, or by using the working part of the implement to “steer” the implement.
  • control system may further include one more of a GNSS system and inertial sensors and navigation sensors and vehicle based sensors.
  • the present invention resides in a vehicle control system having a controller and a spatial database adapted to provide spatial data to the controller at control speed.
  • the invention resides in a control system for controlling a vehicle within a region to be traversed, the control system comprising: a spatial database containing spatial data; a controller adapted to receive spatial data from the spatial database at control speed; the control system being adapted to receive spatial data from the controller and/or an external source; and the controller using the spatial data for controlling the vehicle.
  • a control system for steering a vehicle within a region to be traversed, the control system comprising: a spatial database containing spatial data; a controller adapted to receive spatial data from the spatial database at control speed; the controller being adapted to control the steering of the vehicle, the spatial database being adapted to receive updated spatial data from the controller and/or an external source; and the updated spatial data relating to the vehicle and/or an implement associated with and proximate the vehicle and/or at least a portion of the region proximate the vehicle.
  • the region to be traversed by the vehicle will generally be the field that is to be plowed, harvested, etc., and the invention will be described generally with reference to agricultural vehicles operating in fields.
  • the region to be traversed by the vehicle may take a range of other forms in different applications.
  • the region to be traversed by the vehicle might comprise roadways located in a particular geographical area.
  • the region could comprise the vehicle navigable regions of the mine. In underground mining, this could include the various levels of the mine located vertically above and below one another at different relative levels (depths).
  • control system of the present invention could be applied to vehicles that operate on airport tarmacs, in which case the region to be traversed by the vehicle might be the tarmac, or a portion thereof. From these examples, the person skilled in the art will appreciate the breadth of other applications that are possible.
  • the control system of the present invention includes a spatial database that contains spatial data.
  • the spatial database may also be adapted to receive spatial data including updated spatial data, and to provide spatial data to other components of the control system.
  • data may be characterized as “spatial” if it has some relationship or association with “real world” geographical location, or if it is stored somehow with reference to geographical location.
  • Some illustrative examples of the kinds of spatial data that may be stored within the database include (but are not limited to) coordinate points describing the location of an object (e.g., a rock or tree) in terms of the object's “real world” geographical location in a field, the coordinate points for a geographical location itself, information regarding a “state” of the vehicle (e.g., its speed, “pose” (position and orientation) or even fuel level) at a particular geographical location, a time when the vehicle was at a particular geographical location, or a command to the vehicle to change its trajectory or mode of equipment (e.g., plow) operation if or when it reaches a certain geographical location.
  • a “state” of the vehicle e.g., its speed, “pose” (position and orientation) or even fuel level
  • spatial data any data or information that has an association with geographical location, or which is stored with reference to geographical location, can constitute “spatial data”.
  • spatial data and “spatial information” will be used interchangeably.
  • References simply to “data” or “information” will generally also carry a similar meaning, and references simply to the “database” will be to the spatial database, unless the context requires otherwise.
  • the spatial database is an electronic database stored in a memory device, such as, for example, a RAM, as discussed in more detail below.
  • Spatial data may be stored within the database according to any convenient coordinate system, including (but not limited to) cartesian (or projected) coordinates, polar coordinates, cylindrical coordinates, spherical coordinates, latitude/longitude/altitude etc.
  • the coordinate system may also be “global” in the sense of the location references provided by GPS, or “local” coordinates such as those defined with respect to a local origin and reference orientation.
  • the coordinates may or may not take into account the curvature caused by the Earth's overall spherical shape.
  • Cartesian (x,y or x,y,z) coordinates or latitude/longitude/altitude will be used most frequently because of the way these inherently lend themselves to describing geographical location, and because of the ease with which these coordinate systems can be implemented digitally.
  • Particularly representative embodiments may utilize the WGS84 World Geodetic System 1984 datum which is consistent with the current GPS.
  • GPS global positioning system
  • WAAS Wide Area Augmentation System
  • EGNOS European Space Agency's “European Geostationary Navigation Overlay Service”
  • MFTS Japanese' “Multi-Functional Transportation Satellite”
  • GASs Global System Based Augmentation Systems
  • SBASs Satellite Based Augmentation Systems
  • the controller (which controls the vehicle) receives spatial data from the spatial database.
  • the data received by the controller from the database forms at least part of the control inputs that the controller operates on to control the vehicle (i.e. the spatial data forms at least part of the inputs that drive the controller).
  • the fact that the controller operates directly on information that is inherently associated with “real world” geographic location represents a change in thinking compared with existing vehicle control systems.
  • the control system of the present invention “thinks” directly in terms of spatial location.
  • control parameters are defined in geographic space rather than the space of an abstract vector. Consequently, the controller of the present invention may be considered to be inherently “multi-dimensional” or “spatial” in nature, as opposed to “one-dimensional” or “linear” like the existing control systems described in the background section above.
  • control system including the controller
  • the controller may be implemented using equipment that provides memory and a central processing unit to run the one or more algorithms required to control the vehicle.
  • the controller (and hence the control algorithm(s)) used in the present invention may take any form suitable for controlling the steering of a vehicle.
  • closed loop or feedback type control will be used at least in relation to some signal streams (i.e. in relation to at least some of the vehicle variables being controlled by the controller).
  • open loop control may also be used, as may feed-forward control structures wherein the spatial data received by the controller from the spatial database is fed forward to form part of the control outputs used to control the vehicle.
  • the control structure may incorporate combinations of proportional, integral and differential control, or a series of such (possibly nested) control loops.
  • any form of suitable control and/or controller may be used.
  • the control system may also incorporate conventional signal processing and transmitting equipment, for example, for suitably filtering incoming spatial data signals, and for transmitting control signals from the controller to the vehicle's steering system to steer the vehicle.
  • the person skilled in the art will appreciate that any suitable electric, mechanical, pneumatic or hydraulic actuators, or combinations thereof, may be used with the present invention.
  • the actuators may be linked with the vehicle's steering and drive systems to control the steering, acceleration, deceleration, etc. of the vehicle in response to control signals produced by the controller.
  • Associated equipment such as amplifiers and power sources may also be provided as required to amplify the control signals, and to power the actuators.
  • a wide range of power sources may be used including batteries, generators, pumps, etc., depending on the nature of the actuator(s) and the signals to be amplified.
  • the spatial database used to store the spatial data and to provide the spatial data to the controller may be different to other forms of databases used in other areas.
  • databases often contain the vast amounts of information (in this case “information” is not used in its “spatial” sense) and the information is generally stored in complex hierarchical structures.
  • these databases may be considered to be “multi-levelled” in that an initial query may return only relatively superficial level information, but this may in turn allow the user to interrogate the database more deeply to obtain more specific, linked or related information.
  • control speed means that the database is able to provide the information at a rate of the same order as the speed at which the controller repeats successive cycles of the control algorithm (i.e. at a rate of the same order as the “clock speed” of the controller).
  • the database will be adapted to provide the data to the controller, and perhaps also receive data from the controller and/or external sources, at every successive cycle of the control algorithm (i.e. at the controller's clock speed).
  • the database may be adapted to provide (and perhaps receive) data at less than, but close to, the controller's clock speed (for example, at every second or third successive cycle of the control algorithm), provided that the rate is fast enough to provide the controller with sufficiently up-to-date spatial information to achieve adequate vehicle control performance.
  • the database may be adapted to provide data at a rate of the same order as one of those controller clock speeds. In any event, the database should provide data to the controller at a rate commensurate with the control loop bandwidth.
  • the database may be adapted to provide data to the controller at a rate of between 1 Hz and 100 Hz.
  • rates between 1 Hz and 20 Hz will almost always be sufficient, and even rates between 3 Hz and 12 Hz may be sufficient for vehicles moving at significantly less than 60 km/hr.
  • the necessary or achievable rates may vary depending on the level of control precision and performance required in different applications, the speed at which the vehicle in question moves, and the capabilities of the available equipment used to implement the control system.
  • the spatial database used in the present invention can provide spatial data to the controller at control speed, and therefore forms part of the system's overall configuration
  • the spatial database may be considered to be “embedded” within the control system, rather than external to it. This is particularly so in embodiments where feedback type control is used, and the spatial database forms part of the system's overall closed loop structure (i.e. in embodiments where the spatial database forms part of the loop).
  • the form of the database should allow the required rapid database access and response times.
  • the database and all of the data that it contains will be loaded into the control system's memory (i.e. loaded into RAM). This way, the data will be directly accessible by the controller's CPU (central processing unit), rather than requiring a query to be sent to a remote disk or storage device containing the data, the response to which would then need to be loaded into RAM before being accessible by the CPU.
  • the database could be located on a separate disk or other storage device, particularly if the device is capable of retrieving data in response to a query with sufficient speed such as, for example, a disk device with RAM read/write cache.
  • the amount of memory required to store the spatial data relating to a particular field to be traversed by the vehicle may be in the order of megabytes.
  • this may correspond to approximately 4 MB of memory required to store the coordinates of each point.
  • the speed of the database may be assisted by the way in which the data is arranged (i.e. stored) within the database.
  • a wide range of methods and algorithms are known for arranging data (i.e. for assigning appropriate “indices” and the corresponding memory allocations to individual items of data) within databases, and the particular method chosen depends on the nature of the data, and the way and speed with which the database is to respond to a query.
  • the data should be arranged so as to enable the database to collate and deliver all relevant information relating to a complex query.
  • the spatial database used in the present invention can store data in a “single-level” or “flat” structure according to the geographical location that particular items of data relate to.
  • Some algorithms which could be used to arrange the spatial data within the database include the algorithms commonly referred to by the names “Grid-indexing”, “Quadtree” or “R-tree”.
  • data may be arranged within the database using a form of algorithm that will be referred to as a “spatial hash-key” algorithm.
  • a spatial hash-key algorithm maps physical locations (based on their “real world” coordinates) into one-dimensional “hash-keys”.
  • the “hash-key” for each location is a string of characters that can be stored in the database's hash table and retrieved in response to a query.
  • Properties of the spatial hash-key algorithm may include: points which are close to each other in the real world should have closely related hash keys (i.e. the algorithm should maintain “locality”), the algorithm should operate using whatever coordinate system the control system uses to represent the region, the algorithm should be adapted for digital implementation (hence, it should be adapted to operate using integer or floating-point numbers, preferably with 64-bit “double” precision or better) the algorithm should be fast to compute.
  • control system of the present invention may be adapted to receive updated data from the controller and/or an external source.
  • the spatial database can be adapted to receive the updated information at control speed.
  • Data received from the controller may include or may be used to generate, for example, estimates of the vehicle's predicted state (i.e. its speed, position, orientation, etc.) at an upcoming location based on its current instantaneous state at a particular location.
  • the external sources may include GPS, INS, or any other inertial, visual or other system used for obtaining information relating to the state of the vehicle or other aspects of the region (such as obstacles close to the vehicle). Data received in this way may be (at least initially) recorded in its unprocessed or “raw” form in the database.
  • This unprocessed data may be fed directly back into the controller, or the respective streams of incoming data (possibly relating to disparate variables) may be filtered using a Kalman filter or some other similar digital signal processing technique to obtain a statistically optimized estimate of the state of the vehicle and its proximate surroundings as it travels. This optimized estimate of the vehicle's state at a particular location may then be fed into the controller. The use of statistically optimized estimates and data may help to improve control performance.
  • the invention resides in a closed loop vehicle control system comprising: a spatial database; a controller adapted to receive spatial data from the spatial database at control speed, the controller controlling the steering of the vehicle; wherein updated spatial data is fed back into the control system.
  • the invention resides in a method for controlling a vehicle comprising: entering spatial data relating to a region to be traversed by the vehicle into a spatial database; providing spatial data from the spatial database to a controller at control speed to control the vehicle as the vehicle traverses the region; and entering updated spatial data into the spatial database as the vehicle traverses the region.
  • the invention resides in a vehicle control system comprising: a spatial database; a controller adapted to receive spatial data from the spatial database; and the controller using the spatial data from the spatial database to control the steering of the vehicle.
  • FIG. 1A shows a vehicle comprising a tractor and an implement fitted with an optical tracking control system in accordance with one embodiment of the present invention, and further shows XYZ axial attitude orientations.
  • FIG. 1B shows the vehicle with a block diagram of the control system.
  • FIG. 1C shows a vehicle fitted with an optical tracking control system in accordance with another embodiment of the present invention including a pair of optical tracking sensors.
  • FIG. 2 shows a real-time kinematic (RTK) optical tracking vehicle control system in accordance with another embodiment of the present invention.
  • RTK real-time kinematic
  • FIG. 3 shows a vehicle fitted with an optical tracking control system in accordance with another embodiment of the present invention including an inertial navigation system (INS) with inertial sensors.
  • INS inertial navigation system
  • FIG. 4 shows an RTK optical tracking vehicle control system in accordance with another embodiment of the present invention including an INS.
  • FIG. 5 shows a flow sheet illustrating the interaction of an optical movement sensor with a controller in accordance with an embodiment of the present invention.
  • FIG. 6 shows a flow sheet illustrating the interaction of an optical movement sensor and a GNSS sensor with a controller in accordance with an embodiment of the present invention.
  • FIG. 7 shows a flow sheet illustrating the interaction of an optical movement sensor and inertial sensors with a controller in accordance with an embodiment of the present invention.
  • FIG. 8 shows a flow sheet illustrating the interaction of an optical movement sensor, inertial sensors and a GNSS sensor with a controller in accordance with an embodiment of the present invention.
  • FIG. 9 shows a schematic view of an embodiment of the present invention in which the position of an implement is optically tracked.
  • FIG. 10 shows a schematic diagram of one possible arrangement for an optical movement sensor that could be used in the present invention.
  • FIG. 11 shows an end view of the lens and LED Light-Emitting Diode illuminator ring used in the arrangement of FIG. 10 .
  • FIG. 12 is a schematic illustration of the operation of a discrete-time Kalman filter, which may be used in an optimal estimator of the present invention.
  • FIG. 13 schematically represents the difference between the vehicle's actual spatial location and what is “seen” by existing forms of “one-dimensional” controllers such as those described in the background section above.
  • FIG. 14 is a pictorial representation of an agricultural vehicle having a control system in accordance with one particular embodiment of the present invention.
  • FIG. 15 illustrates the physical meaning of certain parameters controlled by some versions of the present control system, namely the “cross-track error”, the “heading error” and the “curvature error”.
  • FIG. 16 is a schematic “block-diagram” representation of an overall control system structure that may be used in representative embodiments of the present invention.
  • FIG. 17 is a schematic representation of the “controller” block that may be used in representative embodiments such as that shown in FIG. 16 .
  • FIG. 19 is a block diagram representation of the state space representation used in the digital implementation of certain aspects of the control system.
  • FIG. 20 shows an example trajectory of an agricultural vehicle, and the coordinates corresponding to different points along the trajectory using a simplified integer based coordinate system.
  • FIG. 21 shows a similar example trajectory of an agricultural vehicle to that shown in FIG. 20 , except that the coordinate system is similar in format to the WGS84 coordinate used by current GPS.
  • FIG. 22 illustrates the way in which numbers are represented in the IEEE 754 standard double-precision floating-point format.
  • FIG. 23 is a “flow-diagram” illustrating the way a particularly preferred spatial hash algorithm may be used to generate hash keys for the coordinates in FIG. 21 .
  • GNSS Global navigation satellite systems
  • GPS U.S.
  • Galileo European, proposed
  • GLONASS Globalnaya Navigazionnaya Sputnikovaya
  • Beidou China
  • Compass China, proposed
  • IRNSS Indian Regional Navigation Satellite System IRNSS Indian Regional Navigation Satellite System
  • QZSS QZSS Quasi-Zenith Satellite System
  • Inertial navigation systems include gyroscopic (gyro) sensors, accelerometers and similar technologies for providing output corresponding to the inertia of moving components in all axes, i.e. through six degrees of freedom (positive and negative directions along longitudinal X, transverse Y and vertical Z axes). Yaw, pitch and roll refer to moving component rotation about the Z, Y and X axes respectively. Said terminology will include the words specifically mentioned, derivatives thereof and words of similar meaning.
  • FIGS. 1A and 1B show a tractor 10 fitted with an optical movement sensor 16 in accordance with an embodiment of the present invention.
  • the tractor 10 is towing an agricultural implement, such as a plow, sprayer, cultivator, etc. 12 .
  • the tractor 10 is fitted with a steering control system.
  • the steering control system includes a GNSS receiver 13 connected to antennas 20, a controller 14 and a steering valve block 15 .
  • the controller 14 suitably includes a computer memory that is capable of having an initial path of travel entered therein.
  • the computer memory is also adapted to store or generate a desired path of travel.
  • the controller 14 receives position and attitude signals from one or more sensors (to be described later) and the data received from the sensors are used by the controller 14 to determine or calculate the position and attitude of the tractor.
  • the controller 14 then compares the position and attitude of the tractor with the desired position and attitude of the tractor.
  • the controller 14 issues a steering correction signal that interacts with a steering control mechanism.
  • the steering control mechanism makes adjustments to the angle of steering of the tractor, to thereby assist in moving the tractor back towards the desired path of travel.
  • the steering control mechanism may comprise one or more mechanical or electrical controllers or devices that can automatically adjust the steering angle of the vehicle. These devices may act upon the steering pump, the steering column or steering linkages.
  • the steering control algorithm may be similar to that described in our U.S. Pat. No. 6,876,920, which is incorporated herein by reference and discloses a steering control algorithm, which involves entering an initial path of travel (often referred to as a wayline).
  • the computer in the controller 14 determines or calculates the desired path of travel, for example, by determining the offset of the implement being towed by the tractor and generating a series of parallel paths spaced apart from each other by the offset of the implement. This ensures that an optimal working of the field is obtained.
  • the vehicle then commences moving along the desired path of travel.
  • One or more sensors provide position and attitude signals to the controller and the controller uses those position and attitude signals to determine or calculate the position and attitude of the vehicle.
  • the controller generates a steering correction signal.
  • the steering correction signal may be generated, for example, by using the difference between the determined position and attitude of the vehicle and the desired position and attitude of the vehicle to generate an error signal, with the magnitude of the error signal being dependent upon the difference between the determined position and attitude and the desired position and attitude of the vehicle.
  • the error signal may take the form of a curvature demand signal that acts to steer the vehicle back onto the desired path of travel.
  • Steering angle sensors in the steering control mechanism may monitor the angle of the steering wheels of the tractor and send the data back to the controller to thereby allow the controller to correct for understeering or oversteering.
  • the error signal may result in generation of a steering guidance arrow on a visual display unit to thereby enable the driver of the vehicle to properly steer the vehicle back onto the desired path of travel.
  • This manual control indicator may also be provided in conjunction with the steering controls as described in paragraph above.
  • most, if not all, steering control algorithms operate by comparing a determined or calculated position and attitude of the vehicle with a desired position and attitude of the vehicle.
  • the desired position and attitude of the vehicle is typically determined from the path of travel that is entered into, or stored in, or generated by, the controller.
  • the determined or calculated position and attitude of the vehicle is, in most, if not all, cases determined by having input data from one or more sensors being used to determine or calculate the position and attitude of the vehicle.
  • GNSS sensors, accelerometers, wheel angle sensors and gyroscopes are used as the sensors in preferred embodiments of that patent.
  • the controller 14 includes a graphic user interface (GUI) 17 mounted in the cab of the tractor 10 for inputting data to the controller 14 and a display screen.
  • GUI 17 can comprise any means for entering data into the controller 14 , for example a touchscreen, keyboard or keypad for manually entering data, or a cable/wireless connection for transferring data to the controller 14 .
  • the GUI 17 also includes a display screen, and can include various other output devices such as LEDs, audio, printers, hardwired and wireless output connections, etc.
  • the controller 14 also includes a computer memory for receiving and storing data, a CPU for processing data and a control signal generator for generating control signals to the steering control mechanism.
  • the controller 14 may also include random access memory (RAM), read only memory (ROM), and an optical disc drive such as a DVD drive or a CD drive for receiving optical disks and reading information therefrom.
  • RAM random access memory
  • ROM read only memory
  • optical disc drive such as a DVD drive or a CD drive for receiving optical disks and reading information therefrom.
  • the controller 14 may be pre-programmed with software that allows for calculation of the desired path of travel.
  • software may be loaded onto the controller from a recorded media carrier, such as a DVD disc, a CD disc, a floppy disk or the like. Appropriate software may be downloaded from a network.
  • the tractor 10 shown in FIGS. 1A and 1B is also fitted with an optical movement sensor 16 .
  • the optical movement sensor 16 is fitted to an arm 18 extending forwardly from the front of the tractor 10 . This is so that the optical movement sensor 16 is ahead of the wheels to minimize the effect of dust kicked up by the wheels.
  • the optical movement sensor 16 may be positioned at a side of the tractor or at a rear part of the tractor, or even underneath the tractor.
  • the basic requirement for the optical movement sensor positioning and mounting is that the optical movement sensor can emit radiation, typically light, onto the ground and receive reflected radiation or light from the ground. Provided that this basic requirement is met, the optical movement sensor may be mounted anywhere on the tractor. Indeed, the optical movement sensor may even be mounted on the implement 12 ( FIG. 9 ).
  • the optical movement sensor 16 is “gimballed”, meaning that its orientation with respect to the tractor may change. This “gimballed” embodiment is described further below.
  • the optical tracking movement sensor 16 may comprise the operative part of an optical computer mouse.
  • Optical computer mice incorporate an optoelectronics sensor that takes successive pictures of the surface on which the mouse operates. Most optical computer mice use a light source to illuminate the surface that is being tracked. Changes between one frame and the next are processed by an image processing part of a chip embedded in the mouse and this translates the movement of the mouse into movement on two axes using a digital correlation algorithm.
  • the optical movement sensor 16 may include an illumination source for emitting light therefrom.
  • the illumination source may comprise one or more LEDs.
  • the optical movement sensor may also include an illumination detector for detecting light reflected from the ground or the surface over which the vehicle is travelling.
  • Appropriate optical components such as a lens (preferably a telecentric lens), may be utilized to properly focus the emitted or detected light.
  • a cleaning system such as a stream of air or other cleaning fluid, may be used to keep the optical path clean.
  • the optical movement sensor 16 may comprise a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) sensor.
  • CMOS complementary metal oxide semiconductor
  • the optical movement sensor 16 may also include an integrated chip that can rapidly determine the relative movement along an axis of the vehicle and the relative movement across an axis of the vehicle by analysing successive frames captured by the illumination detector. The optical movement sensor can complete hundreds to thousands of calculations per second.
  • the optical movement sensor 16 generates signals that are indicative of the relative movement of the vehicle along the vehicle's axis and the relative movement of the vehicle across the vehicle's axis.
  • the signals are sent to the controller 14 .
  • the signals received by the controller 14 are used to progressively calculate or determine changes in the position and attitude of the vehicle.
  • the controller 14 may include a clock that can be used to determine a time of travel of the vehicle and use that time of travel and possibly other input variables (such as the speed of the vehicle), together with the signals generated by the optical movement sensor, to calculate or determine the position and attitude of the vehicle This may then be compared to the desired position and attitude of the vehicle arising from the desired path of travel stored in or generated by the controller 14 .
  • a steering correction signal is sent from the controller 14 to the steering (valve block) control mechanism.
  • Examples of such automatic steering control mechanisms are disclosed in U.S. Pat. No. 7,142,956; No. 7,277,792; No. 7,400,956; and No. 7,437,230, all of which are assigned to a common assignee with the present application and are incorporated herein by reference.
  • optical movement sensor 16 Only one optical movement sensor 16 is illustrated in FIGS. 1A and 1B . However, as described above, if the optical movement sensor 16 is the kind used in optical computer mice, and if the optical movement sensor 16 is fixed with respect to the vehicle, the optical movement sensor will generally only measure movement along and across the principal axis of the vehicle (i.e. along the longitudinal roll X axis and along the transverse pitch Y axis). Fixed optical movement sensors of this kind generally do not measure rotation about the yaw Z axis.
  • a single optical movement sensor 16 could be used to measure change in the vehicle's orientation with respect to its yaw Z axis (in addition to measuring changes in the movement of the vehicle along the roll X and pitch Y axes), if the optical movement sensor 16 is mounted in a gimbal mount 19 , which can be controlled with input from a compass (GNSS, magnetic, etc.) and/or a gyroscope.
  • GNSS compass
  • magnetic, etc. magnetic, etc.
  • gyroscope gimballed
  • the optical movement sensor orients itself in a similar way to a compass needle (which stays in one orientation even if the compass is rotated).
  • a gimbal device or mechanism 19 is provided to dynamically adjust the orientation of the optical movement sensor with respect to the vehicle so that the optical movement sensor's orientation remains the same as the vehicle moves and turns.
  • Such gimbal mounting devices and mechanisms are commercially available and will be known to those skilled in the art. They therefore require no further explanation.
  • the gimballed mounting device or mechanism could also monitor the change in the optical movement sensor's orientation relative to the orientation of the vehicle, and this information could be used to calculate or determine changes in the vehicle's orientation about its yaw Z axis.
  • the alternative embodiment shown in FIG. 1C accommodates calculating and determining attitude changes in the vehicle's orientation about its yaw Z axis, which attitude changes indicate the vehicle's heading or direction of travel.
  • Two optical movement sensors 16 are mounted on the front of the vehicle 10 and each measures movement of the vehicle along the longitudinal roll Y axis and along the transverse pitch X axis of the vehicle.
  • this difference will generally be associated with changes in the vehicle's orientation about its yaw Z axis. Therefore, this difference may be used to calculate or determine changes in the vehicle's orientation about its yaw Z axis.
  • the vehicle might be provided with an optical movement sensor 16 adjacent to and inboard of the front wheels ( FIG. 1C ). Therefore, there would be one optical movement sensor on either side of the vehicle at the front of the vehicle. If the vehicle were to turn a corner (which would make it rotate about its yaw Z axis), the optical movement sensor 16 on the outside of the turning circle would measure a greater distance travelled than the optical movement sensor on the inside. This difference could then be used, along with the known positioning of each optical movement sensor with respect to the other, to calculate or determine the change in the vehicle's orientation about its yaw Z axis.
  • FIG. 2 shows a schematic diagram of an alternative embodiment of the present invention.
  • a number of the features of the embodiment shown in FIG. 2 are similar to those shown in FIG. 1 .
  • similar features in FIG. 2 are denoted by the same reference numeral as those used to denote similar features in FIG. 1 , but increased by 100.
  • tractor 110 in FIG. 2 corresponds to tractor 10 in FIG. 1 .
  • the embodiment shown in FIG. 2 also includes a controller 114 and an optical movement sensor 116 .
  • the embodiment shown in FIG. 2 further includes a differential GNSS system.
  • the differential GNSS system includes a GNSS receiver(s) 113 connected to satellite antennas 120 .
  • the satellite antennas 120 are mounted on the roof of the tractor 110 and optionally on the implement 112 . Such multiple antennas 120 enable vector calculations of the tractor attitude.
  • the satellite antennas 120 receive satellite signals from the array of GNSS satellites orbiting the earth, shown schematically at 122 , 124 , 126 and 127 .
  • the differential GNSS system also includes a base station 128 .
  • the base station 128 includes a GNSS antenna 130 connected to a GNSS receiver 132 .
  • the antenna 130 receives signals from the orbiting GNSS satellites 122 , 124 , 126 and 127 .
  • the GNSS receiver 132 calculates and provides positional data to a computer 134 .
  • the computer compares the positional data from the GNSS receiver 132 with a predetermined and accurately known position for antenna 130 .
  • computer 134 is able to calculate an error factor, which is continuously updated and passed to a transmitter 136 , such as a radio modem.
  • the transmitter 136 generates a serial data signal which is upconverted and propagated by the base antenna 130 .
  • the transmitted error signal is received by an antenna 140 mounted on tractor 110 .
  • the GNSS receiver on the tractor 110 receives GNSS signals from the constellation of GNSS satellites via GNSS antenna 120 mounted on the tractor 110 .
  • the signals are sent to controller 114 .
  • the signals received from GNSS receiver(s) 113 on tractor 110 are corrected by the error correction signal sent from the transmitter 138 136.
  • an accurate determination of position of the tractor can be obtained from the differential GNSS system.
  • the controller 114 also receives position signals from the optical movement sensor 116 .
  • the optical movement sensor 116 in FIG. 2 is the kind used in optical computer mice, and if it is fixed to the vehicle, two or more such fixed optical movement sensors would need to be provided if the optical movement sensor is to be used to measure changes in the vehicle's orientation about its yaw axis.
  • a single optical movement sensor might be used, provided the single optical movement sensor is mounted in a gimballed manner and the device or mechanism 119 used for the gimballed mounting can monitor the changes in the orientation of the optical movement sensor relative to the orientation of the vehicle. Further explanation of the embodiment in FIG. 2 will be provided below.
  • FIG. 3 shows a schematic diagram of an alternative embodiment of the present invention.
  • a number of the features of the embodiment shown in FIG. 3 are similar to those shown in FIG. 1 .
  • similar features in FIG. 3 are denoted by the same reference numeral as used to denote those features in FIG. 1 , but increased by 200.
  • tractor 210 in FIG. 3 corresponds to tractor 10 in FIG. 1 .
  • the embodiment shown in FIG. 2 also includes a controller 214 and an optical movement sensor 216 .
  • the embodiment shown in FIG. 3 also includes one more inertial sensors 221 mounted to the tractor.
  • the one more inertial sensors may comprise one or more accelerometers and/or gyroscopes.
  • vehicle based sensors may be used instead of the inertial sensors. These may include magnetometers, wheel angle sensors and/or wheel speed encoders.
  • a combination of inertial sensors and vehicle-based sensors may also be used.
  • An assembly of sensors such as an Inertial Navigation System (INS), a Dynamic Measurement Unit (DMU), an Inertial Sensor Assembly (ISA), a Vertical Gyro (VG) or an Attitude Heading Reference System (AHRS) may be used in feature 221 .
  • INS Inertial Navigation System
  • DMU Dynamic Measurement Unit
  • ISA Inertial Sensor Assembly
  • VG Vertical Gyro
  • AHRS Attitude Heading Reference System
  • the inertial sensors may comprise one or more, or an assembly of sensors including accelerometers and rate gyroscopes for providing further position and attitude signals to the controller.
  • the assembly may comprise between one and three sensor sets orthogonally mounted, with each sensor set comprising not necessarily one of each, but no more than one of each of the above mentioned sensors.
  • the sensor assembly 221 provides relative position and attitude information to the controller 214 .
  • the optical movement sensor 216 also provides relative position and attitude information to controller 214 .
  • the controller uses both sets of information to obtain a more accurate determination of the position and attitude of the vehicle. This will be described in greater detail hereunder. Also, as described above with reference to the embodiments in FIGS. 1 and 2 , if the optical movement sensor 216 in FIG. 3 is the kind used in optical computer mice, and if it is fixed to the vehicle, two or more such fixed optical movement sensors would need to be provided if the optical movement sensor is to be used to measure changes in the vehicle's orientation about its yaw Z axis.
  • a single optical movement sensor might be used, provided the single optical movement sensor is mounted in a gimballed manner and the device or mechanism 219 used for the gimballed mounting can monitor the changes in the orientation of the optical movement sensor relative to the orientation of the vehicle.
  • FIG. 4 shows a schematic view of a further embodiment of the present invention.
  • a number of the features of the embodiment shown in FIG. 4 are similar to those shown in FIG. 2 .
  • similar features in FIG. 4 are denoted by the same reference numeral as used to denote those features in FIG. 2 , but with the leading “1” of the reference numerals used in FIG. 2 replaced with a leading “3” in FIG. 4 (i.e., plus 200).
  • tractor 310 in FIG. 4 corresponds to tractor 110 in FIG. 2 .
  • FIG. 4 includes an optical movement sensor 316 , a GNSS-based system 319 399 (shown in FIG. 8) and an inertial sensor and/or vehicle based sensor 321 . These sensors interact with the controller in a manner that will be described hereunder.
  • FIG. 5 shows a schematic flow sheet of the interaction between the optical movement sensor and the controller.
  • the flow sheet is based upon the embodiment shown in FIG. 1 embodiments shown in FIGS. 1A-C. Only one optical movement sensor is shown in FIG. 5 .
  • the way that two or more optical movement sensors can be used to measure changes in the vehicle's orientation about its yaw axis has already been explained.
  • providing two or more optical movement sensors may also provide the additional benefit of increasing the accuracy of the position and attitude information calculated by the optical estimator (compared with systems that use only a single optical movement sensor) due to the greater amount of information available upon which the estimates can be based.
  • the controller 14 of FIG. 1 is shown by dotted outline 14 A.
  • the controller of FIG. 5 includes an optimal estimator 60 and an error calculation module 62 .
  • the optimal estimator 60 and error calculation module 62 may form part of the computer memory and/or CPU of the controller 14 .
  • the particular programs required to run the optimal estimator and error calculation module may be written into the computer memory or they may be downloaded from a network or they may be loaded onto the computer memory via an optical drive, such as a CD drive or a DVD drive, or they may be loaded from any other form of recorded media.
  • the optimal estimator and the error calculation module may be provided in firmware associated with the controller 14 .
  • the optical movement sensor(s) 16 of FIG. 1 feeds position and attitude data into optimal estimator 60 .
  • Optimal estimator 60 acts to process the information from the optical movement sensor(s) 16 to provide a statistically optimal estimate of the position and attitude information received from the optical movement sensor(s).
  • the optimal estimator may include algorithms that receive the position and attitude information from optical movement sensor(s) 16 and convert that position and attitude information into a calculated or determined position and attitude of the tractor. This produces a statistically optimal estimate of the calculated or determined position and attitude of the tractor.
  • FIGS. 5-8 schematically represent the operation of the control system in accordance with different embodiments of the invention. However, it is also useful to consider the way in which the vehicle's parameters and dynamics are represented for the purposes of implementing the control system. Those skilled in the art will recognize that a range of methods may be used for this purpose. However, it is considered that one method is to represent the parameters and dynamics in “state space” form.
  • states In state space representations, the variables or parameters used to mathematically model the motion of the vehicle, or aspects of its operation, are referred to as “states” x i .
  • the states may include the vehicle's position (x,y), velocity
  • X (t) [x 1 (t)x 2 (t)x 3 (t)x 4 (t). . . x n (t)] T where n is the number of states.
  • the mathematical model used to model the vehicle's motion and aspects of its operation will comprise a series of differential equations.
  • the number of equations will be the same as the number of states.
  • the differential equations will be linear in terms of the states, whereas in other situations the equations may be nonlinear in which case they must generally be “linearized” about a point in the “state space”. Linearization techniques that may be used to do this will be well known to those skilled in this area.
  • the quantities that are desired to be known about the vehicle are the outputs y, from the model.
  • both the state equation and the measurement equation defined above are continuous functions of time.
  • continuous time functions do not often lend themselves to easy digital implementation (such as will generally be required in implementing the present invention) because digital control systems generally operate as recursively repeating algorithms. Therefore, for the purpose of implementing the equations digitally, the continuous time equations may be converted into the following recursive discrete time equations by making the substitutions set out below and noting that (according to the principle of superposition) the overall response of a linear system is the sum of the free (unforced) response of that system and the responses of that system due to forcing/driving inputs.
  • the quantity E w (t) is not deterministic and so the integral defining L w k+1 cannot be performed (even numerically). It is for this reason that it is preferable to use statistical filtering techniques.
  • the optimal estimator shown in FIGS. 5-8 will use such statistical techniques.
  • One particularly favorable technique involves the use of a Kalman filter to statistically optimize the states estimated by the mathematical model.
  • a Kalman filter operates as a “predictor-corrector” algorithm.
  • the algorithm operates by first using the mathematical model to “predict” the value of each of the states at time step k+1 based on the known inputs at time step k+1 and the known value of the states from the previous time step k. It then “corrects” the predicted value using actual measurements taken from the vehicle at time step k+1 and the optimized statistical properties of the model.
  • the Kalman filter comprises the following equations each of which is computed in the following order for each time step:
  • FIG. 12 The operation of the discrete time Kalman filter which may be used in the optimal estimator of the present invention is schematically illustrated in FIG. 12 .
  • the statistically optimal estimate of the vehicle's position and attitude provided by the optimal estimator 60 is supplied to the error calculation module 62 .
  • the error calculation module 62 receives information on the required control path 64 (or the desired path of travel).
  • the required control path or the desired path of travel may be entered into the computer memory of the controller or it may be calculated from an initial wayline and further operating parameters, such as the width of the implement being towed by the tractor.
  • the error calculation module 62 uses the statistically optimal estimate of the position and attitude of the tractor obtained from the optimal estimator 60 and the desired position and attitude of the tractor determined from the required control path to calculate the error in position and attitude of the tractor. This may be calculated as an error in the x-coordinate, an error in the y-coordinate and an error in the heading of the position and attitude of the tractor. These error values are represented as “Ex”, “Ey” and “Eh” in FIG. 5 . These error values are used in a correction calculation module 66 to determine a correction value. The correction value may result in a curvature demand 68 , which represents a steering control signal that is sent to a steering control mechanism. The correction value is calculated as a function of the error in the coordinate values.
  • FIG. 6 shows a schematic flow sheet of the interaction of the optical movement sensor(s) and GNSS sensor with the controller.
  • This flow sheet represents one possible implementation for use with the embodiment shown in FIG. 2 .
  • the error correction calculation module 62 , desired control path 64 , correction calculation module 66 and curvature demand 68 shown in FIG. 6 are essentially identical to those shown in FIG. 5 and will not be described further.
  • the controller which is represented by dashed outline 114 A, receives positional data from the optical movement sensor(s) 116 and the GNSS system 119 199.
  • the GNSS system 119 199 shown in FIG. 6 may correspond to the differential GNSS system described with reference to FIG. 2 .
  • the optimal estimator 160 receives positional data from the optical movement sensor 116 end of and from the GNSS system 119 199.
  • the optimal estimator 160 analyses the positional data from the optical movement sensor and the GNSS system to provide a statistically optimal estimate of the position coordinates of the tractor.
  • the GNSS system provides absolute position coordinate data and the optical movement system provides relative position and attitude data. Both sources of data can be used to obtain a more accurate calculated or determined position and attitude of the vehicle.
  • the optical movement sensor continues to provide position and attitude data to the optimal estimator. In such circumstances, control of the vehicle can be effected by the information received from the optical movement sensor alone.
  • the optical movement sensor provides position and attitude data at a much greater frequency than a GNSS system. Therefore, the position and attitude data received from the optical movement sensor can be used to provide a determined or calculated vehicle position and attitude during periods between receipt of positional data from the GNSS system. This feature assists in maintaining enhanced accuracy in the position and attitude data.
  • FIG. 7 shows a flow sheet of the interaction of optical movement sensor(s) and inertial sensors with the controller.
  • This flow sheet may be used in the embodiment shown in FIG. 3 .
  • the error correction calculation module 62 , desired control path 64 , correction calculation module 66 and curvature demand 68 shown in FIG. 7 are essentially identical to those as shown in FIGS. 5 and 6 and will not be described further.
  • the controller which is represented by dashed outline 214 A, receives positional data from the optical movement sensor(s) 216 and the inertial sensors 221 . This positional data is received by the optimal estimator 260 .
  • the optimal estimator 60 260 analyses this data and provides a best estimate of the position of the vehicle.
  • FIG. 8 shows a flow sheet demonstrating the interaction of optical movement sensor(s), inertial sensors and the GNSS system.
  • the flow sheet shown in FIG. 8 may be used as an implementation for the embodiment shown in FIG. 4 .
  • the error correction calculation module 62 , desired control path 64 , correction calculation module 66 and curvature demand 68 shown in FIG. 8 are essentially identical to those shown in FIGS. 5-7 and will not be described further.
  • the optimal estimator 60 360 receives positional data from the optical movement sensor 316 , the GNSS system 319 399 and the inertial sensors 321 . This data is sent to the optimal estimator 60 360 which produces a best estimate of the position of the vehicle. This is then sent to the error calculation module 62 .
  • FIG. 9 shows a schematic view of another embodiment of the present invention in which an optical movement sensor 416 A is mounted to the implement.
  • an optical movement sensor 416 A is mounted to the implement.
  • two or more optical movement sensors may be provided, or a single optical movement sensor may be provided with the gimballed mounting.
  • the optical movement sensor(s) 416 A is used to provide positional data relating to the position of the implement 12 412.
  • FIG. 9 other features are essentially identical to those shown in FIG. 1 and, for convenience, have been denoted by the same reference numbers as used in FIG. 1 , but increased by 400, and need not be described further.
  • FIGS. 2-4 may also be modified by replacing the optical movement sensors in those embodiments with an optical movement sensor mounted to the implement of those embodiments. It will also be appreciated that the position of the implement may be determined as well as the position of the vehicle. In such cases, the optical movement sensor 416 A mounted to the implement 12 412 (as shown in FIG. 9 ) may comprise an additional optical movement sensor to the optical movement sensor mounted to the tractor, as shown in FIGS. 1 to 4 .
  • FIG. 10 shows a schematic diagram of one possible embodiment of an optical movement sensor that may be used in the present invention.
  • the optical movement sensor 500 shown in FIG. 10 includes a housing or enclosure 502 .
  • the housing 502 holds an illumination source 504 , in the form of a ring of LEDs.
  • the ring of LEDs is shown more clearly in FIG. 11 .
  • the housing 502 also houses a charged coupled device (CCD) detector and an integrated optical movement sensor chip 506 .
  • the detector and optical movement sensor chip 506 is suitably taken from an optical computer mouse.
  • the housing 502 also houses a lens 510 (which will suitably be a telecentric lens). Light from the ring of LEDs that is reflected from the ground 512 is focused by the lens 510 onto the detector 506 .
  • CCD charged coupled device
  • a nozzle 514 may be positioned close to the lens 510 .
  • the nozzle 514 may periodically or continuously blow a jet of air over the lens 510 to thereby blow away any dirt or debris that may have settled on the lens.
  • FIG. 10 also shows the field of illumination 516 and the field of view 518 provided by the arrangement 500 .
  • the optical movement chip 506 sends signals to the optimal estimator, as shown in FIGS. 5 to 8 . These signals may be sent via a wire 509 or via an appropriate wireless connection.
  • the present invention provides control systems that can be used to control the movement of the vehicle or an implement associated with the vehicle.
  • the control system includes an optical movement sensor that may be the operative part of an optical computer mouse. These optical movement sensors are relatively inexpensive, provide a high processing rate and utilize proven technology. Due to the high processing rate of such optical movement sensors, the control system has a high clock speed and therefore a high frequency of updating of the determined or calculated position of the vehicle or implement.
  • the optical movement sensor may be used by itself or it may be used in conjunction with a GNSS system, one or more inertial sensors, or one or more vehicle based sensors.
  • the optical movement sensor can be used to augment the accuracy of inertial and/or other sensors. In particular, the optical movement sensor can be used to debias yaw drift that is often inherent in inertial sensors.
  • FIG. 13 The inherent “linear” nature of existing control systems is illustrated schematically in FIG. 13 . Whilst the “real world” spatial geometry of the respective swaths shown on the left in FIG. 13 may have been calculated, nevertheless from the control system's point of view at any given time the controller only “knows” that the vehicle is on the nth swath and that it has been moving along that swath for a known amount of time with known speed. Hence, at a fundamental level, the controller does not inherently know where the vehicle is located in space. This is represented graphically in FIG. 13 .
  • FIG. 14 shows an agricultural vehicle 601 having a control system in accordance with one embodiment of the present invention.
  • the agricultural vehicle 601 is a tractor towing an implement 602 .
  • the implement 602 could be a plow, harvester, seed sower, leveler, agricultural chemical applicator/dispenser or any other kind of agricultural implement.
  • the embodiment of the invention shown in FIG. 14 could equally be applied on other kinds of vehicles operating in other areas, for example cars, mine-trucks, airport tarmac vehicles, etc.
  • the components of the control system in the particular embodiment shown in FIG. 14 include a main control unit (MCU) 603 , a GPS antenna 604 and actuators 605 .
  • the main control unit 603 houses the spatial database and also the electronic hardware used to implement the controller.
  • the main control unit 603 may be an industrial computer (for example an industrial PC) capable of running other applications in addition to the vehicle control system.
  • the main control unit 603 may be a purpose-built unit containing only the hardware required to run the controller, the spatial database and the other components of the vehicle control system.
  • the main control unit 603 receives GPS signals from the GPS antenna 604 , and it uses these (typically in combination with feedback and/or other external spatial data signals) to generate a control signal for steering the vehicle.
  • the control signal will typically be made up of a number of components or streams of data relating to the different parameters of the vehicle being controlled, for example the vehicle's “cross-track error”, “heading error”, “curvature error”, etc. These parameters will be described further below.
  • the control signal is amplified using suitable signal amplifiers (not shown) to create a signal that is sufficiently strong to drive the actuators 605 .
  • the actuators 605 are interconnected with the vehicle's steering mechanism (not shown) such that the actuators operate to steer the vehicle as directed by the control signal.
  • further actuators may also be provided which are interconnected with the vehicle's accelerator and/or braking mechanisms, and the control signal may incorporate components or signal streams relating to the vehicle's forward progress (i.e. its forward speed, acceleration, deceleration, etc.).
  • the component(s) of the control signal relating to the vehicle's forward progress may also be amplified by amplifiers (not shown) sufficiently to cause the actuators which are interconnected with the accelerator/braking mechanism to control the vehicle's acceleration/deceleration in response to the control signal.
  • the vehicle 601 may also be optionally provided with one or more optical sensors 606 , one or more inertial sensors (IS) 607 and a user terminal (UT) 608 .
  • One form of optical sensor 606 that may be used may operate by receiving images of the ground beneath the vehicle, preferably in rapid succession, and correlating the data pertaining to respective successive images to obtain information relating to the vehicle's motion.
  • Other forms of optical sensor may also be used including LIDAR (Light Detection and Ranging) or sensors which operate using machine vision and/or image analysis.
  • the one or more inertial sensors 607 will typically include at least one gyroscope (e.g., a rate gyroscope), although the inertial sensors 607 could also comprise a number of sensors and components (such as accelerometers, tilt sensors and the like) which together form a sophisticated inertial navigation system (INS).
  • the vehicle may be further provided with additional sensors (not shown) such as sensors which receive information regarding the location of the vehicle relative to a fixed point of known location in or near the field, magnetometers, ultrasonic range and direction finding and the like. The data generated by these additional sensors may be fed into the database and used by the control system to control the vehicle as described below.
  • the user terminal 608 may comprise a full computer keyboard and separate screen to enable the user to utilize the full functionality of the computer.
  • the terminal 608 may comprise, for example, a single combined unit having a display and such controls as may be necessary for the user to operate the vehicle's control system. Any kind of controls known by those skilled in this area to be suitable may be used on the main control unit, including keypads, joysticks, touch screens and the like.
  • the user terminal 608 is positioned in the vehicle cabin so that it can be operated by the driver as the vehicle moves.
  • the present control system could also be operated by wireless remote control, meaning that the user terminal 608 could alternatively be totally separate from the vehicle and could operate the vehicle's control system from a remote location.
  • a single remote user terminal 608 may be used to wirelessly interface with the control systems of multiple vehicles (possibly simultaneously) so that the user can control multiple moving vehicles from the one remote terminal.
  • the “cross-track error” is the lateral difference between the vehicle's actual position, and its desired position. This is illustrated by the “ ⁇ ” bracket in FIG. 15 .
  • the “heading error” is the difference between the vehicle's actual instantaneous direction of motion h (i.e. its actual compass heading), and its desired instantaneous direction of motion H.
  • curvature error is the difference between the actual instantaneous radius of curvature r of the vehicle's motion and the desired instantaneous radius of curvature R.
  • a vehicle control system in accordance with one particular embodiment of the invention comprises: a task path generator; a spatial database; at least one external spatial data source; a vehicle attitude compensation module; a position error generator; a controller; and actuators to control (steer) the vehicle.
  • the desired path trajectory for the vehicle is first entered into the control system by the user via the user terminal 608 .
  • the task path generator interprets this user-defined path definition and converts it into a series of points of sufficient spatial density to adequately represent the desired path to the requisite level of precision.
  • the task path generator typically also defines the vehicle's desired trajectory along the user-defined path, for example, by generating a desired vehicle position, a desired heading H and a desired instantaneous radius of curvature R for each point on the path. This information is then loaded into the spatial database.
  • This and other spatial information is stored within the database in representative embodiments, and in particular the way in which pieces of data are given memory allocations according to their spatial location, is described further below.
  • the control system progressively receives updated information regarding spatial location from the external spatial data sources.
  • the external spatial data sources will typically include GPS. However, a range of other spatial data sources may also be used in addition to, or in substitute for GPS. For example, the inertial navigation systems (INS), visual navigation systems, etc. described above may also be used as external data sources in the present control system.
  • INS inertial navigation systems
  • visual navigation systems etc. described above may also be used as external data sources in the present control system.
  • the spatial data collected by the external spatial data sources actually pertains to the specific location of the external spatial data receivers, not necessarily the vehicle/implement reference location itself (which is what is controlled by the control system).
  • the reference location is on the vehicle 601 and is indicated by the intersection (i.e. the origin) of the roll, pitch and yaw axes. In other embodiments, the reference location may be located elsewhere on the vehicle, or on the implement 602 , etc. In any event, to illustrate this point, it will be seen that the GPS antenna 604 in FIG. 14 is located on the roof of the vehicle some distance from the vehicle's reference point.
  • the spatial data collected by the GPS antenna actually relates to the instantaneous location of the vehicle's roof, not the location of the vehicle's reference point.
  • the spatial data collected by the optical sensor 606 actually pertains to the particular location of the optical sensor (slightly out in front of the vehicle in FIG. 14 ).
  • changes in the vehicle's attitude will also influence the spatial position readings received by the different receivers. For example, if one of the vehicle's wheels passes over, or is pushed sideways by a bump, this may cause the vehicle to rotate about at least one (and possibly two or three) of the axes shown in FIG. 14 . This will in turn change the relative position of the spatial data receiver(s) such as GPS antenna 604 with respect to the reference location on the vehicle or implement. This can be used (typically in combination with other sources of external spatial data or “feedback” data) to determine the orientation of the vehicle. The orientation of the vehicle may be considered to be the relative orientation of the vehicle's axes in space.
  • a vehicle attitude compensation module is provided. This is shown in FIG. 16 .
  • the vehicle attitude compensation module converts all readings taken by the various spatial data receivers (which relate to the different specific locations of the receivers) into readings pertaining to the spatial location and orientation of the vehicle's reference point. This data pertaining to the spatial location and orientation of the vehicle's reference point is then fed into the spatial database.
  • the one or more external spatial data sources will progressively receive updated data readings in rapid succession (e.g., in “real time” or as close as possible to it). These readings are then converted by the vehicle attitude compensation module and fed into the spatial database. The readings may also be filtered as described above. Therefore, whilst each reading from each spatial data source is received, converted (ideally filtered) and entered into the spatial database individually, nevertheless the rapid successive way in which these readings (possibly from multiple “parallel” data sources) are received, converted and entered effectively creates a “stream” of incoming spatial data pertaining to the vehicle's continuously changing instantaneous location and orientation. In order to provide sufficient bandwidth, successive readings from each external spatial data source should be received and converted with a frequency of the same order as the clock speed (or at least one of the clock speeds) of the controller, typically 3 Hz-12 Hz or higher.
  • the position error generator next receives information from the spatial database.
  • the information it receives from the database includes: the vehicle's desired position, heading H and instantaneous radius of curvature R. (It will be recalled that this information is originally generated by the task path generator and then entered into the spatial database, based on the user-defined path trajectory); and the vehicle's actual position, heading h and instantaneous radius of curvature r. (This information is based on spatial data progressively received from the external spatial data sources as described above, and typically also on data received through feedback.)
  • the position error generator uses this information to calculate an instantaneous “error term” for the vehicle.
  • the “error term” incorporates the vehicle's instantaneous cross-track error, heading error and curvature error (as described above).
  • the error term is then fed into the controller.
  • the controller is shown in greater detail in FIG. 17 .
  • the controller incorporates a cross-track error PID proportional-integral-derivative (PID) controller, a heading error PID controller and a curvature error PID controller.
  • PID controllers used with the present invention are of a conventional form that will be well understood by those skilled in this area and need not be described in detail.
  • the output from the cross-track error, heading error and curvature error PID controllers then passes through a curvature demand signal integrator.
  • the output from the PID controllers is therefore integrated in order to generate a curvature demand signal.
  • This curvature demand signal is thus the “control signal” which is amplified by amplifiers (not shown) before proceeding to drive the actuators as required.
  • the signal obtained by integrating the output from the PID controllers is amplified and sent to the actuators in the form of a curvature demand to change the vehicle's steering angle and hence steer the vehicle back onto the desired path.
  • the change in vehicle pose, etc., caused by the control driven change in steering angle is registered via the updated information received through the external data sources (GPS etc) and the vehicle's new position, heading and instantaneous radius of curvature are re-entered into the spatial database to complete control system's overall closed loop control structure.
  • the arrows extending from the actuators/steering mechanism to the external data sources in FIG. 16 are dashed rather than solid lines.
  • FIG. 18 there is shown a slightly more elaborate embodiment of the control system.
  • the embodiment shown in FIG. 18 is generally the same as that shown in FIG. 16 , except that the embodiment in FIG. 18 incorporates an optimizing filter and an external obstacle detection input.
  • the optimizing filter can operate to statistically optimize at least some of the spatial data contained in the spatial data base.
  • the filter will generally operate as an “observer”, meaning that it does not form part of the control loop. Rather, the filter will typically reside outside the control loop and it will generally operate by taking data directly from the database and returning optimize data directly into the database, as shown in FIG. 18 .
  • the filter will take the updated “feedback” data that re-enters the database from the control loop (described above) together with the updated spatial data obtained from the external spatial data sources (after it has been processed by the vehicle attitude compensation module) and it will then use these disparate streams of data to calculate a statistically optimized updated estimate of, for example, the vehicle's instantaneous position, heading and radius of curvature.
  • the filter will typically comprise a Kalman filter.
  • the external obstacle detection input may comprise any form of vision based, sound based or other obstacle detection means, and the obstacle detection data may be converted by the vehicle attitude compensation module (just like the other sources of external data discussed above) and then fed into the spatial database.
  • the control system incorporates obstacle detection, it is then necessary for the task path generator to be able to receive updated information from the spatial database. This is so that if an obstacle is detected on the desired path, an alternative path that avoids the obstacle can be calculated by the task path generator and re-entered into the database.
  • the ability of the task path generator to also receive data from the spatial database is indicated by the additional arrow from the spatial database to the task path generator in FIG. 18 .
  • FIGS. 4 - 6 16-18 graphically represent the operation of the control system. However, it is also useful to consider the way in which the vehicle's parameters and dynamics are represented for the purposes of implementing the control system. Those skilled in the art will recognize that a range of methods may be used for this purpose. However, it is considered that one method is to represent the parameters and dynamics in “state space” form.
  • states In state space representations, the variables or parameters used to mathematically model the motion of the vehicle, or aspects of its operation, are referred to as “states” xi. In the present case, the states may include the vehicle's position (x,y), velocity
  • X (t) [x 1 (t)x 2 (t)x 3 (t)x 4 (t). . . x n (t)] T where n is the number of states.
  • the mathematical model used to model the vehicle's motion and aspects of its operation will comprise a series of differential equations.
  • the number of equations will be the same as the number of states.
  • the differential equations will be linear in terms of the states, whereas in other situations the equations may be nonlinear in which case they must generally be “linearised” about a point in the “state space”. Linearisation techniques that may be used to do this will be well known to those skilled in this area.
  • the process noise represents errors in the model and vehicle dynamics which exist in the actual vehicle but which are not accounted for in the model.
  • Ew(t) represents an unknown quantity, its contents are not known. However, for reasons that will be understood by those skilled in this area, in order to allow statistically optimized signal processing and state estimation Ew(t) is generally assumed to be Gaussian, white, have zero mean and to act directly on the state derivatives. It is also assumed that the process noise element associated with each individual state is uncorrelated with the process noise element of the other states.
  • the quantities that are desired to be known about the vehicle are the outputs y1 from the model.
  • both the state equation and the measurement equation defined above are continuous functions of time.
  • continuous time functions do not often lend themselves to easy digital implementation (such as will generally be required in implementing the present invention) because digital control systems generally operate as recursively repeating algorithms. Therefore, for the purpose of implementing the equations digitally, the continuous time equations may be converted into the following recursive discrete time equations by making the substitutions set out below and noting that (according to the principle of superposition) the overall response of a linear system is the sum of the free (unforced) response of that system and the responses of that system due to forcing/driving inputs.
  • the quantity Lw.sub.k+1 is the (forced) response of the system due to the random “error” inputs that make up the process noise.
  • this quantity may be defined as:
  • Kalman Filter operates as a “predictor-corrector” algorithm.
  • the algorithm operates by first using the mathematical model to “predict” the value of each of the states at time step k+1 based on the known inputs at time step k+1 and the known value of the states from the previous time step k. It then “corrects” the predicted value using actual measurements taken from the vehicle at time step k+1 and the optimized statistical properties of the model.
  • the Kalman Filter comprises the following equations each of which is computed in the following order for each time step:
  • k+1 means the value of the quantity at time step k+1 given updated information from time step k+1.
  • P is the co-variance in the difference between the estimated and actual value of X.
  • Q is the co-variance in the process noise.
  • K is the “Kalman gain” which is a matrix of computed coefficients used to optimally “correct” the initial state estimate.
  • R is the co-variance in the measurement noise.
  • [0139] is a vector containing measurement values taken from the actual vehicle.
  • the hash table typically operates as a form of index allowing the computer (in this case the control system CPU) to “look up” a particular piece of data in the database (i.e. to look up the location of that piece of data in memory).
  • pieces of data pertaining to particular locations along the vehicle's path are assigned different hash keys based on the spatial location to which they relate.
  • the hash table then lists a corresponding memory location for each hash key.
  • the CPU is able to “look up” data pertaining to a particular location by looking up the hash key for that location in the hash table which then gives the corresponding location for the particular piece of data in memory.
  • the hash keys for different pieces of spatial data can be assigned in such a way that “locality” is maintained. In other words, points which are close to each other in the real world should be given closely related indices in the hash table (i.e. closely related hash keys).
  • the spatial hash algorithm used to generate hash keys for different spatial locations in representative embodiments of the present invention may be most easily explained by way of a series of examples.
  • FIG. 20 the successive points which define the path are described by a simplified integer based (X,Y) coordinate system.
  • X,Y the vehicle moves in the X direction along the entire length of the first swath from (0,0) to (4,0), before moving up in the Y direction to then move back along the second swath in the opposite direction from (4,1) to (0,1), etc.
  • the hash table must operate by listing the hash key for each particular spatial location together with the corresponding memory location for data pertaining to that spatial location. Therefore, the hash table is inherently one-dimensional, and yet it must be used to link hash keys to corresponding memory allocations for data that inherently pertains to two-dimensional space.
  • Hash key coordinates (hexadecimal) (decimal) (0, 0) 0x0 0 (1, 0) 0x0 0 (2, 0) 0x0 0 (3, 0) 0x0 0 (4, 0) 0x0 0 (0, 1) 0x1 1 (1, 1) 0x1 1 (2, 1) 0x1 1 (3, 1) 0x1 1 (4, 1) 0x1 1 (0, 2) 0x2 2 (1, 2) 0x2 2 (2, 2) 0x2 2 (3, 2) 0x2 2 (4, 2) 0x2 2 (0, 3) 0x3 3 (1, 3) 0x3 3 (2, 3) 0x3 3 (3, 3) 0x3 3 (4, 3) 0x3 3 (0, 4) 0x4 4 (1, 4) 0x4 4 (2, 4) 0x4 4 (3, 4) 0x4 4 (3, 4) 0x4 4 (3, 4) 0x4 4 (1, 4)
  • the prefix “0x” indicates that the numbers in question are expressed in hexadecimal format. This is a conventional notation.
  • Hash key coordinates (hexadecimal) (decimal) (0, 0) 0x0 0 (1, 0) 0x100 256 (2, 0) 0x200 512 (3, 2) 0x302 770 (4, 2) 0x402 1026 (0, 3) 0x3 3
  • Hash key coordinates (hexadecimal) (decimal) (0, 0) 0x0 0 (1, 0) 0x100 256 (2, 0) 0x200 512 (3, 0) 0x300 768 (4, 0) 0x400 1024 (0, 1) 0x1 1 (1, 1) 0x101 257 (2, 1) 0x201 513 (3, 1) 0x301 769 (4, 1) 0x401 1025 (0, 2) 0x2 1 (1, 2) 0x102 258 (2, 2) 0x202 514 (3, 2) 0x302 770 (4, 2) 0x402 1026 (0, 3) 0x3 3 (1, 3) 0x103 759 (2, 3) 0x203 515 (3, 3) 0x303 771 (4, 3) 0x403 1027 (0, 4) 0x4 4 (1,
  • Yet a further method for generating hash keys is to use a technique which shall hereinafter be referred to as “bitwise interleaving”.
  • the first step in this technique is to represent the (X,Y) coordinates in binary form.
  • the point (X,Y) may be re-written in 8-bit binary array notation as (X1X2X3X4X5X6X7 ⁇ 8, Y1Y2Y3Y4Y5Y6Y7Y8).
  • the successive bits from the X and Y binary coordinates are alternatingly “interleaved” to give the following 16-bit binary hash key X1Y1X2Y2X3Y3X4Y4 ⁇ 5Y5X6Y6X7YX8Y8.
  • the hash keys generated using this method for each point on the vehicle path in FIG. 20 are given in Table 3 below.
  • Hash key coordinates (hexadecimal) (decimal) (0, 0) 0x0 0 (1, 0) 0x2 2 (2, 0) 0x8 8 (3, 0) 0xa 10 (4, 0) 0x20 32 (0, 0) 0x1 1 (1, 0) 0x3 3 (2, 0) 0x9 9 (3, 0) 0xb 11 (4, 0) 0x21 33 (0, 2) 0x4 4 (1, 2) 0x6 6 (2, 2) 0xc 12 (3, 2) 0xc 14 (4, 2) 0x24 36 (0, 3) 0x5 5 (1, 3) 0x6 7 (2, 3) 0xd 13 (3, 3) 0xf 15 (4, 3) 0x25 37 (0 ,4) 0x10 16 (1
  • decimal number 3 may be written as 11 in binary notation.
  • decimal number 4 is written as 100 in binary. Therefore, the location (3,4) may be rewritten in 8-bit binary array notation as (00000011,00000100). Bitwise interleaving these binary coordinates then gives the single 16-bit binary hash key 0000000000011010, which can equivalently be written as the hexadecimal number 0x1a or the decimal number 26.
  • hash keys by “bitwise interleaving” the X and Y coordinates leads to unique hash keys (in this example) for each spatial location.
  • the hash keys generated in this way satisfy the requirement that points which are close together in the real world are assigned closely related hash keys. For example, consider again the points (0,0) and (1,0). The hash keys now assigned to these points by “bitwise interleaving” (when written in decimal notation) are 0 and 2 respectively. Furthermore, the point (0,1) which is also nearby is also assigned the closely related hash key 1. Conversely, points which are separated by a considerable distance in the real world are given considerably differing hash keys, for example, the hash key for (4,3) is 37.
  • GPS and other similar systems which describe spatial location typically do so using IEEE double-precision floating-point numbers (not simple integers). For instance, GPS supplies coordinates in the form of (X,Y) coordinates where X corresponds to longitude, and Y corresponds to latitude. Both X and Y are given in units of decimal degrees.
  • FIG. 21 shows an example vehicle path similar to that shown in FIG. 20 , except that the coordinates used to describe the points along the path in FIG. 21 correspond to a “realistic” coordinate system such as that used by current GPS.
  • a “realistic” coordinate system such as that used by current GPS.
  • a double-precision floating-point number represented in accordance with the IEEE 754 standard comprises a string of 64 binary characters (64 bits) as shown in FIG. 22 .
  • the number is represented in three parts, namely the sign, the exponent and the mantissa.
  • the sign comprises one bit. If the sign bit is 1 then the number is negative, and conversely if the sign bit is 0 then the number is positive.
  • the exponent comprises eleven binary characters, and hence can range from 00000000000 to 11111111111. However, because of the need to represent numbers that are both greater and smaller than one, it is necessary to be able to represent both large positive and large negative values for the exponent.
  • step 3 the binary coordinate representations (and all other numbers which are generated or used by the algorithm in binary form) have been written in the alternative hexadecimal notation for ease of reference and to save space in FIG. 23 .
  • step 4 the binary representations of the two coordinates are split into their respective exponent (11 bits) and mantissa (52 bits) portions.
  • the resulting exponents are then adjusted by a selected offset.
  • the size of the offset is selected depending on the desired “granularity” of the resulting fix-point number. In the particular example shown in step 6 ) of FIG. 23 , the offset is 37, however those skilled in the art will appreciate this number can be varied to suit.
  • the next step is to “resurrect” the leading “1” and the binary point which implicitly exist in the mantissa but which are left off when the mantissa is actually written (see above). Hence, the leading “1” and the binary point are simply prepended to the mantissa of each of the coordinates. This is step 7 ) in FIG. 23 .
  • the mantissa for each coordinate is then right-shifted by the number of bits in the corresponding exponent.
  • the exponents for each coordinate are then prepended to their corresponding mantissas forming a single character string for each coordinate.
  • the resultant bit fields for each coordinate are bitwise interleaved to obtain a single hash key corresponding to the original coordinates.
  • the resultant hash key is 32-bits in length. However, the length of the resultant hash key may vary depending on, for example whether the high-order byte is discarded, etc.

Abstract

A vehicle control system having a controller and a spatial database adapted to provide spatial data to the controller at control speed. The spatial data provided from the spatial database to the controller includes images collected from an optical sensor subsystem in addition to other data collected by a variety of sensor types, including a GNSS or inertial measurement system. The spatial data received by the controller from the database forms at least part of the control inputs that the controller operates on to control the vehicle. The advantage provided by the present invention allows control system to “think” directly in terms of spatial location. A vehicle control system in accordance with one particular embodiment of the invention comprises a task path generator, a spatial database, at least one external spatial data receiver, a vehicle attitude compensation module, a position error generator, a controller, and actuators to control the vehicle.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This application is a reissue application of application Ser. No. 13/573,682, filed Oct. 3, 2012, now U.S. Pat. No. 8,768,558, issued Jul. 1, 2014, which is a continuation-in-part of and claims the benefit of U.S. patent application Ser. No. 12/504,779, filed Jul. 17, 2009, now U.S. Pat. No. 8,311,696, and is a continuation-in-part of and claims the benefit of U.S. patent application Ser. No. 12/947,620, filed Nov. 16, 2010, now abandoned, which is continuation of and claims the benefit of U.S. patent application Ser. No. 11/620,388, filed Jan. 5, 2007, entitled “Vehicle Control System,” now U.S. Pat. No. 7,835,832, issued Nov. 16, 2010, which are all incorporated herein by reference.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a control system for controlling the direction of travel of a vehicle, and in particular to a control system having an embedded spatial database using optical tracking for vehicle control and guidance. The control system of the present invention may also be used to control other aspects of a vehicle's motion, such as speed or acceleration. Furthermore, in the case of agricultural vehicles and the like, the present control system may be used to control yet other aspects of the vehicle's operation, such as the application of agricultural chemicals at desired locations (including at desired application rates), or the engagement and/or mode of operation of agricultural implements (e.g., plows, harvesters, etc.) at desired locations, etc.
For convenience, the invention will be described mainly with reference to agricultural vehicles and moving agricultural machinery. However, it will be clearly understood that the invention is not limited to agricultural applications and it may equally be applied to vehicles and other moving machinery in other areas.
2. Description of the Related Art
Automatic control of steering (“autosteering”) of vehicles is becoming more widespread, especially in agricultural and mining applications. Most commercially available automatic steering systems include a controller that has means for determining, among other things, the position and heading of a vehicle, a computer-based system for comparing the position and heading of the vehicle with a desired position and heading, and a steering control responsive to a control signal issued by the controller when the position and/or heading of the vehicle deviates from the desired position and/or heading.
A number of control systems have previously been devised for controlling the steering of agricultural vehicles. These systems are generally used on vehicles such as tractors (including tractors with towed tools or other implements), harvesters, headers and the like which operate in large fields. These vehicles generally move along predetermined trajectories (“paths”) throughout the field. In general, a wayline is entered into the control system and subsequent paths are calculated based on the wayline. If the vehicle deviates from the path as it moves, the controller causes the vehicle to steer back towards and onto the path as described below.
As the vehicle moves along the predetermined path trajectory, it uses various means such as signals produced by GPS (global positioning system) or INS (inertial navigation system) to identify if the vehicle deviates from the desired path trajectory. If the vehicle deviates, the extent of the deviation (i.e. the difference between the actual curvature of the vehicle's trajectory and the desired curvature, its actual compass heading compared with the desired compass heading, and the distance the vehicle is displaced laterally from the desired path) is expressed in the form of an error, and this error is fed back into the control system and used to steer the vehicle back onto the desired path.
A problem with previous vehicle control systems is that they are inherently “one-dimensional” or “linear” in nature. This means that, at a fundamental level, the controller operates by “knowing” the path that the vehicle is required to traverse, and “knowing” where the vehicle is located on that path (i.e. how far along the path the vehicle has moved) at a given time. However, the controller does not “know” where the vehicle is actually located in space. This is despite the fact that the controller may often progressively receive information containing the vehicle's spatial location, for example from the GPS/INS signals. In current controllers, the GPS/INS signals are used primarily to determine when the vehicle deviates from the path (i.e. to calculate the error) rather than for the primary purpose of determining the vehicle's actual position in space. Hence, at a fundamental level, the controller only “knows” the geometry of the path and how far the vehicle has moved along the path.
Therefore, with current controllers, if it is desired to know the actual spatial position of the vehicle, this must be calculated from the known geometry of the path and the known distance the vehicle has moved along that path. This calculation can be computationally expensive and difficult to implement in practice, particularly for curved, piecewise, broken or other complex path trajectories.
By way of example, it will be appreciated that one form of common path trajectory that agricultural vehicles are often required to traverse in fields is made up of a number of (usually parallel) path segments or “swaths” (these are sometimes also referred to as “rows”). Thus, the vehicle typically moves along one swath, harvesting or plowing as it goes, and it then turns around and moves back along an adjacent parallel swath, harvesting or plowing in the opposite direction. The adjacent swath will generally be spaced from the first swath sufficiently closely that no part of the field or crop is missed between the swaths, but also sufficiently apart so that there is not an unnecessary overlap region (i.e. a region between the swaths that gets plowed or harvested on both passes). In general, the distance between the mid-lines of each respective swath is determined with reference to the width of the vehicle (i.e. the width of the plow, harvester or possibly the tool being towed by the vehicle).
In cases where paths comprising a series of parallel swaths are used, the first swath will often be used as a reference swath or “wayline”. In general, the geometry of the wayline in space will be entered into the control system along with the vehicle or implement width, and this is used to calculate the required spacing (and hence trajectory) for each of the adjacent parallel swaths. However, with most existing control systems, the controller is only able to control the steering of the vehicle as it proceeds along each of the swaths. It is much harder to control the steering of the vehicle as it turns around between one swath and the next. Therefore, whilst the spatial geometry of the respective swaths may have been calculated, from the control system's point of view at any given time it only “knows” that it is on the nth swath (numbered from the wayline) and that it has been moving along that swath for a known amount of time with known speed (i.e. it knows that the vehicle is a certain distance along the nth swath). However, at a fundamental level, the control system does not inherently know where the vehicle is consequently located in space or the spatial relationship between each swath. A graphical representation of the difference between the vehicle's actual spatial location and what the control system “sees” is given in FIG. 1 13.
The “one-dimensional” or “linear” nature of existing control systems also causes other difficulties. One example is in relation to obstacle avoidance. In most agricultural applications, the positions of obstacles (e.g., fences, trees, immovable rocks, creeks, etc.) are known according to their “real-world” spatial location. The spatial location may be known according to global latitude and longitude coordinates (e.g., as provided by GPS), or alternatively the location may be known relative to a fixed point of known location (this is generally a point in or near the field used to define the origin of a coordinate system for the field). However, as current control systems only recognize where the vehicle is located along the path, not where the vehicle is actually located in space, the control system itself is therefore unable to recognize whether the location of the obstacle coincides with the trajectory of the path, and hence whether there may be a collision.
Consequently, with current control systems, it may be necessary for a number of separate modules to be provided, in addition to the primary control module, if automatic obstacle avoidance (i.e. obstacle avoidance without the need for intervention by the driver of the vehicle) is to be achieved. In these cases, one of the modules would be a collision detection module for calculating the geometry and trajectory of a section of the path a short distance ahead of the vehicle in terms of “real world” spatial coordinates and for determining whether any of the points along that section of path will coincide with the location of an obstacle. If the collision detection module identifies that the section of path is likely to pass through an obstacle (meaning that there would be a collision if the vehicle continued along that path), then a further module may be required to determine an alternative trajectory for (at least) the section of the path proximate the obstacle. Yet a further module may then be required to determine how best to steer the vehicle from the alternative trajectory back onto the original path after the vehicle has moved past the obstacle. This multi-modular control system structure is complicated and can lead to computational inefficiencies because the different modules may each perform many of the same geometric calculations for their own respective purposes, separately from one another, leading to “doubling up” and unnecessary computation. Also, with this modular control system structure, control of the vehicle generally passes from one module to another as described above, but determining when one module should take over from another creates significant difficulties in terms of both system implementation and maintenance.
Another problem associated with the “one-dimensional” nature of existing control systems is their inherent inflexibility and unadaptability. For example, in practice, if the vehicle deviates from the desired path for some reason, it may be preferable for subsequent paths (swaths) to also include a similarly shaped deviation so that the paths remain substantially parallel along their length (or tangentially parallel and consistently spaced in the case of curved sections of path). If the vehicle is, for example, a harvester or a plow, then keeping the paths parallel in this way may help to prevent portions of the field from being missed, or from being harvested/plowed multiple times (by passing over the same portion of field on multiple passes). Even with the modular control system structures described above, it is often difficult to determine the geometry of the deviated path portion in terms of “real world” coordinates, and even if this can be done, it is also difficult to adjust subsequent path geometries to correspond to the deviation from the predetermined path trajectory that was originally entered.
As a further example of the inherent inflexibility and unadaptability of current “one-dimensional” control systems, it is illustrative to consider the situation where an obstacle is located near the end of one swath such that it would be quicker and more efficient to simply move on to an adjacent swath located nearby rather than wasting time trying to go around the obstacle to finish the first swath before moving on to the adjacent swath. Current “one-dimensional” control systems are not able to recognize that it would be more efficient to move on. This is because the control system only knows where the vehicle is along its current path (e.g., close to the end of the swath), and if a modular control systems is used, that module may also recognize that it is approaching the obstacle. The control system does not know where the vehicle is actually located in space, and therefore it cannot recognize that the beginning of the next swath is actually located nearby—it simply does not know where the next swath is (or indeed where the current swath is in space). Therefore, current control systems cannot easily recognize when it would be better to change paths (at least without intervention from the vehicle's driver), as this example illustrates. Nor is the current “one-dimensional” structure inherently adapted to enable the control systems to automatically (i.e. autonomously without assistance from the driver) determine and guide the vehicle along an efficient trajectory between swaths.
As used herein, “attitude” generally refers to the heading or orientation (pitch with respect to the Y axis, roll with respect to the X axis, and yaw with respect to the Z axis) of the vehicle, or of an implement associated with the vehicle. Other vehicle/implement-related parameters of interest include groundspeed or velocity and position. Position can be defined absolutely in relation to a geo-reference system, or relatively in relation to a fixed position at a known location, such as a base station. A change in one or both of the position and orientation of the vehicle (which can include a towed component, such as an implement or a trailer) can be considered a change in the vehicle's “pose.” This includes changes (e.g., different order time derivatives) in attitude and/or position. Attitude and position are generally measured relatively with respect to a particular reference frame that is fixed relative to the area that the vehicle is operating in, or globally with respect to a geo-reference system.
U.S. Pat. No. 6,876,920, which is assigned to a common assignee herewith and incorporated herein by reference, describes a vehicle guidance apparatus for guiding a vehicle over a paddock or field along a number of paths, the paths being offset from each other by a predetermined distance. The vehicle guidance apparatus includes a GNSS Global navigation satellite system (GNSS) receiver for periodically receiving data regarding the vehicle's location, and an inertial relative location determining means for generating relative location data along a current path during time periods between receipt of vehicle position data from the GNSS receiver. The apparatus also includes data entry means to enable the entry by an operator of an initial path and a desired offset distance between the paths. Processing means are arranged to generate a continuous guidance signal indicative of errors in the attitude and position of the vehicle relative to one of the paths, the attitude and position being determined by combining corrected GNSS vehicle location data with the relative location data from the inertial relative location determining means.
In the system described in U.S. Pat. No. 6,876,920, the inertial sensor is used to provide a higher data rate than that obtainable from GNSS alone. Although the inertial navigation system (INS) part of the steering control system suffers from errors, in particular a yaw bias, the signals received from the GNSS system are used to correct these errors. Thus, the combination of a GNSS based system and a relatively inexpensive INS navigation system allow for quite accurate control of the position of the vehicle. Although this system allows for accurate vehicle positioning and sound control of the vehicle's steering, difficulties may be experienced if there are prolonged periods of GNSS outage. GNSS outages may occur due to unsuitable weather conditions, the vehicle operating in an area where GNSS signals cannot be accessed, or due to problems with the GNSS receiver. If a period of prolonged GNSS outage occurs, the steering system relies solely upon the INS. Unfortunately, a yaw bias in a relatively inexpensive inertial sensor used in the commercial embodiment of that steering control system can result in errors being introduced into the steering of the vehicle.
Optical computer mice are widely used to control the position of a cursor on a computer screen. Optical computer mice incorporate an optoelectronic sensor that takes successive pictures of the surface on which the mouse operates. Most optical computer mice use a light source to illuminate the surface that is being tracked (i.e. the surface over which the mouse is moving). Changes between one frame and the next are processed using the image processing ability of the chip that is embedded in the mouse. A digital correlation algorithm is used so that the movement of the mouse is translated into corresponding movement of the mouse cursor on the computer screen.
The optical movement sensors used in optical computer mice have high processing capabilities. A number of commercially available optical computer mice include optical mouse sensors that can process successive images of the surface over which the mouse is moving at speeds in excess of 1500 frames per second. The mouse has a small light emitting source that bounces light off the surface and onto a complementary metal oxide semiconductor (CMOS) sensor. The CMOS sensor sends each image to a digital signal processor (DSP) for analysis. The DSP is able to detect patterns in images and see how those patterns have moved since the previous image. Based on the change in patterns over a sequence of images, the digital signal processor determines how far the mouse has moved in X and Y directions, and sends these corresponding distances to the computer. The computer moves the cursor on the screen based upon the coordinates received from the mouse. This happens hundreds to thousands of times each second, making the cursor appear to move very smoothly.
The chips incorporated into optical computer mice often include photodetectors and an embedded integrated circuit that is used to analyse the digital signals received from the photodetectors. The photodetectors may include an array of photosensors, such as an array of charged couple devices (CCDs).
U.S. Pat. No. 5,786,804 (incorporated herein by reference), which is assigned to Hewlett-Packard Company, describes a method and system for tracking attitude of a device. The system includes fixing a two-dimensional (2D) array of photosensors to the device and using the array to form a reference frame and a sample frame of images. The fields of view of the sample and reference frames largely overlap, so that there are common image features from frame to frame. Several frames are correlated with the reference frame to detect differences in location of the common features. Based upon detection of correlations of features, an attitudinal signal indicative of pitch, yaw and/or roll is generated. The attitudinal signal is used to manipulate a screen cursor of a display system, such as a remote interactive video system.
It will be clearly appreciated that any reference herein to background material or a prior publication is not to be understood as an admission that any background material, prior publication or combination thereof forms part of the common general knowledge in the field, or is otherwise admissible prior art, whether in Australia or any other country.
SUMMARY OF THE INVENTION
In a first aspect, the present invention provides a control system for controlling movement of a vehicle characterized in that the control system includes an optical movement sensor which scans a surface over which the vehicle is moving and generates a signal indicative of relative movement along an axis of the vehicle and relative movement across an axis of the vehicle, said signal being provided to a controller.
In a second aspect, the present invention provides a control system for controlling movement of a vehicle comprising a controller having a computer memory for storing or generating a desired path of travel, the controller being adapted to receive position and/or heading signals from one or more sensors, the position and/or heading signals enabling the controller to determine a position and/or heading of the vehicle relative to a desired path of travel, the controller sending control signals to a steering control mechanism in response to the determined position and/or heading of the vehicle, wherein the position and/or heading signals from the one or more sensors include a signal generated by an optical movement sensor configured to scan a surface during travel of the vehicle, the optical movement sensor generating a signal indicative of relative movement along an axis of the vehicle and relative movement across an axis of the vehicle.
The surface that is scanned by the optical movement sensor is suitably a surface over which the vehicle is travelling. Suitably, the optical movement sensor scans a surface that is close to or under the vehicle during travel of the vehicle over the surface.
The optical movement sensor may comprise the operative part of an optical computer mouse. Therefore, in saying that the optical movement sensor “scans” the surface over which the vehicle moves, where the optical movement sensor comprises the operative part of an optical computer mouse, it will be understood that the optical movement sensor receives successive images of the surface over which the vehicle is moving. One part or other of the control system will then detect patterns in the images, and uses the change in the patterns between successive images to obtain information regarding the movement of the vehicle.
The optical movement sensor may comprise an illumination source and an illumination detector. The optical movement sensor may comprise an optical movement sensor integrated circuit.
As noted above, the optical movement sensor may comprise the operative part from an optical computer mouse. Alternatively, the optical movement sensor may be adapted from or derived from the operative part of an optical computer mouse. The optical movement sensor may use a light source to illuminate the surface that is being tracked (i.e. the surface over which the vehicle is moving).
Changes between one frame and the next may be processed by an image processing part of a chip embedded in the optical movement sensor and this may translate the movement across the surface of the optical movement sensor (which will generally be mounted to the vehicle) into movement along two axes. Alternatively, the image processing may be performed by processing means separate from the optical movement sensor. For example, the signals received by the optical movement sensor may be conveyed to a separate microprocessor with graphics processing capabilities for processing.
The optical movement sensor may include an optical movement sensing circuit that tracks movement in a fashion similar to the optical movement sensing circuits used to track movement in computer mice. The person skilled in the art will readily appreciate how such optical movement sensing circuits analyze data and provide signals indicative of movement of the sensor across the surface. For this reason, further discussion as to the actual algorithms used in the optical movement sensing circuits need not be provided. Suitably, the optical movement sensing circuit may comprise an optical movement sensing integrated circuit. Such optical movement sensing integrated circuits are readily available from a number of suppliers.
In some embodiments, the control system of the present invention may further comprise one or more inertial sensors for providing further signals regarding the vehicle's attitude and position (or changes thereto) to the controller. Accelerometers and rate gyroscopes are examples of inertial sensors that may be used. The inertial sensors may form part of or comprise an inertial navigation system (INS), a dynamic measurement unit (DMU), an inertial sensor assembly (ISA), or an attitude heading reference system (AHRS). These are well-known to persons skilled in the art and need not be described further. The inertial sensors may be used in conjunction with other navigation sensors, such as magnetometers, or vehicle based sensors such as steering angle sensors, or wheel speed encoders.
Inertial sensors, such as rate gyroscopes and accelerometers, can suffer from time varying errors that can propagate through to create errors in the vehicle's calculated attitude and/or position. These errors can be sufficiently acute that to prevent providing the controller with significantly inaccurate measures of the vehicle's attitude and/or position, it is preferable (and often necessary) for the control system to also receive signals regarding the vehicle's attitude and/or position (or changes thereto) from a source that is independent of the inertial sensors. These separate signals can be used to compensate for the errors in the inertial sensor signals using known signal processing techniques.
It is common to use GNSS signals (which provide information regarding the vehicle's location) to compensate for the errors in the inertial sensor signals. However, the present invention opens up the possibility of providing a control system that includes the optical movement sensor and one or an assembly of inertial sensors (and possibly including one or more other vehicle sensors as well). In other words, in some embodiments of the present invention, the signals provided by the optical movement sensor may be used to compensate for the errors in the inertial sensor signals instead of or in addition to the GNSS signals.
In embodiments such as those described in the previous paragraph, a single optical movement sensor may generally be sufficient to compensate for the errors in inertial sensors, such as accelerometers which measure rates of change in linear displacement. However, a single optical movement sensor may not be sufficient to compensate for errors in inertial sensors, such as gyroscopes which measure rates of change in angular displacement because the optical movement sensor will often be fixedly mounted to the vehicle such that the orientation of the optical movement sensor is fixed to, and changes with, the orientation of the vehicle.
The single optical movement sensor of the kind used in optical computer mice is able to detect and measure movement of the optical movement sensor along the X (roll) and Y (pitch) axes (in the present context this means the X (roll) and Y (pitch) axes of the vehicle because the optical movement sensor is fixed to the vehicle). However, this kind of optical movement sensor is not generally able to detect and measure rotation about the Z (yaw) axis. Consequently, if it is desired to compensate for the XYZ errors in inertial sensors such as gyroscopes using optical movement sensors that are fixedly mounted to the vehicle, two or more optical movement sensors will generally need to be provided and mounted at different locations on the vehicle.
Alternatively, a single optical movement sensor can be used to compensate for the errors in gyroscopes and the like which measure rates of change in rotational displacement if the optical movement sensor is not fixed with respect to the vehicle. Rather, the optical movement sensor could be mounted so that when the vehicle turned (i.e. rotated about its Z (yaw) axis), the orientation of the optical movement sensor would remain unchanged. In effect, even if the vehicle turns, the orientation of the optical movement sensor would remain unchanged, meaning that the optical movement sensor would effectively translate but not rotate with respect to the surface over which the vehicle is moving. A single optical movement sensor might thus be used to compensate for the errors in both accelerometers and gyroscopes, but some system or mechanism (e.g., gimbal-mounting) would need to be provided to maintain the constant orientation of the optical movement sensor.
The embodiments of the invention described above where the control system incorporates one or more inertial sensors, one or more optical movement sensors, and where the optical movement sensor(s) are used (instead of GNSS signals) to compensate for the errors in the inertial sensor(s) can generally be described as relative measurement control systems. This is because the optical movement sensor(s) and the inertial sensor(s) can only measure changes in vehicle attitude and/or position. They are unable to fix the geographic position and attitude of the vehicle in absolute “global” coordinates. References in this document to relative movement of the vehicle, or of an implement associated with the vehicle, or relative attitude/position/heading/pose information should be understood in this context.
However, the relative coordinate system established by relative measurement control systems such as those described above can relate to absolute geographic space if the vehicle can be moved sequentially to at least two, and preferably three or more, locations whose absolute geographic locations are known. This leads to the possibility of calibrating a control system having only optical, inertial, and possibly other vehicle sensors, in the total absence of GNSS. For example, during power up (initialization), the inertial navigation system positions of the vehicle could be arbitrarily set on a map whose origin and orientation is known. To relate this map to absolute geographic space, the vehicle could be located at the first known location, the internal coordinates noted, then moved to a second location and the new internal coordinates likewise noted. The line between the two points could be fitted from the internal map onto the real world map to arrive at the XY offset between the two map origins, the orientation difference between the two map origins, and the linear scaling difference between the two maps.
Thus, in one embodiment, the present invention may comprise a control system including one or more optical movement sensors and one or more inertial sensors. Suitably, the control system may include one or more optical movement sensors and an assembly of inertial sensors. In one embodiment, the control system of the present invention may further comprise an assembly of sensors including accelerometers and rate gyroscopes for providing further position and/or attitude signals to the controller. The assembly may comprise between one and three sensor sets orthogonally mounted, with each sensor set comprising not necessarily one of each, but no more than one of each of the above-mentioned sensors. Such inertial sensors are well known to persons skilled in the art and need not be described further.
In another embodiment, the present invention may comprise a control system including one or more optical movement sensors and one or more other sensors. The other sensors may comprise navigation sensors such as magnetometers, or vehicle sensors such as wheel speed encoders, and steering angle encoders. Control systems in accordance with this embodiment of the invention would also be described as relative measurement control systems, and the relative coordinate system established by such a system can relate to absolute geographic space in generally the same way as described above.
In yet another embodiment, the control system of the present invention, which incorporates one or more optical movement sensors, may be integrated with a GNSS system. In this system, the GNSS system provides absolute measurement in geographic space and the optical movement sensor provides relative movement data that can be used to control the vehicle during periods of outage of GNSS signals or during periods of normal operation when no GNSS signals are being received. Thus, in a further embodiment, the present invention provides a control system including one or more optical movement sensors and a GNSS system.
In a further still embodiment, the control system of the present invention may incorporate one or more optical movement sensors, a GNSS system and one or more inertial sensors, suitably an assembly of inertial sensors. In this embodiment, the optical movement sensor is configured to look at the ground near or under the vehicle. The output signal generated by the optical movement sensor comprises the relative movement along the axis of the vehicle and the relative movement across the axis of the vehicle. This information can be used as an additional source for compensating for the errors in the inertial sensors, giving a combined GNSS/INS/optical movement sensor system with the capability of operating over sustained periods of GNSS outage. Thus, in another embodiment, the present invention that may provide a control system including one or more optical movement sensors, a GNSS system and one or more inertial sensors, such as an assembly of inertial sensors.
GPS (global positioning system) is the name of the satellite-based navigation system originally developed by the United States Department of Defence. GNSS (including GPS and other satellite-based navigation systems) is now used in a wide range of applications. A number of systems also exist for increasing the accuracy of the location readings obtained using GNSS receivers. Some of these systems operate by taking supplementary readings from additional satellites and using these supplementary readings to “correct” the original GNSS location readings. These systems are commonly referred to as “Satellite Based Augmentation Systems” (SBAS) and some examples of SBASs are: the United States' “Wide Area Augmentation System” (WAAS); the European Space Agency's “European Geostationary Navigation Overlay Service” (EGNOS); and the Japanese “Multi-Functional Transportation Satellite” (MFTS).
A number of “Ground Based Augmentation Systems” (GBASs) also exist which help to increase the accuracy of GNSS location readings by taking additional readings from beacons located at known locations on the ground. It will be understood that, throughout this specification, all references to GNSS include GNSS when augmented by supplementary systems such as SBASs, GBASs and the like.
In embodiments of the present invention where the optical movement sensor is used in combination with one or more other sensors, the datastream from the optical movement sensor may be combined with a datastream from another sensor. This may be done using known signal processing techniques to obtain a stream of statistically optimal estimates of the vehicle's current position and/or attitude. Suitably, the signal processing techniques may utilize a statistically optimized filter or estimator. The optimal filter or estimator could usefully, but not necessarily, comprise a Kalman filter.
The optical sensor used in the control system in accordance with the present invention may comprise an optical movement sensing integrated circuit that receives raw data from a lens assembly mounted on a vehicle or on an implement towed by a vehicle. The lens assembly may be configured such that an image of the ground immediately below the lens assembly is formed on a photosensor plane of the optical movement sensing integrated chip by the lens assembly. Usefully, the lens may be a telecentric lens. Furthermore, the lens may be an object space telecentric lens. An object space telecentric lens is one that achieves dimensional and geometric invariance of images within a range of different distances from the lens and across the whole field of view. Telecentric lenses will be known to those skilled in the art and therefore need not be described any further.
The lens assembly may be chosen so that the extent of the image on the optical movement sensing integrated chip represents a physical extent in the object plane which is commensurate with both the anticipated maximum speed of the vehicle and the processing rate of the optical movement sensing integrated circuit. For example, if the maximum speed of the vehicle is 5 m per second and the desired overlap of successive images is 99%, an image representing 0.5 m in extent will require a processing speed of 1000 frames per second.
The optical movement sensor may include an illumination source of sufficient power such that the image of the ground beneath the vehicle is rendered with optimum contrast. This can be usefully, but not necessarily implemented as an array of high intensity light emitting diodes chosen to emit light at the wavelength of optimal intensity of the optical movement sensor.
Desirably, the optical movement sensor may be provided with a mechanism to keep the entrance pupil of the optical assembly free of dust. This could be usefully implemented by means of a high velocity air curtain passing the entrance pupil. Other mechanisms may be used, such as those that spray a cleaning fluid over the pupil. The cleaning fluid in those embodiments may comprise a cleaning liquid, such as water. Other means or mechanisms suitable for keeping the lens, or at least the entrance pupil of the optical assembly, free of dust will be known to those skilled in the art and may also be used with the present invention.
In another embodiment, the present invention provides a control system for controlling a position of an implement associated with a vehicle, characterised in that the control system includes an optical movement sensor which scans a surface over which the implement is moving and generates a signal indicative of relative movement along an axis of the implement and relative movement across an axis of the implement, said signal being provided to a controller.
In another aspect, the present invention provides a control system for maintaining a position and/or heading (attitude) of an implement close to a desired path of travel, the control system comprising a controller having a computer memory for storing or generating the desired path of travel, the controller being adapted to receive position and/or heading signals relating to a position and/or heading of the implement from one or more sensors, the position and/or heading signals enabling the controller to determine the position and/or heading of the implement relative to the desired path of travel, the controller sending control signals to a position and/or heading control mechanism in response to the determined position and/or heading, wherein the position and/or heading signals from the one or more sensors include a signal generated by an optical movement sensor configured to scan a surface over which the implement is travelling, the optical movement sensor generating a signal indicative of relative movement along an axis of the vehicle and relative movement across an axis of the vehicle.
Suitably, in this aspect, the optical movement sensor is mounted to the implement. The optical movement sensor may scan the surface close to the implement or underneath the implement as the implement traverses the surface.
In this aspect, the control algorithms and the position control mechanisms may be as described in U.S. Pat. No. 7,460,942, which is assigned to a common assignee herewith and incorporated herein by reference. In embodiments of this aspect of the invention, the position of the implement may be controlled by controlling the steering of the vehicle associated with the implement (this is especially useful if the implement is rigidly and fixedly connected to the vehicle), or by moving the position of the implement (or at least a working part of the implement) relative to the vehicle, which may be achieved by adjusting the lateral offset between the working part of the implement and the vehicle, or by using the working part of the implement to “steer” the implement.
In this aspect, the control system may further include one more of a GNSS system and inertial sensors and navigation sensors and vehicle based sensors. These various systems and sensors are described above with reference to other aspects of the invention.
It is a further objective of the present invention to provide a vehicle control system having an embedded spatial database that may at least partially ameliorate one or more of the above-mentioned difficulties, or which may provide a useful or commercial alternative to existing control systems.
Accordingly, in a first broad form the present invention resides in a vehicle control system having a controller and a spatial database adapted to provide spatial data to the controller at control speed.
In another broad form, the invention resides in a control system for controlling a vehicle within a region to be traversed, the control system comprising: a spatial database containing spatial data; a controller adapted to receive spatial data from the spatial database at control speed; the control system being adapted to receive spatial data from the controller and/or an external source; and the controller using the spatial data for controlling the vehicle.
In a further broad form, a control system is provided for steering a vehicle within a region to be traversed, the control system comprising: a spatial database containing spatial data; a controller adapted to receive spatial data from the spatial database at control speed; the controller being adapted to control the steering of the vehicle, the spatial database being adapted to receive updated spatial data from the controller and/or an external source; and the updated spatial data relating to the vehicle and/or an implement associated with and proximate the vehicle and/or at least a portion of the region proximate the vehicle.
In agricultural applications, the region to be traversed by the vehicle will generally be the field that is to be plowed, harvested, etc., and the invention will be described generally with reference to agricultural vehicles operating in fields. However, no limitation is meant in this regard, and the region to be traversed by the vehicle may take a range of other forms in different applications. For example, in automotive applications the region to be traversed by the vehicle might comprise roadways located in a particular geographical area. Alternatively, in mining applications the region could comprise the vehicle navigable regions of the mine. In underground mining, this could include the various levels of the mine located vertically above and below one another at different relative levels (depths). Furthermore, the control system of the present invention could be applied to vehicles that operate on airport tarmacs, in which case the region to be traversed by the vehicle might be the tarmac, or a portion thereof. From these examples, the person skilled in the art will appreciate the breadth of other applications that are possible.
The control system of the present invention includes a spatial database that contains spatial data. The spatial database may also be adapted to receive spatial data including updated spatial data, and to provide spatial data to other components of the control system. In general, data may be characterized as “spatial” if it has some relationship or association with “real world” geographical location, or if it is stored somehow with reference to geographical location. Some illustrative examples of the kinds of spatial data that may be stored within the database include (but are not limited to) coordinate points describing the location of an object (e.g., a rock or tree) in terms of the object's “real world” geographical location in a field, the coordinate points for a geographical location itself, information regarding a “state” of the vehicle (e.g., its speed, “pose” (position and orientation) or even fuel level) at a particular geographical location, a time when the vehicle was at a particular geographical location, or a command to the vehicle to change its trajectory or mode of equipment (e.g., plow) operation if or when it reaches a certain geographical location. These examples illustrate that any data or information that has an association with geographical location, or which is stored with reference to geographical location, can constitute “spatial data”. For the remainder of this specification, the terms “spatial data” and “spatial information” will be used interchangeably. References simply to “data” or “information” will generally also carry a similar meaning, and references simply to the “database” will be to the spatial database, unless the context requires otherwise. Typically, the spatial database is an electronic database stored in a memory device, such as, for example, a RAM, as discussed in more detail below.
Spatial data may be stored within the database according to any convenient coordinate system, including (but not limited to) cartesian (or projected) coordinates, polar coordinates, cylindrical coordinates, spherical coordinates, latitude/longitude/altitude etc. The coordinate system may also be “global” in the sense of the location references provided by GPS, or “local” coordinates such as those defined with respect to a local origin and reference orientation. The coordinates may or may not take into account the curvature caused by the Earth's overall spherical shape. Hence, there is no limitation as to the coordinate system that may be used with the present invention, although it is envisaged that Cartesian (x,y or x,y,z) coordinates or latitude/longitude/altitude will be used most frequently because of the way these inherently lend themselves to describing geographical location, and because of the ease with which these coordinate systems can be implemented digitally. Particularly representative embodiments may utilize the WGS84 World Geodetic System 1984 datum which is consistent with the current GPS.
Those skilled in the art will know that GPS (global positioning system) is the name of the satellite based navigation system originally developed by the United States Department of Defense. GPS is now used in a wide range of applications. A number of systems also exist for increasing the accuracy of the location readings obtained using GPS receivers. Some of these systems operated by taking supplementary readings from additional satellites and using these supplementary readings to “correct” the original GPS location readings. These systems are commonly referred to as “Satellite Based Augmentation Systems” (SBAS) and some examples of SBASs are: the United States' “Wide Area Augmentation System” (WAAS); the European Space Agency's “European Geostationary Navigation Overlay Service” (EGNOS); and the Japanese' “Multi-Functional Transportation Satellite” (MFTS).
A number of “Ground Based Augmentation Systems” (GBASs) also exist which help to increase the accuracy of GPS location readings by taking additional readings from beacons located at known locations on the ground. It will be understood that, throughout this specification, all references to GPS include GPS when augmented by supplementary systems such as SBASs, GBASs and the like.
It is explained above that the controller (which controls the vehicle) receives spatial data from the spatial database. In this way, the data received by the controller from the database forms at least part of the control inputs that the controller operates on to control the vehicle (i.e. the spatial data forms at least part of the inputs that drive the controller). The fact that the controller operates directly on information that is inherently associated with “real world” geographic location represents a change in thinking compared with existing vehicle control systems. In particular, it means that the control system of the present invention “thinks” directly in terms of spatial location. Put another way, in the control system of the present invention, control parameters are defined in geographic space rather than the space of an abstract vector. Consequently, the controller of the present invention may be considered to be inherently “multi-dimensional” or “spatial” in nature, as opposed to “one-dimensional” or “linear” like the existing control systems described in the background section above.
It is envisaged that at least some (and probably most) of the components of the control system, including the controller, will typically be implemented using commercially available equipment and a generally conventional control architecture. For instance, the controller may be implemented using equipment that provides memory and a central processing unit to run the one or more algorithms required to control the vehicle. Likewise, the controller (and hence the control algorithm(s)) used in the present invention may take any form suitable for controlling the steering of a vehicle. Typically, closed loop or feedback type control will be used at least in relation to some signal streams (i.e. in relation to at least some of the vehicle variables being controlled by the controller). However, open loop control may also be used, as may feed-forward control structures wherein the spatial data received by the controller from the spatial database is fed forward to form part of the control outputs used to control the vehicle. Where feedback type control is used, the control structure may incorporate combinations of proportional, integral and differential control, or a series of such (possibly nested) control loops. However, no particular limitation is meant in this regard and the person skilled in the art will appreciate that any form of suitable control and/or controller may be used.
The control system may also incorporate conventional signal processing and transmitting equipment, for example, for suitably filtering incoming spatial data signals, and for transmitting control signals from the controller to the vehicle's steering system to steer the vehicle. The person skilled in the art will appreciate that any suitable electric, mechanical, pneumatic or hydraulic actuators, or combinations thereof, may be used with the present invention. The actuators may be linked with the vehicle's steering and drive systems to control the steering, acceleration, deceleration, etc. of the vehicle in response to control signals produced by the controller. Associated equipment such as amplifiers and power sources may also be provided as required to amplify the control signals, and to power the actuators. A wide range of power sources may be used including batteries, generators, pumps, etc., depending on the nature of the actuator(s) and the signals to be amplified.
Whilst the present control system may operate using a conventional form of controller and using at least some commercially available equipment, the spatial database used to store the spatial data and to provide the spatial data to the controller may be different to other forms of databases used in other areas. In other areas (including non-control related applications such as those where data storage is the principal objective), databases often contain the vast amounts of information (in this case “information” is not used in its “spatial” sense) and the information is generally stored in complex hierarchical structures. Conceptually, these databases may be considered to be “multi-levelled” in that an initial query may return only relatively superficial level information, but this may in turn allow the user to interrogate the database more deeply to obtain more specific, linked or related information. This complex structure means that these kinds of databases can take considerable time (many seconds, minutes or even longer) to generate the appropriate output in response to a query. Those skilled in the art will appreciate that databases such as these, which take a relatively long time to return information in response to a query, may not be suitable for use in control systems such as the present which require low latencies between variable inputs and control outputs to thereby enable real-time control to be provided.
The spatial database used in the present invention will suitably be adapted to provide the data to the controller at control speed. In this sense, “control speed” means that the database is able to provide the information at a rate of the same order as the speed at which the controller repeats successive cycles of the control algorithm (i.e. at a rate of the same order as the “clock speed” of the controller). Ideally, the database will be adapted to provide the data to the controller, and perhaps also receive data from the controller and/or external sources, at every successive cycle of the control algorithm (i.e. at the controller's clock speed). However, in some embodiments it may be sufficient for the database to be adapted to provide (and perhaps receive) data at less than, but close to, the controller's clock speed (for example, at every second or third successive cycle of the control algorithm), provided that the rate is fast enough to provide the controller with sufficiently up-to-date spatial information to achieve adequate vehicle control performance. In cases where the controller operates at different clock speeds for different data signal streams, the database may be adapted to provide data at a rate of the same order as one of those controller clock speeds. In any event, the database should provide data to the controller at a rate commensurate with the control loop bandwidth.
In practice, it is envisaged that the database may be adapted to provide data to the controller at a rate of between 1 Hz and 100 Hz. Given the speeds that vehicles such as agricultural vehicles typically move at (generally less than 60 km/hr or 37.3 miles/hr), rates between 1 Hz and 20 Hz will almost always be sufficient, and even rates between 3 Hz and 12 Hz may be sufficient for vehicles moving at significantly less than 60 km/hr. Nevertheless, those skilled in the art will recognize that the necessary or achievable rates may vary depending on the level of control precision and performance required in different applications, the speed at which the vehicle in question moves, and the capabilities of the available equipment used to implement the control system.
Those skilled in the art will appreciate that because the spatial database used in the present invention can provide spatial data to the controller at control speed, and therefore forms part of the system's overall configuration, the spatial database may be considered to be “embedded” within the control system, rather than external to it. This is particularly so in embodiments where feedback type control is used, and the spatial database forms part of the system's overall closed loop structure (i.e. in embodiments where the spatial database forms part of the loop).
In order for the database to be able to provide (and, if desired, also receive) data at the required rates, the form of the database should allow the required rapid database access and response times. Ideally, the database and all of the data that it contains will be loaded into the control system's memory (i.e. loaded into RAM). This way, the data will be directly accessible by the controller's CPU (central processing unit), rather than requiring a query to be sent to a remote disk or storage device containing the data, the response to which would then need to be loaded into RAM before being accessible by the CPU. However, it is possible that the database could be located on a separate disk or other storage device, particularly if the device is capable of retrieving data in response to a query with sufficient speed such as, for example, a disk device with RAM read/write cache.
It is envisaged that the amount of memory required to store the spatial data relating to a particular field to be traversed by the vehicle may be in the order of megabytes. By way of example (given for illustrative purposes only), consider a straight wayline that is 1 km long and which has 500 parallel swaths of corresponding length. If the database is designed to incorporate information pertaining to locations every 2 m along each of the 500 swaths, this corresponds to 501.times.500=250,500 locations. When the data is structured within the database in the manner described further below, this may correspond to approximately 4 MB of memory required to store the coordinates of each point. However, it is also envisaged that as the nature and complexity of the data required to be stored in the database increases, the required amount of memory may increase to hundreds of megabytes or gigabytes. Devices which provide this amount of memory are (or are at least becoming) commercially available.
The speed of the database may be assisted by the way in which the data is arranged (i.e. stored) within the database. A wide range of methods and algorithms are known for arranging data (i.e. for assigning appropriate “indices” and the corresponding memory allocations to individual items of data) within databases, and the particular method chosen depends on the nature of the data, and the way and speed with which the database is to respond to a query. For the complex hierarchical “multi-leveled” databases described above, the data should be arranged so as to enable the database to collate and deliver all relevant information relating to a complex query. However, as explained above, the requirement for those databases to be able to process complex queries leads to potentially long lag times which may be undesirable in the context of vehicle control applications. Therefore, the spatial database used in the present invention can store data in a “single-level” or “flat” structure according to the geographical location that particular items of data relate to.
Some algorithms which could be used to arrange the spatial data within the database include the algorithms commonly referred to by the names “Grid-indexing”, “Quadtree” or “R-tree”. However, in other embodiments of the invention data may be arranged within the database using a form of algorithm that will be referred to as a “spatial hash-key” algorithm. A spatial hash-key algorithm maps physical locations (based on their “real world” coordinates) into one-dimensional “hash-keys”. The “hash-key” for each location is a string of characters that can be stored in the database's hash table and retrieved in response to a query.
Properties of the spatial hash-key algorithm may include: points which are close to each other in the real world should have closely related hash keys (i.e. the algorithm should maintain “locality”), the algorithm should operate using whatever coordinate system the control system uses to represent the region, the algorithm should be adapted for digital implementation (hence, it should be adapted to operate using integer or floating-point numbers, preferably with 64-bit “double” precision or better) the algorithm should be fast to compute.
It is explained above that the control system of the present invention, and ideally the spatial database, may be adapted to receive updated data from the controller and/or an external source. The spatial database can be adapted to receive the updated information at control speed. Data received from the controller may include or may be used to generate, for example, estimates of the vehicle's predicted state (i.e. its speed, position, orientation, etc.) at an upcoming location based on its current instantaneous state at a particular location. The external sources may include GPS, INS, or any other inertial, visual or other system used for obtaining information relating to the state of the vehicle or other aspects of the region (such as obstacles close to the vehicle). Data received in this way may be (at least initially) recorded in its unprocessed or “raw” form in the database. This unprocessed data may be fed directly back into the controller, or the respective streams of incoming data (possibly relating to disparate variables) may be filtered using a Kalman filter or some other similar digital signal processing technique to obtain a statistically optimized estimate of the state of the vehicle and its proximate surroundings as it travels. This optimized estimate of the vehicle's state at a particular location may then be fed into the controller. The use of statistically optimized estimates and data may help to improve control performance.
According to a further broad form, the invention resides in a closed loop vehicle control system comprising: a spatial database; a controller adapted to receive spatial data from the spatial database at control speed, the controller controlling the steering of the vehicle; wherein updated spatial data is fed back into the control system.
In yet another broad form, the invention resides in a method for controlling a vehicle comprising: entering spatial data relating to a region to be traversed by the vehicle into a spatial database; providing spatial data from the spatial database to a controller at control speed to control the vehicle as the vehicle traverses the region; and entering updated spatial data into the spatial database as the vehicle traverses the region.
In yet a further broad form, the invention resides in a vehicle control system comprising: a spatial database; a controller adapted to receive spatial data from the spatial database; and the controller using the spatial data from the spatial database to control the steering of the vehicle.
It will be appreciated that all preferred features and aspects of the invention described with particular reference to one or other broad form of the invention, may also apply equally to all other forms of the invention, unless the context dictates otherwise.
BRIEF DESCRIPTION OF THE DRAWINGS
Certain embodiments, aspects and features of the invention will now be described and explained by way of example and with reference to the drawings. However, it will be clearly appreciated that these descriptions and examples are provided to assist in understanding the invention only, and the invention is not limited to or by any of the embodiments, aspects or features described or exemplified.
FIG. 1A shows a vehicle comprising a tractor and an implement fitted with an optical tracking control system in accordance with one embodiment of the present invention, and further shows XYZ axial attitude orientations.
FIG. 1B shows the vehicle with a block diagram of the control system.
FIG. 1C shows a vehicle fitted with an optical tracking control system in accordance with another embodiment of the present invention including a pair of optical tracking sensors.
FIG. 2 shows a real-time kinematic (RTK) optical tracking vehicle control system in accordance with another embodiment of the present invention.
FIG. 3 shows a vehicle fitted with an optical tracking control system in accordance with another embodiment of the present invention including an inertial navigation system (INS) with inertial sensors.
FIG. 4 shows an RTK optical tracking vehicle control system in accordance with another embodiment of the present invention including an INS.
FIG. 5 shows a flow sheet illustrating the interaction of an optical movement sensor with a controller in accordance with an embodiment of the present invention.
FIG. 6 shows a flow sheet illustrating the interaction of an optical movement sensor and a GNSS sensor with a controller in accordance with an embodiment of the present invention.
FIG. 7 shows a flow sheet illustrating the interaction of an optical movement sensor and inertial sensors with a controller in accordance with an embodiment of the present invention.
FIG. 8 shows a flow sheet illustrating the interaction of an optical movement sensor, inertial sensors and a GNSS sensor with a controller in accordance with an embodiment of the present invention.
FIG. 9 shows a schematic view of an embodiment of the present invention in which the position of an implement is optically tracked.
FIG. 10 shows a schematic diagram of one possible arrangement for an optical movement sensor that could be used in the present invention.
FIG. 11 shows an end view of the lens and LED Light-Emitting Diode illuminator ring used in the arrangement of FIG. 10.
FIG. 12 is a schematic illustration of the operation of a discrete-time Kalman filter, which may be used in an optimal estimator of the present invention.
FIG. 13 schematically represents the difference between the vehicle's actual spatial location and what is “seen” by existing forms of “one-dimensional” controllers such as those described in the background section above.
FIG. 14 is a pictorial representation of an agricultural vehicle having a control system in accordance with one particular embodiment of the present invention.
FIG. 15 illustrates the physical meaning of certain parameters controlled by some versions of the present control system, namely the “cross-track error”, the “heading error” and the “curvature error”.
FIG. 16 is a schematic “block-diagram” representation of an overall control system structure that may be used in representative embodiments of the present invention.
FIG. 17 is a schematic representation of the “controller” block that may be used in representative embodiments such as that shown in FIG. 16.
FIG. 18 is a further schematic “block-diagram” representation of an overall control system structure that may be used with alternative representative embodiments of the invention which incorporate additional features not shown in FIG. 16.
FIG. 19 is a block diagram representation of the state space representation used in the digital implementation of certain aspects of the control system.
FIG. 20 shows an example trajectory of an agricultural vehicle, and the coordinates corresponding to different points along the trajectory using a simplified integer based coordinate system.
FIG. 21 shows a similar example trajectory of an agricultural vehicle to that shown in FIG. 20, except that the coordinate system is similar in format to the WGS84 coordinate used by current GPS.
FIG. 22 illustrates the way in which numbers are represented in the IEEE 754 standard double-precision floating-point format.
FIG. 23 is a “flow-diagram” illustrating the way a particularly preferred spatial hash algorithm may be used to generate hash keys for the coordinates in FIG. 21.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
1. Introduction and Environment
As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention, which may be embodied in various forms. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present invention in virtually any appropriately detailed structure.
Certain terminology will be used in the following description for convenience in reference only and will not be limiting. For example, up, down, front, back, right and left refer to the invention as oriented in the view being referred to. The words “inwardly” and “outwardly” refer to directions toward and away from, respectively, the geometric center of the embodiment being described and designated parts thereof. Global navigation satellite systems (GNSS) are broadly defined to include GPS (U.S.), Galileo (Europe, proposed), GLONASS Globalnaya Navigazionnaya Sputnikovaya Sistema (GLONASS) (Russia), Beidou (China), Compass (China, proposed), IRNSS Indian Regional Navigation Satellite System (IRNSS) (India, proposed), QZSS Quasi-Zenith Satellite System (QZSS) (Japan, proposed) and other current and future positioning technology using signals from satellites, using single or multiple antennae, with or without augmentation from terrestrial sources. Inertial navigation systems (INS) include gyroscopic (gyro) sensors, accelerometers and similar technologies for providing output corresponding to the inertia of moving components in all axes, i.e. through six degrees of freedom (positive and negative directions along longitudinal X, transverse Y and vertical Z axes). Yaw, pitch and roll refer to moving component rotation about the Z, Y and X axes respectively. Said terminology will include the words specifically mentioned, derivatives thereof and words of similar meaning.
2. Optical Vehicle Control System 2
FIGS. 1A and 1B show a tractor 10 fitted with an optical movement sensor 16 in accordance with an embodiment of the present invention. The tractor 10 is towing an agricultural implement, such as a plow, sprayer, cultivator, etc. 12.
The tractor 10 is fitted with a steering control system. The steering control system includes a GNSS receiver 13 connected to antennas 20, a controller 14 and a steering valve block 15. The controller 14 suitably includes a computer memory that is capable of having an initial path of travel entered therein. The computer memory is also adapted to store or generate a desired path of travel. The controller 14 receives position and attitude signals from one or more sensors (to be described later) and the data received from the sensors are used by the controller 14 to determine or calculate the position and attitude of the tractor. The controller 14 then compares the position and attitude of the tractor with the desired position and attitude of the tractor. If the determined or calculated position and attitude of the tractor deviates from the desired position and attitude of the tractor, the controller 14 issues a steering correction signal that interacts with a steering control mechanism. In response to the steering correction signal, the steering control mechanism makes adjustments to the angle of steering of the tractor, to thereby assist in moving the tractor back towards the desired path of travel. The steering control mechanism may comprise one or more mechanical or electrical controllers or devices that can automatically adjust the steering angle of the vehicle. These devices may act upon the steering pump, the steering column or steering linkages.
In one embodiment of the present invention, the steering control algorithm may be similar to that described in our U.S. Pat. No. 6,876,920, which is incorporated herein by reference and discloses a steering control algorithm, which involves entering an initial path of travel (often referred to as a wayline). The computer in the controller 14 then determines or calculates the desired path of travel, for example, by determining the offset of the implement being towed by the tractor and generating a series of parallel paths spaced apart from each other by the offset of the implement. This ensures that an optimal working of the field is obtained. The vehicle then commences moving along the desired path of travel. One or more sensors provide position and attitude signals to the controller and the controller uses those position and attitude signals to determine or calculate the position and attitude of the vehicle. This position and attitude is then compared with the desired position and attitude of the vehicle. If the vehicle is spaced away from the desired path of travel, or is pointing away from the desired path, the controller generates a steering correction signal. The steering correction signal may be generated, for example, by using the difference between the determined position and attitude of the vehicle and the desired position and attitude of the vehicle to generate an error signal, with the magnitude of the error signal being dependent upon the difference between the determined position and attitude and the desired position and attitude of the vehicle. The error signal may take the form of a curvature demand signal that acts to steer the vehicle back onto the desired path of travel. Steering angle sensors in the steering control mechanism may monitor the angle of the steering wheels of the tractor and send the data back to the controller to thereby allow the controller to correct for understeering or oversteering.
In an alternative embodiment, the error signal may result in generation of a steering guidance arrow on a visual display unit to thereby enable the driver of the vehicle to properly steer the vehicle back onto the desired path of travel. This manual control indicator may also be provided in conjunction with the steering controls as described in paragraph above.
It will be appreciated that the invention is by no means limited to the particular algorithm described, and that a wide variety of other steering control algorithms may also be used.
In general terms, most, if not all, steering control algorithms operate by comparing a determined or calculated position and attitude of the vehicle with a desired position and attitude of the vehicle. The desired position and attitude of the vehicle is typically determined from the path of travel that is entered into, or stored in, or generated by, the controller. The determined or calculated position and attitude of the vehicle is, in most, if not all, cases determined by having input data from one or more sensors being used to determine or calculate the position and attitude of the vehicle. In U.S. Pat. No. 6,876,920, GNSS sensors, accelerometers, wheel angle sensors and gyroscopes are used as the sensors in preferred embodiments of that patent.
Returning now to FIGS. 1A and 1B, the tractor 10 is fitted with a controller 14. The controller 14 includes a graphic user interface (GUI) 17 mounted in the cab of the tractor 10 for inputting data to the controller 14 and a display screen. The GUI 17 can comprise any means for entering data into the controller 14, for example a touchscreen, keyboard or keypad for manually entering data, or a cable/wireless connection for transferring data to the controller 14. The GUI 17 also includes a display screen, and can include various other output devices such as LEDs, audio, printers, hardwired and wireless output connections, etc. The controller 14 also includes a computer memory for receiving and storing data, a CPU for processing data and a control signal generator for generating control signals to the steering control mechanism. The controller 14 may also include random access memory (RAM), read only memory (ROM), and an optical disc drive such as a DVD drive or a CD drive for receiving optical disks and reading information therefrom. The controller 14 may be pre-programmed with software that allows for calculation of the desired path of travel. Alternatively, software may be loaded onto the controller from a recorded media carrier, such as a DVD disc, a CD disc, a floppy disk or the like. Appropriate software may be downloaded from a network.
The actual details of the controller will be readily understood by persons skilled in the art and need not be described further.
The tractor 10 shown in FIGS. 1A and 1B is also fitted with an optical movement sensor 16. The optical movement sensor 16 is fitted to an arm 18 extending forwardly from the front of the tractor 10. This is so that the optical movement sensor 16 is ahead of the wheels to minimize the effect of dust kicked up by the wheels. However, it will be appreciated that the optical movement sensor 16 may be positioned at a side of the tractor or at a rear part of the tractor, or even underneath the tractor. The basic requirement for the optical movement sensor positioning and mounting is that the optical movement sensor can emit radiation, typically light, onto the ground and receive reflected radiation or light from the ground. Provided that this basic requirement is met, the optical movement sensor may be mounted anywhere on the tractor. Indeed, the optical movement sensor may even be mounted on the implement 12 (FIG. 9).
In the embodiment shown in FIGS. 1A and 1B, the optical movement sensor 16 is “gimballed”, meaning that its orientation with respect to the tractor may change. This “gimballed” embodiment is described further below.
The optical tracking movement sensor 16 may comprise the operative part of an optical computer mouse. Optical computer mice incorporate an optoelectronics sensor that takes successive pictures of the surface on which the mouse operates. Most optical computer mice use a light source to illuminate the surface that is being tracked. Changes between one frame and the next are processed by an image processing part of a chip embedded in the mouse and this translates the movement of the mouse into movement on two axes using a digital correlation algorithm. The optical movement sensor 16 may include an illumination source for emitting light therefrom. The illumination source may comprise one or more LEDs. The optical movement sensor may also include an illumination detector for detecting light reflected from the ground or the surface over which the vehicle is travelling. Appropriate optical components, such as a lens (preferably a telecentric lens), may be utilized to properly focus the emitted or detected light. A cleaning system, such as a stream of air or other cleaning fluid, may be used to keep the optical path clean. The optical movement sensor 16 may comprise a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) sensor. The optical movement sensor 16 may also include an integrated chip that can rapidly determine the relative movement along an axis of the vehicle and the relative movement across an axis of the vehicle by analysing successive frames captured by the illumination detector. The optical movement sensor can complete hundreds to thousands of calculations per second.
The optical movement sensor 16 generates signals that are indicative of the relative movement of the vehicle along the vehicle's axis and the relative movement of the vehicle across the vehicle's axis. The signals are sent to the controller 14. The signals received by the controller 14 are used to progressively calculate or determine changes in the position and attitude of the vehicle. In the embodiment shown in FIGS. 1A and 1B, the controller 14 may include a clock that can be used to determine a time of travel of the vehicle and use that time of travel and possibly other input variables (such as the speed of the vehicle), together with the signals generated by the optical movement sensor, to calculate or determine the position and attitude of the vehicle This may then be compared to the desired position and attitude of the vehicle arising from the desired path of travel stored in or generated by the controller 14. If there are any discrepancies between the calculated or determined position and attitude of the vehicle and the desired position and attitude of the vehicle, a steering correction signal is sent from the controller 14 to the steering (valve block) control mechanism. Examples of such automatic steering control mechanisms are disclosed in U.S. Pat. No. 7,142,956; No. 7,277,792; No. 7,400,956; and No. 7,437,230, all of which are assigned to a common assignee with the present application and are incorporated herein by reference.
Only one optical movement sensor 16 is illustrated in FIGS. 1A and 1B. However, as described above, if the optical movement sensor 16 is the kind used in optical computer mice, and if the optical movement sensor 16 is fixed with respect to the vehicle, the optical movement sensor will generally only measure movement along and across the principal axis of the vehicle (i.e. along the longitudinal roll X axis and along the transverse pitch Y axis). Fixed optical movement sensors of this kind generally do not measure rotation about the yaw Z axis. A single optical movement sensor 16 could be used to measure change in the vehicle's orientation with respect to its yaw Z axis (in addition to measuring changes in the movement of the vehicle along the roll X and pitch Y axes), if the optical movement sensor 16 is mounted in a gimbal mount 19, which can be controlled with input from a compass (GNSS, magnetic, etc.) and/or a gyroscope. In this context, “gimballed” means that the optical movement sensor 16 is mounted to the vehicle in a dynamically pivotable manner so that the orientation of the optical movement sensor (at least about its yaw Z axis) remains the same even if the orientation of the vehicle about its yaw Z axis changes. In other words, so that the optical movement sensor orients itself in a similar way to a compass needle (which stays in one orientation even if the compass is rotated). It will be appreciated from the explanations given above that if a single optical movement sensor is mounted to the vehicle in “gimballed” manner, the optical movement sensor will effectively translate, but not rotate, as the vehicle moves and turns. In order to achieve “gimballed” mounting, a gimbal device or mechanism 19 is provided to dynamically adjust the orientation of the optical movement sensor with respect to the vehicle so that the optical movement sensor's orientation remains the same as the vehicle moves and turns. Such gimbal mounting devices and mechanisms are commercially available and will be known to those skilled in the art. They therefore require no further explanation. The gimballed mounting device or mechanism could also monitor the change in the optical movement sensor's orientation relative to the orientation of the vehicle, and this information could be used to calculate or determine changes in the vehicle's orientation about its yaw Z axis.
The alternative embodiment shown in FIG. 1C accommodates calculating and determining attitude changes in the vehicle's orientation about its yaw Z axis, which attitude changes indicate the vehicle's heading or direction of travel. Two optical movement sensors 16 are mounted on the front of the vehicle 10 and each measures movement of the vehicle along the longitudinal roll Y axis and along the transverse pitch X axis of the vehicle. However, where the longitudinal and transverse movements detected by each optical movement sensor 16 differs, this difference will generally be associated with changes in the vehicle's orientation about its yaw Z axis. Therefore, this difference may be used to calculate or determine changes in the vehicle's orientation about its yaw Z axis. As an example, the vehicle might be provided with an optical movement sensor 16 adjacent to and inboard of the front wheels (FIG. 1C). Therefore, there would be one optical movement sensor on either side of the vehicle at the front of the vehicle. If the vehicle were to turn a corner (which would make it rotate about its yaw Z axis), the optical movement sensor 16 on the outside of the turning circle would measure a greater distance travelled than the optical movement sensor on the inside. This difference could then be used, along with the known positioning of each optical movement sensor with respect to the other, to calculate or determine the change in the vehicle's orientation about its yaw Z axis.
3. Alternative Embodiment Optical Control System 102
FIG. 2 shows a schematic diagram of an alternative embodiment of the present invention. A number of the features of the embodiment shown in FIG. 2 are similar to those shown in FIG. 1. For convenience and brevity of description, similar features in FIG. 2 are denoted by the same reference numeral as those used to denote similar features in FIG. 1, but increased by 100. For example, tractor 110 in FIG. 2 corresponds to tractor 10 in FIG. 1. It can be seen that the embodiment shown in FIG. 2 also includes a controller 114 and an optical movement sensor 116.
The embodiment shown in FIG. 2 further includes a differential GNSS system. The differential GNSS system includes a GNSS receiver(s) 113 connected to satellite antennas 120. The satellite antennas 120 are mounted on the roof of the tractor 110 and optionally on the implement 112. Such multiple antennas 120 enable vector calculations of the tractor attitude. The satellite antennas 120 receive satellite signals from the array of GNSS satellites orbiting the earth, shown schematically at 122, 124, 126 and 127. The differential GNSS system also includes a base station 128. The base station 128 includes a GNSS antenna 130 connected to a GNSS receiver 132. The antenna 130 receives signals from the orbiting GNSS satellites 122, 124, 126 and 127. The GNSS receiver 132, on the basis of the signals coming from antenna 130, calculates and provides positional data to a computer 134. The computer compares the positional data from the GNSS receiver 132 with a predetermined and accurately known position for antenna 130. On the basis of this comparison, computer 134 is able to calculate an error factor, which is continuously updated and passed to a transmitter 136, such as a radio modem. The transmitter 136 generates a serial data signal which is upconverted and propagated by the base antenna 130. The transmitted error signal is received by an antenna 140 mounted on tractor 110.
The GNSS receiver on the tractor 110 receives GNSS signals from the constellation of GNSS satellites via GNSS antenna 120 mounted on the tractor 110. The signals are sent to controller 114. The signals received from GNSS receiver(s) 113 on tractor 110 are corrected by the error correction signal sent from the transmitter 138 136. Thus, an accurate determination of position of the tractor can be obtained from the differential GNSS system.
The controller 114 also receives position signals from the optical movement sensor 116. As described above with reference to the embodiment in FIG. 1, if the optical movement sensor 116 in FIG. 2 is the kind used in optical computer mice, and if it is fixed to the vehicle, two or more such fixed optical movement sensors would need to be provided if the optical movement sensor is to be used to measure changes in the vehicle's orientation about its yaw axis. Alternatively, a single optical movement sensor might be used, provided the single optical movement sensor is mounted in a gimballed manner and the device or mechanism 119 used for the gimballed mounting can monitor the changes in the orientation of the optical movement sensor relative to the orientation of the vehicle. Further explanation of the embodiment in FIG. 2 will be provided below.
4. Alternative Embodiment Optical Control System 202
FIG. 3 shows a schematic diagram of an alternative embodiment of the present invention. A number of the features of the embodiment shown in FIG. 3 are similar to those shown in FIG. 1. For convenience and brevity of description, similar features in FIG. 3 are denoted by the same reference numeral as used to denote those features in FIG. 1, but increased by 200. For example, tractor 210 in FIG. 3 corresponds to tractor 10 in FIG. 1. It can be seen that the embodiment shown in FIG. 2 also includes a controller 214 and an optical movement sensor 216.
The embodiment shown in FIG. 3 also includes one more inertial sensors 221 mounted to the tractor. The one more inertial sensors may comprise one or more accelerometers and/or gyroscopes. Instead of the inertial sensors, one or more vehicle based sensors may be used. These may include magnetometers, wheel angle sensors and/or wheel speed encoders. A combination of inertial sensors and vehicle-based sensors may also be used. An assembly of sensors, such as an Inertial Navigation System (INS), a Dynamic Measurement Unit (DMU), an Inertial Sensor Assembly (ISA), a Vertical Gyro (VG) or an Attitude Heading Reference System (AHRS) may be used in feature 221. The inertial sensors may comprise one or more, or an assembly of sensors including accelerometers and rate gyroscopes for providing further position and attitude signals to the controller. Preferably (although not necessarily), the assembly may comprise between one and three sensor sets orthogonally mounted, with each sensor set comprising not necessarily one of each, but no more than one of each of the above mentioned sensors.
The sensor assembly 221 provides relative position and attitude information to the controller 214. Similarly, the optical movement sensor 216 also provides relative position and attitude information to controller 214. The controller uses both sets of information to obtain a more accurate determination of the position and attitude of the vehicle. This will be described in greater detail hereunder. Also, as described above with reference to the embodiments in FIGS. 1 and 2, if the optical movement sensor 216 in FIG. 3 is the kind used in optical computer mice, and if it is fixed to the vehicle, two or more such fixed optical movement sensors would need to be provided if the optical movement sensor is to be used to measure changes in the vehicle's orientation about its yaw Z axis. Alternatively, a single optical movement sensor might be used, provided the single optical movement sensor is mounted in a gimballed manner and the device or mechanism 219 used for the gimballed mounting can monitor the changes in the orientation of the optical movement sensor relative to the orientation of the vehicle.
5. Alternative Embodiment Optical Control System 302
FIG. 4 shows a schematic view of a further embodiment of the present invention. A number of the features of the embodiment shown in FIG. 4 are similar to those shown in FIG. 2. For convenience and brevity of description, similar features in FIG. 4 are denoted by the same reference numeral as used to denote those features in FIG. 2, but with the leading “1” of the reference numerals used in FIG. 2 replaced with a leading “3” in FIG. 4 (i.e., plus 200). For example, tractor 310 in FIG. 4 corresponds to tractor 110 in FIG. 2.
The embodiment shown in FIG. 4 includes an optical movement sensor 316, a GNSS-based system 319 399 (shown in FIG. 8) and an inertial sensor and/or vehicle based sensor 321. These sensors interact with the controller in a manner that will be described hereunder.
FIG. 5 shows a schematic flow sheet of the interaction between the optical movement sensor and the controller. The flow sheet is based upon the embodiment shown in FIG. 1 embodiments shown in FIGS. 1A-C. Only one optical movement sensor is shown in FIG. 5. However, as explained above, there could alternatively be two or more optical movement sensors all feeding into the control system in the same way as the one shown in FIG. 5. The way that two or more optical movement sensors can be used to measure changes in the vehicle's orientation about its yaw axis has already been explained. However, providing two or more optical movement sensors may also provide the additional benefit of increasing the accuracy of the position and attitude information calculated by the optical estimator (compared with systems that use only a single optical movement sensor) due to the greater amount of information available upon which the estimates can be based.
In FIG. 5, the controller 14 of FIG. 1 is shown by dotted outline 14A. The controller of FIG. 5 includes an optimal estimator 60 and an error calculation module 62. The optimal estimator 60 and error calculation module 62 may form part of the computer memory and/or CPU of the controller 14. The particular programs required to run the optimal estimator and error calculation module may be written into the computer memory or they may be downloaded from a network or they may be loaded onto the computer memory via an optical drive, such as a CD drive or a DVD drive, or they may be loaded from any other form of recorded media. Alternatively, the optimal estimator and the error calculation module may be provided in firmware associated with the controller 14.
The optical movement sensor(s) 16 of FIG. 1 feeds position and attitude data into optimal estimator 60. Optimal estimator 60 acts to process the information from the optical movement sensor(s) 16 to provide a statistically optimal estimate of the position and attitude information received from the optical movement sensor(s). The optimal estimator may include algorithms that receive the position and attitude information from optical movement sensor(s) 16 and convert that position and attitude information into a calculated or determined position and attitude of the tractor. This produces a statistically optimal estimate of the calculated or determined position and attitude of the tractor.
FIGS. 5-8 schematically represent the operation of the control system in accordance with different embodiments of the invention. However, it is also useful to consider the way in which the vehicle's parameters and dynamics are represented for the purposes of implementing the control system. Those skilled in the art will recognize that a range of methods may be used for this purpose. However, it is considered that one method is to represent the parameters and dynamics in “state space” form.
In state space representations, the variables or parameters used to mathematically model the motion of the vehicle, or aspects of its operation, are referred to as “states” xi. In the present case, the states may include the vehicle's position (x,y), velocity
( dx dt , dy dt )
heading h, radius of curvature r, etc. Hence the states may include x1=x, x2=y, x3=h, x4=hr,
x 5 = dx dt = dx 1 dt , x 6 = dy dt = dx 2 dt
etc. However, it will be appreciated that the choice of states is never unique, and the meaning and implications of this will be well understood by those skilled in the art.
The values for the individual states at a given time are represented as the individual entries in an n×1 “state vector”:
X(t)=[x1(t)x2(t)x3(t)x4(t). . . xn(t)]T
where n is the number of states.
In general, the mathematical model used to model the vehicle's motion and aspects of its operation will comprise a series of differential equations. The number of equations will be the same as the number of states. In some cases, the differential equations will be linear in terms of the states, whereas in other situations the equations may be nonlinear in which case they must generally be “linearized” about a point in the “state space”. Linearization techniques that may be used to do this will be well known to those skilled in this area.
Next, by noting that any jth order linear differential equations can be re-written equivalently as a set of j first order linear differential equations, the linear (or linearized) equations that represent the model can be expressed using the following “state” equation:
d dt ( X _ ( t ) ) = A X _ ( t ) + B U _ ( t ) + E w _ ( t )
where:
    • A is an n×n matrix linking the state time derivatives to the states themselves,
    • U(t) is an m×1 matrix containing the external “forcing” inputs in the mathematical model,
    • B is an n×m matrix linking the state derivatives to the inputs,
    • m is the number of inputs,
    • Ew(t) is a quantity (represented by an n×1 vector) called the “process noise”. The process noise represents errors in the model and vehicle dynamics which exist in the actual vehicle but which are not accounted for in the model. As Ew(t) represents an unknown quantity, its contents are not known. However, for reasons that will be understood by those skilled in this area, in order to allow statistically optimized signal processing and state estimation Ew(t) is generally assumed to be Gaussian, white, have zero mean and to act directly on the state derivatives. It is also assumed that the process noise element associated with each individual state is uncorrelated with the process noise element of the other states.
The quantities that are desired to be known about the vehicle (the real values for which are generally also measured from the vehicle itself, if possible) are the outputs y, from the model. Each of the outputs generated by the linear (or linearized) model comprises a linear combination of the states x, and inputs u, and so the outputs can be defined by the “output” or “measurement” equation:
Y(t)=CX(t)+DU(t)+Mv(t)
where
    • C is a j×n matrix linking the outputs to the states,
    • D is a j×m matrix linking the outputs to the inputs,
    • j is the number of outputs, and
    • Mv(t) is a quantity (represented by an n×1 vector) called the “measurement noise”. The measurement noise represents errors and noise that invariably exist in measurements taken from the actual vehicle. Like Ew(t) above, Mv(t) is assumed to be Gaussian, white, have zero mean, to act directly on the state derivatives and to be uncorrelated with the process noise or itself.
Next, it will be noted that both the state equation and the measurement equation defined above are continuous functions of time. However, continuous time functions do not often lend themselves to easy digital implementation (such as will generally be required in implementing the present invention) because digital control systems generally operate as recursively repeating algorithms. Therefore, for the purpose of implementing the equations digitally, the continuous time equations may be converted into the following recursive discrete time equations by making the substitutions set out below and noting that (according to the principle of superposition) the overall response of a linear system is the sum of the free (unforced) response of that system and the responses of that system due to forcing/driving inputs. The recursive discrete time equations are:
X k+1=FX k+GU k+1+Lw k+1
Y k+1=ZY k+JU k+1+Nw k+1
where
    • k+1 is the time step occurring immediately after time step k,
    • Z=C, J=D and Nv is the discrete time analog of the continuous time measurement noise Mv(t).
    • F is a transition matrix which governs the free response of the system. F is given by:
      F=eAΔt
    • GU k+1 is the forced response of the system, i.e. the system's response due to the driving inputs. It is defined by the convolution integral as follows:
G U _ k + 1 = 0 Δ t e A ( Δ t - τ ) B U _ ( t k + 1 + τ ) d τ
    • where τ is the integration variable of the convolution integral.
    • Similarly, the quantity Lw k+1 is the (forced) response of the system due to the random “error” inputs that make up the process noise. Hence, conceptually this quantity may be defined as:
L w _ k + 1 = 0 Δ t e A ( Δ t - τ ) E w _ ( t k + 1 + τ ) d τ
However, as noted above, the quantity Ew(t) is not deterministic and so the integral defining Lw k+1 cannot be performed (even numerically). It is for this reason that it is preferable to use statistical filtering techniques. The optimal estimator shown in FIGS. 5-8 will use such statistical techniques. One particularly favorable technique involves the use of a Kalman filter to statistically optimize the states estimated by the mathematical model.
In general, a Kalman filter operates as a “predictor-corrector” algorithm. Hence, the algorithm operates by first using the mathematical model to “predict” the value of each of the states at time step k+1 based on the known inputs at time step k+1 and the known value of the states from the previous time step k. It then “corrects” the predicted value using actual measurements taken from the vehicle at time step k+1 and the optimized statistical properties of the model. In summary, the Kalman filter comprises the following equations each of which is computed in the following order for each time step:
X k + 1 k = F X _ k k + G U _ k + 1 P k + 1 k = FP k k F T + Q K k + 1 = P k + 1 k Z T ( ZP k + 1 k Z T + R ) - 1 Y _ k + 1 = Z X _ k + 1 k + J U _ k + 1 } predictor υ _ k + 1 = Y ^ _ k + 1 - Y _ k + 1 X _ k + 1 k + 1 = X _ k + 1 k + K k + 1 υ k + 1 P k + 1 k + 1 = ( I - K k + 1 Z ) P k + 1 k } corrector
where
    • the notation k+1|k means the value of the quantity in question at time step k+1 given information from time step k. Similarly, k+1|k+1 means the value of the quantity at time step k+1 given updated information from time step k+1.
    • P is the co-variance in the difference between the estimated and actual value of X.
    • Q is the co-variance in the process noise.
    • K is the Kalman gain which is a matrix of computed coefficients used to optimally “correct” the initial state estimate.
    • R is the co-variance in the measurement noise.
    • Ŷ is a vector containing measurement values taken from the actual vehicle.
    • υ is a quantity called the “innovation” which is the difference between the measured values actually taken from the vehicle and values for the corresponding quantities estimated by the model.
    • T is the transpose operator.
    • I is the identity matrix.
The operation of the discrete time Kalman filter which may be used in the optimal estimator of the present invention is schematically illustrated in FIG. 12.
Returning now to FIG. 5, the statistically optimal estimate of the vehicle's position and attitude provided by the optimal estimator 60 is supplied to the error calculation module 62. The error calculation module 62 receives information on the required control path 64 (or the desired path of travel). The required control path or the desired path of travel may be entered into the computer memory of the controller or it may be calculated from an initial wayline and further operating parameters, such as the width of the implement being towed by the tractor.
The error calculation module 62 uses the statistically optimal estimate of the position and attitude of the tractor obtained from the optimal estimator 60 and the desired position and attitude of the tractor determined from the required control path to calculate the error in position and attitude of the tractor. This may be calculated as an error in the x-coordinate, an error in the y-coordinate and an error in the heading of the position and attitude of the tractor. These error values are represented as “Ex”, “Ey” and “Eh” in FIG. 5. These error values are used in a correction calculation module 66 to determine a correction value. The correction value may result in a curvature demand 68, which represents a steering control signal that is sent to a steering control mechanism. The correction value is calculated as a function of the error in the coordinate values.
FIG. 6 shows a schematic flow sheet of the interaction of the optical movement sensor(s) and GNSS sensor with the controller. This flow sheet represents one possible implementation for use with the embodiment shown in FIG. 2. The error correction calculation module 62, desired control path 64, correction calculation module 66 and curvature demand 68 shown in FIG. 6 are essentially identical to those shown in FIG. 5 and will not be described further. However, as can be seen from FIG. 6, the controller, which is represented by dashed outline 114A, receives positional data from the optical movement sensor(s) 116 and the GNSS system 119 199. The GNSS system 119 199 shown in FIG. 6 may correspond to the differential GNSS system described with reference to FIG. 2. The optimal estimator 160 receives positional data from the optical movement sensor 116 end of and from the GNSS system 119 199. The optimal estimator 160 analyses the positional data from the optical movement sensor and the GNSS system to provide a statistically optimal estimate of the position coordinates of the tractor. The GNSS system provides absolute position coordinate data and the optical movement system provides relative position and attitude data. Both sources of data can be used to obtain a more accurate calculated or determined position and attitude of the vehicle.
In cases where a GNSS outage occurs, the optical movement sensor continues to provide position and attitude data to the optimal estimator. In such circumstances, control of the vehicle can be effected by the information received from the optical movement sensor alone.
As a further benefit arising from the system shown in FIG. 6, the optical movement sensor provides position and attitude data at a much greater frequency than a GNSS system. Therefore, the position and attitude data received from the optical movement sensor can be used to provide a determined or calculated vehicle position and attitude during periods between receipt of positional data from the GNSS system. This feature assists in maintaining enhanced accuracy in the position and attitude data.
FIG. 7 shows a flow sheet of the interaction of optical movement sensor(s) and inertial sensors with the controller. This flow sheet may be used in the embodiment shown in FIG. 3. The error correction calculation module 62, desired control path 64, correction calculation module 66 and curvature demand 68 shown in FIG. 7 are essentially identical to those as shown in FIGS. 5 and 6 and will not be described further. However, as can be seen from FIG. 7, the controller, which is represented by dashed outline 214A, receives positional data from the optical movement sensor(s) 216 and the inertial sensors 221. This positional data is received by the optimal estimator 260. The optimal estimator 60 260 analyses this data and provides a best estimate of the position of the vehicle.
FIG. 8 shows a flow sheet demonstrating the interaction of optical movement sensor(s), inertial sensors and the GNSS system. The flow sheet shown in FIG. 8 may be used as an implementation for the embodiment shown in FIG. 4. The error correction calculation module 62, desired control path 64, correction calculation module 66 and curvature demand 68 shown in FIG. 8 are essentially identical to those shown in FIGS. 5-7 and will not be described further. In the embodiment shown in FIG. 8, the optimal estimator 60 360 receives positional data from the optical movement sensor 316, the GNSS system 319 399 and the inertial sensors 321. This data is sent to the optimal estimator 60 360 which produces a best estimate of the position of the vehicle. This is then sent to the error calculation module 62.
6. Alternative Embodiment Optical Control System 402
FIG. 9 shows a schematic view of another embodiment of the present invention in which an optical movement sensor 416A is mounted to the implement. As in the figures described above, two or more optical movement sensors may be provided, or a single optical movement sensor may be provided with the gimballed mounting. In the embodiment shown in FIG. 9, the optical movement sensor(s) 416A is used to provide positional data relating to the position of the implement 12 412. In FIG. 9, other features are essentially identical to those shown in FIG. 1 and, for convenience, have been denoted by the same reference numbers as used in FIG. 1, but increased by 400, and need not be described further.
The embodiments shown in FIGS. 2-4 may also be modified by replacing the optical movement sensors in those embodiments with an optical movement sensor mounted to the implement of those embodiments. It will also be appreciated that the position of the implement may be determined as well as the position of the vehicle. In such cases, the optical movement sensor 416A mounted to the implement 12 412 (as shown in FIG. 9) may comprise an additional optical movement sensor to the optical movement sensor mounted to the tractor, as shown in FIGS. 1 to 4.
FIG. 10 shows a schematic diagram of one possible embodiment of an optical movement sensor that may be used in the present invention. The optical movement sensor 500 shown in FIG. 10 includes a housing or enclosure 502. The housing 502 holds an illumination source 504, in the form of a ring of LEDs. The ring of LEDs is shown more clearly in FIG. 11. The housing 502 also houses a charged coupled device (CCD) detector and an integrated optical movement sensor chip 506. The detector and optical movement sensor chip 506 is suitably taken from an optical computer mouse. The housing 502 also houses a lens 510 (which will suitably be a telecentric lens). Light from the ring of LEDs that is reflected from the ground 512 is focused by the lens 510 onto the detector 506. In order to keep the lens 510 free of dirt and debris, a nozzle 514 may be positioned close to the lens 510. The nozzle 514 may periodically or continuously blow a jet of air over the lens 510 to thereby blow away any dirt or debris that may have settled on the lens. FIG. 10 also shows the field of illumination 516 and the field of view 518 provided by the arrangement 500.
The optical movement chip 506 sends signals to the optimal estimator, as shown in FIGS. 5 to 8. These signals may be sent via a wire 509 or via an appropriate wireless connection.
The present invention provides control systems that can be used to control the movement of the vehicle or an implement associated with the vehicle. The control system includes an optical movement sensor that may be the operative part of an optical computer mouse. These optical movement sensors are relatively inexpensive, provide a high processing rate and utilize proven technology. Due to the high processing rate of such optical movement sensors, the control system has a high clock speed and therefore a high frequency of updating of the determined or calculated position of the vehicle or implement. The optical movement sensor may be used by itself or it may be used in conjunction with a GNSS system, one or more inertial sensors, or one or more vehicle based sensors. The optical movement sensor can be used to augment the accuracy of inertial and/or other sensors. In particular, the optical movement sensor can be used to debias yaw drift that is often inherent in inertial sensors.
7. Alternative Embodiment Vehicle Control System 600
As described in the background section above, one of the problems with existing vehicle control systems is that they are inherently “one-dimensional” or “linear” in nature. The inherent “linear” nature of existing control systems is illustrated schematically in FIG. 13. Whilst the “real world” spatial geometry of the respective swaths shown on the left in FIG. 13 may have been calculated, nevertheless from the control system's point of view at any given time the controller only “knows” that the vehicle is on the nth swath and that it has been moving along that swath for a known amount of time with known speed. Hence, at a fundamental level, the controller does not inherently know where the vehicle is located in space. This is represented graphically in FIG. 13.
Next, FIG. 14 shows an agricultural vehicle 601 having a control system in accordance with one embodiment of the present invention. In FIG. 14, the agricultural vehicle 601 is a tractor towing an implement 602. The implement 602 could be a plow, harvester, seed sower, leveler, agricultural chemical applicator/dispenser or any other kind of agricultural implement. Furthermore, the embodiment of the invention shown in FIG. 14 could equally be applied on other kinds of vehicles operating in other areas, for example cars, mine-trucks, airport tarmac vehicles, etc.
The components of the control system in the particular embodiment shown in FIG. 14 include a main control unit (MCU) 603, a GPS antenna 604 and actuators 605. The main control unit 603 houses the spatial database and also the electronic hardware used to implement the controller. The main control unit 603 may be an industrial computer (for example an industrial PC) capable of running other applications in addition to the vehicle control system. Alternatively, the main control unit 603 may be a purpose-built unit containing only the hardware required to run the controller, the spatial database and the other components of the vehicle control system.
The main control unit 603 receives GPS signals from the GPS antenna 604, and it uses these (typically in combination with feedback and/or other external spatial data signals) to generate a control signal for steering the vehicle. The control signal will typically be made up of a number of components or streams of data relating to the different parameters of the vehicle being controlled, for example the vehicle's “cross-track error”, “heading error”, “curvature error”, etc. These parameters will be described further below. The control signal is amplified using suitable signal amplifiers (not shown) to create a signal that is sufficiently strong to drive the actuators 605. The actuators 605 are interconnected with the vehicle's steering mechanism (not shown) such that the actuators operate to steer the vehicle as directed by the control signal.
In some embodiments, further actuators (not shown) may also be provided which are interconnected with the vehicle's accelerator and/or braking mechanisms, and the control signal may incorporate components or signal streams relating to the vehicle's forward progress (i.e. its forward speed, acceleration, deceleration, etc.). In these embodiments, the component(s) of the control signal relating to the vehicle's forward progress may also be amplified by amplifiers (not shown) sufficiently to cause the actuators which are interconnected with the accelerator/braking mechanism to control the vehicle's acceleration/deceleration in response to the control signal.
The vehicle 601 may also be optionally provided with one or more optical sensors 606, one or more inertial sensors (IS) 607 and a user terminal (UT) 608. One form of optical sensor 606 that may be used may operate by receiving images of the ground beneath the vehicle, preferably in rapid succession, and correlating the data pertaining to respective successive images to obtain information relating to the vehicle's motion. Other forms of optical sensor may also be used including LIDAR (Light Detection and Ranging) or sensors which operate using machine vision and/or image analysis. If present, the one or more inertial sensors 607 will typically include at least one gyroscope (e.g., a rate gyroscope), although the inertial sensors 607 could also comprise a number of sensors and components (such as accelerometers, tilt sensors and the like) which together form a sophisticated inertial navigation system (INS). The vehicle may be further provided with additional sensors (not shown) such as sensors which receive information regarding the location of the vehicle relative to a fixed point of known location in or near the field, magnetometers, ultrasonic range and direction finding and the like. The data generated by these additional sensors may be fed into the database and used by the control system to control the vehicle as described below.
In embodiments where the main control unit 603 comprises an industrial PC or the like, the user terminal 608 may comprise a full computer keyboard and separate screen to enable the user to utilize the full functionality of the computer. However, in embodiments where the main control unit is a purpose-built unit containing only hardware relating to the vehicle's control system, the terminal 608 may comprise, for example, a single combined unit having a display and such controls as may be necessary for the user to operate the vehicle's control system. Any kind of controls known by those skilled in this area to be suitable may be used on the main control unit, including keypads, joysticks, touch screens and the like.
In FIG. 14, the user terminal 608 is positioned in the vehicle cabin so that it can be operated by the driver as the vehicle moves. However, those skilled in the art will recognize that the present control system could also be operated by wireless remote control, meaning that the user terminal 608 could alternatively be totally separate from the vehicle and could operate the vehicle's control system from a remote location. It is also envisaged that a single remote user terminal 608 may be used to wirelessly interface with the control systems of multiple vehicles (possibly simultaneously) so that the user can control multiple moving vehicles from the one remote terminal.
In order to control the steering of the vehicle, there are three parameters that should be controlled. These are the “cross-track error”, the “heading error” and the “curvature error”. The physical meaning of these parameters can be understood with reference to FIG. 15. The “cross-track error” is the lateral difference between the vehicle's actual position, and its desired position. This is illustrated by the “{” bracket in FIG. 15. The “heading error” is the difference between the vehicle's actual instantaneous direction of motion h (i.e. its actual compass heading), and its desired instantaneous direction of motion H. The heading error is given by:
Heading Error=H−h
Those skilled in the art will recognize that both h and H are inherently directional quantities.
Finally, the “curvature error” is the difference between the actual instantaneous radius of curvature r of the vehicle's motion and the desired instantaneous radius of curvature R. The curvature error is given by:
Curvature Error=1/R−1/r
It will also be clearly appreciated that there may be many other vehicle variables or parameters which also need to be controlled if, for example, acceleration/deceleration or the vehicle's mode of equipment operation are also to be controlled.
Referring next to FIG. 16, it can be seen that a vehicle control system in accordance with one particular embodiment of the invention comprises: a task path generator; a spatial database; at least one external spatial data source; a vehicle attitude compensation module; a position error generator; a controller; and actuators to control (steer) the vehicle.
In the overall operation of the control system, the desired path trajectory for the vehicle is first entered into the control system by the user via the user terminal 608. The task path generator then interprets this user-defined path definition and converts it into a series of points of sufficient spatial density to adequately represent the desired path to the requisite level of precision. The task path generator typically also defines the vehicle's desired trajectory along the user-defined path, for example, by generating a desired vehicle position, a desired heading H and a desired instantaneous radius of curvature R for each point on the path. This information is then loaded into the spatial database. The way in which this and other spatial information is stored within the database in representative embodiments, and in particular the way in which pieces of data are given memory allocations according to their spatial location, is described further below.
As the vehicle moves along the user-defined path, it will invariably experience various perturbations in its position and orientation due to, for example, bumps, potholes, subsidence beneath the vehicle's wheels, vehicle wheel-spin, over/under-steer, etc. Those skilled in this area will recognize that a huge range of other similar factors can also influence the instantaneous position and orientation of the vehicle as it moves. One of the purposes of the present control system is to automatically correct for these perturbations in position and orientation to maintain the vehicle on the desired path (or as close to it as possible).
As the vehicle moves, the control system progressively receives updated information regarding spatial location from the external spatial data sources. The external spatial data sources will typically include GPS. However, a range of other spatial data sources may also be used in addition to, or in substitute for GPS. For example, the inertial navigation systems (INS), visual navigation systems, etc. described above may also be used as external data sources in the present control system.
Those skilled in the art will recognize that the spatial data collected by the external spatial data sources actually pertains to the specific location of the external spatial data receivers, not necessarily the vehicle/implement reference location itself (which is what is controlled by the control system). In FIG. 14, the reference location is on the vehicle 601 and is indicated by the intersection (i.e. the origin) of the roll, pitch and yaw axes. In other embodiments, the reference location may be located elsewhere on the vehicle, or on the implement 602, etc. In any event, to illustrate this point, it will be seen that the GPS antenna 604 in FIG. 14 is located on the roof of the vehicle some distance from the vehicle's reference point. Therefore, the spatial data collected by the GPS antenna actually relates to the instantaneous location of the vehicle's roof, not the location of the vehicle's reference point. Likewise, the spatial data collected by the optical sensor 606 actually pertains to the particular location of the optical sensor (slightly out in front of the vehicle in FIG. 14).
In addition to this, changes in the vehicle's attitude will also influence the spatial position readings received by the different receivers. For example, if one of the vehicle's wheels passes over, or is pushed sideways by a bump, this may cause the vehicle to rotate about at least one (and possibly two or three) of the axes shown in FIG. 14. This will in turn change the relative position of the spatial data receiver(s) such as GPS antenna 604 with respect to the reference location on the vehicle or implement. This can be used (typically in combination with other sources of external spatial data or “feedback” data) to determine the orientation of the vehicle. The orientation of the vehicle may be considered to be the relative orientation of the vehicle's axes in space.
In order to compensate for the difference in position between the vehicle's reference point and the location of the spatial data receiver(s), and also to account for changes in the vehicle's orientation, a vehicle attitude compensation module is provided. This is shown in FIG. 16. The vehicle attitude compensation module converts all readings taken by the various spatial data receivers (which relate to the different specific locations of the receivers) into readings pertaining to the spatial location and orientation of the vehicle's reference point. This data pertaining to the spatial location and orientation of the vehicle's reference point is then fed into the spatial database.
Those skilled in the art will recognize that the one or more external spatial data sources will progressively receive updated data readings in rapid succession (e.g., in “real time” or as close as possible to it). These readings are then converted by the vehicle attitude compensation module and fed into the spatial database. The readings may also be filtered as described above. Therefore, whilst each reading from each spatial data source is received, converted (ideally filtered) and entered into the spatial database individually, nevertheless the rapid successive way in which these readings (possibly from multiple “parallel” data sources) are received, converted and entered effectively creates a “stream” of incoming spatial data pertaining to the vehicle's continuously changing instantaneous location and orientation. In order to provide sufficient bandwidth, successive readings from each external spatial data source should be received and converted with a frequency of the same order as the clock speed (or at least one of the clock speeds) of the controller, typically 3 Hz-12 Hz or higher.
Referring again to FIG. 16, the position error generator next receives information from the spatial database. The information it receives from the database includes: the vehicle's desired position, heading H and instantaneous radius of curvature R. (It will be recalled that this information is originally generated by the task path generator and then entered into the spatial database, based on the user-defined path trajectory); and the vehicle's actual position, heading h and instantaneous radius of curvature r. (This information is based on spatial data progressively received from the external spatial data sources as described above, and typically also on data received through feedback.)
The position error generator then uses this information to calculate an instantaneous “error term” for the vehicle. The “error term” incorporates the vehicle's instantaneous cross-track error, heading error and curvature error (as described above). The error term is then fed into the controller. The controller is shown in greater detail in FIG. 17.
From FIG. 17 it can be seen that the controller incorporates a cross-track error PID proportional-integral-derivative (PID) controller, a heading error PID controller and a curvature error PID controller. The PID controllers used with the present invention are of a conventional form that will be well understood by those skilled in this area and need not be described in detail. The output from the cross-track error, heading error and curvature error PID controllers then passes through a curvature demand signal integrator. The output from the PID controllers is therefore integrated in order to generate a curvature demand signal. This curvature demand signal is thus the “control signal” which is amplified by amplifiers (not shown) before proceeding to drive the actuators as required. In other words, the signal obtained by integrating the output from the PID controllers is amplified and sent to the actuators in the form of a curvature demand to change the vehicle's steering angle and hence steer the vehicle back onto the desired path. Finally, the change in vehicle pose, etc., caused by the control driven change in steering angle is registered via the updated information received through the external data sources (GPS etc) and the vehicle's new position, heading and instantaneous radius of curvature are re-entered into the spatial database to complete control system's overall closed loop control structure. It will be noted that the arrows extending from the actuators/steering mechanism to the external data sources in FIG. 16 are dashed rather than solid lines. This is to indicate that, whilst there is no actual control signal or other data flow from the actuators/steering mechanism to the external data sources, there is nevertheless a causal link between the change in vehicle pose, etc., caused by the control driven change in steering angle and the updated information received through the external data sources.
In FIG. 18, there is shown a slightly more elaborate embodiment of the control system. The embodiment shown in FIG. 18 is generally the same as that shown in FIG. 16, except that the embodiment in FIG. 18 incorporates an optimizing filter and an external obstacle detection input. The optimizing filter can operate to statistically optimize at least some of the spatial data contained in the spatial data base. Also, the filter will generally operate as an “observer”, meaning that it does not form part of the control loop. Rather, the filter will typically reside outside the control loop and it will generally operate by taking data directly from the database and returning optimize data directly into the database, as shown in FIG. 18. More specifically, the filter will take the updated “feedback” data that re-enters the database from the control loop (described above) together with the updated spatial data obtained from the external spatial data sources (after it has been processed by the vehicle attitude compensation module) and it will then use these disparate streams of data to calculate a statistically optimized updated estimate of, for example, the vehicle's instantaneous position, heading and radius of curvature. The filter will typically comprise a Kalman filter.
The external obstacle detection input may comprise any form of vision based, sound based or other obstacle detection means, and the obstacle detection data may be converted by the vehicle attitude compensation module (just like the other sources of external data discussed above) and then fed into the spatial database. Where the control system incorporates obstacle detection, it is then necessary for the task path generator to be able to receive updated information from the spatial database. This is so that if an obstacle is detected on the desired path, an alternative path that avoids the obstacle can be calculated by the task path generator and re-entered into the database. The ability of the task path generator to also receive data from the spatial database is indicated by the additional arrow from the spatial database to the task path generator in FIG. 18.
FIGS. 4-6 16-18 graphically represent the operation of the control system. However, it is also useful to consider the way in which the vehicle's parameters and dynamics are represented for the purposes of implementing the control system. Those skilled in the art will recognize that a range of methods may be used for this purpose. However, it is considered that one method is to represent the parameters and dynamics in “state space” form.
In state space representations, the variables or parameters used to mathematically model the motion of the vehicle, or aspects of its operation, are referred to as “states” xi. In the present case, the states may include the vehicle's position (x,y), velocity
( dx dt , dy dt )

heading h, radius of curvature r etc. Hence the states may include xi=x,
x 2 = y , x 3 = h , x 4 = h , x 5 = dx dt = dx 1 dt , x 6 = dy dt = dx 2 dt .

Etc. However, it will be appreciated that the choice of states is never unique, and the meaning and implications of this will be well understood by those skilled in the art.
The values for the individual states at a given time are represented as the individual entries in an n×1 “state vector”:
X(t)=[x1(t)x2(t)x3(t)x4(t). . . xn(t)]T
where n is the number of states.
In general, the mathematical model used to model the vehicle's motion and aspects of its operation will comprise a series of differential equations. The number of equations will be the same as the number of states. In some cases, the differential equations will be linear in terms of the states, whereas in other situations the equations may be nonlinear in which case they must generally be “linearised” about a point in the “state space”. Linearisation techniques that may be used to do this will be well known to those skilled in this area.
Next, by noting that any jth order linear differential equations can be re-written equivalently as a set j first order linear differential equations, the linear (or linearized) equations that represent the model can be expressed using the following “state” equation:
d dt ( X _ ( t ) ) = A X _ ( t ) + B U _ ( t ) + E w _ ( t )

Where:
    • A is an n×n matrix linking the state time derivatives to the states themselves,
    • U(t) is an m×1 matrix containing the external “forcing” inputs in the mathematical model,
    • B is an n×m matrix linking the state derivatives to the inputs,
    • m is the number of inputs,
    • Ew(t) is a quantity (represented by an n×1 vector) called the “process noise”. The process noise represents errors in the model and vehicle dynamics which exist in the actual vehicle but which are not accounted for in the model. As Ew(t) represents an unknown quantity, its contents are not known. However, for reasons that will be understood by those skilled in this area, in order to allow statistically optimised signal processing and state estimation Ew(t) is generally assumed to be Gaussian, white, have zero mean and to act directly on the state derivatives. It is also assumed that the process noise element associated with each individual state is uncorrelated with the process noise element of the other states.
The process noise represents errors in the model and vehicle dynamics which exist in the actual vehicle but which are not accounted for in the model. As Ew(t) represents an unknown quantity, its contents are not known. However, for reasons that will be understood by those skilled in this area, in order to allow statistically optimized signal processing and state estimation Ew(t) is generally assumed to be Gaussian, white, have zero mean and to act directly on the state derivatives. It is also assumed that the process noise element associated with each individual state is uncorrelated with the process noise element of the other states.
The quantities that are desired to be known about the vehicle (the real values for which are generally also measured from the vehicle itself, if possible) are the outputs y1 from the model. Each of the outputs generated by the linear (or linearized) model comprises a linear combination of the states xi and inputs ui, and so the outputs can be defined by the “output” or “measurement” equation:
Y(t)=CX(t)+DU(t)Mv(t)
    • Where C is a j×n matrix linking the outputs to the states,
    • D is a j×m matrix linking the outputs to the inputs,
    • j is the number of outputs, and
    • M v(t) is a quantity (represented by an n×1 vector) called the “measurement noise”. The measurement noise represents errors and noise that invariably exist in measurements taken from the actual vehicle. Like Ew(t) above, M v(t) is assumed to be Gaussian, white, have zero mean, to act directly on the state derivatives and to be uncorrelated with the process noise or itself.
Next, it will be noted that both the state equation and the measurement equation defined above are continuous functions of time. However, continuous time functions do not often lend themselves to easy digital implementation (such as will generally be required in implementing the present invention) because digital control systems generally operate as recursively repeating algorithms. Therefore, for the purpose of implementing the equations digitally, the continuous time equations may be converted into the following recursive discrete time equations by making the substitutions set out below and noting that (according to the principle of superposition) the overall response of a linear system is the sum of the free (unforced) response of that system and the responses of that system due to forcing/driving inputs. The recursive discrete time equations are:
Xk+1=FXk+GUk+1+Lwk+1
Yk+1=ZXk+JUk+1+Nvk+1
where k+1 is the time step occurring immediately after time step k, Z=C, J=D and Nv is the discrete time analog of the continuous time measurement noise Mv(t). F is a transition matrix which governs the free response of the system. F is given by:
F=eAδ
GU.sub.k+1 is the forced response of the system, i.e. the system's response due to the driving inputs. It is defined by the convolution integral as follows:
GU k + 1 = 0 Δ t e A ( Δ t - τ ) dt
Similarly, the quantity Lw.sub.k+1 is the (forced) response of the system due to the random “error” inputs that make up the process noise. Hence, conceptually this quantity may be defined as:
Lw k + 1 = 0 Δ t e A ( Δ t - τ ) dt Ew ( t k + 1 + τ ) d τ
However, as noted above, the quantity Ew(t) is not deterministic and so the integral defining Lw.sub.k+1 cannot be performed (even numerically). It is for this reason that it is preferable to use statistical filtering techniques such as a “Kalman Filter” to statistically optimize the states estimated by the mathematical model.
In general, a “Kalman Filter” operates as a “predictor-corrector” algorithm. Hence, the algorithm operates by first using the mathematical model to “predict” the value of each of the states at time step k+1 based on the known inputs at time step k+1 and the known value of the states from the previous time step k. It then “corrects” the predicted value using actual measurements taken from the vehicle at time step k+1 and the optimized statistical properties of the model. In summary, the Kalman Filter comprises the following equations each of which is computed in the following order for each time step:
X k + 1 k = F X _ k k + G U _ k + 1 P k + 1 k = FP k k F T + Q K k + 1 = P k + 1 k Z T ( ZP k + 1 k Z T + R ) - 1 Y _ k + 1 = Z X _ k + 1 k + J U _ k + 1 υ _ k + 1 = Y ^ _ k + 1 - Y _ k + 1 } predictor X _ k + 1 k + 1 = X _ k + 1 k + K k + 1 υ k + 1 P k + 1 k + 1 = ( I - K k + 1 Z ) P k + 1 k } corrector

where the notation k+1|k means the value of the quantity in question at time step k+1 given information from time step k. Similarly, k+1|k+1 means the value of the quantity at time step k+1 given updated information from time step k+1. [0135]P is the co-variance in the difference between the estimated and actual value of X. [0136]Q is the co-variance in the process noise. [0137]K is the “Kalman gain” which is a matrix of computed coefficients used to optimally “correct” the initial state estimate. [0138]R is the co-variance in the measurement noise. [0139] is a vector containing measurement values taken from the actual vehicle.
The operation of the discrete time state space equations outlined above, including the Kalman gain and the overall feedback closed loop control structure, are represented graphically in FIG. 19.
In relation to the spatial database, it is mentioned above that a wide range of methods are known for arranging data within databases. One commonly used technique is to provide a “hash table”. The hash table typically operates as a form of index allowing the computer (in this case the control system CPU) to “look up” a particular piece of data in the database (i.e. to look up the location of that piece of data in memory). In the context of the present invention, pieces of data pertaining to particular locations along the vehicle's path are assigned different hash keys based on the spatial location to which they relate. The hash table then lists a corresponding memory location for each hash key. Thus, the CPU is able to “look up” data pertaining to a particular location by looking up the hash key for that location in the hash table which then gives the corresponding location for the particular piece of data in memory. In order to increase the speed with which these queries can be carried out, the hash keys for different pieces of spatial data can be assigned in such a way that “locality” is maintained. In other words, points which are close to each other in the real world should be given closely related indices in the hash table (i.e. closely related hash keys).
The spatial hash algorithm used to generate hash keys for different spatial locations in representative embodiments of the present invention may be most easily explained by way of a series of examples. To begin, it is useful to consider the hypothetical vehicle path trajectory shown in FIG. 20. In FIG. 20, the successive points which define the path are described by a simplified integer based (X,Y) coordinate system. Hence, in FIG. 20, the vehicle moves in the X direction along the entire length of the first swath from (0,0) to (4,0), before moving up in the Y direction to then move back along the second swath in the opposite direction from (4,1) to (0,1), etc.
As outlined above, in the present invention all data is stored within the spatial database with reference to spatial location. Therefore, it is necessary to assign indices or “hash keys” to each piece of data based on the spatial location to which each said piece of data relates. However, it will be recalled that the hash table must operate by listing the hash key for each particular spatial location together with the corresponding memory location for data pertaining to that spatial location. Therefore, the hash table is inherently one-dimensional, and yet it must be used to link hash keys to corresponding memory allocations for data that inherently pertains to two-dimensional space.
One simple way of overcoming this problem would be to simply assign hash keys to each spatial location based only on, say, the Y coordinate at each location. The hash keys generated in this way for each point on the vehicle path in FIG. 20 are given in Table 1 below.
TABLE 1
Spatial Hash Key Generated Using only the Y Coordinate
(X,Y) Hash key Hash key
coordinates (hexadecimal) (decimal)
(0, 0) 0x0 0
(1, 0) 0x0 0
(2, 0) 0x0 0
(3, 0) 0x0 0
(4, 0) 0x0 0
(0, 1) 0x1 1
(1, 1) 0x1 1
(2, 1) 0x1 1
(3, 1) 0x1 1
(4, 1) 0x1 1
(0, 2) 0x2 2
(1, 2) 0x2 2
(2, 2) 0x2 2
(3, 2) 0x2 2
(4, 2) 0x2 2
(0, 3) 0x3 3
(1, 3) 0x3 3
(2, 3) 0x3 3
(3, 3) 0x3 3
(4, 3) 0x3 3
(0, 4) 0x4 4
(1, 4) 0x4 4
(2, 4) 0x4 4
(3, 4) 0x4 4
(4, 4) 0x4 4
The prefix “0x” indicates that the numbers in question are expressed in hexadecimal format. This is a conventional notation.
Those skilled in the art will recognize that the above method for generating hash keys is far from optimal because there are five distinct spatial locations assigned to each different hash key. Furthermore, in many instances, this method assigns the same hash key to spatial locations which are physically remote from each other. For instance, the point (0,1) is distant from the point (4,1), and yet both locations are assigned the same hash key. An identically ineffective result would be obtained by generating a hash key based on only the X coordinate.
An alternative method would be to generate hash keys by concatenating the X and Y coordinates for each location. The hash keys generated using this method for each point on the vehicle path in FIG. 20 are given in Table 2 below.
TABLE 2
Hash Keys Generated by Concatenating the X and Y Coordinates
(X, Y) Hash key Hash key
coordinates (hexadecimal) (decimal)
(0, 0) 0x0 0
(1, 0) 0x100 256
(2, 0) 0x200 512
(3, 2) 0x302 770
(4, 2) 0x402 1026
(0, 3) 0x3 3

TABLE 2
Hash Keys Generated by Concatenating the X and Y Coordinates
(X, Y) Hash key Hash key
coordinates (hexadecimal) (decimal)
(0, 0) 0x0 0
(1, 0) 0x100 256
(2, 0) 0x200 512
(3, 0) 0x300 768
(4, 0) 0x400 1024
(0, 1) 0x1 1
(1, 1) 0x101 257
(2, 1) 0x201 513
(3, 1) 0x301 769
(4, 1) 0x401 1025
(0, 2) 0x2 1
(1, 2) 0x102 258
(2, 2) 0x202 514
(3, 2) 0x302 770
(4, 2) 0x402 1026
(0, 3) 0x3 3
(1, 3) 0x103 759
(2, 3) 0x203 515
(3, 3) 0x303 771
(4, 3) 0x403 1027
(0, 4) 0x4 4
(1, 4) 0x104 260
(2, 4) 0x204 516
(3, 4) 0x304 772
(4, 4) 0x404 1028
In order to understand how the numbers listed in Table 2 above were arrived at, it is necessary to recognize that in the digital implementation of the present control system, all coordinates will be represented in binary. For the purposes of the present example which relates to the simplified integer based coordinate system in FIG. 20, a simplified 8-bit binary representation has been used.
Hence, to illustrate the operation of the spatial hash key algorithm used to generate the numbers in Table 2, consider the point (3,3). Those skilled in the art will understand that the decimal number 3 may be written as 11 in binary notation. Therefore, the location (3,3) may be rewritten in 8-bit binary array notation as (00000011,00000011). Concatenating these binary coordinates then gives the single 16-bit binary hash key 0000001100000011 which can equivalently be written as the hexadecimal number 0x303 or the decimal number 771. The process of converting between decimal, binary and hexadecimal representations should be well known to those skilled in the art and need not be explained.
It will be noted from Table 2 above that concatenating the X and Y coordinates leads to unique hash keys (in this example) for each spatial location. However, the hash keys generated in this way are still somewhat sub-optimal because points which are located close to each other are often assigned vastly differing hash keys. For example, consider the points (0,0) and (1,0). These are adjacent point in the “real world”. However, the hash keys assigned to these points using this method (written in decimal notation) are 0 and 256 respectively. In contrast, the point (0,4) is much further away from (0,0) and yet it is assigned the much closer hash key 4. Therefore, this algorithm does not maintain “locality”, and an alternative algorithm would be preferable.
Yet a further method for generating hash keys is to use a technique which shall hereinafter be referred to as “bitwise interleaving”. As for the previous example, the first step in this technique is to represent the (X,Y) coordinates in binary form. Hence, using the 8-bit binary array representation discussed above, the point (X,Y) may be re-written in 8-bit binary array notation as (X1X2X3X4X5X6X7×8, Y1Y2Y3Y4Y5Y6Y7Y8). Next, rather than concatenating the X and Y coordinates to arrive at a single 16-bit binary hash key, the successive bits from the X and Y binary coordinates are alternatingly “interleaved” to give the following 16-bit binary hash key X1Y1X2Y2X3Y3X4Y4×5Y5X6Y6X7YX8Y8. The hash keys generated using this method for each point on the vehicle path in FIG. 20 are given in Table 3 below.
TABLE 3
Hash Keys Generated by “Bitwise Interleaving” the
X and Y Coordinates (X, Y)
(X, Y) Hash key Hash key
coordinates (hexadecimal) (decimal)
(0, 0) 0x0 0
(1, 0) 0x2 2
(2, 0) 0x8 8
(3, 0) 0xa 10
(4, 0) 0x20 32
(0, 0) 0x1 1
(1, 0) 0x3 3
(2, 0) 0x9 9
(3, 0) 0xb 11
(4, 0) 0x21 33
(0, 2) 0x4 4
(1, 2) 0x6 6
(2, 2) 0xc 12
(3, 2) 0xc 14
(4, 2) 0x24 36
(0, 3) 0x5 5
(1, 3) 0x6 7
(2, 3) 0xd 13
(3, 3) 0xf 15
(4, 3) 0x25 37
(0 ,4) 0x10 16
(1, 4) 0x12 18
(2, 4) 0x18 24
(3 ,4) 0x1a 26
(4, 4) 0x30 48
To further illustrate the operation of the spatial hash algorithm used to generate the numbers in Table 3, consider the point (3,4). As noted above, the decimal number 3 may be written as 11 in binary notation. Similarly, decimal number 4 is written as 100 in binary. Therefore, the location (3,4) may be rewritten in 8-bit binary array notation as (00000011,00000100). Bitwise interleaving these binary coordinates then gives the single 16-bit binary hash key 0000000000011010, which can equivalently be written as the hexadecimal number 0x1a or the decimal number 26.
From Table 3 it will be seen that generating hash keys by “bitwise interleaving” the X and Y coordinates leads to unique hash keys (in this example) for each spatial location. Also, the hash keys generated in this way satisfy the requirement that points which are close together in the real world are assigned closely related hash keys. For example, consider again the points (0,0) and (1,0). The hash keys now assigned to these points by “bitwise interleaving” (when written in decimal notation) are 0 and 2 respectively. Furthermore, the point (0,1) which is also nearby is also assigned the closely related hash key 1. Conversely, points which are separated by a considerable distance in the real world are given considerably differing hash keys, for example, the hash key for (4,3) is 37.
From the example described with reference to Table 3, it can be seen that generating hash keys by “bitwise interleaving” the binary X and Y coordinates preserves “locality”. This example therefore conceptually illustrates the operation of the bitwise interleaving spatial hash algorithm that may be used with representative embodiments of the present invention. However, the above example is based on the simplified integer based coordinate system shown in FIG. 20. In order to understand the actual algorithm that may be used in the implementation of the present control system, it is necessary to take into account certain other complexities. These complexities include:
The fact that GPS and other similar systems which describe spatial location typically do so using IEEE double-precision floating-point numbers (not simple integers). For instance, GPS supplies coordinates in the form of (X,Y) coordinates where X corresponds to longitude, and Y corresponds to latitude. Both X and Y are given in units of decimal degrees.
the fact that certain spatial locations have negative coordinate values when described using GPS and other similar coordinate systems. For example, using the WGS84 datum used by current GPS, the coordinates (153.00341,−27.47988) correspond to a location in Queensland, Australia (the negative latitude value indicates southern hemisphere).
Complexities inherent in representing numbers in accordance with the IEEE double-precision floating-point numbers standard.
FIG. 21 shows an example vehicle path similar to that shown in FIG. 20, except that the coordinates used to describe the points along the path in FIG. 21 correspond to a “realistic” coordinate system such as that used by current GPS. In order to understand the implementation of the bitwise interleaving spatial hash algorithm when applied to these realistic coordinates, it is necessary to first appreciate certain aspects regarding the way numbers are represented using the standard IEEE double-precision floating-point number format.
A double-precision floating-point number represented in accordance with the IEEE 754 standard comprises a string of 64 binary characters (64 bits) as shown in FIG. 22. The number is represented in three parts, namely the sign, the exponent and the mantissa. The sign comprises one bit. If the sign bit is 1 then the number is negative, and conversely if the sign bit is 0 then the number is positive. The exponent comprises eleven binary characters, and hence can range from 00000000000 to 11111111111. However, because of the need to represent numbers that are both greater and smaller than one, it is necessary to be able to represent both large positive and large negative values for the exponent. However, it is not desirable to use one of the exponent bits to represent the sign of the exponent because this would leave fewer bits available to represent the exponent's actual value and would therefore greatly limit the size of the numbers that could be represented. Therefore, in the IEEE standard 64 bit format, the true value of the exponent is given by the binary number actually written by the eleven exponent bits minus an implied exponent bias.
Hence, actual exponent value=written exponent value-exponent bias.
The exponent bias is 0x3ff=1023. Consequently, the maximum true exponent value that can be represented (written in decimal notation) is 1023, and the minimum true exponent value that can be represented is −1022.
Finally, the remaining 52 bits form the mantissa. However, as all non-zero numbers must necessarily have a leading “1” when written in binary notation, an implicit “1” followed by a binary point is assumed to exist at the front of the mantissa. In other words, the leading “1” and the binary point which must necessarily exist for all non-zero binary numbers is simply omitted from the actual written mantissa in the IEEE 64-bit standard format. This is so that an additional bit may be used to represent the number with greater precision. However, when interpreting numbers which are represented in accordance with the IEEE standard, it is important to remember that this leading “1” and the binary point implicitly exist even though they are not written.
Bearing in mind these issues, it is possible to understand the actual spatial hash algorithm used in representative implementations of the present control system. A “worked” example illustrating the operation of the spatial hash algorithm to generate a hash key based on the coordinate (153.0000°, −27.0000° is given in the form of a flow diagram in FIG. 23. The points are initially expressed in terms of decimal degrees as this is the format in which they are delivered from, for example, GPS.
From FIG. 23 it can be seen that in order to implement the algorithm the X and Y coordinates are separated. The next step is to “normalise” the signs of the respective coordinates (in this case only the Y coordinate needs to be normalized). The reason for normalising the signs of the coordinate is because, when calculating a spatial hash key, it is more convenient to eliminate negative sign bits from the coordinates. In the case of the latitude coordinate, those skilled in this area will recognize that latitude is conventionally written as a number in the range (−90°≤latitude≤90°. Therefore, by simply adding 90° to the value of the latitude coordinate, the spatial hash algorithm can operate with values in the equivalent “un-signed” or “normalised” latitude range (0°≤latitude≤180°). Those skilled in the art will appreciate that the longitude coordinates can also be normalised to fall within the range (0°≤longitude≤360.degree.), although that is not necessary in this example.
After normalising the coordinates, the next step is to convert the respective coordinates from their representations in decimal degrees into binary IEEE double-precision floating-point number format. This is shown as step 3) in FIG. 23. However, it will be noted that the binary coordinate representations (and all other numbers which are generated or used by the algorithm in binary form) have been written in the alternative hexadecimal notation for ease of reference and to save space in FIG. 23.
Next, the binary representations of the two coordinates are split into their respective exponent (11 bits) and mantissa (52 bits) portions. This is step 4) in FIG. 23. Then, in order to determine the correct (“true”) value of the exponent, the exponent for each of the coordinate is “de-biased” by subtracting the implicit exponent bias (0x3ff=1023) as described above. This is step 5).
After de-biasing the exponents, the resulting exponents are then adjusted by a selected offset. The size of the offset is selected depending on the desired “granularity” of the resulting fix-point number. In the particular example shown in step 6) of FIG. 23, the offset is 37, however those skilled in the art will appreciate this number can be varied to suit.
After adjusting the exponent, the next step is to “resurrect” the leading “1” and the binary point which implicitly exist in the mantissa but which are left off when the mantissa is actually written (see above). Hence, the leading “1” and the binary point are simply prepended to the mantissa of each of the coordinates. This is step 7) in FIG. 23.
The mantissa for each coordinate is then right-shifted by the number of bits in the corresponding exponent. The exponents for each coordinate are then prepended to their corresponding mantissas forming a single character string for each coordinate. There is then an optional step of discarding the high-order byte for each of the two bit fields. This may be done simply to save memory if required, but is not necessary. Finally, the resultant bit fields for each coordinate are bitwise interleaved to obtain a single hash key corresponding to the original coordinates. In the example shown in FIG. 23, the resultant hash key is 32-bits in length. However, the length of the resultant hash key may vary depending on, for example whether the high-order byte is discarded, etc.
Those skilled in the art will recognize that various other alterations and modifications may be made to the particular embodiments, aspects and features of the invention described without departing from the spirit and scope of the invention may be made to the particular embodiments, aspects and features of the invention described without departing from the spirit and scope of the invention.

Claims (23)

Having thus described the invention, what is claimed as new and desired to be secured by Letters Patent is:
1. A system for controlling a vehicle, the vehicle including an automatic steering system and roll, pitch and yaw axes, and the control system comprising:
a spatial database containing spatial data corresponding to GPS-defined positions in the region;
a controller mounted on said vehicle and adapted for computing guidance signals, to receive spatial data from the spatial database at control speed, and to control the steering of the vehicle;
a guidance subsystem mounted on said vehicle and connected to said controller, said guidance subsystem being adapted for receiving said guidance signals from said controller and utilizing said guidance signals for guiding said vehicle;
external spatial data sources mounted on said vehicle, comprising at least an optical movement sensor subsystem adapted for optically sensing movement of said vehicle relative to a surface over which said vehicle is traveling;
said optical movement sensor subsystem including an optical movement sensor connected to said controller and adapted for providing optically-sensed vehicle movement signals thereto corresponding to optically-sensed relative vehicle movement;
said optical movement sensor subsystem including an optical movement sensor and an optimal estimator providing a statistically optimal estimate of the position and attitude information received from the optical movement sensor;
said optimal estimator including algorithms that receive the position and attitude information from the optical movement sensor and converts said information into a calculated or determined position and attitude of said vehicle producing a statistically optimal estimate of the calculated or determined position and attitude of said vehicle;
said controller being adapted for computing said guidance signals utilizing said vehicle movement signals;
the controller correlating images from said optical movement sensor subsystem to obtain data relating to the vehicle's motion;
a vehicle reference point located at an intersection of the vehicle roll, pitch and yaw axes; and
the spatial database being adapted to receive updated spatial data from the controller and the external spatial data sources as the vehicle traverses the region.
2. The system for controlling a vehicle according to claim 1, further comprising:
a global navigation satellite system (GNSS) positioning subsystem mounted on said vehicle and adapted for providing GNSS-derived position signals to said controller;
said controller using said GNSS-derived position signals for computing said guidance signals;
said GNSS positioning subsystem including a pair of antennas mounted on said vehicle; and
said antennas receiving GNSS ranging signals corresponding to their respective geo-reference locations.
3. The system for controlling a vehicle according to claim 2, further comprising:
said processor being adapted for computing an attitude of said vehicle using ranging differences between the GNSS signals received by said antennas; and
said GNSS antennas being mounted on said vehicle in transversely-spaced relation.
4. The system for controlling a vehicle according to claim 3, further comprising:
said vehicle including a motive component and an implement connected to said motive component;
a GNSS antenna mounted on said implement and connected to said GNSS receiver; and
said guidance subsystem being adapted for automatically steering said vehicle utilizing said positioning signals to accommodate an offset between said tractor and implement and correct relative positioning of said tractor and implement to maintain said implement on a guide path.
5. The system for controlling a vehicle according to claim 4, further comprising:
said guidance subsystem including an hydraulic steering valve block connected to said controller and to a steering mechanism of said vehicle; and
said guidance subsystem including a graphic user interface (GUI) adapted for displaying a guide path of said vehicle.
6. The system for controlling a vehicle according to claim 5, further comprising:
a GNSS base station including a radio transmitter and a radio receiver;
said vehicle including an RF receiver adapted to receive RF transmissions from said base station; and
a real-time kinematic (RTK) correction subsystem using carrier phase satellite transmissions with said vehicle in motion.
7. The system for controlling a vehicle according to claim 1 wherein said optical movement sensor subsystem includes:
a pair of said optical movement sensors fixedly mounted in spaced relation on said vehicle.
8. The system for controlling a vehicle according to claim 1, wherein said external spatial data sources mounted on the vehicle further comprise:
a GNSS system including an antenna and a receiver;
an inertial navigation system (INS) including a gyroscope and an accelerometer; and
a tilt sensor.
9. A control system as claimed in claim 8, wherein the controller uses the GPS system, the inertial navigation system, the gyroscope, the accelerometer and the tilt sensor to generate a control signal for controlling the vehicle.
10. A system for controlling an agricultural vehicle, the vehicle including an automatic steering system and roll, pitch and yaw axes, and the control system comprising:
a spatial database containing spatial data corresponding to GPS-defined positions in the region;
a controller mounted on said vehicle and adapted for computing guidance signals, to receive spatial data from the spatial database at control speed, and to control the steering of the vehicle;
a guidance subsystem mounted on said vehicle and connected to said controller, said guidance subsystem being adapted for receiving said guidance signals from said controller and utilizing said guidance signals for guiding said vehicle;
external spatial data sources mounted on said vehicle, comprising at least an optical movement sensor subsystem adapted for optically sensing movement of said vehicle relative to a surface over which said vehicle is traveling;
said optical movement sensor subsystem including an optical movement sensor connected to said controller and adapted for providing optically-sensed vehicle movement signals thereto corresponding to optically-sensed relative vehicle movement;
said optical movement sensor subsystem including an optical movement sensor and an optimal estimator providing a statistically optimal estimate of the position and attitude information received from the optical movement sensor;
said optimal estimator including algorithms that receive the position and attitude information from the optical movement sensor and converts said information into a calculated or determined position and attitude of said vehicle producing a statistically optimal estimate of the calculated or determined position and attitude of said vehicle;
said controller being adapted for computing said guidance signals utilizing said vehicle movement signals;
the controller correlating images from said optical movement sensor subsystem to obtain data relating to the vehicle's motion;
a vehicle reference point located at an intersection of the vehicle roll, pitch and yaw axes;
the spatial database being adapted to receive updated spatial data from the controller and the external spatial data sources as the vehicle traverses the region;
a global navigation satellite system (GNSS) positioning subsystem mounted on said vehicle and adapted for providing GNSS-derived position signals to said controller;
said controller using said GNSS-derived position signals for computing said guidance signals;
said GNSS positioning subsystem including a pair of antennas mounted on said vehicle;
said antennas receiving GNSS ranging signals corresponding to their respective geo-reference locations;
said processor being adapted for computing an attitude of said vehicle using ranging differences between the GNSS signals received by said antennas;
said GNSS antennas being mounted on said vehicle in transversely-spaced relation;
said vehicle including a motive component and an implement connected to said motive component;
a GNSS antenna mounted on said implement and connected to said GNSS receiver;
said guidance subsystem being adapted for automatically steering said vehicle utilizing said positioning signals to accommodate an offset between said tractor and implement and correct relative positioning of said tractor and implement to maintain said implement on a guide path;
said guidance subsystem including an hydraulic steering valve block connected to said controller and to a steering mechanism of said vehicle;
said guidance subsystem including a graphic user interface (GUI) adapted for displaying a guide path of said vehicle;
a GNSS base station including a radio transmitter and a radio receiver;
said vehicle including an RF receiver adapted to receive RF transmissions from said base station; and
a real-time kinematic (RTK) correction subsystem using carrier phase satellite transmissions with said vehicle in motion.
11. A method for controlling a vehicle within a region to be traversed, the vehicle including an automatic steering system and roll, pitch and yaw axes, the method comprising the steps:
providing a spatial database;
populating said database with spatial data corresponding to GPS-defined positions in the region;
providing a position error generator;
providing a controller;
mounting said controller to said vehicle;
traversing the region with said vehicle;
receiving spatial data with said controller from the spatial database at control speed;
controlling the steering of the vehicle with the controller as the vehicle traverses the region;
providing the controller with a task path generator;
receiving data from the spatial database with the controller and controller task path generator;
providing the controller with a vehicle attitude compensation module;
mounting external spatial data sources, including at least an optical movement sensor subsystem, on said vehicle and optically sensing movement of said vehicle relative to a surface over which said vehicle is traveling;
said optical movement sensor subsystem including an optimal estimator providing a statistically optimal estimate of the position and attitude information received from the optical movement sensor;
providing said optimal estimator with algorithms that receive the position and attitude information from the optical movement sensor and convert said information into a calculated or determined position and attitude of said vehicle producing a statistically optimal estimate of the calculated or determined position and attitude of said vehicle;
populating said spatial database with ground images from said optical movement sensor subsystem;
inputting said ground images to the controller;
correlating the images with said controller to obtain data relating to the vehicle's motion;
designating and locating a vehicle reference point at an intersection of the vehicle roll, pitch, and yaw axes; and
updating said spatial database with spatial data from the controller and said external spatial data sources as the vehicle traverses the region.
12. The method for controlling a vehicle according to claim 11, further comprising the steps:
providing a global navigation satellite system (GNSS) positioning subsystem mounted on said vehicle and providing GNSS-derived position signals to said controller;
providing said GNSS positioning subsystem with a pair of antennas mounted on said vehicle;
receiving with said antennas GNSS ranging signals corresponding to their respective geo-reference locations; and
computing with said processor an attitude of said vehicle using ranging differences between the GNSS signals received by said antennas.
13. The method for controlling a vehicle according to claim 12, further comprising the steps:
mounting said GNSS antennas on said vehicle in transversely-spaced relation.
14. The method for controlling a vehicle according to claim 12, further comprising the steps:
providing said vehicle with a motive component and an implement connected to said motive component;
mounting a GNSS antenna on said implement and connecting said implement-mounted GNSS antennas to said GNSS receiver; and
said guidance subsystem automatically steering said vehicle utilizing said positioning signals to accommodate an offset between said tractor and said implement and to maintain said implement on a guide path.
15. The method according to claim 11, which includes the additional steps of:
providing said optical movement sensor subsystem with a pair of optical movement sensors; and
fixedly mounting said optical movement sensors in spaced relation on said vehicle.
16. The method for controlling a vehicle according to claim 11, wherein said external spatial data sources mounted on the vehicle further comprise:
a GNSS system including an antenna and a receiver;
an inertial navigation system (INS) including a gyroscope and an accelerometer; and
a tilt sensor.
17. The method for controlling a vehicle according to claim 16, wherein the controller uses the GPS system, the inertial navigation system, the gyroscope, the accelerometer and the tilt sensor to generate a control signal for controlling the vehicle.
18. An apparatus for controlling a vehicle, the apparatus comprising:
a spatial database containing spatial data corresponding to absolute positions in a region; and
a controller, in communication with a single gimbal-mounted optical movement sensor of the vehicle or plural optical movement sensors mounted on the vehicle in transversely-spaced relation, the controller configured to:
convert position and attitude information from the single optical movement sensor or the plural optical movement sensors into a calculated position and attitude of the vehicle, wherein the calculated attitude defines a roll, yaw, and pitch of the vehicle,
steer the vehicle using the calculated position and attitude of the vehicle and the spatial data from the spatial database, and
update the spatial database with updated spatial data as the vehicle traverses the region.
19. The apparatus of claim 18, further comprising an optimal estimator to calculate the calculated position and attitude of the vehicle by calculating a statistically optimal estimate of the position and attitude information received from the single optical movement sensor or the plural optical movement sensors.
20. The apparatus of claim 19, wherein the optimal estimator includes algorithms that receive the position and attitude information from the single optical movement sensor or the plural optical movement sensors and convert the information into the calculated position and attitude of the vehicle by calculating the statistically optimal estimate.
21. A method of controlling a vehicle having a single gimbal-mounted optical movement sensor mounted thereon or plural optical movement sensors mounted thereon in transversely-spaced relation, the method comprising:
converting position and attitude information from the single optical movement sensor or the plural optical movement sensors into a calculated position and attitude of the vehicle, wherein the calculated attitude defines a roll, yaw, and pitch of the vehicle;
steering the vehicle using the calculated position and attitude of the vehicle and spatial data corresponding to absolute positions in a region;
wherein the spatial data is from a database, and the method further comprises updating the database with updated spatial data as the vehicle traverses the region.
22. The method of claim 21, further comprising calculating a statistically optimal estimate of the position and attitude information received from the single optical movement sensor or the plural optical movement sensors.
23. The method of claim 22, further comprising:
receiving the position and attitude information from the single optical movement sensor or the plural optical movement sensors; and
converting the received information into the calculated position and attitude of the vehicle by calculating the statistically optimal estimate.
US15/196,824 2007-01-05 2016-06-29 Optical tracking vehicle control system and method Active USRE48527E1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/196,824 USRE48527E1 (en) 2007-01-05 2016-06-29 Optical tracking vehicle control system and method

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US11/620,388 US7835832B2 (en) 2007-01-05 2007-01-05 Vehicle control system
US12/504,779 US8311696B2 (en) 2009-07-17 2009-07-17 Optical tracking vehicle control system and method
US12/947,620 US20110118938A1 (en) 2007-01-05 2010-11-16 Vehicle control system
US13/573,682 US8768558B2 (en) 2007-01-05 2012-10-03 Optical tracking vehicle control system and method
US15/196,824 USRE48527E1 (en) 2007-01-05 2016-06-29 Optical tracking vehicle control system and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/573,682 Reissue US8768558B2 (en) 2007-01-05 2012-10-03 Optical tracking vehicle control system and method

Publications (1)

Publication Number Publication Date
USRE48527E1 true USRE48527E1 (en) 2021-04-20

Family

ID=75440655

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/196,824 Active USRE48527E1 (en) 2007-01-05 2016-06-29 Optical tracking vehicle control system and method

Country Status (1)

Country Link
US (1) USRE48527E1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11167792B2 (en) * 2015-11-19 2021-11-09 Agjunction Llc Single-mode implement steering
US11180189B2 (en) * 2015-11-19 2021-11-23 Agjunction Llc Automated reverse implement parking
US11615639B1 (en) * 2021-01-27 2023-03-28 Jackson Klein Palm vein identification apparatus and method of use

Citations (477)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3585537A (en) 1969-02-10 1971-06-15 Bell Telephone Labor Inc Electric wave filters
US3596228A (en) 1969-05-29 1971-07-27 Ibm Fluid actuated contactor
US3727710A (en) 1971-05-13 1973-04-17 Gen Motors Corp Steer control for a track-laying vehicle
US3815272A (en) 1973-01-03 1974-06-11 G Marleau Collapsible, triangular net assembly
US3899028A (en) 1972-03-30 1975-08-12 Systron Donner Corp Angular position sensing and control system, apparatus and method
US3987456A (en) 1974-08-01 1976-10-19 Lignes Telegraphiques Et Telephoniques Wide relative frequency band and reduced size-to-wavelength ratio antenna
US4132272A (en) 1977-06-30 1979-01-02 International Harvester Company Tractor hitch position control system
US4170776A (en) 1977-12-21 1979-10-09 Nasa System for near real-time crustal deformation monitoring
US4180133A (en) 1978-01-12 1979-12-25 Iowa State University Research Foundation, Inc. Guidance system for towed vehicles
US4398162A (en) 1980-10-22 1983-08-09 Ngk Spark Plug Co., Ltd. Ladder-type piezoelectric filters
US4453614A (en) 1982-03-19 1984-06-12 Deere & Company Steering arrangement for an off-highway articulated vehicle
US4529990A (en) 1979-10-22 1985-07-16 Siemens Aktiengesellschaft Antenna system for a jamming transmitter
US4637474A (en) 1974-11-05 1987-01-20 Leonard Willie B Tractor and towed implement with elevation control system for implement including pressure responsive valve actuator
US4667203A (en) 1982-03-01 1987-05-19 Aero Service Div, Western Geophysical Method and system for determining position using signals from satellites
US4689556A (en) 1984-10-12 1987-08-25 Daymarc Corporation Broad band contactor assembly for testing integrated circuit devices
US4694264A (en) 1986-03-05 1987-09-15 The United States Of America As Represented By The United States Department Of Energy Radio frequency coaxial feedthrough device
US4694639A (en) * 1985-12-30 1987-09-22 Chen Sheng K Robotic lawn mower
US4710775A (en) 1985-09-30 1987-12-01 The Boeing Company Parasitically coupled, complementary slot-dipole antenna element
US4714435A (en) 1985-11-14 1987-12-22 Molex Incorporated Connection for flexible apparatus
US4739448A (en) 1984-06-25 1988-04-19 Magnavox Government And Industrial Electronics Company Microwave multiport multilayered integrated circuit chip carrier
US4751512A (en) 1986-01-21 1988-06-14 Oceanonics, Inc. Differential navigation system for remote mobile users
US4769700A (en) 1981-11-20 1988-09-06 Diffracto Ltd. Robot tractors
US4777601A (en) * 1985-03-15 1988-10-11 Jd-Technologie Ag Method and apparatus for a passive track system for guiding and controlling robotic transport and assembly or installation devices
US4785463A (en) 1985-09-03 1988-11-15 Motorola, Inc. Digital global positioning system receiver
US4802545A (en) 1986-10-15 1989-02-07 J. I. Case Company Steering control system for articulated vehicle
US4812991A (en) 1986-05-01 1989-03-14 Magnavox Govt. And Industrial Electronics Company Method for precision dynamic differential positioning
US4858132A (en) 1987-09-11 1989-08-15 Ndc Technologies, Inc. Optical navigation system for an automatic guided vehicle, and method
US4864320A (en) 1988-05-06 1989-09-05 Ball Corporation Monopole/L-shaped parasitic elements for circularly/elliptically polarized wave transceiving
US4894662A (en) 1982-03-01 1990-01-16 Western Atlas International, Inc. Method and system for determining position on a moving platform, such as a ship, using signals from GPS satellites
US4916577A (en) 1988-12-20 1990-04-10 Grumman Aerospace Corporation Method of mounting removable modules
US4918607A (en) 1988-09-09 1990-04-17 Caterpillar Industrial Inc. Vehicle guidance system
US4963889A (en) 1989-09-26 1990-10-16 Magnavox Government And Industrial Electronics Company Method and apparatus for precision attitude determination and kinematic positioning
US5031704A (en) 1988-05-10 1991-07-16 Fleischer Manufacturing, Inc. Guidance control apparatus for agricultural implement
US5100229A (en) 1990-08-17 1992-03-31 Spatial Positioning Systems, Inc. Spatial positioning system
US5134407A (en) 1991-04-10 1992-07-28 Ashtech Telesis, Inc. Global positioning system receiver digital processing technique
US5144130A (en) * 1989-02-16 1992-09-01 Rieter Machine Works Limited Method for the electrical adjustment of an optical row of sensors
US5148179A (en) 1991-06-27 1992-09-15 Trimble Navigation Differential position determination using satellites
US5152347A (en) 1991-04-05 1992-10-06 Deere & Company Interface system for a towed implement
US5155493A (en) 1990-08-28 1992-10-13 The United States Of America As Represented By The Secretary Of The Air Force Tape type microstrip patch antenna
US5155490A (en) 1990-10-15 1992-10-13 Gps Technology Corp. Geodetic surveying system using multiple GPS base stations
US5156219A (en) 1990-06-04 1992-10-20 A.I.L., Inc. Positioning apparatus for drawn implement
US5165109A (en) 1989-01-19 1992-11-17 Trimble Navigation Microwave communication antenna
US5173715A (en) 1989-12-04 1992-12-22 Trimble Navigation Antenna with curved dipole elements
US5177489A (en) 1989-09-26 1993-01-05 Magnavox Electronic Systems Company Pseudolite-aided method for precision kinematic positioning
US5185610A (en) 1990-08-20 1993-02-09 Texas Instruments Incorporated GPS system and method for deriving pointing or attitude from a single GPS receiver
US5191351A (en) 1989-12-29 1993-03-02 Texas Instruments Incorporated Folded broadband antenna with a symmetrical pattern
US5194851A (en) 1991-02-21 1993-03-16 Case Corporation Steering controller
US5202829A (en) 1991-06-10 1993-04-13 Trimble Navigation Limited Exploration system and method for high-accuracy and high-confidence level relative position and velocity determinations
US5207239A (en) 1991-07-30 1993-05-04 Aura Systems, Inc. Variable gain servo assist
US5229941A (en) * 1988-04-14 1993-07-20 Nissan Motor Company, Limtied Autonomous vehicle automatically running on route and its method
US5239669A (en) 1992-02-04 1993-08-24 Trimble Navigation Limited Coupler for eliminating a hardwire connection between a handheld global positioning system (GPS) receiver and a stationary remote antenna
US5247440A (en) * 1991-05-03 1993-09-21 Motorola, Inc. Location influenced vehicle control system
US5255756A (en) 1992-04-22 1993-10-26 Progressive Farm Products, Inc. Caddy with guidance system for agricultural implements
US5268695A (en) 1992-10-06 1993-12-07 Trimble Navigation Limited Differential phase measurement through antenna multiplexing
US5294970A (en) 1990-12-31 1994-03-15 Spatial Positioning Systems, Inc. Spatial positioning system
US5296861A (en) 1992-11-13 1994-03-22 Trimble Navigation Limited Method and apparatus for maximum likelihood estimation direct integer search in differential carrier phase attitude determination systems
US5311149A (en) 1993-03-12 1994-05-10 Trimble Navigation Limited Integrated phase locked loop local oscillator
US5323322A (en) 1992-03-05 1994-06-21 Trimble Navigation Limited Networked differential GPS system
US5334987A (en) 1993-04-01 1994-08-02 Spectra-Physics Laserplane, Inc. Agricultural aircraft control system using the global positioning system
US5343209A (en) 1992-05-07 1994-08-30 Sennott James W Navigation receiver with coupled signal-tracking channels
US5345245A (en) 1992-07-01 1994-09-06 Kokusai Denshin Denwa Company, Limited Differential data signal transmission technique
US5359332A (en) 1992-12-31 1994-10-25 Trimble Navigation Limited Determination of phase ambiguities in satellite ranges
US5361212A (en) 1992-11-02 1994-11-01 Honeywell Inc. Differential GPS landing assistance system
US5365447A (en) 1991-09-20 1994-11-15 Dennis Arthur R GPS and satelite navigation system
US5369589A (en) 1993-09-15 1994-11-29 Trimble Navigation Limited Plural information display for navigation
US5375059A (en) 1990-02-05 1994-12-20 Caterpillar Inc. Vehicle position determination system and method
US5390207A (en) 1990-11-28 1995-02-14 Novatel Communications Ltd. Pseudorandom noise ranging receiver which compensates for multipath distortion by dynamically adjusting the time delay spacing between early and late correlators
US5390124A (en) 1992-12-01 1995-02-14 Caterpillar Inc. Method and apparatus for improving the accuracy of position estimates in a satellite based navigation system
US5416712A (en) 1993-05-28 1995-05-16 Trimble Navigation Limited Position and velocity estimation system for adaptive weighting of GPS and dead-reckoning information
US5442363A (en) 1994-08-04 1995-08-15 U.S. Army Corps Of Engineers As Represented By The Secretary Of The Army Kinematic global positioning system of an on-the-fly apparatus for centimeter-level positioning for static or moving applications
US5444453A (en) 1993-02-02 1995-08-22 Ball Corporation Microstrip antenna structure having an air gap and method of constructing same
JPH07244150A (en) 1994-02-28 1995-09-19 Fujita Corp Attitude measuring apparatus of heavy machine
US5451964A (en) 1994-07-29 1995-09-19 Del Norte Technology, Inc. Method and system for resolving double difference GPS carrier phase integer ambiguity utilizing decentralized Kalman filters
US5471217A (en) 1993-02-01 1995-11-28 Magnavox Electronic Systems Company Method and apparatus for smoothing code measurements in a global positioning system receiver
US5477458A (en) 1994-01-03 1995-12-19 Trimble Navigation Limited Network for carrier phase differential GPS corrections
US5476147A (en) 1993-03-19 1995-12-19 Fixemer; Richard A. Guidance system for an agricultural implement
US5477228A (en) 1993-04-13 1995-12-19 Differential Corrections Inc. Differential global positioning system using radio data system
US5490073A (en) 1993-04-05 1996-02-06 Caterpillar Inc. Differential system and method for a satellite based navigation
US5491636A (en) 1994-04-19 1996-02-13 Glen E. Robertson Anchorless boat positioning employing global positioning system
US5495257A (en) 1994-07-19 1996-02-27 Trimble Navigation Limited Inverse differential corrections for SATPS mobile stations
US5504482A (en) 1993-06-11 1996-04-02 Rockwell International Corporation Automobile navigation guidance, control and safety system
US5511623A (en) 1994-09-12 1996-04-30 Orthman Manufacturing, Inc. Quick hitch guidance device
US5519620A (en) 1994-02-18 1996-05-21 Trimble Navigation Limited Centimeter accurate global positioning system receiver for on-the-fly real-time kinematic measurement and control
US5521610A (en) 1993-09-17 1996-05-28 Trimble Navigation Limited Curved dipole antenna with center-post amplifier
US5523761A (en) 1993-01-12 1996-06-04 Trimble Navigation Limited Differential GPS smart antenna device
US5534875A (en) 1993-06-18 1996-07-09 Adroit Systems, Inc. Attitude determining system for use with global positioning system
US5543804A (en) 1994-09-13 1996-08-06 Litton Systems, Inc. Navagation apparatus with improved attitude determination
US5546093A (en) 1994-01-04 1996-08-13 Caterpillar Inc. System and method for providing navigation signals to an earthmoving or construction machine
US5548293A (en) 1993-03-24 1996-08-20 Leland Stanford Junior University System and method for generating attitude determinations using GPS
US5561432A (en) 1995-05-12 1996-10-01 Trimble Navigation Out of plane antenna vector system and method
US5563786A (en) 1994-02-16 1996-10-08 Fuji Jukogyo Kabushiki Kaisha Autonomous running control system for vehicle and the method thereof
US5568152A (en) 1994-02-04 1996-10-22 Trimble Navigation Limited Integrated image transfer for remote target location
US5568162A (en) 1994-08-08 1996-10-22 Trimble Navigation Limited GPS navigation and differential-correction beacon antenna combination
US5583513A (en) 1993-03-24 1996-12-10 Board Of Trustees Of The Leland Stanford Junior University System and method for generating precise code based and carrier phase position determinations
US5589835A (en) 1994-12-20 1996-12-31 Trimble Navigation Limited Differential GPS receiver system linked by infrared signals
US5592382A (en) 1995-03-10 1997-01-07 Rockwell International Corporation Directional steering and navigation indicator
US5596328A (en) 1994-08-23 1997-01-21 Honeywell Inc. Fail-safe/fail-operational differential GPS ground station system
US5600670A (en) 1994-12-21 1997-02-04 Trimble Navigation, Ltd. Dynamic channel allocation for GPS receivers
US5604506A (en) 1994-12-13 1997-02-18 Trimble Navigation Limited Dual frequency vertical antenna
US5608393A (en) 1995-03-07 1997-03-04 Honeywell Inc. Differential ground station repeater
US5610616A (en) 1994-08-23 1997-03-11 Honeywell Inc. Differential GPS ground station system
US5610845A (en) 1994-08-30 1997-03-11 United Technologies Corporation Multi-parameter air data sensing technique
US5610522A (en) 1993-09-30 1997-03-11 Commissariat A L'energie Atomique Open magnetic structure including pole pieces forming a V-shape threbetween for high homogeneity in an NMR device
US5612883A (en) 1990-02-05 1997-03-18 Caterpillar Inc. System and method for detecting obstacles in the path of a vehicle
US5617100A (en) 1994-04-07 1997-04-01 Matsushita Electric Industrial Co., Ltd. Accurate position measuring system
US5617317A (en) 1995-01-24 1997-04-01 Honeywell Inc. True north heading estimator utilizing GPS output information and inertial sensor system output information
US5621646A (en) 1995-01-17 1997-04-15 Stanford University Wide area differential GPS reference system and method
US5638077A (en) 1995-05-04 1997-06-10 Rockwell International Corporation Differential GPS for fleet base stations with vector processing mechanization
US5644139A (en) 1995-03-02 1997-07-01 Allen; Ross R. Navigation technique for detecting movement of navigation sensors relative to an object
US5663879A (en) 1987-11-20 1997-09-02 North American Philips Corporation Method and apparatus for smooth control of a vehicle with automatic recovery for interference
US5673491A (en) 1995-10-20 1997-10-07 Brenna; Douglas J. Crane level indicator device
US5680140A (en) 1994-07-19 1997-10-21 Trimble Navigation Limited Post-processing of inverse differential corrections for SATPS mobile stations
US5684476A (en) 1993-12-30 1997-11-04 Concord, Inc. Field navigation system
US5706015A (en) 1995-03-20 1998-01-06 Fuba Automotive Gmbh Flat-top antenna apparatus including at least one mobile radio antenna and a GPS antenna
US5717593A (en) 1995-09-01 1998-02-10 Gvili; Michael E. Lane guidance system
US5725230A (en) 1996-06-17 1998-03-10 Walkup; Joseph L. Self steering tandem hitch
US5731786A (en) 1994-12-29 1998-03-24 Trimble Navigation Limited Compaction of SATPS information for subsequent signal processing
US5739785A (en) 1993-03-04 1998-04-14 Trimble Navigation Limited Location and generation of high accuracy survey control marks using satellites
US5757316A (en) 1997-02-01 1998-05-26 Litton Systems, Inc. Attitude determination utilizing an inertial measurement unit and a plurality of satellite transmitters
US5765123A (en) 1993-08-07 1998-06-09 Aisin Aw Co., Ltd. Navigation system
US5777578A (en) 1997-02-10 1998-07-07 National Science Council Global positioning system (GPS) Compass
WO1998036288A1 (en) 1997-02-14 1998-08-20 Kelvin Korver A navigation/guidance system for a land-based vehicle
US5810095A (en) 1996-07-25 1998-09-22 Case Corporation System for controlling the position of an implement attached to a work vehicle
US5814961A (en) * 1996-09-03 1998-09-29 Nec Corporation Guidance system for automated guided vehicle
US5828336A (en) 1996-03-29 1998-10-27 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Robust real-time wide-area differential GPS navigation
US5848485A (en) 1996-12-27 1998-12-15 Spectra Precision, Inc. System for determining the position of a tool mounted on pivotable arm using a light source and reflectors
US5854987A (en) 1995-02-22 1998-12-29 Honda Giken Kogyo Kabushiki Kaisha Vehicle steering control system using navigation system
US5862501A (en) 1995-08-18 1999-01-19 Trimble Navigation Limited Guidance control system for movable machinery
US5864315A (en) 1997-04-07 1999-01-26 General Electric Company Very low power high accuracy time and frequency circuits in GPS based tracking units
US5864318A (en) 1996-04-26 1999-01-26 Dorne & Margolin, Inc. Composite antenna for cellular and gps communications
US5875408A (en) 1995-07-17 1999-02-23 Imra America, Inc. Automated vehicle guidance system and method for automatically guiding a vehicle
US5877725A (en) 1997-03-06 1999-03-02 Trimble Navigation Limited Wide augmentation system retrofit receiver
US5906645A (en) 1995-12-04 1999-05-25 Toyota Jidosha Kabushiki Kaisha Auto-drive control unit for vehicles
US5912798A (en) 1997-07-02 1999-06-15 Landsten Chu Dielectric ceramic filter
US5914685A (en) 1997-04-25 1999-06-22 Magellan Corporation Relative position measuring techniques using both GPS and GLONASS carrier phase measurements
US5917448A (en) 1997-08-07 1999-06-29 Rockwell Science Center, Inc. Attitude determination system with sequencing antenna inputs
US5919242A (en) 1992-05-14 1999-07-06 Agri-Line Innovations, Inc. Method and apparatus for prescription application of products to an agricultural field
US5918558A (en) 1997-12-01 1999-07-06 Case Corporation Dual-pump, flow-isolated hydraulic circuit for an agricultural tractor
US5923270A (en) 1994-05-13 1999-07-13 Modulaire Oy Automatic steering system for an unmanned vehicle
US5926079A (en) 1996-12-05 1999-07-20 Motorola Inc. Ceramic waveguide filter with extracted pole
US5925080A (en) * 1996-03-29 1999-07-20 Mazda Motor Corporation Automatic guided vehicle control system
US5927603A (en) 1997-09-30 1999-07-27 J. R. Simplot Company Closed loop control system, sensing apparatus and fluid application system for a precision irrigation device
US5929721A (en) 1996-08-06 1999-07-27 Motorola Inc. Ceramic filter with integrated harmonic response suppression using orthogonally oriented low-pass filter
US5928309A (en) 1996-02-05 1999-07-27 Korver; Kelvin Navigation/guidance system for a land-based vehicle
US5933110A (en) 1998-07-13 1999-08-03 Arinc, Inc. Vessel attitude determination system and method
US5936573A (en) 1997-07-07 1999-08-10 Trimble Navigation Limited Real-time kinematic integrity estimator and monitor
US5935183A (en) 1996-05-20 1999-08-10 Caterpillar Inc. Method and system for determining the relationship between a laser plane and an external coordinate system
US5940026A (en) 1997-07-21 1999-08-17 Rockwell Science Center, Inc. Azimuth determination for GPS/INS systems via GPS null steering antenna
US5943008A (en) 1997-09-23 1999-08-24 Rockwell Science Center, Inc. Single global positioning system receiver capable of attitude determination
US5941317A (en) 1996-08-01 1999-08-24 Great Western Corporation Pty Ltd. Row cultivator with laterally moveable tool bar
US5944770A (en) 1995-12-28 1999-08-31 Trimble Navigation Limited Method and receiver using a low earth orbiting satellite signal to augment the global positioning system
US5945917A (en) 1997-12-18 1999-08-31 Rockwell International Swathing guidance display
US5949371A (en) 1998-07-27 1999-09-07 Trimble Navigation Limited Laser based reflectors for GPS positioning augmentation
US5969670A (en) 1998-01-22 1999-10-19 Trimble Navigation Limited Inexpensive monitoring technique for achieving high level integrity monitoring for differential GPS
US5987383A (en) 1997-04-28 1999-11-16 Trimble Navigation Form line following guidance system
US6014101A (en) 1996-02-26 2000-01-11 Trimble Navigation Limited Post-processing of inverse DGPS corrections
US6014608A (en) 1996-11-04 2000-01-11 Samsung Electronics Co., Ltd. Navigator apparatus informing or peripheral situation of the vehicle and method for controlling the same
US6018313A (en) 1995-09-01 2000-01-25 Tilmar Konle System for determining the location of mobile objects
US6023239A (en) 1997-10-08 2000-02-08 Arinc, Inc. Method and system for a differential global navigation satellite system aircraft landing ground station
US6052647A (en) 1997-06-20 2000-04-18 Stanford University Method and system for automatic control of vehicles based on carrier phase differential GPS
US6055477A (en) 1995-03-31 2000-04-25 Trimble Navigation Ltd. Use of an altitude sensor to augment availability of GPS location fixes
US6057800A (en) 1996-06-28 2000-05-02 State University Of New York RDOP surface for GPS relative positioning
WO2000024239A1 (en) 1998-10-27 2000-05-04 Agsystems Pty. Ltd. Vehicle positioning apparatus and method
US6061390A (en) 1994-09-02 2000-05-09 California Institute Of Technology P-code enhanced method for processing encrypted GPS signals without knowledge of the encryption code
US6061632A (en) 1997-08-18 2000-05-09 Trimble Navigation Limited Receiver with seamless correction capacity
US6062317A (en) 1999-09-03 2000-05-16 Caterpillar Inc. Method and apparatus for controlling the direction of travel of an earthworking machine
US6069583A (en) 1996-05-09 2000-05-30 Agence Spatiale Europeene Receiver for a navigation system, in particular a satellite navigation system
US6070673A (en) 1996-11-22 2000-06-06 Case Corporation Location based tractor control
US6076612A (en) 1999-08-31 2000-06-20 Case Corporation Transition from position to draft mode controlled by hitch position command and feedback
US6081171A (en) 1998-04-08 2000-06-27 Nokia Mobile Phones Limited Monolithic filters utilizing thin film bulk acoustic wave devices and minimum passive components for controlling the shape and width of a passband response
US6100842A (en) 1998-02-20 2000-08-08 Trimble Navigation Limited Chained location determination system
US6104978A (en) 1998-04-06 2000-08-15 General Electric Company GPS-based centralized tracking system with reduced energy consumption
US6122595A (en) 1996-05-20 2000-09-19 Harris Corporation Hybrid GPS/inertially aided platform stabilization system
US6128574A (en) 1996-07-23 2000-10-03 Claas Kgaa Route planning system for agricultural work vehicles
US6138062A (en) * 1996-07-15 2000-10-24 Toyota Jidoshia Kabushiki Kaisha Automatic travel controlling device
US6144335A (en) 1998-04-14 2000-11-07 Trimble Navigation Limited Automated differential correction processing of field data in a global positional system
US6191733B1 (en) 1999-06-01 2001-02-20 Modular Mining Systems, Inc. Two-antenna positioning system for surface-mine equipment
US6191730B1 (en) 1997-12-15 2001-02-20 Trimble Navigation Limited Two-channel fast-sequencing high-dynamics GPS navigation receiver
US6199000B1 (en) 1998-07-15 2001-03-06 Trimble Navigation Limited Methods and apparatus for precision agriculture operations utilizing real time kinematic global positioning system systems
US6198992B1 (en) 1997-10-10 2001-03-06 Trimble Navigation Limited Override for guidance control system
US6198430B1 (en) 1999-03-26 2001-03-06 Rockwell Collins, Inc. Enhanced differential GNSS carrier-smoothed code processing using dual frequency measurements
US6205401B1 (en) 1995-09-19 2001-03-20 Litef Gmbh Navigation system for a vehicle, especially a land craft
US6212453B1 (en) 1998-09-11 2001-04-03 Honda Giken Kogyo Kabushiki Kaisha Vehicle steering control system
US6215828B1 (en) 1996-02-10 2001-04-10 Telefonaktiebolaget Lm Ericsson (Publ) Signal transformation method and apparatus
US6230097B1 (en) 1998-08-31 2001-05-08 Trimble Navigation Limited Accurate vehicle navigation
US6233511B1 (en) 1997-11-26 2001-05-15 Case Corporation Electronic control for a two-axis work implement
US6236924B1 (en) 1999-06-21 2001-05-22 Caterpillar Inc. System and method for planning the operations of an agricultural machine in a field
US6236916B1 (en) 1999-03-29 2001-05-22 Caterpillar Inc. Autoguidance system and method for an agricultural machine
US20010004601A1 (en) 1995-12-22 2001-06-21 Drane Christopher R. Location and tracking system
US6253160B1 (en) 1999-01-15 2001-06-26 Trimble Navigation Ltd. Method and apparatus for calibrating a tool positioning mechanism on a mobile machine
US6256583B1 (en) 1998-09-16 2001-07-03 Rockwell Collins, Inc. GPS attitude determination system and method using optimal search space identification for integer cycle ambiguity resolution
US6259398B1 (en) 2000-05-19 2001-07-10 Sri International Multi-valued variable ambiguity resolution for satellite navigation signal carrier wave path length determination
US6266595B1 (en) 1999-08-12 2001-07-24 Martin W. Greatline Method and apparatus for prescription application of products to an agricultural field
US6275705B1 (en) 1995-12-22 2001-08-14 Cambridge Positioning Systems Ltd. Location and tracking system
US6285320B1 (en) 1999-09-03 2001-09-04 Sikorsky Aircraft Corporation Apparatus and method for mapping surfaces of an object
US6292132B1 (en) 1999-08-13 2001-09-18 Daimlerchrysler Ag System and method for improved accuracy in locating and maintaining positions using GPS
US20010025221A1 (en) 2000-03-14 2001-09-27 Bernhard Klein Route planning system
US6307505B1 (en) 1998-07-22 2001-10-23 Trimble Navigation Limited Apparatus and method for coupling data to a position determination device
US6314348B1 (en) 1998-02-11 2001-11-06 Trimble Navigation Limited Correction control for guidance control system
US6313788B1 (en) 1998-08-14 2001-11-06 Seagull Technology, Inc. Method and apparatus for reliable inter-antenna baseline determination
US6325684B1 (en) 1999-06-11 2001-12-04 Johnson Outdoors, Inc., Trolling motor steering control
US6336066B1 (en) 1998-09-29 2002-01-01 Pellenc S.A. Process for using localized agricultural data to optimize the cultivation of perennial plants
US20020004691A1 (en) 2000-03-10 2002-01-10 Yasuhiro Kinashi Attitude determination and alignment using electro-optical sensors and global navigation satellites
US6345231B2 (en) 1998-07-10 2002-02-05 Claas Selbstfahrende Erntemaschinen Gmbh Method and apparatus for position determining
US6356602B1 (en) 1998-05-04 2002-03-12 Trimble Navigation Limited RF integrated circuit for downconverting a GPS signal
US6377889B1 (en) 2000-10-13 2002-04-23 Trimble Navigation Limited Non-linear method of guiding to arbitrary curves with adaptive feedback
US6380888B1 (en) 2000-11-13 2002-04-30 The United States Of America As Represented By The Secretary Of The Navy Self-contained, self-surveying differential GPS base station and method of operating same
US6389345B2 (en) 1999-06-29 2002-05-14 Caterpillar Inc. Method and apparatus for determining a cross slope of a surface
US6397147B1 (en) 2000-06-06 2002-05-28 Csi Wireless Inc. Relative GPS positioning using a single GPS receiver with internally generated differential correction terms
US20020067849A1 (en) 2000-12-06 2002-06-06 Xerox Corporation Adaptive tree-based lookup for non-separably divided color tables
US20020072850A1 (en) 2000-12-08 2002-06-13 Mcclure John A. GPS derived swathing guidance system
US6415229B1 (en) 1996-06-21 2002-07-02 Claas Kgaa System for position determination of mobile objects, in particular vehicles
US6418031B1 (en) 2000-05-01 2002-07-09 International Business Machines Corporation Method and means for decoupling a printed circuit board
US6421003B1 (en) 2000-05-19 2002-07-16 Sri International Attitude determination using multiple baselines in a navigational positioning system
US6424915B1 (en) 2000-06-01 2002-07-23 Furuno Electric Co., Ltd. System for determining the heading and/or attitude of a body
US6434462B1 (en) 2001-06-28 2002-08-13 Deere & Company GPS control of a tractor-towed implement
US6431576B1 (en) 1999-04-28 2002-08-13 Deere & Company System for steering towed implement in response to, or independently of, steering of towing vehicle
US6445990B1 (en) 2001-03-19 2002-09-03 Caterpillar Inc. Method and apparatus for controlling straight line travel of a tracked machine
US6445983B1 (en) 2000-07-07 2002-09-03 Case Corporation Sensor-fusion navigator for automated guidance of off-road vehicles
US6449558B1 (en) 1998-05-29 2002-09-10 David Anthony Small Method and device for creating a network positioning system (NPS)
US20020138187A1 (en) 2001-02-09 2002-09-26 The Board Of Trustees Of The University Of Illinois Fuzzy steering controller
US6463091B1 (en) 1995-08-09 2002-10-08 Magellan Corporation Spread spectrum receiver using a pseudo-random noise code for ranging applications in a way that reduces errors when a multipath signal is present
US6466871B1 (en) 1999-10-03 2002-10-15 Azimuth Technologies Method for calibrating and verifying the attitude of a compass
US6469663B1 (en) 2000-03-21 2002-10-22 Csi Wireless Inc. Method and system for GPS and WAAS carrier phase measurements for relative positioning
US20020161522A1 (en) 2001-02-05 2002-10-31 Clark Cohen Low cost system and method for making dual band GPS measurements
US20020165669A1 (en) * 2001-02-28 2002-11-07 Enpoint, L.L.C. Attitude measurement using a single GPS receiver with two closely-spaced antennas
US20020165645A1 (en) * 1999-05-31 2002-11-07 Masato Kageyama Vehicle interference prevention device
US6484097B2 (en) 1999-04-23 2002-11-19 Global Locate, Inc. Wide area inverse differential GPS
US20020171427A1 (en) 2001-04-24 2002-11-21 The United States Of America Represented By The Secretary Of The Navy Magnetic anomaly sensing system and methods for maneuverable sensing platforms
US6498570B2 (en) * 2001-05-24 2002-12-24 Phillip N. Ross Optical highway line detector
US6501422B1 (en) 1998-08-19 2002-12-31 Trimble Navigation, Ltd. Precise parallel swathing guidance via satellite navigation and tilt measurement
US20030014171A1 (en) 2001-07-16 2003-01-16 Xinghan Ma Harvester with intelligent hybrid control system
US6515619B1 (en) 1998-07-30 2003-02-04 Mckay, Jr. Nicholas D. Object location system
US6516271B2 (en) 2001-06-29 2003-02-04 The Regents Of The University Of California Method and apparatus for ultra precise GPS-based mapping of seeds or vegetation during planting
WO2003019430A1 (en) 2001-08-29 2003-03-06 Beeline Technologies Pty Ltd Vehicle guidance software, method and system
US6542077B2 (en) 1993-06-08 2003-04-01 Raymond Anthony Joao Monitoring apparatus for a vehicle and/or a premises
US6549835B2 (en) 2000-09-28 2003-04-15 Nissan Motor Co., Ltd. Apparatus for and method of steering vehicle
US6553311B2 (en) 2000-12-08 2003-04-22 Trimble Navigation Limited Navigational off- line and off-heading indication system and method
US20030093210A1 (en) 2001-11-15 2003-05-15 Toshiyuki Kondo Travel control apparatus of vehicle
US6577952B2 (en) 2001-01-08 2003-06-10 Motorola, Inc. Position and heading error-correction method and apparatus for vehicle navigation systems
US6587761B2 (en) 2001-10-23 2003-07-01 The Aerospace Corporation Unambiguous integer cycle attitude determination method
US6606542B2 (en) 1995-05-30 2003-08-12 Agco Corporation System and method for creating agricultural decision and application maps for automated agricultural machines
US6611755B1 (en) 1999-12-19 2003-08-26 Trimble Navigation Ltd. Vehicle tracking, communication and fleet management system
US6611228B2 (en) 2000-07-24 2003-08-26 Furuno Electric Company, Limited Carrier phase-based relative positioning apparatus
US6622091B2 (en) 2001-05-11 2003-09-16 Fibersense Technology Corporation Method and system for calibrating an IG/GP navigational system
US20030187577A1 (en) 2000-12-08 2003-10-02 Satloc, Llc Vehicle navigation system and method for swathing applications
US6631394B1 (en) 1998-01-21 2003-10-07 Nokia Mobile Phones Limited Embedded system with interrupt handler for multiple operating systems
US6631916B1 (en) 2000-07-28 2003-10-14 Merle E. Miller Guidance system for pull-type equipment
US6643576B1 (en) 2000-11-15 2003-11-04 Integrinautics Corporation Rapid adjustment of trajectories for land vehicles
US20030208319A1 (en) * 2000-06-05 2003-11-06 Agco System and method for creating demo application maps for site-specific farming
US6646603B2 (en) 2000-06-16 2003-11-11 Koninklijke Philips Electronics, N.V. Method of providing an estimate of a location
US6657875B1 (en) 2002-07-16 2003-12-02 Fairchild Semiconductor Corporation Highly efficient step-down/step-up and step-up/step-down charge pump
US6671587B2 (en) 2002-02-05 2003-12-30 Ford Motor Company Vehicle dynamics measuring apparatus and method using multiple GPS antennas
US20040006426A1 (en) 2002-07-03 2004-01-08 Armstrong Ray G. Vehicle locating system
US6686878B1 (en) 2000-02-22 2004-02-03 Trimble Navigation Limited GPS weather data recording system for use with the application of chemicals to agricultural fields
US6688403B2 (en) 2001-03-22 2004-02-10 Deere & Company Control system for a vehicle/implement hitch
US20040039514A1 (en) 2002-04-05 2004-02-26 Steichen John Carl Method and apparatus for controlling a gas-emitting process and related devices
US6721638B2 (en) 2001-05-07 2004-04-13 Rapistan Systems Advertising Corp. AGV position and heading controller
US6732024B2 (en) 2001-05-07 2004-05-04 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for vehicle control, navigation and positioning
US6744404B1 (en) 2003-07-09 2004-06-01 Csi Wireless Inc. Unbiased code phase estimator for mitigating multipath in GPS
US6774843B2 (en) 2001-03-28 2004-08-10 Communications Research Laboratory, Independent Administrative Institution Method for acquiring azimuth information
US6789014B1 (en) 2003-05-09 2004-09-07 Deere & Company Direct modification of DGPS information with inertial measurement data
US6792380B2 (en) 2002-02-12 2004-09-14 Furuno Electric Company Limited Attitude angle detecting apparatus
US20040186644A1 (en) * 2003-03-20 2004-09-23 Mcclure John A. Satellite based vehicle guidance control in straight and contour modes
US20040212533A1 (en) 2003-04-23 2004-10-28 Whitehead Michael L. Method and system for satellite based phase measurements for relative positioning of fixed or slow moving points in close proximity
US20040221790A1 (en) * 2003-05-02 2004-11-11 Sinclair Kenneth H. Method and apparatus for optical odometry
US6819269B2 (en) 2000-05-17 2004-11-16 Omega Patents, L.L.C. Vehicle tracker including battery monitoring feature and related methods
US6819780B2 (en) 2001-02-02 2004-11-16 Cnh America Llc Method and apparatus for automatically steering a vehicle in an agricultural field using a plurality of fuzzy logic membership functions
US6822314B2 (en) 2002-06-12 2004-11-23 Intersil Americas Inc. Base for a NPN bipolar transistor
US6865484B2 (en) 2001-04-11 2005-03-08 Mitsui & Co., Ltd. Satellite position measurement system
US6865465B2 (en) 2002-05-06 2005-03-08 Csi Wireless, Inc. Method and system for implement steering for agricultural vehicles
US20050055147A1 (en) 2002-10-31 2005-03-10 Oliver Hrazdera Agricultural utility vehicle and method of controlling same
US6879283B1 (en) 2003-02-21 2005-04-12 Trimble Navigation Limited Method and system for transmission of real-time kinematic satellite positioning system data
US20050080559A1 (en) 2002-10-02 2005-04-14 Hideto Ishibashi Position measuring system for working machine
US20050110676A1 (en) * 2003-10-06 2005-05-26 Heppe Stephen B. Method and apparatus for satellite-based relative positioning of moving platforms
US6900992B2 (en) 2001-09-18 2005-05-31 Intel Corporation Printed circuit board routing and power delivery for high frequency integrated circuits
US20050150160A1 (en) * 2003-10-28 2005-07-14 Norgaard Daniel G. Method for selecting crop varieties
US6922635B2 (en) 2002-08-13 2005-07-26 Drs Communications Company, Llc Method and system for determining absolute positions of mobile communications devices using remotely generated positioning information
US6931233B1 (en) 2000-08-31 2005-08-16 Sirf Technology, Inc. GPS RF front end IC with programmable frequency synthesizer for use in wireless phones
US20050196162A1 (en) * 2004-03-03 2005-09-08 Mootz Jeffery S. Self Leveling Camera Support Apparatus
US20050225955A1 (en) 2004-04-09 2005-10-13 Hewlett-Packard Development Company, L.P. Multi-layer printed circuit boards
US6967538B2 (en) 2002-11-28 2005-11-22 Hynix Semiconductor Inc. PLL having VCO for dividing frequency
US20050259240A1 (en) * 2003-09-18 2005-11-24 Goren David P Optical navigation of vehicles
US20050265494A1 (en) 2000-05-13 2005-12-01 Goodings Christopher J Method and apparatus for code phase tracking
US20060031664A1 (en) 2004-08-04 2006-02-09 National Instruments Corporation Method and system for loading and updating firmware in an embedded device
US7006032B2 (en) 2004-01-15 2006-02-28 Honeywell International, Inc. Integrated traffic surveillance apparatus
US7027918B2 (en) 2003-04-07 2006-04-11 Novariant, Inc. Satellite navigation system using multiple antennas
US7026982B2 (en) 2001-12-19 2006-04-11 Furuno Electric Ompany Limited Carrier-phase-based relative positioning device
US7031725B2 (en) 2002-08-13 2006-04-18 Drs Communications Company, Llc Method and system for determining relative positions of networked mobile communication devices
US20060095172A1 (en) * 2004-10-28 2006-05-04 Abramovitch Daniel Y Optical navigation system for vehicles
US20060103573A1 (en) 2004-11-12 2006-05-18 Geier George J Frequency error tracking in satellite positioning system receivers
US20060167600A1 (en) 2005-01-27 2006-07-27 Raven Industries, Inc. Architecturally partitioned automatic steering system and method
US7089099B2 (en) 2004-07-30 2006-08-08 Automotive Technologies International, Inc. Sensor assemblies
US20060206246A1 (en) 2004-10-28 2006-09-14 Walker Richard C Second national / international management and security system for responsible global resourcing through technical management to brige cultural and economic desparity
US20060213167A1 (en) 2003-12-12 2006-09-28 Harvey Koselka Agricultural robot system and method
US20060215739A1 (en) 2005-03-24 2006-09-28 Ian Williamson System and method for making correlation measurements utilizing pulse shape measurements
US7142956B2 (en) 2004-03-19 2006-11-28 Hemisphere Gps Llc Automatic steering system and method
US7155335B2 (en) 2003-08-06 2006-12-26 General Motors Corporation Satellite radio real time traffic updates
US20060290779A1 (en) * 2005-01-18 2006-12-28 Reverte Carlos F Autonomous inspector mobile platform
US7162348B2 (en) 2002-12-11 2007-01-09 Hemisphere Gps Llc Articulated equipment position control system and method
US20070055412A1 (en) * 2003-11-20 2007-03-08 Werner Bernhard Lane device, selector device and method for detecting the lane of a vehicle
US7191061B2 (en) 2003-04-17 2007-03-13 Battelle Energy Alliance, Llc Auto-steering apparatus and method
US20070078570A1 (en) 2005-10-04 2007-04-05 Xiaowen Dai Method and apparatus for reporting road conditions
US20070088447A1 (en) 2004-04-27 2007-04-19 Abb Research Ltd Scheduling of industrial production processes
US20070112700A1 (en) 2004-04-22 2007-05-17 Frontline Robotics Inc. Open control system architecture for mobile autonomous systems
US7221314B2 (en) 2004-03-26 2007-05-22 Topcon Gps, Llc Estimation and resolution of carrier wave ambiguities in a position navigation system
US20070121708A1 (en) 2005-11-28 2007-05-31 Honeywell International, Inc. Discriminator function for GPS code alignment
US7248211B2 (en) 2004-07-26 2007-07-24 Navcom Technology Inc. Moving reference receiver for RTK navigation
US20070193798A1 (en) 2005-10-21 2007-08-23 James Allard Systems and methods for obstacle avoidance
US20070194984A1 (en) 2006-02-21 2007-08-23 Honeywell International Inc. System and method for detecting false navigation signals
US20070198185A1 (en) * 2002-12-11 2007-08-23 Mcclure John A GNSS control system and method
US20070205940A1 (en) 2005-07-01 2007-09-06 Chun Yang Method and device for tracking weak global navigation satellite system (gnss) signals
US7271766B2 (en) 2004-07-30 2007-09-18 Novariant, Inc. Satellite and local system position determination
US7277784B2 (en) 2002-05-31 2007-10-02 Deere & Company Combination of a self-moving harvesting machine and a transport vehicle
US20070247361A1 (en) 2006-04-21 2007-10-25 Broadcom Corporation, A California Corporation Communication system with assisted GPS and SBAS
US7292186B2 (en) 2003-04-23 2007-11-06 Csi Wireless Inc. Method and system for synchronizing multiple tracking devices for a geo-location system
US20070285308A1 (en) 2004-07-30 2007-12-13 Integirnautics Corporation Multiple frequency antenna structures and methods for receiving navigation or ranging signals
US7324915B2 (en) 2005-07-14 2008-01-29 Biosense Webster, Inc. Data transmission to a position sensor
US20080039991A1 (en) 2006-08-10 2008-02-14 May Reed R Methods and systems for providing accurate vehicle positioning
US20080059068A1 (en) 2006-09-05 2008-03-06 Honeywell International Inc. Method and system for autonomous vehicle navigation
US7358896B2 (en) 2005-11-03 2008-04-15 Nemerix Sa Multiband GNSS receiver
US20080129586A1 (en) 2005-01-20 2008-06-05 Thales Satellite-Based Positioning Receiver with Improved Integrity and Continuity
US7388539B2 (en) * 2005-10-19 2008-06-17 Hemisphere Gps Inc. Carrier track loop for GNSS derived attitude
US7395769B2 (en) 2004-10-21 2008-07-08 Jensen Layton W Individual row rate control of farm implements to adjust the volume of crop inputs across wide implements in irregularly shaped or contour areas of chemical application, planting or seeding
WO2008080193A1 (en) 2007-01-05 2008-07-10 Hemisphere Gps Llc A vehicle control system
US7400956B1 (en) * 2003-03-20 2008-07-15 Hemisphere Gps Inc. Satellite position and heading sensor for vehicle steering control
US20080204312A1 (en) 2005-05-18 2008-08-28 Leica Geosystems Ag Phase Ambiguity Resolution Method for a Satellite Based Positioning System
US7428259B2 (en) 2005-05-06 2008-09-23 Sirf Technology Holdings, Inc. Efficient and flexible GPS receiver baseband architecture
WO2008119386A1 (en) 2007-03-30 2008-10-09 Carl Zeiss Smt Ag Support for a component of an optical device
US20080269988A1 (en) * 2003-03-20 2008-10-30 Feller Walter J Combined gnss gyroscope control system and method
US7451030B2 (en) 2005-02-04 2008-11-11 Novariant, Inc. System and method for interactive selection and determination of agricultural vehicle guide paths offset from each other with varying curvature along their length
US20080284643A1 (en) * 2007-05-16 2008-11-20 Scherzinger Bruno M Post-mission high accuracy position and orientation system
US20080288205A1 (en) * 2007-05-16 2008-11-20 Edward Kah Ching Teoh Optical navigation device with surface and free space navigation
US7460942B2 (en) * 2001-04-09 2008-12-02 Hemisphere Gps Llc Soil cultivation implement control apparatus and method
US7479900B2 (en) 2003-05-28 2009-01-20 Legalview Assets, Limited Notification systems and methods that consider traffic flow predicament data
US7505848B2 (en) 2003-03-31 2009-03-17 Deere & Company Path planner and method for planning a contour path of a vehicle
US20090093959A1 (en) * 2007-10-04 2009-04-09 Trimble Navigation Limited Real-time high accuracy position and orientation system
US7522100B2 (en) 2005-07-01 2009-04-21 Sirf Technology Holdings, Inc. Method and device for acquiring weak global navigation satellite system (GNSS) signals
US7522099B2 (en) 2005-09-08 2009-04-21 Topcon Gps, Llc Position determination using carrier phase measurements of satellite signals
WO2009066183A2 (en) 2007-09-27 2009-05-28 Hemisphere Gps Llc Tightly-coupled pcb gnss circuit and manufacturing method
US20090160951A1 (en) * 2007-12-20 2009-06-25 Utah State University Research Foundation Three-Axis Image Stabilization System
US20090164067A1 (en) * 2003-03-20 2009-06-25 Whitehead Michael L Multiple-antenna gnss control system and method
US20090171583A1 (en) 2006-03-15 2009-07-02 The Boeing Company Global position system (gps) user receiver and geometric surface processing for all-in-view coherent gps signal prn codes acquisition and navigation solution
WO2009082745A1 (en) 2007-12-22 2009-07-02 Hemisphere Gps Llc Integrated dead reckoning and gnss/ins positioning
US20090177395A1 (en) 2008-01-07 2009-07-09 David Stelpstra Navigation device and method
US20090174587A1 (en) 2007-01-10 2009-07-09 Tomohiro Ogawa Current switch circuit and d/a converter, semiconductor integrated circuit, and communication device using the same
US20090175593A1 (en) 2007-04-18 2009-07-09 Panasonic Corporation Digital broadcast receiving apparatus and digital broadcast receiving method
US20090177399A1 (en) 2008-01-07 2009-07-09 Samsung Electronics Co., Ltd. Method for estimating location and apparatus using the same
US20090174597A1 (en) 2008-01-08 2009-07-09 Dilellio James A Global Positioning System Accuracy Enhancement
US20090174622A1 (en) 2005-12-27 2009-07-09 Kyocera Corporation Transmitter/Receiver Circuit and Transmission/Reception Method
US20090204281A1 (en) * 2008-02-10 2009-08-13 Hemisphere Gps Llc Visual, gnss and gyro autosteering control
US20090251366A1 (en) * 2008-04-08 2009-10-08 Mcclure John A Gnss-based mobile communication system and method
US20090259397A1 (en) 2008-04-10 2009-10-15 Richard Stanton Navigation system with touchpad remote
US20090259707A1 (en) 2006-03-21 2009-10-15 Thales Method and device for fast correlation calculation
US20090262014A1 (en) 2006-03-15 2009-10-22 The Boeing Company Method and system for all-in-view coherent gps signal prn codes acquisition and navigation solution determination
US20090265101A1 (en) 2008-04-22 2009-10-22 En-Min Jow Access Device With Navigation Function
US20090262974A1 (en) 2008-04-18 2009-10-22 Erik Lithopoulos System and method for obtaining georeferenced mapping data
US20090262018A1 (en) 2008-02-05 2009-10-22 Mstar Semiconductor, Inc. High Accuracy Satellite Receiving Controller and Associated Method
US20090265104A1 (en) 2008-04-22 2009-10-22 Itt Manufacturing Enterprises, Inc. Navigation System and Method of Obtaining Accurate Navigational Information in Signal Challenging Environments
US20090265054A1 (en) 2008-04-16 2009-10-22 Gm Global Technology Operations, Inc. In-vehicle sensor-based calibration algorithm for yaw rate sensor calibration
US7610123B2 (en) 2005-01-04 2009-10-27 Deere & Company Vision-aided system and method for guiding a vehicle
US20090274113A1 (en) 2008-05-01 2009-11-05 Mr.Daniel A. Katz Channel Allocation for Burst Transmission to a Diversity of Satellites
US20090274079A1 (en) 2008-05-01 2009-11-05 Qualcomm Incorporated Radio Frequency (RF) Signal Multiplexing
US20090276155A1 (en) 2008-04-30 2009-11-05 Honeywell International, Inc. Systems and methods for determining location information using dual filters
US20090273372A1 (en) 2005-02-25 2009-11-05 Qualcomm Incorporated Half bin linear frequency discriminator
US20090273513A1 (en) 2008-05-01 2009-11-05 Skytraq Technology Inc. Method of dynamically optimizing the update rate of gps output data
US7623952B2 (en) * 2005-04-21 2009-11-24 A.I.L., Inc. GPS controlled guidance system for farm tractor/implement combination
US20090295634A1 (en) 2008-05-30 2009-12-03 O2Micro, Inc. Global positioning system receiver
US20090295633A1 (en) 2008-06-02 2009-12-03 Pinto Robert W Attitude estimation using intentional translation of a global navigation satellite system (GNSS) antenna
US20090299550A1 (en) 2008-05-27 2009-12-03 Baker David A Orientation-based wireless sensing apparatus
WO2009148638A1 (en) 2008-02-26 2009-12-10 Hemisphere Gps Llc Unbiased code phase discriminator
US20090322600A1 (en) 2004-03-19 2009-12-31 Whitehead Michael L Method and system using gnss phase measurements for relative positioning
US20090322597A1 (en) 2007-04-30 2009-12-31 Navento Technologies, S.L. Location method and system and locatable portable device
US20090322598A1 (en) 2008-06-26 2009-12-31 Honeywell International, Inc. Integrity of differential gps corrections in navigation devices using military type gps receivers
US20090322606A1 (en) 2001-10-30 2009-12-31 Sirf Technology, Inc. Method and Apparatus for Real Time Clock (RTC) Brownout Detection
US20090322601A1 (en) 2008-05-22 2009-12-31 Jonathan Ladd Gnss receiver using signals of opportunity and assistance information to reduce the time to first fix
US20090326809A1 (en) 2003-10-06 2009-12-31 Colley Jaime B System and method for augmenting a satellite-based navigation solution
US20100013703A1 (en) 2006-05-25 2010-01-21 The Boeing Company Gps gyro calibration
US20100030470A1 (en) 2008-07-02 2010-02-04 O2Micro, Inc. Global positioning system and dead reckoning (gps&dr) integrated navigation system
US20100026569A1 (en) 2008-07-31 2010-02-04 Honeywell International Inc. Method and apparatus for location detection using gps and wifi/wimax
US20100039318A1 (en) 2006-11-06 2010-02-18 Marcin Michal Kmiecik Arrangement for and method of two dimensional and three dimensional precision location and orientation determination
US20100039316A1 (en) 2008-02-25 2010-02-18 Sirf Technology, Inc. System and Method for Operating a GPS Device in a Micro Power Mode
US20100039320A1 (en) 2008-08-14 2010-02-18 Boyer Pete A Hybrid GNSS and TDOA Wireless Location System
US20100039321A1 (en) 2008-08-15 2010-02-18 Charles Abraham Method and system for calibrating group delay errors in a combined gps and glonass receiver
US20100060518A1 (en) 2008-09-11 2010-03-11 Bar-Sever Yoaz E Method and apparatus for autonomous, in-receiver prediction of gnss ephemerides
US20100063649A1 (en) 2008-09-10 2010-03-11 National Chiao Tung University Intelligent driving assistant systems
US7689354B2 (en) 2003-03-20 2010-03-30 Hemisphere Gps Llc Adaptive guidance system and method
US20100085249A1 (en) 2008-10-03 2010-04-08 Trimble Navigation Limited Compact Transmission of GPS Information Using Compressed Measurement Record Format
US20100084147A1 (en) 2008-10-02 2010-04-08 Trimble Navigation Ltd. Automatic Control of Passive, Towed Implements
US20100103034A1 (en) 2008-10-24 2010-04-29 Ntt Docomo, Inc. Positioning control device and positioning control method
US20100103038A1 (en) 2008-10-27 2010-04-29 Mediatek Inc. Power saving method adaptable in gnss device
US20100106414A1 (en) 2008-10-27 2010-04-29 John Whitehead Method of performing routing with artificial intelligence
US20100103040A1 (en) 2008-10-26 2010-04-29 Matt Broadbent Method of using road signs to augment Global Positioning System (GPS) coordinate data for calculating a current position of a personal navigation device
US20100103033A1 (en) 2008-10-23 2010-04-29 Texas Instruments Incorporated Loosely-coupled integration of global navigation satellite system and inertial navigation system
US20100106445A1 (en) 2008-10-24 2010-04-29 Takahiro Kondoh Angular velocity sensor correcting apparatus for deriving value for correcting output signal from angular velocity sensor, angular velocity calculating apparatus, angular velocity sensor correcting method, and angular velocity calculating method
US20100109945A1 (en) 2008-11-06 2010-05-06 Texas Instruments Incorporated Loosely-coupled integration of global navigation satellite system and inertial navigation system: speed scale-factor and heading bias calibration
US20100114483A1 (en) 2008-11-03 2010-05-06 Samsung Electronics Co., Ltd. Method and apparatus for automatically optimizing and setting a GPS reception period and map contents
US20100109947A1 (en) 2006-05-26 2010-05-06 Savcor One Oy System and method for positioning a gps device
US20100109944A1 (en) 2003-03-20 2010-05-06 Whitehead Michael L Gnss-based tracking of fixed or slow-moving structures
US20100111372A1 (en) 2008-11-03 2010-05-06 Microsoft Corporation Determining user similarities based on location histories
US20100109950A1 (en) 2008-11-06 2010-05-06 Texas Instruments Incorporated Tightly-coupled gnss/imu integration filter having speed scale-factor and heading bias calibration
US20100109948A1 (en) 2008-11-04 2010-05-06 Leonid Razoumov Methods and Apparatuses For GPS Coordinates Extrapolation When GPS Signals Are Not Available
US20100117899A1 (en) 2008-11-13 2010-05-13 Ecole Polytechnique Federale De Lausanne (Epfl) Method to secure gnss based locations in a device having gnss receiver
US20100117900A1 (en) 2008-11-13 2010-05-13 Van Diggelen Frank Method and system for maintaining a gnss receiver in a hot-start state
US20100121577A1 (en) 2008-04-24 2010-05-13 Gm Global Technology Operations, Inc. Three-dimensional lidar-based clear path detection
US20100117894A1 (en) 2008-01-09 2010-05-13 Mayfllower Communications Company, Inc. Gps-based measurement of roll rate and roll angle of spinning platforms
US20100124212A1 (en) 2008-11-14 2010-05-20 Ralink Technology (Singapore) Corporation Method and system for rf transmitting and receiving beamforming with location or gps guidance
US20100124210A1 (en) 2008-11-14 2010-05-20 Ralink Technology Corporation Method and system for rf transmitting and receiving beamforming with gps guidance
US20100134354A1 (en) 2008-12-02 2010-06-03 Sirf Technology, Inc. Method and Apparatus for a GPS Receiver Capable or Reception of GPS Signals and Binary Offset Carrier Signals
US20100149037A1 (en) 2008-12-15 2010-06-17 Samsung Electronics Co., Ltd. Global positioning system (GPS) receiver and method of determining location of GPS receiver
US20100152949A1 (en) 2008-12-15 2010-06-17 Delphi Technologies, Inc. Vehicle event recording system and method
US20100149030A1 (en) 2002-08-15 2010-06-17 Rajiv Kumar Verma Position determination system and method
US20100149025A1 (en) 2007-10-09 2010-06-17 Honeywell International Inc. Gps receiver raim with slaved precision clock
US20100149034A1 (en) 2008-12-17 2010-06-17 Altek Corporation Method for calculating current position coordinate and method for calculating pseudo range
US20100150284A1 (en) 2005-12-14 2010-06-17 Dennis Arthur Fielder Gps receiver with improved immunity to burst transmissions
US20100149033A1 (en) 2008-12-12 2010-06-17 Charles Abraham Method and system for power management for a frequency synthesizer in a gnss receiver chip
US20100156718A1 (en) 2008-12-19 2010-06-24 Altek Corporation Method for calculating current position coordinate
US20100156712A1 (en) 2008-12-23 2010-06-24 Toyota Motor Sales, U.S.A., Inc. Gps gate system
US20100161179A1 (en) 2008-12-22 2010-06-24 Mcclure John A Integrated dead reckoning and gnss/ins positioning
US20100161568A1 (en) 2008-07-25 2010-06-24 Seiko Epson Corporation Data Compression by Multi-Order Differencing
US20100161211A1 (en) 2008-12-24 2010-06-24 Mitac International Corp. Method and system for automatically creating poi by identifying geographic information on a screen of a portable navigation device
US20100159943A1 (en) 2008-12-18 2010-06-24 Verizon Corporate Services Group, Inc. Method and system for providing location-based information to a group of mobile user agents
US20100156709A1 (en) 2008-12-19 2010-06-24 Nexteq Navigation Corporation System and method for applying code corrections for gnss positioning
US20100171757A1 (en) 2007-01-31 2010-07-08 Melamed Thomas J Referencing a map to the coordinate space of a positioning system
US20100171660A1 (en) 2005-11-15 2010-07-08 O2Micro, Inc. Novas hybrid positioning technology using terrestrial digital broadcasting signal (dbs) and global positioning system (gps) satellite signal
US20100185364A1 (en) * 2009-01-17 2010-07-22 Mcclure John A Raster-based contour swathing for guidance and variable-rate chemical application
US20100185366A1 (en) 2005-07-19 2010-07-22 Heiniger Richard W Adaptive machine control system and method
US20100185389A1 (en) 2009-01-21 2010-07-22 Michael Glenn Woodard GPS-based vehicle alert and control system
US20100188286A1 (en) 2008-09-17 2010-07-29 St-Ericsson Sa Time reference system
US20100189163A1 (en) 2009-01-27 2010-07-29 U-Blox Ag Method of processing a digital signal derived from a direct-sequence spread spectrum signal and a receiver
US20100188285A1 (en) 2009-01-23 2010-07-29 Her Majesty The Queen In Right Of Canada As Represented By The Minister Of Natural Resources Decoupled clock model with ambiguity datum fixing
US20100201829A1 (en) * 2009-02-09 2010-08-12 Andrzej Skoskiewicz Camera aiming using an electronic positioning system for the target
US20100207811A1 (en) 2009-02-18 2010-08-19 Bae Systems Information And Electronics Systems Integration, Inc. (Delaware Corp.) GPS antenna array and system for adaptively suppressing multiple interfering signals in azimuth and elevation
US20100210206A1 (en) 2007-11-15 2010-08-19 Qual Comm Incorporated Gnss receiver and signal tracking circuit and system
US20100211315A1 (en) 2007-03-22 2010-08-19 Furuno Electric Company Limited Gps composite navigation apparatus
US20100211248A1 (en) 2009-02-17 2010-08-19 Lockheed Martin Corporation System and method for stability control using gps data
US20100211316A1 (en) 2007-05-29 2010-08-19 Pedro Dias Freire Da Silva Highly integrated gps, galileo and inertial navigation system
US20100220004A1 (en) 2009-02-27 2010-09-02 Steven Malkos Method and system for gnss assistance data or lto data download over a broadcast band
US20100222076A1 (en) 2002-01-07 2010-09-02 Qualcomm Incorporated Multiplexed cdma and gps searching
US20100220008A1 (en) 2007-08-10 2010-09-02 Crossrate Technology Llc System and method for optimal time, position and heading solution through the integration of independent positioning systems
US20100228480A1 (en) 2009-03-07 2010-09-09 Lithgow Paul A Space satellite tracking and identification
US20100225537A1 (en) 2001-11-06 2010-09-09 Charles Abraham Method and apparatus for processing a satellite positioning system signal using a cellular acquisition signal
US20100228408A1 (en) 2005-09-14 2010-09-09 Tom Ford Helicopter ship board landing system
WO2010104782A1 (en) 2009-03-11 2010-09-16 Hemisphere Gps Llc Removing biases in dual frequency gnss receivers using sbas
US20100235093A1 (en) 2009-03-10 2010-09-16 Chien-Yang Chang Method for adjusting displayed navigation direction using sensors and navigation device using the same
US20100230198A1 (en) * 2009-02-12 2010-09-16 Frank Jonathan D Automated vehicle and system utilizing an optical sensing system
US20100232351A1 (en) 2009-03-11 2010-09-16 Mangesh Chansarkar Utilizing sbas signals to improve gnss receiver performance
US20100231446A1 (en) 2007-10-19 2010-09-16 Nxp B.V. Processing of satellite positioning system signals
US20100241347A1 (en) 2009-03-17 2010-09-23 Lear Cororation Method and system of locating stationary vehicle with remote device
US20100241353A1 (en) 2007-05-16 2010-09-23 Thinkware Systems Corporation Method for matching virtual map and system thereof
US20100238976A1 (en) 2007-12-05 2010-09-23 Qualcomm Incorporated Global navigation receiver
US20100241441A1 (en) 2009-03-19 2010-09-23 Entrix, Inc. Automated scat system
US20100241864A1 (en) 2008-11-21 2010-09-23 Dafca, Inc. Authenticating an integrated circuit based on stored information
US20100312475A1 (en) 2009-06-08 2010-12-09 Inventec Appliances (Shanghai) Co. Ltd. Turning determining method for assisting navigation and terminal device thereof
US20110015817A1 (en) 2009-07-17 2011-01-20 Reeve David R Optical tracking vehicle control system and method
WO2011014431A1 (en) 2009-07-29 2011-02-03 Hemisphere Gps Llc System and method for augmenting dgnss with internally-generated differential correction
US8098324B2 (en) 2005-12-07 2012-01-17 Sony Corporation Imaging device, GPS control method, and computer program
US8106817B2 (en) 2009-12-31 2012-01-31 Polaris Wireless, Inc. Positioning system and positioning method
US8190337B2 (en) 2003-03-20 2012-05-29 Hemisphere GPS, LLC Satellite based vehicle guidance control in straight and contour modes
US20120182421A1 (en) 2011-01-18 2012-07-19 Asanov Pavel Gps device with integral camera
US20120251123A1 (en) 2007-05-24 2012-10-04 Federal Law Enforcement Development Services, Inc. Led light global positioning and routing communication system
US20120271540A1 (en) 2009-10-22 2012-10-25 Krzysztof Miksa System and method for vehicle navigation using lateral offsets
US20120300070A1 (en) 2011-05-23 2012-11-29 Kabushiki Kaisha Topcon Aerial Photograph Image Pickup Method And Aerial Photograph Image Pickup Apparatus
US20130004086A1 (en) 2009-11-20 2013-01-03 Saab Ab Method estimating absolute orientation of a vehicle
US20130041549A1 (en) 2007-01-05 2013-02-14 David R. Reeve Optical tracking vehicle control system and method
US20130046461A1 (en) 2011-08-17 2013-02-21 Abram L. Balloga Orientation Device and Method
US20130066542A1 (en) 2011-09-08 2013-03-14 Kama-Tech (Hk) Limited Intelligent laser tracking system and method for mobile and fixed position traffic monitoring and enforcement applications
US20130107034A1 (en) 2000-10-06 2013-05-02 Vederi, Llc System and method for creating, storing and utilizing images of a geographical location
US8437901B2 (en) 2008-10-15 2013-05-07 Deere & Company High integrity coordination for multiple off-road vehicles
US20130121678A1 (en) 2011-11-16 2013-05-16 Alfred Xueliang Xin Method and automated location information input system for camera
US20130141565A1 (en) 2011-12-01 2013-06-06 Curtis Ling Method and System for Location Determination and Navigation using Structural Visual Information
US20130147661A1 (en) 2011-12-07 2013-06-13 International Business Machines Corporation System and method for optical landmark identification for gps error correction
US20130156271A1 (en) 2011-12-20 2013-06-20 Net-Fit Tecnologia Da Informacao Ltda. System for analysis of pests and diseases in crops and orchards via mobile phone
US20130166103A1 (en) 2011-12-26 2013-06-27 Hon Hai Precision Industry Co., Ltd. Aircraft exploration system
US20130211715A1 (en) 2012-02-09 2013-08-15 Samsung Electronics Co., Ltd. Apparatus and method for measuring position using gps and visible light communication
US8649930B2 (en) 2009-09-17 2014-02-11 Agjunction Llc GNSS integrated multi-sensor control system and method
CN105531706A (en) 2013-07-17 2016-04-27 索特斯波特有限公司 Search engine for information retrieval system

Patent Citations (521)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3585537A (en) 1969-02-10 1971-06-15 Bell Telephone Labor Inc Electric wave filters
US3596228A (en) 1969-05-29 1971-07-27 Ibm Fluid actuated contactor
US3727710A (en) 1971-05-13 1973-04-17 Gen Motors Corp Steer control for a track-laying vehicle
US3899028A (en) 1972-03-30 1975-08-12 Systron Donner Corp Angular position sensing and control system, apparatus and method
US3815272A (en) 1973-01-03 1974-06-11 G Marleau Collapsible, triangular net assembly
US3987456A (en) 1974-08-01 1976-10-19 Lignes Telegraphiques Et Telephoniques Wide relative frequency band and reduced size-to-wavelength ratio antenna
US4637474A (en) 1974-11-05 1987-01-20 Leonard Willie B Tractor and towed implement with elevation control system for implement including pressure responsive valve actuator
US4132272A (en) 1977-06-30 1979-01-02 International Harvester Company Tractor hitch position control system
US4170776A (en) 1977-12-21 1979-10-09 Nasa System for near real-time crustal deformation monitoring
US4180133A (en) 1978-01-12 1979-12-25 Iowa State University Research Foundation, Inc. Guidance system for towed vehicles
US4529990A (en) 1979-10-22 1985-07-16 Siemens Aktiengesellschaft Antenna system for a jamming transmitter
US4398162A (en) 1980-10-22 1983-08-09 Ngk Spark Plug Co., Ltd. Ladder-type piezoelectric filters
US4769700A (en) 1981-11-20 1988-09-06 Diffracto Ltd. Robot tractors
US4667203A (en) 1982-03-01 1987-05-19 Aero Service Div, Western Geophysical Method and system for determining position using signals from satellites
US4894662A (en) 1982-03-01 1990-01-16 Western Atlas International, Inc. Method and system for determining position on a moving platform, such as a ship, using signals from GPS satellites
US4453614A (en) 1982-03-19 1984-06-12 Deere & Company Steering arrangement for an off-highway articulated vehicle
US4739448A (en) 1984-06-25 1988-04-19 Magnavox Government And Industrial Electronics Company Microwave multiport multilayered integrated circuit chip carrier
US4689556A (en) 1984-10-12 1987-08-25 Daymarc Corporation Broad band contactor assembly for testing integrated circuit devices
US4777601A (en) * 1985-03-15 1988-10-11 Jd-Technologie Ag Method and apparatus for a passive track system for guiding and controlling robotic transport and assembly or installation devices
US4785463A (en) 1985-09-03 1988-11-15 Motorola, Inc. Digital global positioning system receiver
US4710775A (en) 1985-09-30 1987-12-01 The Boeing Company Parasitically coupled, complementary slot-dipole antenna element
US4714435A (en) 1985-11-14 1987-12-22 Molex Incorporated Connection for flexible apparatus
US4694639A (en) * 1985-12-30 1987-09-22 Chen Sheng K Robotic lawn mower
US4751512A (en) 1986-01-21 1988-06-14 Oceanonics, Inc. Differential navigation system for remote mobile users
US4694264A (en) 1986-03-05 1987-09-15 The United States Of America As Represented By The United States Department Of Energy Radio frequency coaxial feedthrough device
US4812991A (en) 1986-05-01 1989-03-14 Magnavox Govt. And Industrial Electronics Company Method for precision dynamic differential positioning
US4802545A (en) 1986-10-15 1989-02-07 J. I. Case Company Steering control system for articulated vehicle
US4858132A (en) 1987-09-11 1989-08-15 Ndc Technologies, Inc. Optical navigation system for an automatic guided vehicle, and method
US5663879A (en) 1987-11-20 1997-09-02 North American Philips Corporation Method and apparatus for smooth control of a vehicle with automatic recovery for interference
US5229941A (en) * 1988-04-14 1993-07-20 Nissan Motor Company, Limtied Autonomous vehicle automatically running on route and its method
US4864320A (en) 1988-05-06 1989-09-05 Ball Corporation Monopole/L-shaped parasitic elements for circularly/elliptically polarized wave transceiving
US5031704A (en) 1988-05-10 1991-07-16 Fleischer Manufacturing, Inc. Guidance control apparatus for agricultural implement
US4918607A (en) 1988-09-09 1990-04-17 Caterpillar Industrial Inc. Vehicle guidance system
US4916577A (en) 1988-12-20 1990-04-10 Grumman Aerospace Corporation Method of mounting removable modules
US5165109A (en) 1989-01-19 1992-11-17 Trimble Navigation Microwave communication antenna
US5144130A (en) * 1989-02-16 1992-09-01 Rieter Machine Works Limited Method for the electrical adjustment of an optical row of sensors
US4963889A (en) 1989-09-26 1990-10-16 Magnavox Government And Industrial Electronics Company Method and apparatus for precision attitude determination and kinematic positioning
US5177489A (en) 1989-09-26 1993-01-05 Magnavox Electronic Systems Company Pseudolite-aided method for precision kinematic positioning
US5173715A (en) 1989-12-04 1992-12-22 Trimble Navigation Antenna with curved dipole elements
US5191351A (en) 1989-12-29 1993-03-02 Texas Instruments Incorporated Folded broadband antenna with a symmetrical pattern
US5615116A (en) 1990-02-05 1997-03-25 Caterpillar Inc. Apparatus and method for autonomous vehicle navigation using path data
US5956250A (en) 1990-02-05 1999-09-21 Caterpillar Inc. Apparatus and method for autonomous vehicle navigation using absolute data
US5684696A (en) 1990-02-05 1997-11-04 Caterpillar Inc. System and method for enabling an autonomous vehicle to track a desired path
US5390125A (en) 1990-02-05 1995-02-14 Caterpillar Inc. Vehicle position determination system and method
US5612883A (en) 1990-02-05 1997-03-18 Caterpillar Inc. System and method for detecting obstacles in the path of a vehicle
US5375059A (en) 1990-02-05 1994-12-20 Caterpillar Inc. Vehicle position determination system and method
US5838562A (en) 1990-02-05 1998-11-17 Caterpillar Inc. System and a method for enabling a vehicle to track a preset path
US5156219A (en) 1990-06-04 1992-10-20 A.I.L., Inc. Positioning apparatus for drawn implement
US5100229A (en) 1990-08-17 1992-03-31 Spatial Positioning Systems, Inc. Spatial positioning system
US5185610A (en) 1990-08-20 1993-02-09 Texas Instruments Incorporated GPS system and method for deriving pointing or attitude from a single GPS receiver
US5155493A (en) 1990-08-28 1992-10-13 The United States Of America As Represented By The Secretary Of The Air Force Tape type microstrip patch antenna
US5155490A (en) 1990-10-15 1992-10-13 Gps Technology Corp. Geodetic surveying system using multiple GPS base stations
US5390207A (en) 1990-11-28 1995-02-14 Novatel Communications Ltd. Pseudorandom noise ranging receiver which compensates for multipath distortion by dynamically adjusting the time delay spacing between early and late correlators
US5294970A (en) 1990-12-31 1994-03-15 Spatial Positioning Systems, Inc. Spatial positioning system
US5194851A (en) 1991-02-21 1993-03-16 Case Corporation Steering controller
US5152347A (en) 1991-04-05 1992-10-06 Deere & Company Interface system for a towed implement
US5134407A (en) 1991-04-10 1992-07-28 Ashtech Telesis, Inc. Global positioning system receiver digital processing technique
US5293170A (en) 1991-04-10 1994-03-08 Ashtech Inc. Global positioning system receiver digital processing technique
US5247440A (en) * 1991-05-03 1993-09-21 Motorola, Inc. Location influenced vehicle control system
US5202829A (en) 1991-06-10 1993-04-13 Trimble Navigation Limited Exploration system and method for high-accuracy and high-confidence level relative position and velocity determinations
US5148179A (en) 1991-06-27 1992-09-15 Trimble Navigation Differential position determination using satellites
US5207239A (en) 1991-07-30 1993-05-04 Aura Systems, Inc. Variable gain servo assist
US5467282A (en) 1991-09-20 1995-11-14 Dennis; Arthur R. GPS and satellite navigation system
US5365447A (en) 1991-09-20 1994-11-15 Dennis Arthur R GPS and satelite navigation system
US5239669A (en) 1992-02-04 1993-08-24 Trimble Navigation Limited Coupler for eliminating a hardwire connection between a handheld global positioning system (GPS) receiver and a stationary remote antenna
US5323322A (en) 1992-03-05 1994-06-21 Trimble Navigation Limited Networked differential GPS system
US5255756A (en) 1992-04-22 1993-10-26 Progressive Farm Products, Inc. Caddy with guidance system for agricultural implements
US5343209A (en) 1992-05-07 1994-08-30 Sennott James W Navigation receiver with coupled signal-tracking channels
US5919242A (en) 1992-05-14 1999-07-06 Agri-Line Innovations, Inc. Method and apparatus for prescription application of products to an agricultural field
US5345245A (en) 1992-07-01 1994-09-06 Kokusai Denshin Denwa Company, Limited Differential data signal transmission technique
US5268695A (en) 1992-10-06 1993-12-07 Trimble Navigation Limited Differential phase measurement through antenna multiplexing
US5361212A (en) 1992-11-02 1994-11-01 Honeywell Inc. Differential GPS landing assistance system
US5296861A (en) 1992-11-13 1994-03-22 Trimble Navigation Limited Method and apparatus for maximum likelihood estimation direct integer search in differential carrier phase attitude determination systems
US5390124A (en) 1992-12-01 1995-02-14 Caterpillar Inc. Method and apparatus for improving the accuracy of position estimates in a satellite based navigation system
US5359332A (en) 1992-12-31 1994-10-25 Trimble Navigation Limited Determination of phase ambiguities in satellite ranges
US5523761A (en) 1993-01-12 1996-06-04 Trimble Navigation Limited Differential GPS smart antenna device
US5471217A (en) 1993-02-01 1995-11-28 Magnavox Electronic Systems Company Method and apparatus for smoothing code measurements in a global positioning system receiver
US5444453A (en) 1993-02-02 1995-08-22 Ball Corporation Microstrip antenna structure having an air gap and method of constructing same
US5739785A (en) 1993-03-04 1998-04-14 Trimble Navigation Limited Location and generation of high accuracy survey control marks using satellites
US5311149A (en) 1993-03-12 1994-05-10 Trimble Navigation Limited Integrated phase locked loop local oscillator
US5476147A (en) 1993-03-19 1995-12-19 Fixemer; Richard A. Guidance system for an agricultural implement
US5548293A (en) 1993-03-24 1996-08-20 Leland Stanford Junior University System and method for generating attitude determinations using GPS
US5583513A (en) 1993-03-24 1996-12-10 Board Of Trustees Of The Leland Stanford Junior University System and method for generating precise code based and carrier phase position determinations
US5334987A (en) 1993-04-01 1994-08-02 Spectra-Physics Laserplane, Inc. Agricultural aircraft control system using the global positioning system
US5490073A (en) 1993-04-05 1996-02-06 Caterpillar Inc. Differential system and method for a satellite based navigation
US5477228A (en) 1993-04-13 1995-12-19 Differential Corrections Inc. Differential global positioning system using radio data system
US5416712A (en) 1993-05-28 1995-05-16 Trimble Navigation Limited Position and velocity estimation system for adaptive weighting of GPS and dead-reckoning information
US6542077B2 (en) 1993-06-08 2003-04-01 Raymond Anthony Joao Monitoring apparatus for a vehicle and/or a premises
US5504482A (en) 1993-06-11 1996-04-02 Rockwell International Corporation Automobile navigation guidance, control and safety system
US5534875A (en) 1993-06-18 1996-07-09 Adroit Systems, Inc. Attitude determining system for use with global positioning system
US5765123A (en) 1993-08-07 1998-06-09 Aisin Aw Co., Ltd. Navigation system
US5369589A (en) 1993-09-15 1994-11-29 Trimble Navigation Limited Plural information display for navigation
US5521610A (en) 1993-09-17 1996-05-28 Trimble Navigation Limited Curved dipole antenna with center-post amplifier
US5610522A (en) 1993-09-30 1997-03-11 Commissariat A L'energie Atomique Open magnetic structure including pole pieces forming a V-shape threbetween for high homogeneity in an NMR device
US5684476A (en) 1993-12-30 1997-11-04 Concord, Inc. Field navigation system
US5955973A (en) 1993-12-30 1999-09-21 Concord, Inc. Field navigation system
US5477458A (en) 1994-01-03 1995-12-19 Trimble Navigation Limited Network for carrier phase differential GPS corrections
US5899957A (en) 1994-01-03 1999-05-04 Trimble Navigation, Ltd. Carrier phase differential GPS corrections network
US5546093A (en) 1994-01-04 1996-08-13 Caterpillar Inc. System and method for providing navigation signals to an earthmoving or construction machine
US5568152A (en) 1994-02-04 1996-10-22 Trimble Navigation Limited Integrated image transfer for remote target location
US5563786A (en) 1994-02-16 1996-10-08 Fuji Jukogyo Kabushiki Kaisha Autonomous running control system for vehicle and the method thereof
US5890091A (en) 1994-02-18 1999-03-30 Trimble Navigation Ltd. Centimeter accurate global positioning system receiver for on-the-fly real-time kinematic measurement and control
US5519620A (en) 1994-02-18 1996-05-21 Trimble Navigation Limited Centimeter accurate global positioning system receiver for on-the-fly real-time kinematic measurement and control
JPH07244150A (en) 1994-02-28 1995-09-19 Fujita Corp Attitude measuring apparatus of heavy machine
US5617100A (en) 1994-04-07 1997-04-01 Matsushita Electric Industrial Co., Ltd. Accurate position measuring system
US5491636A (en) 1994-04-19 1996-02-13 Glen E. Robertson Anchorless boat positioning employing global positioning system
US5923270A (en) 1994-05-13 1999-07-13 Modulaire Oy Automatic steering system for an unmanned vehicle
US5680140A (en) 1994-07-19 1997-10-21 Trimble Navigation Limited Post-processing of inverse differential corrections for SATPS mobile stations
US5495257A (en) 1994-07-19 1996-02-27 Trimble Navigation Limited Inverse differential corrections for SATPS mobile stations
US5451964A (en) 1994-07-29 1995-09-19 Del Norte Technology, Inc. Method and system for resolving double difference GPS carrier phase integer ambiguity utilizing decentralized Kalman filters
US5442363A (en) 1994-08-04 1995-08-15 U.S. Army Corps Of Engineers As Represented By The Secretary Of The Army Kinematic global positioning system of an on-the-fly apparatus for centimeter-level positioning for static or moving applications
US5568162A (en) 1994-08-08 1996-10-22 Trimble Navigation Limited GPS navigation and differential-correction beacon antenna combination
US5596328A (en) 1994-08-23 1997-01-21 Honeywell Inc. Fail-safe/fail-operational differential GPS ground station system
US5610616A (en) 1994-08-23 1997-03-11 Honeywell Inc. Differential GPS ground station system
US5610845A (en) 1994-08-30 1997-03-11 United Technologies Corporation Multi-parameter air data sensing technique
US6061390A (en) 1994-09-02 2000-05-09 California Institute Of Technology P-code enhanced method for processing encrypted GPS signals without knowledge of the encryption code
US5664632A (en) 1994-09-12 1997-09-09 Orthman Manufacturing, Inc. Quick hitch guidance device
US5511623A (en) 1994-09-12 1996-04-30 Orthman Manufacturing, Inc. Quick hitch guidance device
US5543804A (en) 1994-09-13 1996-08-06 Litton Systems, Inc. Navagation apparatus with improved attitude determination
US5604506A (en) 1994-12-13 1997-02-18 Trimble Navigation Limited Dual frequency vertical antenna
US5589835A (en) 1994-12-20 1996-12-31 Trimble Navigation Limited Differential GPS receiver system linked by infrared signals
US5600670A (en) 1994-12-21 1997-02-04 Trimble Navigation, Ltd. Dynamic channel allocation for GPS receivers
US5731786A (en) 1994-12-29 1998-03-24 Trimble Navigation Limited Compaction of SATPS information for subsequent signal processing
US5621646A (en) 1995-01-17 1997-04-15 Stanford University Wide area differential GPS reference system and method
US5617317A (en) 1995-01-24 1997-04-01 Honeywell Inc. True north heading estimator utilizing GPS output information and inertial sensor system output information
US5854987A (en) 1995-02-22 1998-12-29 Honda Giken Kogyo Kabushiki Kaisha Vehicle steering control system using navigation system
US5644139A (en) 1995-03-02 1997-07-01 Allen; Ross R. Navigation technique for detecting movement of navigation sensors relative to an object
US5608393A (en) 1995-03-07 1997-03-04 Honeywell Inc. Differential ground station repeater
US5592382A (en) 1995-03-10 1997-01-07 Rockwell International Corporation Directional steering and navigation indicator
US5706015A (en) 1995-03-20 1998-01-06 Fuba Automotive Gmbh Flat-top antenna apparatus including at least one mobile radio antenna and a GPS antenna
US6055477A (en) 1995-03-31 2000-04-25 Trimble Navigation Ltd. Use of an altitude sensor to augment availability of GPS location fixes
US5638077A (en) 1995-05-04 1997-06-10 Rockwell International Corporation Differential GPS for fleet base stations with vector processing mechanization
US5561432A (en) 1995-05-12 1996-10-01 Trimble Navigation Out of plane antenna vector system and method
US6606542B2 (en) 1995-05-30 2003-08-12 Agco Corporation System and method for creating agricultural decision and application maps for automated agricultural machines
US5875408A (en) 1995-07-17 1999-02-23 Imra America, Inc. Automated vehicle guidance system and method for automatically guiding a vehicle
US6463091B1 (en) 1995-08-09 2002-10-08 Magellan Corporation Spread spectrum receiver using a pseudo-random noise code for ranging applications in a way that reduces errors when a multipath signal is present
US5862501A (en) 1995-08-18 1999-01-19 Trimble Navigation Limited Guidance control system for movable machinery
US5717593A (en) 1995-09-01 1998-02-10 Gvili; Michael E. Lane guidance system
US6018313A (en) 1995-09-01 2000-01-25 Tilmar Konle System for determining the location of mobile objects
US6205401B1 (en) 1995-09-19 2001-03-20 Litef Gmbh Navigation system for a vehicle, especially a land craft
US5673491A (en) 1995-10-20 1997-10-07 Brenna; Douglas J. Crane level indicator device
US5906645A (en) 1995-12-04 1999-05-25 Toyota Jidosha Kabushiki Kaisha Auto-drive control unit for vehicles
US20010004601A1 (en) 1995-12-22 2001-06-21 Drane Christopher R. Location and tracking system
US6275705B1 (en) 1995-12-22 2001-08-14 Cambridge Positioning Systems Ltd. Location and tracking system
US5944770A (en) 1995-12-28 1999-08-31 Trimble Navigation Limited Method and receiver using a low earth orbiting satellite signal to augment the global positioning system
US5928309A (en) 1996-02-05 1999-07-27 Korver; Kelvin Navigation/guidance system for a land-based vehicle
US6215828B1 (en) 1996-02-10 2001-04-10 Telefonaktiebolaget Lm Ericsson (Publ) Signal transformation method and apparatus
US6014101A (en) 1996-02-26 2000-01-11 Trimble Navigation Limited Post-processing of inverse DGPS corrections
US5828336A (en) 1996-03-29 1998-10-27 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Robust real-time wide-area differential GPS navigation
US5925080A (en) * 1996-03-29 1999-07-20 Mazda Motor Corporation Automatic guided vehicle control system
US5864318A (en) 1996-04-26 1999-01-26 Dorne & Margolin, Inc. Composite antenna for cellular and gps communications
US6069583A (en) 1996-05-09 2000-05-30 Agence Spatiale Europeene Receiver for a navigation system, in particular a satellite navigation system
US5935183A (en) 1996-05-20 1999-08-10 Caterpillar Inc. Method and system for determining the relationship between a laser plane and an external coordinate system
US6122595A (en) 1996-05-20 2000-09-19 Harris Corporation Hybrid GPS/inertially aided platform stabilization system
US5725230A (en) 1996-06-17 1998-03-10 Walkup; Joseph L. Self steering tandem hitch
US6415229B1 (en) 1996-06-21 2002-07-02 Claas Kgaa System for position determination of mobile objects, in particular vehicles
US6057800A (en) 1996-06-28 2000-05-02 State University Of New York RDOP surface for GPS relative positioning
US6138062A (en) * 1996-07-15 2000-10-24 Toyota Jidoshia Kabushiki Kaisha Automatic travel controlling device
US6128574A (en) 1996-07-23 2000-10-03 Claas Kgaa Route planning system for agricultural work vehicles
US5810095A (en) 1996-07-25 1998-09-22 Case Corporation System for controlling the position of an implement attached to a work vehicle
US5941317A (en) 1996-08-01 1999-08-24 Great Western Corporation Pty Ltd. Row cultivator with laterally moveable tool bar
US5929721A (en) 1996-08-06 1999-07-27 Motorola Inc. Ceramic filter with integrated harmonic response suppression using orthogonally oriented low-pass filter
US5814961A (en) * 1996-09-03 1998-09-29 Nec Corporation Guidance system for automated guided vehicle
US6014608A (en) 1996-11-04 2000-01-11 Samsung Electronics Co., Ltd. Navigator apparatus informing or peripheral situation of the vehicle and method for controlling the same
US6070673A (en) 1996-11-22 2000-06-06 Case Corporation Location based tractor control
US5926079A (en) 1996-12-05 1999-07-20 Motorola Inc. Ceramic waveguide filter with extracted pole
US5848485A (en) 1996-12-27 1998-12-15 Spectra Precision, Inc. System for determining the position of a tool mounted on pivotable arm using a light source and reflectors
US5757316A (en) 1997-02-01 1998-05-26 Litton Systems, Inc. Attitude determination utilizing an inertial measurement unit and a plurality of satellite transmitters
US5777578A (en) 1997-02-10 1998-07-07 National Science Council Global positioning system (GPS) Compass
WO1998036288A1 (en) 1997-02-14 1998-08-20 Kelvin Korver A navigation/guidance system for a land-based vehicle
US5877725A (en) 1997-03-06 1999-03-02 Trimble Navigation Limited Wide augmentation system retrofit receiver
US5864315A (en) 1997-04-07 1999-01-26 General Electric Company Very low power high accuracy time and frequency circuits in GPS based tracking units
US6229479B1 (en) 1997-04-25 2001-05-08 Magellan Corporation Relative position measuring techniques using both GPS and GLONASS carrier phase measurements
US5914685A (en) 1997-04-25 1999-06-22 Magellan Corporation Relative position measuring techniques using both GPS and GLONASS carrier phase measurements
US5987383A (en) 1997-04-28 1999-11-16 Trimble Navigation Form line following guidance system
US5987383C1 (en) 1997-04-28 2006-06-13 Trimble Navigation Ltd Form line following guidance system
US6463374B1 (en) 1997-04-28 2002-10-08 Trimble Navigation Ltd. Form line following guidance system
US6052647A (en) 1997-06-20 2000-04-18 Stanford University Method and system for automatic control of vehicles based on carrier phase differential GPS
US5912798A (en) 1997-07-02 1999-06-15 Landsten Chu Dielectric ceramic filter
US5936573A (en) 1997-07-07 1999-08-10 Trimble Navigation Limited Real-time kinematic integrity estimator and monitor
US5940026A (en) 1997-07-21 1999-08-17 Rockwell Science Center, Inc. Azimuth determination for GPS/INS systems via GPS null steering antenna
US5917448A (en) 1997-08-07 1999-06-29 Rockwell Science Center, Inc. Attitude determination system with sequencing antenna inputs
US6061632A (en) 1997-08-18 2000-05-09 Trimble Navigation Limited Receiver with seamless correction capacity
US5943008A (en) 1997-09-23 1999-08-24 Rockwell Science Center, Inc. Single global positioning system receiver capable of attitude determination
US5927603A (en) 1997-09-30 1999-07-27 J. R. Simplot Company Closed loop control system, sensing apparatus and fluid application system for a precision irrigation device
US6023239A (en) 1997-10-08 2000-02-08 Arinc, Inc. Method and system for a differential global navigation satellite system aircraft landing ground station
US6198992B1 (en) 1997-10-10 2001-03-06 Trimble Navigation Limited Override for guidance control system
US6233511B1 (en) 1997-11-26 2001-05-15 Case Corporation Electronic control for a two-axis work implement
US5918558A (en) 1997-12-01 1999-07-06 Case Corporation Dual-pump, flow-isolated hydraulic circuit for an agricultural tractor
US6191730B1 (en) 1997-12-15 2001-02-20 Trimble Navigation Limited Two-channel fast-sequencing high-dynamics GPS navigation receiver
US5945917A (en) 1997-12-18 1999-08-31 Rockwell International Swathing guidance display
US6631394B1 (en) 1998-01-21 2003-10-07 Nokia Mobile Phones Limited Embedded system with interrupt handler for multiple operating systems
US5969670A (en) 1998-01-22 1999-10-19 Trimble Navigation Limited Inexpensive monitoring technique for achieving high level integrity monitoring for differential GPS
US6314348B1 (en) 1998-02-11 2001-11-06 Trimble Navigation Limited Correction control for guidance control system
US6100842A (en) 1998-02-20 2000-08-08 Trimble Navigation Limited Chained location determination system
US6104978A (en) 1998-04-06 2000-08-15 General Electric Company GPS-based centralized tracking system with reduced energy consumption
US6081171A (en) 1998-04-08 2000-06-27 Nokia Mobile Phones Limited Monolithic filters utilizing thin film bulk acoustic wave devices and minimum passive components for controlling the shape and width of a passband response
US6144335A (en) 1998-04-14 2000-11-07 Trimble Navigation Limited Automated differential correction processing of field data in a global positional system
US6392589B1 (en) 1998-04-14 2002-05-21 Trimble Navigation Limited Automated differential correction processing of field data in a global positioning system
US6356602B1 (en) 1998-05-04 2002-03-12 Trimble Navigation Limited RF integrated circuit for downconverting a GPS signal
US6449558B1 (en) 1998-05-29 2002-09-10 David Anthony Small Method and device for creating a network positioning system (NPS)
US6345231B2 (en) 1998-07-10 2002-02-05 Claas Selbstfahrende Erntemaschinen Gmbh Method and apparatus for position determining
US5933110A (en) 1998-07-13 1999-08-03 Arinc, Inc. Vessel attitude determination system and method
US6553299B1 (en) 1998-07-15 2003-04-22 Trimble Navigation Ltd. Methods and apparatus for precision agriculture operations utilizing real time kinematic global positioning system systems
US20030187560A1 (en) 1998-07-15 2003-10-02 Keller Russell J. Methods and apparatus for precision agriculture operations utilizing real time kinematic global positioning system systems
US6199000B1 (en) 1998-07-15 2001-03-06 Trimble Navigation Limited Methods and apparatus for precision agriculture operations utilizing real time kinematic global positioning system systems
US6307505B1 (en) 1998-07-22 2001-10-23 Trimble Navigation Limited Apparatus and method for coupling data to a position determination device
US5949371A (en) 1998-07-27 1999-09-07 Trimble Navigation Limited Laser based reflectors for GPS positioning augmentation
US6515619B1 (en) 1998-07-30 2003-02-04 Mckay, Jr. Nicholas D. Object location system
US6313788B1 (en) 1998-08-14 2001-11-06 Seagull Technology, Inc. Method and apparatus for reliable inter-antenna baseline determination
US6703973B1 (en) 1998-08-19 2004-03-09 Trimble Navigation, Ltd. Guiding vehicle in adjacent swaths across terrain via satellite navigation and tilt measurement
US6501422B1 (en) 1998-08-19 2002-12-31 Trimble Navigation, Ltd. Precise parallel swathing guidance via satellite navigation and tilt measurement
US6230097B1 (en) 1998-08-31 2001-05-08 Trimble Navigation Limited Accurate vehicle navigation
US6212453B1 (en) 1998-09-11 2001-04-03 Honda Giken Kogyo Kabushiki Kaisha Vehicle steering control system
US6256583B1 (en) 1998-09-16 2001-07-03 Rockwell Collins, Inc. GPS attitude determination system and method using optimal search space identification for integer cycle ambiguity resolution
US6336066B1 (en) 1998-09-29 2002-01-01 Pellenc S.A. Process for using localized agricultural data to optimize the cultivation of perennial plants
US6876920B1 (en) 1998-10-27 2005-04-05 Beeline Technologies Pty Ltd Vehicle positioning apparatus and method
WO2000024239A1 (en) 1998-10-27 2000-05-04 Agsystems Pty. Ltd. Vehicle positioning apparatus and method
US6253160B1 (en) 1999-01-15 2001-06-26 Trimble Navigation Ltd. Method and apparatus for calibrating a tool positioning mechanism on a mobile machine
US6198430B1 (en) 1999-03-26 2001-03-06 Rockwell Collins, Inc. Enhanced differential GNSS carrier-smoothed code processing using dual frequency measurements
US6236916B1 (en) 1999-03-29 2001-05-22 Caterpillar Inc. Autoguidance system and method for an agricultural machine
US6484097B2 (en) 1999-04-23 2002-11-19 Global Locate, Inc. Wide area inverse differential GPS
US6431576B1 (en) 1999-04-28 2002-08-13 Deere & Company System for steering towed implement in response to, or independently of, steering of towing vehicle
US20020165645A1 (en) * 1999-05-31 2002-11-07 Masato Kageyama Vehicle interference prevention device
US6191733B1 (en) 1999-06-01 2001-02-20 Modular Mining Systems, Inc. Two-antenna positioning system for surface-mine equipment
US6325684B1 (en) 1999-06-11 2001-12-04 Johnson Outdoors, Inc., Trolling motor steering control
US6236924B1 (en) 1999-06-21 2001-05-22 Caterpillar Inc. System and method for planning the operations of an agricultural machine in a field
US6389345B2 (en) 1999-06-29 2002-05-14 Caterpillar Inc. Method and apparatus for determining a cross slope of a surface
US6266595B1 (en) 1999-08-12 2001-07-24 Martin W. Greatline Method and apparatus for prescription application of products to an agricultural field
US6292132B1 (en) 1999-08-13 2001-09-18 Daimlerchrysler Ag System and method for improved accuracy in locating and maintaining positions using GPS
US6076612A (en) 1999-08-31 2000-06-20 Case Corporation Transition from position to draft mode controlled by hitch position command and feedback
US6062317A (en) 1999-09-03 2000-05-16 Caterpillar Inc. Method and apparatus for controlling the direction of travel of an earthworking machine
US6285320B1 (en) 1999-09-03 2001-09-04 Sikorsky Aircraft Corporation Apparatus and method for mapping surfaces of an object
US6466871B1 (en) 1999-10-03 2002-10-15 Azimuth Technologies Method for calibrating and verifying the attitude of a compass
US6611755B1 (en) 1999-12-19 2003-08-26 Trimble Navigation Ltd. Vehicle tracking, communication and fleet management system
US6686878B1 (en) 2000-02-22 2004-02-03 Trimble Navigation Limited GPS weather data recording system for use with the application of chemicals to agricultural fields
US20020004691A1 (en) 2000-03-10 2002-01-10 Yasuhiro Kinashi Attitude determination and alignment using electro-optical sensors and global navigation satellites
US6611754B2 (en) 2000-03-14 2003-08-26 Siemens Vdo Automotive Ag Route planning system
US20010025221A1 (en) 2000-03-14 2001-09-27 Bernhard Klein Route planning system
US6469663B1 (en) 2000-03-21 2002-10-22 Csi Wireless Inc. Method and system for GPS and WAAS carrier phase measurements for relative positioning
US6418031B1 (en) 2000-05-01 2002-07-09 International Business Machines Corporation Method and means for decoupling a printed circuit board
US20050265494A1 (en) 2000-05-13 2005-12-01 Goodings Christopher J Method and apparatus for code phase tracking
US6819269B2 (en) 2000-05-17 2004-11-16 Omega Patents, L.L.C. Vehicle tracker including battery monitoring feature and related methods
US6421003B1 (en) 2000-05-19 2002-07-16 Sri International Attitude determination using multiple baselines in a navigational positioning system
US6259398B1 (en) 2000-05-19 2001-07-10 Sri International Multi-valued variable ambiguity resolution for satellite navigation signal carrier wave path length determination
US6424915B1 (en) 2000-06-01 2002-07-23 Furuno Electric Co., Ltd. System for determining the heading and/or attitude of a body
US20030208319A1 (en) * 2000-06-05 2003-11-06 Agco System and method for creating demo application maps for site-specific farming
US6397147B1 (en) 2000-06-06 2002-05-28 Csi Wireless Inc. Relative GPS positioning using a single GPS receiver with internally generated differential correction terms
US6646603B2 (en) 2000-06-16 2003-11-11 Koninklijke Philips Electronics, N.V. Method of providing an estimate of a location
US6445983B1 (en) 2000-07-07 2002-09-03 Case Corporation Sensor-fusion navigator for automated guidance of off-road vehicles
US6611228B2 (en) 2000-07-24 2003-08-26 Furuno Electric Company, Limited Carrier phase-based relative positioning apparatus
US6631916B1 (en) 2000-07-28 2003-10-14 Merle E. Miller Guidance system for pull-type equipment
US6931233B1 (en) 2000-08-31 2005-08-16 Sirf Technology, Inc. GPS RF front end IC with programmable frequency synthesizer for use in wireless phones
US6549835B2 (en) 2000-09-28 2003-04-15 Nissan Motor Co., Ltd. Apparatus for and method of steering vehicle
US20130107034A1 (en) 2000-10-06 2013-05-02 Vederi, Llc System and method for creating, storing and utilizing images of a geographical location
US6377889B1 (en) 2000-10-13 2002-04-23 Trimble Navigation Limited Non-linear method of guiding to arbitrary curves with adaptive feedback
US6380888B1 (en) 2000-11-13 2002-04-30 The United States Of America As Represented By The Secretary Of The Navy Self-contained, self-surveying differential GPS base station and method of operating same
US6643576B1 (en) 2000-11-15 2003-11-04 Integrinautics Corporation Rapid adjustment of trajectories for land vehicles
US20020067849A1 (en) 2000-12-06 2002-06-06 Xerox Corporation Adaptive tree-based lookup for non-separably divided color tables
US6539303B2 (en) 2000-12-08 2003-03-25 Mcclure John A. GPS derived swathing guidance system
US6553311B2 (en) 2000-12-08 2003-04-22 Trimble Navigation Limited Navigational off- line and off-heading indication system and method
US20020072850A1 (en) 2000-12-08 2002-06-13 Mcclure John A. GPS derived swathing guidance system
US6711501B2 (en) 2000-12-08 2004-03-23 Satloc, Llc Vehicle navigation system and method for swathing applications
US20030187577A1 (en) 2000-12-08 2003-10-02 Satloc, Llc Vehicle navigation system and method for swathing applications
US6577952B2 (en) 2001-01-08 2003-06-10 Motorola, Inc. Position and heading error-correction method and apparatus for vehicle navigation systems
US6819780B2 (en) 2001-02-02 2004-11-16 Cnh America Llc Method and apparatus for automatically steering a vehicle in an agricultural field using a plurality of fuzzy logic membership functions
US20020161522A1 (en) 2001-02-05 2002-10-31 Clark Cohen Low cost system and method for making dual band GPS measurements
US6570534B2 (en) 2001-02-05 2003-05-27 Integrinautics Corp. Low cost system and method for making dual band GPS measurements
US20020138187A1 (en) 2001-02-09 2002-09-26 The Board Of Trustees Of The University Of Illinois Fuzzy steering controller
US6754584B2 (en) 2001-02-28 2004-06-22 Enpoint, Llc Attitude measurement using a single GPS receiver with two closely-spaced antennas
US20020165669A1 (en) * 2001-02-28 2002-11-07 Enpoint, L.L.C. Attitude measurement using a single GPS receiver with two closely-spaced antennas
US6445990B1 (en) 2001-03-19 2002-09-03 Caterpillar Inc. Method and apparatus for controlling straight line travel of a tracked machine
US6688403B2 (en) 2001-03-22 2004-02-10 Deere & Company Control system for a vehicle/implement hitch
US6774843B2 (en) 2001-03-28 2004-08-10 Communications Research Laboratory, Independent Administrative Institution Method for acquiring azimuth information
US7460942B2 (en) * 2001-04-09 2008-12-02 Hemisphere Gps Llc Soil cultivation implement control apparatus and method
US6865484B2 (en) 2001-04-11 2005-03-08 Mitsui & Co., Ltd. Satellite position measurement system
US20020171427A1 (en) 2001-04-24 2002-11-21 The United States Of America Represented By The Secretary Of The Navy Magnetic anomaly sensing system and methods for maneuverable sensing platforms
US6732024B2 (en) 2001-05-07 2004-05-04 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for vehicle control, navigation and positioning
US6721638B2 (en) 2001-05-07 2004-04-13 Rapistan Systems Advertising Corp. AGV position and heading controller
US6622091B2 (en) 2001-05-11 2003-09-16 Fibersense Technology Corporation Method and system for calibrating an IG/GP navigational system
US6498570B2 (en) * 2001-05-24 2002-12-24 Phillip N. Ross Optical highway line detector
US6434462B1 (en) 2001-06-28 2002-08-13 Deere & Company GPS control of a tractor-towed implement
US6516271B2 (en) 2001-06-29 2003-02-04 The Regents Of The University Of California Method and apparatus for ultra precise GPS-based mapping of seeds or vegetation during planting
US6553300B2 (en) 2001-07-16 2003-04-22 Deere & Company Harvester with intelligent hybrid control system
US20030014171A1 (en) 2001-07-16 2003-01-16 Xinghan Ma Harvester with intelligent hybrid control system
WO2003019430A1 (en) 2001-08-29 2003-03-06 Beeline Technologies Pty Ltd Vehicle guidance software, method and system
US7277792B2 (en) 2001-08-29 2007-10-02 Beeline Technologies Vehicle guidance software, method and system
US6900992B2 (en) 2001-09-18 2005-05-31 Intel Corporation Printed circuit board routing and power delivery for high frequency integrated circuits
US6587761B2 (en) 2001-10-23 2003-07-01 The Aerospace Corporation Unambiguous integer cycle attitude determination method
US20090322606A1 (en) 2001-10-30 2009-12-31 Sirf Technology, Inc. Method and Apparatus for Real Time Clock (RTC) Brownout Detection
US20100225537A1 (en) 2001-11-06 2010-09-09 Charles Abraham Method and apparatus for processing a satellite positioning system signal using a cellular acquisition signal
US20030093210A1 (en) 2001-11-15 2003-05-15 Toshiyuki Kondo Travel control apparatus of vehicle
US7026982B2 (en) 2001-12-19 2006-04-11 Furuno Electric Ompany Limited Carrier-phase-based relative positioning device
US20100222076A1 (en) 2002-01-07 2010-09-02 Qualcomm Incorporated Multiplexed cdma and gps searching
US6671587B2 (en) 2002-02-05 2003-12-30 Ford Motor Company Vehicle dynamics measuring apparatus and method using multiple GPS antennas
US6792380B2 (en) 2002-02-12 2004-09-14 Furuno Electric Company Limited Attitude angle detecting apparatus
US7231290B2 (en) 2002-04-05 2007-06-12 E. I. Du Pont De Nemours And Company Method and apparatus for controlling a gas-emitting process and related devices
US20040039514A1 (en) 2002-04-05 2004-02-26 Steichen John Carl Method and apparatus for controlling a gas-emitting process and related devices
US6865465B2 (en) 2002-05-06 2005-03-08 Csi Wireless, Inc. Method and system for implement steering for agricultural vehicles
US7277784B2 (en) 2002-05-31 2007-10-02 Deere & Company Combination of a self-moving harvesting machine and a transport vehicle
US6822314B2 (en) 2002-06-12 2004-11-23 Intersil Americas Inc. Base for a NPN bipolar transistor
US20040006426A1 (en) 2002-07-03 2004-01-08 Armstrong Ray G. Vehicle locating system
US6657875B1 (en) 2002-07-16 2003-12-02 Fairchild Semiconductor Corporation Highly efficient step-down/step-up and step-up/step-down charge pump
US7031725B2 (en) 2002-08-13 2006-04-18 Drs Communications Company, Llc Method and system for determining relative positions of networked mobile communication devices
US6922635B2 (en) 2002-08-13 2005-07-26 Drs Communications Company, Llc Method and system for determining absolute positions of mobile communications devices using remotely generated positioning information
US20100149030A1 (en) 2002-08-15 2010-06-17 Rajiv Kumar Verma Position determination system and method
US20050080559A1 (en) 2002-10-02 2005-04-14 Hideto Ishibashi Position measuring system for working machine
US6990399B2 (en) 2002-10-31 2006-01-24 Cnh America Llc Agricultural utility vehicle and method of controlling same
US20050055147A1 (en) 2002-10-31 2005-03-10 Oliver Hrazdera Agricultural utility vehicle and method of controlling same
US6967538B2 (en) 2002-11-28 2005-11-22 Hynix Semiconductor Inc. PLL having VCO for dividing frequency
US7373231B2 (en) 2002-12-11 2008-05-13 Hemisphere Gps Llc Articulated equipment position control system and method
US20070198185A1 (en) * 2002-12-11 2007-08-23 Mcclure John A GNSS control system and method
US7885745B2 (en) * 2002-12-11 2011-02-08 Hemisphere Gps Llc GNSS control system and method
US7162348B2 (en) 2002-12-11 2007-01-09 Hemisphere Gps Llc Articulated equipment position control system and method
US6879283B1 (en) 2003-02-21 2005-04-12 Trimble Navigation Limited Method and system for transmission of real-time kinematic satellite positioning system data
US20100109944A1 (en) 2003-03-20 2010-05-06 Whitehead Michael L Gnss-based tracking of fixed or slow-moving structures
US20040186644A1 (en) * 2003-03-20 2004-09-23 Mcclure John A. Satellite based vehicle guidance control in straight and contour modes
US8190337B2 (en) 2003-03-20 2012-05-29 Hemisphere GPS, LLC Satellite based vehicle guidance control in straight and contour modes
US20090164067A1 (en) * 2003-03-20 2009-06-25 Whitehead Michael L Multiple-antenna gnss control system and method
US7689354B2 (en) 2003-03-20 2010-03-30 Hemisphere Gps Llc Adaptive guidance system and method
US20080269988A1 (en) * 2003-03-20 2008-10-30 Feller Walter J Combined gnss gyroscope control system and method
US7437230B2 (en) * 2003-03-20 2008-10-14 Hemisphere Gps Llc Satellite based vehicle guidance control in straight and contour modes
US7400956B1 (en) * 2003-03-20 2008-07-15 Hemisphere Gps Inc. Satellite position and heading sensor for vehicle steering control
US7505848B2 (en) 2003-03-31 2009-03-17 Deere & Company Path planner and method for planning a contour path of a vehicle
US7027918B2 (en) 2003-04-07 2006-04-11 Novariant, Inc. Satellite navigation system using multiple antennas
US7191061B2 (en) 2003-04-17 2007-03-13 Battelle Energy Alliance, Llc Auto-steering apparatus and method
US7292186B2 (en) 2003-04-23 2007-11-06 Csi Wireless Inc. Method and system for synchronizing multiple tracking devices for a geo-location system
US20040212533A1 (en) 2003-04-23 2004-10-28 Whitehead Michael L. Method and system for satellite based phase measurements for relative positioning of fixed or slow moving points in close proximity
US20040221790A1 (en) * 2003-05-02 2004-11-11 Sinclair Kenneth H. Method and apparatus for optical odometry
US6789014B1 (en) 2003-05-09 2004-09-07 Deere & Company Direct modification of DGPS information with inertial measurement data
US7479900B2 (en) 2003-05-28 2009-01-20 Legalview Assets, Limited Notification systems and methods that consider traffic flow predicament data
US6744404B1 (en) 2003-07-09 2004-06-01 Csi Wireless Inc. Unbiased code phase estimator for mitigating multipath in GPS
US7155335B2 (en) 2003-08-06 2006-12-26 General Motors Corporation Satellite radio real time traffic updates
US20050259240A1 (en) * 2003-09-18 2005-11-24 Goren David P Optical navigation of vehicles
US20070069924A1 (en) * 2003-09-18 2007-03-29 Goren David P Optical navigation of vehicles
US20050110676A1 (en) * 2003-10-06 2005-05-26 Heppe Stephen B. Method and apparatus for satellite-based relative positioning of moving platforms
US6961018B2 (en) 2003-10-06 2005-11-01 The Insitu Group, Inc. Method and apparatus for satellite-based relative positioning of moving platforms
US20090326809A1 (en) 2003-10-06 2009-12-31 Colley Jaime B System and method for augmenting a satellite-based navigation solution
US20050150160A1 (en) * 2003-10-28 2005-07-14 Norgaard Daniel G. Method for selecting crop varieties
US20070055412A1 (en) * 2003-11-20 2007-03-08 Werner Bernhard Lane device, selector device and method for detecting the lane of a vehicle
US20060213167A1 (en) 2003-12-12 2006-09-28 Harvey Koselka Agricultural robot system and method
US7854108B2 (en) 2003-12-12 2010-12-21 Vision Robotics Corporation Agricultural robot system and method
US7006032B2 (en) 2004-01-15 2006-02-28 Honeywell International, Inc. Integrated traffic surveillance apparatus
US20050196162A1 (en) * 2004-03-03 2005-09-08 Mootz Jeffery S. Self Leveling Camera Support Apparatus
US7142956B2 (en) 2004-03-19 2006-11-28 Hemisphere Gps Llc Automatic steering system and method
US20090322600A1 (en) 2004-03-19 2009-12-31 Whitehead Michael L Method and system using gnss phase measurements for relative positioning
US7221314B2 (en) 2004-03-26 2007-05-22 Topcon Gps, Llc Estimation and resolution of carrier wave ambiguities in a position navigation system
US20050225955A1 (en) 2004-04-09 2005-10-13 Hewlett-Packard Development Company, L.P. Multi-layer printed circuit boards
US20070112700A1 (en) 2004-04-22 2007-05-17 Frontline Robotics Inc. Open control system architecture for mobile autonomous systems
US20070088447A1 (en) 2004-04-27 2007-04-19 Abb Research Ltd Scheduling of industrial production processes
US7248211B2 (en) 2004-07-26 2007-07-24 Navcom Technology Inc. Moving reference receiver for RTK navigation
US20070285308A1 (en) 2004-07-30 2007-12-13 Integirnautics Corporation Multiple frequency antenna structures and methods for receiving navigation or ranging signals
US7089099B2 (en) 2004-07-30 2006-08-08 Automotive Technologies International, Inc. Sensor assemblies
US7271766B2 (en) 2004-07-30 2007-09-18 Novariant, Inc. Satellite and local system position determination
US20060031664A1 (en) 2004-08-04 2006-02-09 National Instruments Corporation Method and system for loading and updating firmware in an embedded device
US7395769B2 (en) 2004-10-21 2008-07-08 Jensen Layton W Individual row rate control of farm implements to adjust the volume of crop inputs across wide implements in irregularly shaped or contour areas of chemical application, planting or seeding
US20060095172A1 (en) * 2004-10-28 2006-05-04 Abramovitch Daniel Y Optical navigation system for vehicles
US20060206246A1 (en) 2004-10-28 2006-09-14 Walker Richard C Second national / international management and security system for responsible global resourcing through technical management to brige cultural and economic desparity
US20060103573A1 (en) 2004-11-12 2006-05-18 Geier George J Frequency error tracking in satellite positioning system receivers
US7610123B2 (en) 2005-01-04 2009-10-27 Deere & Company Vision-aided system and method for guiding a vehicle
US20060290779A1 (en) * 2005-01-18 2006-12-28 Reverte Carlos F Autonomous inspector mobile platform
US20080129586A1 (en) 2005-01-20 2008-06-05 Thales Satellite-Based Positioning Receiver with Improved Integrity and Continuity
US20060167600A1 (en) 2005-01-27 2006-07-27 Raven Industries, Inc. Architecturally partitioned automatic steering system and method
US7451030B2 (en) 2005-02-04 2008-11-11 Novariant, Inc. System and method for interactive selection and determination of agricultural vehicle guide paths offset from each other with varying curvature along their length
US20090273372A1 (en) 2005-02-25 2009-11-05 Qualcomm Incorporated Half bin linear frequency discriminator
US20060215739A1 (en) 2005-03-24 2006-09-28 Ian Williamson System and method for making correlation measurements utilizing pulse shape measurements
US7623952B2 (en) * 2005-04-21 2009-11-24 A.I.L., Inc. GPS controlled guidance system for farm tractor/implement combination
US7428259B2 (en) 2005-05-06 2008-09-23 Sirf Technology Holdings, Inc. Efficient and flexible GPS receiver baseband architecture
US20080204312A1 (en) 2005-05-18 2008-08-28 Leica Geosystems Ag Phase Ambiguity Resolution Method for a Satellite Based Positioning System
US7522100B2 (en) 2005-07-01 2009-04-21 Sirf Technology Holdings, Inc. Method and device for acquiring weak global navigation satellite system (GNSS) signals
US20070205940A1 (en) 2005-07-01 2007-09-06 Chun Yang Method and device for tracking weak global navigation satellite system (gnss) signals
US7324915B2 (en) 2005-07-14 2008-01-29 Biosense Webster, Inc. Data transmission to a position sensor
US20100185366A1 (en) 2005-07-19 2010-07-22 Heiniger Richard W Adaptive machine control system and method
US7522099B2 (en) 2005-09-08 2009-04-21 Topcon Gps, Llc Position determination using carrier phase measurements of satellite signals
US20100228408A1 (en) 2005-09-14 2010-09-09 Tom Ford Helicopter ship board landing system
US20070078570A1 (en) 2005-10-04 2007-04-05 Xiaowen Dai Method and apparatus for reporting road conditions
US7571029B2 (en) 2005-10-04 2009-08-04 Gm Global Technology Operations, Inc. Method and apparatus for reporting road conditions
US7388539B2 (en) * 2005-10-19 2008-06-17 Hemisphere Gps Inc. Carrier track loop for GNSS derived attitude
US20070193798A1 (en) 2005-10-21 2007-08-23 James Allard Systems and methods for obstacle avoidance
US7358896B2 (en) 2005-11-03 2008-04-15 Nemerix Sa Multiband GNSS receiver
US20100171660A1 (en) 2005-11-15 2010-07-08 O2Micro, Inc. Novas hybrid positioning technology using terrestrial digital broadcasting signal (dbs) and global positioning system (gps) satellite signal
US20070121708A1 (en) 2005-11-28 2007-05-31 Honeywell International, Inc. Discriminator function for GPS code alignment
US8098324B2 (en) 2005-12-07 2012-01-17 Sony Corporation Imaging device, GPS control method, and computer program
US20100150284A1 (en) 2005-12-14 2010-06-17 Dennis Arthur Fielder Gps receiver with improved immunity to burst transmissions
US20090174622A1 (en) 2005-12-27 2009-07-09 Kyocera Corporation Transmitter/Receiver Circuit and Transmission/Reception Method
US20070194984A1 (en) 2006-02-21 2007-08-23 Honeywell International Inc. System and method for detecting false navigation signals
US20090171583A1 (en) 2006-03-15 2009-07-02 The Boeing Company Global position system (gps) user receiver and geometric surface processing for all-in-view coherent gps signal prn codes acquisition and navigation solution
US20090262014A1 (en) 2006-03-15 2009-10-22 The Boeing Company Method and system for all-in-view coherent gps signal prn codes acquisition and navigation solution determination
US20090259707A1 (en) 2006-03-21 2009-10-15 Thales Method and device for fast correlation calculation
US20070247361A1 (en) 2006-04-21 2007-10-25 Broadcom Corporation, A California Corporation Communication system with assisted GPS and SBAS
US20100013703A1 (en) 2006-05-25 2010-01-21 The Boeing Company Gps gyro calibration
US20100109947A1 (en) 2006-05-26 2010-05-06 Savcor One Oy System and method for positioning a gps device
US20080039991A1 (en) 2006-08-10 2008-02-14 May Reed R Methods and systems for providing accurate vehicle positioning
US20080059068A1 (en) 2006-09-05 2008-03-06 Honeywell International Inc. Method and system for autonomous vehicle navigation
US20100039318A1 (en) 2006-11-06 2010-02-18 Marcin Michal Kmiecik Arrangement for and method of two dimensional and three dimensional precision location and orientation determination
US20080167770A1 (en) * 2007-01-05 2008-07-10 Beeline Technologies Pty Ltd Vehicle control system
US8768558B2 (en) 2007-01-05 2014-07-01 Agjunction Llc Optical tracking vehicle control system and method
WO2008080193A1 (en) 2007-01-05 2008-07-10 Hemisphere Gps Llc A vehicle control system
US20130041549A1 (en) 2007-01-05 2013-02-14 David R. Reeve Optical tracking vehicle control system and method
US7835832B2 (en) * 2007-01-05 2010-11-16 Hemisphere Gps Llc Vehicle control system
US20110118938A1 (en) 2007-01-05 2011-05-19 Andrew John Macdonald Vehicle control system
US20090174587A1 (en) 2007-01-10 2009-07-09 Tomohiro Ogawa Current switch circuit and d/a converter, semiconductor integrated circuit, and communication device using the same
US20100171757A1 (en) 2007-01-31 2010-07-08 Melamed Thomas J Referencing a map to the coordinate space of a positioning system
US20100211315A1 (en) 2007-03-22 2010-08-19 Furuno Electric Company Limited Gps composite navigation apparatus
WO2008119386A1 (en) 2007-03-30 2008-10-09 Carl Zeiss Smt Ag Support for a component of an optical device
US20090175593A1 (en) 2007-04-18 2009-07-09 Panasonic Corporation Digital broadcast receiving apparatus and digital broadcast receiving method
US20090322597A1 (en) 2007-04-30 2009-12-31 Navento Technologies, S.L. Location method and system and locatable portable device
US20080288205A1 (en) * 2007-05-16 2008-11-20 Edward Kah Ching Teoh Optical navigation device with surface and free space navigation
US20100241353A1 (en) 2007-05-16 2010-09-23 Thinkware Systems Corporation Method for matching virtual map and system thereof
US20080284643A1 (en) * 2007-05-16 2008-11-20 Scherzinger Bruno M Post-mission high accuracy position and orientation system
US20120251123A1 (en) 2007-05-24 2012-10-04 Federal Law Enforcement Development Services, Inc. Led light global positioning and routing communication system
US20100211316A1 (en) 2007-05-29 2010-08-19 Pedro Dias Freire Da Silva Highly integrated gps, galileo and inertial navigation system
US20100220008A1 (en) 2007-08-10 2010-09-02 Crossrate Technology Llc System and method for optimal time, position and heading solution through the integration of independent positioning systems
WO2009066183A2 (en) 2007-09-27 2009-05-28 Hemisphere Gps Llc Tightly-coupled pcb gnss circuit and manufacturing method
US20090093959A1 (en) * 2007-10-04 2009-04-09 Trimble Navigation Limited Real-time high accuracy position and orientation system
US20100149025A1 (en) 2007-10-09 2010-06-17 Honeywell International Inc. Gps receiver raim with slaved precision clock
US20100231446A1 (en) 2007-10-19 2010-09-16 Nxp B.V. Processing of satellite positioning system signals
US20100210206A1 (en) 2007-11-15 2010-08-19 Qual Comm Incorporated Gnss receiver and signal tracking circuit and system
US20100238976A1 (en) 2007-12-05 2010-09-23 Qualcomm Incorporated Global navigation receiver
US20090160951A1 (en) * 2007-12-20 2009-06-25 Utah State University Research Foundation Three-Axis Image Stabilization System
WO2009082745A1 (en) 2007-12-22 2009-07-02 Hemisphere Gps Llc Integrated dead reckoning and gnss/ins positioning
US20090177399A1 (en) 2008-01-07 2009-07-09 Samsung Electronics Co., Ltd. Method for estimating location and apparatus using the same
US20090177395A1 (en) 2008-01-07 2009-07-09 David Stelpstra Navigation device and method
US20090174597A1 (en) 2008-01-08 2009-07-09 Dilellio James A Global Positioning System Accuracy Enhancement
US20100117894A1 (en) 2008-01-09 2010-05-13 Mayfllower Communications Company, Inc. Gps-based measurement of roll rate and roll angle of spinning platforms
US20090262018A1 (en) 2008-02-05 2009-10-22 Mstar Semiconductor, Inc. High Accuracy Satellite Receiving Controller and Associated Method
US20090204281A1 (en) * 2008-02-10 2009-08-13 Hemisphere Gps Llc Visual, gnss and gyro autosteering control
US20100039316A1 (en) 2008-02-25 2010-02-18 Sirf Technology, Inc. System and Method for Operating a GPS Device in a Micro Power Mode
WO2009148638A1 (en) 2008-02-26 2009-12-10 Hemisphere Gps Llc Unbiased code phase discriminator
WO2009126587A1 (en) 2008-04-08 2009-10-15 Hemisphere Gps Llc Gnss-based mobile communication system and method
US20090251366A1 (en) * 2008-04-08 2009-10-08 Mcclure John A Gnss-based mobile communication system and method
US20090259397A1 (en) 2008-04-10 2009-10-15 Richard Stanton Navigation system with touchpad remote
US20090265054A1 (en) 2008-04-16 2009-10-22 Gm Global Technology Operations, Inc. In-vehicle sensor-based calibration algorithm for yaw rate sensor calibration
US20090262974A1 (en) 2008-04-18 2009-10-22 Erik Lithopoulos System and method for obtaining georeferenced mapping data
US20090265104A1 (en) 2008-04-22 2009-10-22 Itt Manufacturing Enterprises, Inc. Navigation System and Method of Obtaining Accurate Navigational Information in Signal Challenging Environments
US20090265101A1 (en) 2008-04-22 2009-10-22 En-Min Jow Access Device With Navigation Function
US20100121577A1 (en) 2008-04-24 2010-05-13 Gm Global Technology Operations, Inc. Three-dimensional lidar-based clear path detection
US20090276155A1 (en) 2008-04-30 2009-11-05 Honeywell International, Inc. Systems and methods for determining location information using dual filters
US20090274079A1 (en) 2008-05-01 2009-11-05 Qualcomm Incorporated Radio Frequency (RF) Signal Multiplexing
US20090274113A1 (en) 2008-05-01 2009-11-05 Mr.Daniel A. Katz Channel Allocation for Burst Transmission to a Diversity of Satellites
US20090273513A1 (en) 2008-05-01 2009-11-05 Skytraq Technology Inc. Method of dynamically optimizing the update rate of gps output data
US20090322601A1 (en) 2008-05-22 2009-12-31 Jonathan Ladd Gnss receiver using signals of opportunity and assistance information to reduce the time to first fix
US20090299550A1 (en) 2008-05-27 2009-12-03 Baker David A Orientation-based wireless sensing apparatus
US20090295634A1 (en) 2008-05-30 2009-12-03 O2Micro, Inc. Global positioning system receiver
US20090295633A1 (en) 2008-06-02 2009-12-03 Pinto Robert W Attitude estimation using intentional translation of a global navigation satellite system (GNSS) antenna
US20090322598A1 (en) 2008-06-26 2009-12-31 Honeywell International, Inc. Integrity of differential gps corrections in navigation devices using military type gps receivers
US20100030470A1 (en) 2008-07-02 2010-02-04 O2Micro, Inc. Global positioning system and dead reckoning (gps&dr) integrated navigation system
WO2010005945A1 (en) 2008-07-11 2010-01-14 Hemisphere Gps Llc Combined gnss and gyroscope control system and method
US20100161568A1 (en) 2008-07-25 2010-06-24 Seiko Epson Corporation Data Compression by Multi-Order Differencing
US20100026569A1 (en) 2008-07-31 2010-02-04 Honeywell International Inc. Method and apparatus for location detection using gps and wifi/wimax
US20100039320A1 (en) 2008-08-14 2010-02-18 Boyer Pete A Hybrid GNSS and TDOA Wireless Location System
US20100039321A1 (en) 2008-08-15 2010-02-18 Charles Abraham Method and system for calibrating group delay errors in a combined gps and glonass receiver
US20100063649A1 (en) 2008-09-10 2010-03-11 National Chiao Tung University Intelligent driving assistant systems
US20100060518A1 (en) 2008-09-11 2010-03-11 Bar-Sever Yoaz E Method and apparatus for autonomous, in-receiver prediction of gnss ephemerides
US20100188286A1 (en) 2008-09-17 2010-07-29 St-Ericsson Sa Time reference system
US20100084147A1 (en) 2008-10-02 2010-04-08 Trimble Navigation Ltd. Automatic Control of Passive, Towed Implements
US20100085249A1 (en) 2008-10-03 2010-04-08 Trimble Navigation Limited Compact Transmission of GPS Information Using Compressed Measurement Record Format
US20100085253A1 (en) 2008-10-03 2010-04-08 Trimble Navigation Limited Continuous Tracking Counter for Enabling Cycle-slip Free Messages in a Network of Global Navigation System Satellite Receivers
US8437901B2 (en) 2008-10-15 2013-05-07 Deere & Company High integrity coordination for multiple off-road vehicles
US20100103033A1 (en) 2008-10-23 2010-04-29 Texas Instruments Incorporated Loosely-coupled integration of global navigation satellite system and inertial navigation system
US20100103034A1 (en) 2008-10-24 2010-04-29 Ntt Docomo, Inc. Positioning control device and positioning control method
US20100106445A1 (en) 2008-10-24 2010-04-29 Takahiro Kondoh Angular velocity sensor correcting apparatus for deriving value for correcting output signal from angular velocity sensor, angular velocity calculating apparatus, angular velocity sensor correcting method, and angular velocity calculating method
US20100103040A1 (en) 2008-10-26 2010-04-29 Matt Broadbent Method of using road signs to augment Global Positioning System (GPS) coordinate data for calculating a current position of a personal navigation device
US20100103038A1 (en) 2008-10-27 2010-04-29 Mediatek Inc. Power saving method adaptable in gnss device
US20100106414A1 (en) 2008-10-27 2010-04-29 John Whitehead Method of performing routing with artificial intelligence
US20100111372A1 (en) 2008-11-03 2010-05-06 Microsoft Corporation Determining user similarities based on location histories
US20100114483A1 (en) 2008-11-03 2010-05-06 Samsung Electronics Co., Ltd. Method and apparatus for automatically optimizing and setting a GPS reception period and map contents
US20100109948A1 (en) 2008-11-04 2010-05-06 Leonid Razoumov Methods and Apparatuses For GPS Coordinates Extrapolation When GPS Signals Are Not Available
US20100109945A1 (en) 2008-11-06 2010-05-06 Texas Instruments Incorporated Loosely-coupled integration of global navigation satellite system and inertial navigation system: speed scale-factor and heading bias calibration
US20100109950A1 (en) 2008-11-06 2010-05-06 Texas Instruments Incorporated Tightly-coupled gnss/imu integration filter having speed scale-factor and heading bias calibration
US20100117900A1 (en) 2008-11-13 2010-05-13 Van Diggelen Frank Method and system for maintaining a gnss receiver in a hot-start state
US20100117899A1 (en) 2008-11-13 2010-05-13 Ecole Polytechnique Federale De Lausanne (Epfl) Method to secure gnss based locations in a device having gnss receiver
US20100124210A1 (en) 2008-11-14 2010-05-20 Ralink Technology Corporation Method and system for rf transmitting and receiving beamforming with gps guidance
US20100124212A1 (en) 2008-11-14 2010-05-20 Ralink Technology (Singapore) Corporation Method and system for rf transmitting and receiving beamforming with location or gps guidance
US20100241864A1 (en) 2008-11-21 2010-09-23 Dafca, Inc. Authenticating an integrated circuit based on stored information
US20100134354A1 (en) 2008-12-02 2010-06-03 Sirf Technology, Inc. Method and Apparatus for a GPS Receiver Capable or Reception of GPS Signals and Binary Offset Carrier Signals
US20100149033A1 (en) 2008-12-12 2010-06-17 Charles Abraham Method and system for power management for a frequency synthesizer in a gnss receiver chip
US20100149037A1 (en) 2008-12-15 2010-06-17 Samsung Electronics Co., Ltd. Global positioning system (GPS) receiver and method of determining location of GPS receiver
US20100152949A1 (en) 2008-12-15 2010-06-17 Delphi Technologies, Inc. Vehicle event recording system and method
US20100149034A1 (en) 2008-12-17 2010-06-17 Altek Corporation Method for calculating current position coordinate and method for calculating pseudo range
US20100159943A1 (en) 2008-12-18 2010-06-24 Verizon Corporate Services Group, Inc. Method and system for providing location-based information to a group of mobile user agents
US20100156718A1 (en) 2008-12-19 2010-06-24 Altek Corporation Method for calculating current position coordinate
US20100156709A1 (en) 2008-12-19 2010-06-24 Nexteq Navigation Corporation System and method for applying code corrections for gnss positioning
US20100161179A1 (en) 2008-12-22 2010-06-24 Mcclure John A Integrated dead reckoning and gnss/ins positioning
US20100156712A1 (en) 2008-12-23 2010-06-24 Toyota Motor Sales, U.S.A., Inc. Gps gate system
US20100161211A1 (en) 2008-12-24 2010-06-24 Mitac International Corp. Method and system for automatically creating poi by identifying geographic information on a screen of a portable navigation device
US20100185364A1 (en) * 2009-01-17 2010-07-22 Mcclure John A Raster-based contour swathing for guidance and variable-rate chemical application
US20100185389A1 (en) 2009-01-21 2010-07-22 Michael Glenn Woodard GPS-based vehicle alert and control system
US20100188285A1 (en) 2009-01-23 2010-07-29 Her Majesty The Queen In Right Of Canada As Represented By The Minister Of Natural Resources Decoupled clock model with ambiguity datum fixing
US20100189163A1 (en) 2009-01-27 2010-07-29 U-Blox Ag Method of processing a digital signal derived from a direct-sequence spread spectrum signal and a receiver
US20100201829A1 (en) * 2009-02-09 2010-08-12 Andrzej Skoskiewicz Camera aiming using an electronic positioning system for the target
US20100230198A1 (en) * 2009-02-12 2010-09-16 Frank Jonathan D Automated vehicle and system utilizing an optical sensing system
US20100211248A1 (en) 2009-02-17 2010-08-19 Lockheed Martin Corporation System and method for stability control using gps data
US20100207811A1 (en) 2009-02-18 2010-08-19 Bae Systems Information And Electronics Systems Integration, Inc. (Delaware Corp.) GPS antenna array and system for adaptively suppressing multiple interfering signals in azimuth and elevation
US20100220004A1 (en) 2009-02-27 2010-09-02 Steven Malkos Method and system for gnss assistance data or lto data download over a broadcast band
US20100228480A1 (en) 2009-03-07 2010-09-09 Lithgow Paul A Space satellite tracking and identification
US20100235093A1 (en) 2009-03-10 2010-09-16 Chien-Yang Chang Method for adjusting displayed navigation direction using sensors and navigation device using the same
US20100232351A1 (en) 2009-03-11 2010-09-16 Mangesh Chansarkar Utilizing sbas signals to improve gnss receiver performance
US20100231443A1 (en) 2009-03-11 2010-09-16 Whitehead Michael L Removing biases in dual frequency gnss receivers using sbas
WO2010104782A1 (en) 2009-03-11 2010-09-16 Hemisphere Gps Llc Removing biases in dual frequency gnss receivers using sbas
US20100241347A1 (en) 2009-03-17 2010-09-23 Lear Cororation Method and system of locating stationary vehicle with remote device
US20100241441A1 (en) 2009-03-19 2010-09-23 Entrix, Inc. Automated scat system
US20100312475A1 (en) 2009-06-08 2010-12-09 Inventec Appliances (Shanghai) Co. Ltd. Turning determining method for assisting navigation and terminal device thereof
US20110015817A1 (en) 2009-07-17 2011-01-20 Reeve David R Optical tracking vehicle control system and method
US8311696B2 (en) 2009-07-17 2012-11-13 Hemisphere Gps Llc Optical tracking vehicle control system and method
WO2011014431A1 (en) 2009-07-29 2011-02-03 Hemisphere Gps Llc System and method for augmenting dgnss with internally-generated differential correction
US8649930B2 (en) 2009-09-17 2014-02-11 Agjunction Llc GNSS integrated multi-sensor control system and method
US20120271540A1 (en) 2009-10-22 2012-10-25 Krzysztof Miksa System and method for vehicle navigation using lateral offsets
US20130004086A1 (en) 2009-11-20 2013-01-03 Saab Ab Method estimating absolute orientation of a vehicle
US8106817B2 (en) 2009-12-31 2012-01-31 Polaris Wireless, Inc. Positioning system and positioning method
US20120182421A1 (en) 2011-01-18 2012-07-19 Asanov Pavel Gps device with integral camera
US20120300070A1 (en) 2011-05-23 2012-11-29 Kabushiki Kaisha Topcon Aerial Photograph Image Pickup Method And Aerial Photograph Image Pickup Apparatus
US20130046461A1 (en) 2011-08-17 2013-02-21 Abram L. Balloga Orientation Device and Method
US20130066542A1 (en) 2011-09-08 2013-03-14 Kama-Tech (Hk) Limited Intelligent laser tracking system and method for mobile and fixed position traffic monitoring and enforcement applications
US20130121678A1 (en) 2011-11-16 2013-05-16 Alfred Xueliang Xin Method and automated location information input system for camera
US20130141565A1 (en) 2011-12-01 2013-06-06 Curtis Ling Method and System for Location Determination and Navigation using Structural Visual Information
US20130147661A1 (en) 2011-12-07 2013-06-13 International Business Machines Corporation System and method for optical landmark identification for gps error correction
US20130156271A1 (en) 2011-12-20 2013-06-20 Net-Fit Tecnologia Da Informacao Ltda. System for analysis of pests and diseases in crops and orchards via mobile phone
US20130166103A1 (en) 2011-12-26 2013-06-27 Hon Hai Precision Industry Co., Ltd. Aircraft exploration system
US20130211715A1 (en) 2012-02-09 2013-08-15 Samsung Electronics Co., Ltd. Apparatus and method for measuring position using gps and visible light communication
CN105531706A (en) 2013-07-17 2016-04-27 索特斯波特有限公司 Search engine for information retrieval system

Non-Patent Citations (42)

* Cited by examiner, † Cited by third party
Title
"ARINC Engineering Services, Interface Specification IS-GPS-200, Revision D", Online [retrieved on May 18, 2010]. Retrieved from the Internet: <http://www.navcen.uscg.gov/gps/geninfo/IS-GPS-2000.pdf> (Dec. 7, 2004) p. 168, para [0001].
"Eurocontrol, Pegasus Technical Notes on SBAS", report [online] Dec. 7, 2004 [retrieved on May 18, 2010], Retrieved from the Internet: <http://www.icao.int.icao/en/ro/nacc/meetings/2004/gnss/documentation/Pegasus/tn.pdg> (Dec. 7, 2004), p. 89, paras [0001]-[0004].
"ISO", 11783 Part 7 Draft Amendment 1 Annex, Paragraphs B.6 and B.7.ISO 11783-7 2004 DAM1, ISO; Mar. 8, 2004.
"Orthman Manufacturing Co., www.orthman.com/htm;guidance.htm"; 2004; regarding the "Tracer Quick-Hitch".
Bevly, David M., "Comparison of INS v. Carrier-Phase DGPS for Attitude Determination in the Control of Off-Road Vehicles"; Ion 55th Annual Meeting; Jun. 28-30, 1999; Cambridge, Massachusetts; pp. 497-504.
Han, Shaowel, et al., "Single-Epoch Ambiguity Resolution for Real-Time GPS Attitude Determination with the Aid of one-Dimensional Optical Fiber Gyro", GPA Solutions, vol. 3, No. 1; pp. 5-12 (1999); John Wiley & Sons, Inc.
International Search Report and Written Opinion for PCT/IB2008/003796 dated Jul. 15, 2009.
International Search Report and Written Opinion for PCT/US08/81727 dated Dec. 23, 2008.
International Search Report and Written Opinion for PCT/US08/88070 dated Feb. 9, 2009.
International Search Report and Written Opinion for PCT/US09/63594 dated Jan. 11, 2010.
International Search Report and Written Opinion for PCT/US10/21334 dated Mar. 12, 2010.
International Search Report and Written Opinion for PCT/US10/26509 dated Apr. 20, 2010; 7 pages.
International Search Report and Written Opinion for PCT/US2004/015678 dated Jun. 21, 2005.
International Search Report and Written Opinion for PCT/US2010/043094 dated Sep. 17, 2010.
International Search Report for PCT/AU2008/000002 dated Feb. 28, 2008.
International Search Report for PCT/US09/039686 dated May 26, 2009.
International Search Report for PCT/US09/067693 dated Jan. 26, 2010.
International Search Report for PCT/US09/33567 dated Feb. 9, 2009.
International Search Report for PCT/US09/33693 dated Mar. 30, 2009.
International Search Report for PCT/US09/34376 dated Nov. 2, 2009.
International Search Report for PCT/US09/49776 dated Aug. 11, 2009.
International Search Report for PCT/US09/60668 dated Dec. 9, 2009.
International Search Report for PCT/US10/26509 dated Apr. 20, 2010.
Irsigler, M., et al., "PPL Tracking Performance in the Presence of Oscillator Phase Noise"; GPS Solutions, vol. 5, No. 4, pp. 45-57 (2002).
Kaplan, E.D., "Understanding GPS: Principles and Applications"; Artech House, MA, 1996.
Keicher, R. et al., "Automatic Guidance for Agricultural Vehicles in Europe"; Computers and Electronics in Agriculture, vol. 25; Jan. 2000; pp. 169-194.
Last, J.D., et al., "Effect of skywave interference on coverage of radio beacon DGPS stations"; IEEE Proc.-Radar, Sonar Navig., vol. 144, No. 3; Jul. 1997; pp. 163-168.
Ling, Dai, et al., "Real-Time Attitude Determination from Microsatellite by Lamda Method Combined with Kalman Filtering"; A Collection for the 22nd AIAA International Communications Satellite Systems Conference and Exhibit Technical Papers vol. 1, Monterey, California American Institute of Aeronautics and Astronautics, Inc. (May 2004), pp. 136-143.
Noh, Kwang-Mo, "Self-tuning Controller for Farm Tractor Guidance"; Digital Repository @ Iowa State University, Retrospective Theses and Dissertations. Paper 9874; (1990); 192 pages.
Noh, Kwang-Mo, Self-tuning controller for farm tractor guidance, Iowa State University Retrospective Theses and Dissertations, Paper 9874, (1990).
Notification Concerning Transmittal of International Report on Patentability for PCT/US2009/049776 dated Jan. 20, 2011.
Notification of Publication of International Application for WO 2011/014431 dated Feb. 3, 2011.
Notification of Transmittal of International Preliminary Report on Patentability for PCT/US09/039686 dated Oct. 21, 2010.
Park, Chansik et al., "Integer Ambiguity Resolution for GPS Based Attitude Determination System"; SICE Jul. 29-31, 1998; Chiba; pp. 1115-1120.
Parkinson, Bradford W., et al., "Global Positioning System: Theory and Applications, vol. II"; Bradford W. Parkinson and James J. Spiker, Jr., eds., Global Positioning System: Theory and Applications, vol. II, 1995, AIAA, Reston, VA, USA, pp. 3-50 (1995), 3-50.
Rho, Hyundho et al., "Dual-Frequency GPS Precise Point Positioning with WADGPS Corrections" [retrieved on May 18, 2010] Retrieved from the Internet: <http://gauss.gge.unb.ca/papers.pdf/iongnss2005.rho.wadgps.pdf> (Jul. 12, 2006).
Schaer, et al., "Determination and Use of GPS Differential Code Bias Values"; Presentation [online]. Retrieved May 18, 2010. Retrieved from the Internet: <http://nng.esoc.esa.de/ws206/REPR2.pdf> (May 8, 2006).
Schwabe Williamson & Wyatt Listing of Related Cases dated Jan. 31, 2017; 1 page.
Takac, Frank, et al., "Smark RTK: A Novel Method of Processing Standardized RTCM Network RTK Information for High Precision Positioning"; Proceedings of ENC GNSS 2008; Toulouse, France; Apr. 22, 2008.
Van Zuydam,. R.P., Centimeter-Precision Guidance of Agricultural Implements in the Open Field by Means of Real Tim Kinematic DGPS, ASA-CSSA-SSSA, pp. 1023-1034 (1999).
Ward, Phillip W., "Performance Comparisons Between FFL, PLL, and a Novel FLL-Assisted-PLL Carrier Tracking Loop under RF Interference Conditions"; 11th Int. Tech Meeting of the Satellite Division of the US Inst. Of Navigation, Nashville, TN; Sep. 15-18, 1998; pp. 783-795.
Xu, Jiangning et al., "An Ehw Architecture for Real-Time GPS Attitude Determination Based on Parallel Genetic Algorithm", The Computer Society Proceedings of the 2002 NASA/DOD Conference on Evolvable Hardware (EH'02) (2002).

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11167792B2 (en) * 2015-11-19 2021-11-09 Agjunction Llc Single-mode implement steering
US11180189B2 (en) * 2015-11-19 2021-11-23 Agjunction Llc Automated reverse implement parking
US11615639B1 (en) * 2021-01-27 2023-03-28 Jackson Klein Palm vein identification apparatus and method of use

Similar Documents

Publication Publication Date Title
US8768558B2 (en) Optical tracking vehicle control system and method
US8311696B2 (en) Optical tracking vehicle control system and method
AU2008203618B2 (en) A vehicle control system
US20220155794A1 (en) 3-d image system for vehicle control
JP6827627B2 (en) Methods and systems for generating and updating vehicle environment maps
US10151588B1 (en) Determining position and orientation for aerial vehicle in GNSS-denied situations
Hague et al. Ground based sensing systems for autonomous agricultural vehicles
CN102460074B (en) Method and apparatus for combining three-dimensional position and two-dimensional intensity mapping for localization
EP0679976B1 (en) Integrated vehicle positioning and navigation system, apparatus and method
EP1983397B1 (en) Landmark navigation for vehicles using blinking optical beacons
US11480973B2 (en) Robotic mower boundary detection system
US7978128B2 (en) Land survey system
US7363154B2 (en) Method and system for determining the path of a mobile machine
USRE48527E1 (en) Optical tracking vehicle control system and method
US20130293716A1 (en) Mobile mapping system for road inventory
Miller et al. Sensitivity analysis of a tightly-coupled GPS/INS system for autonomous navigation
CN110673608A (en) Robot navigation method
US10955241B2 (en) Aircraft imaging system using projected patterns on featureless surfaces
CN109752016A (en) A kind of parallel traveling route track generation system for unmanned low-speed vehicle
Ahamed Navigation of an autonomous tractor using multiple sensors
Pattinson et al. Galileo enhanced solution for pest detection and control in greenhouses with autonomous service robots
Thorpe et al. Third annual report for perception for outdoor navigation
Thorpe et al. S ELECTE
CN115326054A (en) Automatic navigation method of crawler-type agricultural vehicle
IL286204A (en) Method of navigating a vehicle and system thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: HEMISPHERE GPS LLC, CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BEELINE TECHNOLOGIES PTY LTD;REEL/FRAME:055358/0533

Effective date: 20080421

Owner name: BEELINE TECHNOLOGIES PTY LTD, AUSTRALIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MACDONALD, ANDREW JOHN;REEVE, DAVID ROBERT;MORRISON, CAMPBELL ROBERT;REEL/FRAME:055358/0496

Effective date: 20070404

Owner name: AGJUNCTION LLC, KANSAS

Free format text: CHANGE OF NAME;ASSIGNOR:HEMISPHERE GPS LLC;REEL/FRAME:055362/0215

Effective date: 20130619

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8